'High-Performance Training and Inference on GPUs for NLP Models' - Lei Li

preview_player
Показать описание
UCSB's Institute for Energy Efficiency
2022 Emerging Technologies Review
Original Presentation Date: January 21, 2022

Title: High-Performance Training and Inference on GPUs for NLP Models
Speaker: Lei Li, UCSB

Biography: Lei Li is an assistant professor in Computer Science Department at University of California Santa Barbara. His research interest lies in natural language processing, machine translation, and AI-powered drug discovery. He received his B.S. from Shanghai Jiao Tong University and Ph.D. from Carnegie Mellon University. His dissertation work on fast algorithms for mining co-evolving time series was awarded ACM KDD best dissertation
(runner up). His recent work on AI writer Xiaomingbot received 2nd-class award of Wu Wen-tsün AI prize in 2017. He is a recipient of ACL 2021 best paper award, CCF Young Elite award in 2019, and CCF distinguished speaker in 2017. His team won first places for five language translation directions and the best in corpus filtering challenge in WMT 2020. Previously, he worked at EECS department of UC Berkeley, Baidu's Institute of Deep Learning in Silicon Valley, and at ByteDance as the founding director of AI Lab. He has served as Associate Editor of TPAMI and organizers and area chair/senior PC for multiple conferences including KDD, ACL, EMNLP, NeurIPS, AAAI, IJCAI, WSDM, and CIKM. He has published over 100 technical papers in ML, NLP and data mining and holds more than 10 patents. He has launched ByteDance's machine translation system (VolcTrans) and many of his algorithms have been deployed in production, serving over
a billion users.
Рекомендации по теме