HuggingFace + Ray AIR Integration: A Python Developer’s Guide to Scaling Transformers | AnyScale

preview_player
Показать описание
ABOUT THE TALK:
Hugging Face Transformers is a popular open-source project with cutting-edge Machine Learning (ML). Still, meeting the computational requirements for advanced models it provides often requires scaling beyond a single machine. This session explores the integration between Hugging Face and Ray AI Runtime (AIR), allowing users to scale their model training and data loading seamlessly. We will dive deep into the implementation and API and explore how we can use Ray AIR to create an end-to-end Hugging Face workflow, from data ingest through fine-tuning and HPO to inference and serving.

ABOUT THE SPEAKERS:
Jules S. Damji is a lead developer advocate at Anyscale Inc, an MLflow contributor, and co-author of Learning Spark, 2nd Edition. He is a hands-on developer with over 25 years of experience and has worked at leading companies, such as Sun Microsystems, Netscape, @Home, Opsware/LoudCloud, VeriSign, ProQuest, Hortonworks, and Databricks, building large-scale distributed systems.

Antoni Baum is a software engineer at Anyscale, working on Ray Tune, XGBoost-Ray, Ray AIR, and other ML libraries. In his spare time, he contributes to various open source projects, trying to make machine learning more accessible and approachable.

ABOUT DATA COUNCIL:

Make sure to subscribe to our channel for the most up-to-date talks from technical professionals on data related topics including data infrastructure, data engineering, ML systems, analytics and AI from top startups and tech companies.

FOLLOW DATA COUNCIL:
Рекомендации по теме