Training Billions of Parameter LLMs with MosaicML

preview_player
Показать описание
Featuring Hanlin Tang, CEO and Co-Founder of MosaicML and dives into how MosaicML can facilitate when training billions-of-parameter large language models.

This talk was originally delivered at Arize:Observe 2023, a conference on the intersection of large language models, generative AI, and machine learning observability in the era of LLMops.

Рекомендации по теме