Scaling Laws for Natural Language Models

preview_player
Показать описание
When you create a language model there are a number of competing factors you can set. This week Roger walked through a paper that presented observations on the effects of different factors in getting good model performance. The paper compares data Scale versus Shape, Batch size, and other factors. In the end it presents a summary of the Scaling Laws for parameters.

*Links*

*Content*
00:00 Introduction
07:30 Model parameters
36:12 Scale vs. Shape
20:51 Batch size
55:56 Scaling law summary
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
😊About Us

West Coast Machine Learning is a channel dedicated to exploring the exciting world of machine learning! Our group of techies is passionate about deep learning, neural networks, computer vision, tiny ML, and other cool geeky machine learning topics. We love to dive deep into the technical details and stay up to date with the latest research developments.

Our Meetup group and YouTube channel is the perfect place to connect with other like-minded individuals who share your love of machine learning. We offer a mix of research paper discussions, coding reviews, and other data science topics. So, if you're looking to stay up to date with the latest developments in machine learning, connect with other techies, and learn something new, be sure to subscribe to our channel and join our Meetup community today!

▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
#AI #ML #LLMs #MachineLearning #InferenceOptimization #ModelPerformance #Transformers #GenerativeAI
Рекомендации по теме
Комментарии
Автор

Nice content
just wondering why your views are so low?

JackRid-ks