Generative AI with Large Language Models: Hands-On Training feat. Hugging Face and PyTorch Lightning

preview_player
Показать описание
TOPIC SUMMARY
Module 1: Introduction to Large Language Models (LLMs)
- A Brief History of Natural Language Processing (NLP)
- Transformers
- Subword Tokenization
- Autoregressive vs Autoencoding Models
- ELMo, BERT and T5
- The GPT (Generative Pre-trained Transformer) Family
- LLM Application Areas

Module 2: The Breadth of LLM Capabilities
- LLM Playgrounds
- Staggering GPT-Family progress
- Key Updates with GPT-4
- Calling OpenAI APIs, including GPT-4

Module 3: Training and Deploying LLMs
- Hardware Options (e.g., CPU, GPU, TPU, IPU, AWS chips)
- The Hugging Face Transformers Library
- Best Practices for Efficient LLM Training
- Parameter-efficient fine-tuning (PEFT) with low-rank adaptation (LoRA)
- Open-Source Pre-Trained LLMs
- LLM Training with PyTorch Lightning
- Multi-GPU Training
- LLM Deployment Considerations
- Monitoring LLMs in Production

Module 4: Getting Commercial Value from LLMs
- Supporting ML with LLMs
- Tasks that can be Automated
- Tasks that can be Augmented
- Best Practices for Successful A.I. Teams and Projects
- What's Next for A.I.

CHAPTERS
0:00 Intro
5:56 Module 1: Introduction to Large Language Models (LLMs)
18:17 The Models that Shaped the Field
34:40 The GPT Family: A Closer Look
38:09 Module 2: The Breadth of LLM Capabilities
49:12 GPT-4 and OpenAI APIs
57:48 Module 3: Training and Deploying LLMs
1:09:52 Advanced Training Techniques and Open-Source Options
1:45:56 Training with PyTorch Lightning and Multi-GPU Training
2:09:33 Deployment and Monitoring of LLMs
2:10:30 Module 4: Getting Commercial Value from LLMs

ABSTRACT
At an unprecedented pace, Large Language Models like GPT-4 are transforming the world in general and the field of data science in particular. This two-hour training introduces deep learning transformer architectures including LLMs. Critically, it also demonstrates the breadth of capabilities of state-of-the-art LLMs like GPT-4 can deliver, including for dramatically revolutionizing the development of machine learning models and commercially successful data-driven products, accelerating the creative capacities of data scientists and pushing them in the direction of being data product managers. Brought to life via hands-on code demos that leverage the Hugging Face and PyTorch Lightning Python libraries, this training covers the full lifecycle of LLM development, from training to production deployment.

ABOUT THE PRESENTER
Jon Krohn is Co-Founder and Chief Data Scientist at the machine learning company Nebula. He authored the book Deep Learning Illustrated, an instant #1 bestseller that was translated into seven languages. He is also the host of SuperDataScience, the data science industry’s most listened-to podcast. Jon is renowned for his compelling lectures, which he offers at leading universities and conferences, as well as via his award-winning YouTube channel. He holds a PhD from Oxford and has been publishing on machine learning in prominent academic journals since 2010.

STAY IN TOUCH

Рекомендации по теме
Комментарии
Автор

I invested a ton of time into preparing this training so I hope you find it valuable! Let me know what you like and what could be improved in the future, please :)

JonKrohnLearns
Автор

Hi Jon, I haven't gone through this video yet, but from other comments I learn that this is a great tutorial. I have seen your other videos on the math behind ML and it is very impressive. You are doing this as a service to educate people on concepts that is not easy to understand sometimes. This inspires me to be of service to others in whichever way I can.

johndebritto
Автор

Exceptionally clear and concise, despite being 2+ hours long. Excellent introduction to LLMs holding enough advanced bits to keep more seasoned viewers hooked as well!

jordhanus
Автор

Extremely well done Jon, I was able to immediately get to work and learn thanks to your concise explainations of everything. I'm blown away by how much you covered with out any fluff in a manner that was very approachable.

texasdaveodell
Автор

This is the best instructions for getting started with LLM for laymen. A true lifesaver. Appreciate your generosity to unlocking the treasure trove. Can't wait to dig deep on this tut. Difficult to suggest improvements but I do want to know, say how to allow the custom inputs of 10 different type of restaurants to have stronger influence on the output. Since these 10 restaurants speak the same way but differ slightly. Their outputs should be similar. Use cases, custom chatbots for each industry and save time for manual inputs for the new restaurant. Subscribed and liked. Thanks a lot!

scammersnightmare
Автор

Thank you so much Jon. I appreciate your time and efforts. Please keep uploading videos on NLP.

shankargupta
Автор

Hey John, I wanted to express my sincere gratitude for the incredible training session on generative AI with LLMs you shared with us. The depth of knowledge and insights you provided in so short time was truly amazing.

luisrodriguesphd
Автор

Awesome Jon, This is like A-Z in one session you have covered. I really appreciate your time and efforts to grow the community.

upskillwithchetan
Автор

Hi Jon! I tried to run the example in Colab but the GPT4All-inference file read that there is no module named: ‘nomic.gpt4all’. So it won’t take me past the second step of the download. I’m sure it’s user error, but could you point me in the right direction please? Thanks!

josephburak
Автор

Pure 🔥 John! Just like your Deep Learning Illustrated!🎉🏆

AP-hvdh
Автор

Hi Jon, Many thanks for your efforts and time put to prepare this content. As always your sessions are simple, crisp, complex concepts explains very nicely along with hands on example.

abhijitdarwade
Автор

Dude you still got it!!! Great video!!!

tadandergart
Автор

Thank you sir, your Deep learning illustrated(all 3 videos ) are interested, now iam foĺlowing all your lectures

sridevi
Автор

Loved this, really insightful and helpful. Thank you! And did anyone tell you you sound a lot like Sean Carroll, which is also very cool :-)

JamesBradyGames
Автор

Jon Krohn! Thanks my teacher! I'm going to enjoy and eat this content! ❤

temiwale
Автор

Great session Jon! This was very helpful 😁

ShawhinTalebi
Автор

God bless you for putting this up. I have no other means to train

ireneaustin
Автор

It's really a great training. It covers all different aspects in graceful manner. Greatly appreciated. Thanks you so much!

Just curious to know if any plan to make tutorial on "Deep Learning with PyTorch and TensorFlow” available to community.

divyeshrajpura
Автор

Hi Jon, thank you for all your awesome content. Trying to follow along but get a module not found error when first importing the model in google colab. Any ideas what's changed?

AaranDanielMusic
Автор

Sir, can I start ML parallely with this ML foundation series or after completing this maths from algebra then I start ML . my bigest problam 😞 . please answer

himanshuchouhan