filmov
tv
NanoGPT using Simpsons Data: Get Started with Large Language Models
Показать описание
NanoGPT is a simple, fast repository for training/finetuning medium-sized GPTs. I recommend it to get a better handle on large language models. This video walks through using it on a Simpsons dataset. It covers why I chose nanoGPT, how I munged the Simpson dataset, how I trained my first model, and ways to keep learning.
Chapters:
00:00:00 intro
00:00:17 Why NanoGPT
00:00:52 Simpons dataset
00:01:47 Using the Google Colab notebooks
00:02:24 pull into nanogpt_simpsons repo
00:04:18 using the config files
00:05:36 training the model
00:06:12 getting predictions
00:07:16 using weights and biases for experiment management
━━━━━━━━━━━━━━━━━━━━━━━━━
★ Rajistics Social Media »
━━━━━━━━━━━━━━━━━━━━━━━━━
Chapters:
00:00:00 intro
00:00:17 Why NanoGPT
00:00:52 Simpons dataset
00:01:47 Using the Google Colab notebooks
00:02:24 pull into nanogpt_simpsons repo
00:04:18 using the config files
00:05:36 training the model
00:06:12 getting predictions
00:07:16 using weights and biases for experiment management
━━━━━━━━━━━━━━━━━━━━━━━━━
★ Rajistics Social Media »
━━━━━━━━━━━━━━━━━━━━━━━━━
NanoGPT using Simpsons Data: Get Started with Large Language Models
NanoGPT meets the Simpsons #machinelearning #largelanguagemodels #datascience #gpt4
Train your own language model with nanoGPT | Let’s build a songwriter
nanoGPT result demo and repo with tutorial
karpathy/nanoGPT - Gource visualisation
NanoGPT: Entrena tu propia IA generativa en tiempo récord
RajaGPT | Ep-1 | Starting with NanoGPT from Karpathy's code | ft. Raja Ayyanar | June 04, 2023
Crafting a nanoGPT from Scratch on Your Terminal without OpenAI | FREE
Nvidia Earnings and Other AI News (August 2023)
Making SimpleGPT2 — a GPT-2 implementation that prioritizes readability and education
Nano’s (like this) are the smallest size of cache that you can find! 👀
GPT (nanoGPT) from a beginner's perspective (Part 1)
Unlocking the Power of Tiny Language Models: Training, Performance, and Future Potential
DoctorGPT Coding
minGPT code walkthrough
309 - Training your own Chatbot using GPT
LLama 2: Andrej Karpathy, GPT-4 Mixture of Experts - AI Paper Explained
Ronen Eldan | The TinyStories Dataset: How Small Can Language Models Be And Still Speak Coherent
how to use GPT-J notebook
[80] Solving NLP (Natural Language Processing) Tasks Using Chat GPTs & LLMs (Large Language Mode...
New in AI - DoctorGPT, Fooocus, gpt-llm-trainer, Platypus
Reinforcement Learning with AI Feedback (RLAIF) for Large Language Models
Eat Melon! Reinforcement Learning Demo
Efficient Large Language Model training with LoRA and Hugging Face PEFT
Комментарии