filmov
tv
Phi-1: A 'Textbook' Model
Показать описание
After a conversation with one of the 'Textbooks Are All You Need' authors, I can now bring you insights from the new phi-1 tiny language model. See if you agree with me that it tells us so much more than how to do good coding, it affects AGI timelines by telling us whether data will be a bottleneck.
I cover 5 other papers, including WizardCoder, Data Constraints (how more epochs could be used), TinyStories, and more, to give context to the results and end with what I think timelines might be and how public messaging could be targeted.
With extracts from Sarah Constantin in Asterisk and Carl Shulman on Dwarkesh Patel, Andrej Karpathy and Jack Clark (co-founder of Anthropic), as well as the Textbooks and TinyStories co-author himself, Ronen Eldan, I hope you get something from this one. And yes, the title of the paper isn't the best.
I cover 5 other papers, including WizardCoder, Data Constraints (how more epochs could be used), TinyStories, and more, to give context to the results and end with what I think timelines might be and how public messaging could be targeted.
With extracts from Sarah Constantin in Asterisk and Carl Shulman on Dwarkesh Patel, Andrej Karpathy and Jack Clark (co-founder of Anthropic), as well as the Textbooks and TinyStories co-author himself, Ronen Eldan, I hope you get something from this one. And yes, the title of the paper isn't the best.
Phi-1: A 'Textbook' Model
Textbooks Are All You Need: Phi-1 Model for code
Microsoft’s Phi-1 Crushes Coding Challenges!
Unlocking the Power of Textbooks: Phi-1 Language Model Redefines Code Training
Textbooks Are All You Need - phi-1.5 by Microsoft
Textbooks Are All You Need Microsoft phi-1 1.3B better than WizardCoder 16B Code generation LLM
Phi 1.5 - The small model getting big results
Phi-1 LLM Model for Code by Microsoft (1.3B/350M Parameters)
Install Phi-1 5 Locally on Windows Linux AWS GCP
New Phi-3 AI small language models (SLM) released by Microsoft | Today AI
Sébastien Bubeck on Phi-2 and the surprising power of small models
SMALL BUT MIGHTY!! PHI-1.5 AI PACKS A PUNCH
Welcome to gl club zita😂😂... Phi' mor...fahlada..... #linglingkwong #dr.zita
[#90] “Textbooks Are All You Need”: phi-1 es mejor que ChatGPT y 130 veces mas pequeño
Local LLM Testing with Phi-1.5 #ai #artificialintelligence #machinelearning
⚡️ Microsoft STRIKES with Phi 1.5 - SMALL but STRONG!!!
Is Microsoft's Phi-2 the Best Small Language Model Yet?
Microsoft Phi-1.5 A New Milestone in AI and better than Meta’s Llama 2
Phi-2 vs. Giant Models: Microsoft's Game-Changing Small Language Model
Phi 1 Pt 1 Introduction P1 & P2
What is Phi, 1.618, and The Golden Ratio? | Phi in 5 Minutes!
24.1 Phi-Phi Modeling Example Problem (Part I)
Running Phi 1.5 In Google Colab
Cách phi bài như thần bài Hong Kong - Kira Magic #Shorts
Комментарии