Grok-1 Open Source: 314B Mixture-of-Experts Model by xAI | Blog post, GitHub/Source Code

preview_player
Показать описание
We'll have a look at the release of Grok-1 and the GitHub repository - model architecture and interesting parts

00:00 - Intro
00:31 - Blog Post
03:44 - GitHub Repository
05:32 - Model Architecture
11:03 - Run Configuration

Join this channel to get access to the perks and support my work:

#artificialintelligence #llm #chatgpt #rust #python #chatbot
Рекомендации по теме
Комментарии
Автор

Good luck trying to install it on a PC, will need a serious tutorial because of the size

RpgBlasterRpg
Автор

I remember going through your detectron2 video step by step. Loved that video, it helped me a lot. Seeing you again today brings back memories. Great work! Excited to seeing your work in the future.

mehermanoj
Автор

you should have asked GPT4 what else could be done with the model as far as improving the architecture? And asked it to compare itself to Grok. This is the first architecture video I have seen though, and got a little deeper than I was able to examine myself, so thank you. I'd like to fully understand this.

Also why 2 experts? What happens when you increase that number to 3 or 8? Do all 8 try to chime in at once?

texasfossilguy
Автор

Sure! Here's the organized text in English:

I just watched your videos on ECG. I study at the university in Brazil in IT and work in the healthcare field. I really want to make an app that can take a photo of the ECG and provide a diagnosis afterward. Is it possible to do this with LSTM?
can you help me?

gasperpb
Автор

hi. please help me. how to create custom model from many pdfs in Persian language? tank you.

mohsenghafari
Автор

please talk about your opinion about this grok release....

seakyle
welcome to shbcf.ru