NousRedditGPT-8x7B: Legendary Mistral MoE Trained on 10k Reddit threads (on Apple MLX!)

preview_player
Показать описание
Unleash the power of language with NousHermes-Mixtral-8x7B-Reddit🚀

The next-gen LLM fine-tuned on a massive 10,000 Reddit thread dataset! Trained entirely on an Apple M3 Macbook using the MLX platform, Mistral delivers cutting-edge AI performance within your reach. Witness its mastery of language tasks, from generating creative text formats to engaging in insightful conversations. Dive into the video and discover how Mistral 8x7B pushes the boundaries of AI, powered by Apple's innovative M3 chip and the revolutionary MLX platform. Don't miss this glimpse into the future of language models!

Let us know what you think in the comments!

Комментарии
Автор

Reddit is full of halfwits now but maybe Alexis can manage to sell it? 🎉🎉😢

GerryPrompt
Автор

while this is cool, don't you think major players already use these datasets anonymously?

dmitrisochlioukov
Автор

This video would benefit from more examples of what the model does.

LetondAtreides
Автор

This is hard to watch because of this trippy fractals

GodFearingPookie
Автор

This is a big big mistake to follow apple in AI too. We already spend tons of money to develop app for apple products and ironically we need to keep replace them to be abale to develop for then beside giving apple 100$ yearly to keep the app alive … now they will do that in AI too … i will not touch apple for AI

Pouya..
Автор

GPT4chan retrain and merging with this when? I want chaos

nonetrix
Автор

Honestly, having a liberal bias in moderation probably serves as an unintentional way to clean the quality of the data.

RyanSmith-rbch