The FASTEST LLM Inference Endpoint Ever

preview_player
Показать описание
Subscribe and press the bell for more videos
Here we test the new Groq Inference API for free. Results are mind blowing
#groq
#llms
Рекомендации по теме
Комментарии
Автор

Legal language model 😂
The speed seems great, but I'm guessing the trade-off is quality. I'm guessing mixtral 8x7b knows the definition of the term "llm" normally.

Do you have any data on benchmarks with the same models but using groq for inference?

DanielVagg