RouteLLM achieves 90% GPT4o Quality AND 80% CHEAPER

preview_player
Показать описание

Go from shiny demos to reliable AI products that delight your customers with Langtrace. Visit the website to learn more and join the community of innovators

Join My Newsletter for Regular AI Updates 👇🏼

Need AI Consulting? 📈

My Links 🔗

Media/Sponsorship Inquiries ✅

Комментарии
Автор

My "AI Stack" is RouteLLM, MoA, and CrewAI. What about you?

matthew_berman
Автор

Would love to see a tutorial on how to set this up.

davtech
Автор

Please do a tutorial for local installation for this. Thanks

cool
Автор

Matthew - for those of us who develop line-of-business apps for SME businesses - local LLM deployment is a must. Would certainly like to see you demo RouteLLM with orchestration - Thanks!

velocityerp
Автор

it'd be cool if you did a vid on setting it up and running it locally

clapppo
Автор

I am a cyber security analyst who knows very little about coding so, between your videos and just straight asking ChatGPT or Claude, I am ham-fisting my way through getting AI to run locally. Please keep making tutorial videos - I am excited to see how to impliment RouteLLM!

josephremick
Автор

Yes! Please show us a comprehensive breakdown of this great tool!
I’m also interested in your sponsor’s product, LangTrace. Can you possibly show us how to use it?

bernieapodaca
Автор

Great breakdown, much appreciated. I definitely foresee local LLMs becoming dominant for organisations as soon as next year. My advice during consults is for them not to invest a massive amount in high-end data secure cloud systems, but just to hang on a little, work with dummy data on current models to build up foundational knowledge, and then once local options exist they can start diving into more sensitive analytics.

aiforculture
Автор

Just popping up to say thanks Matthew. You have become almost my only required source for AI news because your take is right up my street every time. Great work, keep it coming

jamesvictor
Автор

Hey Matt, yes it would be great if you could show a demo of how to setup this model on Azure OpenAI or Azure Databrix and then use it in the application.

AshishKumar-hgcl
Автор

Yes, please provide a tutorial on setting up the described language model.

MichaelLloydMobile
Автор

I feel like everyone is realising things at the same time. I started 2 projects, the first an LLM co-ordination system and a chain of thought processing on specific models

AngeloXification
Автор

Absolutely do a detailed tutorial on how to get this up and running!

caseyvallett
Автор

Yes! The tutorial. Great vid. Sharing with my crew...Just beginning an AI Consultant agency and cost is an existential threat!!!

CookTheBruce
Автор

I also saw where they will have 20TB m.2 drives in a couple of years. Running this LLM locally will be really cool.

mrbrent
Автор

How would this effect mixture of agents? Could we have multiple route llms combined together since they use such lower compute?

DJMaster
Автор

There seems to be a hold up on the highest end models as the leading companies continually try to improve safety while watching their competition. Nobody seems to want to jump in and release a new/better model at risk of the potential "dangerous" label being applied to them. So a lot of the progress remains hidden in the lab, waiting for competition to finally engage.

joe_limon
Автор

can this be used to route between agents as opposed to model instances? for example routing to chain of thought agent vs simple q and a agent?

MEvansMusic
Автор

It would be interesting to see how this will work on your AI benchmark. Please do a setup and test

madelles
Автор

Thanks for this video. Very informative.

Please make a full tutorial about the setup of route llm and what the recommendations of the local pc should be. Thank you in advance!

wardehaj