How To Use AutoGen STUDIO with ANY Open-Source LLM Tutorial

preview_player
Показать описание
Step by step setup guide for a totally local LLM with LM Studio and Ollama using AutoGen Studio. Now one thing to note...there is still a small issue with LM Studio and the 2-token limit with Studio. I hope this gets fixed soon, but when it is...I show you how to connect.

You can download the IDE I use and you can use the Conda Environment with the following download as well:

— — — — — — — — —

📰 📰 📰 Don't forget to sign up for the 🆓 newsletter below to give updates in AI, what I'm working on and struggles I've dealt with (which you may have too!):

=========================================================
=========================================================

— — — — — — — — —

Connect With Me:
* 🐦 X (twitter): @TylerReedAI
* 📸 Instagram: TylerReedAI

— — — — — — — — —

— — — — — — — — —

— — — — — — — — —

📖 Chapters:
00:00 Welcome to the Course!
00:47 Studio Start
01:27 Ollama
02:04 Model, Agent, Workflow
04:53 LM Studio
06:54 Model, Agent, Workflow
10:11 Outro

💬 If you have any issues, let me know in the comments and I will help you out!
Рекомендации по теме
Комментарии
Автор

I've been waiting very patiently for this. Thank you Tyler.

steveknows
Автор

supper thanks i was serching for exact thing

theprocess-YT
Автор

Why to use Ollama and LM Studio when you have a Multi Model Session in LM Studio and you can run multiple LLMs in there?

kine
Автор

hi where to find the local host URL using olama to set up please help

theprocess-YT
Автор

is this new autogen nerfed? there are only 3 available model types in my version (0.1.3) -- openai, google, azure

madimakes
Автор

I am getting the following error when I run the playground
Error occurred while processing message: ConversableAgent.__init__() got an unexpected keyword argument 'description'

Thoughts on this?

vincentnestler