Anthropic MCP with Ollama, No Claude? Watch This!

preview_player
Показать описание
anthropic released model context protocol which allows you to connect llm's to your own data and tools. in this video chris shows how to decouple mcp from claude desktop and claude 3.5 sonnet, and run mcp with your own applications natively, and use open source llm's such as qwen 2.5, llama 3.2, ibm granite via ollama or even use gpt-4o or gpt-4o-mini from openai

github repo:
Рекомендации по теме
Комментарии
Автор

This is what I expected from MCP, not having it tied to Claude Desktop, good job !

Moukrea
Автор

Legend. Just came across MCP and was hoping someone would build a client as a demo. Thanks.

RheinardKorf
Автор

This is tremendous, Chris! I learned so much....and really enjoy the way you present the information. Thank you!

KenBurge
Автор

Works well! Tried with ollama sqlite. An interresting new cli option could be to transform it in "server" mode able to receive / respond to messages coming from api calls so the chat+mcp will be available from remote computers/agents

securedev-fr
Автор

Great job Chris in creating and explaining this!

goyalnitj
Автор

Thank you so much for this video! It helped me add MCP tools to my node JS local LLM project.

coolmcdude
Автор

This is exactly the video i was looking for. AI is crazy, every day news worth seeing

henriquematias
Автор

Would be cool if you could dig into the initialization handshake of the tools, how the llm decides which tools to use, and the eventual conversation loop between cli and tools towards the response to the user 😊

CarlintVeld
Автор

Excellent stuff, thank you. Would love to see a video on how to use JSONRpc in other projects.

MicheleHjorleifsson
Автор

A wonderful implementation and explanation!

adisegev
Автор

Amazing Chris! I see your video with Claude Desktop which seems perfect, but kind of blackbox form me. Now you change it to devevoper friendly playground with source code explanation - big thank you !!!

michalsestrienka
Автор

Fantastic! Cannot wait to check the source codes!!

terryliu
Автор

great video! can you make a follow up on how to get this integrated into something like Open WebUI?

Boopdaloop
Автор

Great job Chris, any plans to support multiple servers like Claude Desktop can? I've built an MCP server that can build Powerpoint decks, I can chain with the SQLite mcp server to pull data sources from a database and dynamically build tables and charts in Powerpoint. All in one prompt

RussellAshby
Автор

works pretty well however i would mention uv can be installed onto your base system and with your setup it will complain if you use your own venv and its not a good idea to install python packages to your base system just a note for anyone else. btw i used it with llama3.2

hand-eye
Автор

Very nice! Can it handle multiple steps in one shot like Claude Desktop ?

Dave-cp
Автор

I've been wondering, anthropic have many optimizer such as in context rag, prompt caching, and some other more i couldn't remember. I wonder does this MCP server also optimized ready? or we have to optimize them by our self?

NLPprompter
Автор

Very cool. I added the memory mcp and tavily search to the config. Seems like it works, The current setup requires one cli per server, is that correct?

peterdecrem
Автор

how does this compare to autogen, how can they work together? seems some overlapping but not really... can't quite wrap my head around them for integration

roygatz
Автор

This would be cool to use as a host with FastMCP for servers

JD-hkiw