Function Calling with Llama 3.1 and Ollama | Langchain

preview_player
Показать описание
In this video, we will explore how to implement function (or tool) calling with LLama 3.1 and Ollama locally.

#llama3.1 #functioncalling #ollama #llama3 #llm #langchain #opensource #nlp #machinlearning #coding #python #datascience #ai
Рекомендации по теме
Комментарии
Автор

Great video. God bless you bro for kindly sharing your knowledge.

jofjofjof
Автор

Cool bro I really wanted it, thank you so much! Please have a logo on your channel and be consistent with the same quality, your content is very nice! Actually I'm also building a revolutionary AI tool, please can you help me with that? I made the AI agent using ollama library and other libraries (including my package which is not available in public) which is very nice and became more cool due to your this tutorial, now the only last thing I want please make a video on how to implement unlimited memory (or you can say context) to the AI agent I made, I use ollama.generate so I can't use the LLM's built in context, and even if I could I won't because it's very short. I watched the AI austin's unlimited memory video but I wasn't able to implement it to my AI agent as he was implementing on his AI agent instead of telling how can we also do that on our project actually. Please can you do that for me today, because I need it today. ALSO MAIN THING - Please use local storage instead of any cloud vector database or any type of cloud database.
Thank you hope you will do that for me today. It's emergency otherwise I won't say to do it instantly. Please I join my hands. Please bro! 🙏🙏😥😥

siddhubhai