Ollama Function Calling: LangChain & Llama 3.1 🦙

preview_player
Показать описание
In this video, I introduce the tool support feature within Ollama that enables local function calling on your machine. We'll explore the easy setup, deployment options (local and cloud), and how to leverage different models, notably Llama 3.1 and Groq (function calling model). I'll demonstrate practical examples, including weather queries for San Francisco, opening applications like the calculator and Chrome, and asking Claude AI questions. I'll guide you through the implementation using Bun Lang Chain, TypeScript, and share the code via a GitHub repository. Perfect for enhancing your chat applications and more!

Links:

00:00 Introduction to Ollama Tool Support
00:38 Setting Up and Running Models
01:59 Example Functions and Use Cases
03:16 Implementing Function Calling
04:48 Handling Different Operating Systems
08:26 Final Thoughts and Conclusion
Рекомендации по теме
Комментарии
Автор

The best way to support this channel? Comment, like, and subscribe!

DevelopersDigest
Автор

Hi, great video! Thank you! Can you share the repo please?

Cyberspider
Автор

Im building a java api around this llama chat. Can the llama call a function I implement on java ?

bryanwilliam
Автор

hi when you say Llama 3.1 does you mean the 8b and 70b ...
and if it's correct should we replace "Llama 3.1" in the code with the specific model like "Llama 3.1 8b"

adamkabli
Автор

i guess, this is a javascript example

wryltxw
visit shbcf.ru