LLM Structured Output for Function Calling with Ollama

preview_player
Показать описание
I explain how function calling works with LLM. This is often confused concept, LLM doesn't call a function - LLM retuns JSON response with values to be used for function call from your environment. In this example I'm using Sparrow agent, to call a function.

Sparrow GitHub repo:

Reference:

0:00 LLM and Function Calling
1:45 Example
4:25 Code
7:00 Summary

CONNECT:
- Subscribe to this YouTube channel

#llm #functions #ollama
Рекомендации по теме
Комментарии
Автор

Thank you Andrej for clarifying what function calling is.

MrLablanco
Автор

This is what I was waiting for :) Thank You ;)

himenatika
Автор

What if we want the output in csv form?

MuhammadDanyalKhan