LLM Function Calling - AI Tools Deep Dive

preview_player
Показать описание
Tool and Function calling with LLMs is becoming one of the most crucial to understand capabilities, and can elevate the way you interact with and build AI applications to the next level. I’ve put together this video to give a comprehensive overview of what tool calling is, how it works, and how you can make your own. Cheers!

I put these videos together on my own time with my own funding, if you find these resources useful and have the means, consider leaving a donation via the Super Thanks function!

Resources:

Chapters:
00:00 - What is Tool/Function Calling?
03:09 - Defining Custom Tools
05:51 - LLM Tool Response
08:20 - Executing Tools
12:59 - Additional Tool Behavior
14:45 - Advanced Tools with Pydantic Schemas
17:53 - Executing Advanced Tools
19:58 - Universal Tools and Functions
20:57 - Defining Models & Tools w/LangChain
22:42 - Binding Tools w/LangChain
24:27 - Executing Tools w/LangChain
26:56 - Tools & LLMs as Agents
27:40 - Tool Calling Agent
29:13 - ReAct Agent
30:47 - Outro

#ai #coding #openai
Рекомендации по теме
Комментарии
Автор

The BEST walkthrough and sample code on tool calling on YouTube!

IdPreferNot
Автор

Adam, your content is very helpful and thoughtfully put together. It’s clear there are a lot of hours going into their preparation

therealsergio
Автор

An intriguing exploration into LLM function calling! Investigating further AI tools may improve your comprehension even more.

sirishkumar-mz
Автор

Nice video! I was just thinking about function calling…and your video showed up! Thanks 😊

sunitjoshi
Автор

Great video. Dude's been speaking nonstop for 30min straight. Now, I need a 2 hour break.

Horizont.
Автор

How are possible with a LLM running on a own Server? So without an Ai-API. Like a Llama model. Is this possible with a specific model?

youtubemensch
Автор

Awesome video! Is the code shared anywhere? 😊

aishwaryaallada
Автор

The tutorial doesn't say how to insert your API key. To do this replace
{
client = OpenAI()
}

with :

{
import os
from dotenv import load_dotenv

# Load environment variables
load_dotenv()

client = OpenAI()
# OpenAI API configuration
client.api_key = os.getenv("OPENAI_API_KEY")
}
ignore the "{" and "}".

Then you can use a .env file with:
OPENAI_API_KEY="sk-proj..."

DaleIsWigging
Автор

where can we see the functions that are available with each model?

ganian
Автор

Kinda hate this dudes voice ngl. Anyone else with me?

maxpalmer
Автор

Hi Adam, well explained all content, I need to disable parallel calling, but I am not sure where to Put parallel_function_tool:false, can you help me in this case?

siddheshwarpandhare