Function Calling - EVERYTHING you NEED to know!

preview_player
Показать описание

Affiliate Links (support the channel):

Chapters:
0:00 The power of function calling LLMs
0:56 Video Overview
2:45 What are function calling models?
6:45 Function calling fine-tuning datasets
11:52 Training an LLM for function calling
37:49 Inferencing function calling LLMs
41:04 OpenAI Function calling
47:15 Inferencing function calling Llama 7B
1:00:29 Inferencing OpenChat 7B
1:04:27 Inferencing DeepSeek 67B
1:08:55 Function calling EN ESPANOL
1:14:27 Choosing a function calling model
Рекомендации по теме
Комментарии
Автор

Great video!! I am interesting in the performance of models with function calling. Do you know how the performance would be if a function called model is used for customer services and the requirements (server, gpu, ram)?

gdelco
Автор

I really enjoy these longer form videos. Thanks for the effort and details.
Interesting observation on the model types influencing the performance of function calling.
Are deepseek and openchat licenses available for commercial use?

brennonwilliams
Автор

I really appreciate your videos, you make these things so accessible and easy to understand. Have you thought of making a discord server, nearly everyone around llms and their mother has a server Mistral, Qwen, Nous Research (Except for companies like Microsoft, Facebook Google). I'm asking cause there is a huge amount of people hanging around these llm focused servers looking to learn exactly what you're showing. People would be able to collaborate on dataset curation research in a better way than a YouTube format can. You could also get video suggestions or ideas also since you offer a place to sell datasets and models it would help if people had a place to collab with others for their own individual datasets. Too many places aren't exactly friendly to beginners.

okj
Автор

Fantastic overview, great detail and clarity. Kudos to creating the dataset.

mamotivated
Автор

Brilliant video! Great content. Thank you very much!

RemekKinas
Автор

Hi! Thanks so much for this video! In the dataset, I was wondering if in the function column, you provide a list of all the available functions the model will have in each row. Why or why not?

carrietam
Автор

Can we call predefined Javascript functions? I was thinking of using Llama 2 as a chatbot which will call functions that I have already defined in my Angular app, and UI changes should occur if response from model is correct. Is this even possible with Function Calling?

VijayDChauhaan
Автор

Really helpful video I have one question what if I have already have LLM that's fine tuned using instructions tuning can I fine tune it again to add function calling?

augustyasharma
Автор

Thanks for uploading this video.
I have a question about the dataset. What is the percentage of questions with function call and questions without function call? And why did you create data with that percentage?

B-ixlo
Автор

What a great video, everything was very well explained. I wonder how much the result of function calling will improve if examples in Spanish are added to the training dataset?

raestrada
Автор

Can I use this method to train gpt 3.5 turbo model to train it to better understand and use the parameters values for my function calling where it's calling an API

varunmehra
Автор

What do you think the performance of Mixtral-8x7B can be for function calling? Will it be better than DeepSeek, given the comparable model parameter count and overall better performance of Mixtral-8x7B in normal tasks (logical ones)?

shubhashish
Автор

How long will it take to tain a 34B model? Do you offer GPTQ script?

yiouyou
Автор

Very good video, I learnt a lot.
I wish you a Merry Christmas and please keep up the good work.

gustavstressemann
Автор

SUCH A SCAM FOR NEW COMMERS. WHY WOULD ANY ONE PAY FOR THE MODEL THAT YOU FINED

ArtemiiKhristich
Автор

amazing video, wouldn't it make sense to have constrained decoding since you have the functions definition, you know the parameters that have to come back for function calling

frazuppi
Автор

which branch has code in This video? thks

StevenPack-nhns
Автор

It's cool using the function list column. i wanted to use that but was worried i will cram up the context window. But a vector database with function list's should be awesome. Thanks for sharing man.

befikerbiresaw
Автор

Thank you for the good video! :) I understand that the LIMA methodology is full-parameter fine-tuning. In this case, if fine-tuning is performed on the function-calling dataset, it is likely that function-calling is incorrectly called when given a general prompt that does not require function-calling. Is this the case in practice? If so, what is the solution?

soc
Автор

@TrelisResearch thank you for the great content, I wonder if padding on the right for function calling mandatory, since my understanding is, for decoder model, we usually pad on the left. Any unexpected behavior might caused by padding on the right? Thanks

MW-ezmw