Function Calling with Mistral AI

preview_player
Показать описание
- Notebook:

Рекомендации по теме
Комментарии
Автор

Excellent. Love you Mistral! Thank you for your lack of censorship and treating customers like adults.

rodvik
Автор

Excited to bring Mistral into Taskade with our upcoming Multi-Agent update! 😊

Taskade
Автор

Congratulations on the launch of the channel ☺ great video, looking forward to the next ones!

andfanilo
Автор

Good stuff, tools lifecycle looks clean and straight forward. Trying it out by porting existing openai models for tool calling to mistral. Thanks for sharing and please keep sharing ...

mhsnalm
Автор

What is the purpose of Mistral client? can we replace with a model run locally

pabloe
Автор

i don't want use it as a open source llm but instead i want it as local and deployed in my cloud service. I need to deploy it in the Azure cloud then what is the cpu and gpu requirement ???and can i use langchain.???

RiteshKumar-xoll
Автор

How does the model know which function to run? I have to explain to it somehow what a function does..

markus_EU_AT
Автор

Great ! can we change the ENDPOINT = "localhost" (or base_url) & api_key="NONE" ? It would be excellent !

joelwalther
Автор

There's a mention of tool execution on the "model" side? What's the use-case for that?

AurobindoTripathy
Автор

I tried to do this with the OpenAI client and base_url set to my local Mistral-7b endpoint, basically using Mistral7b as a stand-in replacement for the OpenAI models. The tools format should be the same, right? It works with the gpt models but not with Mistral. Any idea, why?

shackyalla
Автор

what other types of functions? is there any good documention?

pvp
Автор

Is function calling and system prompt compatible features? Setting tool_choice in "auto" but with usecase demanding a function call, the model write the JSON to call the function but includes it as a part of the content, instead of using tool calls explicitly.

MaximoPower
Автор

How does a request gets translated into llm input? Are you using special tokens to denote function call or response messages. Thanks for the help.

ayubsubhaniya
Автор

Really exciting guys 🙌 Is the new function calling only executable via new Mistral-Large model, or it is also available with the lower sized models also?

roke
Автор

Does Mixtral 8x7b have function calling or just the Large API model?

parkersettle
Автор

Is the API format for function calling identical to the OpenAI format?

alex-r
Автор

Great, is Mistral 7B capable of function calling ?

Techonsapevole
Автор

Thank you for using the same format as openai. Could the integration be flawless using openai JS/TS client SDK ?

ZiperRom
Автор

is there functions calling support for typescript and Nextjs or only possible with python ?

derax
Автор

Hey that's great mistral. But i think the api is not free and also doesn't offer any free tier. Right ?

DevsDoCode