Get structured Json Output from OpenAI GPT API Consistently!

preview_player
Показать описание


Includes few-shot learning with function calling demo!

Using OpenAI's Chat Completion API, we explore two different ways of getting structured JSON output out from GPT rather than a blob of free text - For example, if you wanted to get the output returned as a list or as a dictionary of key-value pairs or even more completed schemas!

In the first method, we just use prompt engineering to describe what we want the output schema to be in free text. However, this gets a bit hairy when the schema gets more complicated. So as an alternative, we also explore a second method.

In this second method, we use function calling. But instead of using it to actually call a function, we just take the arguments as our output.

However, when the schema gets more complicated, GPT has trouble filling in all of the required arguments. So we also show you how to do "Few-Shot Learning" for function calling to teach GPT a few real worked examples to improve it's ability to consistently deliver the output you want!
Рекомендации по теме
Комментарии
Автор

I'm glad I found the hidden gem your channel is! Thank you for sharing

valentind.
Автор

Ah loving this chan. Forgot about it. It’s the bomb!!!

JOHNSMITH-verq
Автор

Just what I was looking for. Thank you for this amazing info!! 😊

AnkitSharma-rhhs
Автор

This is a great video! Can you please make another one to get back consistent responses using json format now that devday happened? I am curious about using the api to parse through input data and returning data consistently.

Very helpful channel!

bobjake
Автор

thanks, this is exactly what I needed

notgreen
Автор

Thank you! this helped out alot with the project I am building!

arun
Автор

Great video. I think chatGPT will have problems doing math like word count for the foreseeable future. At its core the LLM predicts the next word. Using it to do math is just using the wrong tool for the job until Open AI implements some additional math layer on top of the model.

InnocenceVVX
Автор

You mention providing examples in the end, but do I see that correctly that you never feed those into an API call to openai?

thisiscrispin