Spring AI - Bring your own data by Stuffing the Prompt

preview_player
Показать описание
In this tutorial we will take a look at a technique known as stuffing the prompt. This will allow us to add some context to our prompts when sending a request to the LLM. In this example we use Open AI's GPT-4 Model but this technique will work with all of the supported LLMs.

Spring AI 1.0 Updates

The syntax for some of the features we went through in this video might have changed a little bit with the release of 1.0. Please check out this video I did on the changes in 1.0.0 M1

🔗Resources & Links mentioned in this video:

👋🏻Connect with me:

Рекомендации по теме
Комментарии
Автор

As a junior developer your explanations are good! Thanks again for the awesome content

jodye
Автор

Dan! You're the man! These Spring AI OpenAI videos have been tremendously helpful in building out my own Spring Boot chatbot that uses the OpenAI Assistants API to get real-time data with an external function call. A video dealing with using Assistants to trigger external API calls would be great!

hfk
Автор

Thanks. And the next video extend this example and use a vector "database".

LordLamers
Автор

Hi Dan, this is a nicely explained video on AI using Spring. Trust me no one really talks about it. However, I have just one concern, as far as I know these "context" value has restricted token length. Because of that we need to learn RAG architecture. In short resolving problem with this approach is not feasible in real world.

satyaprakashnayak