🦙LlamaIndex | Text-To-SQL ( LlamaIndex + DuckDB)

preview_player
Показать описание
LlamaIndex is a simple, flexible data framework for connecting
custom data sources to large language models. It provides the key tools to augment your LLM applications with data.

In this video, I will walk you through the process of text-to-SQL capabilities using different query engine of LlamaIndex and DuckDB.

Happy Learning 😎

👉🏼 Links:

------------------------------------------------------------------------------------------

------------------------------------------------------------------------------------------
🔗 🎥 Other videos you might find helpful:

------------------------------------------------------------------------------------------
🤝 Connect with me:

#llamaindex #llm #texttosql #datasciencebasics
Рекомендации по теме
Комментарии
Автор

Hi sir, can you show how to do this with a open source model?

asdasdaa
Автор

What if in my company has database table and column names with Alphanumeric characters? Could you please let me know how to train the LLM model (LLAMA2 with Langchain Libraries) to read those tables?

abhishekfnu
Автор

hey last question ... i have 2 tables and so far never the query never did a inner join, im trying to get info from the two tables but im failing...do you have an example ?

renaudgg
Автор

how to persist the object index which is created in this demo?

rekha
Автор

Thanks for your video. In the above configuration, it will point out the tables and views. But how can we access the synonym tables? If anyone knows that, please guide me.

achradwasad
Автор

Why it is not working in jupyter notebook?

jayrn
Автор

in the advances section with the example of many tables, you dont use anymore the OpenAI LLM and yet a natural question will still work? i dont understand

renaudgg
Автор

Can u deploy this model to a web application?

siriuppuluri
Автор

If I have a large CSV with over 100 columns, it will be tedious to create a schema. Is there a way where the engine self-learns the table schema instead of being explicitly provided?

nishkarve
Автор

Thank you for your sharing.
I’m constantly looking for an alternative to use the Meta Llama2 model locally, rather than always relying on the OpenAI API key. Even though I genuinely believe that the OpenAI model might be superior, Llama2 should ideally suffice. Isn’t this possible solely with Llama2?

xppoigj