Discover LlamaIndex: Ask Complex Queries over Multiple Documents

preview_player
Показать описание
In this video, we show how to ask complex comparison queries over multiple documents with LlamaIndex. Specifically, we show how to use our SubQuestionQueryEngine object which can break down complex queries into a query plan over subsets of documents.

Рекомендации по теме
Комментарии
Автор

Enjoyed the tutorial. It would be ideal if you could upload also in 1080p quality in the future.

nicholas-axilla
Автор

This tutorial is so good and easy to grasp Compared to the ones I have watched previously in this playlist

SivakumarSaminathan-so
Автор

Sub Question Query engine not works for open source llms as there is a dependecy for openai.As per one page from you says some llms supports sub question query engine like zyphre, phi3-mini-4k-instruct and llama2:70b but not able to work with sub question query engine when i tried .is there an alternative tool there to perform the same ?or will come in future?if i use any open llm and if open ai key is present in the .env file the Sub question query engine will takes the key and work automatically with open ai.Its a curse as when we install pip install llama index there are 3 libraries associated with openai automatically downloads, if we removed them manually nothing will works.

HarikrishnanK-pn
Автор

An excellent addition to querying. There are many use cases that require querying over multiple documents in a “parallel” manner. I’m just curious how the service_context variable specifying the language model to be used is passed to the query.

AndrewGruskin
Автор

Simple and excellent. Though I have a question about library choice. So, why you didn't choose llamaindex and llama2, why you choose langchain's openai?

mariozupan
Автор

The documentation and the videos are lacking in quality and specificity. I would focus on listing more parameters in the documentation and less emoji's.

AndrewMagee
Автор

I am getting below error
ImportError: cannot import name 'ServiceContext' from 'llama_index'

basantsingh
Автор

Can you please provide the tutorial link for this?

manaranjanpradhan
Автор

What if we have thousands of documents. Do we categorize them into some classes and build a query engine for each class?

tqzmyqb
Автор

Could this be done as multimodal? Image index and text index

Автор

great! please more tutorials like this one

aa-xnhc
Автор

this doesn't seem very scalable solution. if you have 1000 docs, then you'll build 1000 query engines? Doesn't seem to work ...

longhaowang
Автор

bruh how can u get away with this fab video but no link to the notebook my g

jaminthalaivaa
Автор

Sub Question Query engine not works for open source llms as there is a dependecy for openai.As per one page from you says some llms supports sub question query engine like zyphre, phi3-mini-4k-instruct and llama2:70b but not able to work with sub question query engine when i tried .is there an alternative tool there to perform the same ?or will come in future?if i use any open llm and if open ai key is present in the .env file the Sub question query engine will takes the key and work automatically with open ai.Its a curse as when we install pip install llama index there are 3 libraries associated with openai automatically downloads, if we removed them manually nothing will works.

HarikrishnanK-pn