Question Answering Research - Ep. 4 - Retrieval Augmented Generation (RAG)

preview_player
Показать описание
Weekly Research Group, July 15th, 2021

This week, Nick walked us through more of the details of the RAG architecture, looking at both the RAG-Sequence and RAG-Token variants. He also demonstrated some example code which shows how to apply RAG to your own dataset--the example answers some questions about Game of Thrones :)

Next week, I’ll walk us through some example code for applying BERT to an AirBnb pricing dataset (containing text, numerical, and categorical features) using the concatenation approach for combining the features. Sign up here:
Рекомендации по теме
Комментарии
Автор

Hello Nick,

I noticed that they use a dataset with question-and-answer pairs for fine-tuning the BART generator model(correct me if I am wrong here).

If I only have data with questions and their corresponding passages, how can I fine-tune the generator model to improve answers for my custom data?

adarshkm
Автор

Hello Nick! Thanks for the amazing conversation. I am in between the RAG and LFQA techniques and require some clarity on that. Basically what I have found is that both ways work the same way (retrieving the relevant documents and generating answers) except LFQA is able to generate detailed and long responses.

My question is that, is it possible to use some way or model that can absorb all of my domain specific data and when I question it, without going to retrieve the answer, it generates the answer from its knowledge?

Thanks! Please enlighten me.

aayushsmarten
Автор

can we do bert as retriver as bloom as generator . RAG model on custom data

owaizdero
Автор

Thank you Nick and Chris for your nice research on RAG. Would it be interesting also evaluating the new Blenderbot2.0 from fb research?

alelasantillan
Автор

Thank you for this great explanation. I wonder if you know if there's any model that uses RAG for language generation rather than QA. (By language generation, I am thinking of GPT3 where you feed one a few words, it could generate a whole passgae)

riverdong
Автор

why your model..generate one line one makes

abhishekprakash