How to Build Q&A Models in Python (Transformers)

preview_player
Показать описание
In this video we'll cover how to build a question-answering model in Python using HuggingFace's Transformers.

You will need to install the transformers library with:
pip install transformers

Alongside either TensorFlow or PyTorch (to follow this video exactly you will need PyTorch). To install TensorFlow just type:
pip install tensorflow
OR
conda install tensorflow

And for PyTorch follow the instructions under 'Install PyTorch' here:

🤖 70% Discount on the NLP With Transformers in Python course:

Link to Q&A fine-tuning video:

You can find the Medium article link below here:
Рекомендации по теме
Комментарии
Автор

I am enrolled to your NLP in Python. I haven’t been actively participating yet. It is really good!

imdadood
Автор

This video is so simple to understand and so helpful to reach my idea project.

mamunurrahman
Автор

Excellent work. Even with some minor changes to HF code recommendation I found the tutorial very helpful. Thank you.

siamakshams
Автор

How do I enter (input) a question directly? I want to know how to connect this model to social media like telegram or whatsapp, do you know it?

alfafa
Автор

I want to get answers from a table in the image, how to do that?

prashun
Автор

is there way to pre-load a context? i.e. i don't want the model to read in the entire context (e.g. a book) everytime i wan to ask a question

dato
Автор

Video is helpful. Can you please tell me something about the "score", a confidence score, how they calculate it? Any kind of formula or based on some metrics?

rashmisingh
Автор

I want to ask what will happen if the question we ask has nothing to do with the context? Then if we ask the same question multiple times how to make the program recognize it

xuejingfu
Автор

thank you for your helpful video😍. how can we recognize that a question is not answerable?

niloozh
Автор

Can you talk a bit about how to further train (fine-tune) the model? Do I just put the start index and end index as the labels to BertForQuestionAnswering? I followed the hugging face page's example but it doesn't work like the loss is decreasing but the accuracy is very low. I calculated the accuracy as if only the predicted start index match with the label start index AND the predicted end index match with the label end index.

shanefeng
Автор

Is it possible to implement this as a mobile app?

khaoyaa
Автор

video idea, what are the challenges for doing transformer on scale.

wryltxw
Автор

What if we want long answers? For example if we need to extract all preferred qualifications from a job description context then what to do? I tried this but it gives a very short answer and doesn’t pull all the details or requirements. This is just one example. Any suggestions here?

shakshuka
Автор

Great video! Do you know how to increase the number of words in the answer? Thanks :)

mariiii