HuggingFace Finetuning Seq2Seq Transformer Model Coding Tutorial

preview_player
Показать описание
In this video, we're going to finetune a t-5 model using HuggingFace to solve a seq2seq problem.

Рекомендации по теме
Комментарии
Автор

Thanks, learned a lot. I've got a question though. To train a chat-bot model (like GODEL), how are we going to tokenize it. Like what's going to be input and what's going to be output?
P.S: Please keep uploading tutorials like this.

aleefbilal
Автор

Hey there! Thank you for the insightful video. I am trying to fine-tune a pre-trained BART model taken from HuggingFace to complete conversational tasks. Do you have any resources for this? Thank you!

ftay
Автор

Hello ! Pretty good video mate, hope you can make more content. I just have a quick question, how can I turn a MLM model into a Seq2Seq in a HuggingFace model in pytorch, do you have any documentation to acomplish this target ? Many thanks !

samuelparada
Автор

hello! Can this be used for semantic search? thank you in advance

etherealshift
Автор

Can this be used for sequential sentence classification?

abdullahalsefat
Автор

Thank you so much, but I was wondering which tokenizer to use in the seq2seq trainer, the one for the input or that for the target? I am trying to use different tokenizers.

oyusuphgmail