GPT-3 Alternative - OPT-175B Hugging Face Language Model Tutorial

preview_player
Показать описание
Meta AI’s recently shared Open Pretrained Transformer (OPT-175B), a language model with 175 billion parameters trained on publicly available data sets.

For the first time for a language technology system of this size, the release includes both the pretrained models and the code needed to train and use them.

This video contains three parts:

00:00 Quick Intro about OPT-175B
03:51 OPT-175B Live Demo with Alpa
08:07 OPT1.3B Text Generation Hands-on Coding using Hugging Face Transformers


Рекомендации по теме
Комментарии
Автор

Being Meta, I'm scared Meta might become the new Google someday, it may not be but the possibility is there.

mystwalker
Автор

Davinci has a 4000 token limit. Does OPT state what their token limit is?

knowledgelover
Автор

can you try china's trillion paramaeter language model?

mvrdara
Автор

Hmm 🤔 We should have a contest for the most efficient Wikipedia “fact check” of an output using the model.

ScriptureFirst
Автор

All in all can we say, there aren't good results with this technique, right ?

neilsal
Автор

it throws me an error: NameError: name 'generator' is not defined

alvarochuiso
Автор

title says 175b but you using 1.3b model.

kutupbear