Advanced AI Text Generation With Only a Few Lines of Code Using GPT-2 #NLP #AI

preview_player
Показать описание
Implement GPT-2 using just a few lines of code with an all new python library called Chatting Transformer.

GPT-2 is a Natural Language Processing (NLP) Transformer model that was released by Open AI that has the ability to high quality generate text. By using Chatting Transformer, you can select one of one of four models including: gpt2, gp2-medium and gpt2-large. You may even use the largest 1.5 billion parameter model called gpt2-xl. Then, you may select one of five text generation methods, which are: greedy, beam-search, generic-sampling, top-k-sampling and top-p-nucleus sampling.

From here, only one line of code is required to start generating text using an advanced artificial intelligence model!

Note: GPT-3 is currently only available through an exclusive API offered by Open AI

Рекомендации по теме
Комментарии
Автор

Excellent video, Would you make a similar one for Q&A (text summarization) chatbot?

jorgerios
Автор

I have a question that may sound pretty dumb but I'm new to this. What would happen if you used the output of a certain string of this model to build a library of information in a tree style. Like the topic that is being talked about and below that layer, there is calculated what the ai "thinks" of it by analyzing the vocabulary used by its positivity/negativity and made random slight variations to the output string and then used this new string as an input again and looped this? Would it develope a personality like a human would? Thanks for the great video!

Leo-rhrq
Автор

Is it possible to tell GPT-2 to output rhymes to words you type or even whole poems? Or is that a task only GPT-3 will achieve?

KrisTiasMusic