filmov
tv
Generative AI - Transformers - BERT -GPT - Text Sentiment Analysis - The secret of Dark Of Knowledge
Показать описание
Generative AI refers to a kind of artificial intelligence that aims to generate new data, such as images, music, or text, based on a set of training data. Generative AI systems use machine learning techniques, such as neural networks, to analyze and learn patterns in the data, and then generate new data that follows those patterns. There are several reasons that have led to a significant acceleration in the pace of research and development in generative AI, resulting in a range of exciting applications and possibilities for the future. Indeed, transformers, based on attention mechanism, are at the origin of this revolution. Thus, models, such as GPT-3 and BERT, has led to remarkable improvements in natural language understanding and text generation. These models can perform a wide range of tasks, including like Machine translation, Text summarization, Sentiment analysis, Conversational AI, etc.
This presentation aims to shed light on the ecosystem of Generative AI, models behind the Chat-GPT or Google Bard. By recalling the limitations of the traditionnel models such as RNN, LSTM and GRU, the presentation explains through the famous article "All you need is Attention", how the attention mechanism, encoders, decoders behind BERT and GPT models has managed to perform in reasoning thanks to the Dark Knowledge revealed by such models. Some live demos are planned to show how to implement an NLP examples based on the BERT mode.
References :
This presentation aims to shed light on the ecosystem of Generative AI, models behind the Chat-GPT or Google Bard. By recalling the limitations of the traditionnel models such as RNN, LSTM and GRU, the presentation explains through the famous article "All you need is Attention", how the attention mechanism, encoders, decoders behind BERT and GPT models has managed to perform in reasoning thanks to the Dark Knowledge revealed by such models. Some live demos are planned to show how to implement an NLP examples based on the BERT mode.
References :
Комментарии