Orca 2 by Microsoft: Teaching Small Language Models How to Reason

preview_player
Показать описание
In this video we dive into the Orca 2 model, presented in a recent research by Microsoft, titled: "Orca 2: Teaching Small Language Models How to Reason".

We first provide a background for the previous Orca paper which was released earlier this year, so previous knowledge about the first Orca model is not required to follow this video. We discuss about what is imitation learning, and how explanation tuning helps to boost the knowledge gained with imitation learning, as shown in Orca 1.

Then, we explain the two key improvements that Orca 2 brings to the table. One improves the quality of the data used for training by Orca 2 by using prompts selectively for different language tasks, and the second is Cautious Reasoning, a new term introduced in this paper, which is about teaching the model to be able to choose the proper solution strategy to use in order to answer a given user instruction.
Orca 2 gains this capability thanks to Prompt Erasing, a novel technique introduced in the paper, which we also cover in this video.

-----------------------------------------------------------------------------------------------

-----------------------------------------------------------------------------------------------

👍 Please like & subscribe if you enjoy this content

-----------------------------------------------------------------------------------------------

Chapters:
0:00 Introducing Orca 2
0:48 Orca 1 Recap
3:00 What's New With Orca 2
5:48 Orca 2 Results
Рекомендации по теме
Комментарии
Автор

This video is full of informations 😊✨✅ love to watch

kunalsoni
Автор

can I ask what you use to make your animated images?

hickam
Автор

please remove the music, it is distracting

virtuous