Microsoft Orca 2 LLM achieves Llama 2 70B performance (notebook included)

preview_player
Показать описание
Orca 2 is the latest step in our efforts to explore the capabilities of smaller LMs (on the order of 10 billion parameters or less). With Orca 2, we continue to show that improved training signals and methods can empower smaller language models to achieve enhanced reasoning abilities, which are typically found only in much larger language models.

Orca 2 significantly surpasses models of similar size (including the original Orca model) and attains performance levels similar to or better than models 5-10 times larger, as assessed on complex tasks that test advanced reasoning abilities in zero-shot settings.

Orca 2 comes in two sizes (7 billion and 13 billion parameters); both are created by fine-tuning the corresponding LLAMA 2 base models on tailored, high-quality synthetic data. We open-source Orca 2 to encourage further research on the development, evaluation, and alignment of smaller LMs.

#microsoft #orca2 #llm #ml #ai #largelanguagemodels #ml #python #deeplearning #transformers #llama2
Рекомендации по теме