How to use Llama 3.1 ?

preview_player
Показать описание
This video explains the codes to run Llama 3.1 by Meta in the local system using Huggingface and tests it on various usecases
#ai #ml #datascience #meta #llm #llama3
Рекомендации по теме
Комментарии
Автор

Quick, to the point but the exact content we're looking for. Thanks for making this.

shaileshrana
Автор

Introduction to Lama 3.1 Release - 00:00:00
Performance and Features Overview - 00:00:36
Setting Up Lama 3.1 Locally - 00:01:06
Installing and Upgrading Transformers - 00:01:06
Loading the Model and Creating the Pipeline - 00:01:37
Using the Chat Interface - 00:02:05
Testing Pirate Language Prompt - 00:02:35
Testing Mathematical Problem - 00:03:19
Testing Language Translation - 00:03:53
Conclusion and Encouragement to Try Larger Models - 00:04:35

SouhailEntertainment
Автор

raise ValueError(
ValueError: `rope_scaling` must be a dictionary with two fields, `type` and `factor`, got {'factor': 8.0, 'low_freq_factor': 1.0, 'high_freq_factor': 4.0, 8192, 'rope_type': 'llama3'}

why I am getting this error, I tried everything you mentioned

Compact
Автор

it open source but we need to apply to use it and meta needs to accept right?

SouhailEntertainment
Автор

running this llama3.1 using lots of GPU and running on google colab will give the same quality of rusult? if quality differs means why? or lot of GPUs are only for time saving on large user base?

manivannana
Автор

The default prompts (pirate, who are you ...) are lame

rusticagenerica