Llama 3.2 on Windows using Hugging Face Llama-3.2-1B (Run LLM Locally!)

preview_player
Показать описание
Llama 3.2 from Meta just came out and it now supports small models with only 1B and 3B parameters and multimodal models with 11B and 90B parameters for vision. In this video, I will go over the vision capabilities and show you how to run the llama 3.2 1B model locally using huggingface.

0:00 Introduction
0:26 Llama 3.2 Model Types
0:57 Llama 3.2 Vision Model Demo
1:33 Hugging Face Granting Access for Llama 3.2
1:57 Access Token
2:21 Installing Modules and Virtual Env
2:35 Logging In Hugging Face from VS Code
2:49 Running Llama 3.2 1B Model Locally

Thanks for watching! If you found this video helpful, please like, subscribe and share:

Social:
Рекомендации по теме
Комментарии
Автор

How about train it to talk with pdf locally, and without internet totally offline ?

vinsmoke.sanji.
Автор

does this run on windows cpu? or only gpu

SundranyCars
Автор

In the preview read me.txt the link to download the Pytorch is hided can you clarify this point please?
where can i have this file
THANK

marouahammami