Use Ollama to test Phi 3 on your local PC

preview_player
Показать описание


Microsoft's new language model (SLM), Phi 3, is so small you can run it locally on your device. So, in this video, let's learn how to do that, and why you might want a small language model running locally. We'll install Ollama, download the model Phi 3, run it through its paces, and talk about why you might want this option.

Links:

Chapters:
0:00 Introduction
1:33 How to get Phi 3
3:02 Testing Phi 3
3:45 What is the point of using this locally?

Apart from publicly accessible information, all user data or other related information shared in this video is created for demonstration purposes only. User accounts, passwords, or other data used as part of any demonstrations shown in this video have been created specifically for that purpose and are not any individual or company's private information or data.

Рекомендации по теме
Комментарии
Автор

Hi Nick,
I have experimented with local access to LLMs, I have a PC that has a GPU and have set up LM Studio and docker to run models, My aim was to ingest my own documents and enable summarisation etc, reasonable success once I worked out how to get it to read my own docs in dropbox, I'd appreciate more videos on LLM on PC
🙏

DavidWhite-citc
Автор

Great so now we can install fully trained - uncensored AI, on our local machines and ask it any illegal question, such as: how to build a weapon of mass destruciton. Isn't this exactly what AI developers said they would protect us from being able to do?

chillzwinter
Автор

Hi Nick,
Great Video! Just had a question what is the specs of your local PC? Does It have any GPU? I'm asking it to understand thoughtput of this model using CPU only

debarghyamaity