Run LLMs without GPUs | local-llm

preview_player
Показать описание
Run Large Language Models (LLMs) without GPU with local-llm.
With local-llm, you can run LLMs locally or on Cloud Workstations.

Join this channel to get access to perks:

Timestamps:
0:00 intro
0:42 key benefits of running LLMs locally
1:25 what is local-llm
3:00 installing local-llm
6:00 running a model with local-llm
8:45 outro

Resources:

Support this channel:

Connect with me:

#llm #localllm
Рекомендации по теме
Комментарии
Автор

hey rishab great video.. can we fine tune the model using local-llm??

rakeshreddy
Автор

great thanks! so question can i use this to run a downloaded llm, instead of accessing hugging face ?

armanshirzad
Автор

Hello Sir,
Thank you for reading my message
Sir, I just finished my UG(BSc.IT) and I'm interested in the Cloud Computing Field as a fresher should I start preparing for DevOps or Cloud engineering to land a Job in Cloud Computing as soon as possible, And any advice that you would help me in career growth wise.

Heet