Run Llama 3.1 Locally as Code Assistant in VSCode with Ollama

preview_player
Показать описание
This video shows how to locally use Llama 3.1 model as code assistant in VS Code with Ollama.

🔥 Get 50% Discount on any A6000 or A5000 GPU rental, use following link and coupon:

Coupon code: FahdMirza

#llama3.1 #llama405b #ollama #codegpt

PLEASE FOLLOW ME:

RELATED VIDEOS:

All rights reserved © 2021 Fahd Mirza
Рекомендации по теме
Комментарии
Автор

Here's a revised version of your sentence:

Thank you for explaining nicely and clearly. Its working for me. I am stunned to see your internet download speed of 729 MB/s @ 5:30 . Where is it ? Is this speed publicly available in your country or for some secret military weapon making laboratory ? mine is 1.2 MB/s .

RamrachaiMarma
Автор

Hi Fahd, thanks for all the guides, can you please do a step by step guide for using MicroSoft GraphRAG using Ollama Local LLMs. Can take input pdfs / docs / code files etc and run completely locally offline.

-xx-
Автор

im trying this extension its doest work, dont recognise opened files, trying to press explain and nothing happened

VietNam-vz
Автор

Hey please tell me how to use this on my website as a chatbot. Is it possible???

khushigupta
Автор

no one is talking about the "interactions" and theyre ignoring it like it doesnt exist....these guides are useless....that "interactions" is how many times you can chat with the llm EVEN IF ITS RUN making this absolutely garbage.

omegablast