FINALLY! Open-Source 'LLaMA Code' Coding Assistant (Tutorial)

preview_player
Показать описание
This is a free, 100% open-source coding assistant (Copilot) based on Code LLaMA living in VSCode. It is super fast and works incredibly well. Plus, no internet connection is required!

Join My Newsletter for Regular AI Updates 👇🏼

My Links 🔗

Media/Sponsorship Inquiries 📈
Рекомендации по теме
Комментарии
Автор

I typically don't rely too heavily on AI when coding. I use TabbyML, which has a limited model, but it works for me. It's completely open-source and includes a VSCode extension too. It's free and doesn't require login. I use the DeepSeekCoder 6.7B model locally.

rohithgoud
Автор

Need to sign in to use the plugin. No thanks. That is not completely local.

Komma
Автор

Matt Williams, a member of the ollama team, shows how to make this work 100% free and open source in his video "writing better code with ollama"

jbo
Автор

How is it local if i have to authorize with 3rd party 😮

AlexanderBukh
Автор

🎯 Key Takeaways for quick navigation:

00:00 💻 *Introduction to Local Coding Assistants*
- Introduction to the concept of a local coding assistant and its advantages,
- Mention of the coding assistant "Codi" setup with "Olama" for local development.
01:07 🔧 *Setting Up the Coding Environment*
- Guide on installing Visual Studio Code and the Codi extension,
- Instructions on signing in and authorizing the Codi extension for use.
02:00 🚀 *Enabling Local Autocomplete with Olama*
- Steps to switch from GPT-4 to local model support using Olama,
- Downloading and setting up the Olama model for local inference.
03:39 🛠️ *Demonstrating Local Autocomplete in Action*
- A practical demonstration of the local autocomplete feature,
- Examples include writing a Fibonacci method and generating code snippets.
05:27 🌟 *Exploring Additional Features of Codi*
- Description of other useful features in Codi not powered by local models,
- Examples include chatting with the assistant, adding documentation, and generating unit tests.
07:04 📣 *Conclusion and Sponsor Acknowledgment*
- Final thoughts on the capabilities of Codi and its comparison to GitHub Copilot,
- Appreciation for Codi's sponsorship of the video.

Made with HARPA AI

warezit
Автор

I know it is a sponsored video, but is there any open source alternative to Cody extension?
We need a completely local solution, because Cody may use telemetry and gathering some information behind the scenes

KodandocomFaria
Автор

The problem with Cody is that it just autocomplete with local models, a thing you can do with many VsCode Extensions like LLaMA Coder, an many more. All the nice features use the online version, which is extremely limited in numbers of requests if you go for the free plan (a bit of expansion on the monthly numbers of these would make things better to test or to grow a serious interest later leading to a better plan). Also there is a, not indifferent, number of extensions that do those nice features (chat, document, smells, refactoring, explain and tests) the same all in one extension and for free using local models (ollama or openai compatible endpoints). Cody does these features a little better and has a better interaction with the codebase, probably due to the bigger context window (at least from my tests) and a nicer implementaion/integration in VScode, but unless you pay you're not gonna really benefit from them cause of the low free number of requests you can afford, which aren't really enough to seriously dive in.

mayorc
Автор

Is it cody that understands? I think it is the LM that does. Also, why $9 if I am running everything locally?

RichardGetzPhotography
Автор

since you have to sign in, does it sends any data upstream when you use local models?

atr
Автор

The only time I'm coding - is while being on flight. I'm so glad I can use LLM from now on!

Resursator
Автор

I'm looking for a local code assistant. I don't mind supporting the project, with a license for example, but I don't want to log in each use or at all. How often does this phone-home? Will it work if my IDE is offline? Pass.

Joe_Brig
Автор

Wait, you have GitHub Copilot enabled there too, which shows up in your editor.
Are you sure that the completion itself is not provided by the GitHub Copilot extension and not Cody with the local model?

supercurioTube
Автор

Excellent. Thanks for showing this. Exactly what I've been looking for.

daverobey
Автор

Am I misunderstanding something or are you advertising this as an open source solution while it still dependent on a 3rd party service? What exactly is cody? I would have assumed if it is completely local, it's just a plugin that lets you use local models on your machine. Yet you describe it as having multiple versions with different features in each tier, including a paid tier. How exactly does that qualify as open source?

SageGoatKing
Автор

ollama : The term 'ollama' is not recognized as the name of a cmdlet, function, script file, or operable program

is there a working tutorial for windows 10?

WhiteDragon
Автор

Is it possible to use it in a Windows and WSL system? If yes how we should install LLaMA?

olimpialucio
Автор

Does anyone knows if the 500 autocompletions per month on the Free tier, also applies if we run codellama locally?

ruifigueiredo
Автор

Don't think this is an option unless you got a pretty good graphics card. I set mine up and gave it a setup to autocomplete. I heard my mac cpu fan going crazy, and it took about 20 secs to get a 5 token suggestion (it was correct tho :P )

jeffspaulding
Автор

thanks for the video, this is absolutely a blessing of an assistant

vivekpadman
Автор

This is so cool, but doesn't the Cody login kind of invalidate the local benefits? A 3rd party still gets access to your code.

Ludecan