Install Aider with Ollama for AI Pair Programming

preview_player
Показать описание
This video shows how to install Aider locally with Ollama and lets you pair program with LLMs, to edit code in your local git repository.

🔥 Get 50% Discount on any A6000 or A5000 GPU rental, use following link and coupon:

Coupon code: FahdMirza

#aiderchat #aider #ollama

PLEASE FOLLOW ME:

RELATED VIDEOS:

All rights reserved © 2021 Fahd Mirza
Рекомендации по теме
Комментарии
Автор

I tried this past week with a llama3.1 7b and 96, 000 context window. Turned it to 0 temperature. Also turned on the repo map function (which was turned of by default for unknown models). It’s somewhat apparent that it’s off here too (low token sent count). It acted weird at first, then I removed the default template code from the Ollama model file. Then it seemed to work well, except when I asked it a question, it responded by trying to delete a bunch of highly tuned and detailed code. The repo isn’t that big. Some files are up to 1000 lines. No matter what, it just kept doing that. It’s insane for me. 48GB of vram gibberish. lol

jofus
Автор

is it worth using ollama models? part of the issue is the AI has to translate the user's intent into code, which is another kind of intelligence than just coding

ytubeanon
Автор

sir. Could you please provide guidance on how to use aider with MiniCPM-V2.6?

NLPprompter
Автор

Excellent tool for pair programming, however remember to look at other options that may improve your coding experience with other capabilities.

sirishkumar-mz
Автор

Dude, you need a more complex example. If it is working with huge projects, just clone some project and show it...

beckbeckend
Автор

Great video. Will definitely try it. Will it create documentation such as usage documents/readme files and uml and flowchart from an existing codebase?

joydeepbhattacharjee