Local chat and code completion with Cody and Ollama (Experimental)

preview_player
Показать описание

Learn how to enable experimental local inference for Cody for Visual Studio Code which allows you to use local LLMs for both chat and code completion for those times when Internet connectivity is out of reach.

This feature is limited to Cody Free and Pro users at this time.
Рекомендации по теме
Комментарии
Автор

The "Experimental" Ollama Models are not being shown in the select models tab at my work computer. On my home computer they appear.

carlosmagnoabreu
Автор

Does the cody with ollama also has limit like "500 Autocompletions per month" for free version?

mikexie-dodd
Автор

my settngs.json file does not look like this. I do not have a but my settings are pointed at experimental - ollama. I cannot select an Ollama model and I feel like it's jnust defaulting to Claude 2.

chrishardwick
Автор

Is this available for Jetbrains editors?

AdityaLadwa
Автор

When trying to use Cody with Ollama and a local LLM, the chat is working fine, but when using the setup as recommended in this video, the autocomplete returns chat suggestions instead of code. Any idea what's causing this and how to fix?

ConfusedSourcerer
Автор

Somehow my autocompletion is broken since I moved to another place. Nothing suggested even I "trigger autocomplete at cursor".

maxint
Автор

when I install cody I see no option at all to select ollama chat. has it been removed?

adamvelazquez