How to Use Local LLM in Cursor

preview_player
Показать описание
Cursor is the leading AI coding editor right now. Here's how you can use local LLMs in Cursor as a coding assistant!

#cursor #aicoding #llm
Рекомендации по теме
Комментарии
Автор

Thank you. very valuable video for me.

Jake-kyed
Автор

Nice videos and very useful, is there ability to get an AI assistance similar to one in GPT openAI and ability to use its API

amrsalem
Автор

Thank you for the video! But some questions how do you change the context length of your model when running it this way? I know usually you would set the context length when using “ollama run [model]” but it seems you don’t get that chance with this configuration. Any help with this would be appreciated, thank you!

cheyannehutson
Автор

ollama_origins=* command is not recognized for me

nielsdebont
Автор

Thanks for the video. Why would anyone want to do this?

boiserunner
Автор

This is ridiculously dangerous advice given Ollama has no authentication and you are suggesting to make it open to the entire Internet

moresignal