Give Internet Access to your Local LLM (Python, Ollama, Local LLM)

preview_player
Показать описание
Give your local LLM model internet access using Python. The LLM will be able to use the internet to find information relevant to the user's questions.

Library Used:
Рекомендации по теме
Комментарии
Автор

Sir this video can explain mode and provide information from big article

HeRaUSA
Автор

what about reading from a website or giving you a link based on a question like "find me 3 inches pvd pipes"

shortvideosfullofstupidity
Автор

This was great, thanks for the introduction.
Love to see a deep dive to have this add on as an extra where you get command prompt with internet search. Like running "ollama but now with internet"

Oxxygen_io
Автор

hello my friend, thank you for the helpful video! i learned what i was actually looking for. Can you do me a favour and open up your OBS and click on the 3 dot button under your Mic/Aux and then you choose Filters. There you press + (Plus) - Button in the left corner and choose "Noise Supression" and create one of this in its default settings. Thank you very much <3

MrOnePieceRuffy
Автор

amazing video, extremely underrated channel. Good work, I needed this to complete my program for an assistant model using ollama that has the capability to create files, run files, edit the contents of files, search the web, and maintain a persistent memory. This was the second to last thing I needed to finish it up, now I just need to finish the run files part.

_areck_
Автор

Can this be used with the Ollama API? If so, how?

paleostressmanagement
Автор

Is it better to open various tabs from the same LLM in case i want to ask different subjects like we do in ChatGPT? Or i can use only one chat for everything i want to do?

edengate
Автор

"Join the Discord" => "invalid invite" 😒

themaxgo
Автор

how much more advanced can we create a local llm to be than the censored versions available publically?

jumpersfilmedinvr