Flowise Ollama Tutorial | How to Load Local LLM on Flowise

preview_player
Показать описание
Flowise Ollama Tutorial | How to Load Local LLM on Flowise

In this Flowise Ollama Tutorial video I will show you how to load Local LLMs on Flowise using Ollama.
Want to learn how to create Flowise Ollama Agents. This is the video for you!

🙏 Support My Channel:

📑 Useful Links:

💬 Chat with Like-Minded Individuals on Discord:

🧠 I can build your chatbots for you!

🕒 TIMESTAMPS:
00:00 - Intro
00:21 - Local Models overview
01:00 - Ollama setup
02:13 - Start Ollama server
03:49 - Local Conversation Chain
05:22 - Local RAG Chatbot
08:26 - Open Source Limitations
Рекомендации по теме
Комментарии
Автор

Hi Leon, how are you? I'm from Brazil and I'm advancing my studies in the area of ​​automation in general using AI resources and when looking for videos about Flowise I found your channel. Since yesterday I've been watching all the videos on the playlist about Flowise. I want to express my gratitude for your dedication and teachings. Thank you very much!

marcioricardoluciano
Автор

thank you please more video on open source models

sadyaz
Автор

Hi Leon. Good work! Keep rolling out tutorials like this with Ollama! In my case, I didn't have to use the MMAP parameter. Everything works fine.

redrhino
Автор

Big thanks for these videos Leon! You do such a great job of editing as well.

swhitings
Автор

This is exactly what i was looking for! Thank you so much! I’ve tried getting ollama working with flowise over the last days…

conneyk
Автор

THank you man for all these vids you are the best. I hope you add more vids on how to add more complexity with creative ideas. thanks <3

moroccangamereviews
Автор

Oh! thank you very much. Really appreciate your work! 🎉

BadBite
Автор

Hallo and thank you so much… I started testing Flowise with this and it worked.

Kartratte
Автор

Thanks for your awesome videos. Please add more videos on Opensource Models.
Thanks again.

toursian
Автор

thank you very much, please more video on open source models !

abelpouillet
Автор

Thanks for this video. Can we please have a video with Ollama and folder loader?

fatehkerrache
Автор

Great stuff. I tried some of the earlier tutorials using Ollama instead of OpenAI (the posiitve/negative review reply tutorial) and found the if/else didn't work with llama3. I'd like to learn more about how to make agents that are completely local and don't use any 3rd party services like open ai, pineapple, etc., but maybe that's not possible without too much missing functionality.

cyborgmetropolis
Автор

I've been watching your work on youtube for a very long time and wanted to say thank you very much for what you do and wanted to wish you a lot of luck for your wishes!

ojgecbz
Автор

Dankie vir jou videos Leon. Ek leer baie by jou en dis lekker om 'n mede Suid Afrikaner te sien gesels oor AI en LLMs.

whackojaco
Автор

Been mind-blown by pretty much every single one of your videos / tutorials 👏, I'm super grateful! 🙏
Just a thought, how about an entirely open source and local stack running on Docker - I mean, for example, Flowise, ChromaDB, Mixtral (a quantized model running through Ollama?) running in different containers locally - And then, wow, a deployment pipeline to, say, Google Cloud with a CI/CD script via Github Actions - So, dev locally with the Docker containers, push to Github when satisfied, automatically build and deploy on Github Actions and boom, app available on Google Cloud - That would be incredible! I'm going to try it right now, lots of research and trials and errors ahead... 😁

BruWozniak
Автор

Thanks for your videos. They really help me a lot!

RuiminWang-hkwz
Автор

Hi Leon, thanks a lot for the video. For my purpose i used nomic-embed-text as llm for embedding, it s faster. I managed to connect directly my ollama+flowise customtool to my crm api .. but it worked only with llama2 model and not mistral ... i struggled a while then found the trick ! no need to interface to make or n8n ... i'm still working on it ..Cheers

mehdibelkhayat
Автор

Thanks for another great. We would really appreciate it if you made a bigger project like Chatbots for a Real Estate or e-commerce website

DarkKnight-ukmq
Автор

Hi Leon, great content. Really love your contents and presentation. Can you please make a video of RAG, where you embed different types of files, like CSV, pdf, doc and do the chats with those. Thanks

rickyS-D
Автор

a video about Ollama function calling with flowise would be very nice :)

KevinBahnmuller