LLM Chat App in Python w/ Ollama-py and Streamlit

preview_player
Показать описание
In this video I walk through the new Ollama Python library, and use it to build a chat app with UI powered by Streamlit. After reviewing some important methods from this library, I touch on Python generators as we construct our chat app, step by step.

Links:

Timestamps:
00:00 - Intro
00:26 - Why not use the CLI?
01:17 - Looking at the ollama-py library
02:26 - Setting up Python environment
04:05 - Reviewing Ollama functions
04:14 - list()
04:52 - show()
05:44 - chat()
06:55 - Looking at Streamlit
07:59 - Start writing our app
08:51 - App: user input
11:16 - App: message history
13:09 - App: adding ollama response
15:00 - App: chooing a model
17:07 - Introducing generators
18:52 - App: streaming responses
21:22 - App: review
22:10 - Where to find the code
22:27 - Thank you for 2k
Рекомендации по теме
Комментарии
Автор

Keep the videos coming, you're editing and teaching style are top notch!

Aberger
Автор

Straight to the point! Thank you for sharing.

MrEdinaldolaroque
Автор

High quality stuff. Clear and concise. Keep making more such videos.

akashrawat
Автор

Great stuff as always! I really appreciate you breaking down the code line-by-line. Very clear explanation.

Bearistotle_
Автор

This really helped me at my internship! Thanks a lot and keep the videos coming! 😁

childisch
Автор

We want more videos from you, please keep up the pace

replymeasapp
Автор

Really clear and concise videos that actually shows how to do things instead of the tons of videos that only read and attempt to explain research papers. Keep up the good work, this will get you really far as people gradually discover your content.

spartacusnobu
Автор

Insanely good playlist and very well presented.

natekidwell
Автор

Great Video! Precise and really easy to follow

rijeanirso
Автор

You should consider making more and more tutorials. You are the best!!

theubiquitousanomaly
Автор

Very good job, I cant wait to see RAG applications . You are an awesome teacher.

mbrihoum
Автор

Great video! A great next video would be to insert ollama functions in there so a question about the weather for example would return something like the good ol get_weather("san fransisco") example and you call some external api to get the result and then returns it to the user.

tommymalm
Автор

I'm glad I found you videos, I was wondering if you ever going do a udemy class, also what text editor are you using is really clean. Thank you for taking the time to make this videos.

PenicheJose
Автор

Fantastic stuff. I am just starting up my company and (it seems that) new clients are queueing up. Your video's are absolutely spot on so thanks again. If you need more ideas about content: i was wondering perhaps you can create one about fine-tuning or training a downloaded model (perhaps Phi)? I know OpenAI has this sleek interface of uploading 'ideal' question-answer pairs and have a trained model on that as a result. This surely should be possible using your own model, right?

And while we are at it: how would you deploy your own model to a production server?

Take care and keep up the good work!

bjaburg
Автор

Thank you. This is very informative. Could you post videos utilizing Chroma db persistent state To work with PDF documents and SQL database

anurajms
Автор

Great video. I would love to see how you would tackle having the output of one model being fed to another model, but in a chat environment. So for example qwen1.5 receives input from the user in chinese and translates into english, sends it to openhermes mistral 7B as an input, and then openhermes responds to the user. Or for example LLAVA receives a picture from the user and a question based on that picture. LLAVA recognizes the image, sends it's output and the question from the user regarding the picture to openhermes mistral 7B, which then responds to the user. The frontend could be simple react code or streamlit... Not sure if this can be considered agents, but anyways, that would be an awesome video and kind of an extension to this one.

sushicommander
Автор

Great video. Could you do a video using langchain, RAG, and streamlit? This would be very helpful.

guanjwcn
Автор

Great videos, how about a llama3, streamlit, groq video?

CodeShockDev
Автор

Using the ollama API is good, but perhaps you could show a bit about langchain, using ollama as an example? Langchain is as close as it gets to an industry standard for accessing all sorts of models, so showing how to use it would be valuable to your growing community.

mohl-bodell
Автор

how do i run the virtual environment on windows? i think the example you gave was for linux

Sammich