How to RUN GEMMA with LANGCHAIN and OLLAMA Locally

preview_player
Показать описание
In this video, I'll show you how to use Gemma with LangChain and Ollama. First, we'll take a look at Ollama. Next, we'll learn how to use an Ollama model with Langchain. Finally, we'll cover how to perform an Ollama Chat model.

00:01 Intro
00:50 Installing Ollama
02:34 LangChain & Ollama
04:31 Working with LLMs
06:00 Working with Chat Models

▶️ LangChain Tutorials:
▶️ Generative AI Tutorials:
▶️ LLMs Tutorials:
▶️ HuggingFace Tutorials:

🔥 Thanks for watching. Don't forget to subscribe, like the video, and leave a comment.

#ai #gemma #generativeai
Рекомендации по теме
Комментарии
Автор

Thanks so much for this informative tutorial. What keyboard are you using ? It sounds very nice 😊

amoahs
Автор

Hi, I have an error while running ollama run gemma:2b --> Error: error loading model. TIA!

MavrickMania