Beyond Basic LLM Applications: Getting Started With Redis and LangChain

preview_player
Показать описание
In this video, we're going to have a closer look at how you use LangChain with Redis. I'm first going to make a case for why you want to consider Redis as a backend for your LLM application.

Link to the Colab notebook:

Link to the data:

A full course on how to develop and deploy contextual language model applications with LangChain is available in The Rabbitmetrics Community:

▬▬▬▬▬▬ V I D E O C H A P T E R S & T I M E S T A M P S ▬▬▬▬▬▬

0:00 Introduction and overview

0:28 Why Redis

2:10 Setting up a Redis database

3:42 Redis-py and LangChain

4:07 Connecting LangChain to Redis
Рекомендации по теме
Комментарии
Автор

Most underrated YouTube channel out there for langchain. I cannot believe that this course is far better than the stuff I have seen on paid courses on Udemy. It is very clear, concise and you really put all the perquisites that could be useful upfront such as what to install, what to import and additional blog posts for more in-depth research. Thank you.

martindeveloper
Автор

how can I find the environment variables of Redis

mariacamila
Автор

Will this work with the Redis alternatives now they are persona non grata?

paulmiller
Автор

Thank you for this nice tutorial! Do you use "raw review data" with 34GB?

fabianaltendorfer
Автор

Thank you for the video

I get this error: Vector index initial capacity 20000 exceeded server limit (1021 with the given parameters)

Anybody able to help?
Thanks

davidreilly