LangChain - Conversations with Memory (explanation & code walkthrough)

preview_player
Показать описание

Creating Chat Agents that can manage their memory is a big advantage of LangChain. This video goes through the various types of memory and how to implement them in a LangChain Conversation chain.

My Links:

Github:

#LangChain #BuildingAppswithLLMs
Рекомендации по теме
Комментарии
Автор

A video on custom memory will be helpful. Thanks for the series.

aanchalagarwal
Автор

Great explanation ! Would love to see the custom/combinaisons one :)

resistance_tn
Автор

Thank you so much! I had a app built without any conversation memory just using chains and was struggling to convert to memory.

you made this very easy to follow and understand

atylerblack
Автор

Love the videos! Thank you for making them. Dying at the b-film footage

ketolm
Автор

Another great video, I want to create my own agent with a memory. I’m thinking a vector database is the best way of doing it would be great if you could do a similar video outlining some of the different vector database, options, pros and cons of the different ones.

Jasonknash
Автор

i’ve been experimenting with entity in my own ways and its pretty wild and probably the most useful for general use. I imagine word for word would really only matter in something like a story generator or whatnot

aiamfree
Автор

Oh how much I missed that voice. Keep the videos coming and maybe get some sunglasses and a webcam.

m_ke
Автор

This is exactly what I needed, thanks so much!

hikariayana
Автор

Indeed, this was helpful. Thank you for this video series. The more I work through them the more may questions are being answered :-)

kenchang
Автор

This is awesome! I love the way you explain things Sam! If you ever create an in depth video course about using lang chain and llms, especially regarding extracting particular knowledge from a personal or business knowledge base - let me know pls, I'll be first one to buy it 😍

krisszostak
Автор

wow!!! super helpful and thanks a ton for making this tutorial!!

abhirj
Автор

I love these tutorials. Learning so much. Thanks.

ygbclow
Автор

I like the iced out parrot thumbnails 😎

starmorph
Автор

Is it cost effective to use ConversationSummary? From my understanding it needs to summarize our conversation every time.

owszystkim
Автор

i think the best one is to create like a small ai handler that handles all of the memory in your device then sends a very brief summary to the llm wih the necessary info of what the user means, in this case we will avoid sending too much data with much more effective promts than all of the mentined above

abdoualgerian
Автор

Great explanation, you deserved my sub!

xxthxforkillxx
Автор

Great video! I'm such a big fan of your work now! I'm sure this channel is going to places once the llms become a bit more mainstream in the programming stack. Please keep up with the awesome work!
I have a question with regard to the knowledge graph memory section. The sample code given shows that the relevant section never gets populated. Furthermore, the prompt structure has two inputs, {history} and {input}, but we only pass on the {input} part, which might explain why the relevant information is empty. In this case, do you know if there is any use for the relevant information section?

A second query is in regard to the knowledge graph. Since the prompt seems to be contextually aware, even though the buffer doesn't show the chat history, is it safe to say that in addition to the chat log shown (as part of verbose), it also sends the knowledge graph triplets created to the llm to process the response?

noone-jqxw
Автор

how do we add a custom prompt by adding some variable data, and using memory in ConversationChain?
Like I'm trying this but getting the validation error:
PROMPT = PromptTemplate(
input_variables=["chat_history_lines", "input", "tenant_prompt", "context"], template=_DEFAULT_TEMPLATE
)
llm = OpenAI(temperature=0)
conversation = ConversationChain(
llm=llm,
verbose=True,
memory=memory,
prompt=PROMPT
)

Error: 1 validation error for ConversationChain
__root__
Got unexpected prompt input variables. The prompt expects ['chat_history_lines', 'input', 'tenant_prompt', 'context'], but got ['chat_history_lines', 'history'] as inputs from memory, and input as the normal input key. (type=value_error)

hussamsayeed
Автор

Amazing explanation! I'm currently trying to use Langchain's javascript library to "persist" memory across multiple "sessions" or reloads. Do you have a video of the types of memory that can do that?

z-ro
Автор

How do you use the diff. conversation with LCEL ?

pec