LangChain 🦜️ - COMPLETE TUTORIAL - Basics to advanced concept!

preview_player
Показать описание

Timestamps:
0:00 Course Overview
1:24 Talk to the OpenAI API without langchain
5:04 Models, Prompts, Chains and Parser Basics
12:16 Chains: SequentialChains and RouterChains
21:11 Memory - Buffer vs. Summary
24:50 Index - Loaders, Splitters, VectorDBs, RetrievalChains
32:44 Agents - LLms plus Tools
39:00 How ChatGPT Plugins work
41:35 Evaluation - Use LLMs to evaluate LLMs

#langchain , #chatgpt
Рекомендации по теме
Комментарии
Автор

This channel is a real gem. Excellent teaching style, and the tutorials are concise yet filled with a treasure trove of information and application. Thank you for making these, and I hope you will continue to explore AI with us. Will definitely look forward for your first masterclass.

AdrianMark
Автор

Best Langchain video I have ever watched. 🎉

azizultex
Автор

Exactly what i was looking for! Awesome job. Keep going on 🎉

anikcabidenov
Автор

That was a lot of work. Thank you for making this very enlightening video.

mjmtaiwan
Автор

Really helpful your videos ❤!! I would like to add in min 11:10, if you are getting this warning 'LangChainDeprecationWarning: The function `__call__` was deprecated in LangChain 0.1.7 and will be removed in 0.2.0. Use invoke instead. ' change the line --> response = chat(messages) by --> response = chat.invoke(messages) :)

RipNonoRasta
Автор

This is awesome thank! I wonder how to add recent announced function calling functionality in our project, since it makes much more easier to work with structured json objects.

canerakca
Автор

Thank you so much for sharing your knowledge and expertise.

Canna_Science_and_Technology
Автор

answers all my doubts, great job sir!!

nikhilsharma
Автор

Amazing video! I have one question. In the Index part of the video, you go ahead and create the prompt_template with two arguments, 'question' and 'context'. But then, you didn't specify the 'question' and 'context values', so I think your query is not taking into consideration your template.

To verify it, I added in the prompt 'In all your answers, start them by saying "Dear Pedro" ' and it didn't follow the instructions.

I'm looking for ways in which to combine PromptTemplate and Embeddings, to be able to give the model a prompt and at the same time using data from .txt files

PedroSotomayor-lhfp
Автор

Its a very good video to understand langchain.

arunakumari
Автор

Excellent instruction. How would you approach getting SEO keywords from a blog article using LangChain? Perhaps make a video about how that would be done.

StephenPasco
Автор

marcus!! are u going to make a 2024 version of this with the LCEL and updated things?
A hug from spain!

weqokjq
Автор

minute 2: it needs atualization

from openai import OpenAI
client = OpenAI(api_key=api_key)
def chat(input):
response =
model="gpt-3.5-turbo",
messages = [{"role": "user", "content": input}],
temperature = 0.7,
max_tokens = 256,
top_p = 1
)

return

otimistarj
Автор

Problem will always be the tokens usage. Even though its already reduced by a lot but it will still sending it to openAI API in full blown so it will use your token credits fast depending on how big your embedded data are.

savire.ergheiz
Автор

Bomben Video! What about a locally running opensource LLM? Would be a Gamechanger for German companies especially in finance since the burdens and regulations are very strict

monkeyy
Автор

What modifications are needed to run against another LLM models, like those on hugging face?

a
Автор

Issues while using RetrievalQA

```
ValidationError: 2 validation errors for RetrievalQA
retriever
field required (type=value_error.missing)
retriver
extra fields not permitted (type=value_error.extra)
```

joxa
Автор

Nice! Can you run the notebooks on google colab as is, or are modification needed?

a
Автор

31:00 Why we don't need to pass parameter in the "context" variable, how does the qa object knows our query goes into "question" variables?

nervous
Автор

Hello 👋I am new to AI. I just subbed your channel and will coding along soon but a quick question what is the difference btn "AutoGpt" and "Functional calling". I hear them quite a lot yet still not understaning well.

ihateorangecat