Getting Started with LangChain: Load Custom Data, Run OpenAI Models, Embeddings and ChatGPT

preview_player
Показать описание

In this video, you'll learn about the LangChain Python library and how to set up a Google Colab notebook to run a simple OpenAI model. We'll then load an external webpage, specifically the Twitter Recommendation Engine blog post, and ask questions about it using OpenAI embeddings and ChatGPT. Finally, we'll explore prompt templates and chains and combine all components to briefly examine the inner workings of LangChain.

00:00 - Intro
00:45 - LangChain Overview
02:36 - Prompt Engineering GitHub Repository
03:00 - Google Colab Setup
03:57 - Use an OpenAI model
05:38 - Q&A on Blog Post
11:27 - Prompt Templates
12:49 - Chain for Q&A
16:36 - Conclusion

#chatgpt #promptengineering #gpt4 #artificialintelligence #python #langchain
Рекомендации по теме
Комментарии
Автор

Hey everyone,
Thank you for watching!

venelin_valkov
Автор

Great video with a lot of useful information. I had never used google colab before and it was just what i was looking for. Thank you.

vlmsuon
Автор

Please explain how to calculate openai tokens so I can calculate token spending on ChatGPT, thank you

BonardTitoSaragihGaringging
Автор

and how can I use big models from huggingface ? I can't load them into memory because many of them are bigger than 15gb, some of them are 130gb+ . Any thoughts?

botondvasvari