Build a CHATBOT with AWS Amazon Bedrock- Llama 2, Langchain and Streamlit [HANDS-ON]

preview_player
Показать описание
This use case is part of my Best Selling Udemy Course on AWS Bedrock and Generative AI and can be accessed from below link :

In this video we will learn how to build a CHATBOT with AWS Amazon Bedrock- Llama 2, Langchain and Streamlit

This is a hands-on tutorial, where following things will be demonstrated.
- End to End Demo
- Architecture
- Step by Step Coding

Check out other videos in Amazon Bedrock Playlist :

Use Case 2 : Build a CHATBOT with AWS Amazon Bedrock- Llama 2, Langchain and Streamlit

Use Case 3 : Build a HR Q and A App with Retrieval Augmentation Generation (RAG), AWS Bedrock, FAISS, Langchain, Streamlit

Use Case 4 : Build Serverless e-Learning App using Amazon Bedrock Knowledge Base + AWS Lambda + Claude FM + API Gateway:

Video 5 : Basics on Vectors, Vector Embeddings, Embedding Models, Chunking, Vector DB , Cosine Similarity, KNN Algorithm :

Check out my other Udemy Courses from below:

Рекомендации по теме
Комментарии
Автор

Thankyou for making amazing video on Bedrock 🎉

shashipal
Автор

@rahul thanks for the video - can we use our local text files as an input to the LLM like llama2 that can also be deployed on local?

rachitsharma
Автор

Hi, Llama 2 Chat models are showing unavailable in all the regions. In which region did you access this model? Are there any alternative models available in AWS to use for this chatbot?

onestopzz
Автор

wow this something I wanted !! can you tell me which model in AWS is good for finetuning the docs and giving the result very quick.
and could you please make one video on that .

techtomorrow
Автор

This is great, thanks for sharing. Looking for CHATBOT with AWS Amazon Bedrock & Agent + KB, Claude, Langchain, and Streamlit

michaelwahl
Автор

Straight out of the gate those responses are not being parsed correctly, the model is continuing after the stop token, hence the Human response and the Llamas back and forth.

thehve
Автор

Can you share the git repo for the session code

khandetanvikram
Автор

Great video Thanks.
What if I want to integrate or build such chatbot using bedrock for my website ? (not streamlit as frontend) any suggestions or recommendations ?

indrajitg
Автор

How do we add Contextual info on this model such that Chatbot can answer contextual questions as well generic answers ?
Thanks

iterator
Автор

I am getting below error
The class `Bedrock` was deprecated in LangChain 0.0.34 and will be removed in 0.3. An updated version of the class exists in the langchain-aws package and should be used instead. To use it run `pip install -U langchain-aws` and import as `from langchain_aws import BedrockLLM`.

nehajaju
Автор

Having trouble in authentication " Could not load credentials to authenticate with AWS client. Please check that credentials in the specified profile name are valid" although I set the tokens properly, any solutions ?

varun_tech
Автор

is it mandatory to configure the aws cli for access key and secret key? Can't we put the key credentials in code or in .env file. and pass it to Bedrock function?

draaken
Автор

I dont see in the code where you added the credentials. But these are not free APIs correct. So where do you do that.? even if you are accessing it from OS you still have to do that using code correct?

cosmicfluke
Автор

Is there a way to trim the response. To avoid the unnecessary answers "The Human : and AI: "part

varun_tech
Автор

Which model is good to parse word and pdf documents in bulk

premymn
Автор

Thank you, What is the Infra requirement for above demo?

VISHVANI
Автор

How can we use Bedrock to query data from Database without calling API?

kunalpinyani
Автор

Is there free trial/tier? for your demo, how much does it cost?

dangollayan
Автор

Sir..I am getting an error that "You don't have access to the model with the specified model ID."

saisampath
Автор

Why is the LLm model's Output containing the questions and answers for which we didnt ask also, Example : Human : Thats understandable, can you tell me a joke?, Here we have not asked that specific question also

maruthisrinivas