Llama3 via Groq API | Super Fast Inference | LangChain | Chainlit

preview_player
Показать описание
In this video, I will show you how to use Llama3 via Gorqchat as well as playground and also demonstrate how you can create a simple ChatGPT like UI locally in your computer. You can follow along with me by cloning the repo locally. The idea is once you know this approach, you can quickly switch different models.

Open Source in Action 🚀
- Llama3 as Large Language model.
- LangChain is used as a Framework for LLM
- Groq to load the LLM
- Chainlit is used for deploying.

👉🏼 Links:

------------------------------------------------------------------------------------------

------------------------------------------------------------------------------------------
🤝 Connect with me:

#langchain #chainlit #groq #llama3 #chatui #chatgpt #datasciencebasics
Рекомендации по теме
Комментарии
Автор

Thanks for this.

When I run the code, the chatbot replies with two responses.

Each time it first shows 'Used ChatGroq' and the input and output as JSON.

Then immediately followed like your example with 'ChatBot' with actual response to the question/prompt.

Do you know why it does that? My UI looks different compared to yours as well.

martinsmuts
Автор

bro, can you do more on agents please, for total beginners, thanks again.

dmistclesgee
Автор

When I tried to change the template from "You're a very knowledgeable Machine Learning Engineer." to, for example, ""You're a very knowledgeable AI assistant", the code threw up errors. I have no idea how to make it go away because such prompt distorts the responses to other questions I tried to throw at it. Please help. Thanks.

wcwong
Автор

Hello Bro, I was going through your Databricks in 30 days - very good videos. But, did not see the files in github for practice. Can you please update those? Thanks

DataScienceGuy-zsip
Автор

Is there a JS / TS version of this code, pls?

patricktang
visit shbcf.ru