Unleash the Power of Falcon with LangChain: Step-by-Step Guide to Run Chat App using Chainlit

preview_player
Показать описание
Discover the Power of Falcon LLM with LangChain! Learn how to run a chat UI using this advanced language model in our step-by-step tutorial. Follow along as we guide you through the setup process, including running Falcon with LangChain, configuring the chat app deployment, and launching the chat UI.

Рекомендации по теме
Комментарии
Автор

Grateful to have people like you on YouTube; spreading the valuable information with the developer community :)

hassankamran
Автор

You are our guide in the dark with your tutorials. We're thankful that you offer them for free, so we don't need to buy courses. :) Thanks!

tursunable
Автор

Thank You, All the best keep good thing up "Bon Courage et Merci"

vhater
Автор

I have watched two of your videos and they were both impressive. Good job on putting the app together and also clearly explaining it. Thanks. Liked and subscribed.

newcooldiscoveries
Автор

Please do another video showing langchain doing cooler stuff like iterating over output from the LLM to create new prompts from templates.

pleabargain
Автор

I'm skeptical it's 40B model, the biggest one, people complain it's very slow and demands very powerful hardware.
I'm waiting when Orca will be available.

fontenbleau
Автор

running the chat gives me Error raised by inference API: Internal Server Error

pushkar
Автор

9:45 please do a vid on using the agents with chainlit

pleabargain
Автор

I tried the exact same way you did in the video, but encountered with an error as
"ValueError: langchain_factory use_async parameter is required" do you have any solution for it ?

sjthtnz
Автор

Hi, can I run this on a colab or kaggle? and can I use pdfs or other docs? to chat with my data ?

QHawk
Автор

i followed the steps you mentioned above but i still got the error

AttributeError: module 'chainlit' has no attribute 'langchain_factory'

tamanna-
Автор

Hi It does not perform well on follow up questions(related to last question), can you share an implementation where we can include chat history, probably by using memory from langchain. Thanks In Advance.

sks_DS
Автор

Hey, great content. Question: is it possible to connect chainlit with a bubble app?

JoaquinTorroba
Автор

I appreciate your video, but I need clarification regarding the process when running the code. Does it involve downloading the model onto your machine or codespace container, or does it utilize the Huggingface key to make an API call to the hosted model on the Huggingface hub? In other words, where is the model actually hosted?

IamalwaysOK
Автор

Hi. You are great.
But i don’t understand i have you questions😓.
I had install the chatbot but he responds some “bad” questions , we need to train the program ?
And second Can i install this on my page? Thank you very much❤

Clipclips
Автор

Have you tried using Chainlit in Docker Container? Can we do it?

nikk
Автор

Why does when you create the codespace below there is written Codespace usage for this repository is paid for by misbahsy? Is that a public Codespace you created and you are paying it for other users to use? Explain what's happening.

mayorc
Автор

I don't understand what this chat I for?

logost
Автор

How can we use it for a custom dataset or pdf file? And can it be multilingual?

ahmedewis
Автор

I'm sorry but could you please let me know why do we need langchain here?

riyaz