LangChain + HuggingFace's Inference API (no OpenAI credits required!)

preview_player
Показать описание
We combine LangChain with GPT-2 and HuggingFace, a platform hosting cutting-edge LLM and other deep learning AI models. This has the added benefit of not incurring any charges since we're not using OpenAI (might as well, seeing we're in a recession right about now) but instead using the GPT-2 model hosted on HuggingFace..

LangChain is a fantastic tool for developers looking to build AI systems using the variety of LLMs (large language models, like GPT-4, Alpaca, Llama etc), as it helps unify and standardize the developer experience in text embeddings, vector stores / databases (like Chroma), and chaining it for downstream applications through agents.

Mentioned in the video:

- Watch PART 2 of the LangChain / LLM series:

- HuggingFace's T5 base finetuned WikiSQL model

- HuggingFace GPT2 model

All the code for the LLM (large language models) series featuring GPT-3, ChatGPT, LangChain, LlamaIndex and more are on my github repository so go and ⭐ star or 🍴 fork it. Happy Coding!
Рекомендации по теме
Комментарии
Автор

Such informative video. Thanks for sharing this Samuel. Is there a way to download the hugging face models locally and run it without the internet?

dapoint
Автор

Definitely informative for an introduction, I agree. BUT it seems that something wrong with "requirements.txt": the components 'distutils', 'torch', 'triton' don't have the right version and last but not least there is no version of 'uvloop' for Windows. Need some help.

jkiobiq
Автор

How to explicitly define length of prompt being generated in this HuggingFaceHub?

cm-a-jivheshchoudhari
Автор

Great content and to the point. Thank you!

GEORGEBELG
Автор

Thanks for this video! Quick question, Is there any risk of PII leakage in this implementation ? Asking this because we cannot see the model getting downloaded locally.

SudhakarVyas
Автор

How do you get the full contents without worrying about max_length ??

karthikj
Автор

Definitely informative for an introduction. I learned something very important about HF. But I did not hear a lot of 'langchain' discussion. I'll check out your other vids to see what you got.

Dr_Tripper
Автор

Hi good video, have a question: on huggingface llms, do I need to download them on my PC, because I may have some old laptop with little ram to that . Any advice plz

Freeguy_
Автор

Thank you for the video. Do you have an example using T5 with a postgres db?

chelciedealmeida
Автор

Does Hugging Face also have credits? So would it stop working if the credits expire? (for image generation)

Swaggerdawg
Автор

Hi, I have a quick question, first of all thank you for the great content, instantly subscribed, and second, where did you pass in the database in this video? thank you!

jorgefelipegaviriafierro
Автор

Hi, I have to query from database. Can you suggest me a way to do a transfer learning to pre trained model

sanjayjs
Автор

Hey, on running the same code given by you, my code is generating a small/single line and then it repeats the same line till it reaches the word limit(100 here)

justinjoseph
Автор

How effective its performance compares to openai paid key

izainonline
Автор

Hey samuel, Thanks alot for the video, I didn't know before that HF have a free api

but quick question, I'm building a project that requires an LLM and i can't pay for the openAi credits, so is the HF free api can do? if the project will have multiple requests and act as chatbot, or is there other way i can get free api for LLM for building applications?

hussienhamza
Автор

this code is not working now
can you please make new video and explain how to use hugging face and langchain

jhhh
Автор

Hi
Thank you so much for your great video.
I have a question : Can I use gpt2 in the way that you explained in the video for question answering?

moslemsamiee
Автор

Can we use image-to-text model with this method? How to do this only using api request?

unexpectedworld
Автор

Great video! any thoughts on which of the open source LLMs are comparable to GPT3.5 or GPT4 in terms of their performance? GPT2 and T5 seem to be too simplistic to be useful

moreshk
Автор

hello I was telling you that I have trouble understanding everything well what you explain in this video since first of all I speak bad English

but above all I don't know anything about code so I have trouble following


but I followed a tutorial that allows to use and automate chatgpt in a Google sheet without touching the code,

I was wondering if with this method I could ask chatgpt to describe the images that are in my Google sheet?


Thank you Cyril

SD-rgmj