100% Offline ChatGPT Alternative?

preview_player
Показать описание
In this video I show I was able to install an open source Large Language Model (LLM) called h2oGPT on my local computer for 100% private, 100% local chat with a GPT.

Links

Timeline:
00:00 100% Local Private GPT
01:01 Try h2oGPT Now
02:03 h2oGPT Github and Paper
03:11 Model Parameters
04:18 Falcon Foundational Models
06:34 Cloning the h2oGPT Repo
07:30 Installing Requirements
09:48 Running CLI
11:13 Running h2oGPT UI
12:20 Linking to Local Files
14:14 Why Open Source LLMs?

Links to my stuff:
Рекомендации по теме
Комментарии
Автор

Wow, I have a lot of saved documents, articles, and even e-books on my computer. The idea of my own local chatbot being able to reference all of this and carrying on a conversation with me about it is almost like having a friend that shares all my own interests and has read all the same things that I've read! Amazing how the technology is advancing! I can't wait for this!

AmazingArends
Автор

I didn't knew that you were working for h2o, but I am happy for you all. You're doing a great work making open source LLM more accesible and friendly!

LaHoraMaker
Автор

This is amazing and very well put together! You have one of my favorite channels on all of YouTube and I’ve been able to follow what you teach. It’s an honor to learn from you! Thank you!!

chillcopyrightfreemusic
Автор

Great video. I hope this gets a lot of views because it is relevant to so many different use cases that need to protect source data. Love the demo of how easy it is load vectorize your own local docs with Langchain.

mb
Автор

Awesome!!! Here I was losing hope about AI / GPT being more transparent about biases getting trained "baked" into popular chatbots already & the lack of real privacy users have about what they discuss "there is no privacy". And blammo you guys already had this out in just a few months. Super cool!! Thanks to all who went in on this!

WeTheLittlePeople
Автор

been using Chatbots to write a tabletop GPG campaign for my friends, but having the main story in separate files has been a problem. If I can use the material I already have as my own training material it might be way easier! This chatbot might be exactly what I need! Cool, I will give it a go!

jannekallio
Автор

YES PLEASE make another video where you sey up all of these in a cloud environment instead of local. Excellent video, thank you very much

johnzacharakis
Автор

The best explanation so far. I've experience using GPT4All, self hosted whisper, Wav2lip, stable diffusion and also tried few others that I failed to run succesfully. The AI community is growing so fast and is fascinating. I'm using RTX3060 12GB and the speed is okay for chatbot use case but for realtime AI game engine character response it is slow to get response. I recently get a hand of RTX3080 10GB and in this video I see you are using RTX 3080TI which has 10240 CUDA vs mine 8960. It is first time i see that you can use additional cards which in your case GTX1080 vs mine GTX1060 to run the LLM. Very informative video!

AnimousArchangel
Автор

I would also like to see another video from you about setting up all a cloud environment. Thanks for sharing your knowledge.

onandjvjxiou
Автор

Hello Rob,

I liked your video very much. I wanted to suggest that you consider making a video on how a translator or voice-to-text transformation can become a tool for everyone based on an open language model. It would be an interesting topic to explore and could benefit many people. Thank you for your great content!

TylerMacClane
Автор

Could you do a video on the “fine tuning” you talk about near the end? I like the privacy attribute of running open source locally and the fine tuning would be really beneficial.

kevinmctarsney
Автор

I love the Content, even though no one doesn't know about this, Very very useful content we are expecting a cloud version demo also. Thank You

siva_geddada
Автор

This was great! I’m in the process of setting up langchain locally with openllm as a backend but to think I’ll try this as a next step. Thanks for sharing!

pancakeflux
Автор

This is one of the best discussions of building an AI locally I have seen Bravo!! BTW the tutorial is excellent. its clearly enunciated the print ia veru big and readable for old foggies like me and he goes slow enough to follow and describes and shows what he is doing so noobs like me can follow, . also don't forget the transcription button gives details about every minute and a half . Very welll done anybody who is patient will like this. thank you Rob Mulla

surfernorm
Автор

This is exactutly what i was looking for to develop a model for internal use at my compnay. Thank you!

BryanEnsign
Автор

5:30 yes rob yes. Please. It will be all round approach if you start teaching python on a cloud environment. Much awaited and thanks for everything ur channels has offered me till date.
Explicitly love your YT SHORTS.

irfanshaikh
Автор

Dude, you are not even being biased THIS IS THE BEST INVENTION EVER!!!
Open source??? AND it runs locally???? even without the file access feature this would've been the coolest piece of software I've ever encountered!

Mark_Rober
Автор

Rob, the video is Awsome! Great content as usual 🤩
Would love to watch a version utilizing a spinned up instance from a cloud provider too ( for those of us without a gpu 😊)

dimitriskapetanios
Автор

These language models are getting improved so fast, by the time you have it installed and working there's 3 better ones

thefrub
Автор

THIS IS A GAME CHANGER!! FOSS FOR THE WIN!

electrobification