Crazy New AI 🤯 AI to Understand Your Documents | PrivateGPT One-Click Installer

preview_player
Показать описание
ChatGPT is cool and all, but what about giving access to your files to your OWN LOCAL OFFLINE LLM to ask questions and better understand things? Well, you can with PrivateGPT. This guide includes a one-click installer for it, so you can get started as soon as possible.

Timestamps:
0:00 - Explanation
0:20 - What is PrivateGPT?
1:02 - Open PowerShell
1:30 - One-click install for PrivateGPT
2:40 - Chosing a model
3:58 - Breakdown of the folder structure
4:40 - Creating desktop shortcuts
5:24 - Ingesting documents (to use them with the AI)
6:38 - Asking privateGPT about my documents
7:25 - Vicuna 13b with privateGPT (WOW!)

#PrivateGPT #AI #ChatGPT
-----------------------------
-----------------------------
-----------------------------
🖥️ My Current Hardware (Links here are affiliate links. If you click one, I'll receive a small commission at no extra cost to you):
🎙️ My Current Mic/Recording Gear:

Everything in this video is my personal opinion and experience and should not be considered professional advice. Always do your own research and ensure what you're doing is safe.
Рекомендации по теме
Комментарии
Автор

Subscribed because you were cool enough to make a 1-click installer. Please continue doing so for future projects!

julienarpin
Автор

Note. While this is CPU bound, It also has a processor core limit of 63, so if you have a beefy server or threadripper you might run into issues. You will have to go into your bios and shut some cores off. Even with ~60ish cores, this is currently SUPER SLOW, and gets slower the larger your document database gets. I ingested a directory of ~ 500 MB of PDFs, and it took nearly 30minutes per query.

OtherTNSEE
Автор

Amazing project, im sure will help a lot of people like me.

Aldrinstori
Автор

Personally I see that this project has a lot of value but I think the priority should be on how to scale the infrastructure to speed up ingestion and inference times, rapidly prototype and test a bunch of LLMs and last but not the least if it can have a nice looking UI.

aneeqrehman
Автор

Thank you for the tutorial. is there any way to use a GUI like "Text Generation Web UI".

behroozhussaini
Автор

hi there :) another question, how to install more models and, further, how to switch between the models to compare the results as you did? thank you very much, highly appreciate the video and your responses in advance

dabo
Автор

hi thx a lot for ur tuto. Vicuna 13b can generate multilingual response ? i need french answers

Vl-zghc
Автор

Best youtuber and voice :3 ever in my opinion ^^ keep it up

blk
Автор

I was looking for something like this to use with AutoGPT, Anyone know a fork out there for Local GPT with the Auto like feedback loop?

matthallett
Автор

Hey, I'm receiving a can't open file [Errno 2] No such file or directory" error because there are no .py files, is anyone else getting this issue?

joshuabennett
Автор

Is there a way to decrease the time it takes for a model to provide an answer? Like it takes 2 to 3.5 minutes for a vic7b, for example, to provide the response, is there anything that can be done on the matter?

dabo
Автор

Hi, thank you very much for the tutorial. Can I receive answers in other languages ​​as well? How do I add the option to another language? Thank you.

qnkjffo
Автор

1-click to install, 30 minutes per query, sounds legit ;P

ybwang
Автор

Everyone talks about private gpt being able to talk to your docs, but like I already read my docs, can you make it write some statements of claims, or some written submissions or something like that?

MrVincentTremblay
Автор

Are there already models with different main languages than english available, that could be used with it?

fixelheimer
Автор

I hope you made a updated video of this, if it ever got GPU-accelerated support of it.

swiftypopty
Автор

What directory does this install everything in? I noticed a variable called $THCT or something like that (which I know is inherited from a diff script). I don’t use conda, but I want to use venv to activate a virtual environment for whatever directory this script is going to be working from so that all dependencies aren’t installed globally. How can I go about that?

MisterK-YT
Автор

I tried installing this with the Koala model (because I haven't tried that one before) and it had a problem. It was asking me if it was a ggml model, then just dropping out of the installation. It wouldn't start. I eventually reinstalled it with Vicuna 13b and it works. Can't wait for a faster version, though, as even with 34gb of ram it is a little slow.

amkire
Автор

How do I use Vicuna 7B GPU with training / ingesting my own pdf files?

skyhr
Автор

Python was not found; run without arguments to install from the Microsoft Store, or disable this shortcut from Settings > Manage App Execution Aliases.

How can I solve this guys?

officialgombowom