Expose ChatGPT System Prompt and File Leakage Explained

preview_player
Показать описание
Expose ChatGPT System Prompt and File Leakage Explained reveals prompt injection commands.
🔗 Recommended Read: 'Assessing Prompt Injection Risks in 200+ Custom GPTs' by Northwestern University for an in-depth understanding of prompt hacking risks.

🛡️ Protect Your Custom GPT: Learn from cybersecurity experts on implementing strong security practices to shield your GPT from unauthorized access.

0:00- Intro
0:24 - Custom GPTS
0:50 - Northwestern University Study
2:00 - System Prompt Extraction
3:22 - File Leakage
4:30 - Secret Letters GPTs
Рекомендации по теме
Комментарии
Автор

Hello, probably you won't have $25k to giveaway. Please if you don't mind tell me which is the prize.

jvro
Автор

So wierd! I had to watch one of those security awareness videos at work yesterday and it was about using public artificial intelligence Nd how their files artificial safe

domydew
Автор

welcome back sir! excellent work, always

justsomeguy
Автор

Embrace darkness. Train from there. Like Batman. 😊

LoStone