TikTok’s AI Voice Trend Exposes A Nightmare

preview_player
Показать описание
Build your dream setup with FlexiSpot and use my code "FSYTB50” to get extra $50 off on order over $500 🌟 You can get combo discount on their ergonomic chair with the desk purchase!
FlexiSpot E7L standing desk:
FlexiSpot C7 Ergonomic chair:

Here are the chapters for this video. Feel free to jump around accordingly:

0:00 Introduction
1:14 Sponsor
2:51 The Rise and Fall of Prank Calls
5:24 Nicki Minaj AI Prank Call
7:00 Jojo Siwa AI Prank Call
9:40 The Surge in AI Voice Apps
10:12 TikTokers Pranking with AI Voices
12:01 The Dark Side of AI Prank Trends
12:50 Other Applications
13:26 How It Gets Even Worse
15:12 Conclusion

Note:
I deliberately leave the channel icons and video titles on each video for reference in case anyone wants to see the full videos featured in these documentaries. If anything is unclear, or for additional references, please reach out, and I will happily provide a link to any of the following content discussed, featured, or used as a b-roll in the video.

Copyright Note:
Copyright Disclaimer under section 107 of the Copyright Act 1976, allowance is made for “fair use” for purposes such as criticism, comment, news reporting, teaching, scholarship, education and research.

Music:
All songs featured in the video are from:
- YouTube's Audio Library
- Ian Taylor's RuneScape Music library owned by Jagex LTD. Please check out their music on Spotify if you are into that style. I cannot emphasize enough how good the music is.

=-=-=-=

Creators note:
Hello, my name is Zackary Smigel. I've been creating YouTube videos since I was a kid. I started YouTube full-time about five years ago. You might have seen me from Real Estate License Wizard, where I obtained 60,000 subscribers and over 5 million channel views. I have since left that channel for creative pursuits, and this is my first crack at starting something new. I'd appreciate a like and subscribe if this content interests you and I captured your attention for even just a few moments.

Thanks so much for watching,

Zackary Smigel

#youtube #videoessay #zackarysmigel

=-=-=-=

Don’t forget to like and subscribe for more engaging video commentary!

=-=-=-=
Рекомендации по теме
Комментарии
Автор

Build your dream setup with FlexiSpot and use my code "FSYTB50” to get extra $50 off on order over $500 🌟 You can get combo discount on their ergonomic chair with the desk purchase!
FlexiSpot E7L standing desk:
FlexiSpot C7 Ergonomic chair:

Thanks so much for watching!

ZackarySmigel
Автор

do people not have humanity anymore?? seriously, that poor girl getting harassed by "bakugou" sounded close to crying, my heart broke for her. what the hell are we doing here

pantheo
Автор

That stuff isnt funny. These peoples jobs suck already. Y'all should use this for good like messing with scammers and stuff like that.

FacePlant
Автор

Isn't it AWESOME that they get paid three times my monthy wage for posting a video harrassing me for 5 minutes while I try to do my shitty job? :D Isn't AWESOME?!!! :D

JulesWees
Автор

Service workers get enough shit from crappy customers and now have to deal with some loser who doesnt even have the confidence to use their own voice for a prank call.

jasons
Автор

Why dont they call scam centres instead? Waste scammers time

norso
Автор

“Yes, Jojo Siwa, I can go ahead and do that for you” the adult man told the computer demon voice

TheAK
Автор

Why did JoJo sound like a 60-year-old chainsmoking dude?

collinvickers
Автор

AI voices always read long series of number like phone numbers and addresses as a single number, which is not something any human being will ever do. If you live at 1420 Main St, a person will say "fourteen twenty" or "one four two O, " but an AI will say "one thousand four hundred and twenty, " because it doesn't understand context and how humans actual communicate.

strippinheat
Автор

My 97 year old grandmother got an AI scam call with the voice of my mom because she picked up and decided to mess with the scammer after seeing some of those scam bait videos. They said I was in the hospital and needed money. What sick human beings... Fortunately my grandmother had been scammed before so she knew to hang up and call us directly, but how is anybody supposed to explain the complexities of AI to their grandmother? She can barely understand using the phone. It's absolutely sickening.

SolaireIntensifies
Автор

the poor lady being shredded by bakugo AI in the beginning sounded like she was about to cry. whoever was recording should have stepped in (or better yet, don't do this at all??)

justsomeguy
Автор

Harassment content should be banned. People should not be able to profit from those videos

pc
Автор

I can't believe it. This is worse than the NPC tiktok trend. No workers were harassed while pretending to be NPCs...for the most part.
Great video as always

one_night_gaming
Автор

When you get a spam robocall, don't engage with the call, less you say, the safer.

tablechair
Автор

My best friend of 15+ years sent me an audio file of “him” reading out a paragraph of text. It was an AI replica. I could pinpoint a few nuances that didn’t feel quite right, but for most of it I was floored by how much it sounded just like him. It was spooky. He sent it to me to illustrate how creepy this is.

Chucanelli
Автор

Another chapter in "How bad will things get before regulations make their biggest comeback yet"

dylnpickl
Автор

"we have a lot of famous people on here"
*scrolls past femboy roommate in recent chats*

kiteofdark
Автор

Literally couldn’t tell you what Nicki minaj or jojo siwa sound like so I can’t tell if these are accurate

evan
Автор

Monetized bullying. I never thought such a dystopian phenomenon would exist but here we are. To be clear, if you watch AI prank videos on any platform then you should stop. It should not be necessary to explain why using the suffering of other real people as entertainment is a bad thing (and why it makes you a bad person if you participate in it).

Kgm
Автор

Honestly AI voice cloning should be banned for mass consumer use, celebrities should sue these AI companies to their grave. If you wanna clone your own voice for w/e use go for it but it should be illegal to input someone else's.

pagedmaj