We're building a dystopia just to make people click on ads | Zeynep Tufekci

preview_player
Показать описание
We're building an artificial intelligence-powered dystopia, one click at a time, says technosociologist Zeynep Tufecki. In an eye-opening talk, she details how the same algorithms companies like Facebook, Google and Amazon use to get you to click on ads are also used to organize your access to political and social information. And the machines aren't even the real threat. What we need to understand is how the powerful might use AI to control us -- and what we can do in response.

The TED Talks channel features the best talks and performances from the TED Conference, where the world's leading thinkers and doers give the talk of their lives in 18 minutes (or less). Look for talks on Technology, Entertainment and Design -- plus science, business, global issues, the arts and more.

Рекомендации по теме
Комментарии
Автор

The irony is, I finally gave in into watching this talk after YouTube suggested it to me for weeks.

kaielvin
Автор

Whelp, on to the next recommended video!

kylec
Автор

A great talk. Unfortunately, most people simply aren't capable of seeing the issue. It's a large part of the reason it works so well.

tech
Автор

The main issue she addresses is less about ads and tracking, but the fact that AI systems are correlating this information in ways that even the human programmers/engineers cannot fully grasp. It's truly a type of intelligence that is foreign to our thinking.

homewall
Автор

People need to understand that the problem here isn't just some Silicon Valley political bias, it's algorithms that are optimized to generate revenue through interaction and viewership regardless of context. Unfortunately that optimization is greatly rewarded by political polarization, which is inherently dependent on engagement.

paskowitz
Автор

I'm a disabled veteran. I'm reasonably intelligent, from what I can tell. I'm observant almost all the times I want to be, and often find myself noticing things I didn't mean to, so I'm guessing my cognition is subject to distraction, but ultimately functional enough to determine if something is happening in front of me. Being disabled, I have a lot of time on my hands, and an inability to do anything with that time but consume. I hate it, but it's my life now. So I watch YouTube. A lot. And as paranoid as it sounds, I've been watching YouTube watch me right back. Here's what I've noticed...

First off, I loathe advertising. I don't watch cable television at all, unless the content finds its way to YouTube, and at that point the ads have been stripped away. If the ads are still present I'll immediately stop the video and find another. I pay for YouTube Red and Google Music as a package, so I can keep advertisements out of both my music and video watching experience. Not completely, of course. That's impossible. But I've done what I can to eliminate ads from my life. I see the methods they use to manipulate, I know I'm vulnerable to them, so I stay away as much as possible. This means my observed (by YouTube) experience with YouTube is based almost entirely on viewing habits of specific subject matter, not what products are likely to sell best on the videos I watch and a calculation of which combination will earn a click. I'm sure YouTube would love that info, but it's never had a chance to gather it on me.

I like science shows. YouTube knows this really well, so recommends all kinds of scientific content to me. It's a cornerstone of my viewing habits. These are almost always down in the bottom of the recommendation lists (the part you have to click to see) no matter what else I've been watching. A science show covering anything to do with genetics, the Mars rovers, the elimination of Smallpox, and so many more can pull me out of whatever string of shows I'm currently watching. Remember this, because it comes in to play later.

I'm a liberal. No, I'm not gonna get political, but I need you to understand the reference point. I like shows that speak to my viewpoint as much as anyone, though not as much as some. I'm always open to hear a logical side to any discussion, so my political watching habits are left of center. Well, I try to keep them there. But YouTube knows that on some subjects I hold very deep views, so when I see something about that topic, I want to hear how "atrocious" the other side is being while getting my info. I have a viewing bias, and YouTube knows how to exploit it. On those issues, once I click on the first video, the entire first page of recommendations is about that subject, but in a more extreme format, from more radical sources. And I click, because I'm outraged, and I want to commiserate with like-minded people. This is where I first noticed YouTube watching me.

I would get to a point where I would be sick of the subject. I didn't want to hear about it anymore. I just wanted to watch something pleasant. At those times I would open the bottom of the recommendations, pick one of the ever present science videos, and away I would go. I'd done this cycle repeatedly whenever I'd get on a political viewing kick. Watch politics, get outraged, turn to other content in disgust. After a while I noticed science shows would start to show up at the bottom of the first page of recommendations, usually about the point I was tired of the subject. YouTube had learned my temperament when it came to specific content, and was able to (mostly) accurately predict when I would need a break. But the predictions always seemed to take a few videos after my anger had subsided to kick in, even if I kept watching the shows. I thought it was a delay, but I also noticed that the videos by that point were even beyond my own political views, taking extreme stances just as radical as anything found on the right. I realized it was a last ditch effort to get me even more riled up, so I would continue watching that subject, that content.

It wasn't malicious. YouTube isn't a sentient entity, so IT can't have malice. What it's doing is what it's programmed to do. Make accurate predictions to keep people watching content, and it does that very well. But wait! There's more...

In addition to physical disabilities I have mental health issues. I won't go into detail, but depression is a big aspect, and when mixed with other things it can make life challenging. My YouTube consumption reflects these personality aspects as well, and YouTube has learned my patterns. But with the pattern the YouTube algorithm has shown me so far, this is actually problematic. When depressed I watch depressing shows, dark content, things that reflect how I feel about the world. But that's what YouTube recommends more of, and in more extreme forms. Unlike with political content, though, with depressing content it takes me days, sometimes weeks, to even think about watching anything that doesn't reflect my misery. So for days or weeks at a time, YouTube won't recommend any science shows, even below the click. It's ALL dark, and I rarely have enough self control in my darkest moments to turn away and watch anything else. I would always have to make an effort to watch different content, then watch about a day of videos, before YouTube would stop at least occasionally tossing a depressing one up at the top.

At first the trend of nothing but depressing content would happen for a few days, and then a science video would pop up. The next time was longer. Then longer. And my life reflected this. My periods of depression would stretch longer, be made deeper, encouraged to be extended. I would get recommendations for sad videos on the very first day I started a downward slide, before I had watched anything at all that day. They would be waiting for me as soon as I logged in to YouTube. Then, because YouTube seemed to have found content that I didn't tire of for up to weeks at a time, it would start recommending them a day or two before I would get depressed.

...BEFORE I WOULD GET DEPRESSED!

Sorry, that needed some extra attention. YouTube, by that point, was more than predicting my viewing habits. It could know by my data when I was likely to get depressed days ahead of time. But a reverse effect began. I would be drawn into the dark, depressing videos almost as much as the science ones even before I was in the depths of depression, but once I began watching my mood would default to depression to match the content. At that point, was YouTube making predictions, or was it manipulating my emotional state? I don't know. But it scared me enough to make me notice.

Now I'm aware of these things. I make a concentrated effort not to let my emotional state be dictated by anyone but me or the people I love. But just because you know something has a certain amount of control over you, doesn't mean you're in a situation to do something about it. Awareness helps, though, and I'm in a situation unique enough that I can see the effects plainly. Most other people don't have the time to notice things like this, so are being openly manipulated, and completely unaware of how much or where the source of manipulation comes from.

So... what have YOU noticed?

wintergray
Автор

For those who like tl;dr 17:15
It's a very nice explanation, that in the grand scheme of things, us even say using adblockers will do nothing to fix what these things are capable of.
Ever wondered why on TED you don't see the negative comments everyone is talking about? In the near future, it could be youtube sorting out the comments you might want to see to stay on the site or say to incentivise you to write more and more [like I do now] for keeping data on you. Send videos where you're likely to share your thoughts and opinions more likely as opposed to what you might want to see.

RNorthex
Автор

She’s right on all points. A few weeks after the 2016 elections, I deleted my Facebook profile. They don’t make that easy for you either. Searching within Facebook for “delete account” will only show you the link to “deactivate” which is not the same action. Ironically, searching Google is the only thing that brought me to Facebook’s “Delete Account” page. During the process, I was faced many times with pop ups asking me if I am sure that I want to delete my account and suggesting that I consider deactivating instead so that I may come back later without having lost all of my likes, photos, posts, and friends. That made me even more determined to choose deletion. I recently have done the same with Twitter as well. Hmmm.... what’s next?

a.garcia
Автор

This is the most important talk I've ever heard.

RedIria
Автор

I am certain that I've been led to this video by youtube algorithm cause it knows me too well. Yet, I find myself having a love/hate relationship with this algorithm. It helps me save time, it helps me find useful contents, it helped me find this beautiful presentation! So... is it my friend? is it an artificial helping hand leading me to my personal growth? or is it a dark puppet master, shaping who I am without me knowing?! I guess I'll never know. of course this is just on the personal level, but even on this limited case study (which is the relationship between me and youtube) it's impossible to know if it's harmless, let alone on a vast scope such as society as a whole.

elaheh
Автор

Bravo!!! Very good description of very disturbing problems. Kinda vague (understandably) about what to do and how to proceed. Like a minnow fighting upstream rapids.

LJK
Автор

"It's like you're never hardcore enough for Youtube" I love it

kata
Автор

The ads won't work on me. Im too poor to buy things hahahaha

holdmybeer
Автор

This woman is brilliant and she has hit the nail right on the head.

nickwilliams
Автор

Too bad nobody will actually listen. They'll watch the video, say that's horrible, then go on facebook and it's business as usual. The problem it that the people watching this video don't or wont understand the stakes involved.

Hume
Автор

"Do you ever go on YouTube on want to watch one video and an hour later you've watched 27?"

That's exactly where I am right now. It's three in the morning right now on a work day...

dddd
Автор

Are there any coding geniuses out there that could create a program that generates personal data noise? Randomly messing with the algorithms sending them disinformation?? Something you could install on your phone or computer???

trpill
Автор

People wonder why the political environment has been so polarized recently. It's just a result of our online business models. Business is now the enemy of mankind, for it may bring upon our demise.

SirMikeys
Автор

It's rather funny that I browsed the boots she's using a couple of weeks ago. Now I wonder if youtube suggested this TED talk because of the theme or because of the boots :D

adiasartes
Автор

Do what I do. Every now and then, like, share and comment on things you DISAGREE WITH so Facebook keeps sending it to you, so you can keep your mind open, and informed about both sides of an issue and what your opponents are thinking. Not flawless but it works.

canaryimpulse