How news feed algorithms supercharge confirmation bias | Eli Pariser | Big Think

preview_player
Показать описание
How news feed algorithms supercharge confirmation bias
----------------------------------------------------------------------------------
----------------------------------------------------------------------------------
ELI PARISER:

Eli Pariser has dedicated his career to figuring out how technology can elevate important topics in the world. He is the co-founder of Upworthy and bestselling author of The Filter Bubble: What the Internet Is Hiding from You.
----------------------------------------------------------------------------------
TRANSCRIPT:

ELI PARISER: A filter bubble is your own personal universe of information that's been generated by algorithms that are trying to guess what you're interested in. And increasingly online we live in these bubbles. They follow us around. They form part of the fabric of most websites that we visit and I think we're starting to see how they're creating some challenges for democracy.

We've always chosen media that conforms to our address and read newspapers or magazines that in some way reflect what we're interested in and who we want to be. But the age of kind of the algorithmically mediated media is really different in a couple of ways. One way is it's not something that we know that we're choosing. So we don't know on what basis, who an algorithm thinks we are and therefore we don't know how it's deciding what to show us or not show us. And it's often that not showing us part that's the most important – we don't know what piece of the picture we're missing because by definition it's out of view. And so that's increasingly I think part of what we're seeing online is that it's getting harder and harder even to imagine how someone else might come to the views that they have might see the world the way they do. Because that information is literally not part of what we're seeing or consuming. Another feature of kind of the filter bubble landscape is that it's automatic and it's not something that we're choosing. When you pick up a left wing magazine or a right wing magazine we know what the bias is, what to expect.

A deeper problem with algorithms choosing what we see and what we don't see is that the data that they have to base those decisions on is really not representative of the whole of who we are as human beings. So Facebook is basically trying to take a handful of sort of decisions about what to click on and what not to click on, maybe how much time we spend with different things and trying to extract from that some general truth about what we're interested in or what we care about. And that clicking self who in fractions of a second is trying to decide am I interested in this article or am I not it just isn't a very full representation of the whole of our human self. You can do this experiment where you can look back at your web history for the last month and obviously there are going to be some things there that really gave you a lot of value, that represent your true self or your innermost self. But there's a lot of stuff, you know, I click on cell phone reviews even though I'll always have an iPhone. I never am not going to have an iPhone. But it's just some kind of compulsion that I have. And I don't particularly need or want algorithms amping up my desire to read useless technology reviews.

The people who create these algorithms like to say like they're neutral. We don't want to create a kind of take an editorial point of view. And I think there's something to that that's important, you know. We don't want Mark Zuckerberg to impose his political views on all of us and I don't think he is. But it's also kind of a weird dodge because every time that you create a list and that's essentially all that Faceboo...

Рекомендации по теме
Комментарии
Автор

The problem is once people make up their minds about something it's very difficult for them to see the reality and accept that they were wrong when they find new information which is correct but contradicts their belief. Because of Cognitive dissonance and confirmation bias they continue to seek more information to reinforce their beliefs and reject the truth

footiemate
Автор

The moment you ignore one side of the argument you are surrendering your own ability to critically think for yourself.

aaronsmith
Автор

Private browsing with VPN helps prevent this. Check your security/privacy settings for everything from your browser, to your phone, to individual apps, etc. The default settings are usually the ones that give you the least security/privacy, and create more of a confirmation bias.

zigarettenbruch
Автор

I’ll probably like this video and agree with the point of view, but won’t that just reinforce my confirmation bias?

brettjohnson
Автор

I watched some Jordan petersen videos and then Google Chrome was offering me right wing subjects on its search page. Then YouTube assumed this meant I wanted to see a rant by neo nazis about immigration - thanks for thinking of me with an algorithm but these assumptions are totally inaccurate . I don't want to live in an echo chamber of personal prejudice and bias that keeps getting offered to me, the big algorithm in the sky is trying to pigeon hole me but I'm not fitting into a profile it recognises.

andynixon
Автор

Yep, thats why the opposing sides in politic rarely negotiated :(

rodigoduterte
Автор

I was wondering how some people are oblivious to this, it’s like subtle digital age propaganda.

Pguz
Автор

Eli is a real techy, and educator of the public.. much respect eli

dertythegrower
Автор

if what you’re thinking is true, you can challenge it.. it will withstand the arguments against it.. if it won’t withstand, you’ve come a little closer to the truth

patrickmaur
Автор

this is great. thanks so much for all the big think content!

muskduh
Автор

*You can't be brave if you've only had wonderful things happen to you. :)*

ChessMasteryOfficial
Автор

Algorithom creators seem to believe that the use of a single word provides a window into a persons soul. The movie "Shadrach" illustrates my point.

larrybutler
Автор

My YouTube algorithm told me I should watch this. And guess what? I didn't learn anything!

mendonesiac
Автор

Corbett Report was talking about this about a year ago. And the reason I subscribe to Big Think is exactly because they are out of my bubble :)

fourorthree
Автор

So glad someone made this video, even if I made it I don't have the reach of big think

arande
Автор

I think the algorithm is neutral in a sense that what it shows you depends on your own actions/choices. If you click and view a lot of climate change videos then it will show you more climate change videos, if you click on gaming videos it will show you more gaming videos. So the choice is still in the hands of its user. Personally I actively clicks on videos from all kinds of perspective so my list shows videos from differing perspectives. (Like, just as an example, a John Oliver video, next to a Stephen Colbert video, next to a Jordan Peterson video, next to a CNN video, next to a Jonathan Haidt video, next to a Vlog brothers video, next to a Piers Morgan video) If you are more open to differing points of view then it will still give you differing points of view. What it does is perhaps only to amplify the effects your choices.

spacekettle
Автор

Isn't this a problem with the human mind? like said in the video people have always done this. I would say the bigger problem than left/right is the up/down, that division is never really pushed or explained in the media.

MrTweedyDocumentaries
Автор

Why did they select a person to speak on feed algorithms authoritatively -- who doesn't understand what drives the algorithms?

There is ultimately one and only one criteria these algorithms look at: *attention*
They feed you things you are likely to view to get you to view those things. They select against things you are not likely to view because you are not likely to view them. What makes them so insidious is what makes them so successful.

Gulgathydra
Автор

My heads a bubble, and these internal dialog are my bias

SocertesGudas
Автор

I reject your algorithms, and seek to educate myself on all sides of any debate.

kirawelty