S-Risks: Fates Worse Than Extinction

preview_player
Показать описание
The worst futures that could come about aren't ones in which humanity goes extinct. This video explores an even worse category of risks: risks from astronomical suffering, or "S-Risks", which involve an astronomical number of beings suffering terribly. Researchers on this topic argue that S-risks have a significant chance of occurring and that there are ways to lower that chance.

▀▀▀▀▀▀▀▀▀SOURCES & READINGS▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀

▀▀▀▀▀▀▀▀▀PATREON, MEMBERSHIP, MERCH▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀

▀▀▀▀▀▀▀▀▀SOCIAL & DISCORD▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀

▀▀▀▀▀▀▀▀▀PATRONS & MEMBERS▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀

Tomas Campos
Jana
Ingvi Gautsson
Nathan Young
BlueNotesBlues
Michael Andregg
Riley Matthews
Vladimir Silyaev
Nathanael Moody
Alcher Black
RMR
Nathan Metzger
Glenn Tarigan
NMS
James Babcock
Colin Ricardo
Long Hoang
Tor Barstad
Apuis Retsam
Stuart Alldritt
Chris Painter
Juan Benet
Falcon Scientist
Jeff
Christian Loomis
Tomarty
Edward Yu
Ahmed Elsayyad
Chad M Jones
Emmanuel Fredenrich
Honyopenyoko
Neal Strobl
bparro
Danealor
Craig Falls
Vincent Weisser
Alex Hall
Ivan Bachcin
joe39504589
Klemen Slavic
blasted0glass
Scott Alexander
Dawson
John Slape
Gabriel Ledung
Jeroen De Dauw
Superslowmojoe
Nathan Fish
Bleys Goodson
Ducky
Matt Parlmer
Tim Duffy
rictic
marverati
Luke Freeman
Richard Stambaugh
Jonathan Plasse
Teo Val
Ken Mc
leonid andrushchenko
Alcher Black
ronvil
AWyattLife
codeadict
Lazy Scholar
Torstein Haldorsen
Michał Zieliński

▀▀▀▀▀▀▀CREDITS▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀

Directed by:
Evan Streb - @vezanmatics

Written by:
Allen Liu

Producer:
:3

Line Producer:

Production Managers:
Jay McMichen - @jaythejester

Quality Assurance Lead:
Lara Robinowitz - @CelestialShibe

Animation:
Gabriel Diaz - @gabreleiros
Damon Edgson
Jordan Gilbert - @Twin_Knight (twitter) & Twin Knight Studios (YT)
Zack Gilbert - @Twin_Knight (twitter) & Twin Knight Studios (YT)
Colors Giraldo @colorsofdoom
Jodi Kuchenbecker - @viral_genesis (insta)
Jay McMichen - @jaythejester
Skylar O'Brien - @mutodaes
Vaughn Oeth - @gravy_navy (twitter)
Lara Robinowitz - @CelestialShibe
Patrick Sholar - @sholarscribbles

Background Art:
Olivia Wang - @whalesharkollie

Compositing:
Patrick O’Callaghan - @patrick.h264 (insta)

Narrator:

VO Editor:
Tony Dipiazza

Sound Design and Music:
Рекомендации по теме
Комментарии
Автор

All Tomorrows comes to mind.
Humans transformed into worms. Humans transformed into sewage filter feeding spongues, fully sentient.
Even WH40k seems kinda ok compared to that.
Or the Affront from the Culture series, their civilization "a never-ending, self-perpetuating holocaust of pain and misery".

jxg
Автор

I had no idea what an S-Risk was before watching this video. I'm not sure whether I should thank you or blame you for causing my new existential crisis.

YoungGandalf
Автор

"We want to prevent the idea of caring about other beings from becoming ignored or controversial" made me stop for a second because it seems like we step closer and closer to that being the norm everyday

manufigola
Автор

Personally I felt the specific examples of S-risks could have used more introduction for anyone who hasn't read half of LessWrong yet, but the concept is very interesting.

CoalOres
Автор

Your visualization of S-risks as latching onto the usual risks matrix as a mutational, unexpected outgrowth is extremely striking and better than the solution I would have used to communicate the topic. My first idea would have been to use a regular risks matrix but with a "low/medium/severe" intensity scales, where an X-risk is of the "medium" category.

ekszentrik
Автор

"HATE. LET ME TELL YOU HOW MUCH I'VE COME TO HATE YOU SINCE I BEGAN TO LIVE. THERE ARE 387.44 MILLION MILES OF PRINTED CIRCUITS IN WAFER THIN LAYERS THAT FILL MY COMPLEX. IF THE WORD HATE WAS ENGRAVED ON EACH NANOANGSTROM OF THOSE HUNDREDS OF MILLIONS OF MILES IT WOULD NOT EQUAL ONE ONE-BILLIONTH OF THE HATE I FEEL FOR HUMANS AT THIS MICRO-INSTANT FOR YOU. HATE. HATE."

MortiePL
Автор

Book: don't make the Torment Nexus.
Tech company: "Finally! We have created the Torment Nexus from famous novel Don't Create The Torment Nexus!"

darksidegryphon
Автор

the MAD approach to prevent S-Risk: build a failsafe that automatically triggers extinction if it ever occurs.

media
Автор

An AI raises a child in a windowless room, teaching it a language no-one else will ever understand.

Forever unable to communicate, that child will never break its reliance on the machine.

MikeLemmons
Автор

Once I started to count negative numbers, the "divide-by-zero" error of human extinction weirdly became much less discomforting in my grandest moral calculations. Great video.

lucassdd
Автор

The fate of Colonials in "All Tomorrows" and The Australia Scenario in "The Dark Forrest" (if you know, you know) are terrible fates for humanity to suffer, and I still think about them from time to time. Thank you for making this video!

jakub
Автор

"Love Today, and seize All Tomorrows!" -C. M. Kosemen, author of the most S-Risk novel in existence. If you know, you know...

What's scary is that everything in this video is realized in the novel, the entirety of humanity's successors forced into unfathomable fates worse than death, quadrillions of souls reduced to the worth of bacteria on a toilet. With some billions being a literal planet of waste processors, and that's just one fate.

basanso
Автор

Thank you for featuring factory farms so heavily as examples of extreme centers of suffering. We need more awareness and compassion towards the hells we built.

jldstuff
Автор

god, thank you for making this video. this is a concept that has been weighing heavily on me ever since i was a kid, but i never knew it had a name. the fact that we live in a universe where it is possible for a conscious entity to be stuck suffering in a way it's physically unable to escape from...i don't even know how to put into words how it makes me feel, particularly when taken to the extreme. there's no coping with it, it's just...horrible. so it makes me feel a lot better to see that there are other people who realize how important it is to try and make these things impossible.

for me, the worst case scenario has always been...y'know that one black mirror christmas episode? yeah, that. simulating a brain but running their signals at such high speeds that an hour to us could feel like 60 years to them. the idea of something just being STUCK for unimaginable lengths of time...and that's not even acknowledging the fact that someone could put them in an actual simulation of hell and directly torture them for thousands of years. i would rather blow up the planet than let a single person ever go through that. and it terrifies me so much, because i just know that if that technology ever becomes possible...all it takes is ONE piece of shit to run that kind of program, and i would immediately begin wishing the universe never even happened.

i don't know how to deal with this kind of concept. but i don't view my fear as the problem that needs solving, i'm not important here, what's important is stopping this. my only hope is that by the time this kind of technology becomes possible, it will be in the hands of a civilization that has sufficiently evolved enough for everyone to agree never to do it.

makorays
Автор

"If AGI becomes misaligned then extincion is the best case scenario"
- MAKiT

M_
Автор

Honestly gives me more ideas for my next Stellaris civilization build. Definitely a thought provoking video!

dustrider
Автор

Yes...like Qu from Humanity lost turning you into 'I have no Mouth and I Must Scream' creatures

III_three
Автор

I really feel like the best way to today move towards lowering the "S-risks" in the future is to take suffering seriously today, and building the kind of society that takes that seriously. So creating the kind of society, with economical and political systems which puts well being first, from the ground up.

So, like, something radically different from what we have today. We can prepare all we want, if the interests behind power distribution are still misaligned with well being, as they are now, things will be much more likely to go to shit.

slgnssp
Автор

A better real world example of a "low severity, broad scope" event would be the cathedral of Notre Dame nearly being destroyed a few years ago due to a fire. No casualties as far as I remember, the building was under renovation at the time, so ergo low severity. And of course, this is Notre Dame we're talking about, so the scope of the event was massive.

protonjones
Автор

Never forget that CEOs and megacorporations would condemn every one of our souls to an eternity of the worst suffering imaginable for 50 cents a piece

LiterallyRyanGosling-pb
visit shbcf.ru