Anthony Aguirre & Anna Yelizarova | On Existential Hope, AI, and Worldbuilding

preview_player
Показать описание
Foresight Existential Hope Group

Anthony Aguirre & Anna Yelizarova | On Existential Hope, AI, and Worldbuilding

FLI is currently welcoming entries from teams across the globe, to compete for a prize purse of up to $100,000 by designing visions of a plausible, aspirational future that includes strong artificial intelligence. Last day to enter your submissions is April 15th, 2022.

In this interview we talk about the concept of worldbuilding, what we need to build better worlds, and why it is important to encourage the imagining of more positive visions of the future.

One of the questions we always ask in this podcast is for the scientist to provide an example of a potential eucatastrophe. The phrase “eucatastrophe” was originally coined by JRR Tolkien as “the sudden happy turn in a story which pierces you with a joy that brings tears”.

In a paper published by the Future of Humanity Institute, written by Owen Cotton-Barratt and Toby Ord, they use Tolkien’s term to suggest that ”an existential eucatastrophe is an event which causes there to be much more expected value after the event than before.” I.e. the opposite of a catastrophe.

Telling stories can help us make what seems abstract become real and clear in our minds. Therefore we have now created a bounty based on this prompt. Use this event as a story-device to show us the picture of a day in a life where this happens. How would this make us feel? How would we react? What hope can people get from this?

Join us:

Foresight Institute advances technologies for the long-term future of life, focusing on molecular machine nanotechnology, biotechnology, and computer science.

Subscribe for videos concerning our programs on Molecular Machines, Biotechnology & Health Extension, Intelligent Cooperation, Neurotech, Space, and Existential Hope.
Рекомендации по теме
Комментарии
Автор

As I understand current trends, it looks like 2045 will feel like toon town. Meaning, we will be sharing our world with cartoons processed through mixed reality glasses/goggles. Walt Disney will finally get his dream, we are going into toon town folks. That means cartoon girlfriends and boyfriends, protege's and mentors, guides and therapist's...all cartoons, Buckle up.

auditoryproductions
Автор

David Horacek: God I hate when people take works of fiction seriously. Works of fiction are just for entertainment. Period.
And being able to work less absolutely WOULD be one of the BIGGEST examples of overcoming our limitations!
Right now, we have to slave away to survive or make the TINIEST scientific, mathematical advances.
IF we could overcome that, and we SHOULD, then WE WOULD be able to work less. So you JUST contradicted your own point.

theultimatereductionist
Автор

These "optimistic" visions don't seem at all optimistic to me. They sound like social retirement planning: aww, work less, be more local, have fewer problems with resources, be more human-centered, keep our technologies on a shorter leash... All of you sound like you want to move all of us into Hobbiton - a place where we're all charmingly passive and useless and more than a little pathetic. (And even Hobbiton was vigilantly shielded from outside intrusions by the druids and by Gandalf so it could go on being almost entirely free of adventures, which is how the hobbits like it.)

To me an optimistic vision is one where the light of human consciousness takes over galaxies in an expansion wave that travels as close to c as possible, transforming stupid dead matter into life. To me it's sheer pessimism to pretend that the way that nature made us is the best we're gonna get. Optimism is believing that we can confront our limitations and fight to mercilessly destroy every single one of them in turn. It's not to "work less" you wimps! Optimism is scaling up good things by terrifying orders of magnitude, and then doubling down and scaling up still more. Every twinkling star in the sky is a waste of zettawatts of unexploited energy. Optimism is the hope that we will capture that energy and finally harness it for good. And when the expanding wave of human consciousness finally intersects the expanding wave of alien life, we will not meet them as some backwater inferiors, but as a mighty and terrible empire that is able to set the terms of that interaction and see to it that our values don't perish.

Yes, I know that none of this seems relevant to 2045, but it is. Because if your dream of 2045 is Hobbiton, that lack of ambition and passive acceptance of our limitations could be the opiate that keeps the human future grounded forever. That would be the ultimate catastrophe, a future that accepts the permanent waste of human potential.

[If you read this far, know that this rant was delivered in the voice of a character that I'm developing in a rather ambitious novel. You decide whether or not this character is a villain. But I know he thinks of people like A. Aguirre and A. Yelizarova as villains.]

davidhoracek