filmov
tv
Anthony Aguirre & Anna Yelizarova | On Existential Hope, AI, and Worldbuilding
Показать описание
Foresight Existential Hope Group
Anthony Aguirre & Anna Yelizarova | On Existential Hope, AI, and Worldbuilding
FLI is currently welcoming entries from teams across the globe, to compete for a prize purse of up to $100,000 by designing visions of a plausible, aspirational future that includes strong artificial intelligence. Last day to enter your submissions is April 15th, 2022.
In this interview we talk about the concept of worldbuilding, what we need to build better worlds, and why it is important to encourage the imagining of more positive visions of the future.
One of the questions we always ask in this podcast is for the scientist to provide an example of a potential eucatastrophe. The phrase “eucatastrophe” was originally coined by JRR Tolkien as “the sudden happy turn in a story which pierces you with a joy that brings tears”.
In a paper published by the Future of Humanity Institute, written by Owen Cotton-Barratt and Toby Ord, they use Tolkien’s term to suggest that ”an existential eucatastrophe is an event which causes there to be much more expected value after the event than before.” I.e. the opposite of a catastrophe.
Telling stories can help us make what seems abstract become real and clear in our minds. Therefore we have now created a bounty based on this prompt. Use this event as a story-device to show us the picture of a day in a life where this happens. How would this make us feel? How would we react? What hope can people get from this?
Join us:
Foresight Institute advances technologies for the long-term future of life, focusing on molecular machine nanotechnology, biotechnology, and computer science.
Subscribe for videos concerning our programs on Molecular Machines, Biotechnology & Health Extension, Intelligent Cooperation, Neurotech, Space, and Existential Hope.
Anthony Aguirre & Anna Yelizarova | On Existential Hope, AI, and Worldbuilding
FLI is currently welcoming entries from teams across the globe, to compete for a prize purse of up to $100,000 by designing visions of a plausible, aspirational future that includes strong artificial intelligence. Last day to enter your submissions is April 15th, 2022.
In this interview we talk about the concept of worldbuilding, what we need to build better worlds, and why it is important to encourage the imagining of more positive visions of the future.
One of the questions we always ask in this podcast is for the scientist to provide an example of a potential eucatastrophe. The phrase “eucatastrophe” was originally coined by JRR Tolkien as “the sudden happy turn in a story which pierces you with a joy that brings tears”.
In a paper published by the Future of Humanity Institute, written by Owen Cotton-Barratt and Toby Ord, they use Tolkien’s term to suggest that ”an existential eucatastrophe is an event which causes there to be much more expected value after the event than before.” I.e. the opposite of a catastrophe.
Telling stories can help us make what seems abstract become real and clear in our minds. Therefore we have now created a bounty based on this prompt. Use this event as a story-device to show us the picture of a day in a life where this happens. How would this make us feel? How would we react? What hope can people get from this?
Join us:
Foresight Institute advances technologies for the long-term future of life, focusing on molecular machine nanotechnology, biotechnology, and computer science.
Subscribe for videos concerning our programs on Molecular Machines, Biotechnology & Health Extension, Intelligent Cooperation, Neurotech, Space, and Existential Hope.
Комментарии