Anthony Bourdain and The Ethics of Synthetic Media

preview_player
Показать описание


How would you feel if you found out that someone had used AI to generate your voice saying something you'd never said aloud?

Me on Other Parts of the Internet:

Sources:

Corredoira, Loreto, Ignacio Bel Mallen, and Rodrigo Cetina Presuel. 2021. The Handbook of Communication Rights, Law, and Ethics. John Wiley & Sons.

Рекомендации по теме
Комментарии
Автор

I can imagine this being an interesting new thing lawyers have to deal with. Deep fake voice recordings that have to be proven wrong or they could have disastrous consequences

brandonmtb
Автор

The biggest issue is that no one can know how Anthony Bourdain would have read those emails. The producers of the documentary created their own interpretation of the words by deciding how they would sound. And then, they failed to disclose that they had done so. They should have just had another person read those emails. Instead, the integrity of the entire documentary has been lost.

slugbiker
Автор

Probably don't want to abbreviate Content Warning at the beginning.

Great video, you've been so inspiring to dive into the ethics and implications of technology applications!

fyzxnerd
Автор

Maybe it's just me, but I can't see how this is a "weird grey zone, " especially as you yourself pointed out, that most audiences are at least a little bit uncomfortable with it.

I'm personally of the opinion that unless you get express written/verbal permission from someone before their passing, you should not be allowed to fake their voice or likeness into a project. And no, getting permission from their spouse, child(ren), or estate doesn't count. Otherwise you're just puppeting around a corpse for your own entertainment, which to me, is reprehensible.

PengyRoll
Автор

The choice of not disclosing the deep fake audio and subsequent outrage might have generated a lot of extra publicity for the show.

Micetticat
Автор

I'm horrified of the future where video will be impossible to be known if real or fake! And how far we'll go to 'prove' video is real in the future, like those Black Mirror cameras in your head!

calabiyau
Автор

This is something I’ve been thinking about for a while. I was especially disturbed when folks were excited about the Black History deep fakes of Tubman Douglas’s et all… I thought they were trying to acculturate us to our image and likeness being distorted in that way

bobo
Автор

What a cool video and channel! I wish you all the success in growing it, and the benefits that might come from it! All the best :)

leptir
Автор

While it is definitely an ethical question, I think our perception of content needs to change that it may not be for real that what is displayed. Misuse will always happen and at one point you won't be able to identify deep fakes anymore, if we aren't already there

It may even be necessary to start to consider signing our media, so the content authenticity can be verified. How that would look like though remains to be seen.

Thanks for the video. - The channel "2 minute papers" is also talking about that a lot lately.

tcmsurfer
Автор

Interesting how people are uncomfortable with a deep fake voice clone of what the very subject had written; personally I feel zero discomfort towards this, positive emotions if anything. I can understand discomfort with blatant misrepresentations (I mean, I think everyone would be extremely upset if someone spread a deepfake of them saying derogatory and inflammatory (whatever that would be for the subject at hand) things), but imo this instance is incredibly reasonable. I can somewhat understand discomfort towards "misleading content" (when something is advertised as "real" (which imo will simultaneously mean less and less and more and more in the upcoming years)), but particular case seems odd when thing at hand is a literal quote written by the subject themselves. idk, even extreme examples (full book vocal deepfakes from the authors themselves) still has me excited without much personal moral discomfort, so I'm probably an outlier here.

jdave
Автор

Maybe a strange cut in the first minute!? Anyway interesting topic indeed 🎬🎬🎬🎬🎬🤖🤯🗽🌈☮️💟

pingnick
Автор

I’m very scared of the future of ai ethics

terme-nator_
Автор

I don't believe that any wrong was done but if one take in to consideration the psychological behaviour of our current mass is the underlying feel of untrust, so any information that comes to light after the fact would be considered a secret and frown upon on automatically. I have been looking forward for the tech to be used more often, especially in multi language audio content, were actor/actress voices sound true to them in different languages. That said, bad actors will use any tool for malicious use. Great to see u b 🖤

TrxShad
Автор

Forget the ethics, it's just straight-up ghoulish.

moonverine
Автор

Sis the information that you deliver is super interesting but the way u deliver is boring if feels like I m listening to my teacher can u pls make it a bit interesting this is not a hate comment just a suggestion

WhyShubham.
Автор

imagining people's voices being deepfaked in the future to read what they wrote, including maybe thirsty or cringy and embarrassing messages, is just really unsettling

DIOsNotDead
Автор

Where’s the difficulty? Dead people can’t consent.

CannaCJ
Автор

click-bait and cancel culture! There's no gray zone here. If the person wrote it, which is the case here, the cinematographer can definitely use it if he chooses to do so.

dxem