The ethical dilemma of self-driving cars - Patrick Lin

preview_player
Показать описание

Self-driving cars are already cruising the streets today. And while these cars will ultimately be safer and cleaner than their manual counterparts, they can’t completely avoid accidents altogether. How should the car be programmed if it encounters an unavoidable accident? Patrick Lin navigates the murky ethics of self-driving cars.

Lesson by Patrick Lin, animation by Yukai Du.
Рекомендации по теме
Комментарии
Автор

Allow self driving vehicles to communicate with one another. That way, in such a scenario, multiple vehicles could work in unison to allow a safe pathway out for the car in danger. This would, of course, require very rapid communication between the vehicles. It would significantly lower the chances of this happening, but not prevent it in every instance.

Jacob-Vivimord
Автор

This video brought up a lot of points I hadn't considered about self driving vehicals. Very interesting topic, I'm curious as to where it will lead.

eyespelegode
Автор

At first I would have crashed into the motorcyclist with the helmet cause of the lower risk but after realizing that I’m hitting the driver just because they are following the rules and being cautious it really hit me.

rileyj.s.
Автор

The system should deploy its hidden rocket launchers to blow the boxes up.

RBsRealm
Автор

Now here's another ethical dilemma: When we get to the point that most cars are self-driving and are 10 times less likely to have an accident, should humans even be allowed to drive on public roads at all?

henrilinnainmaa
Автор

I just want to say I think this is a beautifully realized piece of communication. The pace of information flow is accessible without being patronizing, the text is very well structured and delivered, and the graphics and music support it perfectly. Good job, people!

Rufusdos
Автор

That is if you only see 3 options. I see more.

If the other cars on the road are also self driving, they can all instantly start to make way for you to minimize damage and there is a good chance everyone can walk away.

Otherwise, you can still swing towards the car, but not smach him. Just get mostly out of the way of the falling thing.



And blame goes to whoever tied down that load.

kalebbruwer
Автор

The future where cars dish out street justice haha

icisne
Автор

I see that most comments missed the main point of this video. The given example is designed to tackle the ethical dilemma by assuming the choice between different victims is inevitable and you guys are just trying to run away from it by reversing the assumption itself saying it can be prevented somehow.

BTW the assumption is fair and almost certain to occur especially at the early stages of self-driving vehicles.

islamshatta
Автор

Answer: make trucks with loosely binded cargo illegal

kebabremoveth
Автор

I've gone over this very topic several times in the last few years and have come to the same conclusion each time. The only ethical way of handling accidents with self-driving cars is for each of them to prioritize the lives of people *outside* of the car. That means they will always choose to put passengers in danger before putting outsiders in danger. This way, it's the passenger's conscious decision to trust their life to a car the moment they enter it.

Lutranereis
Автор

This is very interesting to think about. It also raises an assumption here: In this scenario, if all cars were self-driving, could they sense everything around them or not? If so, the other cars may be able to 'see' this event happening and account for it to move out of the way to make a gap for the oncoming car that would otherwise hit the object. However that seems to assume a perfect overseeing system, probably too difficult to create at this stage. Awesome to think about tho

jameshansen
Автор

In a world of self driving cars, why are people still riding motorcycles?

JM-usfr
Автор

If the car is intelligent enough to identify different scenarios at the point of the incident. Wouldn't it be intelligent enough not to drive too close behind a flatbed truck with an insecure load?

sntkore
Автор

The real question is why that self driving car is tailgating the semi.

Agreedtodisagree
Автор

Unfortunately in the end the decision will probably be financial. Who do I hit to minimize the expected liability.

ThePeterDislikeShow
Автор

It's a bad system that would allow itself to be boxed in in the first place. If there are no contingencies, create contingencies. This is especially true once all cars are automated and are all just part of a larger system. All should be able to compensate for any other under any circumstance.

Abraxis
Автор

Self driving cars would automatically be a safe distance from other cars so they could brake in time

tlowry
Автор

You can put a constraint on the self-driving car that it is not allowed to follow so close to a vehicle that it cannot stop in time should something fall off the back of it. That would avoid the posited decision entirely.

CrispyDruid
Автор

I predict that future cars will continually be made structurally stronger and safer for occupants during a crash instead of having programs to tell it to swerve.

boy
join shbcf.ru