Are We Holding Autonomous Vehicles To Impossibly High Standards?

preview_player
Показать описание
Tesla -- and (so we're told) other automakers -- have the technology needed to bring either partial or full autonomous vehicle operation to the marketplace.

In Tesla's case, it says it will be 'feature complete' for full autonomous driving by the end of this year, with features ready to be rolled out to customers' cars by the end of next.  

Yet the last hurdle -- regulatory approval -- could put a spanner in the work... not because the technology might be flawed, but because society is expecting autonomous vehicles to operate one hundred percent perfectly at all times. 

Does that mean that we're expecting too much? 

Watch the video above to find out, like, comment and subscribe, and support us using the links below.

-~-~~-~~~-~~-~-
Please watch: "2023 Kia Niro EV: Why You'll Want To Drive This"
-~-~~-~~~-~~-~-
Рекомендации по теме
Комментарии
Автор

The auto insurance industry is 150 billion tailor industries. They have actuaries. When they decide auto mode is cheaper and start charging a premium for human driving that is when it will change in mass.

davidgoodwin
Автор

I drive for a living and am always trying to be careful and overall a good driver.
On a daily basis I see people using phones while driving, "sleeping" on traffic lights making less cars pass, merge too agreessively, brake suddenly out of nowhere when there is no reason to, not entering the crossroad when turning left etc. etc.

Lets face it, most of the people are bad drivers. We need autonomy.

goran
Автор

No. We shouldn't expect perfection.
But if they can save a significant number of lives, they should be utilized.

skydivekrazy
Автор

I'd like to frame this discussion in a different way. When a human being makes a poor decision (texting, driving drunk, applying makeup, etc) and causes an accident, that decision is self-contained to that human being, along with all of the potential ramifications.

However, if self-driving car software developers introduce a mistake and it's published, that mistake is not self-contained to one vehicle. Instead, it would infiltrate via OTA update potentially hundreds of thousands of vehicles in the US, if not millions in the US and around the world.

In the US in 2017 there were 3.2 trillion miles driven nationally, and roughly 6 million recorded car accidents. That's one accident every cumulative 500, 000 miles driven roughly, or every 4-5 seconds someplace in the United States there was an accident. Keeping in mind that most accidents are not fatal. We're talking about fender benders, non-fatal collisions with other cars, pedestrians, cyclists, etc.

In order for self-driving cars to take off, they have to be significantly better, probably an order of magnitude, than human driving. Why? Because human beings will want the sense of "control".

"I wouldn't have gotten into that accident if I were driving. It was the software's fault!".

If the accident rate is even relatively close, many will buy into that argument. It would take a chasm between self-driving car accident rates and human-caused accident rates to change the perception, and give the public a reason to buy into giving up that control en mass.

So do self-driving cars need to be perfect? No. But they do need to be, imo, at least an order of magnitude better on the road than their human counterparts.

And on top of that, especially in lieu of recent news like the software defects introduced into the Boeing 731 MAX auto-pilot system, the concern with OTA updates to the self-driving software powering potentially millions of cars will need to be addressed as well.

MGPCycling
Автор

Humans are afraid of fully autopilot on cars but not afraid of a distracted, 😱 careless, reckless, driver.🤢💀

nelsonpagan
Автор

The standard has to be high as we all consider ourselves way above average drivers even if we’re not.

MartFish
Автор

Legislators tough to crack?!? HA! Best joke I've heard in a long while. You can get a couple politicians for the price of a nice used car.

hellcat
Автор

I can see the UAE allowing autonomous vehicles on their road which could be the test ground for the world.

brbarlow
Автор

It's not that we have too high expectations for autonomous vehicles so much as we have unrealistic ideas about the abilities of human drivers. We do have to expect a ?earning curve for autonomous vehicles, and we have to expect and accept a few failures initially, or there will be no chance of reaping the maximum benefit. I think you have a good take on it.

That (illogical, unreasonable) fear that is preventing legislation to allow it is the problem. "We have nothing to fear but fear itself", indeed.

awofman
Автор

Turn the framework on it's head. Instead of asking "How much safer than a human driver does autonomous driving have to be to be acceptable", ask "How many casualties are we willing to accept?" then look at current numbers with human drivers vs proven numbers with autonomous driving..
Then it becomes, how much longer will we accept human drivers.

CSHarvey
Автор

I was just discussing this stuff recently with someone. He was convinced that FSD wouldn't be possible (at least, any time soon) because of the Trolley problem, and because the software wouldn't be able to avoid accidents 100% of the time. I pointed out that people also aren't perfect drivers and don't necessarily have an immediate answer to the Trolley problem, and yet we're still allowed to drive.

TwileD
Автор

100% perfection is impossible to achieve, in an unbounded time period. That does not mean we don’t try, and that does not mean we can’t set levels of acceptable risk. We would never venture out of our beds if we didn’t accept the risk that we may get injured, or die today. For that matter, why risk going to bed, one may die there as well.

Injuries, and death are a part of life, trying to make a machine that will never fail is a fools errand. Knowing our limitations, and working to minimize failure is what allows us to grow, and improve our abilities.

If autonomous vehicles prevent one death, is it worth it? How about if they prevent 100, or 1, 000 deaths? Nearly 1.25 million people die in road crashes each year, on average 3, 287 deaths a day. If using autonomous vehicles halves that is it worth it?

Each person who drives a car, or is a passenger in a car accepts the risks of using a car. When you get on a bus, or taxi, you accept that risk. How is that different from using an autonomous vehicle? One difference may be who do you blame when a machine fails?

100% is impossible, the question is: How many nines are acceptable? 99.9%, or It’s like NASA (or SpaceX) figuring out how many digits of ‘PI’ are needed to reach their destination?

robertgamble
Автор

The question I can't answer is ... If in 100 tricky traffic situations, the human and the autonomous car both avoid harm in 85. In four, both of them have an accident. In ten, the human crashes but the autonomous car doesn't. In one case the autonomous car crashes but the human doesn't. Who is liable in the last case? It's awesome that the robo-car is ten times safer than a human, but who is responsible for its actions?

keeperMLT
Автор

For me the claim of "full self driving" is the issue. I can see substantial self-driving in the not-too-distant future including significant commercial opportunities but I'd argue it's not full self driving until it can handle all road driving - including full snow cover of road markings etc. Call it autonomous taxi service, fine. Don't call it full self driving.


Also worth remembering "safer than humans" is a vague concept. Accident rates in the US are very high compared to the UK even allowing for the increase in miles driven. For an international audience you have to be better than any national average not just one of the worse averages amongst the developed world.

asharak
Автор

Elon musk said they are programming the cars to be ok with a crash. How is that an expectation of computers being perfect drivers. During autonomy day Elon said it is impossible to drive in LA traffic without the risk of fender benders. So there will be a setting of risk of accidents vs not being able to change lanes. You choose how aggressive your car is. He also said if there is a crash Tesla would probably be the one that is liable.

Tesla being liable is why the Tesla car insurance makes so much sense. If your car is driving it self and gets in an accident is your insurance company or the auto manufacturer liable. It won’t matter if they are both the same company.

I could see Tesla charging you more for your insurance though if you set your car to be the maximum aggressive drive setting.

ecospider
Автор

Many self driving vehicles drive better than many humans. What I have never seen (we have many prototypes on the local roads here) one do is the crazy, unpredictable stuff humans do with cars.
The thing that is absurd about demanding a Human as a required Standby (caging the wheel) driver, is attention and switchover (taking control) issues. When I am driving, I am always up-to-date with the situation. When the car 'hands off' because it can't handle it, the human is delayed : 1) actually needing to contact (grab, place feet) the controls 2) take in the entire situation and formulate a reaction. Not realistic in an emergency. OTOH changing drivers (human takes over), when departing known route would be very doable. Voice alerts, driver taps a button (similar to Cancel on a cruise control) when ready. Failing to take over, the vehicle simply parks.

steveurbach
Автор

Airplanes are not "fail-safe" but "fail-functional" - and that's what autonomous vehicles need to be.

gerhardkonighofer
Автор

Great video. That's a question we should answer. I'll just tell that in the case of Tesla, they should keep the safest distance possible at the least fatality speed, so if something happens, the people inside the car would be less likely to be kill. If the owner wants to set the speed higher, them he/she should sign an agreement that any liability fall into his hand and that he should pay attention at all time.

gremy
Автор

I don't think perfection is required, but for autonomous vehicles to be accepted they will have to be as good as or better than the best drivers, not average drivers.

allansmith
Автор

It may depend on the drivers' experience in sub-optimal terranes. There shall come a time when insurance companies will discourage drivers from taking direct control over their vehicles.

pauljmeyer