Investigation reveals Tesla in self-driving mode in crash that killed motorcyclist

preview_player
Показать описание
Investigators disclosed on Wednesday, July 31, that the driver of a Tesla involved in a deadly crash in the Seattle area had been using “Full Self-Driving” (FSD) mode during the collision that killed a 28-year-old motorcyclist in April. Authorities said that they made the discovery after downloading and reviewing the event-data record on the Tesla Model S.

Following the crash, the driver initially told a state trooper that he was using Tesla’s Autopilot system and looked down at his phone before hitting the man on the motorcycle. Police said that the driver admitted to putting total trust in the Autopilot system.

Authorities said that the investigation is ongoing and that prosecutors have yet to decide if any charges will be filed.

If the findings in the investigation hold this would be the second reported crash death involving a Tesla in Full Self-Driving mode.

Tesla has two partially automated systems: Full Self-Driving, which can take on many driving tasks, and Autopilot, which keeps the vehicle within lanes and away from objects in front of it.

For its part, Tesla has told drivers that its Full Self-Driving and Autopilot modes still require them to be aware and ready to take control of the vehicle at any time.

On July 23, Tesla CEO Elon Musk expressed optimism that the Full Self-Driving system would be ready to be implemented without human supervision by the end of the year.

Musk also said that his company will be unveiling a robotaxi vehicle on Oct. 10. He said that he didn’t think that approval from government regulators would be a problem in deploying the taxis.

Follow Straight Arrow News on social media:

Download the SAN App!
Рекомендации по теме
Комментарии
Автор

It's the drivers fault. It's always been the drivers fault.

nathansuss
Автор

Is it the car's fault or the driver's fault?

dmr
Автор

It's the driver's fault for using his phone, and it's the company's fault for making a product that will predictably cause most humans to fail. Humans aren't designed to play backup. Right now it's relatively doable, because the technology still makes enough mistakes to keep most people attentive. But imagine we have a car that doesn't make a mistake for a full month. A full year. How many of us won't get caught by surprise if we aren't needed for that length of time? And how many people who don't drive for a year will be in practice enough to take over in an emergency, such as when the vehicle gets caught on ice approaching an intersection? Unlike a technology backup, our skills degrade in time as we don't use them. And the better FSD gets, the more we can expect that the only times humans will be needed is when a situation suddenly becomes critical and potentially challenging to save. Situational awareness and practice are key when handling an emergency, and it's these that go away when they aren't required regularly.

communityband
Автор

So he was looking at his phone while operating a vehicle 🤔....driving while distracted?....you behind the wheel you responsible

dstylez