Will Tesla’s Autopilot accident stall autonomous progress?

News | July 5th, 2016 by 3
tesla model s p85d interior 750x500

If you haven’t already heard the tragic news of Joshua Brown’s fatal accident, we’ll fill you in. Brown was driving his Tesla Model S on …

If you haven’t already heard the tragic news of Joshua Brown’s fatal accident, we’ll fill you in. Brown was driving his Tesla Model S on May 7th and had decided to use the car’s Autopilot system, a system we’ve spoken about at length. It’s also reported that while letting the car drive autonomously, Brown allegedly decided to watch a movie on the large infotainment screen while the car drove itself. It was during this that a tractor-trailer turned left in front of Brown’s vehicle. Brown wasn’t paying enough attention to notice the truck and his Autopilot system couldn’t differentiate the white trailer from the bright sky, so it didn’t notice it either. Unfortunately, the two collided and Brown lost his life. It’s the first reported fatality during the use of Tesla’s Autopilot and, naturally, the world is in a frenzy about it.

It’s genuinely tragic that Brown lost his life but the sad truth is that it was bound to happen eventually. Tesla’s Autopilot system is still in its beta phase, which in layman’s terms means it isn’t quite finished yet. So there are going to be bugs in the system, there are going to be things that don’t work or times when it malfunctions. That’s just what betas are and the Autopilot system isn’t to blame for Brown’s death. According to most insurance companies, that blame lies with Tesla.

tesla autopilot image 750x380

We’ve recently learned that, according to the Insurance Institute of Canada’s report “Automated Vehicles: Implications for the Insurance Industry in Canada”, in the event of an autonomous vehicle accident, the blame lies with the automaker. In this report, it states that if a human isn’t operating the car, they aren’t at fault for a collision. Tesla will argue using this, claiming that it tells drivers to pay attention, but will likely lose.

“Current insurance coverages and practices, however, are simply not designed for a world where human drivers are replaced by vehicles that can drive themselves,” says Paul Kovacs, the author of the Canadian report.

Most other governments and insurance companies seem to agree with this report and do hold the car companies, and their systems, responsible. If car companies advertise a product as being safe and someone dies while using it as it’s allowed to be used, it is their fault. Tesla does indeed say that the Autopilot system is still in its beta and does warn drivers to still pay attention, but the Autopilot system doesn’t actually require the driver to do so. There’s no warning to place your hands back on the wheel unless the system encounters something it isn’t equipped to handle, but by that time it could be too late. The systems from BMW, Audi, Mercedes-Benz and Volvo all require the driver to touch the steering wheel every ten seconds or so (the time varies based on the automaker) to insure that the driver doesn’t start watching a movie or jump in the back seat.

Tesla is claiming that its system does require drivers to keep their hands on the wheel and their attention on the road. But only when the software encounters a situation it doesn’t understand. We’ve seen countless videos of owners driving miles without touching the wheel or the pedals. We’ve even seen drivers jump into the backseat and let the car do its thing. Yes, the Autopilot tells you to keep your hands on the wheel, but it doesn’t actually require you to do so for it to function.

Should Brown have been paying attention, regardless of how capable his car was? Of course. In reality, the Autopilot system isn’t entirely to blame in this incident, as watching a movie while driving is irresponsible at best and completely idiotic at worst. However, he shouldn’t have been offered the opportunity to be so careless.

I’ve said it before and I’ll say it again, publicly beta testing an autonomous driving function on open roads with other cars is as careless as giving a child matches and leaving them unattended. Human beings, both children and adults, push the boundaries of what we’re allowed to do and what we’re capable of. It’s just human nature and it must be accounted for. So Tesla might say “Hey, be careful with those matches” but that isn’t going to stop anyone from accidentally burning the house down.

It seems as if most insurance companies and government agencies feel the same way. Insurance companies see this as Tesla’s fault, as they advertised a product as being safe and a fault in the product took a man’s life. Because of this, you can be sure that automakers are going to be far more reluctant to release any sort of autonomous driving aids or systems. Most automakers were hoping that government regulation would provide a blanket cover for them in situations like this, but it just hasn’t been the case yet. And until something like that happens, automakers are going to want to avoid any sort of liability for autonomous accidents. Because of this, it’s possible that we’re going to see a massive slow-down of autonomous technology.

BMW’s Harald Kruger spoke awhile back about how the Bavarian brand would wait until its self-driving technology was absolutely safe before releasing it.

“In the app industry, you can launch products on the market that are 70 to 80 percent ready and then complete their development with the customer,” said Kruger, “That is absolutely impossible with safety features in a car.” Kruger’s hesitancy to launch anything prematurely will likely be shared by the rest of the automotive world now that we know who’s responsible in the event of a crash.

If this does happen, and the industry slows down its approach to self-driving cars, it’s sad that it took the loss of someone’s life to provide some caution. BMW has cautious and concerned for driver safety since the beginning and let’s hope that the rest of the industry, Tesla included, follows suit.

[Source: Motor Mouth]

BMWBLOG

NEWSLETTER