What Will Happen When Your Driverless Car Crashes?

Illustration for article titled What Will Happen When Your Driverless Car Crashes?

Driverless cars are nearly here, at least if Google has its way. But what happens when we're all zipping around, hands-and-feet free, nary a care in the world, and BAM! we're in a terrible accident?

Advertisement

Who's responsible? And perhaps more importantly, will we make any attempts to stop it? When something goes awry and it looks like you're about to crash head-on with an 18-wheeler, what will you do?

Back in 1999, Dutch researchers studied how people might react during the threat of an accident on the automated highways of tomorrow. They were interested in the issue because back in the 1990s the driverless cars of the future were just around the corner (sound familiar?), thanks to the hundreds of millions of dollars being invested in the technology by the U.S. Congress.

Advertisement

The Dutch researchers found during their simulations that roughly half of the people they studied tried to take control of the vehicle. The other half seemed to trust that their robot chauffeur would prevail and lead them to safety, despite all evidence to the contrary.

From the 1999 study:

In the emergency situation, only half of the participants took over control, which supports the idea that [Automated Highway System] as any automation, is susceptible to complacency.

Of course, driving a simulator is vastly different than when you're out on the real open road. But will we find any difference on the highways of tomorrow? Honestly, the way most people drive on the 405, we're probably better off letting Google take the wheel even if it means a few bruised fenders.

Image: scanned from the 1981 book The Future World of Transportation by Valerie Moolman

Advertisement

Share This Story

Get our newsletter

DISCUSSION

SkilletHead
SkilletHead

I've been curious about this for quite some time, but not about how a person would react. Rather, I've been curious about the liability issues.

Even now, we have standard production cars that have some automated driving systems built in. Some fancy cars stop themselves if they think they'll be in a wreck. I think one attempts to park itself.

It's only a matter of time until a self driving car is in a fatal accident. (I see this as a necessary evil - self-driving cars will probably be safer than human-driven cars, but some accidents will still happen) Eventually a system will malfunction, make a bad decision, and result in a casualty.

A car manufacturer can certainly force a driver to sign a waiver when they buy the car, but there are no provisions to protect against third party liability. If a car that can automatically stop itself does so in error, causing a wreck, will the manufacturer be on the hook? What if a mandatory software update wasn't performed? What if the car has aftermarket software or the ECU has been rooted for performance? What if the car has aftermarket hardware: either computer hardware -"chipping" an ECU- or different tires, wheel size, rims (unsprung weight), engine, brakes, etc?

In a case like this, there will be a very interesting lawsuit between the injured third party and both the driver of the car and the manufacturer. If there had been a recall (hardware or software) that wasn't addressed at the time of sale, then the seller could be responsible, too.

I really think that liability, more than any technical challenge, is going to be the main hurtle to getting self driving cars to the masses.