The Fatal Flaw of Semi-Autonomous Vehicles
Imagine that you own a semi-autonomous vehicle. You purchased the vehicle months ago and each time you put the vehicle into autopilot mode, the vehicle drives safely. Your semi-autonomous vehicle’s ability to drive safely and (perhaps) better than you has made you over-reliant on and have overconfidence in the technology’s ability, causing you to have automation bias. Thus, today you decide that on your morning commute you will read a book while your vehicle makes the same drive on the highway. Today, however, your semi-autonomous vehicle encounters a firetruck parked on the highway while traveling at 65 miles per hour. Your semi-autonomous vehicle fails to see the firetruck and smashes into the back of the firetruck. You may have to imagine this situation happening to you, but an unlucky owner of a Tesla Model S did not have to imagine this situation when his Tesla smashed into a firetruck last week in California.
In May 2016, Joshua Brown was killed when his Telsa Model S crashed into a tractor-trailer while in autopilot mode because the autopilot mode could not “see” the tractor-trailer. Both of these accidents likely share a defining feature: the “driver” of the Tesla was not paying attention to the road while the vehicle was in “autopilot” mode.
After the recent accident, a spokesman for Tesla stated that autopilot is “intended for use only with a fully attentive driver.” The autopilot system is considered an SAE “level 2” to “level 3” in the levels of automated driving.
The Audi A8’s “Traffic Jam Pilot” is the first level 3 technology on the market. These level 2 and level 3 vehicles require the driver to monitor the road or otherwise intervene when needed. Thus, level 2 and level 3 vehicles rely on us. A necessary question is whether human monitoring or (safe) intervention is, practically speaking, going to happen.
Because we are unreliable and bad stewards of semi-autonomous vehicles, these vehicles appear dangerously and fatally flawed. The fatal flaw of semi-autonomous technology is that they rely on us.
Our vehicles today cannot be safely operated without our full attention, and our traffic laws require us to pay attention at all times while driving. But we continuously risk our safety and the safety of others and break laws by distracting ourselves or driving under or in non-optimal conditions (such as drowsiness, intoxication, or inclement weather).
Intuitively, we are probably going to pay less attention when our vehicles can safely operate without us. After we first purchase semi-autonomous vehicles, we may be cautious when our vehicles operate in semi-autonomous mode. But after weeks of seeing our vehicles operate safely in “autopilot” mode, we may suffer from an automation bias and relax our guard to engage in various other activities that are better than watching our semi-autonomous vehicles drive on our morning commutes to work. Query whether watching your Tesla safely navigate the roads is more enjoyable than watching an episode of Game of Thrones, texting your friends, or playing HQ Trivia. It is not hard to see that semi-autonomous vehicles’ reliance on us is a bad idea.
There have been many empirical studies that support this intuition. One study found that vehicles using adaptive cruise control with lane centering led drivers to engage in riskier activities, such as reading, eating, texting/emailing, and reaching for an item in the rear compartment. This same study found that these drivers spent 33% more time focusing away from the forward roadway than a driver in a vehicle equipped with only adaptive cruise control.
In its monthly report for October 2015, Google’s Self-Driving Car Project explained why Google ceased producing semi-autonomous vehicles. In fall 2012, Google employees tested its self-driving technology on the freeway during their commutes. The Google team instructed its volunteers to pay attention to the road at all times because of the early stage of the technology. The volunteers also signed forms promising to pay attention at all times and understood that they would be recorded by camera. After the test, the volunteers gave positive feedback about the technology. The Google team then watched the film. The volunteers failed to pay attention as they promised and someone searched the backseat while his vehicle traveled 65 miles per hour on the freeway. “We saw human nature at work: people trust technology very quickly once they see it works. As a result, it’s difficult for them to dip in and out of the task of driving when they are encouraged to switch off and relax.” Our behavior makes it difficult for manufacturers to rely on us to pay attention or to intervene when needed, referred to in the industry as the Handoff Problem. As Google noted, the autonomous vehicle industry knows the Handoff Problem “is a big challenge.”
The two accidents described in the introduction are examples of accidents that occurred because the “driver” was not paying attention to the road. The National Transportation Safety Board, which issued an approximately 500 page report on Joshua Brown’s accident, found that Joshua ignored 7 warnings and did not have his hands on the steering wheel for 90% of the drive leading to his tragic death. The report also determined that Joshua had at least 7 seconds to react prior to impact. It is likely that the driver of the Tesla Model S that crashed into the firetruck was also not paying attention to the road. Otherwise, he may have been able to “slam the brakes” or otherwise avoid the firetruck.
If neither of the drivers paid attention to the road, it is important to question whether Joshua Brown or the driver of the firetruck accident could have retaken control (the so-called handoff) over the driving function from the Tesla Model S to prevent each of their accidents? Many proponents of autonomous technology suggest that if the vehicle starts to malfunction or otherwise is unable to drive, that the vehicle could warn the operator of the semi-autonomous vehicle to “retake” control over the driving functions and safely navigate the vehicle. The ability to retake control over the driving functions of the vehicle requires the operator to remain “situationally aware” of the road, other drivers’ behavior, and whether the semi-autonomous vehicle is appropriately interacting with its surroundings. If the operator is not paying attention to the road, then he or she is not situationally aware and is thus incapable of safely retaking control over the autonomous technology, absent a warning and a period of time to prevent the accident.
Studies have shown that a blindfolded driver—similar to what a distracted driver would be—requires approximately five to six seconds to become situationally aware and that the person’s driving abilities are subpar to a similar non-blindfolded driver for up to one minute. Another study showed that drivers make poor and dangerous decisions within the initial seconds of retaking control over the driving functions of autonomous vehicles. A study produced by the National Highway Traffic Safety Administration found that, in certain circumstances, some drivers averaged 17 seconds after a request to intervene before they retook control over their automated vehicles–at highway speeds, the vehicle would have traveled more than a quarter of a mile prior to the drivers’ intervention.
Therefore, a system that relies on the operator of the semi-autonomous technology to intervene to prevent the accident relies on an unrealistic and flawed assumption.
As recent days have reminded us, each accident involving a vehicle in an “autonomous” mode will not be a page 6 story in the local newspaper. Instead, each accident will be worldwide news and shape public opinion about autonomous technology. Care should be taken to ensure that semi-autonomous technology that operates on this fatal flaw—us—do not cause the public to develop a negative public opinion about the capabilities of autonomous technology.
The next blog post will discuss whether the flaws of semi-autonomous technology could lead to potential civil liability for semi-autonomous vehicle manufacturers when their products are “misused” by us not paying attention to the road.