Last week’s death of a pedestrian in Tempe, Arizona, demonstrates the common outrageous flaw in the autonomous vehicle manufacturers’ plans to rush their product onto the streets. They intend to use robotic cars to replace humans behind the wheels because humans make too many mistakes. They acknowledge that their robots cannot now safely operate on the streets without a back-up driver, who is a human also known as an autonomous vehicle tester. In their corporate minds, this tester rides for six to eight hours every-ready to correct the robot.
In Tempe, the robot failed. It did not pick up the image of the woman walking her bicycle across the street in the dark. It did not activate the brakes or decrease speed or make an evasive maneuver, all actions an attentive human driver likely would have taken. The tester also failed. The in-car camera captured the tester looking down, then looking up horrified the instant before the vehicle ran over the woman.
Are companies like Uber and GM ignoring the risks of having a bored or distracted tester behind the controls of an autonomous vehicle?
They can ship the robot to the computer lab for a corrective tweak. What about fixing their back-up tester? This inherently, untrustworthy human the developers pay $20-$23 an hour to correct the robot’s error before it kills? Can they fix him and all the other testers who have “vigilance decrement” the scientific term for letting your mind wander? From time-to-time, we all have that, but its an occupational condition for those hired to wait and watch like security guards and lifeguards.
In a 2015 study “Vigilance Decrement and Passive Fatigue Caused by Monotony in Automated Driving” published in ScienceDirect, scientists found that because of the low task demand, participants diverted their attention away from performing the task to engagement with their own thoughts resulting in increased mind wandering. Autonomous vehicle developers should have known about vigilance decrement, even if they have never experienced it. A March 2018 article in Wired, “The Unavoidable Folly of Making Humans Train Self-Driving Cars”, traces its discovery in World War II back to a study of RAF cadets ordered to watch a radar screen for blips made by German submarines. Their attention wandered in less than 30 minutes.
You would think that the developers would recognize vigilance decrement as an impermissible impairment for their testers. You would think that they would include it as a concern on its job postings, or test possible applicants, or include it in its training, or compensate by installing a bells-and-whistles device on its high-tech cars to keep testers on task. But they do not.
What are the Developers’ Desired Qualifications for a Self Driving Vehicle Tester?
Here are representative job postings. Uber needs testers for its trucks. It seeks experienced truckers with computer savvy and a willingness to document. Its list of desired qualities includes this ominous phrase: “True love of driving and technology desired; success is often measured in terms of hours logged in the vehicle.”It makes no mention of safety or of the driver’s ability to concentrate while he is logging long hours.
The staffing agency Adecco posted this job description for an unnamed client: Responsible for operating and evaluating a self-driving vehicle in autonomous mode for six to eight hours per day, including collecting data and providing feedback. They want a good driver with computer proficiency who is willing to take a commercial driving class. Again, no mention of the paramount need for good concentration.
How Do Developers Train an Autonomous Vehicle Tester?
Either developers are ignorant of the danger of mind wandering or they are ignoring this the deadly issue.A 2015 IEEE investigatory article revealed that training varied between manufacturers. Google provided five weeks, Audi about two hours. No training program mentioned any testing or preparation for mind wandering occasions.There is no evidence of a new awareness by developers. Google’s current hiring brochure skips the subject as does the current Uber training program detailed in the Wired article.In 2017, the tech website macrumours.com released the Apple driver training plan which contained seven maneuvers to be performed by the driver to take control when the robot fails. Of course, each maneuver requires the driver to be paying attention at that moment. The Apple plan does not prepare for the wandering mind just as the Uber plan did not prepare for the Tempe tragedy.
Do Any Laws Regulate Driverless Cars Tester Qualifications or Training?
Not really. A proposed federal law is stalled in the Senate. The states that allow test vehicles require the testers to be licensed, and California requires developers to attach their test program to their application.
Unfettered by laws, the developers can test on public roads with the foolish assumption that their testers’ minds will never wander. Until the developers accept and solve this problem, more deadly accidents will occur increasing public resistance to self-driving cars on their roads.