My Statesman: Self-Driving Car Crash Raises Tricky Legal Question of Blame

Self-Driving Car Crash Raises Tricky Legal Question of Blame

By Ethan Baron, March 07, 2018 


General Motors is in a race to be the first company to mass produce self-driving cars, but a recent crash with a San Francisco motorcyclist has illustrated the challenge of assigning blame when an autonomous vehicle gets in an accident.

As self-driving cars take to the roads in increasing numbers, collisions with standard vehicles are inevitable, experts say, as are lawsuits.

San Francisco commercial photographer Oscar Nilsson sued GM, after a Dec. 7 collision with a Chevrolet Bolt that aborted a lane change while driving autonomously.

The crash highlights an important issue raised by autonomous technology: Self-driving vehicles may not behave like those driven by humans, and that may complicate investigations into who’s at fault.

“That’s going to continue to be a huge area where we’re going to have problems,” said John Simpson, spokesman for nonprofit Consumer Watchdog, a frequent critic of speedy deployment of autonomous vehicles.

GM’s subsidiary, Cruise, has since August been testing a self-driving car service in San Francisco with human backup drivers behind the wheel, as required by the state.

Nilsson’s lawsuit claims he was riding on his motorcycle behind one of GM’s autonomous Bolts on Oak Street when the car, with its backup driver, changed lanes to the left. When he rode forward, the Bolt suddenly veered back into his lane and knocked him to the ground, according to the lawsuit, filed in U.S. District Court in San Francisco.

The San Francisco Police Department’s report on the incident blamed Nilsson for passing a vehicle on the right when it wasn’t safe, but Nilsson’s lawyer, Sergei Lemberg, disputed that finding.

“I don’t know what a police officer can tell, after the fact,” Lemberg said. “I don’t know that it’s fair to blame this act on the completely innocent person who’s just driving down the road and gets hit.”

The police report, said Lemberg, supported holding GM responsible. It noted that after the Bolt determined it couldn’t make the lane change, and began moving back while Nilsson was passing on the right, the Bolt’s backup driver tried to grab the wheel and steer away, but the collision occurred simultaneously.

“Why don’t these folks just take some responsibility?” Lemberg said.

A crash report filed with the California Department of Motor Vehicles by GM provided a much different view of the accident. The company acknowledged that the car, in autonomous-driving mode in heavy traffic, had aborted a lane change. But GM said that as its car was “re-centering itself” in the lane, Nilsson, who had been riding between two lanes in a legal-in-California practice known as lane-splitting, “moved into the center lane, glanced the side of the Cruise … wobbled, and fell over.”

In an email statement, GM noted that the police report concluded Nilsson was responsible for the accident.

“Safety is our primary focus when it comes to developing and testing our self-driving technology,” GM said.

The company has been running a “Cruise Anywhere” program since August for employees, which allows them to hail automated Cruise vehicles and be driven anywhere in San Francisco. It was unclear whether the vehicle involved in the accident was part of this program.

It was also unclear if the Bolt in question was one of the “third-generation” automated vehicles described last fall by Cruise CEO Kyle Vogt as “the world’s first mass-producible car designed to operate without a driver.” Those vehicles were intended to be used in the “Cruise Anywhere” program, Vogt wrote in a Medium post.

Companies operating autonomous vehicles are likely to settle quickly in crash-related lawsuits when the technology appears to be at fault, and fight mightily when they believe the driver of the ordinary vehicle to be responsible, said Stanford researcher and University of South Carolina School of Law professor Bryant Walker Smith.

“There might be data that might tend to show fault or no fault,” Smith said.

Such data — which may include video and other driving information from the autonomous vehicle — could help investigators. Simpson of Consumer Watchdog said it should be publicly disclosed whenever a self-driving car crashes.

GM’s testing in San Francisco highlights the firm’s progress in the race to mass-produce autonomous cars. While Google took an early lead in autonomous driving with a program now spun off into its own company, called Waymo, GM’s manufacturing capabilities and other advantages have allowed it to catch up, according to a report by market-research firm Navigant Research in January.

Autonomous-vehicle firms competing for success in a new market are bound to face legal thickets when accidents happen.

In the San Francisco accident, GM’s crash report said the Bolt was traveling at 12 mph, while Nilsson had been driving at 17 mph. After the collision, Nilsson “got up and walked his vehicle to the side of the road” and “reported shoulder pain and was taken to receive medical care.”

Nilsson claimed in his lawsuit that he suffered neck and shoulder injuries, which will require “lengthy treatment,” and that he had to go on disability leave. He’s seeking unspecified damages.

Original story: At My Statesman

More news

See more of Lemberg Law’s media coverage

Go to list of articles

Get Your No-Obligation
Case Evaluation

Send a secure message to our legal team.

What’s your name? What’s your name?
What’s your email address? What’s your email address?
What’s your phone number? What’s your phone number?
Briefly describe the problem Briefly describe the problem
Confidentiality Guarantee: We keep your information completely confidential and will not send you spam or sell your information.
By submitting above, I agree to the privacy policy and terms and consent to be contacted by an agent via phone call or text message at the phone number(s) listed above, including wireless number(s).