Close
Close

Published

GM in autonomous prang, motorcyclist pushing for precedent

Motorcyclist Oscar Nilsson is claiming that he was struck by one of GM’s autonomous Cruise test cars, and is suing the company. This case could act as an important legal benchmark for future autonomous vehicle collisions – although police think Nilsson intentionally crashed into the Cruise. GM will be treading carefully.

A backup human driver was present in the vehicle but couldn’t avoid the crash. The accident happened in San Francisco in heavy traffic. The lawsuit argues that the Cruise test vehicle ‘suddenly veered back into Nilsson’s lane striking him off his bike to the ground.’ The incident occurred at relatively low speeds, with the Cruise test vehicle driving at 12 mph and the motorcycle traveling at 17 mph – both travelling in the same direction.

Nilsson said that he was trailing the Cruise, when the car executed a left lane change, he then pulled forward only to have the self-driving car pull into his lane, hitting him. The car was in self-driving mode, but its supervisor intervened when he saw the crash coming.

However, GM is presenting a different version of events. The company acknowledged that the car was in autonomous mode, and had aborted a lane change. But GM said that its car was ‘re-centering itself’ in the lane. Nilsson, who had been riding between the two lanes, moved into the center lane, and glanced the side of the Cruise – and falling to the ground.

Nilsson said the crash caused injuries to his neck and shoulder that will require lengthy treatment. The plaintiff was forced to take disability leave from work, and says injuries sustained to the body and mind incurred expenses for medical care and attendance – exceeding $75,000.

The San Francisco Police Department (SFPD) found Nilsson culpable for accident, saying the motorcyclist was found to be at fault for attempting to overtake a vehicle under conditions that were not safe. The police also reported that the Cruise test car did attempt to stop its lane-change maneuver, and notes that the GM driver tried to steer away from Nilsson.

Undoubtedly, GM will have lots of sensor data to support its case, be it from the vehicle’s cameras, LiDAR, and radar. The large amount of sensor data autonomous vehicles will be able to produce should make it particularly difficult for false claims to be made against them – as long as courts can trust their data. It could be a boon for many drivers, as law enforcement won’t have to solely rely on anecdotal evidence and witness statements.

One of the larger questions surrounding this case is the issue of liability, and whether this case might be used to set a precedent. Determining which party is at fault in a collision is already difficult, but autonomous driving systems may present a new challenge.

Liability will become a more prevalent issue as more autonomous car services are launched. The various levels of autonomy and control given to passengers will in part define the liability, as a basic autopilot system will not allow a driver to stop concentrating. In such a system, liability should stay with the driver, as they are responsible for monitoring the autonomous driving – whereas a fully autonomous system with no driver would place liability back with the manufacturer.

The National Highways Traffic Safety Administration (NHTSA) has released a policy report on autonomous driving technology. The report placed responsibility for determining liability on the individual States. California has already placed liability in the direction of the manufacturer, finding in 2012 that “vehicles originally manufactured by a third party shall control issues of liability arising from the operation of the autonomous vehicle.”

However, this position is currently under reconsideration, as limiting liability to just manufacturers could negatively affect the deployment and use of autonomous vehicle, even though the technology could be hugely beneficial. Some estimates think autonomous driving and ADAS could cut down on 90% of accidents where driver error is the main cause.

The lawsuit could be particularly important for testing whether the existing legislation is robust enough to handle the as yet unchartered territory. Given that Nilsson looks to be at fault, it is unlikely the court will set any new precedent in terms of making GM the vehicle operator foot the bill of the collision.

Autonomous driving systems have come under scrutiny in recent months, after a Tesla Model S traveling at 65mph crashed into the back of a fire truck, in autopilot mode. After an investigation, responsibility for the crash was ruled as lying with the driver, as the autopilot system is meant to be operated while the driver is paying attention. It has proven difficult to create systems that can successfully hand control back over to distracted human drivers, however, and Tesla has since adapted its approach – with much more warnings from the system.

Close