Over the course of 10 months, nearly 400 car accidents in the United States involved advanced driver assistance technologies, the federal government’s top auto safety regulator revealed Wednesday, in its first release of large-scale data on these burgeoning systems.
In 392 incidents cataloged by the National Highway Traffic Safety Administration from July 1 of last year to May 15, six people were killed and five seriously injured. Teslas operating with Autopilot, the more ambitious Full Self Driving mode, or any of its associated component features had 273 accidents.
The disclosures are part of a broad effort by the federal agency to determine the safety of advanced driving systems as they become more common. Beyond the futuristic appeal of driverless cars, dozens of automakers have released automated components in recent years, including features that let you take your hands off the wheel under certain conditions and help you parallel park.
In Wednesday’s statement, the NHTSA revealed that Honda vehicles were involved in 90 incidents and Subarus in 10. Ford Motor, General Motors, BMW, Volkswagen, Toyota, Hyundai and Porsche each reported five or fewer.
“These technologies hold great promise for improving safety, but we need to understand how these vehicles perform in real-world situations,” said Steven Cliff, agency administrator. “This will help our researchers quickly identify potential defect trends as they emerge.”
Speaking to reporters ahead of Wednesday’s launch, Dr Cliff also cautioned against drawing any conclusions from the data collected so far, noting that it doesn’t take into account factors such as how many cars from each manufacturer are on the road and equipped. with these guys. of technologies
“The data can raise more questions than they answer,” he said.
Nearly 830,000 Tesla cars in the United States are equipped with Autopilot or the company’s other driver-assist technologies, offering one explanation for why Tesla vehicles accounted for nearly 70 percent of reported accidents.
Ford, GM, BMW and others have similar advanced systems that allow hands-free driving under certain road conditions, but far fewer of those models have been sold. However, these companies have sold millions of cars in the last two decades that are equipped with individual components of driver assistance systems. Components include so-called lane keeping, which helps drivers stay in their lanes, and adaptive cruise control, which maintains a car’s speed and brakes automatically when traffic ahead slows.
Dr. Cliff said NHTSA would continue to collect data on crashes involving these types of features and technologies, noting that the agency would use it as a guide to establish rules or requirements for how they should be designed and used.
The data was collected under an order issued by the NHTSA a year ago that required automakers to report crashes involving cars equipped with advanced driver assistance systems, also known as ADAS, or level 2 automated driving systems. .
The order was prompted in part by accidents and fatalities over the past six years involving Teslas operating on autopilot. Last week, NHTSA expanded an investigation into whether the autopilot has design and technology flaws that pose safety risks. The agency has been investigating 35 crashes that occurred while Autopilot was engaged, including nine that resulted in the deaths of 14 people since 2014. It also opened a preliminary investigation into 16 incidents in which Teslas under Autopilot control collided with vehicles. emergency they had stopped and had their lights flashing.
Under the order issued last year, NHTSA also collected data on crashes or incidents involving fully automated vehicles that are mostly still in development but are being tested on public roads. Manufacturers of these vehicles include GM, Ford and other traditional automakers, as well as technology companies like Waymo, which is owned by Google’s parent company.
These types of vehicles were involved in 130 incidents, the NHTSA found. One resulted in a serious injury, 15 with mild or moderate injuries, and 108 without injuries. Many of the crashes involving automated vehicles resulted in fender strikes or bumper strikes because they are operated primarily at low speeds and in city driving.
Waymo, which operates a fleet of driverless taxis in Arizona, was involved in 62 incidents. GM’s cruise division, which just started offering self-driving taxi rides in San Francisco, was involved in 23. A minor accident involving an automated test vehicle made by Pony.ai, a start-up, resulted in the recall of three of the company’s test vehicles. vehicles to fix the software.
The NHTSA order was an unusually bold step for the regulator, which has come under fire in recent years for not being more assertive with automakers.
“The agency is gathering data to determine whether, in the field, these systems pose an unreasonable safety risk,” said J. Christian Gerdes, professor of mechanical engineering and director of the Stanford University Center for Automotive Research.
An advanced driver assistance system can steer, brake and accelerate vehicles on its own, although drivers must remain alert and ready to take control of the vehicle at a moment’s notice.
Safety experts are concerned that these systems allow drivers to relinquish active control of the car and could lead them to believe that their cars are being driven by them. When technology malfunctions or can’t handle a particular situation, drivers may not be prepared to quickly take control.
The NHTSA order required companies to provide crash data when advanced driver assistance systems and automated technologies were in use within 30 seconds of impact. Although this data provides a broader picture than ever before of the behavior of these systems, it is still difficult to determine whether they reduce accidents or improve safety.
The agency has not collected data that would allow investigators to easily determine whether using these systems is safer than turning them off in the same situations.
“The question: What is the baseline against which we are comparing this data?” said Dr. Gerdes, a Stanford professor who from 2016 to 2017 was the first chief innovation officer for the Department of Transportation, of which NHTSA is a part.
But some experts say comparing these systems to human driving shouldn’t be the goal.
“When a Boeing 737 falls out of the sky, we don’t ask, ‘Is it falling out of the sky more or less than other planes?’” said Bryant Walker Smith, an associate professor of law and studies at the University of South Carolina. engineering schools specializing in emerging transportation technologies.
“Accidents on our roads are equivalent to several plane crashes every week,” he added. “Comparison is not necessarily what we want. If there are accidents that these driving systems contribute to, accidents that otherwise would not have happened, that is a potentially solvable problem that we need to be aware of.”