Business

Self-Driving and Driver-Assist Technology Linked to Hundreds of Car Crashes

Approximately 400 conflicts in the United States over a 10-month period involved vehicles using advanced driving assistance technology, the highest federal vehicle safety regulator said Wednesday.

The findings are part of a drastic effort by the National Highway Traffic Safety Administration to determine the safety of advanced driving systems as they become more and more commonplace.

Six people were killed and five were seriously injured in 392 incidents cataloged by the authorities from July 1st to May 15th last year. Tesla operating in either the autopilot, the more ambitious full self-driving mode, or the component features associated with them, occurred in 273 crashes. Five of those Tesla crashes were deadly.

Data was collected under NHTSA orders last year, requiring automakers to report collisions involving vehicles equipped with advanced driver assistance systems. In recent years, many manufacturers have deployed such systems, including features that allow them to take their hands off the steering wheel under certain conditions and help parallel parking.

The NHTSA order was a very bold step for regulators. Regulators have been criticized in recent years for being less aggressive towards automakers.

“Until last year, NHTSA’s response to self-driving cars and driver assistance was frankly reluctant,” said Matthew Wandsley, a professor of emerging automotive technology at the Cardoso Law School in New York. “This is the first time the federal government has directly collected crash data on these technologies.”

NHTSA administrator Steven Cliff spoke with reporters prior to Wednesday’s announcement, and the data that authorities continue to collect will “quickly identify trends in flaws that investigators may develop.” It helps. “

Dr. Cliff said NHTSA will use such data as a guide in developing rules and requirements for its design and use. “These technologies have great potential to improve safety, but we need to understand how these vehicles work in real-world situations,” he said.

However, he warns against drawing conclusions from the data collected so far, taking into account factors such as the number of cars of each manufacturer on the road and equipped with these types of technologies. I noted that there is no such thing.

Advanced driver assistance systems can steer, brake, and accelerate the vehicle on their own, but the driver must always be vigilant and ready to control the vehicle at any time.

Safety experts are concerned because these systems can cause the driver to relinquish active control of the car and make them think that their car is driving themselves. If the technology malfunctions or is unable to handle a particular situation, the driver may not be ready to control quickly.

Approximately 830,000 Tesla vehicles in the United States are equipped with Autopilot or other driving assistance technologies from the company. That’s one of the reasons Tesla cars accounted for about 70% of the crashes reported in the data released Wednesday.

Ford Motor Company, General Motors, BMW, etc. have similar advanced systems that enable hands-free driving under certain conditions on the highway, but far fewer models are on sale. However, over the last two decades, these companies have sold millions of vehicles with individual components of driver assistance systems. Components include so-called lane keeping to help the driver stay in the lane and adaptive cruise control that automatically adjusts the vehicle’s speed and brakes when traffic in front is slow.

In a Wednesday release, NHTSA revealed that Honda cars were involved in 90 cases and Subaru was involved in 10 cases. Ford, GM, BMW, Volkswagen, Toyota, Hyundai and Porsche each reported less than five cases.

The data includes individual data about the vehicle with a system designed with little or no driver intervention and the system that can control and control the speed of the vehicle at the same time, but requires constant attention from the driver. It is included.

According to NHTSA, self-driving cars, mostly still under development, have been tested on public roads, but have been involved in 130 incidents. One was seriously injured, 15 were minor or moderate injuries, and 108 were unharmed. Many of the crashes associated with self-driving cars were fender benders or bumper taps, mainly because they were operated at low speeds and in urban areas.

In more than one-third of the 130 accidents related to automated systems, a car stopped and collided with another vehicle. Data show that in 11 collisions, a car with such technology enabled went straight and collided with another car changing lanes.

Most of the incidents related to advanced systems occurred in the San Francisco or Bay Area, where companies such as Waymo, Argo AI, and Cruise are testing and improving their technologies.

Waymo, owned by Google’s parent company and driving an unmanned taxi in Arizona, was part of 62 cases. Cruise, a division of GM, was involved in 23. Cruise has just started offering unmanned taxis in San Francisco this month Received permission California authorities will begin billing passengers.

None of the cars using the automated system were involved in a fatal accident and were seriously injured in a single collision. In March, a cyclist hit a cruise-driven vehicle from behind while driving downhill on the streets of San Francisco.

NHTSA’s order to submit data to automakers was partially prompted by collisions and deaths over the past six years involving Tesla operating on an autopilot. Last week, NHTSA expanded its investigation into the technical and design flaws that pose a safety risk to the autopilot.

The agency is investigating 35 collisions that occurred while the autopilot was in operation, nine of which have killed 14 people since 2014. We have also begun a preliminary investigation into 16 incidents in which Tesla, under the control of the Autopilot, collided with a stopped emergency vehicle. Their lights will flash.

In November, Tesla recalled approximately 12,000 vehicles that were part of a full self-driving beta test. This is a version of the autopilot designed for use in urban areas. ‘Emergency braking system.

The NHTSA order required companies to provide data on collisions when advanced driver assistance systems and automation technologies were used within 30 seconds of the collision. While this data provides an unprecedented picture of how these systems behave, it is still difficult to determine whether to reduce crashes or otherwise improve safety.

Authorities do not collect data that would allow researchers to easily determine if these systems are safer than turning them off in the same situation. The car manufacturer was allowed to edit the description of what happened in the event of the accident. This was an option that Tesla, Ford and others used on a daily basis, making it difficult to interpret the data.

Several independent studies have investigated these techniques, but it has not yet been shown whether they reduce collisions or otherwise improve safety.

J. Christian Geldes, a professor of mechanical engineering and director of the Stanford University Automotive Research Center, said the data released Wednesday was to some extent useful. “Can we learn more from this data? Yes,” he said. “Is it an absolute gold mine for researchers? I don’t know.”

Due to the revision, he said it is difficult to measure the final usefulness of the findings. “NHTSA understands this data much better than the general public can just see what was released,” he said.

Dr. Cliff, the administrator of NHTSA, was wary of acting on the results. “The data may raise more questions than they answer,” he said.

However, some experts said the newly available information should encourage regulators to be more proactive.

Brian Walkersmith, an associate professor at the University of South Carolina’s Emerging College of Law and Engineering, said: Transportation technology.

“These data may also encourage further voluntary and involuntary disclosure,” he added. “Some companies may be willing to provide more context, especially for mileage, crash” prevention “, and other good performance indicators. Attorneys in court look for patterns and even cases of these data. “

Overall, he said, “This is a good start.”

Jason Kao, Asmaa Elkeurti And Vivian Li contributed to the investigation and reporting.

Related Articles

Back to top button