Technology

How Safe Are Systems Like Tesla’s Autopilot? No One Knows.

Tesla every three months, Safety report It provides the number of miles between crashes when the driver uses the company’s driver assistance system, Autopilot, and the number of miles between crashes when not.

These numbers always show that the frequency of accidents is low for the Autopilot, a collection of technologies that allow Tesla’s vehicles to be independently steered, braked and accelerated.

But the numbers are misleading. Autopilot is mainly used for driving on highways. Twice as safe as driving in the city, According to the Ministry of Transport. Autopilots are typically used in safer situations, which can reduce crashes.

Tesla does not provide data that allows comparison of autopilot safety on the same type of road. No other car manufacturer offers a similar system.

Autopilot has been running on public roads since 2015. General Motors introduced Supercruise in 2017, and Ford Motor announced Blue Cruise last year. However, there are few publicly available data that reliably measure the safety of these techniques. American drivers, whether using these systems or sharing roads, are effectively guinea pigs in experiments where results have not yet been revealed.

Automakers and tech companies are adding more vehicle features that claim to improve safety, but these claims are difficult to test. Meanwhile, the death toll on national highways and streets has increased in recent years, In 2021, it will reach the highest price in 16 years.. The additional safety provided by technological advances does not seem to offset the improper decisions made by the driver holding the steering wheel.

J. Christian Geldes, co-director of the Center for Mechanical Engineering at Stanford University, said: Automotive Research was the first Chief Innovation Officer of the Ministry of Transport.

GM worked with the University of Michigan to investigate the potential safety benefits of Supercruise, but concluded that there was not enough data to understand whether the system reduced crashes.

A year ago, the government’s automotive safety regulator, the National Highway Traffic Safety Administration, learned about the potential for serious crashes associated with advanced driver assistance systems in line with Autopilot’s policy. I ordered the company to report within a day. The order said the agency would publish the report, but it has not yet done so.

The safety agency refused to comment on the information it had collected so far, but said in a statement that the data would be released “in the near future.”

Tesla and its CEO, Elon Musk, did not respond to requests for comment. GM said it had reported two incidents related to Supercruise to NHTSA. One is 2018 and the other is 2020. Ford declined to comment.

Authorities’ data are unlikely to provide a complete picture of the situation, but it can encourage lawmakers and drivers to take a closer look at these technologies and ultimately change the way they are sold and regulated. There is sex.

“To solve a problem, we first need to understand it,” said Brian Walkersmith, an associate professor at the University of South Carolina School of Law and Engineering who specializes in new transportation technologies. “This is a way to get more ground truth as the basis for research, regulation and other actions.”

Despite its capabilities, the autopilot does not remove responsibility from the driver. Tesla tells the driver to be vigilant and always ready to control the car. The same is true for Blue Cruise and Super Cruise.

However, many experts are worried that these systems can make the driver think that his car is driving because he can give up active control of the car. Then, if the technology malfunctions or cannot handle the situation on its own, the driver may not be ready to control as quickly as needed.

Older technologies such as automated emergency braking and lane departure warning have long provided drivers with a safety net by slowing or stopping the vehicle and alerting the driver when they deviate from the lane. However, the new driver assistance system reverses its placement by making the driver a technology safety net.

Safety experts are particularly concerned about how Autopilot is sold. For years, Mr. Musk has said that company cars are on the verge of true autonomy — driving themselves in virtually any situation. The name of the system also means automation for which technology has not yet been achieved.

This can lead to driver complacency. The autopilot has played a role in many deadly crashes because the driver was not ready to control the car.

Mr. Musk has long promoted autopilot as a way to improve safety, and Tesla’s quarterly safety report seems to support him. but, Recent research These reports look different, according to reports from the Virginia Department of Transportation Research Council, a division of the Virginia Department of Transportation.

“We know that cars with an autopilot crash less often than without an autopilot,” he said, addressing the safety and operational issues surrounding self-driving cars. Noah Goodall, a council researcher investigating, said. “But are they driven in the same way, on the same road, at the same time, by the same driver?”

Analyzing police and insurance data, the Insurance Institute for Highway Safety, a non-profit research institute funded by the insurance industry, has improved safety with older technologies such as automatic emergency braking and lane departure warning. I found that I was there. However, the organization says studies have not yet shown that driver assistance systems offer similar benefits.

Part of the problem is that police and insurance data do not always indicate whether these systems were in use at the time of the crash.

The Federal Automobile Safety Agency has ordered companies to provide data on collisions when driving assistance technology is used within 30 seconds of a collision. This gives you a complete picture of the performance of these systems.

But even with that data, safety experts said it would be harder to determine if using these systems would be safer than turning them off in the same situation.

The Alliance for Automotive Innovation, an industry group of automotive companies, warns that Federal Safety Agency data can be misunderstood or misrepresented. Some independent experts have expressed similar concerns.

“My big concern is that there is no comparable data on collisions related to traditional cars, and detailed data on collisions related to these techniques is available,” said a professor at the Cardozo School of Law in New York. Yes, says Matthew Onesley, who previously specialized in the emergence of automotive technology. Corporate lawyer for a self-driving car startup called nuTonomy. “These systems can appear to be much less secure than they really are.”

For this and other reasons, automakers may hesitate to share some data with distributors. Under that order, a company can ask to withhold certain data by claiming to reveal the secrets of its business.

The agency also collects collision data about autonomous driving systems. This is a more advanced technology aimed at completely eliminating the driver from the car. These systems are often referred to as “self-driving cars.”

In most cases, this technology is still being tested in a relatively small number of cars that the driver is driving as a backup. Waymo, owned by Google’s parent company Alphabet, operates without a driver in the suburbs of Phoenix, with similar services planned in cities such as San Francisco and Miami.

In some states, businesses need to report crashes related to autonomous driving systems. National security agency data should also provide additional insights in this area.

But a more pressing concern is the safety of autopilots and other driver assistance systems on hundreds of thousands of vehicles.

“I have an open question: are autopilots increasing or decreasing the frequency of crashes?” Wansley said. “You may not get the complete answer, but you can get some useful information.”

Related Articles

Back to top button