One Step Closer to Recall After Tesla’s Crash With Autopilot

One Step Closer to Recall After Tesla's Crash With Autopilot

Tesla with a partially automated driving system is one step closer to a recall as the US increases its investigation into a series of accidents with emergency vehicles or trucks with warning signs.

The National Highway Traffic Safety Administration said Thursday that it is upgrading the Tesla investigation to an engineering analysis, a sign of increased scrutiny by electric vehicle makers and automated systems that perform at least some driving tasks.

Documents posted Thursday by the agency raise some serious issues about Tesla’s Autopilot system. The agency found that it is being used in areas where its capabilities are limited, and that many drivers are not taking action to avoid accidents despite warnings from the vehicle.

The investigation now covers 830,000 vehicles, nearly everything the Austin, Texas, carmaker has sold in the US since the start of the 2014 model year.

NHTSA said it had found 16 accidents involving emergency vehicles and trucks with warning signs, which left 15 injured and one killed.

Investigators will collect additional data, evaluate vehicle performance and “detect the degree to which Autopilot and related Tesla systems may increase human factors or behavioral safety risks, reducing the effectiveness of driver supervision,” the agency said.

A message was left on Thursday seeking comment from Tesla.


FILE - This photo provided by the Laguna Beach Police Department shows a Tesla sedan, at left, in Autopilot mode, which crashed into a parked police cruiser in Laguna Beach, Calif., May 29, 2018.

FILE – This photo provided by the Laguna Beach Police Department shows a Tesla sedan, at left, in Autopilot mode, which crashed into a parked police cruiser in Laguna Beach, Calif., May 29, 2018.

An engineering analysis is the final stage of an investigation, and in most cases NHTSA decides within a year whether to recall or close the investigation.

In most of the 16 crashes, Tesla issued a collision alert to drivers just before impact. Automatic emergency braking intervened to at least slow down the cars in about half the cases. In documents detailing the investigation, NHTSA said that on average, Autopilot gave up control of Tesla less than a second after the accident.

NHTSA also said it is looking at accidents with similar patterns that did not involve emergency vehicles or trucks with warning signs.

The agency found that in many cases, drivers had their hands on the steering wheel as Tesla required, yet they failed to take action to avoid the accident. This suggests that drivers are complying with Tesla’s monitoring system, but it does not ensure that they are paying attention.

In the crashes had video available, drivers should have seen first-reaction vehicles an average of eight seconds before impact, the agency wrote.

The agency must determine whether the autopilot has a safety flaw before recalling.

Investigators also wrote that the use or misuse of the Driver Monitoring System by the driver “or inadvertently operating the vehicle does not necessarily preclude a system defect.”

The agency documents all, but says Tesla’s method of ensuring that drivers pay attention is not enough, and it is faulty and should be recalled, said Bryant Walker Smith, a law professor at the University of South Carolina who specializes in automatic Study vehicles.

“It’s really easy to get your hands on the wheel and be completely free of driving,” he said. Monitoring the driver’s hand position is not effective as it only measures a physical position. “It’s not related to their mental capacity, their engagement or their ability to react,” he said.

Similar systems from other companies, such as General Motors’ Super Cruise, use infrared cameras to observe the driver’s eyes or face to ensure they are looking ahead. But these can still allow a driver to zone out, Walker Smith said.

In total, the agency looked at 191 accidents but removed 85 of them because other drivers were involved or there was not enough information to make a definitive assessment. Of the remaining 106, nearly one-quarter of the crashes appear to be due to the autopilot operating in areas where it has limitations, or conditions that may interfere with its operation.

Other automakers limit the use of their systems to limited-access divided highways.

NHTSA said in a statement that there are no self-driving vehicles available for purchase today.

“Every available vehicle is required to have a human driver under control at all times, and all state laws hold the human driver responsible for the operation of their vehicles,” the agency said.

The agency said the driver-assist system can help avoid accidents, but it must be used correctly and responsibly.

Tesla rolled out an online update of Autopilot software to improve the camera’s detection of emergency vehicle lights in low light conditions. NHTSA has asked why the company did not do the recall.

NHTSA launched its investigation in August last year after a string of accidents since 2018 in which Tesla used the company’s Autopilot or Traffic Aware cruise control system to hit vehicles at scenes where first responders reported flashing lights. , flares, an illuminated arrow board, or cone warning were used. of dangers.

This article is republished from – Voa News – Read the – original article.