PHOENIX (AZFamily/AP) — An Arizona woman was killed by a driver using Tesla’s “Full Self-Driving” feature as the automaker faces an investigation by the National Highway Traffic Safety Administration.

Tesla reported the four crashes to NHTSA under an order from the agency covering all automakers, including one which happened in Northern Arizona that resulted in the death of a Valley woman.

An agency database says the pedestrian was killed in Rimrock, Arizona, just north of Camp Verde and 100 miles from Phoenix in November 2023 after being hit by a 2021 Tesla Model Y.

The Arizona Department of Public Safety (DPS) said in a statement that the crash happened just after 5 p.m. last November on Interstate 10. Two vehicles collided on the freeway, blocking the left lane. A Toyota 4Runner stopped, and two people got out to help with traffic control.

A red Tesla Model Y then hit the 4Runner and one of the people who exited from it. A 71-year-old woman from Mesa was pronounced dead at the scene, officials reported.

The collision happened because the sun was in the Tesla driver’s eyes, so the Tesla driver was not charged, said Raul Garcia, a DPS public information officer. He added that sun glare also contributed to the first collision.

Investigators will investigate whether autonomous driving features can “detect and respond appropriately to reduced roadway visibility conditions, and if so, the contributing circumstances for these crashes.”

The investigation covers roughly 2.4 million Teslas from the 2016 through 2024 model years.

A message was left Friday seeking comment from Tesla, which has repeatedly said the system cannot drive itself and human drivers must always be ready to intervene.

Last week, Tesla held an event at a Hollywood studio to unveil a fully autonomous robotaxi that does not have a steering wheel or pedals.

Musk, who has promised autonomous vehicles before, said the company plans to have autonomous Models Y and 3 running without human drivers next year. Robotaxis without steering wheels would be available in 2026, starting in California and Texas, he said.

The investigation’s impact on Tesla’s self-driving ambitions isn’t clear. NHTSA would have to approve any robotaxi without pedals or a steering wheel, and it’s unlikely that would happen while the investigation is in progress. But if the company tries to deploy autonomous vehicles in its existing models, that likely would fall to state regulations.

There are no federal regulations specifically focused on autonomous vehicles, although they must meet broader safety rules.

NHTSA also said it would investigate whether any other similar crashes involving “Full Self-Driving” have occurred in low-visibility conditions and seek information from the company on whether any updates affected the system’s performance in those conditions.

“In particular, this review will assess the timing, purpose and capabilities of any such updates, as well as Tesla’s assessment of their safety impact,” the documents said.Tesla has twice recalled “Full Self-Driving” under pressure from NHTSA, which in July sought information from law enforcement and the company after a Tesla using the system struck and killed a motorcyclist near Seattle.

The recalls were issued because the system was programmed to run stop signs at slow speeds and because the system disobeyed other traffic laws. Both problems were to be fixed with online software updates.

Critics have said that Tesla’s system, which uses only cameras to spot hazards, lacks the proper sensors to be fully self-driving. Nearly all other companies working on autonomous vehicles use radar and laser sensors in addition to cameras to see better in the dark or poor visibility conditions.

Musk has said that humans drive with only eyesight, so cars should be able to drive with just cameras. He has called lidar (light detection and ranging), which uses lasers to detect objects, a “fool’s errand.”

The “Full Self-Driving” recalls arrived after a three-year investigation into Tesla’s less-sophisticated Autopilot system crashing into emergency and other vehicles parked on highways, many with warning lights flashing.

That investigation was closed last April after the agency pressured Tesla into recalling its vehicles to bolster a weak system that ensured drivers were paying attention. A few weeks after the recall, NHTSA began investigating whether it was working.

NHTSA began its Autopilot crash investigation in 2021 after receiving 11 reports that Teslas that were using Autopilot struck parked emergency vehicles. In documents explaining why the investigation was ended,

NHTSA said it ultimately found 467 crashes involving Autopilot, resulting in 54 injuries and 14 deaths. Autopilot is a fancy version of cruise control, while “Full Self-Driving” has been billed by Musk as capable of driving without human intervention.

The investigation that was opened Thursday enters new territory for NHTSA, which previously had viewed Tesla’s systems as assisting drivers rather than driving themselves. With the new probe, the agency is focusing on the capabilities of “Full Self-Driving” rather than simply making sure drivers are paying attention.

Michael Brooks, executive director of the nonprofit Center for Auto Safety, said the previous investigation of Autopilot didn’t look at why the Teslas weren’t seeing and stopping for emergency vehicles.

“Before they were kind of putting the onus on the driver rather than the car,” he said. “Here they’re saying these systems are not capable of appropriately detecting safety hazards whether the drivers are paying attention or not.”

See a spelling or grammatical error in our story? Please click here to report it.

Do you have a photo or video of a breaking news story? Send it to us here with a brief description.

Shares:

Leave a Reply

Your email address will not be published. Required fields are marked *