A federal safety regulator just took the rarest step in automotive oversight: an escalation to Engineering Analysis on a system that three million people trust to do the hardest thing a machine can — drive itself. NHTSA isn’t just asking questions about Tesla’s Full Self-Driving anymore. It’s testing it, measuring it, and preparing for what might be the largest software recall of the autonomous era.
The reason? Tesla’s cameras go blind in everyday conditions, and the car either doesn’t notice or tells the driver a fraction of a second before impact.
The Upgrade That Changes Everything
On March 19, 2026, NHTSA’s Office of Defects Investigation upgraded its preliminary evaluation into Tesla Vision’s degradation detection system to a formal Engineering Analysis — designation EA26002. The coverage expanded from 2.4 million to approximately 3,203,754 vehicles. The scope widened from four crashes to nine, including one that was fatal. And NHTSA flagged six additional potentially related incidents.
This is not routine. NHTSA investigations move through three phases: Preliminary Evaluation, Engineering Analysis, and Recall. Roughly 80% of cases that reach the Engineering Analysis stage end in a recall. Tesla is now on that runway.
The agency’s concern, stated bluntly in its filing: Tesla’s degradation detection system “fails to detect and/or warn the driver appropriately under degraded visibility conditions such as glare and airborne obscurants.”
Translation: the car goes blind, and it lies to you about it.
How Tesla Vision Lost Its Sight
The story of Tesla’s vulnerability to visibility degradation goes back to mid-2021, when the company removed radar sensors from its vehicles — against the advice of its own engineers, according to internal documents — and committed entirely to a camera-only system called Tesla Vision.
Cameras, unlike radar, are easily compromised by sunlight, fog, rain, dust, and a problem unique to Tesla’s design: internal condensation that forms between the lens and housing cover, particularly in cold and humid weather. NHTSA’s investigation centers on whether Tesla’s software detects these conditions and warns the driver before safety becomes compromised.
The answer, increasingly, appears to be no.
According to NHTSA, in the crashes it reviewed, the system “did not detect common roadway conditions that impaired camera visibility and/or provide alerts when camera performance had deteriorated until immediately before the crash occurred.” In multiple incidents, FSD also “lost track of or never detected a lead vehicle in its path.”
In plain language: a lead vehicle was in front of the Tesla, the Tesla’s cameras couldn’t see it, the system didn’t realize it couldn’t see, the driver wasn’t told, and the crash was the result.
The Timeline That’s Hard to Explain
Here’s where the story shifts from technical concern to something that looks uncomfortably like corporate triage.
A fatal crash involving FSD in reduced visibility occurred on November 28, 2023. Tesla submitted its required safety report to NHTSA on June 27, 2024 — nearly seven months later.
The very next day, June 28, 2024, Tesla began developing an update to the degradation detection system.
NHTSA’s filing notes that it still does not know when that update was actually deployed or which vehicles have received it. Even if every vehicle got the fix, Tesla’s own analysis conceded to regulators that the update “may have affected” only three of the nine identified crashes.
To put that in plain English: even Tesla admits its fix wouldn’t have helped in six of the nine crashes under investigation. The fix itself is reportedly addressing a problem it discovered only after someone died — and it took seven months to file the report that triggered the engineering response.
Three Investigations, One Company
This is not Tesla’s only federal headache. NHTSA is currently running three concurrent investigations into Tesla’s autonomous systems:
- EA26002 — The visibility degradation probe, now covering 3.2 million vehicles with one fatality.
- PE25012 — An investigation into more than 50 incidents of FSD-induced traffic violations, including running red lights and traveling in wrong lanes, covering approximately 2.9 million vehicles. NHTSA states that FSD has “induced vehicle behavior that violated traffic safety laws.”
- A crash reporting probe — Opened in August 2025, this investigation examines whether Tesla correctly reported crashes involving both Autopilot and FSD. NHTSA flagged “data and labeling limitations” at Tesla that “could have led to under-reporting of subject crashes.”
The throughline is consistent: NHTSA keeps finding that Tesla cannot or will not provide clear, reliable data about FSD-related failures. And in the traffic violations probe, Tesla has received multiple deadline extensions for handing over data — a pattern that would worry any regulator.
The Fatal Crash
The incident that triggered the original preliminary evaluation in October 2024 involved a Tesla operating on FSD fatally striking a pedestrian in reduced visibility conditions. The details are still emerging under NHTSA review, but the pattern matches what regulators are now seeing across all nine incidents: a visibility impairment the system failed to detect, a lead object the system failed to track, and a warning the driver never received until it was too late.
The fact that a pedestrian died — not in a crash between two vehicles, but in a scenario where the system apparently failed to see a person on or near the road — is the kind of harm that makes regulatory recall all but inevitable.
The Bigger Picture: “Full Self-Driving” vs. the Laws of Physics
Tesla’s camera-only approach to autonomous driving has always been a bet against the consensus. Every major competitor — Waymo, Cruise, Mobileye, and even Tesla’s own engineers before Elon Musk changed course — uses a sensor fusion approach combining cameras with radar or lidar. The reasoning is straightforward: cameras can’t see through fog, radar can. Lidar doesn’t care about sunlight glare, cameras do.
Tesla’s counterargument has always been that its neural networks are smart enough to compensate for camera limitations. The NHTSA investigation suggests that confidence was misplaced.
An Engineering Analysis is the final step before a recall. It involves physical testing, benchmarking against competitors, and deep technical scrutiny. When Tesla’s last NHTSA probe reached this stage — the 2022 Autopilot investigation into failures to detect stationary emergency vehicles — it resulted in a software recall impacting “nearly every vehicle Tesla had ever sold in the U.S.” up to that point.
If EA26002 follows the same arc, Tesla could face another recall of near-universal scope. And this time, the issue isn’t a patchable software bug about detecting fire trucks. It’s a fundamental hardware limitation: cameras fogging up, cameras blinded by sunlight, and a system that was supposed to catch these failures and didn’t.
The Disaster Dossier: Tesla FSD Visibility Failure
What NHTSA upgrades to Engineering Analysis EA26002 for Tesla Vision degradation detection failures When Phase 1 opened October 2024; upgraded March 19, 2026 Who NHTSA Office of Defects Investigation investigating Tesla, Inc. How many ~3.2 million vehicles; 9 crashes; 1 fatality; 2 injury crashes; 6 additional incidents under review Root cause Camera-only FSD system fails to detect or warn about visibility degradation (glare, fog, dust, internal condensation); system lost track of or never detected lead vehicles before impact Regulatory status Engineering Analysis — typically the final step before a recall; ~80% of EAs result in recall Tesla’s response Began developing fix June 28, 2024, one day after reporting a fatal crash (7 months late). Fix would have prevented 3 of 9 crashes. Deployment timeline unknown to NHTSA Concurrent probes Traffic violations probe (PE25012, 2.9M vehicles); crash reporting investigation
Why This Matters Beyond Tesla
You don’t have to own a Tesla to be affected by this story. Three implications ripple outward:
One: the name. “Full Self-Driving” remains a supervised system — NHTSA’s own filing refers to it as “an assistance system” and states that “drivers pay attention and intervene if needed.” But consumer research consistently shows that the name itself misleads drivers into trusting the system more than they should. Tesla has lost legal battles over this exact point, and the regulatory pressure is intensifying.
Two: the hardware decision. Tesla’s camera-only architecture was always a controversial call. NHTSA’s investigation is the strongest regulatory evidence yet that the decision introduced a systemic vulnerability that software alone cannot solve. If the fix requires adding sensors back, the cost implications for a deployed fleet of over three million vehicles are staggering.
Three: the data gap. The concurrent crash reporting investigation is perhaps the most consequential long-term. If Tesla cannot reliably identify FSD-enabled crashes in its own data — as NHTSA has flagged — then regulators, researchers, and the public cannot accurately assess the safety of these systems at scale. And that’s the foundation every future autonomous driving policy rests on.
The Bottom Line
Tesla’s Full Self-Driving system has a problem it can’t reliably see. The system is supposed to notice when its cameras are compromised — by fog, by glare, by dust, by rain, by condensation — and warn the driver to take back control. That system failed repeatedly. In at least one case, the failure was fatal. Tesla began developing a fix the day after it finally filed its crash report for that fatality, seven months after the fact. And even that fix wouldn’t have prevented most of the crashes.
NHTSA’s Engineering Analysis is the kind of regulatory move that ends in a recall. When it does, it will likely touch nearly every Tesla vehicle sold in the United States — making it one of the largest automotive software recalls in history, and the first major recall to target the core sensory architecture of an autonomous driving system.
The gap between what Tesla promises and what its systems deliver has never been wider. And three million drivers are sitting in a car that can’t tell them when it’s blind.
Sources
- NHTSA Engineering Analysis EA26002 — NHTSA filing and full document (PDF)
- Reuters: “US agency upgrades probe into 3.2 million Tesla vehicles over FSD crashes”
- Electrek: “Tesla is one step away from having to recall FSD in NHTSA visibility crash probe”
- The Verge: “Tesla’s Full Self-Driving is on the cusp of a recall”
- Forbes: “Vehicle AI Has A Blind Spot — Tesla FSD And GM Super Cruise In Focus”
- Electr