The promise of self-driving cars has always sounded futuristic and exciting. But beneath the glossy marketing lies a disturbing reality: we may be handing control to machines that aren’t ready for the road.
In November 2023, this vision turned deadly in Arizona. What happened that day—and what’s happening behind the scenes at Tesla—should concern us all.
A Fatal Glitch: Tesla Self-Driving Car Fails to Detect a Human in Broad Daylight
71-year-old Johna Story was doing something deeply human: helping others. After stopping with her daughter and a coworker to redirect traffic at the site of an earlier accident, she stood along the highway in Arizona.
Read also: Is Donald Trump Actually a Migrant? A Forgotten Family Secret Was Just Exposed
At that moment, a Tesla Model Y, operating in Full Self-Driving (FSD) mode, approached. The vehicle reportedly failed to process intense sun glare—and instead of slowing or stopping, it veered straight into Story’s path, killing her on the spot.
Video footage obtained later showed the Tesla reacting to a man waving it down but failing to decelerate in time. The scene raises chilling questions: if the car can’t handle something as basic as glare, what else is it missing?

Who’s Responsible When the Car Drives Itself?
Though a human driver was inside the Tesla, it remains unclear if they are being held accountable. The vehicle’s FSD software, despite its name, is not fully autonomous and requires constant human supervision—something critics argue Tesla does not emphasize enough.
The crash triggered a new wave of scrutiny from federal regulators, who are now asking whether Tesla is pushing dangerous technology onto public roads.
Federal Investigation: Tesla’s Vision-Only System Under Fire
The National Highway Traffic Safety Administration (NHTSA) has opened an expanded probe into Tesla’s self-driving system. Key concerns include:
- How does FSD handle low-visibility conditions such as glare, fog, and dust?
- Are Tesla’s latest software updates genuinely addressing safety issues—or simply shifting legal responsibility?
- Is relying solely on vision-based sensors (no radar or LiDAR) an unacceptable risk?

Read also: Madeleine McCann Disturbing Case: New Investigation Could Uncover Shocking Details 18 Years Later
The video of the fatal crash suggests that current FSD capabilities may be dangerously inadequate in everyday conditions.
Despite this tragedy, Tesla is charging ahead with its vision of a driverless future. The company plans to launch a commercial robotaxi service in Austin this month—a move many say is reckless given the unresolved safety questions.
Meanwhile, other countries are taking note. In response to fatal incidents like this one, China and EU nations are drafting stricter safety standards for autonomous vehicles.
Johna Story’s death has become a flashpoint in a global debate: should we trust self-driving cars when the technology is still flawed?
