Home
/
Latest news
/
Policy changes
/

Nhtsa launches investigation as tesla fsd faces criticism

NHTSA Launches Investigation | Tesla FSD Faces Criticism Over Traffic Violations

By

Raj Patel

Oct 9, 2025, 09:43 PM

Updated

Oct 10, 2025, 07:54 AM

2 minutes needed to read

A Tesla car showing warning lights while crossing into oncoming traffic, illustrating concerns over its Full Self-Driving feature.

The National Highway Traffic Safety Administration has opened an investigation into Tesla's Full Self-Driving (FSD) system after multiple reports of dangerous driving behavior, including crossing into oncoming traffic and neglecting red lights. This action comes amid decreasing confidence in the technologyโ€™s safety as complaints continue to pile up.

Alarming Driver Experiences

User comments reveal a worrying trend regarding FSD's driving behaviors. Many drivers report instances where the system attempted illegal maneuvers, like making left turns on red lights. One user noted, "I've had to break out of FSD a few times at a light. It wants to take the unprotected left illegally." This indicates a troubling potential for the AI to mimic human mistakes, as users express concern that the car is reflecting the poor driving habits they observe around them.

A Mixed Bag of Performance

Comments highlight a split perception among drivers: while some praise FSD for its capabilities, many others voice frustration at its lapses. One user reflected on past complaints about the FSD often taking too long at stop signs, saying, "I remember back in the day, people were complaining it takes forever to go through a stop sign rather than rolling it like humans do." Dreary experiences at intersections seem to fuel doubts as drivers observe its reliance on real-world behavior around them.

"I think thatโ€™s what is happening. The car sees the red as the light fixture highlighted in blue," another driver commented, illustrating the blurred lines between human behavior and AI learning.

The Debate Continues: Human vs. AI

Discussions also touch on the broader implications of reliance on AI versus human drivers. Some comments argue that humans often struggle to handle unpredictable road conditions, suggesting that automated systems could eventually enhance safety, despite current shortcomings. The ongoing conversation reveals a sentiment that a balance must be struck between technological advancement and strict adherence to traffic laws.

Key Points from the Investigation:

  • ๐Ÿšฆ Reports of FSD disregarding traffic signals prompt NHTSA inquiry.

  • ๐Ÿ“‰ Users highlight fluctuating reliability, with some stating the system mirrors poor human habits.

  • ๐Ÿ“Š Increasing discussion on AI's potential for safe driving versus human performance issues.

As the NHTSA probe progresses, Tesla faces intensified scrutiny to enhance FSD reliability. The stakes are high as user trust dwindles, and calls for accountability resonate across communities. Moreover, a growing number of drivers prefer to intervene manually due to habitual errors observed in FSD's operation.

Future Implications for FSD

Experts suggest that if the investigation concludes with necessary changes, it may lead to significant overhauls of the FSD system. With increasing user complaints and a growing demand for accountability, stricter regulations may come into play. As one user aptly put it, "Seems like every FSD update swings between progress and setback." The timeline for achieving a fully autonomous driving solution is likely to be extended, causing potential delays in Teslaโ€™s future projects.

Lessons from History

Just as prior industries faced challenges with automation, Tesla finds itself tasked with balancing innovation and safety. The scrutiny surrounding FSD echoes historical debates on machine reliability, reminding us that with great advancement comes a substantial responsibility to ensure safety on the roads.