The National Highway Traffic Safety Administration (NHTSA) is investigating Tesla’s fully self-driving (FSD) capabilities in connection with four crashes. The crash occurred in limited visibility conditions with the beta or supervised version of FSD enabled. As TechCrunch points out, in November 2023, a Model Y struck and killed a pedestrian in Arizona. Three other crashes, one of which resulted in an injury, all involved Model 3 electric vehicles and occurred between March and May of this year.
NHTSA said visibility was reduced during these incidents due to conditions such as sun glare, fog, and airborne dust. The agency’s Office of Deficiencies Investigation (ODI) is investigating FSD’s ability to “detect and respond appropriately to conditions of reduced road visibility.” We will also try to determine if there have been other crashes in similar situations with FSD enabled. ODI will also investigate Tesla’s changes to the system that “may affect FSD performance in conditions of reduced road visibility.” In particular, this review will assess the timing, purpose, functionality, and Tesla’s evaluation of such updates. Their safety implications. ”
In April, NHTSA concluded an investigation into hundreds of crashes involving Tesla’s Autopilot system. Of those, 13 were fatal accidents. The agency determined that drivers were “not sufficiently involved” in many of the crashes, stating that “the warnings provided by Autopilot when Autosteer was activated were sufficient to ensure that the driver remained attentive to the driving task.” I decided that I couldn’t do it.”
Tesla CEO Elon Musk claimed just last week that Model 3 and Model Y SUVs would be allowed to drive unsupervised in California and Texas starting next year. At the same event, Musk unveiled CyberCab, a two-seater robocab with no steering wheel or pedals that the company plans to begin producing by 2027.
Tesla does not have a media relations department that could be reached for comment.
