Feds investigating the safety of Tesla’s ‘Full Self-Driving’ feature
Media

Feds investigating the safety of Tesla’s ‘Full Self-Driving’ feature

The National Highway Traffic Safety Administration (NHTSA) on Friday said it is investigating 2.4 million Tesla vehicles with the Full Self-Driving (FSD) software after four collisions were reported, including a fatal crash.

The auto safety regulator opened the preliminary evaluation after receiving four reports of crashes in which Tesla’s FSD software was engaged at times when there was reduced roadway visibility – such as sun glare, fog or airborne dust.

The NHTSA said that in one crash “the Tesla vehicle fatally struck a pedestrian. One additional crash in these conditions involved a reported injury.”

The Tesla vehicles that are the focus of NHTSA’s probe include the 2016-2024 Model S and X vehicles with the optional system as well as 2017-2024 Model 3, 2020-2024 Model Y and 2023-2024 Cybertruck vehicles.

TESLA SHARES FALL AFTER ROBOTAXI EVENT

NHTSA’s preliminary evaluation is the first step in a process that could see the agency seek to recall the vehicles if it believes they pose an unreasonable risk to safety.

The agency’s review of the ability of FSD’s engineering controls to “detect and respond appropriately to reduced roadway visibility conditions” will include a look into whether similar FSD crashes have occurred in such conditions, as well as if any updates or modifications made to FSD by Tesla have affected its performance when visibility is reduced.

NHTSA said the “review will assess the timing, purpose, and capabilities of any such updates, as well as Tesla’s assessment of their safety impact.”

MUSK UNVEILS ROBOTAXI, UNSUPERVISED FULL SELF-DRIVING FUTURE: ‘THAT’S WHAT WE WANT’

Tesla CEO Elon Musk has sought to increase the electric vehicle (EV)-maker’s focus on self-driving technology and robotaxis as it faces tough competition and weak consumer demand in the EV market. 

Tesla’s FSD technology has been in development for years and eventually aims to reach a high level of automation capability, where the vehicle can do most driving tasks without human intervention. However, it has faced legal scrutiny stemming from at least two fatal accidents, including an incident in April when a Model S in FSD mode hit and killed a motorcyclist in the Seattle area.

Tesla explains on its website that FSD and its Autopilot feature are intended to be used by an attentive driver who can intervene and take control as needed.

GET FOX BUSINESS ON THE GO BY CLICKING HERE

“Autopilot and Full Self-Driving (Supervised) are intended for use with a fully attentive driver, who has their hands on the wheel and is prepared to take over at any moment. While these features are designed to become more capable over time, the currently enabled features do not make the vehicle autonomous,” Tesla wrote.

Reuters contributed to this report.

Leave a Reply

Your email address will not be published. Required fields are marked *

This material is for informational purposes is not intended to be relied upon as a forecast, research or investment advice, and is not a recommendation, offer or solicitation to buy or sell any securities or to adopt any investment strategy. The opinions expressed are as of date of publication and are subject to change. Reliance upon information in this material is at the sole discretion of the reader. Past performance is not indicative of current or future results. This information provided is neither tax nor legal advice and investors should consult with their own advisors before making investment decisions. Investment involves risk including possible loss of principal.