Visit Our Sponsors |
The National Highway Traffic Safety Administration (NHTSA) is investigating Tesla's semi-autonomous self-driving software, over concerns that it lacks the ability to detect road hazards in low-visibility environments.
Roughly 2.4 million Tesla vehicles currently have the automaker's "full self-driving" (FSD) software. Despite what the name might imply, FSD still requires a human driver to actively supervise the road, and is not fully autonomous. Rather, it guides vehicles to ramps on and off highways, automatically changes lanes when a driver signals, and slows vehicles at stop signs and traffic lights, among other functions.
According to a report from the NHTSA, at least four collisions linked to FSD issues have been reported to the agency, including one instance where a Tesla vehicle hit and killed a pedestrian. In each instance, the vehicles had their FSD systems turned on during limited visibility caused by factors such as fog and glare from the sun.
Read More: Self-Driving Trucks are 'Not Anywhere Close'
The NHTSA's investigation will look into the FSD system's ability to "detect and respond appropriately to reduced roadway visibility conditions," and whether there might be other similar incidents where a Tesla using FSD crashed in low visibility. If the agency finds that the system poses enough of a safety risk, it could potentially order a full recall of all vehicles that use the software, which would include all of Tesla's passenger models dating back to 2016.
RELATED CONTENT
RELATED VIDEOS
Timely, incisive articles delivered directly to your inbox.