Visit Our Sponsors |
A safe-technology advocacy group issued claimed Aug. 9 that Tesla’s full self-driving software represents a potentially lethal threat to child pedestrians, the latest in a series of claims and investigations into the technology to hit the world’s leading electric carmaker, reports The Guardian.
According to a safety test conducted by the Dawn Project, the latest version of Tesla Full Self-Driving (FSD) Beta software repeatedly hit a stationary, child-sized mannequin in its path. The claims that the technology apparently has trouble recognizing children form part of an ad campaign urging the public to pressure Congress to ban Tesla’s auto-driving technology.
In several tests, a professional test driver found that the software – released in June – failed to detect the child-sized figure at an average speed of 25mph and the car then hit the mannequin.
The Dawn Project’s founder, Dan O’Dowd, called the results “deeply disturbing”.
After a fiery crash in Texas in 2021 that killed two, Musk tweeted that the autopilot feature, a less sophisticated version of FSD, was not switched on at the moment of collision.
At the company’s shareholder meeting earlier this month Musk said that Full Self-Driving has greatly improved, and he expected to make the software available by the end of the year to all owners that request it. But questions about its safety continue to mount.
RELATED CONTENT
RELATED VIDEOS
Timely, incisive articles delivered directly to your inbox.