
"Costas Lakafossis presented NHTSA with a lengthy white paper that tries to prove that human error isn't the problem but that there are instead very specific patterns that repeat themselves in almost every one of these SUA accidents, all pointing to the same cause of possible confusion and the same lack of appropriate pre-emptive measures in the programming of the Human-System Interface of modern self-driving cars."
"Lakafossis claimed that because a driver doesn't need to hold down the brake pedal when starting a Tesla, it might lead to mistakenly pressing the accelerator, explaining about 200 incidents where Teslas crashed into garage walls or parked cars."
"NHTSA expanded a 'preliminary analysis' into an 'engineering analysis' regarding Tesla's vision-only 'FSD' system, expressing concerns that the system fails to detect and/or warn the driver appropriately under degraded visibility conditions such as glare and airborne obscurants."
A petition to NHTSA claims that Tesla's design may lead to unintended acceleration due to programming flaws in the Human-System Interface. The lack of a brake pedal requirement when starting a Tesla could cause drivers to mistakenly press the accelerator. NHTSA has previously attributed these incidents to driver error and is not planning to implement additional safety measures. Additionally, NHTSA has expanded its investigation into Tesla's Full Self-Driving system, expressing concerns about its ability to warn drivers in poor visibility conditions.
Read at Ars Technica
Unable to calculate read time
Collection
[
|
...
]