The future of truly driverless, automated cars got even further away as nonprofit safety watchdog, The Dawn Project released safety test videos displaying the failures of Tesla Inc.’s (NASDAQ: TSLA) driver monitoring system.
Released on Friday, the videos show Tesla’s driver monitoring system failing to detect when a driver texts, reads, watches movies or even falls asleep at the wheel. The car also fails to recognise when a teddy bear, unicorn, or if the driver’s seat is empty.
Tesla has issued warnings about its self-driving software, acknowledging that it may make mistakes at critical moments and may cause sudden swerving even under normal driving conditions. The automaker advises advises drivers to keep their hands on the wheel and remain vigilant while using the system.
Regulators have allowed Tesla’s self-driving software to be sold to the public with a requirement that a driver must be present in the car and paying full attention to the road with both hands on the steering wheel, ready to take control if needed.
Monitoring system in over 4 million Teslas
To ensure driver attentiveness, research performed by the American Automobile Association (AAA) suggests implementing an effective driver monitoring system that uses cameras.
According to Tesla, the cabin camera installed in cars is designed to detect driver inattentiveness and issue audible alerts as a reminder to focus on the road when using autopilot.
However, The Dawn Project conducted tests on Tesla’s driver monitoring system and found that the internal camera fails to recognize certain actions commonly associated with an inattentive driver.
These actions include staring out of the side window for an extended period, eating a meal while neglecting the road, turning around to look at the back seat, and even placing a weight on the steering wheel to simulate hands on the wheel.
Tesla has faced criticism for marketing its system as “Full Self-Driving” while regulators classify it as a Level 2 Advanced Driver Assistance System.
“This ineffective driver monitoring system is in over 4 million Tesla vehicles made in the last five years. We tested it on two cars, and achieved the exact same results,” said software specialist Dan O’Dowd, CEO and founder of the Dawn Project.
“Pedestrians, cyclists and drivers have no way of knowing whether the person ‘supervising’ an ineffective self-driving Tesla is actually paying attention to the road, or is asleep at the wheel.”
A tragic collision involving a Tesla vehicle and a tractor-trailer in 2016 resulted in the National Transportation Safety Board attributing the accident to driver inattention. Following this incident, the National Highway Traffic Safety Administration (NHTSA) mandated Tesla to include a driver monitoring system.
Objective was to show system’s effectiveness
Recently, tests of Tesla’s self-driving software were conducted on public roads in Santa Barbara, with a person in the passenger seat ready to take over if necessary.
The objective was to assess the system’s effectiveness in recognizing potentially dangerous situations, such as distracted driving or instances when no human driver is present to take control, thereby ensuring the safety of the vehicle’s operation. The tests aimed to provide valuable insights into the system’s capabilities and its potential to improve driver attentiveness and overall road safety.