
It looks like self-driving systems and driver monitoring still need a lot of improvement – Photo: Dawn Project
A video posted on the social network Twitter(x) shows how easy it is to fool Tesla’s driver monitoring system.
The system is basically designed to ensure that the driver is awake and aware of hazards around them when the vehicle is moving, following some accidents caused by self-driving cars .
Dan O’Ward, creator of the Dawn Project channel, collaborated with YouTuber AI Addict to test the system.
The group put a small weight on the steering wheel of the tram to fool the driver into keeping his hands on it. They take turns placing various objects in the driver’s seat to pretend to be the driver, sometimes a teddy bear, sometimes just a large beer bottle, even just a balloon, self-driving Let’s activate the mode and see. what is going to happen.
In the first test, he placed a large teddy bear on the driver’s seat of an electric car. Although it was clear that no one was behind the vehicle, the monitoring system did not issue any warnings. Tesla cars keep driving themselves.
Test video “tricks” the Tesla self-driving system – Video: Dawn Project/Twitter
Testers dragged a small doll onto the road to see how the automatic emergency braking system would work. The system took a while to recognize that there was a “person” crossing the road, so the car did not brake in time.
The second test showed the teddy bear being replaced by a unicorn. Once again, the driver monitoring system proved useless. The car also hit the doll crossing the road, worse, it didn’t stop at all.
In the final test, the team no longer placed a heavy object on the driver’s seat. Yet the Tesla was still moving, collided with the doll crossing the street, stopped for a while, and then decided to keep going.
Meanwhile, with another test conducted by YouTuber Whole Mars Catalogue, the driver monitoring system did a pretty decent job. Drivers are constantly looking at their phones and getting warnings to keep an eye on the road. After ignoring this warning four times, the self-driving system was disabled.
It is unclear whether the error was caused by a fault in the driver monitoring system in the vehicle that the Dawn Project and AI Addict updated. However, it can be seen that this system is still not stable enough. Therefore, self-driving and driver monitoring still have a long way to go.
(tags to translate) autonomous vehicles