People think their cars can do all sorts of amazing stuff. Blame it on Hollywood magic like the Fast and Furious movies, unrealistic video games, or some sort of psychological phenomena. This problem carries over to newer driver assistance systems. I hate to break it to everyone, but your forward collision warning doesn’t mean you can take a nap whilst behind the wheel. Sorry.
The Detroit Free Press cited a recent study conducted by AAA Foundation for Traffic Safety. It concluded that people think their blind spot monitoring, adaptive cruise control, and automatic emergency braking systems are way more capable than they really are.
Basically, the study shows what some in the auto industry have said for years: partially self-driving cars might be more dangerous than cars with no self-driving capabilities. The problem really is that humans put too much confidence in technologies that aren’t even designed to do what drivers think they do.
One example is blind-spot monitoring. Drivers overwhelmingly think the systems can detect cyclists, pedestrians, and fast-approaching vehicles from behind far better than current tech allows. The result is about 25 percent of drivers don’t even check the next lane themselves before changing lanes. That poor decision could result in a serious accident, and the driver wouldn’t realize it until after the fact.
The solution is good old-fashioned education. That would require dealers, automakers, and even rental car services to explain thoroughly how the safety tech works. Hopefully drivers listen, because Tesla keeps trying to tell drivers Autopilot requires drivers to be attentive, yet people keep finding ways to have the system do the driving while they don’t even leave their hands on the steering wheel. Yes, the problem is humans, not technology.