Are we too trusting of GPS automation?

A GPS certainly makes life easier — and although I think many of us might consider what would happen if we were without it or it was unable to identify where we were, it is less often we consider how it may lead us astray.

One of our early postings on the Human Factors Blog was about a bus driver following GPS directions that led under a too-short bridge. His case was augmented by the fact that he had chosen the “bus” setting on the GPS and assumed any route produced was therefore safe for buses. The actual model of the GPS under the bus setting was only to add routes that only buses could take, such as HOV exits, rather than to limit any route.

NPR just posted stories of people in Death Valley who got lost from following GPS directions down roads that no longer existed. In one of the cases, their car got stuck for 5 days and resulted in the death of a child. After hearing numerous stories about inaccurate GPS directions from lost drivers, a ranger investigated the maps used by the GPS systems and found roads included in them that had been closed for years. How accurate and updated do GPS systems need to be to be considered safe? How can they address over-trust in potentially dangerous situations (e.g., death valley)?

 

2 thoughts on “Are we too trusting of GPS automation?”

  1. Going back to your earlier article about this & the anecdote about your parents, Anne, do you know if anyone’s looked at instances when the the system was recalculating (typically the driver didn’t follow the GPS) to evaluate why the user didn’t follow? There might be some implicit cues described in these incidents that maps databases could use to make better recommendations.

Comments are closed.