Trust in Automation

I’ve heard a great deal about trust and automation over the years, but this has to be my favorite new example of over-reliance on a system.

GPS routed bus under bridge, company says
“The driver of the bus carrying the Garfield High School girls softball team that hit a brick and concrete footbridge was using a GPS navigation system that routed the tall bus under the 9-foot bridge, the charter company’s president said Thursday.Steve Abegg, president of Journey Lines in Lynnwood, said the off-the-shelf navigation unit had settings for car, motorcycle, bus or truck. Although the unit was set for a bus, it chose a route through the Washington Park Arboretum that did not provide enough clearance for the nearly 12-foot-high vehicle, Abegg said. The driver told police he did not see the flashing lights or yellow sign posting the bridge height.

“We haven’t really had serious problems with anything, but here it’s presented a problem that we didn’t consider,” Abegg said of the GPS unit. “We just thought it would be a safe route because, why else would they have a selection for a bus?”"

Link to original story (with pictures of sheared bus and bridge)

Indeed, why WOULD “they” have a selection for a bus? Here is an excerpt from the manual (Disclosure: I am assuming it’s the same model):

Calculate Routes for – Lets you take full advantage of the routing information built in the City Navigator maps. Some roads have vehicle-based restrictions. For example, a street or gate may be accessible by emergency vehicles only, or a residential street may not allow commercial trucking traffic. By specifying which vehicle type you are driving, you can avoid being routed through an area that is prohibited for your type of vehicle. Likewise, the ******** III may give you access to roads or turns that wouldn’t be available to normal traffic. The following options are available:

  • Car/Motorcycle
  • Truck (large semi-tractor/trailer
  • Bus
  • Emergency (ambulance, fire department, police, etc.)
  • Taxi
  • Delivery (delivery vehicles)
  • Bicycle (avoids routing through interstates and major highways)
  • Pedestrian”

gps-screen.gif

If we can assume no automation can be 100% reliable, at what point to people put too much trust in the system? At what point do they ignore the system in favor of more difficult methods, such as a paper map?At what point is a system so misleading that it should not be offered at all? Sanchez (2006) addressed this question and related type and timing of error to amount of trust placed in the automation. Trust declined sharply (for a time) after an error, so we may assume the Seattle driver might have re-checked the route manually had other (less catastrophic) errors occurred in the past.*

The spokesman for the GPS company is quoted in the above article as stating:

“Stoplights aren’t in our databases, either, but you’re still expected to stop for stoplights.”

I didn’t read the whole manual, but I’m pretty sure it doesn’t say the GPS would warn you of stoplights, a closer analogy to the actual feature that contributed to the accident. This is a time where an apology and a promise of re-design might serve the company better than blaming their users.

*Not a good strategy for preventing accidents!

Other sources for information on trust and reliability of automated systems:

Lee, J.D. & See, K.A. (2004). Trust in Automation: Designing for Appropriate Reliance. Human Factors, 46, 50-80.

Parasuraman, R. & Riley, V. (1997). Humans and automation: use, misuse, disuse, abuse. Human Factors, 39, 230-253.

Wiegmann, D. A., Rich, A., Zhang, H. (2001). Automated diagnostic aids: the effects of aid reliability on users’ trust and reliance. Theoretical Issues in Ergonomics Science, 2(4), 352-367.

The Cognitive Engineering Laboratory


Similar Posts (auto-generated):

About Anne McLaughlin

Associate Professor, Department of Psychology, North Carolina State University, Raleigh, NC

, , ,

6 Responses to Trust in Automation

  1. Jeff Garbers April 22, 2008 at 3:51 pm #

    I think the key phrase is here:

    The driver told police he did not see the flashing lights or yellow sign posting the bridge height.

    Maybe the bigger problem is with the signage and/or driver’s inattention. It’s hard to imagine any responsible driver saying “well, that looks like it’s close to the height of my bus, but if the GPS says we’ll clear, then I’ll just go for it…”

  2. Anne McLaughlin April 22, 2008 at 7:42 pm #

    That is a good point! Personal responsibility can’t go out the window.

    It sounds like the difference between a prescriptive model and a descriptive one.. people SHOULD still pay close attention to signs, know the height of their bus, etc. But if we know that they do not (or won’t) then how can we keep them safe?

    I’d be interested in ways to alter a person’s trust to a level appropriate for the reliability of the system. Perhaps this information could be embedded in the system itself, and be vivid enough to provide an “experience” of system failure without them actually having to experience an automation error.

  3. ryan k. April 22, 2008 at 8:56 pm #

    Facinating!! I am very interested to see the information which should be surfacing right about now in regards to the system/automation/user errors in typical GPS usage in common road automobiles (busses, trucks, large trucks, cars). Since the systems have been increasing in mainstream use for the past 4 or so years (I think?) and have come down to a semi-justifiable purchase price, I can’t wait to see all the goodies of infobits that come out…

    One thing that sparked a big golden “?” over my head here was the idea that having the GPS, while a great tool, could increase the user error due to the stress level of a multi-passenger driving scenario, like a school bus, with a lot of activity… is it possible that adding the extra bit of direction from a source that should be authoritative creates an automation error due to overinformation? Or would one think experienced bus drivers would be somewhat in tune with the noise/commotion of their surroundings, and maybe the GPS adds that slight bit of technological environmental change and discomfort?

    Such fun! I love this blog :)

  4. Richard Pak April 22, 2008 at 9:34 pm #

    I’m embarrassed to say that I’ve been lulled into errors due to too much trust in GPS–but involving taking a turn when I shouldn’t have :o

  5. Anne McLaughlin April 23, 2008 at 7:04 am #

    I’ll out my parents in this comment.

    They are 61 and 80 and spent a month last fall driving up and down the east coast. For this, the bought their first talking GPS, which they named “Lulu.”

    Quotes from both of them include:
    “We love her! She’s so patient, even though we make her recalculate all the time.”
    “I have no idea where we are. We just do what Lulu tells us to do.”

    Having said that, they did make it through a month of driving every day, even with 100% trust in Lulu’s “knowledge.”

%d bloggers like this: