Category Archives: errors

On Reality TV, Less Sleep Means More Drama : NPR

Research shows that sleep deprivation makes people emotionally volatile and temperamental — a fact that hasn’t escaped the notice of some reality TV producers. In fact, though it’s not always obvious to the audience, many reality shows feature contestants who could use a little more sleep.

This is not so different from what actual sleep researchers observe in the lab. Mary Carskadon at Brown University says sleep-deprived people tend to be emotionally volatile.

“You have the little girls on their sleepovers giggling themselves silly. But you also have people who have short tempers or easily cry,” says Carskadon. “I guess all things that do make for high drama.”

On Reality TV, Less Sleep Means More Drama : NPR

Trust in Automation

I’ve heard a great deal about trust and automation over the years, but this has to be my favorite new example of over-reliance on a system.

GPS routed bus under bridge, company says
“The driver of the bus carrying the Garfield High School girls softball team that hit a brick and concrete footbridge was using a GPS navigation system that routed the tall bus under the 9-foot bridge, the charter company’s president said Thursday.Steve Abegg, president of Journey Lines in Lynnwood, said the off-the-shelf navigation unit had settings for car, motorcycle, bus or truck. Although the unit was set for a bus, it chose a route through the Washington Park Arboretum that did not provide enough clearance for the nearly 12-foot-high vehicle, Abegg said. The driver told police he did not see the flashing lights or yellow sign posting the bridge height.

“We haven’t really had serious problems with anything, but here it’s presented a problem that we didn’t consider,” Abegg said of the GPS unit. “We just thought it would be a safe route because, why else would they have a selection for a bus?””

Link to original story (with pictures of sheared bus and bridge)

Indeed, why WOULD “they” have a selection for a bus? Here is an excerpt from the manual (Disclosure: I am assuming it’s the same model):

Calculate Routes for – Lets you take full advantage of the routing information built in the City Navigator maps. Some roads have vehicle-based restrictions. For example, a street or gate may be accessible by emergency vehicles only, or a residential street may not allow commercial trucking traffic. By specifying which vehicle type you are driving, you can avoid being routed through an area that is prohibited for your type of vehicle. Likewise, the ******** III may give you access to roads or turns that wouldn’t be available to normal traffic. The following options are available:

  • Car/Motorcycle
  • Truck (large semi-tractor/trailer
  • Bus
  • Emergency (ambulance, fire department, police, etc.)
  • Taxi
  • Delivery (delivery vehicles)
  • Bicycle (avoids routing through interstates and major highways)
  • Pedestrian”

gps-screen.gif

If we can assume no automation can be 100% reliable, at what point to people put too much trust in the system? At what point do they ignore the system in favor of more difficult methods, such as a paper map?At what point is a system so misleading that it should not be offered at all? Sanchez (2006) addressed this question and related type and timing of error to amount of trust placed in the automation. Trust declined sharply (for a time) after an error, so we may assume the Seattle driver might have re-checked the route manually had other (less catastrophic) errors occurred in the past.*

The spokesman for the GPS company is quoted in the above article as stating:

“Stoplights aren’t in our databases, either, but you’re still expected to stop for stoplights.”

I didn’t read the whole manual, but I’m pretty sure it doesn’t say the GPS would warn you of stoplights, a closer analogy to the actual feature that contributed to the accident. This is a time where an apology and a promise of re-design might serve the company better than blaming their users.

*Not a good strategy for preventing accidents!

Other sources for information on trust and reliability of automated systems:

Lee, J.D. & See, K.A. (2004). Trust in Automation: Designing for Appropriate Reliance. Human Factors, 46, 50-80.

Parasuraman, R. & Riley, V. (1997). Humans and automation: use, misuse, disuse, abuse. Human Factors, 39, 230-253.

Wiegmann, D. A., Rich, A., Zhang, H. (2001). Automated diagnostic aids: the effects of aid reliability on users’ trust and reliance. Theoretical Issues in Ergonomics Science, 2(4), 352-367.

The Cognitive Engineering Laboratory

John Wayne, United Airways, and Human Factors

Most everyone probably heard about the gun accidentally fired in the passenger plan cockpit last week.

But did you hear about the designs that lead to this human error?

I had to do some detective work (and quizzing gun owners) to find the following pictures:

Here is the gun in question (or similar enough) showing the safety and the spaces in front of and behind the trigger.

pilotgun.jpg

Pilots keep the gun in a hoster (see below).

Users report some difficulty ascertaining whether the gun is “locked” into the holster. If it is not, then the trigger can be in an unexpected place (namely, higher in the holster than the shaped holster seems to indicate.)

The TSA requires pilots who have been issued these guns to padlock the trigger for every takeoff and landing. Reports are that pilots do this about 10 times for a shift. Therefore, let’s assume we have 10 chances for error in using the holster and in using the padlock.

tsaholster.jpg

The padlock goes through the trigger. It should go behind, to keep anyone from pulling the trigger. If the gun is 100% in the holster, this is the case. If it is not… then the padlock can end up in FRONT of the trigger. The opaque holster prevents a visual check of the trigger.

The holster also prevents a visual check of the safety.

All of this might be forgiven, or addressed with training, if it weren’t for the fact that there are numerous other ways to prevent a gun from firing rather than locking something through the trigger. Remember, we should be on the “Guard” step of “Design out, Guard, then Train.”

I’m not even going to discuss whether pilots should have guns.

“Boyd said he supports the program to arm pilots, saying, “if somebody who has the ability to fly a 747 across the Pacific wants a gun, you give it to them.”

For an amusing take, see “Trust is not Transitive.”

Usability and Signing up for Campus Safety Alerts

With recent tragic events in the United States, there has been pressure for many University campuses to install emergency alert systems. These systems notify students, faculty, and employees of emergency events via email or mobile text messages.

A few months ago, I signed up to the one offered at my University. Today, I received the following note:

You recently signed up to receive Safe alerts on your cell phone. There is some confusion about the sign-up process and you are among a group of users who did not complete the steps that will enable you to receive emergency messages on your phone. [emphasis added]

Your safety is our paramount concern, so please go to [website] to see instructions to complete the process. You will need to find the checkbox labeled “text message” to receive the CU safe alerts on your phone.

We apologize for this confusion and hope to make the sign-up process simpler in the future.

I thought this was unusual because when I initially signed up, the process did not seem overly complicated. To be sure, it was not intuitive, but not complex either. I was certain that I configured the system to send email and text alerts. I guess I was wrong (along with a few other people).

One thing that makes the system seem so apparently complex is that the system is meant to be a general purpose notification system–not just emergencies. When I log in, I see all of the classes I’ve taught, research groups I belong to, etc. organized into “Channels.” Why can’t the system be just for emergency alerts? Then the sign up process would simply involve entering my email and mobile phone number and opting-in. Instead, it looks like this:

page-image.png

I suppose it has to do with some kind of cost-benefit analysis. Why pay for a system that only handles emergencies when we can extend it to general purpose messaging?

For a future post, I should talk about our new warning sirens (which I cannot hear from my office, unfortunately).

Death from Branding

If you’re Apple, you want people to see the similarities between their iPod and their iPhone. However, if you are a drug manufacturer, you do not want similarities between adult and pediatric medicine.

Above are bottles of Heparin, manufactured by Baxter Healthcare. Both blood thinners, one of these vials is 1,000 times more concentrated than the other. Confusion between these two bottles killed infants at an Indiana hospital back in 2002. This article provides a good overview of past cases.

I actually remember reading about this back then, and thought “Wow, there’s a good human factors lesson. How awful that children had to die to bring it into the spotlight.”

Unfortunately, this lesson stayed unlearned, as two more children were administered the adult drug this week. Because these were the newborn twins of Dennis and Kimberly Quaid, who have already spoken out on 60 Minutes about medication mistakes, we may see the problem addressed more thoroughly in the drug industry.

On a final note, these cases touch on the human desire to blame other humans rather than the systems they interact with. In the Indiana case, a mother who lost her child was quoted as saying:

… who blames the nurses, not drug labeling, for her daughter’s death. “I don’t think it was from the label,” she said. “They are both blue, but one is lighter than the other. How could they mistake those?”

Change blindness, automaticity, expectation, fatigue, and time pressure are but a few of the factors that might have caused the error. Sometimes, it isn’t a case of someone not just “being careful.” This is actually a good thing: we can understand and solve human factors problems. We can’t make someone care.

Unusually quiet morning radio show

What if a Radio DJ hosted a morning show and no one heard?

Lesson learned! I will try to make certain to hit ‘publish’ at the end of this post.

From the article:

“”I’ve been doing the show three days a week for 10 months and always pressed the button at the right moment. Goodness knows why I forgot this time.

“Mr Dixon, the station’s only employee, will not fire his “excellent” breakfast show DJ, who is one of 35 volunteers who have learnt their radio skills from scratch.”

The Double-Bubble Ballot

U.S. news agencies are reporting on the California ballots that ‘may have lost Obama the California primary.’ The argument is that he would have pulled in the ‘declined to state’ voters (those who have not registered as either Democrat or Republican), but that because of a human factors error with the ballot, those votes may not have been counted. (The inference is that these voters would have supported Obama.)

Succinctly, declined-to-state voters have to ask for a Democratic ballot. Then they must fill in a bubble at the top of the ballot, saying that they wanted to vote in the Democratic primary. Obviously, many users might not do this, as it seems a redudant code… the ballot they are holding is the Democratic ballot, so why indicate again that it was the ballot they requested? If you look at the ballot below, it says at the top to “select party in the box below.” Of course, there is only one option, which makes it not much of a selection.

ballot.jpg

It’s likely this area of the ballot was inserted to produce some interesting statistical information (rather than a pure answer of who received the most votes.) If only declined-to-state voters filled the bubble, you could get a count of how many of those voters came out to vote compared to other years, how many chose to vote Democrat, and which candidate received most of their support. While interesting (I would like to know all of those things) it complicates the purpose of primary voting: to count the number of Americans who support a particular candidate.

Why I am not a conspiracy theorist: People with the best of intentions make critical human factors design errors, even errors that cost people their lives (see “Set Phasers on Stun.”) Sometimes, these errors are created by specific good intentions, as in the Florida hanging-chad fiasco.

ballot2.jpg

The reason the choices were staggard on each side of the ballot was to increase the font size, supposedly making the ballot more clear for older voters. This perceptual aid was trumped by the resulting cognitive confusions. These ballot designs may suffer from a lack of user testing, but not from an intentional ploy to keep declined-to-state voters from being counted or to get Pat Buchanan more votes.

Thus, let’s tackle the problem rather than using ‘double bubble’ for a slow news day. Why don’t we demand all ballots and voting machines be user tested? (Security is another issue, for another blog.) If you have an idea of what action to take, please comment so a future post may provide detailed instructions.

NPR covers a good bit of the HF field in one conversation with two doctors

All Things Considered interviewed Dr. Peter Pronovost this weekend about the checklist he developed for doctors and nurses in busy hospitals. On a topical level, this illuminated the working memory demands of hospital work and statistics on how easy it is to err.

As an example, a task analysis revealed almost two hundred steps medical professionals do per day to keep the typical patient alive and well. On average, there was a 1% error rate, which equates to about two errors per day, per patient.

Pronovost introduced checklists for each type of interaction, which resulted in Michigan hospitals going from 30% chance of infection (typical across the US) to almost 0% for a particular procedure.

Could something as simple as a checklist be the answer? No, because this intervention wasn’t “just” a checklist.

Whether trained in these areas or not, the doctors interviewed had to understand:

Team training: Nurses are trained not to question doctors, even if they are making a mistake. Solution: Pronovost brought both groups together and told them to expect the nurses to correct the doctors. (Author note: I’d be interested to see how long that works.)

Social interaction: In an ambigous situation, people are less likely to interfere (e.g., the doctor didn’t wash his or her hands, but the nurse saw them washed for the previous patient and thinks “It’s probably still ok.” Checklist solution: eliminate ambiguity through the list.

Effects of expertise: As people become familiar with a task, they may skip steps, especially steps that haven’t shown their usefulness. (e.g., if skipping a certain step never seems to have resulted in an infection, it seems harmless to skip it). Checklist solution: enforce steps for all levels of experience.

Decision making: People tend to use heuristics when in a time-sensitive or fatigued state. Checklist solution: remove the “cookbook” memory demands of medicine, leaving resources free for the creative and important decisions.

Intuition vs Experience with Roundabouts

Some people might say a traffic circle is obvious. There is only one way to go.. who yields might be more difficult, but at least we are all driving in the same direction.

Not so.

The following two articles come down on the side of experience for the usability of roundabouts.

New Traffic Circle Causes Confusion

Death-crash car launches off the road and into a first floor flat

I am sure the designers believed that if millions of people in London and hundreds of thousands in New Orleans can handle a roundabout, these citizens of a town so small they don’t even bother to mention where it is would do fine.

Why Human Factors is more than providing safety equipment

The new math and physics building is going up outside my window at North Carolina State. I see the workers out there each day, and as the building gets higher they are obviously required to don different safety gear.

The fuzzy picture below shows two workers on the top level (7th floor) and the green highlight is my outline of the full body harness and safety cord the man is wearing. Indeed, it seemed necessary as whatever tool he is using seems to push him off balance with every use (some sort of nail gun?)

workers.jpg

However. Unlike the man behind him, this worker has not attached his safety cord to anything. It merely drags along behind him as he walks around the platform and crawls in and out of the scaffold. In fact, it seems to get in his way when the clasp on the cord catches on the corrugated surface of the platform.