Research shows that sleep deprivation makes people emotionally volatile and temperamental — a fact that hasn’t escaped the notice of some reality TV producers. In fact, though it’s not always obvious to the audience, many reality shows feature contestants who could use a little more sleep.
This is not so different from what actual sleep researchers observe in the lab. Mary Carskadon at Brown University says sleep-deprived people tend to be emotionally volatile.
“You have the little girls on their sleepovers giggling themselves silly. But you also have people who have short tempers or easily cry,” says Carskadon. “I guess all things that do make for high drama.”
I’ve heard a great deal about trust and automation over the years, but this has to be my favorite new example of over-reliance on a system.
GPS routed bus under bridge, company says
“The driver of the bus carrying the Garfield High School girls softball team that hit a brick and concrete footbridge was using a GPS navigation system that routed the tall bus under the 9-foot bridge, the charter company’s president said Thursday.Steve Abegg, president of Journey Lines in Lynnwood, said the off-the-shelf navigation unit had settings for car, motorcycle, bus or truck. Although the unit was set for a bus, it chose a route through the Washington Park Arboretum that did not provide enough clearance for the nearly 12-foot-high vehicle, Abegg said. The driver told police he did not see the flashing lights or yellow sign posting the bridge height.
“We haven’t really had serious problems with anything, but here it’s presented a problem that we didn’t consider,” Abegg said of the GPS unit. “We just thought it would be a safe route because, why else would they have a selection for a bus?””
Indeed, why WOULD “they” have a selection for a bus? Here is an excerpt from the manual (Disclosure: I am assuming it’s the same model):
“Calculate Routes for – Lets you take full advantage of the routing information built in the City Navigator maps. Some roads have vehicle-based restrictions. For example, a street or gate may be accessible by emergency vehicles only, or a residential street may not allow commercial trucking traffic. By specifying which vehicle type you are driving, you can avoid being routed through an area that is prohibited for your type of vehicle. Likewise, the ******** III may give you access to roads or turns that wouldn’t be available to normal traffic. The following options are available:
Truck (large semi-tractor/trailer
Emergency (ambulance, fire department, police, etc.)
Delivery (delivery vehicles)
Bicycle (avoids routing through interstates and major highways)
If we can assume no automation can be 100% reliable, at what point to people put too much trust in the system? At what point do they ignore the system in favor of more difficult methods, such as a paper map?At what point is a system so misleading that it should not be offered at all? Sanchez (2006) addressed this question and related type and timing of error to amount of trust placed in the automation. Trust declined sharply (for a time) after an error, so we may assume the Seattle driver might have re-checked the route manually had other (less catastrophic) errors occurred in the past.*
The spokesman for the GPS company is quoted in the above article as stating:
“Stoplights aren’t in our databases, either, but you’re still expected to stop for stoplights.”
I didn’t read the whole manual, but I’m pretty sure it doesn’t say the GPS would warn you of stoplights, a closer analogy to the actual feature that contributed to the accident. This is a time where an apology and a promise of re-design might serve the company better than blaming their users.
*Not a good strategy for preventing accidents!
Other sources for information on trust and reliability of automated systems:
But did you hear about the designs that lead to this human error?
I had to do some detective work (and quizzing gun owners) to find the following pictures:
Here is the gun in question (or similar enough) showing the safety and the spaces in front of and behind the trigger.
Pilots keep the gun in a hoster (see below).
Users report some difficulty ascertaining whether the gun is “locked” into the holster. If it is not, then the trigger can be in an unexpected place (namely, higher in the holster than the shaped holster seems to indicate.)
The TSA requires pilots who have been issued these guns to padlock the trigger for every takeoff and landing. Reports are that pilots do this about 10 times for a shift. Therefore, let’s assume we have 10 chances for error in using the holster and in using the padlock.
The padlock goes through the trigger. It should go behind, to keep anyone from pulling the trigger. If the gun is 100% in the holster, this is the case. If it is not… then the padlock can end up in FRONT of the trigger. The opaque holster prevents a visual check of the trigger.
The holster also prevents a visual check of the safety.
With recent tragic events in the United States, there has been pressure for many University campuses to install emergency alert systems. These systems notify students, faculty, and employees of emergency events via email or mobile text messages.
A few months ago, I signed up to the one offered at my University. Today, I received the following note:
You recently signed up to receive Safe alerts on your cell phone. There is some confusion about the sign-up process and you are among a group of users who did not complete the steps that will enable you to receive emergency messages on your phone. [emphasis added]
Your safety is our paramount concern, so please go to [website] to see instructions to complete the process. You will need to find the checkbox labeled “text message” to receive the CU safe alerts on your phone.
We apologize for this confusion and hope to make the sign-up process simpler in the future.
I thought this was unusual because when I initially signed up, the process did not seem overly complicated. To be sure, it was not intuitive, but not complex either. I was certain that I configured the system to send email and text alerts. I guess I was wrong (along with a few other people).
One thing that makes the system seem so apparently complex is that the system is meant to be a general purpose notification system–not just emergencies. When I log in, I see all of the classes I’ve taught, research groups I belong to, etc. organized into “Channels.” Why can’t the system be just for emergency alerts? Then the sign up process would simply involve entering my email and mobile phone number and opting-in. Instead, it looks like this:
I suppose it has to do with some kind of cost-benefit analysis. Why pay for a system that only handles emergencies when we can extend it to general purpose messaging?
For a future post, I should talk about our new warning sirens (which I cannot hear from my office, unfortunately).
If you’re Apple, you want people to see the similarities between their iPod and their iPhone. However, if you are a drug manufacturer, you do not want similarities between adult and pediatric medicine.
I actually remember reading about this back then, and thought “Wow, there’s a good human factors lesson. How awful that children had to die to bring it into the spotlight.”
Unfortunately, this lesson stayed unlearned, as two more children were administered the adult drug this week. Because these were the newborn twins of Dennis and Kimberly Quaid, who have already spoken out on 60 Minutes about medication mistakes, we may see the problem addressed more thoroughly in the drug industry.
On a final note, these cases touch on the human desire to blame other humans rather than the systems they interact with. In the Indiana case, a mother who lost her child was quoted as saying:
“… who blames the nurses, not drug labeling, for her daughter’s death. “I don’t think it was from the label,” she said. “They are both blue, but one is lighter than the other. How could they mistake those?”
Change blindness, automaticity, expectation, fatigue, and time pressure are but a few of the factors that might have caused the error. Sometimes, it isn’t a case of someone not just “being careful.” This is actually a good thing: we can understand and solve human factors problems. We can’t make someone care.
U.S. news agencies are reporting on the California ballots that ‘may have lost Obama the California primary.’ The argument is that he would have pulled in the ‘declined to state’ voters (those who have not registered as either Democrat or Republican), but that because of a human factors error with the ballot, those votes may not have been counted. (The inference is that these voters would have supported Obama.)
Succinctly, declined-to-state voters have to ask for a Democratic ballot. Then they must fill in a bubble at the top of the ballot, saying that they wanted to vote in the Democratic primary. Obviously, many users might not do this, as it seems a redudant code… the ballot they are holding is the Democratic ballot, so why indicate again that it was the ballot they requested? If you look at the ballot below, it says at the top to “select party in the box below.” Of course, there is only one option, which makes it not much of a selection.
It’s likely this area of the ballot was inserted to produce some interesting statistical information (rather than a pure answer of who received the most votes.) If only declined-to-state voters filled the bubble, you could get a count of how many of those voters came out to vote compared to other years, how many chose to vote Democrat, and which candidate received most of their support. While interesting (I would like to know all of those things) it complicates the purpose of primary voting: to count the number of Americans who support a particular candidate.
Why I am not a conspiracy theorist: People with the best of intentions make critical human factors design errors, even errors that cost people their lives (see “Set Phasers on Stun.”) Sometimes, these errors are created by specific good intentions, as in the Florida hanging-chad fiasco.
The reason the choices were staggard on each side of the ballot was to increase the font size, supposedly making the ballot more clear for older voters. This perceptual aid was trumped by the resulting cognitive confusions. These ballot designs may suffer from a lack of user testing, but not from an intentional ploy to keep declined-to-state voters from being counted or to get Pat Buchanan more votes.
Thus, let’s tackle the problem rather than using ‘double bubble’ for a slow news day. Why don’t we demand all ballots and voting machines be user tested? (Security is another issue, for another blog.) If you have an idea of what action to take, please comment so a future post may provide detailed instructions.
As an example, a task analysis revealed almost two hundred steps medical professionals do per day to keep the typical patient alive and well. On average, there was a 1% error rate, which equates to about two errors per day, per patient.
Pronovost introduced checklists for each type of interaction, which resulted in Michigan hospitals going from 30% chance of infection (typical across the US) to almost 0% for a particular procedure.
Could something as simple as a checklist be the answer? No, because this intervention wasn’t “just” a checklist.
Whether trained in these areas or not, the doctors interviewed had to understand:
Team training: Nurses are trained not to question doctors, even if they are making a mistake. Solution: Pronovost brought both groups together and told them to expect the nurses to correct the doctors. (Author note: I’d be interested to see how long that works.)
Social interaction: In an ambigous situation, people are less likely to interfere (e.g., the doctor didn’t wash his or her hands, but the nurse saw them washed for the previous patient and thinks “It’s probably still ok.” Checklist solution: eliminate ambiguity through the list.
Effects of expertise: As people become familiar with a task, they may skip steps, especially steps that haven’t shown their usefulness. (e.g., if skipping a certain step never seems to have resulted in an infection, it seems harmless to skip it). Checklist solution: enforce steps for all levels of experience.
Decision making: People tend to use heuristics when in a time-sensitive or fatigued state. Checklist solution: remove the “cookbook” memory demands of medicine, leaving resources free for the creative and important decisions.
I am sure the designers believed that if millions of people in London and hundreds of thousands in New Orleans can handle a roundabout, these citizens of a town so small they don’t even bother to mention where it is would do fine.