In July, builders broke ground on a new hospital in Rwanda’s Burera district, near the Uganda border. The design relies on simple features to reduce the spread of airborne disease: outdoor walkways instead of enclosed halls, waiting rooms alfresco and large windows staggered at different levels on opposing walls to keep air circulating.
The plural of anecdote may not be data, but this is a good start if anyone wants to look at whether regulation (via unions) contributes to safety, what safety rules are ‘annoying’, and construction worker locus of control.
“What’s one safety rule you would initiate at your workplace? What rules are unnecessary?
On union jobs the safety rules tend to be comprehensive, and effectively enforced. On non-union jobs — haha.
Many non union jobs are criminally negligent about safety. And after years of Republican rule of federal government there is little realistic enforcement. In other countries when workers are killed in, say, a building collapse, somebody goes to prison when negligence is proven. Here they might be fined a paltry sum.
I have yet to encounter a safety rule that was unnecessary. Although some are annoying — like wearing masks.”
“What’s one safety rule you would initiate at your workplace? What rules are unnecessary?
I can’t think of any specific “rule” I would initiate … 98 percent of safety is just paying attention to what you are doing and to your surroundings.
You can’t mandate good judgment. Although many of the rules are good and grounded in common sense (they do create a general “culture” of safety), sometimes the letter of the law, so to speak, is enforced too much.
Many times you stand there and say, “I understand why this rule exists, but when applied blindly in this situation, it just doesn’t make sense.”
Research shows that sleep deprivation makes people emotionally volatile and temperamental — a fact that hasn’t escaped the notice of some reality TV producers. In fact, though it’s not always obvious to the audience, many reality shows feature contestants who could use a little more sleep.
This is not so different from what actual sleep researchers observe in the lab. Mary Carskadon at Brown University says sleep-deprived people tend to be emotionally volatile.
“You have the little girls on their sleepovers giggling themselves silly. But you also have people who have short tempers or easily cry,” says Carskadon. “I guess all things that do make for high drama.”
Most everyone probably heard about the gun accidentally fired in the passenger plan cockpit last week.
But did you hear about the designs that lead to this human error?
I had to do some detective work (and quizzing gun owners) to find the following pictures:
Here is the gun in question (or similar enough) showing the safety and the spaces in front of and behind the trigger.
Pilots keep the gun in a hoster (see below).
Users report some difficulty ascertaining whether the gun is “locked” into the holster. If it is not, then the trigger can be in an unexpected place (namely, higher in the holster than the shaped holster seems to indicate.)
The TSA requires pilots who have been issued these guns to padlock the trigger for every takeoff and landing. Reports are that pilots do this about 10 times for a shift. Therefore, let’s assume we have 10 chances for error in using the holster and in using the padlock.
The padlock goes through the trigger. It should go behind, to keep anyone from pulling the trigger. If the gun is 100% in the holster, this is the case. If it is not… then the padlock can end up in FRONT of the trigger. The opaque holster prevents a visual check of the trigger.
The holster also prevents a visual check of the safety.
All of this might be forgiven, or addressed with training, if it weren’t for the fact that there are numerous other ways to prevent a gun from firing rather than locking something through the trigger. Remember, we should be on the “Guard” step of “Design out, Guard, then Train.”
I’m not even going to discuss whether pilots should have guns.
For an amusing take, see “Trust is not Transitive.”
With recent tragic events in the United States, there has been pressure for many University campuses to install emergency alert systems. These systems notify students, faculty, and employees of emergency events via email or mobile text messages.
A few months ago, I signed up to the one offered at my University. Today, I received the following note:
You recently signed up to receive Safe alerts on your cell phone. There is some confusion about the sign-up process and you are among a group of users who did not complete the steps that will enable you to receive emergency messages on your phone. [emphasis added]
Your safety is our paramount concern, so please go to [website] to see instructions to complete the process. You will need to find the checkbox labeled “text message” to receive the CU safe alerts on your phone.
We apologize for this confusion and hope to make the sign-up process simpler in the future.
I thought this was unusual because when I initially signed up, the process did not seem overly complicated. To be sure, it was not intuitive, but not complex either. I was certain that I configured the system to send email and text alerts. I guess I was wrong (along with a few other people).
One thing that makes the system seem so apparently complex is that the system is meant to be a general purpose notification system–not just emergencies. When I log in, I see all of the classes I’ve taught, research groups I belong to, etc. organized into “Channels.” Why can’t the system be just for emergency alerts? Then the sign up process would simply involve entering my email and mobile phone number and opting-in. Instead, it looks like this:
I suppose it has to do with some kind of cost-benefit analysis. Why pay for a system that only handles emergencies when we can extend it to general purpose messaging?
For a future post, I should talk about our new warning sirens (which I cannot hear from my office, unfortunately).
If you’re Apple, you want people to see the similarities between their iPod and their iPhone. However, if you are a drug manufacturer, you do not want similarities between adult and pediatric medicine.
Above are bottles of Heparin, manufactured by Baxter Healthcare. Both blood thinners, one of these vials is 1,000 times more concentrated than the other. Confusion between these two bottles killed infants at an Indiana hospital back in 2002. This article provides a good overview of past cases.
I actually remember reading about this back then, and thought “Wow, there’s a good human factors lesson. How awful that children had to die to bring it into the spotlight.”
Unfortunately, this lesson stayed unlearned, as two more children were administered the adult drug this week. Because these were the newborn twins of Dennis and Kimberly Quaid, who have already spoken out on 60 Minutes about medication mistakes, we may see the problem addressed more thoroughly in the drug industry.
On a final note, these cases touch on the human desire to blame other humans rather than the systems they interact with. In the Indiana case, a mother who lost her child was quoted as saying:
“… who blames the nurses, not drug labeling, for her daughter’s death. “I don’t think it was from the label,” she said. “They are both blue, but one is lighter than the other. How could they mistake those?”
Change blindness, automaticity, expectation, fatigue, and time pressure are but a few of the factors that might have caused the error. Sometimes, it isn’t a case of someone not just “being careful.” This is actually a good thing: we can understand and solve human factors problems. We can’t make someone care.
NPR ran a story earlier this week on an intriguing new human factors problem: fire-safe elevators.
The fall of the World Trade Center made it painfully obvious that stairs in skyscrapers do not function adequately in emergencies. We’ve always been warned away from elevators in case of fire, and I would go so far as to say it part of our collective knowledge from a young age. With the advent of elevators you should use in a fire comes a host of difficulties.
1. Training the zeitgeist: Not all elevators will be replaced, though new tall buildings will all have fireproof elevators. There may be new rules requiring older buildings over a certain size retrofit at least one elevator as fire safe.
- This still makes fireproof elevators the exception instead of the rule. A great research question would be how to train people for a small-percentage case? You want the public, of all ages and experience levels, to know “In case of fire, use stairs, unless there is a fireproof elevator around, which you may or may not have noticed while you were in the building.”
2. Warnings and Information: The symbol in this post is probably familiar to all of you. I’ve occasionally seen it in Spanish, but not often. How will we indicate the difference between fire-safe elevators and other elevators?
- Decals, signs and other indicators will not only have to indicate which elevators are safe and their purpose, but whether other elevators in the building are safe or unsafe. My building is square, with elevators on mirrored sides. If one were safe and the other not, I am sure I could remember which was safe, especially under the cognitive demands of an emergency.
3. Wayfinding and luck: Use of the elevator may depend on the location of the fire.
- One of the original problems was that elevators opened onto smoke-filled or fire-filled floors. The story did not specify how the new elevators would avoid this. If there is a sensor that prevents them from opening onto such a floor, what if there are people desperately waiting for the elevator on that floor (as they have been re-trained to do)?
- Should the system be even more complex, with people gathering on certain floors to await the elevator rescue? And then, if those floors are on fire..
In short, researchers start your engines! We have some training, warning, design, and way-finding work to do.
A “smart” dashboard that reduces the amount of information displayed to drivers during stressful periods on the road could be available in just five years, say German engineers.
A team from the Technical University of Berlin found they could improve reaction times in real driving conditions by monitoring drivers’ brains and reducing distractions during periods of high brain activity.
They were able to speed up driver’s reactions by as much as 100 milliseconds. It might not sound much, but this is enough to reduce breaking distance by nearly 3 metres when travelling at 100 kilometres per hour, says team leader Klaus-Robert Müller.
All Things Considered interviewed Dr. Peter Pronovost this weekend about the checklist he developed for doctors and nurses in busy hospitals. On a topical level, this illuminated the working memory demands of hospital work and statistics on how easy it is to err.
As an example, a task analysis revealed almost two hundred steps medical professionals do per day to keep the typical patient alive and well. On average, there was a 1% error rate, which equates to about two errors per day, per patient.
Pronovost introduced checklists for each type of interaction, which resulted in Michigan hospitals going from 30% chance of infection (typical across the US) to almost 0% for a particular procedure.
Could something as simple as a checklist be the answer? No, because this intervention wasn’t “just” a checklist.
Whether trained in these areas or not, the doctors interviewed had to understand:
Team training: Nurses are trained not to question doctors, even if they are making a mistake. Solution: Pronovost brought both groups together and told them to expect the nurses to correct the doctors. (Author note: I’d be interested to see how long that works.)
Social interaction: In an ambigous situation, people are less likely to interfere (e.g., the doctor didn’t wash his or her hands, but the nurse saw them washed for the previous patient and thinks “It’s probably still ok.” Checklist solution: eliminate ambiguity through the list.
Effects of expertise: As people become familiar with a task, they may skip steps, especially steps that haven’t shown their usefulness. (e.g., if skipping a certain step never seems to have resulted in an infection, it seems harmless to skip it). Checklist solution: enforce steps for all levels of experience.
Decision making: People tend to use heuristics when in a time-sensitive or fatigued state. Checklist solution: remove the “cookbook” memory demands of medicine, leaving resources free for the creative and important decisions.
Check out this fascinating solution to protecting users from the blade of a table saw.
The way it works is that the saw blade registers electrical contact with human skin and immediately stops. I can’t imagine not having this safety system in place, now that it is available. However, I still have some questions that commenters might want to weigh in on:
1. Unless the system is more redundant than an airplane, it must be able to fail. How do you keep users to remain vigilant when 99.999% of the time there is no penalty for carelessness?
2. To answer my own question, is the fear of a spinning blade strong enough to do that on its own? I know I’m not going to intentionally test the SawStop.
3. Can we use natural fears such as this in other areas of automation?
4. For great insight into human decision making, read this thread on a woodworking site. What would it take to change the mind of this first post-er?
“When do we as adult woodworkers take responsibility and understand the dangers of woodworking. Most accidents happen due to not paying attention to what we’re doing. If we stay focused while we’re using power tools, or even hand tools, we eliminate accidents.”