Category Archives: aviation

Institutional Memory, Culture, & Disaster

I admit a fascination for reading about disasters. I suppose I’m hoping for the antidote. The little detail that will somehow protect me next time I get into a plane, train, or automobile. A gris-gris for the next time I tie into a climbing rope. Treating my bike helmet as a talisman for my commute. So far, so good.

As human factors psychologists and engineers, we often analyze large scale accidents and look for the reasons (pun intended) that run deeper than a single operator’s error. You can see some of my previous posts on Wiener’s Laws, Ground Proximity Warnings, and the Deep Water Horizon oil spill.

So, I invite you to read this wonderfully detailed blog post by Ron Rapp about how safety culture can slowly derail, “normalizing deviance.”

Bedford and the Normalization of Deviance

He tells the story of a chartered plane crash in Bedford, Massachusetts in 2014, a take-off with so many skipped safety steps and errors that it seemed destined for a crash. There was plenty of time for the pilot stop before the crash, leading Rapp to say “It’s the most inexplicable thing I’ve yet seen a professional pilot do, and I’ve seen a lot of crazy things. If locked flight controls don’t prompt a takeoff abort, nothing will.” He sums up the reasons for these pilot’s “deviant” performance via Diane Vaughn’s factors of normalization (some interpretation on my part, here):

  • If rules and checklists and regulations are difficult, tedious, unusable, or interfere with the goal of the job at hand, they will be misused or ignored.
  • We can’t treat top-down training or continuing education as the only source of information. People pass on shortcuts, tricks, and attitudes to each other.
  • Reward the behaviors you want. But we tend to punish safety behaviors when they delay secondary (but important) goals, such as keeping passengers happy.
  • We can’t ignore the social world of the pilots and crew. Speaking out against “probably” unsafe behaviors is at least as hard as calling out a boss or coworker who makes “probably” racist or sexist comments. The higher the ambiguity, the less likely people take action (“I’m sure he didn’t mean it that way.” or “Well, we skipped that list, but it’s been fine the ten times so far.”)
  • The cure? An interdisciplinary solution coming from human factors psychologists, designers, engineers, and policy makers. That last group might be the most important, in that they recognize a focus on safety is not necessarily more rules and harsher punishments. It’s checking that each piece of the system is efficient, valued, and usable and that those systems work together in an integrated way.

    Thanks to Travis Bowles for the heads-up on this article.
    Feature photo from the NTSB report, photo credit to the Massachusetts Police.

    Wiener’s Laws

    The article “The Human Factor” in Vanity Fair is two years old, but since I can’t believe I missed posting it — here it is! It’s a riveting read with details of the Air France Flight 447 accident and intelligent discussion of the impact automation has on human performance. Dr. Nadine Sarter is interviewed and I learned of a list of flight-specific “laws” developed by Dr. Earl Wiener, a past-president of HFES.

    “Wiener’s Laws,” from the article and from Aviation Week:

    • Every device creates its own opportunity for human error.
    • Exotic devices create exotic problems.
    • Digital devices tune out small errors while creating opportunities for large errors.
    • Invention is the mother of necessity.
    • Some problems have no solution.
    • It takes an airplane to bring out the worst in a pilot.
    • Whenever you solve a problem, you usually create one. You can only hope that the one you created is less critical than the one you eliminated.
    • You can never be too rich or too thin (Duchess of Windsor) or too careful about what you put into a digital flight-guidance system (Wiener).
    • Complacency? Don’t worry about it.
    • In aviation, there is no problem so great or so complex that it cannot be blamed on the pilot.
    • There is no simple solution out there waiting to be discovered, so don’t waste your time searching for it.
    • If at first you don’t succeed… try a new system or a different approach.
    • In God we trust. Everything else must be brought into your scan.
    • It takes an airplane to bring out the worst in a pilot.
    • Any pilot who can be replaced by a computer should be.
    • Today’s nifty, voluntary system is tomorrow’s F.A.R.

    Kudos to the author, William Langewiesche, for a well researched and well written piece.

    Collection of Aviation Safety Articles & Student Activity Ideas

    I recently came across an impressive collection of Human Factors related safety stories, mostly concerning aviation, from a the System Safety Services group in Canada. The summaries are written in an accessible way, so I recommend this site for good classroom examples. I was already thinking of a classroom activity, perhaps for an undergraduate course:

    In class:
    Please read the following excerpt (abridged) from Aviation Human Factors Industry News Volume VII. Issue 17. Provide a list of the pros and cons of allowing ATCs to take scheduled naps during their shifts. Put an * by each pro or con that is safety related. The full article is available via the link above.

    …the FAA and the controllers union — with assistance from NASA and the Mitre Corp., among others — has come up with 12 recommendations for tackling sleep-inducing fatigue among controllers. Among those recommendations is that the FAA change its policies to give controllers on midnight shifts as much as two hours to sleep plus a half-hour to wake up. That would mark a profound change from current regulations that can make sleeping controllers subject to suspension or dismissal. Yet, at most air traffic facilities, it’s common for two controllers working together at night to engage in unsanctioned sleeping swaps whereby one controller works two jobs while the other controller naps and then they switch off…

    More than two decades ago, NASA scientists concluded that airline pilots were more alert and performed better during landings when they were allowed to take turns napping during the cruise phase of flights. The FAA chose to ignore recommendations that U.S. pilots be allowed “controlled napping.” But other countries, using NASA’s research, have adopted such policies for their pilots. Several countries — including France, Germany, Canada and Australia — also permit napping by controllers during breaks in their work shifts, said Peter Gimbrere, who heads the controllers association’s fatigue mitigation effort. Germany even provides controllers sleep rooms with cots, he said. …fatigue affects human behavior much like alcohol, slowing reaction times and eroding judgment. People suffering from fatigue sometimes focus on a single task while ignoring other, more urgent needs.

    One of the working group’s findings was that the level of fatigue created by several of the shift schedules worked by 70 percent of the FAA’s 15,700 controllers can have an impact on behavior equivalent to a blood-alcohol level of .04, Gimbrere said. That’s half the legal driving limit of .08. “There is a lot of acute fatigue in the controller work force,” he said. Controllers are often scheduled for a week of midnight shifts followed by a week of morning shifts and then a week swing shifts, a pattern that sleep scientists say interrupts the body’s natural sleep cycles.

    At home:
    Your homework assignment is to identify another work domain with similar characteristics where you believe fatigue is a safety concern. Write an argument for requiring rest during work hours or other solutions for fatigue. Again, specifically call out the pros and cons of your solution.

    A list of all articles, in newsletter form, can be found here.

    Photo credit mrmuskrat @ Flickr

    Pilots forget to lower landing gear after cell phone distraction

    This is back from May, but it’s worth noting. A news story chock-full of the little events that can add up to disaster!

    From the article:

    Confused Jetstar pilots forgot to lower the wheels and had to abort a landing in Singapore just 150 metres above the ground, after the captain became distracted by his mobile phone, an investigation has found.

    Major points:

    • Pilot forgets to turn off cell phone and receives distracting messages prior to landing.
    • Co-pilot is fatigued.
    •  They do not communicate with each other before taking action.
    •  Another distracting error occurred involving the flap settings on the wings.
    • They do not use the landing checklist.

    I was most surprised by that last point – I didn’t know that was optional! Any pilots out there want to weigh in on how frequently checklists are skipped entirely?

     

     

    Photo credit slasher-fun @ Flickr

    Human Factors discussed on the “Big Picture Science” podcast

    First off, I highly recommend the Big Picture Science podcast. It’s right up there with RadioLab. I’m sure my friends and family are getting tired of me starting conversations with “So I learned today…”

    That said, I was listening to “Humans Need Not Apply” on the way to work yesterday, a discussion of the jobs and traits that machines can perform instead of a human. Basically, an overview of the future of function allocation.

    Right in the middle of the podcast was an interview with Kathy Abbott, Chief Scientific and Technical Advisor for Flight Deck Human Factors at the FAA. (They even said “human factors” several times during the interview.)

    One of the interesting points Dr. Abbott raised was that function allocation is not just a question of function and capability – it is a social question. The host asked her whether airplanes would eventually be flown without a human pilot and her answer was that even if the machine were capable, human passengers would likely not accept a plane without a pilot in the cockpit.

    If you’re impatient, the interview with Dr. Abbott begins at 27:32.

     

     

    Photo credit U.S. Army @ Flickr

    Development of the ground proximity warning system for aviation

    This article tells the story of inspiration for and creation of a “ground proximity warning” system for pilots, as well as multiple other types of cockpit warnings. Don’t miss the video embedded as a picture in the article! It has the best details!

    Some choice excerpts:

    About 3.5 miles out from the snow-covered rock face, a red light flashed on the instrument panel and a recorded voice squawked loudly from a speaker.

    “Caution — Terrain. Caution — Terrain.”

    The pilot ignored it. Just a minute away from hitting the peaks, he held a steady course.

    Ten seconds later, the system erupted again, repeating the warning in a more urgent voice.

    The pilot still flew on. Snow and rock loomed straight ahead.

    Suddenly the loud command became insistent.

    “Terrain. Pull up! Pull up! Pull up! Pull up! Pull up!”

    Accidents involving controlled flight into terrain still happen, particularly in smaller turboprop aircraft. During the past five years, there have been 50 such accidents, according to Flight Safety Foundation data.

    But since the 1990s, the foundation has logged just two in aircraft equipped with Bateman’s enhanced system — one in a British Aerospace BAe-146 cargo plane in Indonesia in 2009; one in an Airbus A321 passenger jet in Pakistan in 2010.

    In both cases, the cockpit voice recorder showed the system gave the pilots more than 30 seconds of repeated warnings of the impending collisions, but for some reason the pilots ignored them until too late.

    After a Turkish Airlines 737 crashed into the ground heading into Amsterdam in 2009, investigators discovered the pilots were unaware until too late that their air speed was dangerously low on approach. Honeywell added a “low-airspeed” warning to its system, now basic on new 737s.

    For the past decade, Bateman has worked on ways of avoiding runway accidents by compiling precise location data on virtually every runway in the world.

    Excerpts from the NASA ASRS

    One of my students last semester (thanks, Ronney!) turned me on the “Callback” publication from the NASA Aviation Safety Reporting System. These are almost all first person stories written as case studies of errors and accidents or near accidents. There aren’t so many that it falls under my list of neat databases, but it certainly is interesting reading.

    I’ve collected a few below to give a taste of the stories that are included. These are just the top level descriptions – click through to read the first person accounts.

    From Issue 381 Upside Down and Backwards

    1. “An aircraft Mode Selector Panel that “looks the same” whether right side up or upside down, and that can be readily installed either way, is a good example of a problematic design. Confronted with an inverted panel, this Cessna 560 Captain found out what happens when the wrong button is in the right place. “
    2. “Without detailed instructions and clear notation, nearly symmetrical parts can be installed incorrectly. Faced with the replacement of such a part, this CRJ 700 Maintenance Technician wound up with a case of component “misorientation.”

    From Issue 383 When Practice Emergencies Go Bad

    1. “…a C182 pilot performed a simulated engine failure while undergoing a practical examination. It appears that both the examiner and the examinee were so engrossed in the simulated emergency that they both tuned BEEEEP out BEEEEP the BEEEEP gear BEEEEP warning BEEEEP horn.”
    2. “When faced with a real engine failure, performing the Engine Secure Checklist reduces the chance of a fire on landing. However, actually performing the steps in the Engine Secure Checklist when the engine failure is not real can lead to a real problem.”

    From Issue 382 Fly the Airplane!

    1. “A review of recent ASRS reports indicates that failure to follow one of the most basic tenets of flight continues to be a concern when pilots are faced with distractions or abnormal situations.”

    From Issue 376 The Fixation Factor

    1. “The ability to maintain the “big picture” while completing individual, discrete tasks is one of the most critical aspects of working in the aviation environment. Preoccupation with one particular task can degrade the ability to detect other important information. This month’s CALLBACK looks at examples of how fixation adversely affects overall task management.”
    2. “Advanced navigation equipment can provide a wealth of readily available information, but as this Cirrus SR20 pilot learned, sometimes too much information can be a distraction.”

    From Issue 375 Motor Skills: Getting Off to a Good Start

    1. “The Captain of an air carrier jet experienced a very hot start when distractions and failure to follow normal flow patterns altered the engine start sequence.”
    2. “This pilot was familiar with the proper procedures for hand-propping, but despite a conscientious effort, one critical assumption led to a nose-to-nose encounter.”

    Photo credit smartjunco @ Flickr

     

     

     

    What values are pilots allowed to enter for the weight of the plane?

    I’d assume when pilots enter a weight estimate for the plane prior to takeoff that there would be a decision aid to prevents gross miscalculation. It certainly seems like an undue load (no pun intended) on the pilot to require entering multiple components for weight correctly. From the article linked below I am no longer sure how much automation is involved. Apparently, the pilot forgot to account for the weight of the fuel. Doesn’t it seem as though that would be the easiest weight to automatically enter?

    From the article:

    Pilot Miscalculates Plane Weight, Avoids Disaster

    “The weight of the plane dictates the speed required to take off and too little speed could have caused pilots to lose control of the aircraft. Luckily, the captain realized something was wrong and compensated before the plane ran off the runway.

    According to the report there have been “a significant number of reported incidents and several accidents resulting from errors in take-off performance calculations around the world in recent years.”

    On a side note, I’ve been on small planes where we all had to be weighed as well as our luggage prior to boarding. If the margins are that thin, I sure hope no one made any data entry mistakes!

     

    Photo credit martinhartland @ Flickr

    Rudder knob in cockpit mistaken for door latch

    Any aviation experts want to chime in about a knob turning a plane upside down? Also, please note this was characterized as “pilot error.”

    Pilot error causes airliner to flip, fly upside down

    From the article:

    According to the safety board, an analysis of the aircraft’s digital flight recorder indicated the co-pilot, alone in the cockpit while the captain used a restroom, mistakenly turned the rudder trim knob twice to the left for a total of 10 seconds.

    The co-pilot apparently mistook the knob for the cockpit door-lock switch as he tried to let the captain back in. The mistake is believed to have caused the airplane to tilt leftward and descend rapidly.

     

    Human Factors in the News: Next Generation Aviation

    I don’t know how I missed this back in March! They even use the words “human factor” in the title! The article is an interesting overview of the “NextGen” systems coming to aviation and explains our field to the general public.

    Air traffic overhaul hinges on ‘human factor’

    From the article:

    Human factors’ engineering
    Even amid the amazing technological achievements and wondrous capabilities of the 21st century, the most critical connection in the airline industry remains the same as it was at the birth of aviation: the human touch.

    That’s where Domino comes in. Armed with a degree from George Mason University in human factors engineering, Domino studies the way humans interact with machines.

    A classic task of the human factors engineer, Domino says, “is to ensure that information is being presented at the right time to a pilot and in the right form so that the human cognitive capabilities are not simply overwhelmed.”

    The question is, says Domino, “What should you put in front of a pilot and in what form should that information be?”

    Nice job, CNN.

    Photo Credit: AviaFilms on Flickr