Category Archives: aviation

Verdict Reached for Air France Rio Crash

The BBC has reported the incident analysis of the Air France crash that killed 228 people was due to lack of pilot skill in dealing with a high altitude stall.

Here is a link to the BEA Report from the Bureau d’Enquetes et d’Analyses. It’s a frightening read, as they give a moment by moment analysis of the last minutes in the cockpit. No emergency was ever noted and there did not appear to be any mechanical failures. It appeared that the flight crew thought events were under control the entire time (despite the alarms.)

 

 

Photo credit Vin Crosbie at Flickr.

Automation Issues Hit the Big Time on NPR

NPR brings home the safety issues of too much cockpit automation.

From the NPR story:

“It was a fairly busy time of the day. A lot of other airliners were arriving at the same time, which means that air traffic control needed each of us on very specific routes at specific altitudes, very specific air speeds, in order to maintain this smooth flow of traffic,” he says.

So, air traffic control told him to fly a particular course. He and the other pilot flying the jet set the flight automation system to do it.

“What I anticipated the aircraft to do was to continue this descent,” he says. “Well instead, the aircraft immediately pitched up, very abruptly and much to my surprise. And both of us reached for the yoke, going, ‘What’s it doing?’ and there’s that shock of, ‘Why did it do that, what’s it going to do next?’ “

We’ve posted on this topic before, when we discussed Dr. Sethmudnaven’s work and Dr. Sanchez’s work. For more cutting-edge automation failure research, watch these labs:

If your lab should be listed and isn’t, send me an email!

Accidental Activation During Seat Adjustment on Plane

CNN posted this story where a co-pilot accidentally bumped a control while adjusting his seat, sending the plane into a 26 degree dive. Disaster was only averted by the pilot returning from the restroom, as apparently the co-pilot lacked the training to correct the error. From the article:

The aviation agency report concluded that the 25-year-old co-pilot had not been trained in the specific scenario the jet encountered and “probably had no clue to tackle this kind of emergency.”

Fortunately disaster was averted, as this story seems to have all the elements Reason’s Swiss Cheese model of accidents requires:

  • Organizational influences – training, and perhaps design of controls
  • Unsafe supervision – temporary absence of supervision
  • Preconditions for unsafe acts – inadequate experience for the complexity of the situation
  • Unsafe acts – Slip, mis-activation of a control

(These are, of course, guesses based on a short news article — I don’t pretend to know everything about this accident.)

Photo Credit grey_um

Impending Disasters Announced Via Computer

I had no idea that there were automated disaster warnings on planes, such one telling passengers to prepare to crash.

  1. Apparently they exist.
  2. Apparently it’s not too difficult to mistakenly broadcast them.

From one article on the event:

“This is an emergency announcement. We may shortly need to make an emergency landing on water.”

Passenger Michelle Lord said: “People were terrified, we all thought we were going to die. They said the pilot hit the wrong button because they were so close together.”

I certainly see the point of an automated message, since in the event of an upcoming crash the crew is almost certainly busy. But the heart attack I might have upon hearing the message in error would render a crash moot.

Blogging APA Division 21: The Cost of Automation Failure

Arathi Sethumadhavan, currently of Medtronic and recently of Texas Tech, was this year’s winner of the George E. Briggs dissertation award, for the best dissertation this year in the field of applied experimental psychology. Her advisor was Frank Durso.

Her work was inspired by our need to increase automation in aviation, due to increases in air traffic. However, automation does not come without costs — what happens the performance of air traffic controllers and pilots when the automation someday fails? At what point is the operator so “out of the loop” that recovery is impossible?

Sethumadhavan addressed this question by giving people different levels of automation and observing their performance after failures of the automated system. The more automated the system, the more errors occurred when that system failed.

She also measured the situation awareness of her participants in the different levels of automation — results were similar. Those who had more of their task automated had less situation awareness, and even after a system failure their awareness continued to be lower. In other words, they weren’t shocked out of complacency, as one might predict.

Sethumadhaven’s work directly contributes to understanding the human in the loop of the automated system, so that we can predict their behavior and explore design options to prevent errors due to putting the controller out of the loop.

You can read more on Dr. Sethumadhavan’s work here. Congratulations to her on this award!

Photo credit isafmedia under a Creative Commons license.

Blogging APA Division 21: You’re Looking Harmless Today

I‘m on a plane writing this post and I look harmless, or at least not threatening.

According to work presented by Poornima Madhavan from Old Dominion University, being a female in the screening line means I am less likely to be hassled by a false alarm of a screener seeing a threat in my bag.*

In work done with her graduate student Jeremy Brown, Dr. Madhavan found that participants in their studies consistently reported more false alarms (detecting a threat that was not there) when the passenger was male. Both genders showed this bias.

Because this bias affects a perceptual task (detecting a knife in a baggage x-ray) it is called a “Social Cognitive Bias.”

This project is a wonderful example of an applied experiment that gives us information on the effects social and cultural structures can have on cognitive ability.

Photo credit Wayan Vota under a Creative Commons license.

*No matter what gender you are, carrying climbing gear guarantees a search!

“Sully” Sullenberger to Speak at the HFES 2010 Conference

I received word today that Captain Chesley “Sully” Sullenberger will give the keynote address at the 2010 Human Factors and Ergonomics Conference in San Francisco this October.

Not only am I excited to hear him speak, I am excited because he is the perfect choice for a Human Factors audience: he has spoken publicly on interface and instruction issues in aviation and should have an interesting take on responding to an emergency in the midst of a complex task (albeit one he was well-trained for).

Check out this clip of him on The Daily Show, where he talks about the disconnect between design for everyday use versus emergency use when referring to tabs on emergency manuals. The HF starts right at 3 minutes.

Here is a 3-D recreation of the flight complete with audio tape.

I often use this clip in presentations when discussing expert performance.

Photo credit Jim Davidson

Visual Search and Airport Security Screening

Funny I should have mentioned conjunction search the other day, since this post is all about new research by Jeremy Wolfe who has and continues to contribute to the visual search literature.

In this new work, already mentioned on i09, Wolfe and his former research assistant Michael van Wert investigated complex visual search as it applies to baggage scanning at airport security. When the target being searched for (i.e., weapons) does not appear frequently, detection rates go way down. Even if it is detected, people have a hard time inhibiting the motor response of saying “no, I didn’t see anything.”¹

Of course, human difficulties in searching for rare events is nothing new. The big contribution of this work was to determine that we go through two decision criteria when searching and each affects our response time and our accuracy.

¹I’m liberally translating; these aren’t the specifics of the study method.

The primary sources mentioned in this post can be found:

Wolfe, J. M, & van Wert, M. J. (2010). Varying target prevalence reveals two dissociable decision criteria in visual search, Current Biology, 20(2), 121-124

Another good article with implications for the TSA:

Warm, J. S., Parasuraman, R., & Matthews, G. (2008). Vigilance Requires Hard Mental Work and Is Stressful. Human Factors, 50, 433-441.

Emergency Checklists and Aviation

The recent water landing into the Hudson is still being investigated. This AP article focuses on whether flight attendants were trained not to open the back door of the plane during a water landing, but the most interesting bit comes at the end:

Another concern is whether the FAA and airlines need to revise emergency procedures for pilots in the event both engines fail. Those procedures usually involve a sequence of many steps called a checklist. There are different checklists depending upon the problem, but most are based on the expectation that the problem will occur while the plane is flying at a high altitude — airliners typically cruise above 20,000 feet, giving pilots time to identify and correct the problem.

Flight 1549’s first officer, Jeffrey Skiles, told a congressional panel in February that he only had time to make it part of the way through a checklist for restarting the engines when Sullenberger sent the plane into the river.

Sumwalt suggested it would be better for airlines to train pilots to remember one procedure for a low-altitude dual engine failure, rather than go through a long checklist of items while altitude rapidly diminishes.