Category Archives: training

Lion Air Crash from October 2018

From CNN:

The passengers on the Lion Air 610 flight were on board one of Boeing’s newest, most advanced planes. The pilot and co-pilot of the 737 MAX 8 were more than experienced, with around 11,000 flying hours between them. The weather conditions were not an issue and the flight was routine. So what caused that plane to crash into the Java Sea just 13 minutes after takeoff?

I’ve been waiting for updated information on the Lion Air crash before posting details. When I first read about the accident it struck me as a collection of human factors safety violations in design. I’ve pulled together some of the news reports on the crash, organized by the types of problems experienced on the airplane.

1. “a cacophony of warnings”
Fortune Magazine reported on the number of warnings and alarms that began to sound as soon as the plane took flight. These same alarms occurred on its previous flight and there is some blaming of the victims here when they ask “If a previous crew was able to handle it, why not this one?”

The alerts included a so-called stick shaker — a loud device that makes a thumping noise and vibrates the control column to warn pilots they’re in danger of losing lift on the wings — and instruments that registered different readings for the captain and copilot, according to data presented to a panel of lawmakers in Jakarta Thursday.

2. New automation features, no training
The plane included new “anti-stall” technology that the airlines say was not explained well nor included in Boeing training materials.

In the past week, Boeing has stepped up its response by pushing back on suggestions that the company could have better alerted its customers to the jet’s new anti-stall feature. The three largest U.S. pilot unions and Lion Air’s operations director, Zwingly Silalahi, have expressed concern over what they said was a lack of information.

As was previously revealed by investigators, the plane’s angle-of-attack sensor on the captain’s side was providing dramatically different readings than the same device feeding the copilot’s instruments.

Angle of attack registers whether the plane’s nose is pointed above or below the oncoming air flow. A reading showing the nose is too high could signal a dangerous stall and the captain’s sensor was indicating more than 20 degrees higher than its counterpart. The stick shaker was activated on the captain’s side of the plane, but not the copilot’s, according to the data.

And more from CNN:

“Generally speaking, when there is a new delivery of aircraft — even though they are the same family — airline operators are required to send their pilots for training,” Bijan Vasigh, professor of economics and finance at Embry-Riddle Aeronautical University, told CNN.

Those training sessions generally take only a few days, but they give the pilots time to familiarize themselves with any new features or changes to the system, Vasigh said.
One of the MAX 8’s new features is an anti-stalling device, the maneuvering characteristics augmentation system (MCAS). If the MCAS detects that the plane is flying too slowly or steeply, and at risk of stalling, it can automatically lower the airplane’s nose.

It’s meant to be a safety mechanism. But the problem, according to Lion Air and a growing chorus of international pilots, was that no one knew about that system. Zwingli Silalahi, Lion Air’s operational director, said that Boeing did not suggest additional training for pilots operating the 737 MAX 8. “We didn’t receive any information from Boeing or from regulator about that additional training for our pilots,” Zwingli told CNN Wednesday.

“We don’t have that in the manual of the Boeing 737 MAX 8. That’s why we don’t have the special training for that specific situation,” he said.

Did a User Interface Kill 10 Navy Sailors?

I chose a provocative title for this post after reading the report on what caused the wreck of the USS John McCain in August of 2017. A summary of the accident is that the USS John McCain was in high-traffic waters when they believed they lost control of steering the ship. Despite attempts to slow or maneuver, it was hit by another large vessel. The bodies of 10 sailors were eventually recovered and five others suffered injury.

Today marks the final report on the accident released by the Navy. After reading it, it seems to me the report blames the crew. Here are some quotes from the offical Naval report:

  • Loss of situational awareness in response to mistakes in the operation of the JOHN S MCCAIN’s steering and propulsion system, while in the presence of a high density of maritime traffic
  • Failure to follow the International Nautical Rules of the Road, a system of rules to govern the maneuvering of vessels when risk of collision is present
  • Watchstanders operating the JOHN S MCCAIN’s steering and propulsion systems had insufficient proficiency and knowledge of the systems

And a rather devestating:

In the Navy, the responsibility of the Commanding Officer for his or her ship is absolute. Many of the decisions made that led to this incident were the result of poor judgment and decision making of the Commanding Officer. That said, no single person bears full responsibility for this incident. The crew was unprepared for the situation in which they found themselves through a lack of preparation, ineffective command and control and deficiencies in training and preparations for navigation.

Ouch.

Ars Technica called my attention to an important but not specifically called out reason for the accident: the poor feedback design of the control system. I think it is a problem that the report focused on “failures” of the people involved, not the design of the machines and systems they used. After my reading, I would summarize the reason for the accident as “The ship could be controlled from many locations. This control was transferred using a computer interface. That interface did not give sufficient information about its current state or feedback about what station controlled what functions of the ship. This made the crew think they had lost steering control when actually that control had just been moved to another location.” I based this on information from the report, including:

Steering was never physically lost. Rather, it had been shifted to a different control station and watchstanders failed to recognize this configuration. Complicating this, the steering control transfer to the Lee Helm caused the rudder to go amidships (centerline). Since the Helmsman had been steering 1-4 degrees of right rudder to maintain course before the transfer, the amidships rudder deviated the ship’s course to the left.

Even this section calls out the “failure to recognize this configuration.” If the system is designed well, one shouldn’t have to expend any cognitive or physical resources to know from where the ship is being controlled.

Overall I was surprised at the tone of this report regarding crew performance. Perhaps some is deserved, but without a hard look at the systems the crew use, I don’t have much faith we can avoid future accidents. Fitts and Jones were the start of the human factors field in 1947, when they insisted that the design of the cockpit created accident-prone situations. This went against the beliefs of the times, which was that “pilot error” was the main factor. This ushered in a new era, one where we try to improve the systems people must use as well as their training and decision making. The picture below is of the interface of the USS John S McCain, commissioned in 1994. I would be very interested to see how it appears in action.

US Navy (USN) Boatswain’s Mate Seaman (BMSN) Charles Holmes mans the helm aboard the USN Arleigh Burke Class Guided Missile Destroyer USS JOHN S. MCCAIN (DDG 56) as the ship gets underway for a Friends and Family Day cruise. The MCCAIN is getting underway for a Friends and Family Day cruise from its homeport at Commander Fleet Activities (CFA) Yokosuka Naval Base (NB), Japan (JPN). Source: Wikimedia Commons

Wiener’s Laws

The article “The Human Factor” in Vanity Fair is two years old, but since I can’t believe I missed posting it — here it is! It’s a riveting read with details of the Air France Flight 447 accident and intelligent discussion of the impact automation has on human performance. Dr. Nadine Sarter is interviewed and I learned of a list of flight-specific “laws” developed by Dr. Earl Wiener, a past-president of HFES.

“Wiener’s Laws,” from the article and from Aviation Week:

  • Every device creates its own opportunity for human error.
  • Exotic devices create exotic problems.
  • Digital devices tune out small errors while creating opportunities for large errors.
  • Invention is the mother of necessity.
  • Some problems have no solution.
  • It takes an airplane to bring out the worst in a pilot.
  • Whenever you solve a problem, you usually create one. You can only hope that the one you created is less critical than the one you eliminated.
  • You can never be too rich or too thin (Duchess of Windsor) or too careful about what you put into a digital flight-guidance system (Wiener).
  • Complacency? Don’t worry about it.
  • In aviation, there is no problem so great or so complex that it cannot be blamed on the pilot.
  • There is no simple solution out there waiting to be discovered, so don’t waste your time searching for it.
  • If at first you don’t succeed… try a new system or a different approach.
  • In God we trust. Everything else must be brought into your scan.
  • It takes an airplane to bring out the worst in a pilot.
  • Any pilot who can be replaced by a computer should be.
  • Today’s nifty, voluntary system is tomorrow’s F.A.R.

Kudos to the author, William Langewiesche, for a well researched and well written piece.

Anne & Rich Interviewed about Human Factors

Anne and I are big proponents of making sure the world knows what human factors is all about (hence the blog).  Both of us were recently interviewed separately about human factors in general as well as our research areas.

The tone is very general and may give lay people a good sense of the breadth of human factors.  Plus, you can hear how we sound!

First, Anne was just interviewed for the radio show “Radio In Vivo“.

Audio clip: Adobe Flash Player (version 9 or above) is required to play this audio clip. Download the latest version here. You also need to have JavaScript enabled in your browser.

Late last year, I was interviewed about human factors and my research on the local public radio program Your Day:

Audio clip: Adobe Flash Player (version 9 or above) is required to play this audio clip. Download the latest version here. You also need to have JavaScript enabled in your browser.

Excerpts from the NASA ASRS

One of my students last semester (thanks, Ronney!) turned me on the “Callback” publication from the NASA Aviation Safety Reporting System. These are almost all first person stories written as case studies of errors and accidents or near accidents. There aren’t so many that it falls under my list of neat databases, but it certainly is interesting reading.

I’ve collected a few below to give a taste of the stories that are included. These are just the top level descriptions – click through to read the first person accounts.

From Issue 381 Upside Down and Backwards

  1. “An aircraft Mode Selector Panel that “looks the same” whether right side up or upside down, and that can be readily installed either way, is a good example of a problematic design. Confronted with an inverted panel, this Cessna 560 Captain found out what happens when the wrong button is in the right place. “
  2. “Without detailed instructions and clear notation, nearly symmetrical parts can be installed incorrectly. Faced with the replacement of such a part, this CRJ 700 Maintenance Technician wound up with a case of component “misorientation.”

From Issue 383 When Practice Emergencies Go Bad

  1. “…a C182 pilot performed a simulated engine failure while undergoing a practical examination. It appears that both the examiner and the examinee were so engrossed in the simulated emergency that they both tuned BEEEEP out BEEEEP the BEEEEP gear BEEEEP warning BEEEEP horn.”
  2. “When faced with a real engine failure, performing the Engine Secure Checklist reduces the chance of a fire on landing. However, actually performing the steps in the Engine Secure Checklist when the engine failure is not real can lead to a real problem.”

From Issue 382 Fly the Airplane!

  1. “A review of recent ASRS reports indicates that failure to follow one of the most basic tenets of flight continues to be a concern when pilots are faced with distractions or abnormal situations.”

From Issue 376 The Fixation Factor

  1. “The ability to maintain the “big picture” while completing individual, discrete tasks is one of the most critical aspects of working in the aviation environment. Preoccupation with one particular task can degrade the ability to detect other important information. This month’s CALLBACK looks at examples of how fixation adversely affects overall task management.”
  2. “Advanced navigation equipment can provide a wealth of readily available information, but as this Cirrus SR20 pilot learned, sometimes too much information can be a distraction.”

From Issue 375 Motor Skills: Getting Off to a Good Start

  1. “The Captain of an air carrier jet experienced a very hot start when distractions and failure to follow normal flow patterns altered the engine start sequence.”
  2. “This pilot was familiar with the proper procedures for hand-propping, but despite a conscientious effort, one critical assumption led to a nose-to-nose encounter.”

Photo credit smartjunco @ Flickr

 

 

 

Humans and Automation on the Colbert Report

Look! A human factors colleague on the Colbert Report! Does this mean we’re cool?

Dr. Missy Cummings, Associate Professor at MIT
Director of the Humans and Automation Lab

Verdict Reached for Air France Rio Crash

The BBC has reported the incident analysis of the Air France crash that killed 228 people was due to lack of pilot skill in dealing with a high altitude stall.

Here is a link to the BEA Report from the Bureau d’Enquetes et d’Analyses. It’s a frightening read, as they give a moment by moment analysis of the last minutes in the cockpit. No emergency was ever noted and there did not appear to be any mechanical failures. It appeared that the flight crew thought events were under control the entire time (despite the alarms.)

 

 

Photo credit Vin Crosbie at Flickr.

For projectors, new technology means new training (and new errors!)

Mode errors! Coming soon to a theater near you? Have you ever forgotten to set your camera back to Auto from Portrait? How about not understanding what those modes mean? Apparently a similar phenomenon occurs in the professional world of movie theaters. There is a special lens filter used for 3-D movies and when it is not removed for normal movies, the brightness of the film suffers. See the story below for details.

A movie lover’s plea: Let there be light: Many theaters misuse 3-D lenses to show 2-D films, squandering brightness, color

So why aren’t theater personnel simply removing the 3-D lenses? The answer is that it takes time, it costs money, and it requires technical know-how above the level of the average multiplex employee. James Bond, a Chicago-based projection guru who serves as technical expert for Roger Ebert’s Ebertfest, said issues with the Sonys are more than mechanical. Opening the projector alone involves security clearances and Internet passwords, “and if you don’t do it right, the machine will shut down on you.’’ The result, in his view, is that often the lens change isn’t made and “audiences are getting shortchanged.’’

I think “and if you don’t do it right, the machine will shut down on you” summed it up nicely!

Photo credit PatrickBeeson at Flickr.

Radiation: The Difficulty of Monitoring the Invisible – Post 2 of 2

This post continues the list of articles on HF-related errors in radiation delivering healthcare devices.

As Technology Surges, Radiation Safeguards Lag

But the technology introduces its own risks: it has created new avenues for error in software and operation, and those mistakes can be more difficult to detect. As a result, a single error that becomes embedded in a treatment plan can be repeated in multiple radiation sessions.

A new linear accelerator had been set up incorrectly, and the hospital’s routine checks could not detect the error because they merely confirmed that the output had not changed from the first day.

In another case, an unnamed medical facility told federal officials in 2008 that Philips Healthcare made treatment planning software with an obscure, automatic default setting, causing a patient with tonsil cancer to be mistakenly irradiated 31 times in the optic nerve. “The default occurred without the knowledge of the physician or techs,” the facility said, according to F.D.A. records.

In a statement, Peter Reimer of Philips Healthcare said its software functioned as intended and that operator error caused the mistake.

Radiation Offers New Cures, and Ways to Do Harm

The Times found that while this new technology allows doctors to more accurately attack tumors and reduce certain mistakes, its complexity has created new avenues for error — through software flaws, faulty programming, poor safety procedures or inadequate staffing and training.

X-Rays and Unshielded Infants

Asked about the case, Dr. David Keys, a board member of the American College of Medical Physics, said, “It takes less than 15 seconds to collimate [cover non-scanned portions of the body – AM] a baby,” adding: “It could be that the techs at Downstate were too busy. It could be that they were just sloppy or maybe they forgot their training.”

Other problems, according to Dr. Amodio’s e-mail, included using the wrong setting on a radiological device, which caused some premature babies to be “significantly overirradiated.”

Accidental Activation During Seat Adjustment on Plane

CNN posted this story where a co-pilot accidentally bumped a control while adjusting his seat, sending the plane into a 26 degree dive. Disaster was only averted by the pilot returning from the restroom, as apparently the co-pilot lacked the training to correct the error. From the article:

The aviation agency report concluded that the 25-year-old co-pilot had not been trained in the specific scenario the jet encountered and “probably had no clue to tackle this kind of emergency.”

Fortunately disaster was averted, as this story seems to have all the elements Reason’s Swiss Cheese model of accidents requires:

  • Organizational influences – training, and perhaps design of controls
  • Unsafe supervision – temporary absence of supervision
  • Preconditions for unsafe acts – inadequate experience for the complexity of the situation
  • Unsafe acts – Slip, mis-activation of a control

(These are, of course, guesses based on a short news article — I don’t pretend to know everything about this accident.)

Photo Credit grey_um