The article “The Human Factor” in Vanity Fair is two years old, but since I can’t believe I missed posting it — here it is! It’s a riveting read with details of the Air France Flight 447 accident and intelligent discussion of the impact automation has on human performance. Dr. Nadine Sarter is interviewed and I learned of a list of flight-specific “laws” developed by Dr. Earl Wiener, a past-president of HFES.
Anne and I are big proponents of making sure the world knows what human factors is all about (hence the blog). Both of us were recently interviewed separately about human factors in general as well as our research areas.
The tone is very general and may give lay people a good sense of the breadth of human factors. Plus, you can hear how we sound!
First, Anne was just interviewed for the radio show “Radio In Vivo“.
Late last year, I was interviewed about human factors and my research on the local public radio program Your Day:
One of my students last semester (thanks, Ronney!) turned me on the “Callback” publication from the NASA Aviation Safety Reporting System. These are almost all first person stories written as case studies of errors and accidents or near accidents. There aren’t so many that it falls under my list of neat databases, but it certainly is interesting reading.
I’ve collected a few below to give a taste of the stories that are included. These are just the top level descriptions – click through to read the first person accounts.
“An aircraft Mode Selector Panel that “looks the same” whether right side up or upside down, and that can be readily installed either way, is a good example of a problematic design. Confronted with an inverted panel, this Cessna 560 Captain found out what happens when the wrong button is in the right place. “
“Without detailed instructions and clear notation, nearly symmetrical parts can be installed incorrectly. Faced with the replacement of such a part, this CRJ 700 Maintenance Technician wound up with a case of component “misorientation.”
“…a C182 pilot performed a simulated engine failure while undergoing a practical examination. It appears that both the examiner and the examinee were so engrossed in the simulated emergency that they both tuned BEEEEP out BEEEEP the BEEEEP gear BEEEEP warning BEEEEP horn.”
“When faced with a real engine failure, performing the Engine Secure Checklist reduces the chance of a fire on landing. However, actually performing the steps in the Engine Secure Checklist when the engine failure is not real can lead to a real problem.”
“The ability to maintain the “big picture” while completing individual, discrete tasks is one of the most critical aspects of working in the aviation environment. Preoccupation with one particular task can degrade the ability to detect other important information. This month’s CALLBACK looks at examples of how fixation adversely affects overall task management.”
“Advanced navigation equipment can provide a wealth of readily available information, but as this Cirrus SR20 pilot learned, sometimes too much information can be a distraction.”
From Issue 375Motor Skills: Getting Off to a Good Start
“The Captain of an air carrier jet experienced a very hot start when distractions and failure to follow normal flow patterns altered the engine start sequence.”
“This pilot was familiar with the proper procedures for hand-propping, but despite a conscientious effort, one critical assumption led to a nose-to-nose encounter.”
The BBC has reported the incident analysis of the Air France crash that killed 228 people was due to lack of pilot skill in dealing with a high altitude stall.
Here is a link to the BEA Report from the Bureau d’Enquetes et d’Analyses. It’s a frightening read, as they give a moment by moment analysis of the last minutes in the cockpit. No emergency was ever noted and there did not appear to be any mechanical failures. It appeared that the flight crew thought events were under control the entire time (despite the alarms.)
Mode errors! Coming soon to a theater near you? Have you ever forgotten to set your camera back to Auto from Portrait? How about not understanding what those modes mean? Apparently a similar phenomenon occurs in the professional world of movie theaters. There is a special lens filter used for 3-D movies and when it is not removed for normal movies, the brightness of the film suffers. See the story below for details.
So why aren’t theater personnel simply removing the 3-D lenses? The answer is that it takes time, it costs money, and it requires technical know-how above the level of the average multiplex employee. James Bond, a Chicago-based projection guru who serves as technical expert for Roger Ebert’s Ebertfest, said issues with the Sonys are more than mechanical. Opening the projector alone involves security clearances and Internet passwords, “and if you don’t do it right, the machine will shut down on you.’’ The result, in his view, is that often the lens change isn’t made and “audiences are getting shortchanged.’’
I think “and if you don’t do it right, the machine will shut down on you” summed it up nicely!
But the technology introduces its own risks: it has created new avenues for error in software and operation, and those mistakes can be more difficult to detect. As a result, a single error that becomes embedded in a treatment plan can be repeated in multiple radiation sessions.
A new linear accelerator had been set up incorrectly, and the hospital’s routine checks could not detect the error because they merely confirmed that the output had not changed from the first day.
In another case, an unnamed medical facility told federal officials in 2008 that Philips Healthcare made treatment planning software with an obscure, automatic default setting, causing a patient with tonsil cancer to be mistakenly irradiated 31 times in the optic nerve. “The default occurred without the knowledge of the physician or techs,” the facility said, according to F.D.A. records.
In a statement, Peter Reimer of Philips Healthcare said its software functioned as intended and that operator error caused the mistake.
The Times found that while this new technology allows doctors to more accurately attack tumors and reduce certain mistakes, its complexity has created new avenues for error — through software flaws, faulty programming, poor safety procedures or inadequate staffing and training.
Asked about the case, Dr. David Keys, a board member of the American College of Medical Physics, said, “It takes less than 15 seconds to collimate [cover non-scanned portions of the body – AM] a baby,” adding: “It could be that the techs at Downstate were too busy. It could be that they were just sloppy or maybe they forgot their training.”
Other problems, according to Dr. Amodio’s e-mail, included using the wrong setting on a radiological device, which caused some premature babies to be “significantly overirradiated.”
CNN posted this story where a co-pilot accidentally bumped a control while adjusting his seat, sending the plane into a 26 degree dive. Disaster was only averted by the pilot returning from the restroom, as apparently the co-pilot lacked the training to correct the error. From the article:
The aviation agency report concluded that the 25-year-old co-pilot had not been trained in the specific scenario the jet encountered and “probably had no clue to tackle this kind of emergency.”
I don’t feel that I am in control of the information I share on Facebook, and of the information my friends share… FB has total control of (some of) my information, and I don’t like that.
It’s not that Yohann didn’t like Facebook–he did. He liked being able to see his friend’s latest photos and keep up with status updates. The problem was that Yohann (who is, by the way a very smart, tech savvy guy) felt unable to use the Facebook user interface to effectively maintain control of his information.
The root of this problem could be one of two things. It could be that Facebook has adopted the “evil interface” strategy (discussed by Rich previously on the human factors blog), where an interface is not designed to help a user accomplish their goals easily (a key tenet of human factors), but is instead designed to encourage (or trick) a user to behave the way the interface designer wants the user to behave (even if it’s not what the user really wants). Clearly, this strategy is problematic for a number of reasons, not the least of which from Facebook’s perspective is that users will stop using Facebook altogether if they feel tricked or not in control.
A more optimistic perspective is that the problem of privacy on Facebook is a human factors one: the privacy settings on Facebook need to be redesigned because they are currently not easy to use. Here are a few human factors issues I’ve noticed.
Lack of Feedback
In general, there is very little feedback provided to users about the privacy level of different pieces of information on their Facebook profile. For example, by default, Facebook now considers your name, profile picture, gender, current city, networks, friend list, and Pages to all be public information. However, no feedback is given to users as they enter or change this information to indicate that this is considered public information.
While Facebook did introduce a preview function which shows a preview of what information a Facebook friend would see should they visit your profile (which is a great idea!), the preview function does not provide feedback to a user about what information they are sharing publicly or with apps. For example, you can’t type “Yelp” into the preview window to see what information Facebook would share with Yelp through Facebook connect.
No Training (Instructions)
Finally, Facebook does not provide any training and only minimal instructions for users on how to manage privacy settings.
Fortunately, there are some relatively simple human factors solutions that could help users manage their privacy without writing their own Dear John letter to Facebook.
In terms of interface changes to increase feedback to users, Facebook could for example, notify users when they are entering information that Facebook considers public by placing an icon beside the text box. That way, users would be given immediate feedback about which information would be shared publicly.
Perhaps if Facebook decides to take a human factors approach to privacy in the future, Yohann will re-friend Facebook.
Anne was invited to be a panelist at SXSW on Friday, March 12 at 05:00 PM. SXSW is a yearly music, movie, and interactive media festival held in Austin, TX. The title of the interactive panel is With Great Power Comes Great Responsibility: The Future of Video Games. Here is a description:
Video games are more popular than ever, and new games are delivering all kinds of social benefits, from video-game therapy for treating PTSD, to sims for train surgeons, to alternate-reality games that actually bring people together in real life. Will video games be a positive force for people and society in the future (as they arguably are today)? This panel is co-sponsored by Discover Magazine and the National Science Foundation.
Take a look at the event page for more information on the other panelists. If you happen to be there, drop by and say hello.
She promises to document as much HF-relevant aspects of the conference as possible. Here are just some of the talks she’s planning on attending:
History of the button
Long distance UX
Is the brain the ultimate computer interface?
mind control: psychology for the web
what guys are doing to get more girls in tech social gaming: lessons from the pioneers