This story in the LA Times illustrates several important HF/usability issues. First, the importance of knowing what the user knows before introducing new, seemingly “simple” technology, or changing the way they currently do things (in this case, what people know about ignition systems and how they start their cars). Second, like the story about the alarms, it also clearly illustrates that using things under normal circumstances is different for people under stress.
The sleek Infiniti G37 Cindy Marsh bought last August was the car of her dreams, equipped with the latest keyless electronics technology that allows her to start the engine with the touch of a button.
But right away, the system gave her trouble. To get the engine started, she would sometimes have to tap the power button repeatedly. Sometimes it wouldn’t start unless she opened and closed the car doors, Marsh recalled.
She eventually adapted to the system’s quirks but said that even now she isn’t sure how to shut off the engine in an emergency.
In complaints to federal regulators, motorists have reported that they were unable to shut down engines during highway emergencies, including sudden acceleration events. In other cases, parked vehicles accidentally rolled away and engines were left running for hours without their owners realizing it.
And although traditional keys all work the same way and are universally understood by consumers, automakers have adopted different procedures for using the keyless ignition systems. As a result, owners may not know how to operate their own cars in an emergency, let alone a rented or borrowed car.
(Post image by http://www.flickr.com/photos/7755055@N04/2881681649)
This is the first post in our 2-part look at some HF programs. Anne’s post about North Carolina State University’s program can be found here.
Did you know that Human Factors is not only a fun blog, but something you could get a graduate degree in? The field is known by many names but they are the same, more or less¹ (for example, Anne and I received our degree in “engineering psychology”).
The degree is fairly generic and is defined further by specialization (for example, human-computer interaction and usability are closely associated with HF but by no means limited to it). Human factors graduates work in industry (evaluating software/hardware usability, designing), government, and research.
Unfortunately, we probably should have done these posts months ago when students were researching and applying to programs but better late than never! Still deciding on whether to do the M.S. or PhD? See this article (PDF link) provided by HFES. It’s old but still has great information.
Clemson University is located in Clemson, South Carolina which is situated in the foothills of the Blue Ridge Mountains (in the upper left corner of the state). The area is known as the “upstate” of South Carolina and is adjacent to one of the largest metropolitan areas of the state (Greenville-Spartanburg area).
The Department of Psychology at Clemson University offers both master’s and PhD degrees in Human Factors. Clemson’s program is newer than most (established in 1988) but already has graduated several PhD students who work in academia and industry. The faculty have a wide variety of research interests. My own interests are pretty well covered by my posts on this blog.
We do not have rolling admissions; instead, applications are accepted yearly and acceptances are made in mid-late spring. It is probably a very good idea to target people who’s research sounds interesting to you and then ask them if they are taking students that year.
Feel free to ask me questions about the program but the best person to ask is our graduate coordinator:
Dr. Robert Sinclair
Department of Psychology
418 Brackett Hall
Clemson, SC 29634
(864) 656-0358 (fax) email@example.com
¹similar terms to human factors: applied cognitive psychology, applied experimental psychology, engineering psychology
Sadly, more and more products seem set to suffer the same fate, as many of the objects we use daily are “replaced” by digital touch screens. Think of the iPhone, which fulfills the functions of a watch, phone, camera, clock, DVD and CD player, barometer, and so on. The skills of their U.I. designers will be just as important in determining how pleasurable — or otherwise — it will be to use them, as old-fashioned considerations, like how they look. And it’s those same designers that we’re counting on to save us from the curse of over-complicated design.
It seems that every few years, 3D technology is in the zeitgeist (with 3d movies). User interfaces are not immune to the frenzy of 3D. However, there is quite a bit of past research in 3D interfaces (I won’t even scratch the surface but see this simple Google Scholar search to start). Much, but not all, relate to navigation in virtual environments, while other research relates to the inclusion of use of depth/perspective. There are still many outstanding issues in the use of 3d in user interfaces; some of which are: use/interaction (input, output), effects on workload, and effects on learning.
In general, 3-dimensional displays (like a perspective view) are perceived to be more natural and possibly require less mental integration than 2-dimensional displays (see this very well-researched U.S./FAA report on multifunction displays; warning PDF). Some of the logic goes like this: when I view a 2d map, I usually turn it into a 3d representation in my head. Showing a 3d representation removes this step (in addition to showing more information). Compare the two types of information displays:
These images come from a user study examining user preferences in map presentation (2d or 3d). The research showed that it depends. The preference data was very complex (see paper) but the preferences were evenly split but those aged 26-40 preferred 3d maps. Males preferred 2d maps while females preferred 3d maps (which seems surprising).
Personally, I switch between 2d and 3d view when I can because each offers information the other does not. I like to examine hikes after the fact (collecting and mapping GPS data). See below; each view gives you different information:
The 2d view gives a good general overview and the intricacies of the trail but shows no elevation information while the 3d view shows terrain but obscures the path (part of it is hidden behind the terrain).
More subtle uses of 3D on websites is the use of the parallax for the illusion of depth. This website showcases some creative uses of this effect. Most websites use it for aesthetic effect, however, I noticed the new Google Nexus One phone uses it in a subtle but useful way to indicate that you are on a different screen (a type of low-level feedback). See the video below. When the user slides to another screen horizontally, the animation of the galaxy changes perspective:
Embedded video (skip to 61 seconds in):
In some cases, when used appropriately, learning can be enhanced by the use of 3d. Researchers Avi Parush and Dafna Berman (in a 2004 paper in the International Journal of Human-Computer Studies) were interested in the use of 3D interfaces for navigation and orientation in a virtual environment. The virtual environment contained the objects that one would normally have on their computer desktop (e.g., files, applications). Will the use of a 3d environment enhance learning and performance? They manipulated whether subjects had two kinds of aids to help: landmarks or a route list. They found that both types of aids help in the learning process but a crucial point was that landmark placement (in either 2d or 3d) was critical.
One commercially available tool that gives users this kind of view of their computer is the BumpTop desktop:
The BumpTop desktop introduces the further complication (on top of 3d) of the nature of interaction. You are using a 2d surface (the touch pad or mouse) to navigate the 3d environment and in some cases using multi-touch gestures (using more than 1 finger). Very cool…but useful? See for yourself:
I rode in a colleague’s new Volvo the other day and I love the environmental controls. The button lights up when active, showing where the airflow is going. Notice how the fan speed control is integrated into air direction display so they each add information to the other.
Compare to the older Volvo buttons, which had a similar theme but not quite as pleasing. It also lacks the integration of the fan speed.
Last, the controls I just had on a rented Toyota Corolla in Las Vegas:
In this Toyota, you press the “mode” button and it cycles through an LCD to show you where the air will flow. I found it hard to see and I had to infer that “mode” was the right button. I was never able to figure out these controls while driving and had to ask my passenger to work them for me.*
In sum, I like that the Volvo buttons allow for all combination (feet/torso, feet/defroster, torso/defroster). This is something I always wanted in my previous cars. I also rate it high on the “emotional” side of design.. it looks great!
*As a side note, the entire frame for this interface glows light blue, thus ruining my night vision and was tiring to have in the periphery while driving. I prefer the all red dashboard of my 2003.
Darin Ellis sends along this radio story about a woman’s robotic heart that has a malfunction warning system that literally breaks the textbook HF rules of alarm design. I’ll let Darin explain the unfortunate issue:
This woman, who is living thanks to a robotic heart, related a story of the “heart” malfunctioning. Apparently, although not prone to malfunction, there is a very particular way to recover from the malfunctioning state [it warns you via an alarm]
She was (luckily) at home. The alarms went off blaring like crazy. Her young kids react to the alarm and start screaming and crying… Then she had to figure out what was wrong and try to remember how to fix it in the right order. With the kids AND the alarm still blaring. Anyone see what is wrong here, or is it just me?
I am sure she is very grateful for this “heart” but the story made me cringe. I am sure that when your heart literally stops, you don’t need alarms blaring to tell you something is wrong
This post on Smashing Magazine about vertical navigation had me thinking about the book Anne and I are writing (manuscript due this Friday; panicking…I’m a 10 on the Wong-Baker scale). In one of the chapters I discuss tab navigation. When I was looking for a particularly bad example of the use of tabs I remembered Amazon’s website circa 2000. Fortunately, the Wayback Machine had preserved the travesty of UI navigation for posterity:
There is a grand total of 15 options and they are not really in alphabetical order (they seem to be grouped). Amazon can’t be blamed–we probably didn’t know as much as we know now (I can’t believe it was a decade ago!). But browsing the Wayback entry for Amazon’s homepage through the years certainly shows evolution and an iterative process to reach the current Amazon navigation scheme which eschews tabs almost entirely for a cascading, vertical navigation:
Do you have any examples of particularly good or bad examples of tab navigation?
I don’t visit the doctor frequently (less than once a year) but last year I went to the doctor and as part of the paperwork, I encountered a question about how much pain I felt (shown above).
This is the Wong-Baker FACES Pain Rating Scale (which seems to be available online). I thought this was a great way to ask about general pain severity that would be useful for people with another primary language or children–precisely why it was developed. We humans have a very tuned sense for faces and probably emotions. This scale takes advantage of both with caricatured faces and somewhat extreme emotional representations.
James Rubinstein sends along a this post about a 32 inch LCD TV presumably designed for older users. It has features such as a dramatically simplified remote control, fewer wires, and a shut-off timer. [Engadget]
In the “why didn’t they do this sooner” category is an Ethnography application for the iPhone called Everyday Lives (warning, link opens iTunes). It lets you record audio, video, images and other data in the field (via UXforward).