Health Care Comes Home reviews the state of current knowledge and practice about many aspects of health care in residential settings and explores the short- and long-term effects of emerging trends and technologies. By evaluating existing systems, the book identifies design problems and imbalances between technological system demands and the capabilities of users. Health Care Comes Home recommends critical steps to improve health care in the home. The book’s recommendations cover the regulation of health care technologies, proper training and preparation for people who provide in-home care, and how existing housing can be modified and new accessible housing can be better designed for residential health care. The book also identifies knowledge gaps in the field and how these can be addressed through research and development initiatives.
Consumer Health Information Technology in the Home introduces designers and developers to the practical realities and complexities of managing health at home. It provides guidance and human factors design considerations that will help designers and developers create consumer health IT applications that are useful resources to achieve better health.
A capacitive button has no place on a phone, and the people who are pushing it into the marketplace are over-fetishizing visual design to the detriment of the overall experience. Which is a bit of a pet peeve of mine.
Mode errors! Coming soon to a theater near you? Have you ever forgotten to set your camera back to Auto from Portrait? How about not understanding what those modes mean? Apparently a similar phenomenon occurs in the professional world of movie theaters. There is a special lens filter used for 3-D movies and when it is not removed for normal movies, the brightness of the film suffers. See the story below for details.
So why aren’t theater personnel simply removing the 3-D lenses? The answer is that it takes time, it costs money, and it requires technical know-how above the level of the average multiplex employee. James Bond, a Chicago-based projection guru who serves as technical expert for Roger Ebert’s Ebertfest, said issues with the Sonys are more than mechanical. Opening the projector alone involves security clearances and Internet passwords, “and if you don’t do it right, the machine will shut down on you.’’ The result, in his view, is that often the lens change isn’t made and “audiences are getting shortchanged.’’
I think “and if you don’t do it right, the machine will shut down on you” summed it up nicely!
This editorial from MSN Autos nicely summarizes a topic we’ve covered many times: in-car technology interfering with driving. The central problem appears to be that in-car interfaces are designed in isolation–devoid of the context in which they will actually be used (while driving). So the designs demand a high amount of attention and concentration.
Expert on human-automation interaction Dr. John D. Lee is quoted in the article.
But most automotive experts agree that screen and voice-control systems are here to stay. There are guidelines for good interactive system design; the Alliance of Automobile Manufacturers published a 90-page document outlining the best practices for the industry in 2006. It’s long-winded and a bit dated, but Lee of the University of Wisconsin-Madison summarizes the basic wisdom of the document in a few points:
Complex displays that require the driver to search for information using glances longer than two seconds should be avoided.
The interaction should not “time out” or force the driver to attend continually to the task. The driver should be able to interrupt the task easily and return attention to the road.
Visual information should be placed near the driver’s line of sight.
The display should be easily readable with text and icons that can be seen at a glance.
When we interact with a touch screen, we expect a certain “directness”; that is, if I grab something and push up, I expect that thing to move up. Like dragging a web page up or down. However, did you ever notice that on a track pad (like on a laptop), the direction is reversed?
Trackpad: fingers move DOWN, position indicator goes DOWN, web page goes UP
Touchscreen: fingers move DOWN on surface, position indicator (on far right) goes UP, web page goes DOWN
It’s so subtle, perhaps you’ve never noticed it so I made a video:
I sometimes use this inconsistency (position indicator goes down, fingers go down, but screen moves up) in my class as an example of a violation of an old display design guideline called the principle of the moving part. It suggests that when you have an indicator on a display, it should move in the same direction as the thing it’s indicating. The touchscreen/trackpad issue is more complicated because you also have an input incompatibility (fingers and display moving in opposition).
The difference between the touchscreen and trackpad is in what the fingers are “controlling”: the screen or the position indicator?
Am I obsessing over a trivial issue? Probably; this is something that you just get used to. But I seem to not be alone in noticing this issue. Apple, in their next version of their operating system, will make trackpad navigation consistent with touchscreen navigation (fingers move DOWN on surface, position indicator (on far right) goes UP, web page goes DOWN). Fortunately for some users, it is a user-selectable feature:
Peter Hancock, writing in the January issue of The Ergonomist, writes about the hidden dangers imposed by rapidly advancing automotive technology (noise, vibration suppression, keyless ignition). Noise, vibration, sound, and the mechanical key provides useful information that the car is still on. Removing these cues could result in mode errors:
In previous generations of vehicles, leaving the car ‘on’ as you exit tends also to provide a series of visual, auditory and even tactile kinesthetic cues as to its status. Old-time vehicles tended to make a considerable noise, their exhaust was often visible and the whole vehicle tended to vibrate noticeably while the engine was on. Over the immediate past decades, designers and engineers have sought ways to reduce these sources of disturbance since they were perceived as being perhaps unpleasant.
However, these nominally adverse effects contained problematic yet important informational content. Modern vehicles now rarely belch smoke from the exhaust. Efforts have also been very successful at reducing both noise and vibration such that modern vehicles have now indeed become whisper quiet.
It might, initially seem that leaving your engine running is more of an inconvenience than a significant threat. This is simply incorrect. The cases in the United States which have so far accrued from this form of design-induced error have been fatal.
A vehicle ‘running’ in an enclosed space with direct access for the exhaust to the airflow into your house is indeed a deadly trap. Sadly, a number of individuals now appear to have fallen into that trap. This example may be one of these adverse but unintentional design outcomes.
There does not appear to be an online copy so I’m attaching the PDF here (thanks Rick!)
(post image from flickr user IceNineJon)
Below is an excerpt of Chapter 3 from our book. You can read an excerpt of chapter 1 here. You can also enter to win one of two copies. The book is available where finebooks are sold or directly from our publisher CRC Press. Until January 31, 2011, you can get 20% off the cover price when you purchase directly from CRC Press using this link and this code: 810DE.
Chapter Contents (excerpt is section 3.8)
3.1 How Hearing Changes With Age
3.1.1 Pitch Perception
3.1.3 Sound Localization
3.1.4 Sound Compression
3.1.5 Mp3s, Cell Phones and Other Compressed Audio
3.1.6 Background Noise
3.2 Interim Summary
3.3 Accessibility Aids
3.3.1 Hearing Aids
3.3.2 Telephony Services
3.4 Interim Summary
3.5 Human Language
3.5.2 Speech Rate
3.5.3 Environmental Support
3.6 Interim Summary
3.7 Designing Audio Displays
3.7.3 Passive Voice
3.7.5 Number and Order of Options
3.7.6 Alerts 3.8 In Practice: The Auditory Interface
3.9 General Design Guidelines
3.10 Suggested Readings
3.8 In Practice: The Auditory Interface
The textual representation of the menu shown in Figure 3.7 appears to be a very simple menu, certainly more simple than some of the nine-option menus some companies offer. However, this menu becomes deceptively complex in an audio format. Remember, the listener cannot glance back to any part of the menu that he or she misses, and must hold each option in memory while comparing every new option to find the “best” selection to complete the task.
In this menu the user is greeted, and offered a positive message. What follows should be either an instruction with how to proceed in the system or the most common choice. Here, the user is directed for a very particular activity – a loan advance (and probably not the most common option chosen) – to visit a website. The wording of this information is lengthy and confusing and there is little information on how to access the website or what one should do with no internet access options. This first option sets up confusion and delays the understanding of subsequent options and commands. However an audio menu cannot be paused to let the user mentally catch up.
The next information is a command to choose an option; however this is not directly followed by options. Instead, the listener is informed about their privacy rights. This is another interruption in user expectancies for the system. This is followed by a very typical menu of choices organized in a way that is useful to the bank.
However, how a bank organizes choices (by departments or their computer system) is probably not how a user organizes them. These general categories defined by the bank are : User Account, Salary Advance, Loans, Mortgages, and Other. If it is true that users think of their mortgage as being separate from a “loan,” then it would make sense to list the part (mortgage) before the whole (loans) to keep users who think of their mortgage as a loan from choosing “loans” before they hear the mortgage option.
A more useful order would be to group the portions of this menu into categories: rhetorical information, instruction, and responsive information. All rhetorical information (welcome, thanks, privacy, etc.) belongs up front. Be cautious, however, as lengthy rhetorical information can produce inattention in the user, and they may tune out for the instruction and responses.
The following steps constitute one example of a re-design and testing plan.
Step 1: Make a list of all options currently offered or desired in the phone system
Step 2: Examine previous phone system data and select the 4 most commonly chosen options
Step 3: Create representative tasks for most common options and for least common options
Step 4: Recruit older users and perform a card sort with all options. Have users write the expected functions under each option. What kind of functions and information do they expect to find under “Account Options?”
Step 5: Compare the number of groups and options within each group to the 4 most commonly chosen options
Step 6: Create new interface with top 4 options, with user-defined functions under each option. Include other top level options under “Other”
Another design recommendation is to include natural language triggered by user responses. For example, if a user presses 3 or says “Loans,” the response from the system could be “Ok, you said loans, right? Let me get that.” (The system should listen for a “no” at this time). This allows the user time to think and provides environmental support by reminding the user of the next step. This is desirable despite the time it adds.
The re-designed menu in Figure 3.8 shows significant improvements over the first system. This menu offers more options (7), but they are presented in a manageable way. First, the menu offers voice response and monitors for response during presentation of the options. If the system thought the user said “loans,” it replies with “That was loans, right?” If the user then says “no,” the system repeats the original menu with a natural language introduction. “Ok, let me say the options again. Insurance,….” The system offers an explanation for its actions that prepares the user for a response (and prepares them for the result of their response,) such as “I’ll need to ask you a few questions so I can transfer your call.”
Second, notice that the menu changes based on non-response. Rather than repeating the same options that produced no response from the user, the interface tries different tactics. If no voice responses occur, the system offers button press options, but does not clutter the initial interface with these less natural inputs. Last, notice how the options with button presses change as they progress down the line: the first two options include extra information: “You can say ‘new account’ or press 1. Quotes press 2.” Then the reminders to say or press disappear, as the user is only interested in the options. This is a nice implementation of menu simplification via natural language and a good example of how to move from overall context to list format.
The benefits of such a menu are many and extend beyond the hearing chapter of this book. Such improvements are helpful for working memory, language comprehension, and decision making as discussed in Chapter 4.
CNN posted this story where a co-pilot accidentally bumped a control while adjusting his seat, sending the plane into a 26 degree dive. Disaster was only averted by the pilot returning from the restroom, as apparently the co-pilot lacked the training to correct the error. From the article:
The aviation agency report concluded that the 25-year-old co-pilot had not been trained in the specific scenario the jet encountered and “probably had no clue to tackle this kind of emergency.”