Category Archives: usability

Apple Watch Human Factors

watchThe big news in tech last week was the unveiling of the Apple Watch. I think it is a nice moment to discuss a range of human factors topics. (This topic may elicit strong feelings for or against Apple or the idea of a smartwatch but let’s keep it about the science.)

The first is technology adoption/acceptance. Lots of people were probably scratching their heads asking, “who wears a watch, nowadays?” But you do see lots of people wearing fitness bands. Superficially, that contrast seems to demonstrate the Technology Acceptance Model (TAM) in action.  TAM is a way to try to understand when people will adopt new technology. It boils down the essential factors to usability (does it seem easy to use?) and usefulness (does it seem like it will help my work or life?).

Fitness bands check both of the above boxes: since they are essentially single-function devices they are relatively easy to use and tracking fitness is perceived as useful for many people.

Back to the Watch, it may also check off both of the above boxes: it certainly appears easy to use (but we do not know yet), and because it has fitness tracking functions plus many others via apps it certainly may be perceived as useful to the same crowd that buys fitness bands.

The next topic that got me excited was the discussion of the so-called digital crown (shown below). Anne and I have previously studied the contrasts between touch screens and rotary knobs for a variety of computing tasks. Having both choices allows the user select the best input device for the task: touch for pushing big on-screen buttons and large-scale movement and knob for precise, linear movement without obscuring the screen. Using a knob is certainly easier than a touch screen if you have shaky hands or are riding a bumpy cab.


Two small items of note that were included in the Watch was the use of the two-finger gesture on the watch face to send a heart beat to another user–the same gesture many people intuitively think of when they want to feel their own heart beat.

Finally, the Watch has the ability to send animated emoij to other users. What was noteworthy is the ability to manipulate both eyes and mouth in emoji characters. I couldn’t find any literature but I recall somewhere that there is some cross-cultural differences in how people use and interpret emoji: Western users tend to focus on the mouth while Eastern users tend to focus on the eyes (if you know what reference I’m talking about or if I’m mis-remembering, feel free to comment).



There’s so much I haven’t brought up (haptic and multi-modal feedback, user interface design, automation, voice input and of course privacy)!



Wearable Fitness Trackers: A Comparative Usability Evaluation

This guest post is from graduate students Haley Vaigneur and Bliss Altenhoff. Haley and Bliss compared the usability of two fitness trackers as part of a graduate course in health informatics taught by Kelly Caine.


Wearable fitness trackers allow users to track and monitor their health. While these devices originated as a way for doctors to monitor chronically ill patients’ vitals, they have recently been developed and marketed for to a more general, health-conscious market. Equipped with advanced sensors such as accelerometers, users’ activity and sleep can be automatically tracked and then compared with their logged fitness goals and daily diet. Users can then use their statistics to help create or maintain a healthier lifestyle. Two examples of such devices are the Jawbone Up and Fitbit Flex, shown above.

Wearable technology is popular and has the potential to dramatically impact health (e.g. long-term health and activity data tracking, immediate syncing with Electronic Health Records (EHRs)). But these benefits can only be realized if the user is able to effectively use and understand these devices. This was the motivation for focusing on two of the most popular models of fitness trackers: the JawBone Up and FitBit Flex and their accompanying smartphone apps.

This study examined the usability of these two devices and their accompanying smartphone apps by having 14 participants (7 for Jawbone Up, 7 for FitBit Flex) perform a think-aloud test on five key features: Setup, Setting Goals, Tracking Diet, Tracking Activity, and Setting an Alarm. Participants then kept the wearable for three days and were encouraged to incorporate it into their normal routine. On the third day, participants completed the System Usability Scale survey and an informal interview regarding their experiences using the wearable.

Some of the key Jawbone UP findings were:

  1. Adding food or drink items was somewhat difficult due to unintuitive organization and unpredictable bugs. For example, one participant attempted to add a food item by scanning the bar code of a Lunchable, but the app added a Dr. Pepper to the log.
  2. Participants struggled to find the alarm settings, with one conducting a general web search for help to understand the Smart Sleep Window settings and how to save alarm settings.
  3. None of the participants were able to figure out how to communicate to the band or app that they would like to begin a workout. They didn’t realize that the Stopwatch menu option was intended to time the workout.

Some of the key findings of the FitBit Flex were:

Setting goals
Setting goals
What do I tap?
  1. Participants felt that the wristband (when using the appropriate sized band) was not uncomfortable or revealing and they were proud to wear it because it made them feel healthy.
  2. Users had a difficult time figuring out where to go on the app to set their health goals at first. Their instinct was to find it on the app homepage, or Dashboard, but it was under the Account tab.
  3. Some users had difficulty putting on the wristband, and several noted that it fell off unexpectedly. Users were also confused about where to “tap” the wristband to activate it, based on the instructions given in the app. The picture can appear to instruct the user to tap below the black screen, when the user actually needs to tap the screen directly, and firmly.
  4. Users did not realize that after turning Bluetooth on their phone, they needed to return to the app to tell the phone and wristband to begin syncing. They also noted that leaving Bluetooth on all day drained their phone battery.

    Bluetooth confusion

Based on time per task and number of errors the FitBit Flex performed better than the Jawbone Up on the five tasks. Users’ ultimate trust in the data, willingness to continue using the wearable, and general satisfaction with each wearable was heavily influenced by their initial experiences (first day). The positive initial think-aloud results for the FitBit Flex were also consistent with a more positive later experience and stronger acceptance of the wearable.

This study found that there is still much room for improvement in the usability of the accompanying smartphone apps. A major concern for these kinds of devices is keeping user interest and motivation, which can easily be lost through confusing or cumbersome designs. By striving to improve the human factors of the apps simultaneous to the capabilities of the actual wearables, there is great potential for greater user satisfaction, and thus more long-term use.

While activity tracking wearables are currently most popular with more tech-savvy, active people, these devices should be designed to be used by all ages and levels of experience users. These devices could change health monitoring drastically and give people the power and ability to make better choices, and live healthier lifestyles.

Haley Vaigneur is a graduate student in Industrial Engineering at Clemson University. Her concentration is Human Factors and Ergonomics, emphasizing on research in the healthcare field.

Bliss Altenhoff is a Doctoral Candidate studying Human Factors Psychology at Clemson University, where she received her M.S. in Applied Psychology in 2012.  She is a member of the Perception and Action (PAC) lab, where her research is concentrated on enhancing human perception and performance by enriching perceptual display technologies for laparoscopic surgeons. .

This material is based upon work supported by the National Science Foundation under Grant No. 1314342. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.

Worst Mobile Interface Ever

I was reading articles the other day and came across a site that, as many do, reformatted for my phone. Almost all reformatted-for-mobile sites are terrible, but this one is my favorite.
You cannot scroll through the 21 page article by moving your finger up and down, as would happen on a website. The only way to change pages is via the horizontal slider at the bottom. Good luck trying to move it so slightly it only goes forward one page! And yes, moving the slider left and right does move the page up and down.

Usability process not used for ACA website

slideA recently released report, done in March 2013, reveals the process of creating Hindsight is always 20/20, but we’ve also worked hard to establish best practices for considering both engineering and the user in software development. These contributions need to be valued, especially for large scale projects. After looking through the slides, one thing I note is that even this improved approach barely mentions the end users of the website. There is one slide that states “Identify consumer paths; review and modify vignettes.” The two examples of this are users who have more or less complex needs when signing up for insurance. I don’t see any mention of involving actual users prior to release.

The NPR write-up states:

Consultants noted there was no clear leader in charge of this project, which we now know contributed to its disastrous release. And there was no “end-to-end testing” of its full implementation, something we now know never happened.

Some of this may fall on us, for not being convincing enough that human factors methods are worth the investment. How much would the public be willing to pay for a solid usability team to work with the website developers?

Potpourri–Lazy Summer Edition

It’s summer and we (along with some of you) are taking a break.  But here’s a list of interesting usability/HF-related things that have crossed my path:

  • After much complaining, Ford is bringing back physical knobs in their MyTouch in-car controls.  Anne and I worked on some research (PDF) in our past lives as graduate students that directly compared touch-only interfaces to knob-based interfaces so it’s nice to see it is still a major issue; if only Ford read our 9 year old paper 🙂
  • Trucks driving under very low bridges is such a large problem in Australia that they are deploying a really novel and clever warning system.  A waterfall that projects a sign that’s hard to miss!
  • tags_finderApple will introduce their next version of OSX in the fall. One of the features i’m most excited about is system-level tag support.  Tags allow users to organize their files regardless of location or type.  I’m particularly interested in personal, single-user-generated tagging (compared to collaborative tagging like that used in flickr) as it appears to benefit older adults information organization and retrieval (PDF).  This pleases me.


Another edition of potpourri where I surface some of the more interesting HF/usability links that have crossed my path.

Usability of a Glass Dashboard?


I had heard that the Tesla Model S (the luxury electric car) had a giant touch screen as one of the main interfaces for secondary car functions and always wondered what that might be like from a human factors/usability perspective. Physical knobs and switches, unlike interface widgets, give a tactile sensation and do not change location on the dashboard.

This post is an interesting examination of the unique dashboard:

Think about a car’s dashboard for a second. It’s populated with analog controls: dials, knobs, and levers, all of which control some car subsystem such as temperature, audio, or navigation. These analog dials, while old, have two features: tactility and physical analogy. Respectively, this means you can feel for a control, and you have an intuition for how the control’s mechanical action affects your car (eg: counterclockwise on AC increases temperature). These small functions provide a very, very important feature: they allow the driver to keep his or her eyes on the road.

Except for a the privileged few that have extraordinary kinesthetic sense of where our hands are, the Model S’s control scheme is an accident waiting to happen. Hell, most of us can barely type with two hands on an iPhone. Now a Model S driver has to manage all car subsystems on a touchscreen with one hand while driving.

The solution, however, is may not be heads-up displays or augmented reality, as the author suggests (citing the HUD in the BMW).


While those displays allow the eye to remain on the road it’s always in the way–a persistent distraction. Also, paying attention to the HUD means your attention will not be on the road–and what doesn’t get paid attention to doesn’t exist:

Begging robots, overly familiar websites, and the power of the unconscious?

Hello readers, and sorry for the unintentional hiatus on the blog. Anne and I have been recovering from the just-completed semester only to be thrown back into another busy semester.  As we adjust, feast on this potpourri post of interesting HF-related items from the past week.

In todays HF potpourri we have three very interesting and loosely related stories:

  • There seems to be a bit of a resurgence in the study of anthropomorphism in HF/computer science primarily because…ROBOTS.  It’s a topic I’ve written about [PDF] in the context of human-automation interaction.  The topic has reached mainstream awareness because NPR just released a story on the topic
  • The BBC looks at the rise of websites that seem to talk to us in a very informal, casual way.  Clearly, the effect on the user is not what was intended:

The difference is the use of my name. I also have a problem with people excessively using my name. I feel it gives them some power over me and overuse implies disingenuousness. Like when you ring a call centre where they seem obsessed with saying your name.

Paper prototyping made easier

Paper prototyping is a common usability technique to quickly test out an interaction before expending too much effort on programming or designing.  The value in paper prototyping is that with extremely low effort, you can test the interaction rather than the appearance of an interface.

I just came across a great iOS app that lets you add some real interactivity to your paper prototypes:  POP Prototyping on Paper.  You simply sketch out your screen, take a picture using your iPhone camera, and then add interactivity.  It’s a brilliantly simple idea.


Principles of Animation in UI Design

Smashing Magazine posts a great article on some principles for including animation in mobile UIs.  I think the use of animation is under-estimated by some HF people because it’s hard to quanitfy the “performance benefit” (e.g., they may not increase the speed at which a user completes a task).

Some notable examples of animation are the infamous Apple’s page bounce-back, the page-curl in e-book apps or the bouncing icons on a Mac (or expanding/minimizing windows on Mac/PC).  Difficulty in quantifying objective benefits may lead some to dismiss animations as superfluous and unneccessarily ornate.  I wholeheartedly disagree.  Animations provide a fluidity that makes interfaces feel responsive even delightful.

The article provides a lot of reasons for the benefits and most appropriate use of animation. It’s typical Smashing Magazine (i.e., LOOONG) so save it to read later!

(post image from the article)