Square Cash is a great service – it allows you to send money via an email with no service charge if you’re using your debit card. You can receive money without entering a PIN. I use it all the time to divide up restaurant bills among my friends. That said, I found a usability issue yesterday that I wanted to share.
I needed to link my debit card to the app, so I followed their very simple instructions for entry.
The first screen asks for the card number. The number pad is telephone-order rather than number pad-order. This is on a phone, so that makes sense even if I’m much more used to entering these numbers using a keyboard.
Next, the expiration date. On my card, the expiration date is 09/16/2016*, so I start to enter it.
Here is the screen as you start to enter the date:
I then proceeded to enter 09/16 as I looked at my card, then the CCV, and got an error message about an incorrect card number. Tried again. Same. Did this four times before I realized that the expiration date was month/year. It isn’t as though I’d never seen this, or been asked to enter just the month and year from a card, so I thought hard about what tricked me.
I concluded it was the difference between the second and third screens – the guidance is there before you start typing, but as soon as you put in any number for the date, the guidance disappears. Since I was looking down at my card, I just entered what I saw and didn’t think enough to check – especially since it called for ##/##, which matched the month and day on my card, not ##/####, which could only be a month and year.
You are welcome to blame the user for this one, but it would be a small fix to keep the background guide visible during entry.
*No, I’m not dumb enough to put my real card number or expiration date in the pictures for this post. 🙂
Gretchen Addi, an associate partner at IDEO, hired Beskind. Addi says when Beskind is in a room, young designers do think differently. For example, Addi says IDEO is working with a Japanese company on glasses to replace bifocals. With a simple hand gesture, the glasses will turn from the farsighted prescription to the nearsighted one.
Initially, the designers wanted to put small changeable batteries in the new glasses. Beskind pointed out to them that old fingers are not that nimble.
“It really caused the design team to reflect,” Addi says. They realized they could design the glasses in a way that avoided the battery problem. “Maybe it’s just a USB connection. Are there ways that we can think about this differently?”
There are several wonderful take-home messages:
Creative and fulfilling work can extend late into the lifetime
Aging does not just bring limitations, it also extends perspective and wisdom
Designing for aging is doesn’t detract from a product but can enhance it for people of all ages
Having a person with such perspective on a design team changes the perspective and thoughts of the rest of the team, the core tenant of participatory design
The big news in tech last week was the unveiling of the Apple Watch. I think it is a nice moment to discuss a range of human factors topics. (This topic may elicit strong feelings for or against Apple or the idea of a smartwatch but let’s keep it about the science.)
The first is technology adoption/acceptance. Lots of people were probably scratching their heads asking, “who wears a watch, nowadays?” But you do see lots of people wearing fitness bands. Superficially, that contrast seems to demonstrate the Technology Acceptance Model (TAM) in action. TAM is a way to try to understand when people will adopt new technology. It boils down the essential factors to usability (does it seem easy to use?) and usefulness (does it seem like it will help my work or life?).
Fitness bands check both of the above boxes: since they are essentially single-function devices they are relatively easy to use and tracking fitness is perceived as useful for many people.
Back to the Watch, it may also check off both of the above boxes: it certainly appears easy to use (but we do not know yet), and because it has fitness tracking functions plus many others via apps it certainly may be perceived as useful to the same crowd that buys fitness bands.
The next topic that got me excited was the discussion of the so-called digital crown (shown below). Anne and I have previously studied the contrasts between touch screens and rotary knobs for a variety of computing tasks. Having both choices allows the user select the best input device for the task: touch for pushing big on-screen buttons and large-scale movement and knob for precise, linear movement without obscuring the screen. Using a knob is certainly easier than a touch screen if you have shaky hands or are riding a bumpy cab.
Two small items of note that were included in the Watch was the use of the two-finger gesture on the watch face to send a heart beat to another user–the same gesture many people intuitively think of when they want to feel their own heart beat.
Finally, the Watch has the ability to send animated emoij to other users. What was noteworthy is the ability to manipulate both eyes and mouth in emoji characters. I couldn’t find any literature but I recall somewhere that there is some cross-cultural differences in how people use and interpret emoji: Western users tend to focus on the mouth while Eastern users tend to focus on the eyes (if you know what reference I’m talking about or if I’m mis-remembering, feel free to comment).
There’s so much I haven’t brought up (haptic and multi-modal feedback, user interface design, automation, voice input and of course privacy)!
This guest post is from graduate students Haley Vaigneur and Bliss Altenhoff. Haley and Bliss compared the usability of two fitness trackers as part of a graduate course in health informatics taught by Kelly Caine.
Wearable fitness trackers allow users to track and monitor their health. While these devices originated as a way for doctors to monitor chronically ill patients’ vitals, they have recently been developed and marketed for to a more general, health-conscious market. Equipped with advanced sensors such as accelerometers, users’ activity and sleep can be automatically tracked and then compared with their logged fitness goals and daily diet. Users can then use their statistics to help create or maintain a healthier lifestyle. Two examples of such devices are the Jawbone Up and Fitbit Flex, shown above.
Wearable technology is popular and has the potential to dramatically impact health (e.g. long-term health and activity data tracking, immediate syncing with Electronic Health Records (EHRs)). But these benefits can only be realized if the user is able to effectively use and understand these devices. This was the motivation for focusing on two of the most popular models of fitness trackers: the JawBone Up and FitBit Flex and their accompanying smartphone apps.
This study examined the usability of these two devices and their accompanying smartphone apps by having 14 participants (7 for Jawbone Up, 7 for FitBit Flex) perform a think-aloud test on five key features: Setup, Setting Goals, Tracking Diet, Tracking Activity, and Setting an Alarm. Participants then kept the wearable for three days and were encouraged to incorporate it into their normal routine. On the third day, participants completed the System Usability Scale survey and an informal interview regarding their experiences using the wearable.
Some of the key Jawbone UP findings were:
Adding food or drink items was somewhat difficult due to unintuitive organization and unpredictable bugs. For example, one participant attempted to add a food item by scanning the bar code of a Lunchable, but the app added a Dr. Pepper to the log.
Participants struggled to find the alarm settings, with one conducting a general web search for help to understand the Smart Sleep Window settings and how to save alarm settings.
None of the participants were able to figure out how to communicate to the band or app that they would like to begin a workout. They didn’t realize that the Stopwatch menu option was intended to time the workout.
Some of the key findings of the FitBit Flex were:
Participants felt that the wristband (when using the appropriate sized band) was not uncomfortable or revealing and they were proud to wear it because it made them feel healthy.
Users had a difficult time figuring out where to go on the app to set their health goals at first. Their instinct was to find it on the app homepage, or Dashboard, but it was under the Account tab.
Some users had difficulty putting on the wristband, and several noted that it fell off unexpectedly. Users were also confused about where to “tap” the wristband to activate it, based on the instructions given in the app. The picture can appear to instruct the user to tap below the black screen, when the user actually needs to tap the screen directly, and firmly.
Users did not realize that after turning Bluetooth on their phone, they needed to return to the app to tell the phone and wristband to begin syncing. They also noted that leaving Bluetooth on all day drained their phone battery.
Based on time per task and number of errors the FitBit Flex performed better than the Jawbone Up on the five tasks. Users’ ultimate trust in the data, willingness to continue using the wearable, and general satisfaction with each wearable was heavily influenced by their initial experiences (first day). The positive initial think-aloud results for the FitBit Flex were also consistent with a more positive later experience and stronger acceptance of the wearable.
This study found that there is still much room for improvement in the usability of the accompanying smartphone apps. A major concern for these kinds of devices is keeping user interest and motivation, which can easily be lost through confusing or cumbersome designs. By striving to improve the human factors of the apps simultaneous to the capabilities of the actual wearables, there is great potential for greater user satisfaction, and thus more long-term use.
While activity tracking wearables are currently most popular with more tech-savvy, active people, these devices should be designed to be used by all ages and levels of experience users. These devices could change health monitoring drastically and give people the power and ability to make better choices, and live healthier lifestyles.
Haley Vaigneur is a graduate student in Industrial Engineering at Clemson University. Her concentration is Human Factors and Ergonomics, emphasizing on research in the healthcare field.
Bliss Altenhoff is a Doctoral Candidate studying Human Factors Psychology at Clemson University, where she received her M.S. in Applied Psychology in 2012. She is a member of the Perception and Action (PAC) lab, where her research is concentrated on enhancing human perception and performance by enriching perceptual display technologies for laparoscopic surgeons. .
ACKNOWLEDGMENTS This material is based upon work supported by the National Science Foundation under Grant No. 1314342. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.
I was reading articles the other day and came across a site that, as many do, reformatted for my phone. Almost all reformatted-for-mobile sites are terrible, but this one is my favorite.
You cannot scroll through the 21 page article by moving your finger up and down, as would happen on a website. The only way to change pages is via the horizontal slider at the bottom. Good luck trying to move it so slightly it only goes forward one page! And yes, moving the slider left and right does move the page up and down.
A recently released report, done in March 2013, reveals the process of creating Healthcare.gov. Hindsight is always 20/20, but we’ve also worked hard to establish best practices for considering both engineering and the user in software development. These contributions need to be valued, especially for large scale projects. After looking through the slides, one thing I note is that even this improved approach barely mentions the end users of the website. There is one slide that states “Identify consumer paths; review and modify vignettes.” The two examples of this are users who have more or less complex needs when signing up for insurance. I don’t see any mention of involving actual users prior to release.
Consultants noted there was no clear leader in charge of this project, which we now know contributed to its disastrous release. And there was no “end-to-end testing” of its full implementation, something we now know never happened.
Some of this may fall on us, for not being convincing enough that human factors methods are worth the investment. How much would the public be willing to pay for a solid usability team to work with the website developers?
It’s summer and we (along with some of you) are taking a break. But here’s a list of interesting usability/HF-related things that have crossed my path:
After much complaining, Ford is bringing back physical knobs in their MyTouch in-car controls. Anne and I worked on some research (PDF) in our past lives as graduate students that directly compared touch-only interfaces to knob-based interfaces so it’s nice to see it is still a major issue; if only Ford read our 9 year old paper 🙂
Trucks driving under very low bridges is such a large problem in Australia that they are deploying a really novel and clever warning system. A waterfall that projects a sign that’s hard to miss!
I had heard that the Tesla Model S (the luxury electric car) had a giant touch screen as one of the main interfaces for secondary car functions and always wondered what that might be like from a human factors/usability perspective. Physical knobs and switches, unlike interface widgets, give a tactile sensation and do not change location on the dashboard.
This post is an interesting examination of the unique dashboard:
Think about a car’s dashboard for a second. It’s populated with analog controls: dials, knobs, and levers, all of which control some car subsystem such as temperature, audio, or navigation. These analog dials, while old, have two features: tactility and physical analogy. Respectively, this means you can feel for a control, and you have an intuition for how the control’s mechanical action affects your car (eg: counterclockwise on AC increases temperature). These small functions provide a very, very important feature: they allow the driver to keep his or her eyes on the road.
Except for a the privileged few that have extraordinary kinesthetic sense of where our hands are, the Model S’s control scheme is an accident waiting to happen. Hell, most of us can barely type with two hands on an iPhone. Now a Model S driver has to manage all car subsystems on a touchscreen with one hand while driving.
The solution, however, is may not be heads-up displays or augmented reality, as the author suggests (citing the HUD in the BMW).
While those displays allow the eye to remain on the road it’s always in the way–a persistent distraction. Also, paying attention to the HUD means your attention will not be on the road–and what doesn’t get paid attention to doesn’t exist:
Hello readers, and sorry for the unintentional hiatus on the blog. Anne and I have been recovering from the just-completed semester only to be thrown back into another busy semester. As we adjust, feast on this potpourri post of interesting HF-related items from the past week.
In todays HF potpourri we have three very interesting and loosely related stories:
The BBC looks at the rise of websites that seem to talk to us in a very informal, casual way. Clearly, the effect on the user is not what was intended:
The difference is the use of my name. I also have a problem with people excessively using my name. I feel it gives them some power over me and overuse implies disingenuousness. Like when you ring a call centre where they seem obsessed with saying your name.