Category Archives: hci

Intuitive Interfaces for Software Developers

Here is a link to some neat new research being done by my colleagues at NCSU.  It’s about the development of a tool that instantly changes the look of software code as it’s being developed, allowing for different ways to investigate bugs and features, but without changing the code in any way that might introduce errors. Dr. Emerson Murphy-Hill developed the interface for this “refactoring” of code and published on it this past semester.

From the article:

Making Refactoring Tools More Attractive For Programmers

“The researchers designed the marking menus so that the refactoring tools are laid out in a way that makes sense to programmers. For example, tools that have opposite functions appear opposite each other in the marking menu. And tools that have similar functions in different contexts will appear in the same place on their respective marking menus.

Early testing shows that programmers were able to grasp the marking menu process quickly, and the layout of the tools within the menus was intuitive.”

 

“I wasn’t trying to make a computer interface, I was just trying to make a drum” – NPR interviews Bill Buxton

NPR interviews Bill Buxton on the technology side and Sherry Turkle on the social impacts side.

The Touchy-Feely Future Of Technology

Excerpts:

“I wasn’t trying to make a computer interface, I was just trying to make a drum,” Buxton tells NPR’s Robert Siegel. “Did I envision what was going to happen today, that it would be in everybody’s pocket — in their smartphone? Absolutely not. Did we realize that things were going to be different, that you could do things that we never imagined? … Absolutely.”

Today, Buxton is known as a pioneer in human-computer interaction, a field of computer science that has seen a spike in consumer demand thanks to a new, seemingly ubiquitous technology: Touch.”

“Turkle says that’s because touch-screen devices appeal to a sentiment that pretty much everyone can relate to: the desire to be a kid again.

“[The] fantasy of using your body to control the virtual is a child’s fantasy of their body being connected to the world,” Turkle says. “That’s the child’s earliest experience of the world and it kind of gets broken up by the reality that you’re separate from the world. And what these phones do is bring back that fantasy in the most primitive way.”

And Turkle warns that living in that fantasy world could mean missing out on the real world around you.”

 

 

Photo credit Bejadin.info at Flickr.

Beyond Touch: the future of interaction

Follow the link to read “A Brief Rant on the Future of Interaction Design” by Bret Victor. The briefest of summaries would be that we over-use simple touch in our visions of the future, when we could be including many other cues, such as weight and balance.

From the post:

If you’re with me so far, maybe I can nudge you one step further. Look down at your hands. Are they attached to anything? Yes — you’ve got arms! And shoulders, and a torso, and legs, and feet! And they all move!

Any dancer or doctor knows full well what an incredibly expressive device your body is. 300 joints! 600 muscles! Hundreds of degrees of freedom!

The next time you make breakfast, pay attention to the exquisitely intricate choreography of opening cupboards and pouring the milk — notice how your limbs move in space, how effortlessly you use your weight and balance. The only reason your mind doesn’t explode every morning from the sheer awesomeness of your balletic achievement is that everyone else in the world can do this as well.

With an entire body at your command, do you seriously think the Future Of Interaction should be a single finger?

 

 

Photo credit jstarpl @ Flickr

Coming to APA 2011: A Conversation Hour on Use of Electronic Health Records in Clinical Practice

Drs. Kelly Caine (of guest post fame)  and Dennis Morrison will be presenting on human factors considerations for the design and use of electronic health records.  Audience participation is welcome as they discuss this important topic. See abstract below.

In this conversation hour we will discuss the use of electronic health records in clinical practice. Specifically, we will focus on how, when designed using human factors methods, electronic health records may be used to support evidence based practice in clinical settings. We will begin by giving a brief overview of the current state of electronic health records in use in behavioral health settings, as well as outline the potential future uses of such records. Next, we will provide an opportunity for the audience members to ask questions, thus allowing members to guide the discussion to the issues most relevant to them. At the conclusion of the session, participants will have a broader understanding of the role of electronic health records in clinical practice as well as a deeper understanding of the specific issues they face in their practice. In addition, we hope to use this conversation hour as a starting point to generate additional discussions and collaborations on the use of electronic health records in clinical practice, potentially resulting in an agenda for future research in the area of electronic health records in clinical behavioral health practice.

Kelly Caine is the Principal Reserach Scientist in the Center for Law, Ethics, and Applied Research (CLEAR) Health Information.

Dennis Morrison is the CEO of the non-profit Centerstone Research Institute.

Check out the full Division 21 program.

Coming to APA in August: Information Foraging in the Social Web

Peter Pirolli (currently a Research Fellow at Xerox/PARC) will be presenting on Information Foraging Theory. See below for an abstract of his upcoming talk.

Information Foraging Theory is a theory of human-information interaction that aims to explain and predict how people will best shape themselves to their information environments, and how information environments can best be shaped to people.  The approach involves a kind of reverse engineering in which the analyst asks (a) what is the nature of the task and information environments, (b) why is a given system a good solution to the problem, and (c) how is that “ideal” solution realized (approximated) by mechanism.

Typically, the key steps in developing a model of information foraging involve: (a) a rational analysis of the task and information environment (often drawing on optimal foraging theory from biology) and (b) a computational production system model of the cognitive structure of task. I will briefly review work on individual information seeking,  and then focus on how this work is being expanded to studies of information production and sense-making in technology-mediated social systems such as wikis, social tagging, social network sites, and twitter.

In recent years, we have been extending our studies to deal with social interactions on the Web (e.g., wikis, tagging systems, twitter). This has lead to studies of how people assess source credibility (expertise, trustworthiness, bias ) and how user interfaces might affect such judgments.

Check out the full Division 21 program.

Resources: Human Factors Design Considerations in Home Health Technology

The National Academies of Science and Agency for Healthcare Research and Quality have just released two publications.

The first, Health Care Comes Home, is a 200 page report:

Health Care Comes Home reviews the state of current knowledge and practice about many aspects of health care in residential settings and explores the short- and long-term effects of emerging trends and technologies. By evaluating existing systems, the book identifies design problems and imbalances between technological system demands and the capabilities of users. Health Care Comes Home recommends critical steps to improve health care in the home. The book’s recommendations cover the regulation of health care technologies, proper training and preparation for people who provide in-home care, and how existing housing can be modified and new accessible housing can be better designed for residential health care. The book also identifies knowledge gaps in the field and how these can be addressed through research and development initiatives.

The second, Consumer Health Information Technology in the Home: A Guide for Human Factors Design Considerations, is a free designers guide:

Consumer Health Information Technology in the Home introduces designers and developers to the practical realities and complexities of managing health at home. It provides guidance and human factors design considerations that will help designers and developers create consumer health IT applications that are useful resources to achieve better health.

When poor usability costs you your job?

You may have heard that an employee who managed “social media” for Chrysler accidentally posted on Chrysler’s twitter account about *ahem* poor driving in Chrysler’s home city of Detroit. Click here for the original story.

The guy who sent the tweet blames the program he used for multiple twitter accounts. The article calls it a “glitch,” which would not necessarily be usability, but it seems more likely to be a problem with understanding what account a tweet will come from when multiple accounts are accessible.

From the article on WXYZ:

Scott is convinced a software glitch on a program called Tweetdeck led to the tweet being sent out on the wrong account. He says he deleted the Chrysler account from the program, but somehow it still went out.

His attorney, Michael Dezsi, says Scott has a case.

“A simple web search shows a number of other users have encountered the same issues,” Dezsi said.

Action News made contact with a Tweetdeck spokesman via email about the claim.

“We are not familiar with the error you describe–tweets sent from a deleted account–but we normally would try to replicate it to make sure there is no problem on our end (although it sounds very unlikely that this is a TweetDeck issue). If you know the type of hardware, platform and TweetDeck version we could check further,” said Sam Mandel, Tweetdeck executive vice president of business operations.

People like Tweetdeck, though they admit the interface is complex.

This looks like another case where people feel more justified when the problem is a software bug or engineering glitch than when a usability problem caused the error.

Apple, UCD, and Innovation – A Guest Post by Travis Bowles

This guest post is in response to the article User-Led Innovation Can’t Create Breakthroughs: Just ask Apple and IKEA at fastcodesign.com

From the article:

One evening, well into the night, we asked some of our friends on the Apple design team about their view of user-centric design. Their answer? “It’s all bullshit and hot air created to sell consulting projects and to give insecure managers a false sense of security. At Apple, we don’t waste our time asking users, we build our brand through creating great products we believe people will love.”

I’d argue that someone at Apple noticed how Microsoft had been building tablets since 2002 and hadn’t quite gained traction. Apple tends to step in and refine the work of others, often picking the perfect moment in time when the capabilities of a technology and users’ willingness to accept a technology intersect. The question of how they choose those moments is hotly debated – if it was more apparent to the world, their competition wouldn’t always be following them into the market, often AFTER pioneering the first generation of the market Apple dominates (see Microsoft with tablets, Creative Labs with the iPod, Xerox with GUIs).

I believe there is a common misunderstanding that User Centered Design(UCD) is asking users what they need and building it. If that were UCD, then we’d just let marketing and sales departments design products with the feature lists provided by customers and, in many cases, that would be a sufficient source of information to drive an evolutionary product design process. However, I would argue that proper full-spectrum user centered design *leads* to revolutionary product designs. The problem lies in the assumption that user centered design is building what the user thinks he/she wants.

Jonathan Ive is fond of a quote from Henry Ford that I use in explaining the differences between customer feedback and user experience research – “If I had asked people what they needed, they would have said faster horses.” I think this sums up the Apple philosophy that they are creating things so new and cool that the future users wouldn’t even know what technologies were available, let alone be able to assemble them into new category of device. The mistake here is believing that the only tool available to the UCD practitioner is asking users “what should we build for you?”

1910 Model T Ford, Salt Lake City, Utah

What Ive ignores is that, although Henry Ford didn’t rely on potential customers to define his product, he did learn about their needs and try to accommodate them. The original Model Ts were designed to run on ethanol for the benefit of farmers who could make their own fuel from the land (as they did for their horses), and they were designed for the simple servicing by owners in the field (as they did their other farm equipment), in contrast to some more expensive competitors. He didn’t ask his users to design his product but he informed his designs by learning about their environment, goals, and needs.

On a smaller scale, I’ve seen failures of this sort during user testing, when some participants will offer direct design advice, proposing that you place this button here, add this feature there. A lot of researchers get frustrated and dismiss this sort of input, correctly asserting that the participant is not here to redesign the UI. I do, however, find follow-up questions on these design suggestions often produce interesting data points concerning user expectations, needs and even mental models of the system. I wonder sometimes if some designers and researchers overreact because they feel their value is undermined when they acknowledge any value in the ideas of potential users.

One last thought I have is that the new crop of development-centric, massively networked products presents new challenges to the value of UCD. Startups have always moved quickly, and they’ve always run the risk of losing a race to release a product if they spend too much time “polishing” their product before an initial release. As a result, user experiences and feature depth were usually poor to start with and improved over time as the user base increased. The major changes in user experience were made while the number of users forced to adjust was still small, and by the time wide scale adoption was realized, changes generally settled into enhancements and logical upgrades (largely speaking software here, but Consumer Electronics also fits).

However, recently, to be successful a product needs to become ubiquitous almost upon release. Between social networks and newly established cycles of technology obsolescence,* there is little time to build up a base of users to try the early versions of your product before widespread acceptance. One might assume this would motivate companies to work harder to use UCD to create good designs before that initial release, but this has not been the strategy applied by the biggest winners. Instead, I believe successful companies are setting out to provide one or a handful of killer features, often wrapped in a barely serviceable user experience, to as many people as quickly as possible. Rather than risk missing out on a key moment, they skip the needs gathering and early stage user research and take their best shot instead. If they are successful and widely adopted, the reasoning goes, they can go back and improve the experience later with direct user feedback.

Of course, this approach runs into a lot of practical issues. For instance, there is an installed user base who may rebel when confronted with change (although if you provide an irreplaceable device/service, people will complain but still be your customer). Additionally, once the company is successful, it has the dual role of providing an improved future experience and maintaining the current experience, splitting resources and attention. For this reason, companies often find it hard to actually follow through on step 2 of the plan where step one is “get customers” and step two is “make product better for customers.” In this phase, iterative refinements of the product design get bogged down in new features, and there is no time for conducting full-spectrum user research.

Based on these factors, I do wonder, outside of giant corporations or products with decade-spanning development (such as aircraft, medical technologies or anything the government watches over), are we likely to see a rapid decline in user research in innovative product designs, and in early product development for most products? My intuition is that we will see an increase in demand for practitioners capable of research, design and implementation, but with less specialized training in user research and user centered design. The only “concrete” evidence I can back it up with is my anecdotal observation that the majority of interesting opportunities for user research I’ve found have been specifically requesting a developer/engineer with the ability to conduct research or complete designs in addition to implementing them.

* Products such as netbooks, iPads, iPods and smartphones are as expensive as appliances we used to expect 10+ years of service from. The average washing machine is less than an iPad, but you can expect the iPad to be out of date in ~ 2 years. People would be up in arms if their washing machines (or even microwaves, at 1/4 the price) stopped performing after 2 years.

Travis Bowles, M.S., is a usability consultant in San Francisco specializing in enterprise software, novel consumer electronics, and web interfaces.

(post photo credit: flickr user raneko)

Designer of movie UIs to design real UIs

We’ve discussed Mark Coleran before with his fantastical work with those fake user interfaces you see in movies (see movie below).   According to this Fast Company blog post he will have a hand in designing real interfaces.

But Coleran doesn’t just throw out the rule books on user experience and “human interface guidelines.” In fact, because many of his clients know his movie work, he spends a lot of time talking them out of doing something like Children of Men or The Bourne Ultimatum. “One of my biggest frustations is when people will say, ‘We have these specifications and requirements, now execute it just like we saw in the movie,'” he says. “What they don’t realize is that the requirements for those movie FUIs were completely unlike the ones that they’re dealing with. In a movie, you see an interface for at most a couple of seconds. In real life, every design decision has a consequence, and it doesn’t go away. It’s there day in and day out. Those human interface guidelines are there for very good reasons.”

Coleran Reel 2008.06 HD from Mark Coleran on Vimeo.