Category Archives: websites

Calling the Media out for Misleading InfoViz

I was reading an article on my local news today and saw this graphic, apparently made for the article.

screenshot-2016-12-10-at-12-14-30-pm
Being from Alabama, and just a pattern-recognition machine in general, I immediately noticed it was an anomaly. The lightest pink surrounded on all sides by the darkest red? Unlikely. The writer helpfully provided a source though, from the FBI, so I could look at the data myself.

screenshot-2016-12-10-at-12-19-42-pm

There, right at the start, is a footnote for Alabama. It says “3 Limited supplemental homicide data were received.” Illinois is the only other state with a footnote, but because it’s not so different from its neighbors, it didn’t stand out enough for me to notice.

Florida was not contained in the FBI table and thus is grey – a good choice to show there were no data for that state. But as for Alabama and Illinois, it’s misleading to include known bad data in a graph that has no explanations. They should also be grey, rather than imply the limited information is the truth.

I looked up similar data from other sources to check how misleading the graphic was. Because wouldn’t it be nice if my home state had figured out some magic formula for preventing firearm deaths? Unfortunately, The Centers for Disease Control (CDC) statistics on gun deaths put Alabama in the top 4 for the most gun deaths. That’s quite the opposite of the optimism-inducing light pink in the first graphic. The graph below is for 2014 while the first graphic is for 2013, but in case you might be thinking there was some change, I also looked up 2012 (the CDC appears to publish data every two years). The CDC put firearm deaths per person in Alabama even higher that year than in 2014.
screenshot-2016-12-10-at-12-25-54-pm

In closing, I don’t think this graphic was intentionally misleading. Sure, there are plenty of examples where I would be happy to accuse malice instead of bad design. Most times it’s probably just people working under a deadline or with software tools that don’t allow custom corrections. We do have to be careful – I’d hate to see Alabama not receive aid to curb their firearm death rate based on poor information visualizations.

Big Data and A/B Testing

I became interested in using “big data” for A/B testing after a speaker from RedHat gave a talk to our area about it a couple of years ago. It’s a tantalizing idea: come up with a change, send it out on some small percent of your users, and pull it back immediately if it doesn’t work or isn’t better than the original. Even more amazing when you consider a “small percent” can be thousands and thousands of people – a dream for any researcher. Certainly, this connects to last year’s news on the controversy over Facebook’s A/B testing adventures.

The only con I can think of is that if something works or doesn’t work, you may not know why. We are always fumbling toward success, but maybe it’s not good to encourage fumbling over development of theory.

NPR’s Planet Money did a great show recently on A/B testing their podcast and the surprising results. They were also willing to think further about how it could be taken to an extreme, audience testing every segment of the show. Certainly worth a listen.

Warning against overgeneralizing in UX

I enjoyed this article by Matt Gallivan, Experience Research Manager at AirBnB, about the tendency of experts to overgeneralize their knowledge. I try to watch out for it in my own life: When you’re an expert at one thing, it’s so easy to think you know more than you do about other areas.

Excerpt:

Because if you’re a UX researcher, you do yourself and your field no favors when you claim to have all of the answers. In the current digital product landscape, UX research’s real value is in helping to reduce uncertainty. And while that’s not as sexy as knowing everything about everything, there’s great value in it. In fact, it’s critical. It also has the added bonus of being honest.

Somewhat related, here is a fun page analyzing where and why AirBnB succeeds at usability.

Psychology Podcasts

Those who know me know I am a fiend for podcasts. Since I’m also a fiend for psychology, I can’t help but notice when it pops up in a podcast, even one not focused on psychology. I use many of them in my courses: for example, the This American Life episode on what having schizophrenia sounds like is a must listen when I hit the Abnormal Psychology chapter in Intro. The Radiolab Memory and Forgetting is a staple in my Cognitive class and I take advantage of the multi-disciplinarity of Human Factors to play clips from every area. Startup had a good one that illustrates what human factors looks like to a client.

Over the years I’ve compiled a list of my favorites relating to psychology. Some are clips from longer podcasts while some are dedicated to psychology (e.g., Invisibilia). Each one has a general area of psychology noted (although some hit two or more areas) and if it’s a clip I put the start and end time of the most related audio.

I hope you enjoy the resource and I will keep updating it as I find more. If you know of any I don’t have listed, please link to it in the comments for the blog and I’ll add it to the spreadsheet.

Usability process not used for ACA website

slideA recently released report, done in March 2013, reveals the process of creating Healthcare.gov. Hindsight is always 20/20, but we’ve also worked hard to establish best practices for considering both engineering and the user in software development. These contributions need to be valued, especially for large scale projects. After looking through the slides, one thing I note is that even this improved approach barely mentions the end users of the website. There is one slide that states “Identify consumer paths; review and modify vignettes.” The two examples of this are users who have more or less complex needs when signing up for insurance. I don’t see any mention of involving actual users prior to release.

The NPR write-up states:

Consultants noted there was no clear leader in charge of this project, which we now know contributed to its disastrous release. And there was no “end-to-end testing” of its full implementation, something we now know never happened.

Some of this may fall on us, for not being convincing enough that human factors methods are worth the investment. How much would the public be willing to pay for a solid usability team to work with the website developers?

HFES on LinkedIn

The Human Factors and Ergonomics Society has a lively discussion board hosted at LinkedIn. If you like talking about issues with other professionals, have questions or need resources, or just want to see “what do these people like to talk about?” then I suggest a visit!

Some sample topics:

  • What are the top three classical or seminal papers in HFE that you think every graduate should know?
  • What is a safe number of characters to read on a screen while driving?
  • What is the best statistical method for comparing Modified Cooper Harper ratings for two different designs?
  • Does visual appeal, i.e., aesthetics, enhance usability?
  • What is the best place to place OK button on the computer form?
  • Trends in Function Allocation among cognitive agents (human & machine) – a new era for joint cognitive systems?

Two New HF Blogs

Just a short note about two new HF-oriented blogs.  First, Arathi Sethumadhavan Ph.D. has started a new blog. Arathi Sethumadhavan is a Human Factors Scientist at Medtronic’s Cardiac Rhythm and Disease Management. She received her PhD in Experimental Psychology (Human Factors) from Texas Tech University.  Second, Ergonomics in Design, a publication of the Human Factors and Ergonomics Society, has started a blog as well.  Check ’em out!

Coming to APA in August: Information Foraging in the Social Web

Peter Pirolli (currently a Research Fellow at Xerox/PARC) will be presenting on Information Foraging Theory. See below for an abstract of his upcoming talk.

Information Foraging Theory is a theory of human-information interaction that aims to explain and predict how people will best shape themselves to their information environments, and how information environments can best be shaped to people.  The approach involves a kind of reverse engineering in which the analyst asks (a) what is the nature of the task and information environments, (b) why is a given system a good solution to the problem, and (c) how is that “ideal” solution realized (approximated) by mechanism.

Typically, the key steps in developing a model of information foraging involve: (a) a rational analysis of the task and information environment (often drawing on optimal foraging theory from biology) and (b) a computational production system model of the cognitive structure of task. I will briefly review work on individual information seeking,  and then focus on how this work is being expanded to studies of information production and sense-making in technology-mediated social systems such as wikis, social tagging, social network sites, and twitter.

In recent years, we have been extending our studies to deal with social interactions on the Web (e.g., wikis, tagging systems, twitter). This has lead to studies of how people assess source credibility (expertise, trustworthiness, bias ) and how user interfaces might affect such judgments.

Check out the full Division 21 program.

Website Usability Success Story – Bethel University

The Chronicle of Higher Education has posted a great “interactive graphic” about Bethel’s re-design of their admissions page. It includes their metrics of success, an important but often difficult to quantify validation of usability.

Special problems they faced:

  • A large number of specialized programs with different application methods
  • Including financial aid information appropriately and early in the process
  • Managing multiple accounts within a single system
  • Tracking time to complete and number of students who started an application but did not complete it

Hopefully such attention to usable university sites will become more common and this cartoon will no longer be funny.

 

Usability vs. Providing an Experience

Some humor for 2011: a “Things people have never said about a restaurant” website.

My favorite excerpts:

“I really like the way their cheesy elevator jazz interacts with the music I was listening to in iTunes.”

“I hope the phone number and address are actually images so I can’t copy and paste them!”

“I go to restaurant websites for the ambiance.”

“Who needs the phone number of a restaurant when you could be enjoying stock photos of food?”

A quick search turned up a few more rants about restaurant sites. Looks like an epidemic!

  • Restaurant websites: the great and the terrible
  • “A couple of days ago, a friend was asking me for a restaurant recommendation. Easy task, I thought. I had some restaurants in mind and just needed to check and see if they were open and send her the websites. What should have been a 5-minute email turned into a half-hour nightmare as I slogged through websites that are more intent on impressing me with movies, music, and other annoyances than on giving me direct information.”
  • Why ARE restaurant sites so bad?
  • “Who thinks it’s good idea to blast annoying music at people going to your site? Why do they so often rely on Flash, which doesn’t really add anything to the experience, when half the time people are looking up the site on mobile devices to get basic information? Why this bizarre preference for menus in PDF format?”
  • Restaurant websites: casting the net
  • “… has a notoriously ludicrous website which – granted – may well appeal to the sort of ‘zany’ people who eat there. As for everyone else, it will probably just make you want to smash your fist through your monitor.”

Perhaps I’m still unhappy about spending an hour looking for a place to eat in Little Rock last weekend. Flash websites and PDF menus on a 2007 Sprint Treo is not for the faint of heart.