Recent developments in in-vehicle distractions: Voice input no better than manual input

A man uses a cell phone while driving in Burbank, California June 25, 2008. Credit: Reuters/Fred Prouser
Earlier this week the United States Department of Transportation released  guidelines for automakers designed to reduce the distractibility of in-vehicle technologies (e.g., navigation systems). :

The guidelines include recommendations to limit the time a driver must take his eyes off the road to perform any task to two seconds at a time and twelve seconds total.

The recommendations outlined in the guidelines are consistent with the findings of a new NHTSA naturalistic driving study, The Impact of Hand-Held and Hands-Free Cell Phone Use on Driving Performance and Safety Critical Event Risk. The study showed that visual-manual tasks associated with hand-held phones and other portable devices increased the risk of getting into a crash by three times. [emphasis added]

But a new study (I have not read the paper yet) seems to show that even when you take away the “manual” aspect through voice input, the danger is not mitigated:

The study by the Texas Transportation Institute at Texas A&M University was the first to compare voice-to-text and traditional texting on a handheld device in an actual driving environment.

“In each case, drivers took about twice as long to react as they did when they weren’t texting,” Christine Yager, who headed the study, told Reuters. “Eye contact to the roadway also decreased, no matter which texting method was used.”

Potpourri

Another edition of potpourri where I surface some of the more interesting HF/usability links that have crossed my path.

Usability of a Glass Dashboard?

0-IlNLQ5pqXUI5Emfk

I had heard that the Tesla Model S (the luxury electric car) had a giant touch screen as one of the main interfaces for secondary car functions and always wondered what that might be like from a human factors/usability perspective. Physical knobs and switches, unlike interface widgets, give a tactile sensation and do not change location on the dashboard.

This post is an interesting examination of the unique dashboard:

Think about a car’s dashboard for a second. It’s populated with analog controls: dials, knobs, and levers, all of which control some car subsystem such as temperature, audio, or navigation. These analog dials, while old, have two features: tactility and physical analogy. Respectively, this means you can feel for a control, and you have an intuition for how the control’s mechanical action affects your car (eg: counterclockwise on AC increases temperature). These small functions provide a very, very important feature: they allow the driver to keep his or her eyes on the road.

Except for a the privileged few that have extraordinary kinesthetic sense of where our hands are, the Model S’s control scheme is an accident waiting to happen. Hell, most of us can barely type with two hands on an iPhone. Now a Model S driver has to manage all car subsystems on a touchscreen with one hand while driving.

The solution, however, is may not be heads-up displays or augmented reality, as the author suggests (citing the HUD in the BMW).

0-nbDjWV_lIC2DOUnC

While those displays allow the eye to remain on the road it’s always in the way–a persistent distraction. Also, paying attention to the HUD means your attention will not be on the road–and what doesn’t get paid attention to doesn’t exist: