Category Archives: automobiles

Tesla counterpoint: “40% reduction in crashes” with introduction of Autosteer

I posted yesterday about the challenges of fully autonomous cars and cars that approach autonomy. Today I bring you a story about the successes of semi-automatic features in automobiles.

Tesla has a feature called Autopilot that assists the driver without being completely autonomous. Autopilot includes car-controlled actions such as collision warnings, automatic emergency braking, and automatic lane keeping. Tesla classifies the Autopilot features as Level 2 automation. (Level 5 is considered fully autonomous). Rich has already given our thoughts about calling this Autopilot in a previous post. One particular feature is called AutoSteer, described in the NHTSA report as:

The Tesla Autosteer system uses information from the forward-looking camera, the radar sensor, and the ultrasonic sensors, to detect lane markings and the presence of vehicles and objects to provide automated lane-centering steering control based on the lane markings and the vehicle directly in front of the Tesla, if present. The Tesla owner’s manual contains the following warnings: 1) “Autosteer is intended for use only on highways and limited-access roads with a fully attentive driver. When using Autosteer, hold the steering wheel and be mindful of road conditions and surrounding traffic. Do not use Autosteer on city streets, in construction zones, or in areas where bicyclists or pedestrians may be present. Never depend on Autosteer to determine an appropriate driving path. Always be prepared to take immediate action. Failure to follow these instructions could cause serious property damage, injury or death;” and 2) “Many unforeseen circumstances can impair the operation of Autosteer. Always keep this in mind and remember that as a result, Autosteer may not steer Model S appropriately. Always drive attentively and be prepared to take immediate action.” The system does not prevent operation on any road types.

An NHTSA report looking into a fatal Tesla crash also noted that the introduction of Autosteer corresponded to a 40% reduction in automobile crashes. That’s a lot considering Dr. Gill Pratt from Toyota said he might be happy with a 1% change.

Autopilot was enabled in October, 2015, so there has been a good period of time for post-autopilot crash data to be generated.

Toyota Gets It: Self-driving cars depend more on people than on engineering

I recommend reading this interview with Toyota’s Dr. Gill Pratt in its entirety. He discusses pont-by-point the challenges of a self-driving car that we consider in human factors, but don’t hear much about in the media. For example:

  • Definitions of autonomy vary. True autonomy is far away. He gives the example of a car performing well on an interstate or in light traffic compared to driving through the center of Rome during rush hour.
  • Automation will fail. And the less it fails, the less prepared the driver is to assume control.
  • Emotionally we cannot accept autonomous cars that kill people, even if it reduces overall crash rates and saves lives in the long run.
  • It is difficult to run simulations with the autonomous cars that capture the extreme variability of the human drivers in other cars.

I’ll leave you with the last paragraph in the interview as a summary:

So to sum this thing up, I think there’s a general desire from the technical people in this field to have both the press and particularly the public better educated about what’s really going on. It’s very easy to get misunderstandings based on words like or phrases like “full autonomy.” What does full actually mean? This actually matters a lot: The idea that only the chauffeur mode of autonomy, where the car drives for you, that that’s the only way to make the car safer and to save lives, that’s just false. And it’s important to not say, “We want to save lives therefore we have to have driverless cars.” In particular, there are tremendous numbers of ways to support a human driver and to give them a kind of blunder prevention device which sits there, inactive most of the time, and every once in a while, will first warn and then, if necessary, intervene and take control. The system doesn’t need to be competent at everything all of the time. It needs to only handle the worst cases.

The Patient Writes the Prescription

jeep

I took the photo above in my brother-in-laws 2015 Jeep Grand Cherokee EcoDiesel. It says “Exhaust Filter Nearing Full Safely Drive at Highway Speeds to Remedy.”

I’d never seen anything like that before neither had he – it seemed like a terrible idea at first. What if the person couldn’t drive at highway speeds right then? Spending an unknown time driving at highway speeds wasting gas also seemed unpleasant. My brother-in-law said that he was having issues with the car before, but it wasn’t until the Jeep downloaded a software update that it displayed this message on the dashboard.

My own car will be 14 years old this year (nearing an age where it can get its own learner’s permit?), so I had to adjust to the idea of a car that updated itself. I was intrigued by the issue and looked around to see what other Jeep owners had to say.

I found another unhappy customer at the diesel Jeep forum:

At the dealer a very knowledgeable certified technician explained to me that the problem is that we had been making lots of short trips in town, idling at red lights, with the result that the oil viscosity was now out of spec and that the particulate exhaust filter was nearly full and needed an hour of 75 mph driving to get the temperature high enough to burn off the accumulated particulates. No person and no manual had ever ever mentioned that there is a big problem associated with city driving.

And further down the rabbit hole, I found it wasn’t just the diesel Jeep. This is from a Dodge Ram forum:

I have 10,000K on 2014 Dodge Ram Ecodiesel. Warning came on that exhaust filter 90% full. Safely drive at highway speeds to remedy. Took truck on highway & warning changed to exhaust system regeneration in process. Exhaust filter 90% full.
All warnings went away after 20 miles. What is this all about?

It looks like Jeep added a supplement to their owners manual in 2015 to explain the problem:

Exhaust Filter XX% Full Safely Drive at Highway Speeds to Remedy — This message will be displayed on the Driver Information Display (DID) if the exhaust particulate filter reaches 80% of its maximum storage capacity. Under conditions of exclusive short duration and low speed driving cycles, your diesel engine and exhaust after-treatment system may never reach the conditions required to cleanse the filter to remove the trapped PM. If this occurs, the “Exhaust Filter XX% Full Safely Drive at Highway Speeds to Remedy” message will be displayed in the DID. If this message is displayed, you will hear one chime to assist in alerting you of this condition. By simply driving your vehicle at highway speeds for up to 20 minutes, you can remedy the condition in the particulate filter system and allow your diesel engine and exhaust after-treatment system to cleanse the filter to remove the trapped PM and restore the system to normal operating condition.

But now that I’ve had time to think about it, I agree with the remedy. After all,my own car just has a ‘check engine’ light no matter what the issue. Twenty minutes on the highway is a lot easier than scheduling a trip to a mechanic.

What could be done better is the communication of the warning. It tells you what to do, and sort of why, but not how long you have to execute the action or the consequences of not acting. The manual contains a better explanation of why (although the 20 minutes there does not match the 60 minute estimate of at least one expert), not that many people read the manual. Also, the manual doesn’t match the message. The manual says you’ll receive a % full, but the message just said “nearly.” The dash display should direct the driver to more information in the manual. Or, with such a modern display, perhaps scroll to reveal more information (showing partial text, so the driver knows to scroll). Knowing the time to act is more critical, and maybe a % would do that since the driver can probably assume he or she can drive closer to 100% before taking action. It looks as though the driver needs to find a way to drive at highway speeds right now, but hopefully that is not the case. I can’t say for sure though, since neither the manual nor the display told me the answer.

Tesla is wrong to use “autopilot” term

Self driving cars are a hot topic!   See this Wikipedia page on Autonomous cars for a short primer.  This post is mainly a bit of exploration of how the technology is presented to the user.

Tesla markets their self driving technology using the term “Autopilot”.  The German government is apparently unhappy with the use of that term because it could be misleading (LA Times):

Germany’s transport minister told Tesla to cease using the Autopilot name to market its cars in that country, under the theory that the name suggests the cars can drive themselves without driver attention, the news agency Reuters reported Sunday.

Tesla wants to be perceived as first to market with a fully autonomous car (using the term Autopilot) yet they stress that it is only a driver assistance system and that the driver is meant to stay vigilant.  But I do not think term Autopilot is perceived that way by most lay people.  It encourages an unrealistic expectation and may lead to uncritical usage and acceptance of the technology, or complacency.

Complacency can be described and manifested as:

  • too much trust in the automation (more than warranted)
  • allocation of attention to other things and not monitoring the proper functioning of automation
  • over-reliance on the automation (letting it carry out too much of the task)
  • reduced awareness of one’s surroundings (situation awareness)

Complacency is especially dangerous when unexpected situations occur and the driver must resume manual control.  The non-profit Consumer Reports says:

“By marketing their feature as ‘Autopilot,’ Tesla gives consumers a false sense of security,” says Laura MacCleery, vice president of consumer policy and mobilization for Consumer Reports. “In the long run, advanced active safety technologies in vehicles could make our roads safer. But today, we’re deeply concerned that consumers are being sold a pile of promises about unproven technology. ‘Autopilot’ can’t actually drive the car, yet it allows consumers to have their hands off the steering wheel for minutes at a time. Tesla should disable automatic steering in its cars until it updates the program to verify that the driver’s hands are on the wheel.”

Companies must commit immediately to name automated features with descriptive—not exaggerated—titles, MacCleery adds, noting that automakers should roll out new features only when they’re certain they are safe.

Tesla responded that:

“We have great faith in our German customers and are not aware of any who have misunderstood the meaning, but would be happy to conduct a survey to assess this.”

But Tesla is doing a disservice by marketing their system using the term AutoPilot and by selectively releasing video of the system performing flawlessly:

Using terms such as Autopilot, or releasing videos of near perfect instances of the technology will only hasten the likelihood of driver complacency.

But no matter how they are marketed, these systems are just machines that rely on high quality sensor input (radar, cameras, etc).  Sensors can fail, GPS data can be old, or situations can change quickly and dramatically (particularly on the road).  The system WILL make a mistake–and on the road, the cost of that single mistake can be deadly.

Parasuraman and colleagues have heavily researched how humans behave when exposed to highly reliable automation in the context of flight automation/autopilot systems.  In a classic study, they first induced a sense of complacency by exposing participants to highly reliable automation.  Later,  when the automation failed, the more complacent participants were much worse at detecting the failure (Parasuraman, Molloy, & Singh, 1993).

Interestingly, when researchers examined very autonomous autopilot systems in aircraft, they found that pilots were often confused or distrustful of the automation’s decisions (e.g., initiating course corrections without any pilot input) suggesting LOW complacency.  But it is important to note that pilots are highly trained, and have probably not been subjected to the same degree of effusively positive marketing that the public is being subjected regarding the benefits of self-driving technology.  Tesla, in essence, tells drivers to “trust us“, further increasing the likelihood of driver complacency:

We are excited to announce that, as of today, all Tesla vehicles produced in our factory – including Model 3 – will have the hardware needed for full self-driving capability at a safety level substantially greater than that of a human driver. Eight surround cameras provide 360 degree visibility around the car at up to 250 meters of range. Twelve updated ultrasonic sensors complement this vision, allowing for detection of both hard and soft objects at nearly twice the distance of the prior system. A forward-facing radar with enhanced processing provides additional data about the world on a redundant wavelength, capable of seeing through heavy rain, fog, dust and even the car ahead.

To make sense of all of this data, a new onboard computer with more than 40 times the computing power of the previous generation runs the new Tesla-developed neural net for vision, sonar and radar processing software. Together, this system provides a view of the world that a driver alone cannot access, seeing in every direction simultaneously and on wavelengths that go far beyond the human senses.

References

Parasuraman, R., & Molloy, R. (1993). Performance consequences of automation-induced“complacency.” International Journal of Aviation Psychology, 3(1), 1-23.

Some other key readings on complacency:

Parasuraman, R. (2000). Designing automation for human use: empirical studies and quantitative models. Ergonomics, 43(7), 931–951. http://doi.org/10.1080/001401300409125

Parasuraman, R., & Wickens, C. D. (2008). Humans: Still vital after all these years of automation. Human Factors, 50(3), 511–520. http://doi.org/10.1518/001872008X312198

Parasuraman, R., Manzey, D. H., & Manzey, D. H. (2010). Complacency and Bias in Human Use of Automation: An Attentional Integration. Human Factors, 52(3), 381–410. http://doi.org/10.1177/0018720810376055

 

Interesting control/display

Anne sent me an example of, “why haven’t they thought of this before”: an air vent with the temperature display and control knob all in one.

In this article describing the new Audi TT with glass dashboard, they describe the novel control/display/air vent seen in the image above. I guess one problem here is if it is accessible to only the driver or if it’s centrally located.

20140318-175623.jpg

The dashboard (shown in the linked article), however, is another story. While it looks futuristic, it looks like a distraction nightmare!

Recent developments in in-vehicle distractions: Voice input no better than manual input

A man uses a cell phone while driving in Burbank, California June 25, 2008. Credit: Reuters/Fred Prouser
Earlier this week the United States Department of Transportation released  guidelines for automakers designed to reduce the distractibility of in-vehicle technologies (e.g., navigation systems). :

The guidelines include recommendations to limit the time a driver must take his eyes off the road to perform any task to two seconds at a time and twelve seconds total.

The recommendations outlined in the guidelines are consistent with the findings of a new NHTSA naturalistic driving study, The Impact of Hand-Held and Hands-Free Cell Phone Use on Driving Performance and Safety Critical Event Risk. The study showed that visual-manual tasks associated with hand-held phones and other portable devices increased the risk of getting into a crash by three times. [emphasis added]

But a new study (I have not read the paper yet) seems to show that even when you take away the “manual” aspect through voice input, the danger is not mitigated:

The study by the Texas Transportation Institute at Texas A&M University was the first to compare voice-to-text and traditional texting on a handheld device in an actual driving environment.

“In each case, drivers took about twice as long to react as they did when they weren’t texting,” Christine Yager, who headed the study, told Reuters. “Eye contact to the roadway also decreased, no matter which texting method was used.”

The Automobile Fuel Gauge – a ‘failed’ pictogram

Nice post over on Humans in Design on the semi-universal icon that tells you what side of the car to fill gasoline. It’s a little triangle that can go on either side of the icon, and the gas tank opens on that side of the car.

The post is called Lessons from a Failed Pictogram, and it covers the more common icon used on dashboards that is simply a picture of an old-timey gas pump with no triangle. This icon is simply an indicator that the gauge is for fuel – it doesn’t help the user know how to drive up to the pump.

The post addresses the myths that grew up about the fuel pump icon – that the pump handle indicated obtusely that the tank was on the opposing side. Of course, this would be a terrible indicator, but the take home message was that if users come up with imaginary meanings for a pictogram, designers should take notice. The users are begging for that message. From the post:

If a myth exists it’s often a search for meaning that can be used to identify a design problem, which is the first step to a solution.

Indeed, most of the pictures I found in an image search were just the pump with no indicator about the fuel tank. The one below stood out since it uses TWO icons.

On a personal note, I was almost 30 before anyone told me about the fuel indicator arrow.

 

Photo credit gmanviz @ Flickr

Photo credit Strupy @ Flickr

Usability Follies in the News

It’s election season which means more opportunities to point, laugh, and cry at the state of voting usability.  The first is sent in by Kim W.  As part of an NPR story, the reporter dug up a sample ballot. Pretty overwhelming and confusing (“vote for not more than one”??); makes me long for electronic voting.

Next, Ford is sending out a software update to their popular MyTouch car telematics system. The following NYT article is excellent in highlighting the importance of not only basic usability but that “user experience” is just as important as technical capability/specs.  The article lists a variety of usability quirks that should have been caught in user testing (e.g., “a touch-sensitive area under the touch screen that activates the hazard lights has been replaced with a mechanical button, because Ford learned that drivers were inadvertently turning on the hazard lights as they rested their hand while waiting for the system to respond.”).

My Touch (photo: NYT)

I am being facetious when I point an laugh but seriously, many of these issues could have been caught early with basic, relatively cheap, simple user testing.

“I think they were too willing to rush something out because of the flashiness of it rather than the functionality,” said Michael Hiner, a former stock-car racing crew chief in Akron, Ohio, who bought a Ford Edge Limited last year largely because he and his wife were intrigued by MyFord Touch.

Now Ford has issued a major upgrade that redesigns much of what customers see on the screen and tries to resolve complaints about the system crashing or rebooting while the vehicle is being driven. Ford said on Monday that the upgrade made the touch screens respond to commands more quickly, improved voice recognition capabilities and simplified a design that some say had the potential to create more distractions for drivers who tried to use it on the road. Fonts and buttons on the screen have been enlarged, and the layouts of more than 1,000 screens have been revamped.

New automation will warn drivers of lane changes

Ford is introducing a system that first warns of a lane change, then actually changes the direction of the car if the warning is ignored. From the USA Today article:

When the system detects the car is approaching the edge of the lane without a turn signal activated, the lane marker in the icon turns yellow and the steering wheel vibrates to simulate driving over rumble strips. If the driver doesn’t respond and continues to drift, the lane icon turns red and EPAS will nudge the steering and the vehicle back toward the center of the lane. If the car continues to drift, the vibration is added again along with the nudge. The driver can overcome assistance and vibration at any time by turning the steering wheel, accelerating or braking.

Is this going to be as annoying as having Rich Pak’s phone beep every time I go over the speed limit (which is A LOT)? Just kidding – stopping a drifting car could be pretty great.

 

LOLcat photo credit to ClintCJL at Flickr.