Category Archives: health/healthcare

Resources: Human Factors Design Considerations in Home Health Technology

The National Academies of Science and Agency for Healthcare Research and Quality have just released two publications.

The first, Health Care Comes Home, is a 200 page report:

Health Care Comes Home reviews the state of current knowledge and practice about many aspects of health care in residential settings and explores the short- and long-term effects of emerging trends and technologies. By evaluating existing systems, the book identifies design problems and imbalances between technological system demands and the capabilities of users. Health Care Comes Home recommends critical steps to improve health care in the home. The book’s recommendations cover the regulation of health care technologies, proper training and preparation for people who provide in-home care, and how existing housing can be modified and new accessible housing can be better designed for residential health care. The book also identifies knowledge gaps in the field and how these can be addressed through research and development initiatives.

The second, Consumer Health Information Technology in the Home: A Guide for Human Factors Design Considerations, is a free designers guide:

Consumer Health Information Technology in the Home introduces designers and developers to the practical realities and complexities of managing health at home. It provides guidance and human factors design considerations that will help designers and developers create consumer health IT applications that are useful resources to achieve better health.

Radiation: The Difficulty of Monitoring the Invisible – Post 2 of 2

This post continues the list of articles on HF-related errors in radiation delivering healthcare devices.

As Technology Surges, Radiation Safeguards Lag

But the technology introduces its own risks: it has created new avenues for error in software and operation, and those mistakes can be more difficult to detect. As a result, a single error that becomes embedded in a treatment plan can be repeated in multiple radiation sessions.

A new linear accelerator had been set up incorrectly, and the hospital’s routine checks could not detect the error because they merely confirmed that the output had not changed from the first day.

In another case, an unnamed medical facility told federal officials in 2008 that Philips Healthcare made treatment planning software with an obscure, automatic default setting, causing a patient with tonsil cancer to be mistakenly irradiated 31 times in the optic nerve. “The default occurred without the knowledge of the physician or techs,” the facility said, according to F.D.A. records.

In a statement, Peter Reimer of Philips Healthcare said its software functioned as intended and that operator error caused the mistake.

Radiation Offers New Cures, and Ways to Do Harm

The Times found that while this new technology allows doctors to more accurately attack tumors and reduce certain mistakes, its complexity has created new avenues for error — through software flaws, faulty programming, poor safety procedures or inadequate staffing and training.

X-Rays and Unshielded Infants

Asked about the case, Dr. David Keys, a board member of the American College of Medical Physics, said, “It takes less than 15 seconds to collimate [cover non-scanned portions of the body – AM] a baby,” adding: “It could be that the techs at Downstate were too busy. It could be that they were just sloppy or maybe they forgot their training.”

Other problems, according to Dr. Amodio’s e-mail, included using the wrong setting on a radiological device, which caused some premature babies to be “significantly overirradiated.”

Radiation: The Difficulty of Monitoring the Invisible – Post 1 of 2

Lately, I have noticed a plethora of stories on human factors mistakes with medical equipment that delivers radiation. I have collected them here for those who are interested in this problem. At times a computer bug was at fault, but often radiation overdoses came from:

  • inadequate training (and perhaps a poor display, but those are not available for me to examine)
  • non-transferrable mental models between pieces of equipment
  • team communication issues
  • trust in automation

I provide excerpts from each article calling out the HF-related errors.

A Pinpoint Beam Strays Invisibly, Harming Instead of Healing

In Missouri, for example, 76 patients were overradiated because a medical physicist did not realize that the smaller radiation beam used in radiosurgery had to be calibrated differently than the larger beam used for more traditional radiation therapy.

Linear accelerators can be adapted to perform stereotactic radiosurgery in two ways: with small computer-controlled metal leaves that shape the beam, or with a cone attached to the machine’s opening through which radiation is delivered. That opening is made smaller or larger by moving four heavy metal “jaws” that shape the beam into a square. When a cone attachment is used, the square beam must fit entirely within the circumference of the cone. If the square is slightly larger than the cone, radiation will leak out through the four corners of the jaws and irradiate healthy tissue. In the Evanston accidents, records show, the beam was four times too large. Operators could not see this incorrect setting directly because a metal tray on which the cone is mounted hides the jaws, though the settings should have been displayed on a computer screen, according to people who have worked with this device.

That system is supposed to work this way: A treatment plan is developed on one computer, then transferred into another software system that, among other things, verifies that the treatment plan matches the doctor’s prescription. The data is then sent to a third computer that controls the linear accelerator. Several months after the Evanston accidents, Brainlab reminded customers to verify the correct jaw setting, specifically citing the possibility that treatment information could be altered as it passed “through a chain of devices.”

Evanston Hospital had earlier encountered its own communication glitches after upgrading Varian software in December 2008. As a result, medical personnel had to load patient information onto a USB flash drive and walk it from one computer to another. Then, three months ago, concerned that radiation might leak outside the cone, Varian warned customers that its software did not recognize cone attachments on the type of linear accelerator involved in the Evanston accidents. To work around that problem hospitals needed to, as one medical physicist put it, essentially trick the machine into thinking it was using a different attachment, which it did recognize. To do that, users had to enter additional data into the SRS system.

After Stroke Scans, Patients Face Serious Health Risks

While in some cases technicians did not know how to properly administer the test, interviews with hospital officials and a review of public records raise new questions about the role of manufacturers, including how well they design their software and equipment and train those who use them… While in some cases technicians did not know how to properly administer the test, interviews with hospital officials and a review of public records raise new questions about the role of manufacturers, including how well they design their software and equipment and train those who use them.

…a feature that technicians thought would lower radiation levels actually raised them. Cedars-Sinai gave a similar explanation. “There was a lot of trust in the manufacturers and trust in the technology that this type of equipment in this day and age would not allow you to get more radiation than was absolutely necessary,” said Robert Marchuck, the Glendale hospital’s vice president of ancillary services.

At Cedars-Sinai and Glendale Adventist, technicians used the automatic feature — rather than a fixed, predetermined radiation level — for their brain perfusion scans. But a surprise awaited them: when used with certain machine settings that govern image clarity, the automatic feature did not reduce the dose — it raised it. … GE says the hospitals should have known how to safely use the automatic feature… GE further faulted hospital technologists for failing to notice dosing levels on their treatment screens.

Of course, this isn’t the first time we have posted on this issue at The Human Factors Blog, and I don’t imagine it will be the last.
Previous posts:
Error Leads to Radiation Overdose
“Set Phasers on Stun” still relevant in healthcare industry

 

Photo credit microwavedboy on Flickr

The Human Factors Prize

The Human Factors and Ergonomics Society is announcing the Human Factors Prize, a $10,000 prize recognizing excellent human factors research.  The winner will be presented at the annual meeting in Las Vegas this fall.

The Human Factors and Ergonomics Society is proud to announce the Human Factors Prize, established in 2010 by Editor-in-Chief William S. Marras. The prize, which will be presented for the first time in 2011, recognizes excellence in HF/E research through an annual competition in which authors are invited to submit papers on a specific topic for that year. The topic is selected by the editor in chief in consultation with a Board of Referees chaired by Immediate Past Human Factors Editor Nancy J. Cooke. See below for the current year’s topic.

The prize carries a $10,000 cash award and publication of the winning paper in the Society’s flagship journal, Human Factors. The award will be formally conferred at a special session at the HFES Annual Meeting, where the recipient will present his or her work.

This year’s topic is health care ergonomics:

2011 Topic
The topic for the inaugural-year Prize is health care ergonomics. Health care ergonomics is broadly defined to include research at the intersection of health care and human factors. Suitable sample topics include human factors aspects of home health care, the ergonomics of laparoscopic equipment and procedures, patient care coordination, usability of electronic health records and informatics, macroergonomics of health care facilities, and use of simulation for health care training.

False Alarms in the Hospital

NPR pointed me to a two-series in the Boston Globe examining the incessant din of patient alarms.

The monitor repeatedly sounded an alarm — a low-pitched beep. But on that January night two years ago, the nurses at St. Elizabeth’s Medical Center in Brighton didn’t hear the alarm, they later said. They didn’t discover the patient had stopped breathing until it was too late.

These were just two of more than 200 hospital patients nation wide whose deaths between January 2005 and June 2010 were linked to problems with alarms on patient monitors that track heart function, breathing, and other vital signs, according to an investigation by The Boston Globe. As in these two instances, the problem typically wasn’t a broken device. In many cases it was because medical personnel didn’t react with urgency or didn’t notice the alarm.

They call it “alarm fatigue.’’ Monitors help save lives, by alerting doctors and nurses that a patient is — or soon could be — in trouble. But with the use of monitors rising, their beeps can become so relentless, and false alarms so numerous, that nurses become desensitized — sometimes leaving patients to die without anyone rushing to their bedside.

This is a very well-studied topic in human-automation interaction research.  We can understand why false alarms are so prevalent in healthcare settings:  if you were hooked up to a patient monitoring device, would you rather have a) the machine miss some important change but not beep so frequently (low false alarm rate + high miss rate) or b) constantly beep to let you know of the possibility that something is wrong but also be wrong frequently (high false alarm rate + low miss rate).  You’d probably pick option b because of the inherent risk in missing a life-threatening critical event.

But, as research has shown in the past (and the linked articles demonstrate), a high false alarm rate can have very detrimental effects on the person monitoring the alarm.  Keep in mind: the nurses in the story DO NOT WANT to ignore the alarms!  The article walks the fine line in blaming the user (it doesn’t quite do that).  The sheer number of alarms makes it difficult for nurses and other healthcare workers to differentiate true critical events from false alarms.

The general topic of automation in healthcare is a topic that I’ve recently dipped my toes into and it’s fascinating and complex.  Here are some papers on the topic of false alarms and how operators/users are affected.

Dixon, S., Wickens, C. D., & McCarley, J. S.  (2007).  On the independence of compliance and reliance: Are automation false alarms worse than misses?  Human Factors, 49(4), 564-572.

Meyer, J.  (2001).  Effects of warning validity and proximity on responses to warnings.  Human Factors, 43(4), 563-572.

(photo: flickr user moon_child; CC by-NC 2.0)

Profiles in Human Factors: Dr. Julian Sanchez, Medtronic

This post is the first in our new series of human factors career profiles. Dr. Julian Sanchez  was kind enough to answer my questions about his job and the journey he took to get there. Dr. Sanchez received his Ph.D. in psychology from the Georgia Institute of Technology and has worked in a variety of settings, from agricultural technology at Deere & Co. to aviation at the MITRE Corporation and is currently with Medtronic in Minneapolis.

Anne: Hi Julian, let’s start with “Would you briefly describe your job and what you enjoy most about it?”

Julian: I work for a medical device company called Medtronic, within their Cardiac Disease Management division. I am part of the R&D group so I work alongside scientists of all disciplines on product ideas that are at least 5 years from making it to market. I help ensure that Human Factors and UX issues are considered early in the design process.

Implanted pacemakers and defibrillators have the capability of wireless communication with a receiver that then transmits all of the data from the patient’s heart to the doctor’s office. I mean, how can anyone think that working in this field is not the coolest thing?

Anne: Sounds like you like it!  How did you get interested in Human Factors as a career path?

Julian: To be honest, I wasn’t sure that I was really going to love HF as a career path until I did an internship at John Deere. This was only two years before getting my PhD, so thank god for that! I guess the internship really hit home that all of the theoretical principles that I had learned in grad school could be applied, AND there was a real thirst for it.

Anne: So, what skills from graduate school have you used the most?

Julian: During grad school I taught myself Flash, a prototyping tool. Besides HF knowledge, this has been the skill that has best served me. Being able to mock up a prototype gives you the ability to pitch ideas to other engineers and designers.

Anne: Neat. Ok, if you could tell your first-year graduate student self a single sentence, what would it be?

Julian: Great question. “Don’t rush”

Children and Medication Errors – “Thanks, Mom and Dad!”

NPR had a story this morning about the high number of medication errors children experience and some ideas as to why.

In short summary:

  1. Kitchen spoons are inaccurate for giving “teaspoons” of medicine, and it doesn’t take much to give the little ones an overdose.
  2. Dose instructions are in teaspoons, but sometimes the cups that come with the bottle are in milliliters!

I also remember hearing about the surprising lack of literacy when parents read labels: I’ve seen a video of a woman reading a label that abbreviated tsp. and read it as “Tablespoons.”

Primary source link for the curious:
Yin, H.S., Wolf, M.S., Dreyer, B.P., Sanders, L.M., Parker, R.M. (2010). Evaluation of Consistency in Dosing Directions and Measuring Devices for Pediatric Nonprescription Liquid Medications. JAMA. Published online November 30, 2010. doi:10.1001/jama.2010.1797

Photo credit Rakka

“Clomiphene” vs. “Clomipramine”

Found at the Consumerist blog:

The words “Clomiphene” and “Clomipramine” might look similar, but if you work in a pharmacy, you should know that they stand for very different things. Clomiphene is the generic version of the fertility drug Clomid. Clomipramine is a tricyclic antidepressant. A woman in Pittsburgh says that the pharmacy at a Giant Eagle grocery store gave her the antidepressant when she was prescribed the fertility drug. She had a severe allergic reaction and ended up in the emergency room.

The original article may be found here.

Too standardized? – The problem of tube identification in hospitals

When it comes to efficiency, creating standard sizes and connections saves money, production efforts, and makes for easy substitution when one runs out of an object. For example, I was delighted that lid for one brand of pot perfectly fit my new frying pan. Unfortunately, there are times when we do not want parts of one object to fit another because it can encourage dangerous errors.

The New York Times ran a recent article on the similarity between the many tubes used in hospitals for different purposes. These purposes include:

  • Intravenous use
  • Inflating blood pressure cuffs
  • Feeding tubes

Obviously, very different materials pass through each of these tubes, and mixing one with another can be deadly. The article discusses several cases where food or air was passed into patient veins. The tubes entering the patient are often compatible with multiple sources of input, with no guard besides notoriously fallible human attention to prevent a mistake.

From the NYT article:

…the F.D.A. has issued three alerts to hospitals and manufacturers warning about tube mix-ups, the most recent of which was sent out last month after The Times began asking about the issue. Ms. Pratt said she persuaded one manufacturer, Viasys, to produce neonatal feeding tubes that are incompatible with other tubing. Viasys’s tubing is now used in Sharp’s neonatal intensive-care units, but they are expensive — $13 compared with $1.50 for regular tubes.“The regulators have been waiting for the manufacturers to come up with a solution,” Ms. Pratt said, “and the manufacturers won’t spend the money to design and produce something different until the regulators force them to. And now the international standards organization is taking forever to get the whole world onto the same page.”

Nancy Foster, vice president for quality and patient safety policy at the American Hospital Association, agreed, “These things are hard to change when you have to get so many different organizations to act in concert.”

Code Chartreuse – “Too many codes”

Enjoy memorizing this hospital sign!

How about just announcing the issue rather than matching it first with a color? For example: “Attention, tornado!” seems like it would be effective.

Elopement, by the way, means a patient with Alzheimer’s needs to be located. That makes “purple” a code within a code (and makes me want to watch Inception again). This is also one of the few I could understand wanting to disguise with a color.

“Shooter” is another candidate for obfuscation, although I imagine the shooter would quickly figure out that any announcements were about them, while hospital denizens look around and say “Huh, we’ve never heard code silver before. Sounds like something to do with Alzheimer’s.”

Photo credit Jason Boyles.