In the future, medical devices will need to gather and process a deluge of data and present what’s most important for healthcare professionals to address—without overloading or misdirecting them. This was the prediction made by a panel of human factors engineering (HFE) experts who presented at AAMI’s Future Direction of HFE in Medical Device Development webinar on May 5.
“My biggest fear is information overload,” said speaker Jeff Hersh, PhD, MD, chief medical officer for GE Healthcare. “As the devices get more and more connected, and we have more devices, and we can monitor more and more things, where do we draw the line? I think there’s already an overload today, and I foresee that getting worse. I think there will need to be some deep machine learning and other ways to address it. Otherwise, everyone sees the trees and not the forest.”
Hersh and the rest of the panel, which included John Battista, principal human factors engineer at Medtronic; Terry Fairbanks, director of the National Center for Human Factors in Healthcare at MedStar Health; Julian Goldman, director of the medical device interoperability program at Massachusetts General Hospital; and Stuart Karten, founder and president of Karten Design, were asked to envision how people will interact with medical devices in the future. During the discussion, moderated by Jonathan Kendler, design director at UL, the panel forecasted advances in technology that could change the way clinicians interact with medical devices and their patients. Examples included holograms that could visualize medical imaging in three dimensions, passive sensors that would automatically collect and enter patient data in real time, and audio command interfaces similar to consumer products like Siri or Cortana. Goldman predicted that device manufacturers will have much more control over their devices through usability feedback and the ability to apply more frequent software updates. He called on the implementation of a “black box,” similar to flight data recorders on airplanes, to log device data and for a national system to report design, user interface, and equipment issues.
“For human factors, I say that there will be a need for much larger data sets and big data to look at big trends and relationships…We don’t have a way to peer inside of a hospital during the day and look at the interaction with hundreds of devices. We can’t determine whether people are consistently hitting the wrong button when they’re trying to accomplish something. Or overshoot a number and then adjust it back down,” Goldman said. “We just don’t have a way to monitor that today. We need vast amounts of data to better understand these things so we can improve equipment design and usability.”
Ultimately, that information can help guide the design process to reduce use errors. But human factors need to be incorporated earlier in the device’s design phase, Fairbanks said, and into the underlying culture of reporting adverse events. “The answer to better design as complexity increases is better early human factors work and understanding environmental constraints and how human performance is going to be in that environment,” Fairbanks said. “The culture of healthcare is when a use error occurs, healthcare providers blame themselves or others blame them—rather than looking at the contributing factor from the design.”
To reduce potential use errors, the Food and Drug Administration released a series of recommendations earlier this year that encouraged manufacturers to include human factors testing and usability engineering processes into device development. A revision of ANSI/AAMI HE75, 2009(R)2013, Human Factor Engineering—Design of Medical Devices is also underway to bring it up to date with current human factors practices, including the addition of chapters on combination products and integrated systems. AAMI expects the revision to be completed in 2018.