By Sue Barnes, RN, CIC, FAPIC
Infection surveillance, once the primary task of infection preventionists (IPs), has transitioned over time to assume a more limited place in a massively expanded scope of IP responsibilities. Infection surveillance data is used to measure success of infection prevention and control programs, to identify areas for improvement, and to meet public reporting mandates and pay for performance goals.
There has certainly never before been more federal and state legislative focus and funding dedicated to prevention of HAI, though none earmarked to facilitate or innovate relative to infection surveillance. The Department of Health and Human Services (HHS)'s National Action Plan to Prevent Healthcare-Associated Infections set national-level goals beginning in 2009 and again in 2014 for reducing the most common and serious HAIs among the more than 5,000 federal acute-care and community hospitals in the U.S. The only action included in the plan that focuses on support for surveillance is a requirement for validation surveys.1
Public Reporting and Validation of HAI Data
There are variable views on the value of mandatory public reporting of HAI. The most positive reports indicate that public reporting has helped to increase IP department resourcing, transparency, and accountability.1 Others would argue that it has resulted in a diversion of limited IP department resources and in some instances in gaming of rates to ensure Medicare reimbursement and local merit awards/compensation goals are attained.2
Federal and state-based requirements for public reporting of HAI dictate that infection rate data is submitted to NHSN (National Health and Safety Network). NHSN was designed for research versus performance improvement, and consequently the resource burden for IP departments who submit infection data to NHSN is tremendous. In addition, the complex and frequently changing NHSN infection definitions creates additional challenges to those collecting data, doing case finding and reporting rates.3
When surveillance is accomplished manually, the data can be fraught with a lack of inter-rater reliability as well as subjectivity.4 This is due in part to the complex definitions, but also to simple human error. “Institutional variability of infection preventionist rates relative to a computer algorithm suggests that there is significant variation in the application of standard surveillance definitions across medical centers. Variation in surveillance practice may complicate inter-institutional comparisons of publicly reported rates.5 Presumably, as a consequence of this variability, validation of NHSN HAI data is now required, which imposes a further burden on already resource constrained IP departments.6
Pay for Performance (P4P)
The pay-for-performance programs which HAI rates impact are Re-Admission Reduction, Value Based Purchasing (VBP) and Reduction of Hospital Acquired Conditions (HAC). The number of healthcare quality measures used by P4P programs has increased dramatically in the last few years. According to the Wall Street Journal, the government now collects data on 1,675 quality measures through 33 different federal pro-grams. The way these systems work is to penalize facilities for select preventable adverse events by withholding a percentage of Medicare reimbursement.
For the VBP Program the at-risk amount for hospitals will increase to 2 percent of the base operating diagnosis-related group (DRG) payment in 2017. Also this year, the program will add a safety domain representing 20 percent of a hospital’s total performance score. This will include the MRSA infection rate and the Clostridium difficile infection rate. Beginning in 2019, the safety domain will also encompass the complication rates following elective primary total hip and total knee arthroplasty.
The Hospital Acquired Conditions (HAC) Program is based on infection data published on Hospital Compare (https://www.medicare.gov/hospitalcompare/search.html). Hospitals that rank in the highest quartile for HACs are now subject to a 1 percent Medicare payment reduction. Several HACs associated with orthopedic surgery include surgical site infection following spine, neck, shoulder, and elbow procedures.
For the Hospital Admissions Reduction Program, the maximum penalty under the program is now 3 percent of a hospital’s base operating DRG payment amount. The program imposes penalties for readmissions for pneumonia, elective total hip arthroplasty (THA) and total knee arthroplasty (TKA).
Although a number of studies have shown no appreciable improvement in quality of care from P4P programs, and instead a trend in “up-coding,” the programs are not predicted to change any time soon, so IP departments will continue to be held accountable by hospital executives for reporting low HAI rates.
Technology: Automated vs. Manual Surveillance
Manual infection detection/case finding has been shown to be compromised by subjectivity and suboptimal inter-rater reliability.4 The advent of automated infection detection using algorithms with the (electronic medical record (EMR) or via specialized add-on datamining software, has been shown to improve the quality of data, by eliminating these human errors, as well as reducing the amount of time spent in case finding.7-8 In light of the heavy burden imposed by data elements required by NHSN, automating at least the import of denominator data to NHSN with datamining software, or directly from the EMR, is reported to help protect IP resources.9 However, this requires a level of medical informatics/information technology support, not always accessible to infection prevention departments. With sufficient medical informatics/IT support, it is possible even with incomplete EMR documentation to accurately detect infections in an automated fashion, without the purchase of datamining software. One peer-reviewed journal describes the use of information technology tools for fully automated HAI surveillance: “This system employs knowledge bases that allow for the inherent un-sharpness of clinical terms and uncertainty of clinical conclusions. HAIs can be detected according to NHSN HAI surveillance definitions, and can bridge the gap between retrospective surveillance of HAIs and ongoing prospective clinical decision-oriented HAI support."10
However, adoption of automation has been slow, possibly owing to the high cost of software and/or limited medical informatics/IT resources available to IP departments.11 In 2014 it was estimated by the not-for-profit health IT researcher HIMSS Analytics that only 45 percent of U.S. hospitals used electronic infection surveillance systems.12 And among those, the systems do not use the same infection detection algorithms, nor do they completely eliminate the need for manual record review to confirm case finding. Kenrick Cato, a researcher at the Columbia University School of Nursing, says a big problem is that competing electronic surveillance systems often can’t work together to standardize algorithms, share patient data and improve results. “If monitoring systems could be fully automated, he says, it would revolutionize the way infection surveil-lance is done."12 One might question whether man power used for government required validation of manual HAI surveillance, would be better utilized in a national effort to transition from manual to automated (and standardized) infection detection, and upload of infection data.
Detection of Outbreaks
Traditionally, infection prevention and control departments have used laboratory culture reports to identify outbreaks of infections and infectious diseases. In more recent years the EMR has been used to more promptly detect trends and outbreaks. There is no standard technology or algorithm, but multiple approaches are described in the literature. In one study a cough based algorithm using audio recordings is proposed for detection of pertussis out-breaks in developing countries9. In another, an automated EMR algorithm is described for detection of multiple communicable diseases in Ger-many.13 And during the recent Ebola pandemic, Twitter was used as a real-time method of Ebola outbreak surveillance to monitor information spread, epidemic progress, and to examine content of public knowledge and attitudes.14
HAI surveillance remains central to infection prevention and control efforts. However, data collection, case finding and reporting to NHSN is resource intensive and data are often inaccurate. In order to ensure that HAI data serves to optimally support IP Programs and meet federal and state requirements for reporting, improvements in surveillance systems must keep pace with the technology available, in order to improve efficiency and accuracy. The path to this end point would be expedited with government dollars directed in support of (standardized) surveil-lance automation and NHSN import automation vs. validation of manual HAI surveillance. In the short term, this will require hospital executives to partner with IP departments to leverage the rapidly evolving field of medical informatics, datamining/surveillance software and potentially social media as well to ensure best use of IP department resources for prevention and control of HAIs.
Sue Barnes, RN, CIC, FAPIC, is an independent clinical consultant. She is board-certified in infection prevention and control, and was granted the designation of Fellow of APIC in 2015 (FAPIC). She has been in the field of Infection Prevention since 1989. She has participated in the development of a number of APIC guides, and served as a speaker for organizations including AORN and APIC. In addition, Barnes has been published in journals including AORN Journal, American Journal of Infection Control, The Joint Commission Source for Compliance Strategies and the Permanente Journal. She served on the National APIC board of directors from 2010 to 2012, and the San Francisco chapter board of directors for the past 10 years.
1. Pegues DA. Translating and scaling the HHS Action Plan to Prevent Healthcare-associated Infections to the local level: experience of a Los Angeles Health System. Med Care. 2014 Feb;52(2 Suppl 1):S60-5.
2. Sjoding MW, Dimick JB, Cooke CR. Gaming hospital-level pneumonia 30-day mortality and readmission measures by legitimate changes to diagnostic coding. Crit Care Med. 2015 May;43(5):989-95.
3. See I, Soe MM, Epstein L, Edwards JR2, Magill SS2, Thompson ND2. Impact of removing mucosal barrier injury laboratory-confirmed blood-stream infections from central line-associated bloodstream infection rates in the National Healthcare Safety Network, 2014. Am J Infect Control. 2016 Nov 14. pii: S0196-6553(16)30968-3.
4. Nuttall J, Evaniew N, Thornley P, Griffin A, Deheshi B, O'Shea T, Wunder J, Ferguson P, Randall RL, Turcotte R, Schneider P, McKay P, Bhandari M, Ghert M. The inter-rater reliability of the diagnosis of surgical site infection in the context of a clinical trial. Bone Joint Res. 2016 Aug;5(8):347-52.
5. Lin MY, Hota B, Khan YM, Woeltje KF, Borlawsky TB, Doherty JA, Stevenson KB, Weinstein RA, Trick WE; CDC Prevention Epicenter Pro-gram. Quality of traditional surveillance for public reporting of nosocomial bloodstream infection rates. JAMA. 2010 Nov 10;304(18):2035-41.
6. Lempp JM, Cummings MJ, Birnbaum DW. Distribution of Central Line-Associated Bloodstream Infections Determined From Washington State's Annual Reporting Validation Program, 2009-2013. Infect Control Hosp Epidemiol 2016:1-4.
7. Branch-Elliman W, Strymish J, Gupta K. Development and validation of a simple and easy-to-employ electronic algorithm for identifying clinical methicillin-resistant Staphylococcus aureus infection. Infect Control Hosp Epidemiol. 2014 Jun;35(6):692-8.
8. Pramono RX, Imtiaz SA, Rodriguez-Villegas E. A Cough-Based Algorithm for Automatic Diagnosis of Pertussis. PLoS One. 2016 Sep 1;11(9):e0162128.
9. Vostok J, Lapsley W, McElroy N, Onofrey S, McHale E, Johnson N, DeMaria A. Assessment of the burden of mandatory reporting of health care-associated infection using the National Healthcare Safety Network in Massachusetts. Am J Infect Control. 2013 May;41(5):466-8. doi: 10.1016/j.ajic.2012.05.021. Epub 2012 Oct 24.
10. Koller W, de Bruin JS, Rappelsberger A, Adlassnig KP. Advances In Infection Surveillance and Clinical Decision Support With Fuzzy Sets and Fuzzy Logic. Stud Health Technol Inform. 2015;216:295-9.
11. Hebden JN. Slow adoption of automated infection prevention surveillance: are human factors contributing? Am J Infect Control. 2015 Jun;43(6):559-62.
12. Levingston S. Software That Helps Spot Sneaky Infections. Bloomberg Terminal online. Dec. 18, 2014. https://www.bloomberg.com/news/articles/2014-12-18/software-that-fights-infections-in-humans-at-the-hospital
13. Salmon M, Schumacher D, Burmann H, Frank C, Claus H, Höhle M. A system for automated outbreak detection of communicable diseas-es in Germany. Euro Surveill. 2016;21(13).
14. Odlum M, Yoon S. What can we learn about the Ebola outbreak from tweets? Am J Infect Control. 2015 Jun;43(6):563-71.