Lucy A. Savitz, Susan L. Moore, Walter Biffl, Connie Price, Heather Gilmartin
Abstract
Healthcare-associated infections (HAIs) have a significant impact on health care quality and cost. Four health care delivery systems collaborated in a participatory research approach to enhance surgical site infection (SSI) detection and surveillance for selected procedures through use of an automated tool. A planned mixed methods approach was used to model risk factor data available from electronic medical records, conduct algorithm testing and validation, and incorporate structured feedback from surgeons and infection control nurses with regard to tool acceptance and implementation. A version of the tool was tested on a retrospective cohort to assess performance versus surveillance through manual chart review. The tool achieved 100 percent sensitivity in detecting SSIs previously identified and 72 percent specificity in detecting SSIs meeting National Health Safety Network guidelines. At an estimated manual review burden of 20 minutes per chart, a time savings of 456 hours through algorithmic surveillance was calculated. Although these results are promising, IPs were reluctant to depend solely on such a tool for surveillance; however, tool use for real-time detection was deemed desirable. Strategies such as the algorithm developed and tested in this study show potential to both improve efficiency and reduce cost without compromising quality.
Introduction
The persistent problem of mitigating healthcare-associated infections (HAIs) has plagued hospitals both here and around the world. Numerous attempts at improving the quality and safety of care over the years have provided only temporary improvements with minimal impact. For this project, we used a participatory research approach to yield a clinically relevant toolkit that could be implemented in a variety of delivery system settings.1
Prevention of adverse events is both a longstanding problem that has plagued hospitals around the world and a priority for the Center for Medicare & Medicaid Innovation's (CMMI), through its Partnership for Patients initiative (http://innovation.cms.gov/initiatives/Partnership-for-Patients/). Under this initiative, U.S. hospitals were expected to reduce healthcare acquired conditions (HACs) by 40 percent and readmissions by 20 percent by the end of 2013. Hospital Engagement Networks (HENs)a are charged with reducing four HAIs that fall within CMMI's targeted HACs, including surgical site infections (SSIs). Intermountain Healthcare leads one of the HENs, and a Denver Health (DH) subject matter expert, Dr. Connie Price, has been working through this initiative with hospitals across the United States to reduce SSIs. Some of the biggest challenges facing partner HEN hospitals in reducing SSIs include a resource-constrained environment and multiple, competing quality and safety initiatives. This paper describes our efforts to provide meaningful research results with high operational utility in reducing SSIs.
The purpose of this project was to explore opportunities for enhancing the detection and surveillance of inpatient-acquired SSIs for four target procedures—herniorrhaphy, coronary artery bypass graft (CABG) surgery, and hip and knee arthroplasty (including primary total arthroplasty, primary hemiarthroplasty, and revision procedures). The objective was to create and implement an algorithmic process for predicting and/or identifying those patients at risk for an SSI, using electronic risk factor data newly available through enhanced data repositories and accessible by hospital systems with electronic medical records (EMRs). Specific details of the quantitative modeling work are reported elsewhere.2 The project was based on a longstanding HAI detection trigger system built and administered by investigators at Intermountain Healthcare since the mid-1980s.3 The project presented an opportunity to (1) update this work, (2) directly engage clinical perspectives from the fields of surgery and infection prevention on currently excluded risk factors important to such models, and (3) assess the impact of such a tool when implemented in the work environment of infection prevention staff.
Investigators from multiple delivery systems came together to provide the most representative results and generalizable tools. Collaborating delivery systems were DH (a safety net hospital located in Denver, CO), Intermountain Healthcare (a large, nonprofit, integrated delivery system based in Salt Lake City, UT), and the Salt Lake City Veterans Affairs (VA) Medical Center (a VA hospital located in Salt Lake City, UT). Representativeness was further extended by including the Vail Valley Medical Center (a rural community hospital located in Vail, CO), a DH partner.
a Hospital engagement networks (HENs) are one component of the Partnership for Patients, a Centers for Medicare & Medicare Services (CMS) initiative. Over 3,700 hospitals are participating in 26 HENs around the country.
Methods
This participatory research effort was intentionally designed as a multiphase, sequential mixed methods study with iterative tasks, whereby one level of inquiry informed the next. Active involvement of delivery systems in research is intended to accelerate the uptake of research results and build trusting relationships that support a foundation for longer term studies.4 Further, the objectivity of outside observation, balanced with the richness of end-user knowledge, has been shown to enhance validity and foster credibility from both a rigorous research perspective and a clinical relevance perspective. Foundational work in establishing priorities for comparative effectiveness research have further shown the need to actively engage end users in the full spectrum of such research.5
Our mixed methods approach embraced the participatory research paradigm—answering relevant questions that address problems and priorities experienced in these settings.6 Participation was considered at every stage of the research process.7,8 The research team comprised health services researchers, infectious disease physicians, a surgeon, and an infection preventionist (IP). This interdisciplinary team identified relevant research questions, designed the study, and established an analytic plan in collaboration with the AHRQ Task Order Officer, Dr. Kendall Hall, together with a team of subject matter experts from the Centers for Disease Control and Prevention (CDC). The CDC team included Dr. Sandra Barrios-Torres, Dr. Teresa Horan, and Dr. Jonathan Edwards. Throughout the project, broader end-user perspectives were engaged via three focus groups conducted in sessions ranging from 1 to 2 hours with six surgeons (one group) and 13 infection control nurses (two groups; eight participants in group one and five participants in group two). A manual open, heuristic coding process was used to identify topics and themes from focus group data. Results from the surgeons' focus group were used to gain insight into how tool dissemination and adoption might be promoted among surgeons and inform the selection of common risk factors for analysis. Detailed input was codified by data analysts in testing the developed tools at each of the four participating health care delivery systems, with adjustments made to accommodate different data structures where necessary.2 Finally, feedback from end users obtained through the two nurses' focus groups was used to produce a supporting user guide and implementation manual.
We used a planned mixed methods approach for this project. The Office of Behavioral and Social Sciences provides a review of best practices in mixed methods studies.9 More recent work by Zhang and Creswell10 focuses on the mixing procedures, whereby "Mixing in mixed methods is more than just the combination of two independent components of quantitative and qualitative data." Although there are alternative mixing procedures delineated by the authors, our connected approach relied on blending qualitative and quantitative data such that the research connected the qualitative and quantitative portions of the project. Connected mixed methods studies connect the qualitative and quantitative portions of a project in such a way that one approach builds on the findings of the other approach.
We began by employing a modified Delphi process with six surgeons attending the 5th annual Academic Surgical Congress in San Antonio, TX to identify risk factors for improved identification of SSIs. In particular, we were interested in expanding risk factors currently under consideration by hospital surveillance systems. This expert consensus exploration of risk factors was coupled with an in-depth focused group discussion with the same surgeons. The participating surgeons were recruited through professional networking to participate in the risk factor exploration and focus group discussion, both of which took place in person at the conference, adjacent to a session on SSIs. Results from both the rank ordering (i.e., Delphi) and in-depth discussion were used to inform quantitative modeling. A union set of 33 common risk factors identified and ranked by surgeons' discussion and confirmed as electronically available in participating institutions' data systems was used in modeling. Markov chain Monte Carlo multiple imputation was used to account for missing data values. Independent association between potential risk factors and SSI was determined through univariate regression. Binary logistic regression was used to evaluate a variable relationship with SSI occurrence. The final model included risk factors with a probability of < 0.05 or that contributed to the predictive value of the model.1 Electronic algorithms were created to detect both deep and organ-space SSIs, using both recursive partitioning and simplified methods based on abnormal laboratory values or the presence of postoperative microbiology or antimicrobial data. Figure 1 depicts the classification tree algorithm structure, showing the identification of positive values through compact logic.1 Algorithm development, training, and testing are discussed in detail in a separate paper in this report.2
Figure 1 Classification Tree Algorithm Identification
Source. Price CS, Savitz LA. Improving the Measurement of surgical site infection risk stratification/outcome detection. Final Report (Prepared by Denver Health and its partners under contract 290-2006-00-20). AHRQ Publication No. 12-0046. Rockville, MD: Agency for Healthcare Research and Quality; March 2012. Available at http://psnet.ahrq.gov/resource.aspx?resourceID=25139. Accessed March 26, 2014.
Note: A, B, and C indicate conditions. Operators are represented as follows: NOT by the ~ symbol, INTERSECTION by the ∩ symbol, and UNION by the U symbol.
Following conclusion of the quantitative data analysis, including updated risk factor modeling at the surgical procedure level, we conducted two focus groups with infection control nurses to learn about the e-detection tool developed in this study, talk about SSI surveillance and the challenges that individual IPs face in their institution, and discuss implementation and changes in the standard surveillance process that could be facilitated through the use of the electronic tool. The first nursing focus group was conducted in Denver, CO, with five participants recruited from among members of the Mile High Chapter of the Association for Professionals in Infection Control, and the second nursing focus group was conducted in Salt Lake City, UT, with eight participants recruited from hospitals in the Intermountain Healthcare system. The feedback we received from the focus groups provided an understanding of the challenges institutions would confront in implementing an automated surveillance tool in their care delivery system and informed development of a manual for implementation. Testing of the tool in multiple delivery system settings identified a shared perspective that the tool would reduce the work burden associated with chart abstraction, allowing providers to focus their work effort on high-risk cases for SSI prevention.
Challenges were addressed throughout the testing, and the tool was re-worked for maximum applicability for the diverse settings and EMRs. Data were pooled across the four health care system settings to conduct quantitative analyses. Interviews with programmers were used to document effort time and the source of data. These findings, along with the data from the focus groups, were documented and used in developing a user toolkit to support adoption outside the delivery system settings in which the study was conducted.b
b This toolkit is available in the Resources section, under the Surgical Site Infection subsection as AHRQ Report of the Intermountain-led Hospital Engagement Network (HEN) Web site at http://www.henlearner.org.
Results
The DH infection prevention team sought to further adapt, tailor, and validate the electronic detection algorithm created for use in everyday surveillance of SSIs at DH, in order to reduce the burden of chart review while identifying a high percentage of SSIs. Prior to implementing the electronic tool, infection prevention personnel manually reviewed all charts for SSI surveillance purposes based on culture results. DH tested the tool in its integrated system to determine whether it would be possible to reduce cost associated with chart review hours by staff, while maintaining and/or improving the quality of their SSI surveillance efforts. The mandate for the algorithm's application was to maximize sensitivity at the expense of specificity, while realizing a meaningful reduction in chart review burden.
To test the sensitivity of the tool, DH's Infection Prevention Data Manager generated a retrospective cohort of procedures, including associated SSIs as defined by National Health Safety Network (NHSN) guidelines, using DH surveillance data from 2007-2010. The procedures reviewed included hip and knee arthroplasty, abdominal and vaginal hysterectomy, spinal fusion, craniotomy, and herniorrhaphy. The modified algorithm identified 804 procedures (37 percent of total charts for that time period) for review. The percentage of total procedures identified for review varied by procedure type from 15 percent of herniorrhaphy to 62 percent of craniotomy. After manual review by infection control staff, the modified algorithm was determined to have achieved 100 percent sensitivity in detecting SSIs previously identified through traditional surveillance and 72 percent specificity in detecting SSIs meeting NHSN definitions, validated on 4 years of DH's manual SSI surveillance data using NHSN methodology.
Over this 4-year period, 1,375 unnecessary chart reviews could have been avoided without sacrificing detection of a single SSI using the modified surveillance algorithm. Assuming 20 minutes per chart for traditional manual review, a time savings of 456 hours, or 57 full (8-hour) days of chart review, could have been realized using the modified algorithm for surveillance of SSI in hip and knee arthroplasty, abdominal and vaginal hysterectomy, spinal fusion, craniotomy, and herniorrhaphy at DH.
The major constraint to this additional project involved the DH programmer's time, which was required to pull the SSI data into a standardized format. Pooled data were analyzed by current staff whose time was covered by the AHRQ project budget. Making the business case for uptake of the automated surveillance program was a key feature of this work. While promising quantitative results in identifying SSIs using the automated module were found, the perceptions of infection prevention professionals from our focus groups have been instrumental in providing additional support for the decision to implement and institutionalize this automated surveillance system. In summary, the IPs present at the Denver focus group expressed a desire for a free electronic surveillance tool that could enhance current surveillance methods and provide support for the validity of current findings, while providing an opportunity for real-time notification of at-risk patient events if the department would be able to respond in a timely manner. The IPs were hesitant to consider the electronic tool as the sole method for HAI surveillance, mostly due to the high risk of false-positives and the challenges with interpretation of HAI definitions in diverse populations. The sustainability of an institutional surveillance program was noted to be person-specific. The benefit of an electronic surveillance tool that could push data to the IP department was deemed a bonus; the data could be pushed to the department to allow real time surveillance at all times
Discussion
One of the strengths of our research approach was to rigorously differentiate between risk factors for and manifestations of SSI, using a mixed methods approach with engaged delivery system investigator participation. Risk factor data can supply additional information to data systems to improve performance, but use of such data could also curtail any analysis of risk from surveillance systems using the algorithm. We anticipated that the main characteristics that would facilitate its acceptability were a high sensitivity and a low number of charts that would need to be reviewed per identified SSI.
Our approach sought to capitalize on the superior specificity of human reviewers, the growing wealth of electronic data, and the speed of automated systems. If charts are reviewed in roughly 20 minutes,11 and the fraction of SSI among procedures is roughly one percent,12 then 33 hours of review could be anticipated for every SSI found. If electronic tools could effectively remove 80 percent of charts, then only 6.6 hours would be spent for every SSI found. The impact of such savings may be large. The Virginia requirement for statewide detection/reporting would require 160 IPs at a cost of $11.5 million. More than 50 percent of IP time is spent at the desk13—time that could be applied to implementation, education, and other effective activities. A noted limitation of the tool is the need for an integrated EMR system that ideally would include postoperative visits and outpatient pharmacy and laboratory data. In addition, the algorithm requires the time of information technology specialists to build and maintain it. This could be a challenge for institutions that do not have strong advocates for the infection prevention program.
Conclusion
Our surveillance tool has the potential to maximize the work environment of infection prevention staff, moving them from their desks to the units where they can focus on the activities that prevent infections. Further, the surveillance system provides cognitive surveillance support to the human element of traditional practice (i.e., chart review, available electronic data, using "shoe leather"). The advantages of automated surveillance programs include:
- Provide quality assurance for current practice.
- Reduce the burden of chart review.
- Identify patterns of infection that might suggest opportunities for process improvement/reengineering to enhance quality and safety.
- Enhance the work environment for infection prevention staff.
- Meet mandatory, hospital-wide reporting of SSI for value-based payments.
- Use a publicly available electronic surveillance tool vs. an expensive, proprietary data mining surveillance tool like Theradoc™ or Medimined® that can cost up to $150,000, require a separate server, and have continuing maintenance/upgrade fees.
Patient safety improvement strategies, such as the surveillance algorithm developed and tested in this study, leverage electronic data, freeing up clinical resources (e.g., the reduced need for chart review and abstraction in this study). Such approaches provide critical tools for simultaneously reducing cost and improving quality.
Acknowledgments
This project was supported by the Agency for Healthcare Research and Quality (AHRQ) through the DH ACTION Contract No. 290-2006-00-20, Task Order 8. The authors wish to acknowledge the support and guidance received from Kendall Hall, MD the AHRQ Task Order Officer; and Sandra Berrios-Torres, Teresa Horan, and Jonathan R. Edwards representing the CDC as Technical Experts on the project. The findings and conclusions in this document are those of the authors, who are responsible for its content, and do not necessarily represent the views of AHRQ. No statement in this report should be construed as an official position of AHRQ or the U.S. Department of Health and Human Services.
Authors' Affiliations
Intermountain Healthcare Institute for Health Care Delivery Research, Salt Lake City, UT (LS). Denver Health, Denver, CO (SLM, WB, CP). University of Colorado, College of Nursing, Denver, CO (HG).
References
1. Price CS, Savitz LA. Improving the measurement of surgical site infection risk stratification/outcome detection. Final report (Prepared by Denver Health and partners under Contract No290-2006-00-20); AHRQ Publication No. 12-0046-EF. Rockville, MD: Agency for Healthcare Research and Quality; August 2012. Available at http://psnet.ahrq.gov/resource.aspx?resourceID=25139. Accessed March 24, 2014.
2. Rubin MA, Jones M, Huntington JL, et al. Screening for surgical site infections by applying classification trees to electronic data. In Advances in the Prevention and Control of HAIs (AHRQ Publication No. 14-0000). Rockville, MD: Agency for Healthcare Research and Quality; May 2014.
3. Evans RS, Larsen RA, Burke JP, et al. Computerized surveillance of hospital-acquired infection. JAMA 1986 Aug; 256(8):1007-11. PMID: 3735626.
4. Opportunities for advancing delivery system research. In: AHRQ Expert Meeting on the Challenge and Promise of Delivery System Research: Meeting summary. February 16-17, 2011; Sterling, VA. Rockville, MD: Agency for Healthcare Research and Quality; 2012. Available at http://www.ahrq.gov/professionals/systems/system/delivery-system-initiative/index.html. Accessed March 25, 2014.
5. Institute of Medicine. Initial National Priorities for Comparative Effectiveness Research. Washington, DC: National Academies Press; 2009.
6. Bayley KB, Moore SL, Savitz LA. Informing mixed methods research with a participatory perspective: Practical examples in delivery systems. Health Serv Res; in press.
7. Israel BA, Eng E, Schulz AJ, et al (eds). Methods in community-based participatory research for health. San Francisco, CA: Jossey-Bass Publishers; 2005.
8. AHRQ activities using community-based participatory research to address health care disparities. AHRQ Publication No. 09-P012. Rockville, MD: Agency for Healthcare Research and Quality; September 2009.
9. Creswell JW, Klassen AC, Plano VL, et al. Best practices for mixed methods research in the health sciences. Rockville, MD: National Institutes of Health, Office of Behavioral and Social Sciences Research; August 2011. Available at http://obssr.od.nih.gov/mixed_methods_research/. Accessed March 26, 2014.
10. Zhang W, Creswell J. The use of 'mixing' procedure of mixed methods in health services research. Med Care 2013 Aug;51(8):e51-7. PMID 23860333
11. Brossette SE, Hacek DM, Gavin PJ, et al. A laboratory-based, hospital-wide, electronic marker for nosocomial infection: the future of infection control surveillance? Am J Clin Pathol 2006;125(1):34-9. PMID 16482989
12. Gaynes RP, Culver DH, Horan TC, et al. Surgical site infection (SSI) rates in the United States, 1992-1998: the National Nosocomial Infections Surveillance System basic SSI risk index. Clin Infect Dis 2001 Sep;33(Suppl 2):S69-77. PMID 11486302
13. Edmond MB, White-Russell MB, Ober J, et al. A statewide survey of nosocomial infection surveillance in acute care hospitals. Am J Infect Control 2005 Oct;33(8):480-2. PMID 16216664