Automated System Better Identifies Patients at Risk for VAP

Article

An automated system for identifying patients at risk for complications associated with the use of mechanical ventilators provided significantly more accurate results than did traditional surveillance methods, which rely on manual recording and interpretation of individual patient data. In their paper published in Infection Control & Hospital Epidemiology, a Massachusetts General Hospital (MGH) research team report that their system - using an algorithm developed through a collaboration among the hospital's Division of Infectious Diseases, Infection Control Unit, and the Clinical Data Animation Center (CDAC) - was 100 percent accurate in identifying at-risk patients when provided with necessary data.

"Ventilator-associated pneumonia is a very serious problem that is estimated to develop in up to half the patients receiving mechanical ventilator support," says Brandon Westover, MD, PhD, of the MGH Department of Neurology, director of CDAC and co-senior author of the report. "Many patients die each year from ventilator-associated pneumonia, which can be prevented by following good patient care practices, such as keeping the head of the bed elevated and taking measures to prevent the growth of harmful bacteria in patients' airways."

Traditional surveillance of patients receiving mechanical ventilation involves manual recording every 12 hours, usually by a respiratory therapist, of ventilator settings - which are adjusted throughout the day to accommodate the patient's needs. Those settings, which reflect the pressure required to keep a patient's lungs open at the end of a breath and the percentage of oxygen being delivered to the patient, are reviewed by an infection control practitioner for signs that indicate possible ventilator-associated pneumonia.

Lead and corresponding author Erica Shenoy, MD, PhD, of the MGH Division of Infectious Diseases, the Infection Control Unit and hospital epidemiology lead for CDAC says, "In our study, manual surveillance made many more errors than automated surveillance - including false positives, reporting cases that on review, did not meet criteria for what are called ventilator-associated events; misclassifications, reporting an event as more or less serious than it really was; and failure to detect and report cases that, on closer inspection, actually met criteria. In contrast, so long as the necessary electronic data were available, the automated method performed perfectly."

Updated surveillance standards issued in 2013 by the National Health and Safety Network of the U.S. Centers for Disease Control and Prevention (CDC) specified three levels of ventilator-associated events, which can be thought of as corresponding to yellow, orange and red alerts to the risk or presence of ventilator-associated pneumonia:

Ventilator-associated condition (VAC) - an increase in a patient's need for oxygen without evidence of infection,
Infection-related ventilator-associated complication (IVAC) - increased oxygen need accompanied by signs of infection, such as fever, elevated white blood cell count or an antibiotic prescription,
Possible ventilator-associated pneumonia (PVAP) - evidence of bacterial growth in the respiratory system, along with the factors listed above.
The CDC specifications were designed to enable large-scale, automated surveillance for ventilator-associated pneumonia, allowing efficient monitoring of infection rates throughout a hospital or a hospital system. To reduce the time required to manually record and review ventilator settings and medical charts, along with the possibility of human error, members of the MGH research team developed an algorithm to provide automated, real-time monitoring of both ventilator settings and information from the electronic health record. Based on that data, the algorithm determined whether criteria were met for a ventilator-associated event and, if so, which level of event: VAC, IVAP, or PVAP.

Initial testing and debugging of the automated system was carried out from January through March of 2015 in four MGH intensive care units. During that time 1,325 patients were admitted to the units, 479 of whom received ventilator support. A retrospective analysis comparing manual versus automated surveillance of data gathered from patients cared for during this development period revealed that the automated system was 100 percent accurate in detecting ventilator-associated events, distinguishing patients with such events from those without, and predicting the development of ventilator-associated pneumonia. In contrast, the accuracy of manual surveillance for each of those measures was 40 percent, 89 percent and 70 percent.

A validation study to further test the algorithm was conducted using data from a similar three-month period in the subsequent year, during which 1,234 patients were admitted to the ICUs, 431 of whom received ventilator support. During that period, manual surveillance produced accuracies of 71 percent, 98 percent and 87 percent, while results for the automated system were 85 percent, 99 percent and 100 percent accurate. The drop-off in accuracy of the automated system during the validation period reflects a temporary interruption of data availability while software was being upgraded, and the team subsequently developed a monitoring system to alert staff to any future interruptions.

Westover says, "An automated surveillance system could relieve the manual effort of large-scale surveillance, freeing up more time for clinicians to focus on infection prevention. Automated surveillance is also much faster than manual surveillance and can be programmed to run as often as desired, which opens the way to using it for clinical monitoring, not just retrospective surveillance. Real-time, automated surveillance could help us design interventions to prevent, halt or shorten the course of an infection, something we hope to explore as we continue developing this project."

Westover is an assistant professor of Neurology, and Shenoy is an assistant professor of Medicine at Harvard Medical School. The co-senior author of the Infection Control & Hospital Epidemiology paper is David Hooper, MD, chief of the MGH Infection Control Unit. Additional co-authors are Eric Rosenthal, MD, Yu-Ping Shao, MS, Manohar Ghanta, MS, and Valdery Moura Junior, MS, MBA, MGH Neurology and CDAD; Erin Ryan, MPH, CCRP, Dolores Suslak, MSN, CIC, and Nancy Swanson, RN, CIC, MGH Infection Control Unit, and Siddharth Biswal, MS, Georgia Institute of Technology.

Support for the study includes National Institute of Allergy and Infectious Diseases grant K01 AI110524, National Institute of Neurological Disorders and Stroke grant 1K23 NS090900, and grants from the Andrew David Heitman Neuroendovascular Research Fund and The Rappaport Foundation.

Source: Massachusetts General Hospital

Recent Videos
Infection Control Today's Infection Intel: Staying Ahead With Company Updates and Product Innovations.
COVID-19 presentations at IDWeek in Las Angeles, California by Invivyd.   (Adobe Stock 333039083 by Production Perig)
Long COVID and Other Post-Viral Syndromes
Meet Jenny Hayes, MSN, RN, CIC, CAIP, CASSPT.
Infection Control Today Editorial Advisory Board: Fibi Attia, MD, MPH, CIC.
Andrea Thomas, PhD, DVM, MSc, BSc, director of epidemiology at BlueDot
mpox   (Adobe Stock 924156809 by Andreas Prott)
Meet Alexander Sundermann, DrPH, CIC, FAPIC.
Veterinary Infection Prevention
Related Content