To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Background: A comprehensive understanding of antimicrobial prescribing practices, requires antimicrobial stewardship (AMS) clinicians to assess both the quantity and quality of antimicrobial prescribing. In Australia, two national programs collect and analyse such data in the hospital setting; the National Antimicrobial Utilisation Surveillance Program (NAUSP) a continuous, volume-based surveillance program that monitors antimicrobial usage trends; and the Hospital National Antimicrobial Prescribing Survey (Hospital NAPS) a standardised auditing program that assesses antimicrobial prescribing appropriateness. This study aims to analyse the 2023 NAUSP and Hospital NAPS data to compare the volume and appropriateness of inpatient antimicrobial use in Australian hospitals. Methods: Data were extracted from hospitals that participated in both programs in 2023, including systemically administered antimicrobials for adult patients. NAUSP data were aggregated and acute inpatient usage rates relative to patient activity were calculated as Defined Daily Doses (DDD) per 1,000 Occupied Bed Days (OBD) for individual antimicrobials.
Hospital NAPS data on appropriateness of prescribing, as assessed by local auditors as a point prevalence survey and using a standardised assessment matrix, were aggregated and calculated for each antimicrobial. Antimicrobials with high-volume use (NAUSP data), high rates of antimicrobial prescribing (NAPS data), or classified as medium to high risk for antimicrobial resistance potential according to the World Health Organization AWaRe classifications were further investigated. Results: There were 192 acute care hospitals that contributed in 2023 to both NAUSP (representing 12, 619, 253 OBDs) and Hospital NAPS (representing 21,017 antimicrobial prescriptions). Figure 1 summarizes the aggregate usage rates and appropriateness for antimicrobials of interest. Antimicrobials with the highest usage rates, including amoxicillin-clavulanic acid and cefazolin (92.2 and 78.8 DDD/1,000 OBD respectively), had moderate levels of appropriateness (67.8% and 69.9% respectively). Similar ‘Access‘ antimicrobials such as doxycycline, amoxicillin and metronidazole, which tend to be unrestricted in hospital formularies, had moderate levels of appropriateness. Cefalexin had the lowest rate of appropriateness (52.2%). In comparison, ‘Watch‘ antimicrobials, including meropenem and vancomycin, which tend to be restricted, had lower usage rates (14.3 and 19.8 DDDs/1,000 OBD respectively) but higher rates of appropriateness (83.5%, 82.8% respectively). Conclusion: This analysis highlights the importance of assessing and analyzing antimicrobial quantity and quality concurrently, providing a holistic view of prescribing practices. Furthermore, AMS efforts should include all antimicrobials, regardless of restriction category, as commonly used, unrestricted antimicrobials may have substantial rates of inappropriate prescribing.
Background: The 2019 American Urological Association (AUA) best practice statement for Urologic Procedures and Antimicrobial Prophylaxis recommends single-dose peri-operative antimicrobial prophylaxis (e.g. cefazolin or trimethoprim/sulfamethoxazole) not to continue past closure of the incision for Class I and II genitourinary (GU) procedures even in the presence of asymptomatic bacteriuria (ASB). Class I and II GU procedures encompass the majority of urologic procedures which are clean procedures in low risk patients and clean-contaminated procedures, respectively. The objective of this study is to assess current urologic antimicrobial prophylaxis practices at a large academic medical center. Methods: This retrospective observational study included adults who underwent a urologic procedure from September-October 2024 to assess AUA guideline adherence. Patients with a history of renal transplant, documented concern for symptoms consistent with urinary tract infection prior to the procedure, or receiving antibiotics for another condition were excluded. Both inpatient and outpatient preprocedural, intra-operative and postprocedural antibiotics were evaluated. Pre-procedural urine cultures results, attending of record, and type of procedure were correlated with prophylaxis practices using a one way ANOVA. Results: Of the 80 patients reviewed 41.3% received only single dose pre-operative prophylaxis and 57.5% received Discussion:
Nearly half of patients who underwent urologic procedures had a prophylaxis duration of < 2 4 hours in concordance with the AUA best practice recommendations. Opportunities exist for optimizing agent selection education. No difference in length of prophylaxis was found to correlate between different procedures performed. The presence of pre-operative ASB and ordering attending were found to correlate with an increased duration of prophylaxis. A future institutional practice guideline and order set for urologic procedure antimicrobial prophylaxis may be necessary to optimize agent selection and duration for these GU procedures.
Background: A Quality Improvement (QI) initiative to reduce invasive Staphylococcus aureus (SA) infections in a level IV neonatal intensive care unit (NICU) successfully eliminated Methicillin-resistant (MRSA) but not Methicillin-susceptible (MSSA) infections. A combination of SA whole genome sequencing (WGS) and environmental culturing helped to better understand the epidemiology of MSSA colonization and infection in the NICU and drive new infection prevention interventions. Methods: Environmental surveillance of high-touchpoint surfaces for SA was performed using Dey and Engley neutralizing agar. Selected isolates were confirmed as SA using Columbia Sheep’s Blood agar and Staphaurex testing. Statistical analyses examined correlations between monthly effective cleaning, hand hygiene compliance, and colonization rates. To better understand MSSA spread in the NICU, WGS was performed on a convenience sample of 42 MSSA isolates, sampled one month before and after an invasive MSSA infection. Data extracted from electronic health records were used for retrospective room tracing of colonized patients with related isolates to determine modes of transmission. Results: WGS analysis MSSA isolates revealed four MSSA strains from 29 patients suggesting within unit transmission, while 13 patients were colonized with unique MSSA isolates suggesting external sources. Retrospective room tracing of colonized patients identified three transmission patterns: subsequent room occupant transmission, intra-pod spread, and inter-pod transmission without patient transfer, with evidence that these strains were endemic within the unit for at least 3-12 months. Statistical analyses showed no significant correlation between environmental cleaning or hand hygiene compliance and colonization rates. Conclusions: Persistent MSSA colonization and invasive infections in the NICU result from both within-unit transmission and the introduction of unique isolates. These findings are being used to inform the development of new interventions, including updated below-the-elbow hand hygiene protocols, revised environmental cleaning plans, nurse-parent communication training, and a virtual reality hand hygiene training program for parents and staff. WGS of pathogenic organisms is a useful tool to drive QI initiatives aimed at reducing hospital-acquired infections.
Introduction: The estimated annual incidence of tuberculosis (TB)in the United States amongst health care personnel (HCP) is low at 2/100,000 persons. Current TB post exposure testing practices may result in many HCP being contacted and tested, with very low yield, thus leading to increased health care resource utilization and HCP anxiety. Based on CDC criteria, Mayo Clinic, Rochester is a medium risk facility. Given that the only transmission we have seen in the last decade is from smear positive, symptomatic patients, we present an alternative, risk-based approach to defining exposure risk to guide followup testing for health care personnel exposed to TB patients. Our goal was to account for the most common exposure follow up (EFU) scenarios and not the rarest situations, which would require case by case discussion. We present a novel risk stratification definition for EFU testing at Mayo Clinic, Rochester and present 12 months’ worth data pre and post initiative. Methods: Prior to July 2023, case exposure definition for screening was broad without clarity on duration of exposure or risk for acquisition of the disease. After the new definition was proposed in collaboration with Infection prevention and control (IPAC), Occupational safety and health, and Minnesota department of health, each case was reviewed to determine appropriateness of HCP exposure testing Results: In the time frame from July 2022 through June 2023, total of 5 EFUs were conducted, and 70 healthcare personnel were exposed (14 per EFU), and none developed TB infection [MS1] After implementation of new protocol, during July 2023 through June 2024, there were 11 EFUs, 102 healthcare personnel were identified as exposed (9 per EFU), and none developed TB. Of note, the low number of exposure investigations prior to July 2023 coincides with the universal [MS2] masking policy related to the COVID-19 pandemic. Conclusion: Existing public heath guidelines do not establish minimum exposure time warranting follow up testing for tuberculosis amongst HCP. However, not all cases need extensive case management as this may lead to excessive costs and resources for testing, conducting EFUs and anxiety amongst HCP. With our proposed exposure risk stratification, we aim to not only reduce resources and time needed to conduct EFUs, but also decrease incorrectly identified HCP to assure the correct ones are being tested. We will continue to audit and review our data at regular intervals with continued feedback and discussion with stakeholders to adopt a more data driven approach to TB exposure followup.
Voluntary assisted dying (VAD) is an end-of-life care option available to eligible Australians living with a terminal condition, though people living with dementia are typically ineligible to choose VAD as part of their end-of-life care. In order to develop equitable research-informed policy and practice, it is crucial to include the perspectives of all key stakeholders, including living experience experts whose voices are currently excluded from Australian VAD research. This study aims to capture the perspectives of people living with dementia by exploring their VAD-related needs and preferences. The study is grounded in a critical and phenomenological conceptual framework that prioritizes inclusive research design. Thirty-six people living with dementia in Australia self-selected to participate in an online survey. It found that the vast majority of participants wanted the option to access VAD themselves, and most wanted provisions for accessing VAD through advance care directives. Through open text responses, the participants expressed many concerns about potential end-of-life suffering and loss of dignity, with their VAD preferences often aligned with their wish to maintain autonomy and human rights. This is the first known Australian study to explore VAD from the perspective of people living with dementia, providing critical insights into their experiences as stakeholders in a highly contested policy and practice environment that is dominated by medico-legal voices. Centring on people living with dementia challenges misconceptions about their capacity to contribute to VAD research, demonstrating their importance as living experience experts and key stakeholders with clear needs and preferences for their end-of-life care.
Background: Patients with cirrhosis often experience coagulopathy, which can result in profound bleeding at intravenous insertion sites. This makes maintaining dry, intact central venous catheter (CVC) dressings particularly challenging. At our 2,247-bed acute care tertiary referral hospital, the Medical-Surgical Intensive Care Unit (MSICU) specializes in hepatic care.
Upon completing a root cause analysis of our elevated central line associated blood stream infections (CLABSIs), we found patients with coagulopathy had poor CVC dressing adherence. Our goal was to reduce CLABSIs by improving dressing integrity through innovative strategies aimed at mitigation of bleeding and enhanced adhesion. Method: In November 2022 a review was completed of hemostatic and adhesion products to address bleeding at CVC insertion sites and improve dressing adherence to skin. In December 2022 we developed a tiered intervention program using three products tailored to the severity of bleeding at CVC insertion sites. We then selected an adhesive product to bond the perimeter of the dressing to the skin. We disseminated education to the nursing team on product use according to patient CVC dressing condition and manufacturer instructions for use. The tiered intervention program was evaluated by comparing pre-intervention (January 2021 to November 2022) CLABSI rates and standardized infection ratios (SIRs) to post-intervention (December 2022 to October 2024) outcomes. Data was obtained from the National Healthcare Safety Network (NHSN), and analysis was completed using the NHSN statistics calculator. Result: Following implementation of the tiered intervention program, the CLABSI rate decreased from 1.67 to 0.62, a 62.9 percent decrease. The CLABSI SIR decreased from 1.481 to 0.553, a 62.7 percent decrease. The CLABSI SIR reduction was statistically significant (p-value of 0.0318; one tailed Z-test). Conclusion: Patients with coagulopathy issues pose unique challenges in infection prevention. Their abnormal clotting factors increase the risk of bleeding, making it difficult to maintain an intact and occlusive CVC dressing. Hemostatic and adhesive products are effective strategies for maintaining CVC dressing integrity and facilitating CLABSI reduction.
Introduction: Enterococci are the third most common healthcare-associated pathogen, with 30% of isolates resistant to vancomycin (VRE). Resistance is often conferred by the vanA gene cluster on transposon Tn1546 and is frequently plasmid-borne. The suspected role of person-to-person transmission prompted the recommendation for VRE isolation precautions in 1995. However, quasi-experimental studies in hospitals discontinuing these precautions found no significant increases in VRE infections or bacterial clone spread using short-read whole genome sequencing (WGS). We used long-read WGS to analyze vanA plasmid transmission dynamics after discontinuing isolation precautions for VRE at Stanford University Hospital. Methods: This study was conducted at Stanford University Hospital, an 800-bed quaternary referral and transplant center. Routine contact precautions for VRE were discontinued on October 1, 2021. Blood culture Enterococcus faecalis and E. faecium isolates collected during 2021 (prior to and following discontinuation) were included, along with additional isolates retrieved from January–October 2023. Bacterial whole genome sequencing with long-read nanopore technology was performed. Custom analyses were performed on the assembled genomes. Patient data were collected retrospectively. Results: We retrieved 105 blood culture isolates (36.2%) from the isolation period (January–October 2021) and 185 isolates (63.8%) from the no isolation period (October–December 2021, January–October 2023), representing 202 unique patients. Patient characteristics and microbiological findings are shown in Table 1. Only 4.3% (7/171) of E. faecalis and 70.5% (84/119) of E. faecium isolates were vancomycin-resistant. Long-read WGS revealed no clustering between the isolation and no isolation periods. (Figure 1A and B); however, a dominant E. faecium ST117 cluster was seen, while E. faecalis showed greater diversity (Figure 1C). There were only four pairs of putative transmissions Conclusion: The discontinuation of contact isolation precautions at Stanford Hospital did not result in an increase in genetically related Enterococci or genetically related vanA plasmids among patients with Enterococcal bacteremia.
Background: The Agency for Healthcare Research and Quality Safety Program for MRSA Prevention Surgical Services cohort aimed to reduce surgical site infections (SSIs) and prevent methicillin-resistant Staphylococcus aureus (MRSA) in teams performing surgeries at high risk for infection with and high morbidity due to MRSA (cardiac, knee or hip replacement, and spinal fusion) using evidence-based infection prevention interventions and the Comprehensive Unit-based Safety Program (CUSP) framework. We report process and outcome measures associated with program participation. Methods: The Surgical Services Safety Program for MRSA Prevention was implemented from January 2023 to June 2024. The aim was to increase teamwork and collaboration, reinforce safety culture, implement evidence-based infection prevention practices, and decrease SSIs and MRSA. The project team provided 22 live webinars, supporting materials, and other tools to assist surgical teams (Table 1). Teams were also assigned an implementation advisor who provided support through monthly coaching calls.
Teams submitted baseline and endline information on patient safety culture and on infrastructure at the team- and hospital-level, as well as monthly data regarding process measures and SSIs. Teams submitted SSI data from 12 months prior to the start of the program and for 18 months after program implementation. Changes were assessed using pre-post comparisons with Chi-squared test and linear mixed effect models with random intercept. Results: 104 surgical teams (18 cardiac, 19 neurosurgical spinal fusion, 16 orthopedic spinal fusion, 51 knee/hip replacement) from 63 hospitals completed the program. Significant improvements in team-based process measures of surgical team infrastructure (Figure 1) and in teams’ reporting that patients received evidence-based practices (Figure 2) were observed across several areas from baseline to endline, including preoperative decolonization, appropriate antibiotic prophylaxis, and intraoperative infection prevention procedures. While SSI rates did not significantly change, the observed 23% decrease in overall deep or organ space SSI rates approached statistical significance (95% CI -0.46, 0.01) (Table 2 and Table 3). Conclusions: The AHRQ Safety Program for MRSA Prevention supported implementation of evidence-based infection prevention practices to prevent MRSA and SSIs in high-risk surgeries. Participating teams showed improvements in team-based process measures and observed a reduction in deep or organ space SSI rates.
Introduction: Athletes in contact sports have a higher rate of Staphylococcus aureus nasal carriage than the general population, leading to an increased risk of skin and soft tissue infections (SSTIs). These infections can have a significant impact on individual players and teams. This study aimed to assess the effectiveness of adding a nasal decolonization protocol in reducing S. aureus colonization among a Division I (D1) college football team to chlorhexidine gluconate body wash. Methods: A total of 113 athletes were screened for S. aureus nasal carriage at two time points during intensive summer training. During the first screening, athletes were universally prescribed intranasal mupirocin twice daily using clean Q-tips for five consecutive days. Players were also educated on proper hygiene and adherence to the decolonization protocol. Four weeks later, all players were screened again for S. aureus nasal carriage. Protocol success was defined as either detection of Staph aureus in the first round of screening but not in the second (elimination) or a persistently negative result (lack of acquisition). Protocol failure was defined as either the isolation of the same organism in the first and second rounds (lack of elimination) or a positive second-round culture following a negative first-round culture (acquisition). Select S. aureus isolates were submitted for multilocus sequence typing (MLST). Results: At the initial screening, 2 players (1.8%) were colonized with methicillin-resistant Staphylococcus aureus (MRSA), 23 players (20.4%) with methicillin-susceptible Staphylococcus aureus (MSSA), and 4 players (3.5%) with both MRSA and MSSA. After decolonization, follow-up screening identified 0 players with MRSA and 12 players (10.6%) with MSSA, representing a 58.6% reduction in overall S. aureus nasal carriage. Based on study definitions, the decolonization protocol was successful in 101 (89%) players (Figure 1).
MLST was performed on 11 of the 27 initial MSSA-positive isolates and 6 of the 12 second-round MSSA-positive isolates. Based on limited molecular typing data, at least 1 player may have acquired MSSA from another team member within the athletic environment. Discussion: Our findings suggest that implementing a nasal decolonization protocol in a D1 college football team is feasible and effective, resulting in a significant reduction of S. aureus nasal carriage. While initial screening effectively identified carriers, a small subset of athletes acquired MSSA colonization, indicating potential re-exposure or incomplete protocol adherence. Further research should explore decolonization adherence strategies and expand decolonization efforts across contact sports programs to reduce S. aureus-related SSTIs among athletes.
Background: Candida species are increasingly causing infections and are considered high priority fungal pathogens. Despite this, published data describing the clinical importance of Candida growth in the bile tract is limited to case reports and small cohorts. Our goal is to characterize treatment outcomes of patients who had Candida spp isolated from bile cultures obtained via Endoscopic Retrograde Cholangiopancreatography (ERCP) to determine the value of antifungal use in these cases. Methods: We performed a single-center retrospective cohort study of patients with bile cultures positive for Candida spp collected during ERCPs from January 2010 to December 2023. Patients were identified by cross-matching databases of patients who underwent ERCP and patients with Candida-positive bile cultures. The treatment cohort included patients who received antifungals within seven days of the first ERCP with Candida growth in bile cultures (principal ERCP) compared to a control cohort who did not. Patients with candidemia or deep-seated candida infection prior to the principal ERCP or insufficient chart data were excluded. The primary outcome was a composite of death and/or development of invasive candida infection within one year of the principal ERCP date. Kaplan Meier plots and log-rank tests were used to analyze the primary outcome. Results: A total of 266 patients were included out of 285 with 8 being excluded for insufficient records and 11 being excluded for invasive candidiasis within one year prior. The included patient population was 60.2% male, 79.7% white, 7.9% black, and 12.4% other/unknown race and had a mean age of 63.6 +/- 15.8 years. The most common species of Candida identified were C albicans (65.9%), C glabrata (17.4%), and C tropicalis (7.2%) with 27 patients (9.2%) having 2 isolates in their culture. There were 52 patients (19.5%) who received antifungals—46 fluconazole and 6 micafungin. At one year the primary endpoint occurred in 23 out of 53 patients (43.3%) in the antifungal group and 93 out of 213 patients (43.6%) in the control group. The primary outcome was plotted on a Kaplan Meier curve. The hazard ratio was 1.14 (0.71 to 1.85, 95% CI; p=0.574) which did not reach statistical significance. Additionally, antifungal initiation had no statistically significant impact on rehospitalization rates (p=0.602), relapse of bacterial cholangitis (p=0.230), or recurrent Candida-positive bile cultures (p=0.441) within one year. Conclusions: This retrospective study of the use of antifungals in patients with Candida growing from bile cultures following ERCP found no benefit in starting antifungal treatment.
Background: Following the COVID-19 pandemic, various infection prevention strategies were implemented globally. Due to the challenges of asymptomatic transmission during the presymptomatic period, healthcare facilities in Japan adopted a strict universal masking policy for hospital staff and visitors. However, with the widespread availability of vaccines and therapeutic agents, as well as the reduced pathogenicity of the Omicron variant, particularly in younger populations, the universal masking policy was lifted for patients and visitors at our institution. This study evaluates the incidence of hospital-onset COVID-19 cases in a children’s hospital before and after this policy change. Methods: A retrospective analysis was conducted to assess hospital-onset COVID-19 cases among hospitalized patients at Tokyo Metropolitan Children’s Medical Center, Japan, between March 2020 and December 2024. Hospital-onset infection was defined as symptomatic COVID-19 diagnosed on or after the fifth day of hospitalization with a positive SARS-CoV-2 PCR result. Cases with a Ct value of ≥30, indicative of past infection, were excluded. The infection rate was calculated as the number of hospital-onset COVID-19 cases per 10,000 patient-days. During the universal masking policy period, all adults and children capable of wearing masks were required to wear them at all times in public spaces. From July 2023, patients and visitors were no longer required to wear masks unless otherwise indicated, while hospital staff continued universal masking during patient care but were allowed to remove masks for non-patient-care activities. The study compared two periods: the universal masking policy period (March 2020–June 2023) and the post-universal masking policy period (July 2023–December 2024). Results: During the universal masking policy period, there were 8 hospital-onset COVID-19 cases, compared to 10 cases in the post-universal masking policy period. The median ages of the patients were 106 months (IQR: 48-132) and 94 months (IQR: 56-152), respectively. The hospital-onset COVID-19 infection rates were 0.19 and 0.52 per 10,000 patient-days, respectively (p=0.07). None of the hospital-onset COVID-19 cases progressed to severe disease. Conclusion: The universal masking policy may have contributed to reducing the incidence of hospital-onset COVID-19 among patients and visitors in our children’s hospital to some extent. Further studies are needed to evaluate the long-term impact of masking policies on infection prevention in pediatric settings.
Background: Distinguishing outbreaks from pseudo-outbreaks is essential in healthcare settings. Pseudo-outbreaks are defined by an increase in identified organisms without clinical evidence of infection. Here we report two cases involving Pseudomonas aeruginosa identified in clinical specimens, later determined to represent a pseudo-outbreak. Methods: Patient #1 had vertebral osteomyelitis and epidural abscess; intraoperative and blood cultures grew Streptococcus mitis/oralis. Four days post-surgery, one colony of P. aeruginosa grew from one of three intraoperative aerobic cultures. Patient #2 developed a fracture-related infection of the ankle and underwent debridement and hardware removal; all intraoperative cultures grew methicillin-susceptible Staphylococcus aureus. Four days later, two colonies of P. aeruginosa were detected in one of three intraoperative aerobic cultures. Both these findings were deemed unusual, leading to an outbreak investigation, including chart review and laboratory investigations, to identify a source of contamination. Results: The two cultures were received and set up one day apart by different staff. Subsequently, the WASPLab incubation system’s photographic record of the plates demonstrated no P. aeruginosa within the expected first 48 hours, suggesting contamination during culture collection or processing was unlikely. Further review revealed a heavily inoculated culture of P. aeruginosa was processed by the same laboratory technician on an open bench immediately before handling plates for patients #1 and #2. P. aeruginosa typically grows within 24 hours of incubation, and the colony morphology of the contaminated plates matched those of the heavily inoculated culture. Furthermore, both patients had monomicrobial growth of a likely pathogen causing their infection. Therefore, we concluded that this was cross-contamination, likely via aerosolization or improper plate handling. For patient #1, cefepime was discontinued on post-operative day six and switched to ceftriaxone, completed for six weeks, followed by suppressive therapy with amoxicillin, with no recurrence at three months. Patient #2 completed six weeks of cefazolin without anti-pseudomonal coverage, also without recurrence at three months. Conclusions: The pseudo-outbreak likely stemmed from cross-contamination caused by aerosolization or handling heavily inoculated P. aeruginosa cultures near the time and location of the two patients’ plates on an open bench. Awareness of such rare contamination pathways is critical for microbiology labs and clinicians, especially when handling hazardous isolates such as Brucella spp. Careful record keeping and digital storage of serial plate images can narrow the source of contamination, and active surveillance by trained epidemiology personnel is essential to detecting pseudo-outbreaks. Clinical and microbiological correlation should guide treatment to avoid unnecessary antibiotic treatment.
Background: Globally, Antimicrobial Resistance is a growing threat to global health security and economic development. Due to multidrug resistance, bloodstream infections (BSI) are a growing public health concern and a common cause of morbidity and mortality, especially among non-malarial febrile children Method: This study assessed laboratory BC process outcomes among non-malarial febrile children below five years of age at five AMR surveillance sites in Uganda between 2017 and 2018. Secondary BC testing data was reviewed against established standards. Result: Overall, 959 BC specimens were processed. Of these, 91% were from female patients, neonates, infants, and young children (1-48 months). A total of 37 AMR priority pathogens were identified; Staphylococcus aureus was predominant (54%), followed by Escherichia coli (19%). The diagnostic yield was low (4.9%). Only 6.3% of isolates were identified. AST was performed on 70% (18/26) of identified AMR priority isolates, and only 40% of these tests adhered to recommended standards. Conclusion: Interventions are needed to improve laboratory BC practices for effective patient management through targeted antimicrobial therapy and AMR surveillance in Uganda. Further research on process documentation, diagnostic yield, and a review of patient outcomes for all hospitalized febrile patients is needed.
Accurate mortality forecasting is crucial for actuarial pricing, reserving, and capital planning, yet the traditional Lee-Carter model struggles with non-linear age and cohort patterns, coherent multi-population forecasting, and quantifying prediction uncertainties. Recent advances in deep learning provide a range of tools that can address these limitations, but actuarial surveys have not kept pace. This paper provides the first concise view of deep learning in mortality forecasting. We cover six deep network architectures, namely Recurrent Neural Networks, Convolutional Neural Networks, Transformers, Autoencoders, Locally Connected Networks, and Multi-Task Feed-Forward Networks. We discuss how these architectures tackle cohort effects, population coherence, interpretability, and uncertainty in mortality forecasting. Evidence from the literature shows that carefully calibrated deep learning models can consistently outperform the Lee-Carter baselines; however, no single architecture resolves every challenge, and open issues remain with data scarcity, interpretability, uncertainty quantification, and keeping pace with the advances of deep learning. This review is also intended to provide actuaries with a practical roadmap for adopting deep learning models in mortality forecasting.
Background: The gold standard for surgical site infection (SSI) surveillance is 100% chart review, a practice neither efficient nor pragmatic for most large hospital systems.
Modern infection surveillance software uses indicators – specific data elements within the patient medical record - to report possible SSI for confirmation by infection preventionists (IPs). Using all available indicators has been shown to increase identification of SSIs and may approximate the gold standard but has been called “noisy” for including many patients with no SSI and costing precious surveillance time. Here, we describe our experience with evaluating the performance of our surveillance system. Methods: The setting for this study was the 21-hospital Cleveland Clinic health system with a uniform SSI surveillance plan and shared medical record. Our software, Bugsy (Epic Systems Corporation), employs six indicators for possible SSI: hospital readmission, return to surgery, positive microbiology tests, and chief complaint, physician diagnoses (billing codes), or administration of post-prophylaxis antibiotics suggestive of SSI. We extracted all possible SSIs, indicators, and confirmed infections for seven NHSN procedure code categories. We calculated the sensitivity and specificity of each indicator individually using OpenEpi v3.01 and estimated the cost associated with IP time spent on indicators that do not result in confirmed SSI. Results: From January to December 2023, 12,739 possible SSIs were reported with any indicator out of 26,276 inpatient procedures (48%). The frequency of procedures flagged for review ranged from 25% for CSEC to 78% for HYST. The number of procedures, possible SSI, and confirmed SSI with rates are shown in Table 1. The sensitivity and specificity of each indicator are shown in Table 2. Infection preventionists spent an average of 2 minutes reviewing each of 12,027 patient charts (401 hours) and determined there was no SSI, costing an estimated $18,602 annually. Conclusion: Nearly 50% of surgical patients were flagged for review for possible SSI with any indicator. Post-prophylaxis antibiotic was the most sensitive (97%) but least specific (65%) indicator. There was variability in indicator performance between procedure types. Readmission to the hospital was more sensitive in procedures with implants, e.g. KPRO and HPRO, than in procedures without, such as COLO and HYST. Evaluating the performance of possible SSI indicators enables IP programs to make data-driven and pragmatic decisions related SSI case finding practices. Tuning the indicator criteria within the software build may be necessary for optimization and presents an opportunity for IP time and cost savings.
Background: Blood culture volume is crucial to accurate diagnosis of a bloodstream infection. Underfilling blood culture bottles decreases test sensitivity and has been associated with contaminants. Pediatric blood culture volume recommendations are patient weight-based and difficult to audit. Our objectives were to assess healthcare worker pediatric blood culture volume knowledge and to measure culture volumes during the pre-implementation phase of a blood culture quality improvement program. Methods: Data were collected May 2024-November 2024. Surveys were administered to healthcare personnel who regularly obtained blood cultures. To estimate the collected blood volume, blood culture bottles in the laboratory were weighed, and weights were subtracted from the averaged weight pre-filled bottles. Bottles that were Results: <90% of the recommended weight-based volume were classified as underfilled. Blood culture results were compared between bottle characteristics using chi squared and Wilcoxon rank tests as appropriate. Results were presented to stakeholders to facilitate discussions on blood culture collection. 65 surveys were completed. 59 (90.8%) of respondents reported receiving blood culture training. Of those who received training, 51 (78.5%) reported that they had received weight-based blood culture training. A convenience sample of 1,076 bottles were weighed, representing 38.8% of blood cultures collected. Of those, 816 (75.8%) were underfilled (median percentage of recommended volume -57.8% (interquartile range (IQR) -80.3%, -12.3%)). Only 574 (54.3%) cultures were appropriately inoculated into a pediatric bottle based on patient weight. 83 bottles (7.7%) grew bacteria or fungi, 61 (73.5%) were non-commensals. There was no association between underfilling and positivity (p = 0.47). The median percentage of recommended volume did not differ between positive and negative cultures (-60.6% and -57.3%, respectively; p = 0.92). The median percentage of recommended volume for culture growing non-commensals and commensals was -55.5% and -69.1%, respectively (p = 0.01). Stakeholder groups reported that barriers to appropriate volume included: uncertainty regarding blood culture protocols, technical issues obtaining blood and total blood draw limits. Conclusions: In a large, regional children’s center, the majority of weighed pediatric blood culture bottles were underfilled despite the majority of respondents reporting blood culture volume training. Fill volume was not associated with positivity, which may be due to the large proportion of underfilled bottles in this sample. Non-commensal blood cultures did have a higher median percentage of recommended volume as compared to commensal blood cultures, which is consistent with prior publications. Future quality improvement programs will focus on dissemination of policy and addressing systems and technical barriers.
Background: Antimicrobial resistance (AMR) poses a critical threat to global health, with healthcare-associated infections (HAIs) such as Clostridioides difficile and multidrug-resistant organisms (MDROs) exacerbated by antibiotic misuse. Up to 50% of inpatients receive antibiotics during their hospital stay, nearly 1/3 of which is inappropriate. The Standardized Antimicrobial Administration Ratio (SAAR) at Michael E DeBakey VA Medical Center (MEDVAMC) exceeded 1 from 2023–2024, signaling higher-than-expected antibiotic use. Nurses, as pivotal frontline healthcare providers, are often underutilized in antimicrobial stewardship program (ASP) efforts due to a lack of formal ASP education. Addressing this gap aligns with The Joint Commission standards, CDC guidelines, and ANA recommendations for improving ASP engagement and reducing HAIs. Methods: This quality improvement project utilized the Plan-Do-Study-Act (PDSA) framework to develop, implement, and refine an educational intervention aimed at enhancing RN knowledge and engagement in ASP. Baseline data, including a survey assessing RN ASP knowledge, informed the creation of a tailored training program. The program emphasized the 5D approach (Diagnosis, Drug, Dose, Duration, De-escalation), the role of nurses in ASP, and interdisciplinary collaboration. The initiative was endorsed by leadership and delivered through interactive workshops and case-based learning. Post-intervention surveys and infection rate analyses were conducted to evaluate outcomes. Results: The intervention led to a 92% increase in RN knowledge, with a mean post-intervention scores of 92 out of 100 among 67 participating nurses, compared to preintervention score of 48 out of 100. Improved RN competency in ASP facilitated stronger interdisciplinary communication and adherence to stewardship protocols, such as performing antibiotic time outs. Feedback from participants highlighted increased confidence in ASP roles and improved patient safety practices. Some examples of patient safety practices that improved, included more consistent documentation of allergy checks, antibiotic indications, and treatment plans within the electronic health record. Post-intervention, nurses felt more comfortable providing patient education on the importance of completing antibiotics, recognizing side effects, and infection prevention. Conclusions: Empowering nurses through targeted ASP education not only bridges critical knowledge gaps but also fosters a culture of safety and accountability in antibiotic use. Sustaining these outcomes requires integrating ASP education into routine RN training, continuous monitoring of infection rates, and leveraging interdisciplinary collaboration to maintain compliance with evidence-based stewardship practices. These findings underscore the transformative potential of nurse-led initiatives in combating AMR and improving healthcare outcomes.
Background: The risk of bacterial transmission through gastrointestinal endoscopes remains a critical concern in healthcare-associated infections, driven by the complex design of endoscopes and potential lapses in reprocessing protocols. Contaminated endoscopes can serve as vectors for multidrug-resistant organisms, posing significant threats to patient safety. Current U.S. guidelines for endoscope reprocessing and infection control do not mandate routine surveillance sampling; however, select facilities have successfully adopted routine surveillance cultures to monitor reprocessing efficacy. The Centers for Disease Control and Prevention (CDC) has published protocols to support facilities choosing to implement such practices, emphasizing the importance of identifying persistent transmission risks. Methods: This retrospective study was conducted at the University of Kentucky Healthcare (UKHC), a 1,086-bed academic medical center, from January 1, 2019, to June 30, 2024. UKHC implemented a surveillance program in July 2016 targeting endoscopic retrograde cholangiopancreatography (ERCP), esophagogastroduodenoscopy (EGD), endoscopic ultrasound (EUS), and colonoscopy endoscopes. Weekly cultures were performed on ERCP scopes, while targeted cultures were conducted on all four scope types used for patients colonized with carbapenem-resistant organisms (CRO). Following manufacturer instructions for use (IFU), post-reprocessing cultures were performed, and pathogens were categorized as concerning or non-concerning based on CDC protocols. Manual chart reviews identified CRO-colonized patients, and match rates were calculated by comparing endoscope culture results with patient isolates. Results: A total of 163 ERCP scopes were cultured, comprising 94 from weekly surveillance and 69 from CRO-targeted surveillance (Figure 1). Weekly surveillance yielded a 9.6% (9/94) positivity rate, while CRO-targeted surveillance showed an 8.7% (6/69) positivity rate. Among six positive samples, no matching CRO was identified. Among 189 EGD scopes subjected to CRO-targeted surveillance, the positivity rate was 25.9% (49/189), with a 4.1% (2/49) match rate to patient isolates. For 27 EUS scopes, the positivity rate was 11.1% (3/27), with a 33.3% (1/3) match rate. Among 59 colonoscopy scopes, the positivity rate was 5.1% (3/59), with no matches to patient isolates. Conclusions: The UKHC surveillance program highlights ongoing risks of bacterial transmission despite adherence to manufacturer-recommended reprocessing protocols. Scopes yielding concerning organisms underwent additional reprocessing to mitigate patient risk. All scopes with positive organisms after the second reprocessing were sent back to the manufacturer. Surveillance programs provide valuable insights into disinfection efficacy, helping to identify gaps and guide infection prevention strategies. Further refinement and standardization of surveillance protocols are essential to mitigate transmission risks associated with gastrointestinal endoscopy and improve patient safety on a broader scale.
Background: Variability in outpatient parenteral antimicrobial therapy (OPAT) management and challenges to providing recommended OPAT care can compromise patient safety and care quality. Little is known about how OPAT is currently delivered by healthcare systems across the United States (US), including within the Veteran’s Health Administration (VHA). We sought to understand and compare OPAT delivery at selected Veterans Affairs medical centers. Method: Using a qualitative methodology, we conducted semi-structured interviews with key informants involved in OPAT delivery at 6 VHA medical centers with different complexity levels in the Midwestern US. Facility complexity is determined by patient volume and complexity level along with the amount of teaching and research conducted at the facility. Interviews occurred between February and December 2024 with healthcare personnel (n=30), including primary care and infectious diseases physicians, pharmacists, nursing staff, care coordinators, and vascular access providers. Data collection focused on better understanding OPAT processes within key domains of decision-making, patient education, care coordination, and post-discharge management. We used rapid analysis and a summary matrix to compare practices across sites within each domain. Result: Our findings highlight significant variability among VHA medical centers that provide OPAT to Veteran patients. Three of the 6 medical centers had dedicated OPAT programs as evidenced by a multidisciplinary team with clearly delineated roles and responsibilities, and processes that may help mitigate adverse outcomes and improve communication between providers at all OPAT care points. These processes map to the key elements outlined in the Infectious Diseases Society of America (IDSA) practice guidelines for OPAT programs, and include determination of appropriate therapy, patient education, lab monitoring, and discontinuation of treatment. (Figure 1) Conversely, at the three VHA sites without evidence of a multidisciplinary OPAT team or program, most participants described poor communication and coordination, lack of support, and uncertainty among providers about who is responsible for OPAT care. This confusion extends to follow-up and discontinuation of treatment. OPAT key elements were lacking or poorly defined. A process map helps visualize the contrasts in care between sites with and without defined OPAT programs. (Figure 1) Conclusion: Despite its centralized healthcare system, VHA medical centers demonstrate highly variable processes with respect to OPAT care. In the absence of a clear OPAT policy or program, uncertainty among providers about roles and responsibilities may be greater. The presence of a dedicated multidisciplinary OPAT team may help improve communication and care coordination, thereby minimizing quality and safety concerns.
Background: Healthcare associated infections (HAIs) are important areas of concern as they increase length of hospital stay, increase hospital costs, and have high morbidity and mortality. For instance, central line associated bloodstream infections (CLABSIs) approximately increase length of stay by 13.4 days and increase hospital costs by $43,975. Studies also suggest a 1.5-2.5x increase in mortality in patients who develop CLABSIs. In 2013, the REDUCE MRSA trial compared universal MRSA decolonization to targeted MRSA decolonization in the ICU and found superiority ini reducing positive MRSA cultures and all cause bloodstream infections. We aim to decrease central line associated bloodstream infections at our institution by adopting the REDUCE MRSA trial protocol. Methods and Outcomes: All patients admitted to the medical/surgical ICU and cardiac ICU at St Francis Hospital starting in December of 2023 received daily intranasal mupirocin and chlorhexidine bathing regardless of their MRSA status. The primary outcome assessed was the CLABSI rate per month. The secondary outcomes were the standard infection ratio and CLABSI per central line day. We compared data from 2020-2023 to data after initiation of the protocol in 2024. We used unpaired t testing to assess the CLABSI rate per month and used a negative binomial regression model to calculate the standard infection ratio according to the NHSN 2015 national baseline. Results and Discussion: We had a total of 6 CLABSIs in the ICU this year after initiating universal MRSA decolonization. The number of CLABSIs per month decreased from 0.65 per month from 2020-2023 down to 0.50 per month in 2024. These results, while not statistically significant, are limited by the small sample size since the protocol was just initiated this year. One interesting finding was 5 of the 6 CLABSIs occurred during January through March, which brings up the question if introducing these new changes required time for nursing education and compliance to improve. Conclusions: Our results suggest that universal MRSA decolonization in the ICU may decrease the number of CLABSIs. We will continue to collect more data in the coming years to assess for statistical significance. We recommend further research to assess for potential benefits of universal MRSA decolonization in other areas of the hospital where MRSA infection rates are high like step down units.