We partner with a secure submission system to handle manuscript submissions.
Please note:
You will need an account for the submission system, which is separate to your Cambridge Core account. For login and submission support, please visit the
submission and support pages.
Please review this journal's author instructions, particularly the
preparing your materials
page, before submitting your manuscript.
Click Proceed to submission system to continue to our partner's website.
To save this undefined to your undefined account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you used this feature, you will be asked to authorise Cambridge Core to connect with your undefined account.
Find out more about saving content to .
To send this article to your Kindle, first ensure no-reply@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about sending to your Kindle.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Whole-genome sequencing (WGS) is increasingly used to characterize hospital outbreaks of carbapenemase-producing Enterobacterales (CPE). However, access to WGS is variable and testing is often centralized, leading to delays in reporting of results.
Objective:
We describe the utility of a local sequencing service to promptly respond to facility needs over an 8-year period.
Methods:
The study was conducted at Royal Prince Alfred Hospital in Sydney, Australia. All CPE isolated from patient (screening and clinical) and environmental samples from 2015 onward underwent prospective WGS. Results were notified to the infection control unit in real time. When outbreaks were identified, WGS reports were also provided to senior clinicians and the hospital executive administration. Enhanced infection control interventions were refined based on the genomic data.
Results:
In total, 141 CPE isolates were detected from 123 patients and 5 environmental samples. We identified 9 outbreaks, 4 of which occurred in high-risk wards (intensive care unit and/or solid-organ transplant ward). The largest outbreak involved Enterobacterales containing an NDM gene. WGS detected unexpected links among patients, which led to further investigation of epidemiological data that uncovered the outpatient setting and contaminated equipment as reservoirs for ongoing transmission. Targeted interventions as part of outbreak management halted further transmission.
Conclusions:
WGS has transitioned from an emerging technology to an integral part of local CPE control strategies. Our results show the value of embedding this technology in routine surveillance, with timely reports generated in clinically relevant timeframes to inform and optimize local control measures for greatest impact.
To evaluate the utility of selective reactive whole-genome sequencing (WGS) in aiding healthcare-associated cluster investigations.
Design:
Mixed-methods quality-improvement study.
Setting:
Thes study was conducted across 8 acute-care facilities in an integrated health system.
Methods:
We analyzed healthcare-associated coronavirus disease 2019 (COVID-19) clusters between May 2020 and July 2022 for which facility infection prevention and control (IPC) teams selectively requested reactive WGS to aid the epidemiologic investigation. WGS was performed with real-time results provided to IPC teams, including genetic relatedness of sequenced isolates. We conducted structured interviews with IPC teams on the informativeness of WGS for transmission investigation and prevention.
Results:
In total, 8 IPC teams requested WGS to aid the investigation of 17 COVID-19 clusters comprising 226 cases and 116 (51%) sequenced isolates. Of these, 16 (94%) clusters had at least 1 WGS-defined transmission event. IPC teams hypothesized transmission pathways in 14 (82%) of 17 clusters and used data visualizations to characterize these pathways in 11 clusters (65%). The teams reported that in 15 clusters (88%), WGS identified a transmission pathway; the WGS-defined pathway was not one that was predicted by epidemiologic investigation in 7 clusters (41%). WGS changed the understanding of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) transmission in 8 clusters (47%) and altered infection prevention interventions in 8 clusters (47%).
Conclusions:
Selectively utilizing reactive WGS helped identify cryptic SARS-CoV-2 transmission pathways and frequently changed the understanding and response to SARS-CoV-2 outbreaks. Until WGS is widely adopted, a selective reactive WGS approach may be highly impactful in response to healthcare-associated cluster investigations.
We investigated genetic, epidemiologic, and environmental factors contributing to positive Staphylococcus epidermidis joint cultures.
Design:
Retrospective cohort study with whole-genome sequencing (WGS).
Patients:
We identified S. epidermidis isolates from hip or knee cultures in patients with 1 or more prior corresponding intra-articular procedure at our hospital.
Methods:
WGS and single-nucleotide polymorphism–based clonality analyses were performed, including species identification, in silico multilocus sequence typing (MLST), phylogenomic analysis, and genotypic assessment of the prevalence of specific antibiotic resistance and virulence genes. Epidemiologic review was performed to compare cluster and noncluster cases.
Results:
In total, 60 phenotypically distinct S. epidermidis isolates were identified. After removal of duplicates and impure samples, 48 isolates were used for the phylogenomic analysis, and 45 (93.7%) isolates were included in the clonality analysis. Notably, 5 S. epidermidis strains (10.4%) showed phenotypic susceptibility to oxacillin yet harbored mecA, and 3 (6.2%) strains showed phenotypic resistance despite not having mecA. Smr was found in all isolates, and mupA positivity was not observed. We also identified 6 clonal clusters from the clonality analysis, which accounted for 14 (31.1%) of the 45 S. epidermidis isolates. Our epidemiologic investigation revealed ties to common aspirations or operative procedures, although no specific common source was identified.
Conclusions:
Most S. epidermidis isolates from clinical joint samples are diverse in origin, but we identified an important subset of 31.1% that belonged to subclinical healthcare–associated clusters. Clusters appeared to resolve spontaneously over time, suggesting the benefit of routine hospital infection control and disinfection practices.
Studies evaluating the incidence, source, and preventability of hospital-onset bacteremia and fungemia (HOB), defined as any positive blood culture obtained after 3 calendar days of hospital admission, are lacking in low- and middle-income countries (LMICs).
Design, setting, and participants:
All consecutive blood cultures performed for 6 months during 2020–2021 in 2 hospitals in India were reviewed to assess HOB and National Healthcare Safety Network (NHSN) reportable central-line–associated bloodstream infection (CLABSI) events. Medical records of a convenience sample of 300 consecutive HOB events were retrospectively reviewed to determine source and preventability. Univariate and multivariable logistic regression analyses were performed to identify factors associated with HOB preventability.
Results:
Among 6,733 blood cultures obtained from 3,558 hospitalized patients, there were 409 and 59 unique HOB and NHSN-reportable CLABSI events, respectively. CLABSIs accounted for 59 (14%) of 409 HOB events. There was a moderate but non-significant correlation (r = 0.51; P = .070) between HOB and CLABSI rates. Among 300 reviewed HOB cases, CLABSIs were identified as source in only 38 (13%). Although 157 (52%) of all 300 HOB cases were potentially preventable, CLABSIs accounted for only 22 (14%) of these 157 preventable HOB events. In multivariable analysis, neutropenia, and sepsis as an indication for blood culture were associated with decreased odds of HOB preventability, whereas hospital stay ≥7 days and presence of a urinary catheter were associated with increased likelihood of preventability.
Conclusions:
HOB may have utility as a healthcare-associated infection metric in LMIC settings because it captures preventable bloodstream infections beyond NHSN-reportable CLABSIs.
National validation of claims-based surveillance for surgical-site infections (SSIs) following colon surgery and abdominal hysterectomy.
Design:
Retrospective cohort study.
Setting:
US hospitals selected for data validation by Centers for Medicare & Medicaid Services (CMS).
Participants:
The study included 550 hospitals performing colon surgery and 458 hospitals performing abdominal hysterectomy in federal fiscal year 2013.
Methods:
We requested 1,200 medical records from hospitals selected for validation as part of the CMS Hospital Inpatient Quality Reporting program. For colon surgery, we sampled 60% with a billing code suggestive of SSI during their index admission and/or readmission within 30 days and 40% who were readmitted without one of these codes. For abdominal hysterectomy, we included all patients with an SSI code during their index admission, all patients readmitted within 30 days, and a sample of those with a prolonged surgical admission (length of stay > 7 days). We calculated sensitivity and positive predictive value for the different groups.
Results:
We identified 142 colon-surgery SSIs (46 superficial SSIs and 96 deep and organ-space SSIs) and 127 abdominal-hysterectomy SSIs (58 superficial SSIs and 69 deep and organ-space SSIs). Extrapolating to the full CMS data validation cohort, we estimated an SSI rate of 8.3% for colon surgery and 3.0% for abdominal hysterectomy. Our colon-surgery surveillance codes identified 93% of SSIs, with 1 SSI identified for every 2.6 patients reviewed. Our abdominal-hysterectomy surveillance codes identified 73% of SSIs, with 1 SSI identified for every 1.6 patients reviewed.
Conclusions:
Using claims to target record review for SSI validation performed well in a national sample.
To systematically review the methodology, performance, and generalizability of diagnostic models for predicting the risk of healthcare-facility–onset (HO) Clostridioides difficile infection (CDI) in adult hospital inpatients (aged ≥18 years).
Background:
CDI is the most common cause of healthcare-associated diarrhea. Prediction models that identify inpatients at risk of HO-CDI have been published; however, the quality and utility of these models remain uncertain.
Methods:
Two independent reviewers evaluated articles describing the development and/or validation of multivariable HO-CDI diagnostic models in an inpatient setting. All publication dates, languages, and study designs were considered. Model details (eg, sample size and source, outcome, and performance) were extracted from the selected studies based on the CHARMS checklist. The risk of bias was further assessed using PROBAST.
Results:
Of the 3,030 records evaluated, 11 were eligible for final analysis, which described 12 diagnostic models. Most studies clearly identified the predictors and outcomes but did not report how missing data were handled. The most frequent predictors across all models were advanced age, receipt of high-risk antibiotics, history of hospitalization, and history of CDI. All studies reported the area under the receiver operating characteristic curve (AUROC) as a measure of discriminatory ability. However, only 3 studies reported the model calibration results, and only 2 studies were externally validated. All of the studies had a high risk of bias.
Conclusion:
The studies varied in their ability to predict the risk of HO-CDI. Future models will benefit from the validation on a prospective external cohort to maximize external validity.
Incidence and risk factors for recurrent Clostridioides difficile infection (rCDI) are well established in adults, though data are lacking in pediatrics. We aimed to determine incidence of and risk factors for rCDI in pediatrics.
Methods:
This retrospective cohort study of pediatric patients was conducted at 3 tertiary-care hospitals in Canada with laboratory-confirmed CDI between April 1, 2012, and March 31, 2017. rCDI was defined as an episode of CDI occurring 8 weeks or less from diagnostic test date of the primary episode. We used logistic regression to determine and quantify risk factors significantly associated with rCDI.
Results:
In total, 286 patients were included in this study. The incidence proportion for rCDI was 12.9%. Among hospitalized patients, the incidence rate was estimated at 2.6 cases of rCDI per 1,000 hospital days at risk (95% confidence interval [CI], 1.7–3.9). Immunocompromised patients had higher incidence of rCDI (17.5%; P = .03) and higher odds of developing rCDI independently of antibiotic treatment given for the primary episode (odds ratio [OR], 2.31; 95% CI, 1.12–5.09). Treatment with vancomycin monotherapy did not show statistically significant protection from rCDI, independently of immunocompromised status (OR, 0.33; 95% CI, 0.05–1.15]).
Conclusions:
The identification of increased risk of rCDI in immunocompromised pediatric patients warrants further research into alternative therapies, prophylaxis, and prevention strategies to prevent recurrent disease burden within these groups. Treatment of the initial episode with vancomycin did not show statistically significant protection from rCDI.
Despite the increasing rates of carbapenem-resistant Acinetobacter baumannii (CRAB) carriage among hospitalized patients in endemic settings, the role of active surveillance cultures and cohorting is still debated. We sought to determine the long-term effect of a multifaceted infection-control intervention on the incidence of CRAB in an endemic setting.
Methods:
A prospective, quasi-experimental study was performed at a 670-bed, acute-care hospital. The study consisted of 4 phases. In phase I, basic infection control measures were used. In phase II, CRAB carriers were cohorted in a single ward with dedicated nursing and enhanced environmental cleaning. In phase III large-scale screening in high-risk units was implemented. Phase IV comprised a 15-month follow-up period.
Results:
During the baseline period, the mean incidence rate (IDR) of CRAB was 44 per 100,000 patient days (95% CI, 37.7–54.1). No significant decrease was observed during phase II (IDR, 40.8 per 100,000 patient days; 95% CI, 30.0–56.7; P = .97). During phase III, despite high compliance with control measures, ongoing transmission in several wards was observed and the mean IDR was 53.9 per 100,000 patient days (95% CI, 40.5–72.2; P = .55). In phase IV, following the implementation of large-scale screening, a significant decrease in the mean IDR was observed (25.8 per 100,000 patient days; 95% CI, 19.9–33.5; P = .03). An overall reduction of CRAB rate was observed between phase I and phase IV (rate ratio, 0.6; 95% CI, 0.4–0.9; P < .001).
Conclusions:
The comprehensive intervention that included intensified control measures with routine active screening cultures was effective in reducing the incidence of CRAB in an endemic hospital setting.
Vancomycin is often initiated in hospitalized patients; however, it may be unnecessary or continued for longer durations than needed. Oversight of all vancomycin orders may not be feasible with widespread prescribing and strategies to enlist other clinicians to serve as stewards of vancomycin use are needed. We implemented 2 sequential interventions: a protocol in which the pharmacist orders MRSA nasal swab followed by a protocol requiring approval from pharmacists to continue vancomycin for >72 hours.
Methods:
In this single-center, retrospective, quasi-experimental study, we evaluated vancomycin use after implementation of a pharmacy-driven MRSA nasal-swab ordering protocol and a vancomycin 72-hour restriction protocol. The primary outcome was the change in the standardized antibiotic administration ratio (SAAR) for antibacterial agents for resistant gram-positive infections. We also evaluated the impact on antibiotic utilization.
Results:
Following the MRSA swab protocol, the SAAR decreased from 1.26 to 1.13 (P < .001; 95% confidence interval [CI], 1.16–1.25). After the 72-hour approval process, the SAAR was 0.96 (P < .001; 95% CI, 1.0–1.12). Vancomycin utilization decreased from 138.9 to 125.3 days of therapy per 1,000 patient days following the MRSA swab protocol (P < .001) and to 112.7 (P < .001) following the 72-hour approval protocol. Interrupted time-series analysis identified a similar rate of decline in utilization following the 2 interventions (−0.3 and −0.5; P = .16). Both interventions combined resulted in a significant reduction (−1.5; P < .001).
Conclusion:
Implementation of a pharmacist-driven MRSA nasal-swab ordering protocol, followed by a 72-hour approval protocol, was associated with a significant reduction in the SAAR for antibiotics used in the treatment of resistant gram-positive infections and a reduction in vancomycin utilization. Leveraging the oversight of primary service clinical pharmacists through these protocols proved to be an effective strategy.
We evaluated the adequacy of microbiological tests in patients withholding or withdrawing life-sustaining treatment (WLST) at the end stage of life.
Setting:
The study was conducted at 2 tertiary-care referral hospitals in Daegu, Republic of Korea.
Design:
Retrospective cross-sectional study.
Methods:
Demographic findings, clinical and epidemiological characteristics, statistics of microbiological tests, and microbial species isolated from patients within 2 weeks before death were collected in 2 tertiary-care referral hospitals from January to December 2018. We also reviewed the antimicrobial treatment that was given within 3 days of microbiological testing in patients on WLST.
Results:
Of the 1,187 hospitalized patients included, 905 patients (76.2%) had WLST. The number of tests per 1,000 patient days was higher after WLST than before WLST (242.0 vs 202.4). Among the category of microbiological tests, blood cultures were performed most frequently, and their numbers per 1,000 patient days before and after WLST were 95.9 and 99.0, respectively. The positive rates of blood culture before and after WLST were 17.2% and 18.0%, respectively. Candida spp. were the most common microbiological species in sputum (17.4%) and urine (48.2%), and Acinetobacter spp. were the most common in blood culture (17.3%). After WLST determination, 70.5% of microbiological tests did not lead to a change in antibiotic use.
Conclusions:
Many unnecessary microbiological tests are being performed in patients with WLST within 2 weeks of death. Microbiological testing should be performed carefully and in accordance with the patient’s treatment goals.
To measure the impact of an automated hand hygiene monitoring system (AHHMS) and an intervention program of complementary strategies on hand hygiene (HH) performance in both acute-care and long-term care (LTC) units.
Single Veterans Affairs Medical Center (VAMC), with 2 acute-care units and 6 LTC units.
Methods:
An AHHMS that provides group HH performance rates was implemented on 8 units at a VAMC from March 2021 through April 2022. After a 4-week baseline period and 2.5-week washout period, the 52-week intervention period included multiple evidence-based components designed to improve HH compliance. Unit HH performance rates were expressed as the number of dispenses (events) divided by the number of patient room entries and exits (opportunities) × 100. Statistical analysis was performed with a Poisson general additive mixed model.
Results:
During the 4-week baseline period, the median HH performance rate was 18.6 (95% CI, 16.5–21.0) for all 8 units. During the intervention period, the median HH rate increased to 21.6 (95% CI, 19.1–24.4; P < .0001), and during the last 4 weeks of the intervention period (exactly 1 year after baseline), the 8 units exhibited a median HH rate of 25.1 (95% CI, 22.2–28.4; P < .0001). The median HH rate increased from 17.5 to 20.0 (P < .0001) in LTC units and from 22.9 to 27.2 (P < .0001) in acute-care units.
Conclusions:
The intervention was associated with increased HH performance rates for all units. The performance of acute-care units was consistently higher than LTC units, which have more visitors and more mobile veterans.
We investigated gender differences in psychosocial determinants that affect hand hygiene (HH) performance among physicians.
Design:
The survey included a structured questionnaire with 7 parts: self-assessment of HH execution rate; knowledge, attitude, and behavior regarding HH; internal and emotional motivation for better HH; barriers to HH; need for external reminders; preference for alcohol gel; and embarrassment due to supervision.
Setting:
The study was conducted across 4 academic referral hospitals in Korea.
Participants:
Physicians who worked at these hospitals were surveyed.
Methods:
The survey questionnaire was sent to 994 physicians of the hospitals in July 2018 via email or paper. Differences in psychosocial determinants of HH among physicians were analyzed by gender using an independent t test or the Fisher exact test.
Results:
Of the 994 physicians, 201 (20.2%) responded to the survey. Among them, 129 (63.5%) were men. Male physicians identified 4 barriers as significant: time wasted on HH (P = .034); HH is not a habit (P = .004); often forgetting about HH situations (P = .002); and no disadvantage when I do not perform HH (P = .005). Female physicians identified pain and dryness of the hands as a significant obstacle (P = .010), and they had a higher tendency to feel uncomfortable when a fellow employee performed inadequate HH (P = .098). Among the respondents, 26.6% identified diversifying the types of hand sanitizers as their first choice for overcoming barriers to improving HH, followed by providing reminders (15.6%) and soap and paper towels in each hospital room (13.0%).
Conclusion:
A significant difference in the barriers to HH existed between male and female physicians. Promoting HH activities could help increase HH compliance.
Patients diagnosed with coronavirus disease 2019 (COVID-19) aerosolize severe acute respiratory coronavirus virus 2 (SARS-CoV-2) via respiratory efforts, expose, and possibly infect healthcare personnel (HCP). To prevent transmission of SARS-CoV-2 HCP have been required to wear personal protective equipment (PPE) during patient care. Early in the COVID-19 pandemic, face shields were used as an approach to control HCP exposure to SARS-CoV-2, including eye protection.
Methods:
An MS2 bacteriophage was used as a surrogate for SARS-CoV-2 and was aerosolized using a coughing machine. A simulated HCP wearing a disposable plastic face shield was placed 0.41 m (16 inches) away from the coughing machine. The aerosolized virus was sampled using SKC biosamplers on the inside (near the mouth of the simulated HCP) and the outside of the face shield. The aerosolized virus collected by the SKC Biosampler was analyzed using a viability assay. Optical particle counters (OPCs) were placed next to the biosamplers to measure the particle concentration.
Results:
There was a statistically significant reduction (P < .0006) in viable virus concentration on the inside of the face shield compared to the outside of the face shield. The particle concentration was significantly lower on the inside of the face shield compared to the outside of the face shield for 12 of the 16 particle sizes measured (P < .05).
Conclusions:
Reductions in virus and particle concentrations were observed on the inside of the face shield; however, viable virus was measured on the inside of the face shield, in the breathing zone of the HCP. Therefore, other exposure control methods need to be used to prevent transmission from virus aerosol.
We assessed hygiene with wet wipes in bedridden patients with urinary catheters for catheter-associated urinary tract infection (CAUTI) prevention. CAUTIs occurred in 16.5% of the control group compared to 5.9% of the intervention group (P = .035). Hygiene with wet wipes can substitute for conventional hygiene for preventing CAUTI.
In a pediatric hospital system over 2 years, 58,607 doses of antibiotic were wasted, an average of 80 doses per day, including drugs in shortage nationwide. Approximately 50% of waste occurred within the first 2 days of admission or the day of discharge, with ampicillin being the most wasted drug (N = 7,789 doses).
Of 731 restricted antimicrobial prescriptions subject to antimicrobial stewardship program (ASP) prospective audit and feedback (PAF) over a 3-year period, 598 PAF recommendations (82%) were fully accepted. Physician auditors had an increased odds of PAF recommendation acceptance, reinforcing the complementary role of the ASP physician in the multidisciplinary ASP team.
Infection prevention program leaders report frequent use of criteria to distinguish recently recovered coronavirus disease 2019 (COVID-19) cases from actively infectious cases when incidentally positive asymptomatic patients were identified on routine severe acute respiratory coronavirus virus 2 (SARS-CoV-2) polymerase chain reaction (PCR) testing. Guidance on appropriate interpretation of high-sensitivity molecular tests can prevent harm from unnecessary precautions that delay admission and impede medical care.
We used a strand-specific RT-qPCR to evaluate viral replication as a surrogate for infectiousness among 242 asymptomatic inpatients with a positive severe acute respiratory coronavirus virus 2 (SARS-CoV-2) admission test. Only 21 patients (9%) had detectable SARS-CoV-2 minus-strand RNA. Because most patients were found to be noninfectious, our findings support the suspension of asymptomatic admission testing.
Emergency departments are high-risk settings for severe acute respiratory coronavirus virus 2 (SARS-CoV-2) surface contamination. Environmental surface samples were obtained in rooms with patients suspected of having COVID-19 who did or did not undergo aerosol-generating procedures (AGPs). SARS-CoV-2 RNA surface contamination was most frequent in rooms occupied by coronavirus disease 2019 (COVID-19) patients who received no AGPs.