To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
We study the moduli space of constant scalar curvature Kähler (cscK) surfaces around toric surfaces. To this end, we introduce the class of foldable surfaces: smooth toric surfaces whose lattice automorphism group contains a non-trivial cyclic subgroup. We classify such surfaces and show that they all admit a cscK metric. We then study the moduli space of polarised cscK surfaces around a point given by a foldable surface, and show that it is locally modelled on a finite quotient of a toric affine variety with terminal singularities.
In Malaysia, three ethnic groups identify as “Indigenous Peoples”: the heterogeneous Peninsular Malaysia Orang Asli, natives of Sabah, and natives of Sarawak. Malaysia’s hybrid legal system confers differing constitutional, statutory, and common law rights and privileges to Indigenous Peoples, which present distinct yet shared experiences of their land rights. These Indigenous groups were granted differing levels of constitutional privileges during Malaysia’s constitutional formation, which resulted in divergent written laws for the protection and recognition of their customary lands and resources. These differing laws and histories have functioned to dispossess these communities of their traditional lands, territories, and resources in their own ways. The strategy of litigation has afforded Indigenous communities some recourse for gaps in the written law but common law development of such rights and the court process have equally proven to be a barrier in some cases. Although international commitments to the sustainable management of resources have increased possibilities for the inclusion of Indigenous communities in matters concerning their lands and resources, constitutionally-entrenched legal privileges have yet to translate to the effective protection and recognition of traditional Indigenous lands and resources in Malaysia.
Decentralized clinical trials (DCTs) are often hindered by challenges in remotely capturing biomarkers. To address this gap, we developed MyTrials, a mobile application integrated with REDCap, designed to facilitate the remote capture of biomarkers via Bluetooth-enabled remote patient monitoring (RPM) devices. The purpose of the present study was to evaluate the feasibility and acceptability of MyTrials among participants within a DCT design.
Methods:
In this four-arm randomized trial, 47 participants were allocated to receive zero, one, two, or three RPM devices. Participants were asked to use their devices once per week for a total of four weeks to remotely provide biomarkers via MyTrials. Feasibility was assessed using objective metrics of successful biomarker submission (i.e., valid device data accompanied by a video confirming participant identity) alongside the participant-reported Feasibility of Intervention Measure (FIM). Acceptability was evaluated via the Acceptability of Intervention Measure (AIM) and the System Usability Scale (SUS).
Results:
Among participants assigned at least one device, the successful biomarker submission rate was 74% across all study weeks. FIM and AIM scores exceeded prespecified feasibility benchmarks across all conditions except the zero-device condition. SUS scores consistently indicated high usability across all conditions (range: 77.29–94.29).
Conclusions:
The MyTrials platform is a feasible and acceptable solution for remote biomarker capture in DCTs. These findings support the potential of MyTrials to advance remote data collection in clinical research.
This chapter brings into conversation two powerful, imbricated forces in contemporary Nigeria: the dramatic rise in fundamentalist religious Christian and Islamic formations that place hope and prosperity in the afterlife, and the proliferation of community-based technology projects that offer ordinary victims and survivors the power of data as a way to make sense of past and future violence. The chapter argues that these trends are imbricated both with one another and with the history of colonialism from earlier periods to the contemporary moment. The chapter raises questions about the extent to which this Nigerian case study foreshadows a more global shift away from long established (western) authorities – in particular, the law and the nation-state – and toward futures where more and more people could turn toward a kind of moral and political vigilantism, taking the tools for creating hope and meaning (back) into their own hands.
This chapter expands on a series of recent interventions about the consequences of the unraveling of juristocracy at a more diffuse transnational level: consequences for critical scholarship (both disciplinary and interdisciplinary), for the state of (mostly Euro-American) progressive politics, and for the urgent project to imagine alternatives to rights-based frameworks for change and justice-seeking that guard against the use of violence, ethnocentrism, and other expressions of an exclusionary juristocratic reckoning. The chapter begins by summarizing the well-known intellectual historical narrative of notable developments in the wake of the “endtimes” (Hopgood 2013) of human rights and other categories of law that were invested with the weight of social, political, and, to a lesser extent, economic transformation. After focusing on and tracing the afterlives of existing human rights up to the present, the chapter then introduces an alternative vision for what is described as the “future lives” of human rights, a proposition that recognizes the force of the different critiques underlining the profound turn away from human rights in the present, but which nevertheless seeks to go beyond these critiques. Although the original argument for “reinventing human rights” (Goodale 2022) was meant to examine fairly comprehensively the ways in which a radically reformulated account of human rights was still possible, an account, moreover, that might yet prove capable of galvanizing new and more sustainable forms of translocal social and political action, the 2022 intervention nevertheless left certain key concepts rather underdeveloped. As a response, the chapter returns to these key concepts in order to thicken the presentation of a reinvented human rights as a framework for multiscalar social mobilization and justice-seeking. Yet as the chapter emphasizes, this framework does not return “human rights” to its grounding in law – national, regional, or international. In this sense, the proposition builds on the transformative potential of the turn away from certain kinds of law. As the chapter concludes, the case for detaching human rights – conceptually and institutionally – from law seems as compelling as ever, perhaps even more so in light of the violent impotence of the international system writ large in the face of recent crises such as the global COVID-19 pandemic and Russia’s invasion of Ukraine.
Despite their widespread use, purely data-driven methods often suffer from overfitting, lack of physical consistency, and high data dependency, particularly when physical constraints are not incorporated. This study introduces a novel data assimilation approach that integrates Graph Neural Networks (GNNs) with optimization techniques to enhance the accuracy of mean flow reconstruction, using Reynolds-averaged Navier–Stokes (RANS) equations as a baseline. The method leverages the adjoint approach, incorporating RANS-derived gradients as optimization terms during GNN training, ensuring that the learned model adheres to physical laws and maintains consistency. Additionally, the GNN framework is well-suited for handling unstructured data, which is common in the complex geometries encountered in computational fluid dynamics. The GNN is interfaced with the finite element method for numerical simulations, enabling accurate modeling in unstructured domains. We consider the reconstruction of mean flow past bluff bodies at low Reynolds numbers as a test case, addressing tasks such as sparse data recovery, denoising, and inpainting of missing flow data. The key strengths of the approach lie in its integration of physical constraints into the GNN training process, leading to accurate predictions with limited data, making it particularly valuable when data are scarce or corrupted. Results demonstrate significant improvements in the accuracy of mean flow reconstructions, even with limited training data, compared to analogous purely data-driven models.
We prove existence of flips for log canonical foliated pairs of rank one on a ${\mathbb Q}$-factorial projective klt threefold. This, in particular, provides a proof of the existence of a minimal model for a rank one foliation on a threefold for a wider range of singularities, after McQuillan.
Background: There are numerous ways to measure social markers of health. One reliable method for predicting health outcomes is the social vulnerability index (SVI) which assesses multiple themes, including housing insecurity, socioeconomic status, and minority status. As a part of Multi-site Gram Negative Surveillance Initiative (MuGSI), surveillance of Extended-Spectrum Beta-Lactamase (ESBL)-producing Enterobacterales was conducted in four Tennessee counties (Maury, Marshall, Wayne, and Lewis). This study examines the association between social vulnerability and infection rates for ESBL-producing Enterobacterales within the surveillance area. Method: ESBL incident cases reported from July 2019 to December 2023 were analyzed. Cases were defined as the first isolation of Escherichia coli, Klebsiella pneumoniae, or Klebsiella oxytoca resistant to at least one extended-spectrum cephalosporin (ceftazidime, cefotaxime or ceftriaxone) and non-resistant to all carbapenem antibiotics from urine or normally sterile sites in residents of the surveillance area within a 30-day period. Pearson correlation analysis was conducted to evaluate the association between SVI scores and ESBL infection rates per 1,000 residents at the census tract level, as well as the four SVI ranking variables (socioeconomic status, household characteristics, racial & ethnic minority status, and housing type & transportation). Analysis was conducted using SAS 9.4. Geospatial analysis in ArcGIS Pro v2.9.7 produced a bivariate choropleth map, illustrating the interaction between SVI and ESBL infection rates. Result: From 2019–2023, 2,166 ESBL cases were reported. Cases were 21% male and 79% female, with mean age of 66 years. Incidence rates ranged from 0.19 to 19.5 per 1,000 population. The analysis revealed a significant positive relationship between SVI and tract-level ESBL infection rates. Higher vulnerability scores are associated with higher infection rates, as evidenced by the positive correlation coefficient (ℝ? = 0.38427, ℝ? = 0.0272). Pearson correlation analysis revealed that household type and transportation demonstrated statistically significant positive correlation with ESBL infection rates (ℝ? = 0.431, ℝ? = 0.0121). Conclusion:Information from geocoding surveillance data can be used to identify social groups at increased risk of infections with drug resistant pathogens. In this study, ESBL infection rate is significantly associated with SVI. Among the four themes, only household type & transportation status is found to be significantly associated with ESBL infection rates. Further research is needed to understand the role housing plays in the spread of ESBL infection, especially looking at both urban and rural populations. Using SVI scores as a risk assessment tool, infection preventionists and antibiotic stewards can prioritize high risk areas for intervention.
Background: Infection is a common and highly morbid postoperative complication in victims of physical trauma. Current literature analyzing the infectious sequelae of physical trauma predominately comes from military data, where blast trauma, rather than blunt or penetrating trauma, is most common. The epidemiology and management of infectious sequelae of civilian trauma are poorly understood, as is perioperative antimicrobial management of trauma laparotomy. Methods: We performed a single-center retrospective chart review using data from University of Chicago’s electronic medical record (Epic) and the National Trauma Registry. Patients 16 years and older admitted for level 1-2 trauma who underwent laparotomy between 5/1/2018-3/18/2023 were included. Using informatics and manual chart review, we analyzed patient demographics, rates of infection, sites of infection, timing of infection from initial trauma event, and causative organisms. We compared patients based on mechanism of injury (blunt versus penetrating) and whether patients underwent damage control laparotomy (DCL)--where the abdomen is left in discontinuity after the initial laparotomy--or single laparotomy (SL). Results: 430 patients met criteria. The median age was 30. Patients were majority Black (80.9%) and male (80.9%). 80.5% of patients had penetrating trauma, of which 90% were gunshot wounds (GSW). 19.8% had blunt trauma, of which 89% were motor-vehicle crashes (MVC). 19 (4.4%) died during initial stabilization, 199 (46.3%) underwent single laparotomy, and 212 (49.3%) underwent DCL (Figure 1). Of patients that survived initial stabilization, 27 (6.6%) developed a bloodstream infection (BSI), of which 21 (77.8%) came from the DCL group (Figures 2, 3). 19% of BSI in the DCL group were caused by yeast. 30.7% of patients developed a culture-positive surgical site infection (SSI) or intra-abdominal infection (IAI), with a rate of 40.6% in the DCL group (Table 2). Yeast were isolated in 40.5% of patients with positive cultures, 86.3% of which were isolated in the DCL group, with an overall incidence of 20.8% in the entire DCL group. Median time from arrival to infection diagnosis was 11 days. Patients generally received empiric Piperacillin-tazobactam while the abdomen was in discontinuity. Conclusions: Infection in civilian trauma laparotomy often arises as SSI or IAI, and is most pronounced in the DCL population. Yeast represents an unexpectedly high proportion of causative organisms. Further research is required to assess whether yeast burden can be mitigated by either incorporating antifungal prophylaxis at time of initial laparotomy, or by shortening empiric post-laparotomy antibiotic courses.
Background: Contact precautions reduce nosocomial spread of Clostridioides difficile (C. difficile). However, they can decrease patient interactions with providers and delay discharges, so it is imperative precautions are discontinued when appropriate. Patients with discordant C. difficile testing (PCR+/Toxin-) require clinical judgment to determine infection versus colonization. Our institution’s C. difficile isolation protocol categorizes duration based on C. difficile treatment and type of patient floor to reflect this. We transitioned to a new electronic medical record (EMR) in June 2024, which included additional decision support for Contact precaution discontinuation. Prior to new EMR implementation, we hypothesized that patients with discordant C. difficile testing were not being appropriately de-escalated from precautions despite meeting institutional criteria. Methods: This was a retrospective chart review of inpatients admitted to our hospital who had discordant C. difficile testing (PCR+/Toxin-) from July 1, 2023 to October 10, 2023. Patients were excluded if there was no indication of PCR+ (critical value) notification to providers or if patients were on Contact precautions with an additional indication to C. difficile. The primary outcome was the proportion of patients with discordant C. difficile testing who had Contact precautions appropriately discontinued based on internal criteria (Figure 1). Results: A total of 90 patients had discordant C. difficile testing during the study period; 10 were excluded. In the study cohort (n=80), 33.8% (27/80) did not have orders placed for Contact precautions at any time despite positive PCR (Figure 2). Of the remaining 53 patients who were placed on Contact precautions, the median start time of Contact precautions after PCR notification was one hour and 20 minutes.
Of patients who were placed on Contact precautions (n=53), 30.2% (16/53) were treated and deemed to have clinical infection, while 69.8% (37/53) were diagnosed with colonization and not treated for C. difficile infection. Overall, 84.9% (45/53) had appropriate de-escalation of Contact precautions; the remaining 8 (15.1%) had inappropriate de-escalation of Contact precautions and were all in the not treated/colonized group. Conclusion: In our single-institution study, we found higher than expected rates of appropriate Contact precaution initiation and discontinuation; however, 15% still had inappropriate precaution durations. Coupled with the surprising number of patients not initiated on precautions at any time after positive PCR, our results highlighted the need to build clear clinical decision support tools with our new EMR and continual surveillance of providers for adherence to isolation protocols post-implementation.
Background: Rapid identification of patients with carbapenem-resistant Enterobacterales (CRE) on admission to an acute care hospital is critical to prompt initiation of infection control measures. Clinical risk prediction tools can assist in identifying high risk patients and allow facilities to perform targeted CRE screening. We aimed to prospectively validate a previously developed CRE risk prediction tool which incorporates data from current and prior hospital encounters and was incorporated into our electronic medical record (EMR). Method: From 6/2024 – 12/2024 we used an automated daily EMR report to calculate a CRE risk score (probability of a CRE clinical culture within 3 days of hospitalization) for all admissions from the previous day at two hospitals in an academic healthcare network in Atlanta, GA. On select days of the week, we approached a convenience sample of approximately 10 patients with the highest risk scores and obtained a peri-rectal swab on consented patients. Swabs were broth enriched and tested on CHROMagarTM ESBL plates. We used MALDI-TOF and/or the Vitek®2 GN74 panel for species identification and antibiotic susceptibility testing. To evaluate testing accuracy, we defined individuals as high-risk if they had a CRE risk prediction score in the top quartile of scores among patients approached. We calculated the sensitivity, specificity, and positive and negative predictive value of this threshold to predict patients with CRE peri-rectal carriage. Results: 9,422 admissions occurred on sampling days; we approached 720 of which 282 (39%) were consented and tested (Figure 1). Ten patients (3.5%) were positive for CRE: 4 Klebsiella pneumoniae, 3 Escherichia coli, 2 Enterobacter cloacae, and 1 Pantoea species. Among tested individuals, patients with CRE had a higher median CRE risk score (0.19% vs 0.04%), more healthcare exposures, a higher Elixhauser score, and more antibiotic days of therapy (Figure 2). Of the 72 (25%) patients at high-risk (CRE risk prediction ≥0.1%) 6 (8.3%) were CRE positive; using this threshold the sensitivity and specificity were 60% and 80%, respectively, and the positive and negative predictive value were 8% and 98%, respectively. Conclusion: Utilizing an EMR-based risk prediction tool can help identify patients at high-risk for CRE colonization. In healthcare facilities with a low CRE-prevalence, identifying a high-risk subset of patients, even with an 8% probability of CRE, could be a clinically meaningful infection prevention measure. Individual healthcare facilities could adjust the testing threshold based on the hospital and population needs.
Background: Blood culture contamination is a large burden on the health system with significant excessive costs and antimicrobial use. In 2024, a national blood culture shortage required intensive conservation strategies regarding blood culture collection. We developed clinical guidance on blood culture utilization and embedded it in electronic health record (EHR). Our goal is to evaluate its impact on blood culture utilization and anti-MRSA agent usage at our institution. Methods: The antimicrobial stewardship team provided educational communication, and blood culture bottle conservation strategy (BCBCS) recommendations that were embedded into the EHR in July 2024 (Figure 1). Patient charts with a laboratory identified blood culture growing a contaminant in December 2023 (prior to BCBCS) and October 2024 (post-BCBCS) were reviewed. Patients were excluded if they had another clinically relevant pathogen in blood cultures, were discharged prior to blood culture result, or died within 48 hours of blood culture result. Information on anti-MRSA agent (vancomycin, linezolid, daptomycin, ceftaroline) days of therapy (DOT), total hospital blood culture volume, blood culture contamination rates, and ID consultation was collected. Results: 54 patients pre-BCBCS and 29 patients post-BCBCS were reviewed. Anti-MRSA DOT in patients reviewed with contaminated cultures was 161 pre-BCBS and 56 post-BCBCS (Table1). Overall blood culture volume and contamination rate were reduced post BCBCS implementation (Table 2). Total hospital anti-MRSA DOT was noted to be less post EHR guidance as well (1529 pre-BCBCS and 1279 post-BCBS). Conclusions: Reduction in both the volume of blood culture collection and overall contamination rate contributed to a reduction of anti-MRSA therapy at our institution. These results highlight the impact that diagnostic stewardship may have on antimicrobial stewardship metrics.
Background: Isolation and cohorting are essential components of an effective infection prevention and control (IPC) program within healthcare settings to prevent spreading infectious diseases. In Bangladesh, no related study has explored knowledge or practices of isolation and cohorting practice. This study aims to investigate the perception and practice of isolation and cohorting among healthcare workers (HCWs) at tertiary care hospitals in Bangladesh. Methods: From September 2020 to January 2021, this hospital-based, multi-center cross-sectional study was conducted in seven tertiary hospitals across Bangladesh. Participants were HCWs (physicians, nurses, cleaning staff) involved in direct patient care and agreed to participate. A pre-tested structured questionnaire was employed through face-to-face interviews for data collection. Descriptive and multivariate analyses were performed using STATA Version 15. Compliance with isolation precautions was categorized into “good” and “poor” groups based on the mean response score. Results: A total of 1511 HCWs were interviewed, in the study. Overall, 88.7% of HCWs were familiar with the terms ‘isolation’ and ‘cohorting’. Still, 40.5% of them did not have a thorough understanding of when to implement these strategies. Only 18.0% of the HCWs reported good compliance towards isolation and cohorting practice. Entry-level (under 31 years) HCWs exhibited better compliance to isolation and cohorting compared to mid-level (31 to 40 years) HCWs (59.6% vs 29.4%). The majority of HCWs (93.7%) who participated in infection control training demonstrated a high level of compliance with isolation. In terms of good compliance with isolation precautions, physicians and nurses had higher odds (OR: 11.0) than cleaning staff. Moreover, HCWs with < 5 years of experience were more likely to comply with good adherence (OR: 1.56, 95%CI: 1.09-2.21) than those with over 10 years of experience. Conclusions: The study revealed that HCWs have a limited understanding and implementation of isolation and cohorting practices. To address this issue, policymakers and hospital leadership should adopt strategies that promote these practices by providing regular training and awareness programs for healthcare workers. Such initiatives are essential to help limit the spread of infections in healthcare settings.
Background: Treprostinil, a prostacyclin analog, is used to manage pulmonary hypertension (PH) through continuous intravenous (IV) infusion via a central venous catheter (CVC) or continuous subcutaneous (SC) infusion via a small infusion pump connected to a catheter. This study compares the incidence and the types of infections between IV and SC administration in a single-center cohort. Methods: We analyzed 49 PAH patients receiving treprostinil at the Hartford Hospital PH Center, all managed under standardized hygiene protocols by the same healthcare team. Of these, 34 received IV administration and 15 SC, based on patient preference and PH specialist recommendations. The primary outcome was the incidence of infection in each group during the study period. The secondary outcome was the type of infection, including bacteremia, cellulitis, or other skin infections, associated with IV or SC administration. Results: The incidence of bacteremia was significantly higher in the IV group, with 7 cases (5 isolated bacteremia and 2 bacteremia with cellulitis), representing 20.6%. In contrast, there were no bacteremia cases in the SC group. Cellulitis was more common in the SC group (20%; 3 out of 15 patients) compared to the IV group (8.8%; 3 out of 34 patients). Notably, 2 cases of cellulitis in the IV group were associated with bacteremia, while all 3 cases in the SC group were isolated, with 1 progressing to an abscess requiring incision and drainage. The overall infection rate (bacteremia and cellulitis combined) was higher in the IV group (29.4%) compared to the SC group (20%). These findings emphasize the higher risk of bacteremia in the IV group and reveal that while cellulitis occurred more frequently in the SC group, the overall infection burden was greater in the IV group. Conclusion: Previous studies show comparable efficacy between IV and SC remodulin when properly dosed. Our findings, despite a small sample size, reveal a higher overall risk of infections, particularly bloodstream infections (BSIs), with IV therapy due to CVC use. This aligns with existing literature identifying catheter-related infections as a key concern. These results support SC remodulin as a safer option, especially for reducing BSI risk. We plan to incorporate these findings into our counseling protocol, acknowledging the need for further validation.
Background: Upper respiratory infections (URIs) are common causes of outpatient visits in children. While many URIs are viral, antimicrobial prescribing remains high. In preparation for action planning to address this issue within our multi-state health system, this study aimed to characterize current antimicrobial prescribing patterns for pediatric URIs in our outpatient setting. Methods: Retrospective analysis of pediatric (<18 years) antimicrobial prescribing for URI diagnosis codes in 639 outpatient sites (nine states), including clinics, urgent cares, and emergency departments (ED) between July 1, 2023 to June 30, 2024. Primary outcome was overall antimicrobial prescribing rates for URIs and by individual URI diagnosis (sinusitis, bronchitis, pharyngitis, otitis media). Logistic regression machine learning model was used with SHapley Additive exPlanations (SHAP) analysis to show feature contributions to antimicrobial prescribing. Results: A total of 125,590 patient visits by children with URI were included. Antimicrobial prescribing rates varied by diagnosis (sinusitis: 53%, bronchitis: 18%, pharyngitis: 45%, otitis media: 40%, p<0.001). Overall prescribing ranged from 18%-52% across states. Patients seen in the ED had the lowest use of antimicrobials (25%) while those seen in urgent care had the highest utilization (58%). Non-bronchitis diagnosis, non-ED encounters, ≤10 years of age, and specific states had the strongest positive associations with antimicrobial prescribing, while race and social vulnerability index (SVI) were not associated. Conclusions: Antimicrobials were most commonly prescribed for pediatric patients seen for sinusitis, pharyngitis, and otitis media. Factors most associated with increased prescribing included non-ED encounters, >10 years of age, and geography. These data support action to standardize practices and address clinical variation.
Respiratory viral infection (RVI) outbreaks pose a significant threat to health. They are associated with patient morbidity and mortality, staff absenteeism, and financial burden on the healthcare system. There is a need for strategies to reduce RVI transmission in hospitals. One proposal is implementation of continuous masking policies. However, the effectiveness of such policies in mitigating RVI spread is unclear. We conducted a systematic review of the literature to determine the effectiveness of continuous masking in reducing the incidence and transmission of RVIs amongst patients and healthcare workers (HCWs) in hospitals. We systematically searched for original articles published between 2000-2024 according to a pre-determined search criterion. Studies were screened by two reviewers in Covidence. One reviewer extracted the data from eligible studies into a pre-determined data extraction form. For studies that reported only count data, results were summarized narratively. Meta-analysis to pool unadjusted or adjusted outcome measures for studies that report a statistical comparison between masking policies and transmission of infections will be considered if appropriate. Joanna Briggs Institute tools will be used for critical appraisal. 3691 studies were identified. 17 met eligibility criteria. 12 studies were conducted in single-center adult hospitals. 4 studies were conducted in pediatric hospitals, with 2 in neonatal centers. One study was conducted on a hospital system. The studied infections were influenza A/B, parainfluenza 1-3, adenovirus, respiratory syncytial virus (RSV), traditional human coronavirus strains, human metapneumovirus, SARS-CoV-2, and rhinovirus/enterovirus. Eight studies assessed the impact of a masking policy on infection rate in patients. All 8 reported masking policies reduce RVI transmission in patients. 9 studies assessed the impact of a masking policy on infection rate in HCWs. 7 were associated with reductions in RVI transmission in HCWs, whereas 2 showed no statistically significant change. The studies identified in this systematic review were associated with a reduction in RVI transmission with the use of continuous masking amongst patients. The evidence for continuous masking was less consistent for preventing RVI transmission amongst HCW with two studies reporting it was not effective. Our findings suggest that masking policies may play a role in RVI prevention but there are significant limitations with the use of observational design and masking in conjunction with other prevention measures. However, assessment of the quality of the papers is pending. Future directions will include assessing secondary outcomes like masking adherence and assessing adjusted analyses form confounding which are critically important.
Background: Overdiagnosis of C. difficile in hospitalized patients is common and contributes to misdiagnosis, unnecessary treatment, and overestimation of nosocomial infection rates. Many institutions, including ours, have implemented computerized clinical decision support (CCDS) with reductions in testing rates, but long-term data on the impact of such interventions are limited. Methods: A previously reported CCDS intervention paired with education campaign and trainee financial incentive was implemented December 2016. A laxative alert was added in 2018 and testing changed from NAAT only to two-step testing in 2020. Hospital-onset C. difficile cases have been reviewed by members of the antimicrobial stewardship team in real time for diagnostic and antimicrobial prescribing opportunities for improvement (OFIs) since 2016, with a stable workflow for unit leadership notification and data entry in RedCap since June 2023. Diagnostic OFI categories are based on themes from early iterations of this case-based review process and include: clinical criteria not met, stool criteria not met, alternative explanation for diarrhea, smells like C. difficile, test of cure, duplicate test, delayed collection, delayed testing, and other. We analyzed reviews from 6/1/2023-12/31/2024 and further classified diagnostic OFI determinations as “No OFI”, “Inappropriate”, or “Appropriate with process OFI”. During the study period there was no ongoing financial incentive or concerted diagnostic stewardship educational campaign, though feedback continued to be provided to individuals and groups based on case reviews, and a single question regarding C. difficile testing was maintained in annual re-training. Results: There were 144 HO-CDI cases reviewed with no diagnostic OFI in 98 (68%). Testing was inappropriate in 16 (11%). Testing was appropriate with process OFIs in 30 (42%). The most common process OFIs were other-stool documentation (11), delayed testing (7), other-lack of discussion with preexisting ID consult (6), and delayed sample collection (5). In cases with delayed testing, earlier testing was not prevented by CCDS in any case. Conclusions: We found relatively low rates of inappropriate testing (11%) over a time period seven years out from initial implementation of CCDS without ongoing active house wide diagnostic stewardship initiatives. Carefully designed and implemented CCDS can be a valuable tool that facilitates sustained improvement and allows resources to be allocated to new efforts. We additionally observed no cases of delayed diagnosis attributable to CCDS with combination of established institutional criteria for testing and two-step testing.
Background: Candida auris (C. auris) and carbapenemase-producing organisms (CPOs) are rapidly emergent healthcare-associated infections (HAIs) with high mortality. Early identification and isolation of colonized patients are crucial in preventing spread. Currently in Oregon, both organism types are uncommonly encountered such that local public health guidance advises travel-related screening as an important component of regional prevention. In 2024, VA Portland Health Care System (VAPORHCS) implemented a C. auris/CPO travel screening program as a quality improvement project. Methods: Using the Plan-Do-Study-Act (PDSA) framework, starting 4/1/2024 patients admitted to acute care were asked, “Have you had an overnight stay in a hospital, nursing home, or other healthcare facility outside of Oregon or Washington in the last year?” If patients responded affirmatively, the admitting nurse educated the patient and collected swabs after verbal consent: axilla/groin swabs for C. auris and peri-rectal swabs for CPOs. Patients were placed on empiric contact precautions in a single-bed room while awaiting results. Infection prevention prospectively monitored the implementation, and retrospectively medical records were reviewed. Results: The PDSA framework informed the implementation and helped organize the approach to addressing barriers such as missed screenings, communication breakdowns, complex disinfection protocols, the need for staff re-education, and delayed C. auris results (see Figure).
Of 3199 acute care admissions between 4/1/24–11/30/24, 72 patients (2.3%) reported a qualifying travel-related risk factor. 64 patients reported overnight healthcare elsewhere in the United States and Territories (including 5 in Puerto Rico) whilst 8 patients had international exposure (Mexico n=6, Philippines n=2). Of the 72 patients with qualifying travel, 9 patients were not tested (patient refused n=2, staff deemed inappropriate n=3, readmission n=1, unknown/technical issues n=3). An additional 32 patients (1%) initially reported qualifying travel but on chart review, travel was not confirmed. Of those, 16 had testing performed, all of which were negative. The average C. auris test turnaround time was 7.7 days with a range from 3-18 days. One patient (2.4%) tested positive for CPO. Conclusion: The C. auris/CPO screening program was effectively implemented and identified one positive CPO case, preventing the need for an urgent outbreak investigation. The PDSA framework helped the organization methodically plan the implementation and address barriers. The long turn-around-time for C. auris testing resulted in undesirable duration of empiric contact precautions. Continued evaluation of program metrics and public health recommendations are critical to sustainment and refinement over time.
Background: We aimed to examine the impact of daily bathing with chlorhexidine gluconate (CHG) on central line associated bloodstream infections (CLABSIs), catheter associated urinary tract infections (CAUTIs), and bloodstream infections with methicillin-resistant Staphylococcus aureus (LabID MRSA) across a large, rural healthcare system. This healthcare system encompasses 8 large community hospitals, one academic hospital, and 11 hospitals with 50 or fewer beds. Starting in August 2023, all facilities were required to adopt daily CHG bathing for patients with central lines and/or in intensive care units. Some facilities also chose to adopt CHG daily bathing for patients with foley catheters. Methods: We analyzed the hospital-wide monthly incidence of select healthcare associated infections (HAIs) in the year before and after implementation of CHG bathing across a large, decentralized, rural healthcare system. We conducted negative binomial regressions to examine the difference in HAIs before/after implementation of CHG bathing, and we used the National Healthcare Safety Network’s (NHSN) predicted numbers of HAIs to adjust for confounding among hospitals. Results: After adjusting for each hospital’s predicted number of infections, we saw a 40.1% decrease in CLABSIs (p=0.008) and a 33.2% reduction in CAUTIs (p=0.018, Table 1); we also observed a 34.3% reduction in LabID MRSA, although this was not statistically significant (p=0.105). Conclusion: System-wide implementation of CHG daily bathing in a large, decentralized, rural healthcare system was associated with a significant reduction in CLABSIs and CAUTIs.