We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Because pediatric anxiety disorders precede the onset of many other problems, successful prediction of response to the first-line treatment, cognitive-behavioral therapy (CBT), could have a major impact. This study evaluates whether structural and resting-state functional magnetic resonance imaging can predict post-CBT anxiety symptoms.
Methods
Two datasets were studied: (A) one consisted of n = 54 subjects with an anxiety diagnosis, who received 12 weeks of CBT, and (B) one consisted of n = 15 subjects treated for 8 weeks. Connectome predictive modeling (CPM) was used to predict treatment response, as assessed with the PARS. The main analysis included network edges positively correlated with treatment outcome and age, sex, and baseline anxiety severity as predictors. Results from alternative models and analyses are also presented. Model assessments utilized 1000 bootstraps, resulting in a 95% CI for R2, r, and mean absolute error (MAE).
Results
The main model showed a MAE of approximately 3.5 (95% CI: [3.1–3.8]) points, an R2 of 0.08 [−0.14–0.26], and an r of 0.38 [0.24–0.511]. When testing this model in the left-out sample (B), the results were similar, with an MAE of 3.4 [2.8–4.7], R2−0.65 [−2.29–0.16], and r of 0.4 [0.24–0.54]. The anatomical metrics showed a similar pattern, where models rendered overall low R2.
Conclusions
The analysis showed that models based on earlier promising results failed to predict clinical outcomes. Despite the small sample size, this study does not support the extensive use of CPM to predict outcomes in pediatric anxiety.
The association between cannabis and psychosis is established, but the role of underlying genetics is unclear. We used data from the EU-GEI case-control study and UK Biobank to examine the independent and combined effect of heavy cannabis use and schizophrenia polygenic risk score (PRS) on risk for psychosis.
Methods
Genome-wide association study summary statistics from the Psychiatric Genomics Consortium and the Genomic Psychiatry Cohort were used to calculate schizophrenia and cannabis use disorder (CUD) PRS for 1098 participants from the EU-GEI study and 143600 from the UK Biobank. Both datasets had information on cannabis use.
Results
In both samples, schizophrenia PRS and cannabis use independently increased risk of psychosis. Schizophrenia PRS was not associated with patterns of cannabis use in the EU-GEI cases or controls or UK Biobank cases. It was associated with lifetime and daily cannabis use among UK Biobank participants without psychosis, but the effect was substantially reduced when CUD PRS was included in the model. In the EU-GEI sample, regular users of high-potency cannabis had the highest odds of being a case independently of schizophrenia PRS (OR daily use high-potency cannabis adjusted for PRS = 5.09, 95% CI 3.08–8.43, p = 3.21 × 10−10). We found no evidence of interaction between schizophrenia PRS and patterns of cannabis use.
Conclusions
Regular use of high-potency cannabis remains a strong predictor of psychotic disorder independently of schizophrenia PRS, which does not seem to be associated with heavy cannabis use. These are important findings at a time of increasing use and potency of cannabis worldwide.
NHS Scotland, one of the keystone healthcare providers in the UK, have recently set a wide variety of sustainability targets in an effort to mitigate waste and the intensive energy demands of healthcare. Medical garment production, management and design is an area in which design researchers can explore and offer solutions. This paper presents a series of co-design explorations to examine design alternatives to single-use theatre caps, the majority of which are currently disposed of routinely. Using a series of probes, major insights into how theatre cap design may be improved is presented.
The origins and timing of inpatient room sink contamination with carbapenem-resistant organisms (CROs) are poorly understood.
Methods:
We performed a prospective observational study to describe the timing, rate, and frequency of CRO contamination of in-room handwashing sinks in 2 intensive care units (ICU) in a newly constructed hospital bed tower. Study units, A and B, were opened to patient care in succession. The patients in unit A were moved to a new unit in the same bed tower, unit B. Each unit was similarly designed with 26 rooms and in-room sinks. Microbiological samples were taken every 4 weeks from 3 locations from each study sink: the top of the bowl, the drain cover, and the p-trap. The primary outcome was sink conversion events (SCEs), defined as CRO contamination of a sink in which CRO had not previously been detected.
Results:
Sink samples were obtained 22 times from September 2020 to June 2022, giving 1,638 total environmental cultures. In total, 2,814 patients were admitted to study units while sink sampling occurred. We observed 35 SCEs (73%) overall; 9 sinks (41%) in unit A became contaminated with CRO by month 10, and all 26 sinks became contaminated in unit B by month 7. Overall, 299 CRO isolates were recovered; the most common species were Enterobacter cloacae and Pseudomonas aeruginosa.
Conclusion:
CRO contamination of sinks in 2 newly constructed ICUs was rapid and cumulative. Our findings support in-room sinks as reservoirs of CRO and emphasize the need for prevention strategies to mitigate contamination of hands and surfaces from CRO-colonized sinks.
Various water-based heater-cooler devices (HCDs) have been implicated in nontuberculous mycobacteria outbreaks. Ongoing rigorous surveillance for healthcare-associated M. abscessus (HA-Mab) put in place following a prior institutional outbreak of M. abscessus alerted investigators to a cluster of 3 extrapulmonary M. abscessus infections among patients who had undergone cardiothoracic surgery.
Methods:
Investigators convened a multidisciplinary team and launched a comprehensive investigation to identify potential sources of M. abscessus in the healthcare setting. Adherence to tap water avoidance protocols during patient care and HCD cleaning, disinfection, and maintenance practices were reviewed. Relevant environmental samples were obtained. Patient and environmental M. abscessus isolates were compared using multilocus-sequence typing and pulsed-field gel electrophoresis. Smoke testing was performed to evaluate the potential for aerosol generation and dispersion during HCD use. The entire HCD fleet was replaced to mitigate continued transmission.
Results:
Clinical presentations of case patients and epidemiologic data supported intraoperative acquisition. M. abscessus was isolated from HCDs used on patients and molecular comparison with patient isolates demonstrated clonality. Smoke testing simulated aerosolization of M. abscessus from HCDs during device operation. Because the HCD fleet was replaced, no additional extrapulmonary HA-Mab infections due to the unique clone identified in this cluster have been detected.
Conclusions:
Despite adhering to HCD cleaning and disinfection strategies beyond manufacturer instructions for use, HCDs became colonized with and ultimately transmitted M. abscessus to 3 patients. Design modifications to better contain aerosols or filter exhaust during device operation are needed to prevent NTM transmission events from water-based HCDs.
We compared the number of blood-culture events before and after the introduction of a blood-culture algorithm and provider feedback. Secondary objectives were the comparison of blood-culture positivity and negative safety signals before and after the intervention.
Design:
Prospective cohort design.
Setting:
Two surgical intensive care units (ICUs): general and trauma surgery and cardiothoracic surgery
Patients:
Patients aged ≥18 years and admitted to the ICU at the time of the blood-culture event.
Methods:
We used an interrupted time series to compare rates of blood-culture events (ie, blood-culture events per 1,000 patient days) before and after the algorithm implementation with weekly provider feedback.
Results:
The blood-culture event rate decreased from 100 to 55 blood-culture events per 1,000 patient days in the general surgery and trauma ICU (72% reduction; incidence rate ratio [IRR], 0.38; 95% confidence interval [CI], 0.32–0.46; P < .01) and from 102 to 77 blood-culture events per 1,000 patient days in the cardiothoracic surgery ICU (55% reduction; IRR, 0.45; 95% CI, 0.39–0.52; P < .01). We did not observe any differences in average monthly antibiotic days of therapy, mortality, or readmissions between the pre- and postintervention periods.
Conclusions:
We implemented a blood-culture algorithm with data feedback in 2 surgical ICUs, and we observed significant decreases in the rates of blood-culture events without an increase in negative safety signals, including ICU length of stay, mortality, antibiotic use, or readmissions.
Large research teams and consortia present challenges for authorship. The number of disciplines involved in the research can further complicate approaches to manuscript development and leadership. The CHARM team, representing a multi-disciplinary, multi-institutional genomics implementation study, participated in facilitated discussions inspired by team science methodologies. The discussions were centered on team members’ past experiences with authorship and perspectives on authorship in a large research team context. Team members identified challenges and opportunities that were used to create guidelines and administrative tools to support manuscript development. The guidelines were organized by the three values of equity, inclusion, and efficiency and included eight principles. A visual dashboard was created to allow all team members to see who was leading or involved in each paper. Additional tools to promote equity, inclusion, and efficiency included providing standardized project management for each manuscript and making “concept sheets” for each manuscript accessible to all team members. The process used in CHARM can be used by other large research teams and consortia to equitably distribute lead authorship opportunities, foster coauthor inclusion, and efficiently work with large authorship groups.
Teenagers often present in crisis with risk issues, mainly risk to self but sometimes risk to others. Adolescent violence is commonplace and is not just the remit of adolescent forensic psychiatry. Clinicians may lack confidence assessing risk of violence and can neglect vital areas that are essential to reduce risk. Use of structured violence risk assessments enables the multi-agency professional network to formulate a young person's presentation and their violence in a holistic way and consequently develop targeted risk management plans addressing areas such as supervision, interventions and case management to reduce the risk of future violence. Of the several validated tools developed for young people, the Structured Assessment of Violence Risk – Youth (SAVRY™) is that most used by UK-based forensic adolescent clinicians. This article outlines the epidemiology, causes and purposes of violence among adolescents; discusses types of risk assessment tool; explores and deconstructs the SAVRY; and presents a fictitious risk formulation.
Background: Blood cultures are commonly ordered for patients with low risk of bacteremia. Liberal blood-culture ordering increases the risk of false-positive results, which can lead to increased length of stay, excess antibiotics, and unnecessary diagnostic procedures. We implemented a blood-culture indication algorithm with data feedback and assessed the impact on ordering volume and percent positivity. Methods: We performed a prospective cohort study from February 2022 to November 2022 using historical controls from February 2020 to January 2022. We introduced the blood-culture algorithm (Fig. 1) in 2 adult surgical intensive care units (ICUs). Clinicians reviewed charts of eligible patients with blood cultures weekly to determine whether the blood-culture algorithm was followed. They provided feedback to the unit medical directors weekly. We defined a blood-culture event as ≥1 blood culture within 24 hours. We excluded patients aged <18 years, absolute neutrophil count <500, and heart and lung transplant recipients at the time of blood-culture review. Results: In total, 7,315 blood-culture events in the preintervention group and 2,506 blood-culture events in the postintervention group met eligibility criteria. The average monthly blood-culture rate decreased from 190 blood cultures per 1,000 patient days to 142 blood cultures per 1,000 patient days (P < .01) after the algorithm was implemented. (Fig. 2) The average monthly blood-culture positivity increased from 11.7% to 14.2% (P = .13). Average monthly days of antibiotic therapy (DOT) was lower in the postintervention period than in the preintervention period (2,200 vs 1,940; P < .01). (Fig. 3) The ICU length of stay did not change before the intervention compared to after the intervention: 10 days (IQR, 5–18) versus 10 days (IQR, 5–17; P = .63). The in-hospital mortality rate was lower during the postintervention period, but the difference was not statistically significant: 9.24% versus 8.34% (P = .17). The all-cause 30-day mortality was significantly lower during the intervention period: 11.9% versus 9.7% (P < .01). The unplanned 30-day readmission percentage was significantly lower during the intervention period (10.6% vs 7.6%; P < .01). Over the 9-month intervention, we reviewed 916 blood-culture events in 452 unique patients. Overall, 74.6% of blood cultures followed the algorithm. The most common reasons overall for ordering blood cultures were severe sepsis or septic shock (37%), isolated fever and/or leukocytosis (19%), and documenting clearance of bacteremia (15%) (Table 1). The most common indications for inappropriate blood cultures were isolated fever and/or leukocytosis (53%). Conclusions: We introduced a blood-culture algorithm with data feedback in 2 surgical ICUs and observed decreases in blood-culture volume without a negative impact on ICU LOS or mortality rate.
We assessed Oxivir Tb wipe disinfectant residue in a controlled laboratory setting to evaluate low environmental contamination of SARS-CoV-2. Frequency of viral RNA detection was not statistically different between intervention and control arms on day 3 (P=0.14). Environmental contamination viability is low; residual disinfectant did not significantly contribute to low contamination.
This retrospective review of 4-year surveillance data revealed a higher central line-associated bloodstream infection (CLABSI) rate in non-Hispanic Black patients and higher catheter-associated urinary tract infection (CAUTI) rates in Asian and non-Hispanic Black patients compared with White patients despite similar catheter utilization between the groups.
A combination of morphological and molecular techniques were used to revise the genus Ellescus Dejean, 1821 (Coleoptera: Curculionidae: Ellescini) in North America. Four valid species of Ellescus are documented from the Nearctic Region. These are the widespread, hypervariable E. ephippiatus (Say, 1831), the Holarctic E. bipunctatus (Linnaeus, 1758) (of which E. borealis (Carr, 1920) new synonym is found to be a new junior synonym), the west coast endemic E. californicus (Casey, 1885) (resurrected from synonymy with E. ephippiatus (Say, 1831)), and the temperately distributed E. michaelinew species. A neotype is designated for E. bipunctatus. The European species, E. scanicus (Paykull, 1792), is determined to have been erroneously reported from North America. An illustrated identification key, distributional data, and DNA sequences (CO1, ITS2) are provided to facilitate identification of the Ellescus species in North America. Notably, CO1 failed to delineate E. ephippiatus and E. michaeli, but the faster-evolving ITS2 reliably separated these taxa, further supporting the use of multiple markers in taxonomic studies and the utility of ITS2 in weevil species delineation.
We present WALLABY pilot data release 1, the first public release of H i pilot survey data from the Wide-field ASKAP L-band Legacy All-sky Blind Survey (WALLABY) on the Australian Square Kilometre Array Pathfinder. Phase 1 of the WALLABY pilot survey targeted three
$60\,\mathrm{deg}^{2}$
regions on the sky in the direction of the Hydra and Norma galaxy clusters and the NGC 4636 galaxy group, covering the redshift range of
$z \lesssim 0.08$
. The source catalogue, images and spectra of nearly 600 extragalactic H i detections and kinematic models for 109 spatially resolved galaxies are available. As the pilot survey targeted regions containing nearby group and cluster environments, the median redshift of the sample of
$z \approx 0.014$
is relatively low compared to the full WALLABY survey. The median galaxy H i mass is
$2.3 \times 10^{9}\,{\rm M}_{{\odot}}$
. The target noise level of
$1.6\,\mathrm{mJy}$
per 30′′ beam and
$18.5\,\mathrm{kHz}$
channel translates into a
$5 \sigma$
H i mass sensitivity for point sources of about
$5.2 \times 10^{8} \, (D_{\rm L} / \mathrm{100\,Mpc})^{2} \, {\rm M}_{{\odot}}$
across 50 spectral channels (
${\approx} 200\,\mathrm{km \, s}^{-1}$
) and a
$5 \sigma$
H i column density sensitivity of about
$8.6 \times 10^{19} \, (1 + z)^{4}\,\mathrm{cm}^{-2}$
across 5 channels (
${\approx} 20\,\mathrm{km \, s}^{-1}$
) for emission filling the 30′′ beam. As expected for a pilot survey, several technical issues and artefacts are still affecting the data quality. Most notably, there are systematic flux errors of up to several 10% caused by uncertainties about the exact size and shape of each of the primary beams as well as the presence of sidelobes due to the finite deconvolution threshold. In addition, artefacts such as residual continuum emission and bandpass ripples have affected some of the data. The pilot survey has been highly successful in uncovering such technical problems, most of which are expected to be addressed and rectified before the start of the full WALLABY survey.
To describe the epidemiology of complex colon surgical procedures (COLO), stratified by present at time of surgery (PATOS) surgical-site infections (SSIs) and non-PATOS SSIs and their impact on the epidemiology of colon-surgery SSIs.
Design:
Retrospective cohort study.
Methods:
SSI data were prospectively collected from patients undergoing colon surgical procedures (COLOs) as defined by the National Healthcare Safety Network (NHSN) at 34 community hospitals in the southeastern United States from January 2015 to June 2019. Logistic regression models identified specific characteristics of complex COLO SSIs, complex non-PATOS COLO SSIs, and complex PATOS COLO SSIs.
Results:
Over the 4.5-year study period, we identified 720 complex COLO SSIs following 28,188 COLO surgeries (prevalence rate, 2.55 per 100 procedures). Overall, 544 complex COLO SSIs (76%) were complex non-PATOS COLO SSIs (prevalence rate [PR], 1.93 per 100 procedures) and 176 (24%) complex PATOS COLO SSIs (PR, 0.62 per 100 procedures). Age >75 years and operation duration in the >75th percentile were independently associated with non-PATOS SSIs but not PATOS SSIs. Conversely, emergency surgery and hospital volume for COLO procedures were independently associated with PATOS SSIs but not non-PATOS SSIs. The proportion of polymicrobial SSIs was significantly higher for non-PATOS SSIs compared with PATOS SSIs.
Conclusions:
Complex PATOS COLO SSIs have distinct features from complex non-PATOS COLO SSIs. Removal of PATOS COLO SSIs from public reporting allows more accurate comparisons among hospitals that perform different case mixes of colon surgeries.
Sparse recent data are available on the epidemiology of surgical site infections (SSIs) in community hospitals. Our objective was to provide updated epidemiology data on complex SSIs in community hospitals and to characterize trends of SSI prevalence rates over time.
Design:
Retrospective cohort study.
Methods:
SSI data were collected from patients undergoing 26 commonly performed surgical procedures at 32 community hospitals in the southeastern United States from 2013 to 2018. SSI prevalence rates were calculated for each year and were stratified by procedure and causative pathogen.
Results:
Over the 6-year study period, 3,561 complex (deep incisional or organ-space) SSIs occurred following 669,467 total surgeries (prevalence rate, 0.53 infections per 100 procedures). The overall complex SSI prevalence rate did not change significantly during the study period: 0.58 of 100 procedures in 2013 versus 0.53 of 100 procedures in 2018 (prevalence rate ratio [PRR], 0.84; 95% CI, 0.66–1.08; P = .16). Methicillin-sensitive Staphylococcus aureus (MSSA) complex SSIs (n = 480, 13.5%) were more common than complex SSIs caused by methicillin-resistant S. aureus (MRSA; n = 363, 10.2%).
Conclusions:
The complex SSI rate did not decrease in our cohort of community hospitals from 2013 to 2018, which is a change from prior comparisons. The reason for this stagnation is unclear. Additional research is needed to determine the proportion of or remaining SSIs that are preventable and what measures would be effective to further reduce SSI rates.
Background: Central-line–associated bloodstream infections (CLABSIs) arise from bacteria migrating from the skin along the catheter, by direct inoculation, or from pathogens that form biofilms on the interior surface of the catheter. However, given the oxygen-poor environments that obligate anaerobes require, these organisms are unlikely to survive long enough on the skin or on the catheter after direct inoculation to be the true cause of a CLABSI. Although some anaerobic CLABSIs may meet the definition for a mucosal-barrier-injury, laboratory-confirmed, bloodstream infection (MBI-LCBI), some may be not. We sought to determine the proportion of CLABSIs attributed to obligate anaerobic bacteria, and we sought to determine the pathophysiologic source of these infections. Methods: We performed a retrospective analysis of prospectively collected CLABSI data at 54 hospitals (academic and community) in the southeastern United States from January 2015 to December 2020. We performed chart reviews on a convenient sample for which medical records were available. We calculated the proportion of CLABSIs due to obligate anaerobes, and we have described a subset of anaerobic CLABSI cases. Results: We identified 60 anaerobic CLABSIs of 2,430 CLABSIs (2.5%). Of the 60 anaerobic CLABSIs, 7 were polymicrobial with nonanaerobic bacteria. The most common species we identified were Bacteroides, Clostridium, and Lactobacillus (Table 1). The proportion of anaerobic CLABSIs per year varied from 1.2% to 3.7% (Fig. 1). Of 60 anaerobic CLABSIs, 29 (48%) occurred in the only quaternary-care academic medical center in the database. In contrast, an average of 0.6 (SD, 0.6) anaerobic CLABSIs occurred in the 53 community hospitals over the 6-year study period. Of these 29 anaerobic CLABSIs, 23 (79%) were clinically consistent with secondary bloodstream infections (BSIs) due to gastrointestinal or genitourinary source, but they lacked appropriate documentation to meet NHSN criteria for secondary BSI or MBI-LCBI based on case reviews by infection prevention physicians. The other 6 anaerobic CLABSIs did not have a clear clinical etiology and did not meet MBI-LCBI criteria. In addition, 27 (93%) of 29 anaerobic CLABSIs occurred in patients who were either solid-organ transplant recipients, were stem-cell transplant recipients, or were receiving chemotherapy. Lastly, 27 (93%) of 29 anaerobic CLABSIs were treated with antibiotics. Conclusions: Anaerobic CLABSIs are uncommon events, but CLABSI may disproportionately affect large, academic hospitals caring for a high proportion of medically complex patients. Additional criteria could be added to the MBI-LCBI to better classify anaerobic BSI.
Background: Racial and ethnic disparities in healthcare access, medical treatment, and outcomes have been extensively reported. However, the impact of racial and ethnic differences in patient safety, including healthcare-associated infections, has not been well described. Methods: We performed a retrospective review analyzing prospectively collected data on central-line–associated bloodstream infection (CLABSI) and catheter-associated urinary tract infection (CAUTI) rates per 1,000 device days. Data for adult patients admitted to an academic medical center between 2018 and 2021 were stratified by 7 racial and ethnic groups: non-Hispanic White, non-Hispanic Black, Hispanic/Latino, Asian, American Indian/Alaska Native, Native Hawaiian/Pacific Islander, and othe. The “other” group was composed of bi- or multiracial patients, or those for whom no data were reported. We compared the CLABSI and CAUTI rates between the different racial and ethnic groups using Poisson regression. Results: Compared to non-Hispanic White patients, the rate of CLABSI was significantly higher in non-Hispanic Black patients (1.27; 95% CI, 1.02–1.58; P < .03) and those in the “other” race category (1.79; 95% CI, 1.39–2.30; P < .001, respectively), and these trends increased in Hispanic/Latino patients (Table 1). Similarly, Black patients had higher rates of CAUTI (1.42; 95% CI, 1.05–1.92; P < .02), as did Asian patients (2.49; 95% CI, 1.16–5.36; P < .02), and patients in the “other” category (1.52; 95% CI, 1.06–2.18; P < .02) (Table 2). Conclusions: Racial and ethnic minorities may be vulnerable to a higher rate of patient safety events, including CLABSIs and CAUTIs. Additional analyses controlling for potential confounding factors are needed to better understand the relationship between race or ethnicity, clinical management, and healthcare-associated infections. This evaluation is essential to inform mitigation strategies and to provide optimum, equitable care for all.