We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Next generation high-power laser facilities are expected to generate hundreds-of-MeV proton beams and operate at multi-Hz repetition rates, presenting opportunities for medical, industrial and scientific applications requiring bright pulses of energetic ions. Characterizing the spectro-spatial profile of these ions at high repetition rates in the harsh radiation environments created by laser–plasma interactions remains challenging but is paramount for further source development. To address this, we present a compact scintillating fiber imaging spectrometer based on the tomographic reconstruction of proton energy deposition in a layered fiber array. Modeling indicates that spatial resolution of approximately 1 mm and energy resolution of less than 10% at proton energies of more than 20 MeV are readily achievable with existing 100 μm diameter fibers. Measurements with a prototype beam-profile monitor using 500 μm fibers demonstrate active readouts with invulnerability to electromagnetic pulses, and less than 100 Gy sensitivity. The performance of the full instrument concept is explored with Monte Carlo simulations, accurately reconstructing a proton beam with a multiple-component spectro-spatial profile.
The clay minerals formed in a deeply weathered boulder conglomerate of Middle Old Red Sandstone (Devonian) age in north-east Scotland have been studied by a variety of physical and chemical techniques. The granite and granulite boulders in this deposit are completely weathered. With the exception of microcline, all the feldspars in these rocks—orthoclase feldspar, orthoclase-microperthite, albite, and oligoclase—weather to a Cheto-type montmorillonite, poor in iron. Electron and optical microscopy indicate that the weathering transformation is a direct one, without the intervention of any intermediate crystalline or well-defined amorphous phase. Structural control of the primary mineral over the formation of the montmorillonite seems to have been a minimal factor and the evidence suggests that the clay mineral crystallized from the soluble or colloidal products arising from the decomposed feldspars. Smaller amounts of kaolinite also formed during weathering but largely from the weathering of muscovite. The environment in which these changes occurred seems to have been alkaline in a relatively closed system. Chemical analyses of related cores and weathered shells of granite and granulite bounders show only a slight decrease of silica and an increase in magnesia. Judging from the extent of alteration to secondary clay minerals, the order of resistance towards weathering of the primary minerals in these rocks is plagioclase = orthoclase < muscovite < biotite < microcline < quartz.
Understanding characteristics of healthcare personnel (HCP) with SARS-CoV-2 infection supports the development and prioritization of interventions to protect this important workforce. We report detailed characteristics of HCP who tested positive for SARS-CoV-2 from April 20, 2020 through December 31, 2021.
Methods:
CDC collaborated with Emerging Infections Program sites in 10 states to interview HCP with SARS-CoV-2 infection (case-HCP) about their demographics, underlying medical conditions, healthcare roles, exposures, personal protective equipment (PPE) use, and COVID-19 vaccination status. We grouped case-HCP by healthcare role. To describe residential social vulnerability, we merged geocoded HCP residential addresses with CDC/ATSDR Social Vulnerability Index (SVI) values at the census tract level. We defined highest and lowest SVI quartiles as high and low social vulnerability, respectively.
Results:
Our analysis included 7,531 case-HCP. Most case-HCP with roles as certified nursing assistant (CNA) (444, 61.3%), medical assistant (252, 65.3%), or home healthcare worker (HHW) (225, 59.5%) reported their race and ethnicity as either non-Hispanic Black or Hispanic. More than one third of HHWs (166, 45.2%), CNAs (283, 41.7%), and medical assistants (138, 37.9%) reported a residential address in the high social vulnerability category. The proportion of case-HCP who reported using recommended PPE at all times when caring for patients with COVID-19 was lowest among HHWs compared with other roles.
Conclusions:
To mitigate SARS-CoV-2 infection risk in healthcare settings, infection prevention, and control interventions should be specific to HCP roles and educational backgrounds. Additional interventions are needed to address high social vulnerability among HHWs, CNAs, and medical assistants.
Clinical outcomes of repetitive transcranial magnetic stimulation (rTMS) for treatment of treatment-resistant depression (TRD) vary widely and there is no mood rating scale that is standard for assessing rTMS outcome. It remains unclear whether TMS is as efficacious in older adults with late-life depression (LLD) compared to younger adults with major depressive disorder (MDD). This study examined the effect of age on outcomes of rTMS treatment of adults with TRD. Self-report and observer mood ratings were measured weekly in 687 subjects ages 16–100 years undergoing rTMS treatment using the Inventory of Depressive Symptomatology 30-item Self-Report (IDS-SR), Patient Health Questionnaire 9-item (PHQ), Profile of Mood States 30-item, and Hamilton Depression Rating Scale 17-item (HDRS). All rating scales detected significant improvement with treatment; response and remission rates varied by scale but not by age (response/remission ≥ 60: 38%–57%/25%–33%; <60: 32%–49%/18%–25%). Proportional hazards models showed early improvement predicted later improvement across ages, though early improvements in PHQ and HDRS were more predictive of remission in those < 60 years (relative to those ≥ 60) and greater baseline IDS burden was more predictive of non-remission in those ≥ 60 years (relative to those < 60). These results indicate there is no significant effect of age on treatment outcomes in rTMS for TRD, though rating instruments may differ in assessment of symptom burden between younger and older adults during treatment.
In 2016, the National Center for Advancing Translational Science launched the Trial Innovation Network (TIN) to address barriers to efficient and informative multicenter trials. The TIN provides a national platform, working in partnership with 60+ Clinical and Translational Science Award (CTSA) hubs across the country to support the design and conduct of successful multicenter trials. A dedicated Hub Liaison Team (HLT) was established within each CTSA to facilitate connection between the hubs and the newly launched Trial and Recruitment Innovation Centers. Each HLT serves as an expert intermediary, connecting CTSA Hub investigators with TIN support, and connecting TIN research teams with potential multicenter trial site investigators. The cross-consortium Liaison Team network was developed during the first TIN funding cycle, and it is now a mature national network at the cutting edge of team science in clinical and translational research. The CTSA-based HLT structures and the external network structure have been developed in collaborative and iterative ways, with methods for shared learning and continuous process improvement. In this paper, we review the structure, function, and development of the Liaison Team network, discuss lessons learned during the first TIN funding cycle, and outline a path toward further network maturity.
The quenching of cluster satellite galaxies is inextricably linked to the suppression of their cold interstellar medium (ISM) by environmental mechanisms. While the removal of neutral atomic hydrogen (H i) at large radii is well studied, how the environment impacts the remaining gas in the centres of galaxies, which are dominated by molecular gas, is less clear. Using new observations from the Virgo Environment traced in CO survey (VERTICO) and archival H i data, we study the H i and molecular gas within the optical discs of Virgo cluster galaxies on 1.2-kpc scales with spatially resolved scaling relations between stellar ($\Sigma_{\star}$), H i ($\Sigma_{\text{H}\,{\small\text{I}}}$), and molecular gas ($\Sigma_{\text{mol}}$) surface densities. Adopting H i deficiency as a measure of environmental impact, we find evidence that, in addition to removing the H i at large radii, the cluster processes also lower the average $\Sigma_{\text{H}\,{\small\text{I}}}$ of the remaining gas even in the central $1.2\,$kpc. The impact on molecular gas is comparatively weaker than on the H i, and we show that the lower $\Sigma_{\text{mol}}$ gas is removed first. In the most H i-deficient galaxies, however, we find evidence that environmental processes reduce the typical $\Sigma_{\text{mol}}$ of the remaining gas by nearly a factor of 3. We find no evidence for environment-driven elevation of $\Sigma_{\text{H}\,{\small\text{I}}}$ or $\Sigma_{\text{mol}}$ in H i-deficient galaxies. Using the ratio of $\Sigma_{\text{mol}}$-to-$\Sigma_{\text{H}\,{\small\text{I}}}$ in individual regions, we show that changes in the ISM physical conditions, estimated using the total gas surface density and midplane hydrostatic pressure, cannot explain the observed reduction in molecular gas content. Instead, we suggest that direct stripping of the molecular gas is required to explain our results.
Global healthcare systems have been particularly impacted by the COVID-19 pandemic. Healthcare workers (HCWs) are widely reported to have experienced increased levels of baseline psychological distress relative to the general population, and the COVID-19 pandemic may have had an additive effect. However, previous studies are typically restricted to physicians and nurses with limited data available on hospital HCWs. We aimed to conduct a cross-sectional, psychological evaluation of Irish HCWs during COVID-19.
Methods:
HCWs across five adult acute level-4 Dublin-based hospitals completed an online survey of wellbeing and COVID-19 experience.
Results:
There were 1898 HCWs who commenced the survey representing 10% of the total employee base. The sample comprised nurses (33%), doctors (21%), Health and Social Care Professionals (HSCPs) (24%) and ‘Other’ disciplines (22%), and 81% identified as female. Clinical levels of depression, anxiety and PTSD symptoms were endorsed by 31%, 34% and 28% of respondents, respectively. Professional grouping effects included: nurses reporting significantly greater levels of COVID-19 exposure, infection, COVID-fear, moral injury, and post-traumatic distress; HSCPs were significantly less likely to report mood dysfunction. In terms of gender, males were significantly less likely to report negative pandemic experiences, low resilience, and significantly more likely to endorse ‘minimal’ depression, anxiety, and traumatic distress. Logistic regression modelling revealed mental health outcomes (depression, anxiety and PTSD symptoms) were associated with increased frontline exposure, fewer career years’ experience, elevated pre-pandemic stress, and female gender.
Discussion:
To our knowledge, this is the largest evaluation of psychological wellbeing amongst HCWs in acute hospitals in the Dublin region. Our findings have implications for healthcare workforce wellbeing and future service delivery.
Some physiological variables which could aid in assessing the welfare of beef cattle in feedlots were screened in this exploratory study. In two experiments, each of 42 days duration, the physiological responses of Bos taurus steers to three treatments were investigated: pasture (rotation between 1.5 hectare paddocks); a feedlot yard stocked at 12.0 m2 per head with a dry, firm pen surface; and a ‘high-density’ feedlot yard stocked at 6.0 m2 per head with a wet and muddy pen surface. Fourteen steers were used per group per experiment. Relative adrenal mass in both feedlot groups was 8-10% higher than in the pasture group, and this finding was supported by morphological measurements of the adrenal glands. Out of 17 immune variables examined, only serum IgA and the T-cell lymphocytes subpopulation WC+1 showed consistent differences between the feedlot and pasture groups. Interestingly, no differences were observed between the two feedlot treatments. It was concluded that although there may have been some disruption of epithelial/mucosal immunity, more support was required from other immune variables before it could be stated that the immune system was depressed and that pre-pathological states existed in the feedlot groups. However, measures of relative adrenal weight, adrenal index, serum IgA and WC1+ lymphocytes are good candidates for use in future welfare investigations of feedlot cattle.
Objective. The efficacy of individualized, community-based physical activity as an adjunctive smoking cessation treatment to enhance long-term smoking cessation rates was evaluated for the Lifestyle Enhancement Program (LEAP). Methods. The study was a two-arm, parallel-group, randomized controlled trial. All participants (n = 392) received cessation counseling and a nicotine patch and were randomized to physical activity (n = 199; YMCA membership and personalized exercise programming from a health coach) or an equal contact frequency wellness curriculum (n = 193). Physical activity treatment was individualized and flexible (with each participant selecting types of activities and intensity levels and being encouraged to exercise at the YMCA and at home, as well as to use “lifestyle” activity). The primary outcome (biochemically verified prolonged abstinence at 7-weeks (end of treatment) and 6- and 12-months postcessation) and secondary outcomes (7-day point prevalent tobacco abstinence (PPA), total minutes per week of leisure time physical activity and strength training) were assessed at baseline, 7 weeks, 6 months, and 12 months. Results. Prolonged abstinence in the physical activity and wellness groups was 19.6% and 25.4%, respectively, at 7-weeks, 15.1% and 16.6% at 6-months, and 14.1% and 17.1% at 12 months (all between-group P values >0.18). Similarly, PPA rates did not differ significantly between groups at any follow-up. Change from baseline leisure-time activity plus strength training increased significantly in the physical activity group at 7 weeks (P = 0.04). Across treatment groups, an increase in the number of minutes per week in strength training from baseline to 7 weeks predicted prolonged abstinence at 12 months (P ≤ 0.001). Further analyses revealed that social support, fewer years smoked, and less temptation to smoke were associated with prolonged abstinence over 12 months in both groups. Conclusions. Community-based physical activity programming, delivered as adjunctive treatment with behavioral/pharmacological cessation treatment, did not improve long-term quit rates compared to adjunctive wellness counseling plus behavioral/pharmacological cessation treatment. This trial is registered with https://beta.clinicaltrials.gov/study/NCT00403312, registration no. NCT00403312.
Many decisions in everyday life involve a choice between exploring options that are currently unknown and exploiting options that are already known to be rewarding. Previous work has suggested that humans solve such “explore-exploit” dilemmas using a mixture of two strategies: directed exploration, in which information seeking drives exploration by choice, and random exploration, in which behavioral variability drives exploration by chance. One limitation of this previous work was that, like most studies on explore-exploit decision making, it focused exclusively on the domain of gains, where the goal was to maximize reward. In many real-world decisions, however, the goal is to minimize losses and it is well known from Prospect Theory that behavior can be quite different in this domain. In this study, we compared explore-exploit behavior of human subjects under conditions of gain and loss. We found that people use both directed and random exploration regardless of whether they are exploring to maximize gains or minimize losses and that there is quantitative agreement between the exploration parameters across domains. Our results also revealed an overall bias towards the more uncertain option in the domain of losses. While this bias towards uncertainty was qualitatively consistent with the predictions of Prospect Theory, quantitatively we found that the bias was better described by a Bayesian account, in which subjects had a prior that was optimistic for losses and pessimistic for gains. Taken together, our results suggest that explore-exploit decisions are driven by three independent processes: directed and random exploration, and a baseline uncertainty seeking that is driven by a prior.
Since the advent of direct-acting antiviral therapy, the elimination of hepatitis c virus (HCV) as a public health concern is now possible. However, identification of those who remain undiagnosed, and re-engagement of those who are diagnosed but remain untreated, will be essential to achieve this. We examined the extent of HCV infection among individuals undergoing liver function tests (LFT) in primary care. Residual biochemistry samples for 6007 patients, who had venous blood collected in primary care for LFT between July 2016 and January 2017, were tested for HCV antibody. Through data linkage to national and sentinel HCV surveillance databases, we also examined the extent of diagnosed infection, attendance at specialist service and HCV treatment for those found to be HCV positive. Overall HCV antibody prevalence was 4.0% and highest for males (5.0%), those aged 37–50 years (6.2%), and with an ALT result of 70 or greater (7.1%). Of those testing positive, 68.9% had been diagnosed with HCV in the past, 84.9% before the study period. Most (92.5%) of those diagnosed with chronic infection had attended specialist liver services and while 67.7% had ever been treated only 38% had successfully cleared infection. More than half of HCV-positive people required assessment, and potentially treatment, for their HCV infection but were not engaged with services during the study period. LFT in primary care are a key opportunity to diagnose, re-diagnose and re-engage patients with HCV infection and highlight the importance of GPs in efforts to eliminate HCV as a public health concern.
To describe the genomic analysis and epidemiologic response related to a slow and prolonged methicillin-resistant Staphylococcus aureus (MRSA) outbreak.
Design:
Prospective observational study.
Setting:
Neonatal intensive care unit (NICU).
Methods:
We conducted an epidemiologic investigation of a NICU MRSA outbreak involving serial baby and staff screening to identify opportunities for decolonization. Whole-genome sequencing was performed on MRSA isolates.
Results:
A NICU with excellent hand hygiene compliance and longstanding minimal healthcare-associated infections experienced an MRSA outbreak involving 15 babies and 6 healthcare personnel (HCP). In total, 12 cases occurred slowly over a 1-year period (mean, 30.7 days apart) followed by 3 additional cases 7 months later. Multiple progressive infection prevention interventions were implemented, including contact precautions and cohorting of MRSA-positive babies, hand hygiene observers, enhanced environmental cleaning, screening of babies and staff, and decolonization of carriers. Only decolonization of HCP found to be persistent carriers of MRSA was successful in stopping transmission and ending the outbreak. Genomic analyses identified bidirectional transmission between babies and HCP during the outbreak.
Conclusions:
In comparison to fast outbreaks, outbreaks that are “slow and sustained” may be more common to units with strong existing infection prevention practices such that a series of breaches have to align to result in a case. We identified a slow outbreak that persisted among staff and babies and was only stopped by identifying and decolonizing persistent MRSA carriage among staff. A repeated decolonization regimen was successful in allowing previously persistent carriers to safely continue work duties.
The Hierarchical Taxonomy of Psychopathology (HiTOP) has emerged out of the quantitative approach to psychiatric nosology. This approach identifies psychopathology constructs based on patterns of co-variation among signs and symptoms. The initial HiTOP model, which was published in 2017, is based on a large literature that spans decades of research. HiTOP is a living model that undergoes revision as new data become available. Here we discuss advantages and practical considerations of using this system in psychiatric practice and research. We especially highlight limitations of HiTOP and ongoing efforts to address them. We describe differences and similarities between HiTOP and existing diagnostic systems. Next, we review the types of evidence that informed development of HiTOP, including populations in which it has been studied and data on its validity. The paper also describes how HiTOP can facilitate research on genetic and environmental causes of psychopathology as well as the search for neurobiologic mechanisms and novel treatments. Furthermore, we consider implications for public health programs and prevention of mental disorders. We also review data on clinical utility and illustrate clinical application of HiTOP. Importantly, the model is based on measures and practices that are already used widely in clinical settings. HiTOP offers a way to organize and formalize these techniques. This model already can contribute to progress in psychiatry and complement traditional nosologies. Moreover, HiTOP seeks to facilitate research on linkages between phenotypes and biological processes, which may enable construction of a system that encompasses both biomarkers and precise clinical description.
Early in the COVID-19 pandemic, the World Health Organization stressed the importance of daily clinical assessments of infected patients, yet current approaches frequently consider cross-sectional timepoints, cumulative summary measures, or time-to-event analyses. Statistical methods are available that make use of the rich information content of longitudinal assessments. We demonstrate the use of a multistate transition model to assess the dynamic nature of COVID-19-associated critical illness using daily evaluations of COVID-19 patients from 9 academic hospitals. We describe the accessibility and utility of methods that consider the clinical trajectory of critically ill COVID-19 patients.
To examine the association between adherence to plant-based diets and mortality.
Design:
Prospective study. We calculated a plant-based diet index (PDI) by assigning positive scores to plant foods and reverse scores to animal foods. We also created a healthful PDI (hPDI) and an unhealthful PDI (uPDI) by further separating the healthy plant foods from less-healthy plant foods.
Setting:
The VA Million Veteran Program.
Participants:
315 919 men and women aged 19–104 years who completed a FFQ at the baseline.
Results:
We documented 31 136 deaths during the follow-up. A higher PDI was significantly associated with lower total mortality (hazard ratio (HR) comparing extreme deciles = 0·75, 95 % CI: 0·71, 0·79, Ptrend < 0·001]. We observed an inverse association between hPDI and total mortality (HR comparing extreme deciles = 0·64, 95 % CI: 0·61, 0·68, Ptrend < 0·001), whereas uPDI was positively associated with total mortality (HR comparing extreme deciles = 1·41, 95 % CI: 1·33, 1·49, Ptrend < 0·001). Similar significant associations of PDI, hPDI and uPDI were also observed for CVD and cancer mortality. The associations between the PDI and total mortality were consistent among African and European American participants, and participants free from CVD and cancer and those who were diagnosed with major chronic disease at baseline.
Conclusions:
A greater adherence to a plant-based diet was associated with substantially lower total mortality in this large population of veterans. These findings support recommending plant-rich dietary patterns for the prevention of major chronic diseases.
We examined whether preadmission history of depression is associated with less delirium/coma-free (DCF) days, worse 1-year depression severity and cognitive impairment.
Design and measurements:
A health proxy reported history of depression. Separate models examined the effect of preadmission history of depression on: (a) intensive care unit (ICU) course, measured as DCF days; (b) depression symptom severity at 3 and 12 months, measured by the Beck Depression Inventory-II (BDI-II); and (c) cognitive performance at 3 and 12 months, measured by the Repeatable Battery for the Assessment of Neuropsychological Status (RBANS) global score.
Setting and participants:
Patients admitted to the medical/surgical ICU services were eligible.
Results:
Of 821 subjects eligible at enrollment, 261 (33%) had preadmission history of depression. After adjusting for covariates, preadmission history of depression was not associated with less DCF days (OR 0.78, 95% CI, 0.59–1.03 p = 0.077). A prior history of depression was associated with higher BDI-II scores at 3 and 12 months (3 months OR 2.15, 95% CI, 1.42–3.24 p = <0.001; 12 months OR 1.89, 95% CI, 1.24–2.87 p = 0.003). We did not observe an association between preadmission history of depression and cognitive performance at either 3 or 12 months (3 months beta coefficient −0.04, 95% CI, −2.70–2.62 p = 0.97; 12 months 1.5, 95% CI, −1.26–4.26 p = 0.28).
Conclusion:
Patients with a depression history prior to ICU stay exhibit a greater severity of depressive symptoms in the year after hospitalization.
Major depressive disorder (MDD) and chronic pain are highly comorbid, and pain symptoms are associated with a poorer response to antidepressant medication treatment. It is unclear whether comorbid pain also is associated with a poorer response to treatment with repetitive transcranial magnetic stimulation (rTMS).
Methods
162 MDD subjects received 30 sessions of 10 Hz rTMS treatment administered to the left dorsolateral prefrontal cortex (DLPFC) with depression and pain symptoms measured before and after treatment. For a subset of 96 patients, a resting-state electroencephalogram (EEG) was recorded at baseline. Clinical outcome was compared between subjects with and without comorbid pain, and the relationships among outcome, pain severity, individual peak alpha frequency (PAF), and PAF phase-coherence in the EEG were examined.
Results
64.8% of all subjects reported pain, and both depressive and pain symptoms were significantly reduced after rTMS treatment, irrespective of age or gender. Patients with severe pain were 27% less likely to respond to MDD treatment than pain-free individuals. PAF was positively associated with pain severity. PAF phase-coherence in the somatosensory and default mode networks was significantly lower for MDD subjects with pain who failed to respond to MDD treatment.
Conclusions
Pain symptoms improved after rTMS to left DLPFC in MDD irrespective of age or gender, although the presence of chronic pain symptoms reduced the likelihood of treatment response. Individual PAF and baseline phase-coherence in the sensorimotor and midline regions may represent predictors of rTMS treatment outcome in comorbid pain and MDD.