We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The recommended first-line treatment for insomnia is cognitive behavioral therapy for insomnia (CBTi), but access is limited. Telehealth- or internet-delivered CBTi are alternative ways to increase access. To date, these intervention modalities have never been compared within a single study. Further, few studies have examined a) predictors of response to the different modalities, b) whether successfully treating insomnia can result in improvement of health-related biomarkers, and c) mechanisms of change in CBTi. This protocol was designed to compare the three CBTi modalities to each other and a waitlist control for adults aged 50-65 years (N = 100). Participants are randomly assigned to one of four study arms: in-person- (n=30), telehealth- (n=30) internet-delivered (n=30) CBTi, or 12-week waitlist control (n=10). Outcomes include self-reported insomnia symptom severity, polysomnography, circadian rhythms of activity and core body temperature, blood- and sweat-based biomarkers, cognitive functioning, and magnetic resonance imaging.
Hand, foot and mouth disease (HFMD) is a contagious communicable disease, with a high incidence in children aged under 10 years. It is a mainly self-limiting disease but can also cause serious neurological or cardiopulmonary complications in some cases, which can lead to death. Little is known about the burden of HMFD on primary care health care services in the UK. The aim of this work was to describe trends in general practitioner (GP) consultations for HFMD in England from January 2017 to December 2022 using a syndromic surveillance network of GPs. Daily GP consultations for HFMD in England were extracted from 1 January 2017 to 31 December 2022. Mean weekly consultation rates per 100,000 population and 95% confidence intervals (CI) were calculated. Consultation rates and rate ratios (RR) were calculated by age group and sex. During the study period, the mean weekly consultation rate for HFMD (per 100,000 registered GP patients) was 1.53 (range of 0.27 to 2.47). In England, children aged 1–4 years old accounted for the largest affected population followed by children <1 years old. We observed a seasonal pattern of HFMD incidence during the non-COVID years, with a seasonal peak of mean weekly rates between months of September and December. HFMD is typically diagnosed clinically rather than through laboratory sampling. Therefore, the ability to look at the daily HFMD consultation rates provides an excellent epidemiological overview on disease trends. The use of a novel GP-in-hours surveillance system allowed a unique epidemiological insight into the recent trends of general practitioner consultations for HFMD. We demonstrate a male predominance of cases, the impact of the non-pharmaceutical interventions during the COVID-19 pandemic, and a change in the week in which the peak number of cases happens post-pandemic.
In response to the COVID-19 pandemic, we rapidly implemented a plasma coordination center, within two months, to support transfusion for two outpatient randomized controlled trials. The center design was based on an investigational drug services model and a Food and Drug Administration-compliant database to manage blood product inventory and trial safety.
Methods:
A core investigational team adapted a cloud-based platform to randomize patient assignments and track inventory distribution of control plasma and high-titer COVID-19 convalescent plasma of different blood groups from 29 donor collection centers directly to blood banks serving 26 transfusion sites.
Results:
We performed 1,351 transfusions in 16 months. The transparency of the digital inventory at each site was critical to facilitate qualification, randomization, and overnight shipments of blood group-compatible plasma for transfusions into trial participants. While inventory challenges were heightened with COVID-19 convalescent plasma, the cloud-based system, and the flexible approach of the plasma coordination center staff across the blood bank network enabled decentralized procurement and distribution of investigational products to maintain inventory thresholds and overcome local supply chain restraints at the sites.
Conclusion:
The rapid creation of a plasma coordination center for outpatient transfusions is infrequent in the academic setting. Distributing more than 3,100 plasma units to blood banks charged with managing investigational inventory across the U.S. in a decentralized manner posed operational and regulatory challenges while providing opportunities for the plasma coordination center to contribute to research of global importance. This program can serve as a template in subsequent public health emergencies.
Many preoperative urine cultures are of low value and may even lead to patient harms. This study sought to understand practices around ordering preoperative urine cultures and prescribing antibiotic treatment.
We interviewed participants using a qualitative semi-structured interview guide. Collected data was coded inductively and with the Dual Process Model (DPM) using MAXQDA software. Data in the “Testing Decision-Making” code was further reviewed using the concept of perceived risk as a sensitizing concept.
Results:
We identified themes relating to surgeons’ concerns about de-implementing preoperative urine cultures to detect asymptomatic bacteriuria (ASB) in patients undergoing non-urological procedures: (1) anxiety and uncertainty surrounding missing infection signs spanned surgical specialties, (2) there were perceived risks of negative consequences associated with omitting urine cultures and treatment prior to specific procedure sites and types, and additionally, (3) participants suggested potential routes for adjusting these perceived risks to facilitate de-implementation acceptance. Notably, participants suggested that leadership support and peer engagement could help improve surgeon buy-in.
Conclusions:
Concerns about perceived risks sometimes outweigh the evidence against routine preoperative urine cultures to detect ASB. Evidence from trusted peers may improve openness to de-implementing preoperative urine cultures.
Cohort studies demonstrate that people who later develop schizophrenia, on average, present with mild cognitive deficits in childhood and endure a decline in adolescence and adulthood. Yet, tremendous heterogeneity exists during the course of psychotic disorders, including the prodromal period. Individuals identified to be in this period (known as CHR-P) are at heightened risk for developing psychosis (~35%) and begin to exhibit cognitive deficits. Cognitive impairments in CHR-P (as a singular group) appear to be relatively stable or ameliorate over time. A sizeable proportion has been described to decline on measures related to processing speed or verbal learning. The purpose of this analysis is to use data-driven approaches to identify latent subgroups among CHR-P based on cognitive trajectories. This will yield a clearer understanding of the timing and presentation of both general and domain-specific deficits.
Participants and Methods:
Participants included 684 young people at CHR-P (ages 12–35) from the second cohort of the North American Prodromal Longitudinal Study. Performance on the MATRICS Consensus Cognitive Battery (MCCB) and the Wechsler Abbreviated Scale of Intelligence (WASI-I) was assessed at baseline, 12-, and 24-months. Tested MCCB domains include verbal learning, speed of processing, working memory, and reasoning & problem-solving. Sex- and age-based norms were utilized. The Oral Reading subtest on the Wide Range Achievement Test (WRAT4) indexed pre-morbid IQ at baseline. Latent class mixture models were used to identify distinct trajectories of cognitive performance across two years. One- to 5-class solutions were compared to decide the best solution. This determination depended on goodness-of-fit metrics, interpretability of latent trajectories, and proportion of subgroup membership (>5%).
Results:
A one-class solution was found for WASI-I Full-Scale IQ, as people at CHR-P predominantly demonstrated an average IQ that increased gradually over time. For individual domains, one-class solutions also best fit the trajectories for speed of processing, verbal learning, and working memory domains. Two distinct subgroups were identified on one of the executive functioning domains, reasoning and problem-solving (NAB Mazes). The sample divided into unimpaired performance with mild improvement over time (Class I, 74%) and persistent performance two standard deviations below average (Class II, 26%). Between these classes, no significant differences were found for biological sex, age, years of education, or likelihood of conversion to psychosis (OR = 1.68, 95% CI 0.86 to 3.14). Individuals assigned to Class II did demonstrate a lower WASI-I IQ at baseline (96.3 vs. 106.3) and a lower premorbid IQ (100.8 vs. 106.2).
Conclusions:
Youth at CHR-P demonstrate relatively homogeneous trajectories across time in terms of general cognition and most individual domains. In contrast, two distinct subgroups were observed with higher cognitive skills involving planning and foresight, and they notably exist independent of conversion outcome. Overall, these findings replicate and extend results from a recently published latent class analysis that examined 12-month trajectories among CHR-P using a different cognitive battery (Allott et al., 2022). Findings inform which individuals at CHR-P may be most likely to benefit from cognitive remediation and can inform about the substrates of deficits by establishing meaningful subtypes.
In eastern North America, Indigenous peoples domesticated several crops that are now extinct. We present experimental data that alters our understanding of the domestication of one of these—goosefoot (Chenopodium berlandieri). Ancient domesticated goosefoot has been recognized on the basis of seed morphology, especially a decrease in the thickness of the seed coat (testa). Nondomesticated goosefoot also sometimes produces seeds that look similar or even identical to domesticated ones, but researchers believed that such seeds were rare (1%–3%). We conducted a common garden experiment and a series of carbonization experiments to better understand the determinants of seed polymorphism in archaeobotanical assemblages. We found that goosefoot produces much higher percentages of thin-testa seeds (mean 50% in our experiment, 15%–34% in free-living parent populations) than previously reported. We also found that cultivated plants produce more thin-testa seeds than their free-living parents, demonstrating that this trait is plastic in response to a garden environment. The carbonization experiments suggest that thin-testa seeds preserve under a larger window of conditions than thick-testa seeds, contrary to our expectations. These results suggest that (1) carbonized, phenotypically mixed assemblages should be interpreted cautiously, and (2) developmental plasticity and genetic assimilation played a role in the domestication of goosefoot.
Clinical implementation of risk calculator models in the clinical high-risk for psychosis (CHR-P) population has been hindered by heterogeneous risk distributions across study cohorts which could be attributed to pre-ascertainment illness progression. To examine this, we tested whether the duration of attenuated psychotic symptom (APS) worsening prior to baseline moderated performance of the North American prodrome longitudinal study 2 (NAPLS2) risk calculator. We also examined whether rates of cortical thinning, another marker of illness progression, bolstered clinical prediction models.
Methods
Participants from both the NAPLS2 and NAPLS3 samples were classified as either ‘long’ or ‘short’ symptom duration based on time since APS increase prior to baseline. The NAPLS2 risk calculator model was applied to each of these groups. In a subset of NAPLS3 participants who completed follow-up magnetic resonance imaging scans, change in cortical thickness was combined with the individual risk score to predict conversion to psychosis.
Results
The risk calculator models achieved similar performance across the combined NAPLS2/NAPLS3 sample [area under the curve (AUC) = 0.69], the long duration group (AUC = 0.71), and the short duration group (AUC = 0.71). The shorter duration group was younger and had higher baseline APS than the longer duration group. The addition of cortical thinning improved the prediction of conversion significantly for the short duration group (AUC = 0.84), with a moderate improvement in prediction for the longer duration group (AUC = 0.78).
Conclusions
These results suggest that early illness progression differs among CHR-P patients, is detectable with both clinical and neuroimaging measures, and could play an essential role in the prediction of clinical outcomes.
The U.S. Department of Agriculture–Agricultural Research Service (USDA-ARS) has been a leader in weed science research covering topics ranging from the development and use of integrated weed management (IWM) tactics to basic mechanistic studies, including biotic resistance of desirable plant communities and herbicide resistance. ARS weed scientists have worked in agricultural and natural ecosystems, including agronomic and horticultural crops, pastures, forests, wild lands, aquatic habitats, wetlands, and riparian areas. Through strong partnerships with academia, state agencies, private industry, and numerous federal programs, ARS weed scientists have made contributions to discoveries in the newest fields of robotics and genetics, as well as the traditional and fundamental subjects of weed–crop competition and physiology and integration of weed control tactics and practices. Weed science at ARS is often overshadowed by other research topics; thus, few are aware of the long history of ARS weed science and its important contributions. This review is the result of a symposium held at the Weed Science Society of America’s 62nd Annual Meeting in 2022 that included 10 separate presentations in a virtual Weed Science Webinar Series. The overarching themes of management tactics (IWM, biological control, and automation), basic mechanisms (competition, invasive plant genetics, and herbicide resistance), and ecosystem impacts (invasive plant spread, climate change, conservation, and restoration) represent core ARS weed science research that is dynamic and efficacious and has been a significant component of the agency’s national and international efforts. This review highlights current studies and future directions that exemplify the science and collaborative relationships both within and outside ARS. Given the constraints of weeds and invasive plants on all aspects of food, feed, and fiber systems, there is an acknowledged need to face new challenges, including agriculture and natural resources sustainability, economic resilience and reliability, and societal health and well-being.
Abnormal tau, a hallmark Alzheimer’s disease (AD) pathology, may appear in the locus coeruleus (LC) decades before AD symptom onset. Reports of subjective cognitive decline are also often present prior to formal diagnosis. Yet, the relationship between LC structural integrity and subjective cognitive decline has remained unexplored. Here, we aimed to explore these potential associations.
Methods:
We examined 381 community-dwelling men (mean age = 67.58; SD = 2.62) in the Vietnam Era Twin Study of Aging who underwent LC-sensitive magnetic resonance imaging and completed the Everyday Cognition scale to measure subjective cognitive decline along with their selected informants. Mixed models examined the associations between rostral-middle and caudal LC integrity and subjective cognitive decline after adjusting for depressive symptoms, physical morbidities, and family. Models also adjusted for current objective cognitive performance and objective cognitive decline to explore attenuation.
Results:
For participant ratings, lower rostral-middle LC contrast to noise ratio (LCCNR) was associated with significantly greater subjective decline in memory, executive function, and visuospatial abilities. For informant ratings, lower rostral-middle LCCNR was associated with significantly greater subjective decline in memory only. Associations remained after adjusting for current objective cognition and objective cognitive decline in respective domains.
Conclusions:
Lower rostral-middle LC integrity is associated with greater subjective cognitive decline. Although not explained by objective cognitive performance, such a relationship may explain increased AD risk in people with subjective cognitive decline as the LC is an important neural substrate important for higher order cognitive processing, attention, and arousal and one of the first sites of AD pathology.
We describe the association between job roles and coronavirus disease 2019 (COVID-19) among healthcare personnel. A wide range of hazard ratios were observed across job roles. Medical assistants had higher hazard ratios than nurses, while attending physicians, food service workers, laboratory technicians, pharmacists, residents and fellows, and temporary workers had lower hazard ratios.
We describe COVID-19 cases among nonphysician healthcare personnel (HCP) by work location. The proportion of HCP with coronavirus disease 2019 (COVID-19) was highest in the emergency department and lowest among those working remotely. COVID-19 and non–COVID-19 units had similar proportions of HCP with COVID-19 (13%). Cases decreased across all work locations following COVID-19 vaccination.
Early in the COVID-19 pandemic, the World Health Organization stressed the importance of daily clinical assessments of infected patients, yet current approaches frequently consider cross-sectional timepoints, cumulative summary measures, or time-to-event analyses. Statistical methods are available that make use of the rich information content of longitudinal assessments. We demonstrate the use of a multistate transition model to assess the dynamic nature of COVID-19-associated critical illness using daily evaluations of COVID-19 patients from 9 academic hospitals. We describe the accessibility and utility of methods that consider the clinical trajectory of critically ill COVID-19 patients.
Disruptive behavior disorders (DBD) are heterogeneous at the clinical and the biological level. Therefore, the aims were to dissect the heterogeneous neurodevelopmental deviations of the affective brain circuitry and provide an integration of these differences across modalities.
Methods
We combined two novel approaches. First, normative modeling to map deviations from the typical age-related pattern at the level of the individual of (i) activity during emotion matching and (ii) of anatomical images derived from DBD cases (n = 77) and controls (n = 52) aged 8–18 years from the EU-funded Aggressotype and MATRICS consortia. Second, linked independent component analysis to integrate subject-specific deviations from both modalities.
Results
While cases exhibited on average a higher activity than would be expected for their age during face processing in regions such as the amygdala when compared to controls these positive deviations were widespread at the individual level. A multimodal integration of all functional and anatomical deviations explained 23% of the variance in the clinical DBD phenotype. Most notably, the top marker, encompassing the default mode network (DMN) and subcortical regions such as the amygdala and the striatum, was related to aggression across the whole sample.
Conclusions
Overall increased age-related deviations in the amygdala in DBD suggest a maturational delay, which has to be further validated in future studies. Further, the integration of individual deviation patterns from multiple imaging modalities allowed to dissect some of the heterogeneity of DBD and identified the DMN, the striatum and the amygdala as neural signatures that were associated with aggression.
Nearly one in five children with CHD is born with white matter injury that can be recognised on postnatal MRI by the presence of T1 hyperintense lesions. This pattern of white matter injury is known to portend poor neurodevelopmental outcomes, but the exact aetiology and histologic characterisation of these lesions have never been described. A fetal sheep was cannulated at gestational age 110 days onto a pumpless extracorporeal oxygenator via the umbilical vessels and supported in a fluid environment for 14.5 days. The fetus was supported under hypoxic conditions (mean oxygen delivery 16 ml/kg/day) to simulate the in utero conditions of CHD. At necropsy, the brain was fixed, imaged with MRI, and then stained to histologically identify areas of injury. Under hypoxemic in utero conditions, the fetus developed a T1 hyperintense lesion in its right frontal lobe. Histologically, this lesion was characterised by microvascular proliferation and astrocytosis without gliosis. These findings may provide valuable insight into the aetiology of white matter injury in neonates with CHD.
We analyzed blood-culture practices to characterize the utilization of the Infectious Diseases Society of America (IDSA) recommendations related to catheter-related bloodstream infection (CRBSI) blood cultures. Most patients with a central line had only peripheral blood cultures. Increasing the utilization of CRBSI guidelines may improve clinical care, but may also affect other quality metrics.
Alzheimer’s disease (AD) is highly heritable, and AD polygenic risk scores (AD-PRSs) have been derived from genome-wide association studies. However, the nature of genetic influences very early in the disease process is still not well known. Here we tested the hypothesis that an AD-PRSs would be associated with changes in episodic memory and executive function across late midlife in men who were cognitively unimpaired at their baseline midlife assessment..
Method:
We examined 1168 men in the Vietnam Era Twin Study of Aging (VETSA) who were cognitively normal (CN) at their first of up to three assessments across 12 years (mean ages 56, 62, and 68). Latent growth models of episodic memory and executive function were based on 6–7 tests/subtests. AD-PRSs were based on Kunkle et al. (Nature Genetics, 51, 414–430, 2019), p < 5×10−8 threshold.
Results:
AD-PRSs were correlated with linear slopes of change for both cognitive abilities. Men with higher AD-PRSs had steeper declines in both memory (r = −.19, 95% CI [−.35, −.03]) and executive functioning (r = −.27, 95% CI [−.49, −.05]). Associations appeared driven by a combination of APOE and non-APOE genetic influences.
Conclusions:
Memory is most characteristically impaired in AD, but executive functions are one of the first cognitive abilities to decline in midlife in normal aging. This study is among the first to demonstrate that this early decline also relates to AD genetic influences, even in men CN at baseline.
Underrepresentation of Black biomedical researchers demonstrates continued racial inequity and lack of diversity in the field. The Black Voices in Research curriculum was designed to provide effective instructional materials that showcase inclusive excellence, facilitate the dialog about diversity and inclusion in biomedical research, enhance critical thinking and reflection, integrate diverse visions and worldviews, and ignite action. Instructional materials consist of short videos and discussion prompts featuring Black biomedical research faculty and professionals. Pilot evaluation of instructional content showed that individual stories promoted information relevance, increased knowledge, and created behavioral intention to promote diversity and inclusive excellence in biomedical research.