We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Peripheral inflammatory markers, including serum interleukin 6 (IL-6), are associated with depression, but less is known about how these markers associate with depression at different stages of the life course.
Methods
We examined the associations between serum IL-6 levels at baseline and subsequent depression symptom trajectories in two longitudinal cohorts: ALSPAC (age 10–28 years; N = 4,835) and UK Biobank (39–86 years; N = 39,613) using multilevel growth curve modeling. Models were adjusted for sex, BMI, and socioeconomic factors. Depressive symptoms were measured using the Short Moods and Feelings Questionnaire in ALSPAC (max time points = 11) and the Patient Health Questionnaire-2 in UK Biobank (max time points = 8).
Results
Higher baseline IL-6 was associated with worse depression symptom trajectories in both cohorts (largest effect size: 0.046 [ALSPAC, age 16 years]). These associations were stronger in the younger ALSPAC cohort, where additionally higher IL-6 levels at age 9 years was associated with worse depression symptoms trajectories in females compared to males. Weaker sex differences were observed in the older cohort, UK Biobank. However, statistically significant associations (pFDR <0.05) were of smaller effect sizes, typical of large cohort studies.
Conclusions
These findings suggest that systemic inflammation may influence the severity and course of depressive symptoms across the life course, which is apparent regardless of age and differences in measures and number of time points between these large, population-based cohorts.
This paper discusses the application of a class of Rasch models to situations where test items are grouped into subsets and the common attributes of items within these subsets brings into question the usual assumption of conditional independence. The models are all expressed as particular cases of the random coefficients multinomial logit model developed by Adams and Wilson. This formulation allows a very flexible approach to the specification of alternative models, and makes model testing particularly straightforward. The use of the models is illustrated using item bundles constructed in the framework of the SOLO taxonomy of Biggs and Collis.
Diagnostic criteria for major depressive disorder allow for heterogeneous symptom profiles but genetic analysis of major depressive symptoms has the potential to identify clinical and etiological subtypes. There are several challenges to integrating symptom data from genetically informative cohorts, such as sample size differences between clinical and community cohorts and various patterns of missing data.
Methods
We conducted genome-wide association studies of major depressive symptoms in three cohorts that were enriched for participants with a diagnosis of depression (Psychiatric Genomics Consortium, Australian Genetics of Depression Study, Generation Scotland) and three community cohorts who were not recruited on the basis of diagnosis (Avon Longitudinal Study of Parents and Children, Estonian Biobank, and UK Biobank). We fit a series of confirmatory factor models with factors that accounted for how symptom data was sampled and then compared alternative models with different symptom factors.
Results
The best fitting model had a distinct factor for Appetite/Weight symptoms and an additional measurement factor that accounted for the skip-structure in community cohorts (use of Depression and Anhedonia as gating symptoms).
Conclusion
The results show the importance of assessing the directionality of symptoms (such as hypersomnia versus insomnia) and of accounting for study and measurement design when meta-analyzing genetic association data.
The brain can be represented as a network, with nodes as brain regions and edges as region-to-region connections. Nodes with the most connections (hubs) are central to efficient brain function. Current findings on structural differences in Major Depressive Disorder (MDD) identified using network approaches remain inconsistent, potentially due to small sample sizes. It is still uncertain at what level of the connectome hierarchy differences may exist, and whether they are concentrated in hubs, disrupting fundamental brain connectivity.
Methods
We utilized two large cohorts, UK Biobank (UKB, N = 5104) and Generation Scotland (GS, N = 725), to investigate MDD case–control differences in brain network properties. Network analysis was done across four hierarchical levels: (1) global, (2) tier (nodes grouped into four tiers based on degree) and rich club (between-hub connections), (3) nodal, and (4) connection.
Results
In UKB, reductions in network efficiency were observed in MDD cases globally (d = −0.076, pFDR = 0.033), across all tiers (d = −0.069 to −0.079, pFDR = 0.020), and in hubs (d = −0.080 to −0.113, pFDR = 0.013–0.035). No differences in rich club organization and region-to-region connections were identified. The effect sizes and direction for these associations were generally consistent in GS, albeit not significant in our lower-N replication sample.
Conclusion
Our results suggest that the brain's fundamental rich club structure is similar in MDD cases and controls, but subtle topological differences exist across the brain. Consistent with recent large-scale neuroimaging findings, our findings offer a connectomic perspective on a similar scale and support the idea that minimal differences exist between MDD cases and controls.
Helium or neopentane can be used as surrogate gas fill for deuterium (D2) or deuterium-tritium (DT) in laser-plasma interaction studies. Surrogates are convenient to avoid flammability hazards or the integration of cryogenics in an experiment. To test the degree of equivalency between deuterium and helium, experiments were conducted in the Pecos target chamber at Sandia National Laboratories. Observables such as laser propagation and signatures of laser-plasma instabilities (LPI) were recorded for multiple laser and target configurations. It was found that some observables can differ significantly despite the apparent similarity of the gases with respect to molecular charge and weight. While a qualitative behaviour of the interaction may very well be studied by finding a suitable compromise of laser absorption, electron density, and LPI cross sections, a quantitative investigation of expected values for deuterium fills at high laser intensities is not likely to succeed with surrogate gases.
The AD8 is a validated screening instrument for functional changes that may be caused by cognitive decline and dementia. It is frequently used in clinics and research studies because it is short and easy to administer, with a cut off score of 2 out of 8 items recommended to maximize sensitivity and specificity. This cutoff assumes that all 8 items provide equivalent “information” about everyday functioning. In this study, we used item response theory (IRT) to test this assumption. To determine the relevance of this measure of everyday functioning in men and women, and across race, ethnicity, and education, we conducted differential item functioning (DIF) analysis to test for item bias.
Participants and Methods:
Data came from the 2021 follow up of the High School & Beyond cohort (N=8,690; mean age 57.5 ± 1.2; 55% women), a nationally representative, longitudinal study of Americans who were first surveyed in 1980 when they were in the 10th or 12th grade. Participants were asked AD8 questions about their own functioning via phone or internet survey. First, we estimated a one-parameter (i.e., differing difficulty, equal discrimination across items) and two-parameter IRT model (i.e., differing difficulty and differing discrimination across items). We compared model fit using a likelihood-ratio test. Second, we tested for uniform and non-uniform DIF on AD8 items by sex, race and ethnicity (non-Hispanic White, non-Hispanic Black, Hispanic), education level (high school or less, some college, BA degree or more), and survey mode (phone or internet). We examined DIF salience by comparing the difference between original and DIF-adjusted AD8 scores to the standard error of measurement of the original score.
Results:
The two-parameter IRT model fit the data significantly better than the one-parameter model, indicating that some items were more strongly related to underlying everyday functional ability than others. For example, the “problems with judgment” item had higher discrimination (more information) than the “less interest in hobbies/activities” item. There were significant differences in item endorsement by race/ethnicity, education, and survey mode. We found significant uniform and non-uniform DIF on several items across each of these groups. For example, for a given level of functional decline (theta) White participants were more likely to endorse “Daily problems with thinking/memory” than Black and Hispanic participants. The DIF was salient (i.e., caused AD8 scores to change by greater than the standard error of measurement for a large portion of respondents) for those with a college degree and phone respondents.
Conclusions:
In a population representative sample of Americans ∼age 57, the items on the AD8 contributed differing levels of discrimination along the range of everyday functioning that is impacted by later life cognitive impairment. This suggests that a simple cut-off or summed score may not be appropriate since some items yield more information about the underlying construct than others. Furthermore, we observed significant and salient DIF on several items by education and survey mode, AD8 scores should not be compared across education groups and assessment modes without adjustment for this measurement bias.
In October 2010, the provincial government of Ontario, Canada enacted the Open for Business Act (OBA). A central component of the OBA is its provisions aiming to streamline the enforcement of Ontario’s Employment Standards Act (ESA). The OBA’s changes to the ESA are an attempt to manage a crisis of employment standards (ES) enforcement, arising from decades of ineffective regulation, by entrenching an individualised enforcement model. The Act aims to streamline enforcement by screening people assumed to be lacking definitive proof of violations out of the complaints process. The OBA therefore produces a new category of ‘illegitimate claimants’ and attributes administrative backlogs to these people. Instead of improving the protection of workers, the OBA embeds new racialised and gendered modes of exclusion in the ES enforcement process.
Major depressive disorder (MDD) was previously associated with negative affective biases. Evidence from larger population-based studies, however, is lacking, including whether biases normalise with remission. We investigated associations between affective bias measures and depressive symptom severity across a large community-based sample, followed by examining differences between remitted individuals and controls.
Methods
Participants from Generation Scotland (N = 1109) completed the: (i) Bristol Emotion Recognition Task (BERT), (ii) Face Affective Go/No-go (FAGN), and (iii) Cambridge Gambling Task (CGT). Individuals were classified as MDD-current (n = 43), MDD-remitted (n = 282), or controls (n = 784). Analyses included using affective bias summary measures (primary analyses), followed by detailed emotion/condition analyses of BERT and FAGN (secondary analyses).
Results
For summary measures, the only significant finding was an association between greater symptoms and lower risk adjustment for CGT across the sample (individuals with greater symptoms were less likely to bet more, despite increasingly favourable conditions). This was no longer significant when controlling for non-affective cognition. No differences were found for remitted-MDD v. controls. Detailed analysis of BERT and FAGN indicated subtle negative biases across multiple measures of affective cognition with increasing symptom severity, that were independent of non-effective cognition [e.g. greater tendency to rate faces as angry (BERT), and lower accuracy for happy/neutral conditions (FAGN)]. Results for remitted-MDD were inconsistent.
Conclusions
This suggests the presence of subtle negative affective biases at the level of emotion/condition in association with depressive symptoms across the sample, over and above those accounted for by non-affective cognition, with no evidence for affective biases in remitted individuals.
Major depressive disorder (MDD) is a polygenic disorder associated with brain alterations but until recently, there have been no brain-based metrics to quantify individual-level variation in brain morphology. Here, we evaluated and compared the performance of a new brain-based ‘Regional Vulnerability Index’ (RVI) with polygenic risk scores (PRS), in the context of MDD. We assessed associations with syndromal MDD in an adult sample (N = 702, age = 59 ± 10) and with subclinical depressive symptoms in a longitudinal adolescent sample (baseline N = 3,825, age = 10 ± 1; 2-year follow-up N = 2,081, age = 12 ± 1).
Methods
MDD-RVIs quantify the correlation of the individual’s corresponding brain metric with the expected pattern for MDD derived in an independent sample. Using the same methodology across samples, subject-specific MDD-PRS and six MDD-RVIs based on different brain modalities (subcortical volume, cortical thickness, cortical surface area, mean diffusivity, fractional anisotropy, and multimodal) were computed.
Results
In adults, MDD-RVIs (based on white matter and multimodal measures) were more strongly associated with MDD (β = 0.099–0.281, PFDR = 0.001–0.043) than MDD-PRS (β = 0.056–0.152, PFDR = 0.140–0.140). In adolescents, depressive symptoms were associated with MDD-PRS at baseline and follow-up (β = 0.084–0.086, p = 1.38 × 10−4−4.77 × 10−4) but not with any MDD-RVIs (β < 0.05, p > 0.05).
Conclusions
Our results potentially indicate the ability of brain-based risk scores to capture a broader range of risk exposures than genetic risk scores in adults and are also useful in helping us to understand the temporal origins of depression-related brain features. Longitudinal data, specific to the developmental period and on white matter measures, will be useful in informing risk for subsequent psychiatric illness.
Seed retention, and ultimately seed shatter, are extremely important for the efficacy of harvest weed seed control (HWSC) and are likely influenced by various agroecological and environmental factors. Field studies investigated seed-shattering phenology of 22 weed species across three soybean [Glycine max (L.) Merr.]-producing regions in the United States. We further evaluated the potential drivers of seed shatter in terms of weather conditions, growing degree days, and plant biomass. Based on the results, weather conditions had no consistent impact on weed seed shatter. However, there was a positive correlation between individual weed plant biomass and delayed weed seed–shattering rates during harvest. This work demonstrates that HWSC can potentially reduce weed seedbank inputs of plants that have escaped early-season management practices and retained seed through harvest. However, smaller individuals of plants within the same population that shatter seed before harvest pose a risk of escaping early-season management and HWSC.
Diagnosis of sinus venosus defects, not infrequently associated with complex anomalous pulmonary venous drainage, may be delayed requiring multimodality imaging.
Methods:
Retrospective review of all patients from February 2008 to January 2019.
Results:
Thirty-seven children were diagnosed at a median age of 4.2 years (range 0.5−15.5 years). In 32 of 37 (86%) patients, diagnosis was achieved on transthoracic echocardiography, but five patients (14%) had complex variants (four had high insertion of anomalous vein into the superior caval vein and three had multiple anomalous veins draining to different sites, two of whom had drainage of one vein into the high superior caval vein). In these five patients, the final diagnosis was achieved by multimodality imaging and intra-operative findings. The median age at surgery was 5.2 years (range 1.6−15.8 years). Thirty-one patients underwent double patch repair, four patients a Warden repair, and two patients a single-patch repair. Of the four Warden repairs, two patients had a high insertion of right-sided anomalous pulmonary vein into the superior caval vein, one patient had bilateral superior caval veins, and one patient had right lower pulmonary vein insertion into the right atrium/superior caval vein junction. There was no post-operative mortality, reoperation, residual shunt or pulmonary venous obstruction. One patient developed superior caval vein obstruction and one patient developed atrial flutter.
Conclusion:
Complementary cardiac imaging modalities improve diagnosis of complex sinus venosus defects associated with a wide variation in the pattern of anomalous pulmonary venous connection. Nonetheless, surgical treatment is associated with excellent outcomes.
Studying phenotypic and genetic characteristics of age at onset (AAO) and polarity at onset (PAO) in bipolar disorder can provide new insights into disease pathology and facilitate the development of screening tools.
Aims
To examine the genetic architecture of AAO and PAO and their association with bipolar disorder disease characteristics.
Method
Genome-wide association studies (GWASs) and polygenic score (PGS) analyses of AAO (n = 12 977) and PAO (n = 6773) were conducted in patients with bipolar disorder from 34 cohorts and a replication sample (n = 2237). The association of onset with disease characteristics was investigated in two of these cohorts.
Results
Earlier AAO was associated with a higher probability of psychotic symptoms, suicidality, lower educational attainment, not living together and fewer episodes. Depressive onset correlated with suicidality and manic onset correlated with delusions and manic episodes. Systematic differences in AAO between cohorts and continents of origin were observed. This was also reflected in single-nucleotide variant-based heritability estimates, with higher heritabilities for stricter onset definitions. Increased PGS for autism spectrum disorder (β = −0.34 years, s.e. = 0.08), major depression (β = −0.34 years, s.e. = 0.08), schizophrenia (β = −0.39 years, s.e. = 0.08), and educational attainment (β = −0.31 years, s.e. = 0.08) were associated with an earlier AAO. The AAO GWAS identified one significant locus, but this finding did not replicate. Neither GWAS nor PGS analyses yielded significant associations with PAO.
Conclusions
AAO and PAO are associated with indicators of bipolar disorder severity. Individuals with an earlier onset show an increased polygenic liability for a broad spectrum of psychiatric traits. Systematic differences in AAO across cohorts, continents and phenotype definitions introduce significant heterogeneity, affecting analyses.
Ethnohistoric accounts indicate that the people of Australia's Channel Country engaged in activities rarely recorded elsewhere on the continent, including food storage, aquaculture and possible cultivation, yet there has been little archaeological fieldwork to verify these accounts. Here, the authors report on a collaborative research project initiated by the Mithaka people addressing this lack of archaeological investigation. The results show that Mithaka Country has a substantial and diverse archaeological record, including numerous large stone quarries, multiple ritual structures and substantial dwellings. Our archaeological research revealed unknown aspects, such as the scale of Mithaka quarrying, which could stimulate re-evaluation of Aboriginal socio-economic systems in parts of ancient Australia.
Colleges and universities around the world engaged diverse strategies during the COVID-19 pandemic. Baylor University, a community of ˜22,700 individuals, was 1 of the institutions which resumed and sustained operations. The key strategy was establishment of multidisciplinary teams to develop mitigation strategies and priority areas for action. This population-based team approach along with implementation of a “Swiss Cheese” risk mitigation model allowed small clusters to be rapidly addressed through testing, surveillance, tracing, isolation, and quarantine. These efforts were supported by health protocols including face coverings, social distancing, and compliance monitoring. As a result, activities were sustained from August 1 to December 8, 2020. There were 62,970 COVID-19 tests conducted with 1435 people testing positive for a positivity rate of 2.28%. A total of 1670 COVID-19 cases were identified with 235 self-reports. The mean number of tests per week was 3500 with approximately 80 of these positive (11/d). More than 60 student tracers were trained with over 120 personnel available to contact trace, at a ratio of 1 per 400 university members. The successes and lessons learned provide a framework and pathway for similar institutions to mitigate the ongoing impacts of COVID-19 and sustain operations during a global pandemic.
We summarize some of the past year's most important findings within climate change-related research. New research has improved our understanding of Earth's sensitivity to carbon dioxide, finds that permafrost thaw could release more carbon emissions than expected and that the uptake of carbon in tropical ecosystems is weakening. Adverse impacts on human society include increasing water shortages and impacts on mental health. Options for solutions emerge from rethinking economic models, rights-based litigation, strengthened governance systems and a new social contract. The disruption caused by COVID-19 could be seized as an opportunity for positive change, directing economic stimulus towards sustainable investments.
Technical summary
A synthesis is made of ten fields within climate science where there have been significant advances since mid-2019, through an expert elicitation process with broad disciplinary scope. Findings include: (1) a better understanding of equilibrium climate sensitivity; (2) abrupt thaw as an accelerator of carbon release from permafrost; (3) changes to global and regional land carbon sinks; (4) impacts of climate change on water crises, including equity perspectives; (5) adverse effects on mental health from climate change; (6) immediate effects on climate of the COVID-19 pandemic and requirements for recovery packages to deliver on the Paris Agreement; (7) suggested long-term changes to governance and a social contract to address climate change, learning from the current pandemic, (8) updated positive cost–benefit ratio and new perspectives on the potential for green growth in the short- and long-term perspective; (9) urban electrification as a strategy to move towards low-carbon energy systems and (10) rights-based litigation as an increasingly important method to address climate change, with recent clarifications on the legal standing and representation of future generations.
Social media summary
Stronger permafrost thaw, COVID-19 effects and growing mental health impacts among highlights of latest climate science.
Factors that facilitate transfer of training in paediatric echocardiography remain poorly understood. This study assessed whether high-variation training facilitated successful transfer in paediatric echocardiography.
Methods:
A mixed-methods study of transfer of technical and interpretive skill application amongst postgraduate trainees. Trainees were randomised to a low or high-variation training group. After a period of 8 weeks intensive echocardiography training, we video-recorded how trainees completed an echocardiogram in a complex cardiac lesion not previously encountered. Blinded quantitative analysis and scoring of trainee performance (echocardiogram performance, report, and technical proficiency) were performed using a validated assessment tool by a blinded cardiologist and senior cardiac physiologist. Qualitative interviews of the trainees were recorded to ascertain trainee experiences during the training and transfer process.
Results:
Sixteen trainees were enrolled in the study. For the cumulative score for all three components tested (echocardiogram performance, report, and technical proficiency), χ2 = 8.223, p = .016, which showed the high-variation group outperformed the low-variation group. Two common themes which assisted in the transfer emerged from interviews are as follows: (1) use of strategies described in variation theory to describe abnormal hearts, (2) the use of formative live feedback from trainers during hands-on training.
Conclusion:
Training strategies exposing trainees to high-variation training may aid transfer of paediatric echocardiography skills.
The COVID-19 pandemic and mitigation measures are likely to have a marked effect on mental health. It is important to use longitudinal data to improve inferences.
Aims
To quantify the prevalence of depression, anxiety and mental well-being before and during the COVID-19 pandemic. Also, to identify groups at risk of depression and/or anxiety during the pandemic.
Method
Data were from the Avon Longitudinal Study of Parents and Children (ALSPAC) index generation (n = 2850, mean age 28 years) and parent generation (n = 3720, mean age 59 years), and Generation Scotland (n = 4233, mean age 59 years). Depression was measured with the Short Mood and Feelings Questionnaire in ALSPAC and the Patient Health Questionnaire-9 in Generation Scotland. Anxiety and mental well-being were measured with the Generalised Anxiety Disorder Assessment-7 and the Short Warwick Edinburgh Mental Wellbeing Scale.
Results
Depression during the pandemic was similar to pre-pandemic levels in the ALSPAC index generation, but those experiencing anxiety had almost doubled, at 24% (95% CI 23–26%) compared with a pre-pandemic level of 13% (95% CI 12–14%). In both studies, anxiety and depression during the pandemic was greater in younger members, women, those with pre-existing mental/physical health conditions and individuals in socioeconomic adversity, even when controlling for pre-pandemic anxiety and depression.
Conclusions
These results provide evidence for increased anxiety in young people that is coincident with the pandemic. Specific groups are at elevated risk of depression and anxiety during the COVID-19 pandemic. This is important for planning current mental health provisions and for long-term impact beyond this pandemic.
Potential effectiveness of harvest weed seed control (HWSC) systems depends upon seed shatter of the target weed species at crop maturity, enabling its collection and processing at crop harvest. However, seed retention likely is influenced by agroecological and environmental factors. In 2016 and 2017, we assessed seed-shatter phenology in 13 economically important broadleaf weed species in soybean [Glycine max (L.) Merr.] from crop physiological maturity to 4 wk after physiological maturity at multiple sites spread across 14 states in the southern, northern, and mid-Atlantic United States. Greater proportions of seeds were retained by weeds in southern latitudes and shatter rate increased at northern latitudes. Amaranthus spp. seed shatter was low (0% to 2%), whereas shatter varied widely in common ragweed (Ambrosia artemisiifolia L.) (2% to 90%) over the weeks following soybean physiological maturity. Overall, the broadleaf species studied shattered less than 10% of their seeds by soybean harvest. Our results suggest that some of the broadleaf species with greater seed retention rates in the weeks following soybean physiological maturity may be good candidates for HWSC.
Seed shatter is an important weediness trait on which the efficacy of harvest weed seed control (HWSC) depends. The level of seed shatter in a species is likely influenced by agroecological and environmental factors. In 2016 and 2017, we assessed seed shatter of eight economically important grass weed species in soybean [Glycine max (L.) Merr.] from crop physiological maturity to 4 wk after maturity at multiple sites spread across 11 states in the southern, northern, and mid-Atlantic United States. From soybean maturity to 4 wk after maturity, cumulative percent seed shatter was lowest in the southern U.S. regions and increased moving north through the states. At soybean maturity, the percent of seed shatter ranged from 1% to 70%. That range had shifted to 5% to 100% (mean: 42%) by 25 d after soybean maturity. There were considerable differences in seed-shatter onset and rate of progression between sites and years in some species that could impact their susceptibility to HWSC. Our results suggest that many summer annual grass species are likely not ideal candidates for HWSC, although HWSC could substantially reduce their seed output during certain years.