We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Despite advances in antiretroviral treatment (ART), human immunodeficiency virus (HIV) can detrimentally affect everyday functioning. Neurocognitive impairment (NCI) and current depression are common in people with HIV (PWH) and can contribute to poor functional outcomes, but potential synergies between the two conditions are less understood. Thus, the present study aimed to compare the independent and combined effects of NCI and depression on everyday functioning in PWH. We predicted worse functional outcomes with comorbid NCI and depression than either condition alone.
Methods:
PWH enrolled at the UCSD HIV Neurobehavioral Research Program were assessed for neuropsychological performance, depression severity (≤minimal, mild, moderate, or severe; Beck Depression Inventory-II), and self-reported everyday functioning.
Results:
Participants were 1,973 PWH (79% male; 66% racial/ethnic minority; Age: M = 48.6; Education: M = 13.0, 66% AIDS; 82% on ART; 42% with NCI; 35% BDI>13). ANCOVA models found effects of NCI and depression symptom severity on all functional outcomes (ps < .0001). With NCI and depression severity included in the same model, both remained significant (ps < .0001), although the effects of each were attenuated, and yielded better model fit parameters (i.e., lower AIC values) than models with only NCI or only depression.
Conclusions:
Consistent with prior literature, NCI and depression had independent effects on everyday functioning in PWH. There was also evidence for combined effects of NCI and depression, such that their comorbidity had a greater impact on functioning than either alone. Our results have implications for informing future interventions to target common, comorbid NCI and depressed mood in PWH and thus reduce HIV-related health disparities.
Objectives/Goals: Depression is common among people living with HIV (PLWH). This study explored the link between reduced metacognitive awareness and depression in PLWH. It utilized a positive emotion regulation task to compare brain activation during viewing versus upregulating positive emotions. Methods/Study Population: Depressed PLWH (N = 24; mean age = 53; HAM-D mean = 19) participated in an emotion regulation task while blood oxygen-level-dependent (BOLD) responses were recorded. In the emotional regulation task, participants were shown the International Affective Picture System (IAPS) a series of positive, negative, and neutral images. Participants were asked to view these images and given instructions to either negatively reappraise (RN) or positively reappraise (RP). In the RP condition, participants were no longer shown the image and asked to upregulate their positive emotional responses associated with it. Ten onset times were included for each trial. Results/Anticipated Results: A one-sample t-test was conducted to analyze contrasts between reappraisal of positive images and viewing positive images (RP > VP). Results showed significantly greater activation in the posterior cingulate and angular gyrus during the RP condition (peak MNI: 18, -52, 34; p < 0.001, uncorrected, k > 10 voxels). In comparing the reappraisal of negative images to viewing negative images (RN > VN), there was increased activation in the right supramarginal gyrus (peak MNI: 50, -28, 22; p < 0.001, uncorrected, k > 10 voxels). When contrasting the reappraisal of positive to negative images (RP > RN), BOLD signals were higher in the left dorsolateral prefrontal cortex (peak MNI: 40, -38, 32; p < 0.001, uncorrected, k > 10 voxels). Discussion/Significance of Impact: Findings underscore that depressed PLWH demonstrates BOLD responses in brain regions linked to appetitive motivation and meta-cognitive awareness during the RP condition which demands more executive resources among those with depression, highlighting the complexity of emotional regulation in this population.
Wild oat is a significant weed of cropping systems in the Canadian Prairies. Wild oat resistance to herbicides has increased interest in the use of nonchemical management strategies. Harvest weed seed control techniques such as impact mills or chaff collection have been of interest in Prairie crops, with wild oat identified as a key target. To evaluate the effects of crop rotation maturity, harvest management, and harvest weed seed control on wild oat, a study was conducted from 2016 to 2018 at four locations in the Canadian Prairies. Two-year crop rotations with either early, normal, or late-maturing crops were implemented before barley was seeded across all rotations in the final year. In addition, a second factor of harvest management (swathing or straight cut) was included in the study. Chaff collection was used in this study to quantify wild oat seeds that were targetable by harvest weed seed control techniques. The hypothesis was that earlier maturing crops would result in increased wild oat capture at harvest and, therefore, lower wild oat populations. Wild oat density and wild oat biomass were lowest in the early maturing rotations. In addition, wild oat exhibited lower biomass in swathed crops than straight-cut crops. Wild oat seedbank levels reflected a similar trend with the lowest densities occurring in early maturing rotation, then the normal maturity rotation, and the late maturing rotation, which had the highest seedbank densities. Wild oat densities increased in all crop rotations; however, only harvest weed seed control and crop rotation were implemented as control measures. Wild oat numbers in the chaff were not reflective of the earliness of harvest. Crop yields suggest that competitive winter wheat stands contributed to the success of the early maturing rotations compared to other treatments. Early maturing rotations resulted in reduced wild oat populations, likely through a combination of crop competitiveness and rotational diversity, and harvest weed seed control management effects from earlier maturing crops.
Interprofessional teams in the pediatric cardiac ICU consolidate their management plans in pre-family meeting huddles, a process that affects the course of family meetings but often lacks optimal communication and teamwork.
Methods:
Cardiac ICU clinicians participated in an interprofessional intervention to improve how they prepared for and conducted family meetings. We conducted a pretest–posttest study with clinicians participating in huddles before family meetings. We assessed feasibility of clinician enrollment, assessed clinician perception of acceptability of the intervention via questionnaire and semi-structured interviews, and impact on team performance using a validated tool. Wilcoxon rank sum test assessed intervention impact on team performance at meeting level comparing pre- and post-intervention data.
Results:
Totally, 24 clinicians enrolled in the intervention (92% retention) with 100% completion of training. All participants recommend cardiac ICU Teams and Loved ones Communicating to others and 96% believe it improved their participation in family meetings. We exceeded an acceptable level of protocol fidelity (>75%). Team performance was significantly (p < 0.001) higher in post-intervention huddles (n = 30) than in pre-intervention (n = 28) in all domains. Median comparisons: Team structure [2 vs. 5], Leadership [3 vs. 5], Situation Monitoring [3 vs. 5], Mutual Support [ 3 vs. 5], and Communication [3 vs. 5].
Conclusion:
Implementing an interprofessional team intervention to improve team performance in pre-family meeting huddles is feasible, acceptable, and improves team function. Future research should further assess impact on clinicians, patients, and families.
Differences in social behaviours are common in young people with neurodevelopmental conditions (NDCs). Recent research challenges the long-standing hypothesis that difficulties in social cognition explain social behaviour differences.
Aims
We examined how difficulties regulating one's behaviour, emotions and thoughts to adapt to environmental demands (i.e. dysregulation), alongside social cognition, explain social behaviours across neurodiverse young people.
Method
We analysed cross-sectional behavioural and cognitive data of 646 6- to 18-year-old typically developing young people and those with NDCs from the Province of Ontario Neurodevelopmental Network. Social behaviours and dysregulation were measured by the caregiver-reported Adaptive Behavior Assessment System Social domain and Child Behavior Checklist Dysregulation Profile, respectively. Social cognition was assessed by the Neuropsychological Assessment Affect-Recognition and Theory-of-Mind, Reading the Mind in the Eyes Test, and Sandbox continuous false-belief task scores. We split the sample into training (n = 324) and test (n = 322) sets. We investigated how social cognition and dysregulation explained social behaviours through principal component regression and hierarchical regression in the training set. We tested social cognition-by-dysregulation interactions, and whether dysregulation mediated the social cognition–social behaviours association. We assessed model fits in the test set.
Results
Two social cognition components adequately explained social behaviours (13.88%). Lower dysregulation further explained better social behaviours (β = −0.163, 95% CI −0.191 to −0.134). Social cognition-by-dysregulation interaction was non-significant (β = −0.001, 95% CI −0.023 to 0.021). Dysregulation partially mediated the social cognition–social behaviours association (total effect: 0.544, 95% CI 0.370–0.695). Findings were replicated in the test set.
Conclusions
Self-regulation, beyond social cognition, substantially explains social behaviours across neurodiverse young people.
Studies show stimulant medications are effective for different ADHD presentations (predominantly inattentive [IA], predominantly hyperactive-impulsive [HI] or combined [C]); however, few studies have evaluated nonstimulant efficacy in different ADHD presentations. Viloxazine ER [VLX ER] is a nonstimulant, FDA-approved medication for pediatric (≥6 yrs) and adult ADHD. This post-hoc analysis of 4 double-blind (DB), Phase 3, clinical trials (2 in adolescents [NCT03247517 and NCT03247556], 2 in children [NCT03247530 and NCT03247543]), evaluates VLX ER efficacy by ADHD presentation as derived from ADHD Rating Scale, 5th Edition (ADHD-RS-5) assessments at Baseline.
Methods
Children and adolescents with ADHD and an ADHD-RS-5 Total score ≥ 28 were eligible for enrollment. ADHD presentation was defined as a rating of ≥2 on at least 6 of 9 ADHD-RS-5 inattention items, or hyperactive-impulsive items or both. For each ADHD presentation, the change from Baseline (CFB) in ADHD-RS-5 Total score (primary outcome in each study) was assessed using mixed models for repeated measures (MMRM). Responder rate (secondary outcome), ≥50% reduction from baseline in ADHD-RS-5 Total score, was analyzed using generalized estimating equations (GEE).
Results
Of 1354 subjects [placebo N = 452, VLX ER N = 902], ADHD presentation was assigned as 288 (21.3%) [IA], 1010 (74.5%) [C], 40 (3.0%) [HI], 16 (1.2%) [none of these]. Due to the small sample size of [HI], only the [IA] and [C] results are presented. At Week 6 (pooled data endpoint), ADHD-RS-5 Total scores were significantly improved for VLX ER relative to placebo for both the [IA] and [C] ADHD presentations. LS mean (SE) treatment differences, p-values were: [IA] -3.1 (1.35), p = 0.0219, and [C] 5.8 (0.97), p < 0.0001. Responder rates were also significantly higher for VLX ER: 43.0% [IA] and 42.7% [C] relative to placebo 29.5% [IA] and 25.5 % [C] (p=.0311 and p<.0001).
Conclusions
Viloxazine ER significantly reduced ADHD symptoms in individuals meeting criteria for ADHD [IA] or [C] presentations at Baseline. Limitations include post-hoc methodology, smaller sample sizes of [IA] and [HI] groups, and the ADHD-RS-5 ≥ 28 eligibility requirement, that may favor enrollment of individuals with ADHD [C] over ADHD [IA] or [HI] presentations. Consistency of response during long-term use should be evaluated.
Zuranolone is an investigational positive allosteric modulator of synaptic and extrasynaptic GABAA receptors and a neuroactive steroid in clinical development as a once-daily, oral, 14-day treatment course for adults with major depressive disorder or postpartum depression (PPD). The randomized, double-blind, placebo-controlled SKYLARK Study (NCT04442503) demonstrated that zuranolone 50 mg significantly improved depressive symptoms (as assessed by 17-item Hamilton Rating Scale for Depression total score) at Day 15 (primary endpoint; p<0.001) and was generally well tolerated in adults with PPD.
Methods
In the SKYLARK Study, patients were randomized 1:1 to receive zuranolone 50 mg or placebo for 14 days. Safety and tolerability were assessed by the incidence and severity of treatment-emergent adverse events (TEAEs), rates of dose reduction and treatment discontinuation, as well as weight gain and sexual dysfunction.
Results
The SKYLARK Study assessed safety data from 98 patients treated with zuranolone 50 mg and 98 patients treated with placebo. TEAEs were reported in 66.3% of zuranolone-treated patients and 53.1% of placebo-treated patients. In patients that experienced TEAEs, most reported mild (zuranolone, 50.8%; placebo, 75%) or moderate (zuranolone, 44.6%; placebo, 23.1%) events. The most common (≥5%) TEAEs were somnolence (26.5%), dizziness (13.3%), sedation (11.2%), headache (9.2%), diarrhea (6.1%), nausea (5.1%), urinary tract infection (5.1%), and COVID-19 (5.1%) with zuranolone, and headache (13.3%), dizziness (10.2%), nausea (6.1%), and somnolence (5.1%) with placebo. Dose reduction due to TEAEs was 16.3% in patients receiving zuranolone vs 1.0% in patients receiving placebo; the most common TEAEs (>1 patient) leading to zuranolone dose reduction were somnolence (7.1%), dizziness (6.1%), and sedation (3.1%). Treatment discontinuation due to TEAEs was 4.1% in patients receiving zuranolone vs 2.0% in patients receiving placebo; TEAEs leading to zuranolone discontinuation in >1 patient included somnolence (2.0%). Serious TEAEs were reported in 2.0% of zuranolone-treated and 0% of placebo-treated patients; these included upper abdominal pain (1.0%, [1/98]), peripheral edema (1.0%, [1/98]), perinatal depression (1.0%, [1/98]), and hypertension (1.0%, [1/98]). Per investigators, serious TEAEs were not related to zuranolone. No signals for weight gain or sexual dysfunction were identified.
Conclusions
In adults with PPD, zuranolone 50 mg was generally well tolerated. Most TEAEs were mild or moderate in severity. Dose reduction due to TEAEs mainly resulted from somnolence, dizziness, and sedation, while treatment discontinuation due to TEAEs was low. No signals for weight gain or sexual dysfunction were identified.
We present a re-discovery of G278.94+1.35a as possibly one of the largest known Galactic supernova remnants (SNRs) – that we name Diprotodon. While previously established as a Galactic SNR, Diprotodon is visible in our new Evolutionary Map of the Universe (EMU) and GaLactic and Extragalactic All-sky MWA (GLEAM) radio continuum images at an angular size of $3{{{{.\!^\circ}}}}33\times3{{{{.\!^\circ}}}}23$, much larger than previously measured. At the previously suggested distance of 2.7 kpc, this implies a diameter of 157$\times$152 pc. This size would qualify Diprotodon as the largest known SNR and pushes our estimates of SNR sizes to the upper limits. We investigate the environment in which the SNR is located and examine various scenarios that might explain such a large and relatively bright SNR appearance. We find that Diprotodon is most likely at a much closer distance of $\sim$1 kpc, implying its diameter is 58$\times$56 pc and it is in the radiative evolutionary phase. We also present a new Fermi-LAT data analysis that confirms the angular extent of the SNR in gamma rays. The origin of the high-energy emission remains somewhat puzzling, and the scenarios we explore reveal new puzzles, given this unexpected and unique observation of a seemingly evolved SNR having a hard GeV spectrum with no breaks. We explore both leptonic and hadronic scenarios, as well as the possibility that the high-energy emission arises from the leftover particle population of a historic pulsar wind nebula.
Coronavirus disease-2019 precipitated the rapid deployment of novel therapeutics, which led to operational and logistical challenges for healthcare organizations. Four health systems participated in a qualitative study to abstract lessons learned, challenges, and promising practices from implementing neutralizing monoclonal antibody (nMAb) treatment programs. Lessons are summarized under three themes that serve as critical building blocks for health systems to rapidly deploy novel therapeutics during a pandemic: (1) clinical workflows, (2) data infrastructure and platforms, and (3) governance and policy. Health systems must be sufficiently agile to quickly scale programs and resources in times of uncertainty. Real-time monitoring of programs, policies, and processes can help support better planning and improve program effectiveness. The lessons and promising practices shared in this study can be applied by health systems for distribution of novel therapeutics beyond nMAbs and toward future pandemics and public health emergencies.
To evaluate the design of I-Corps@NCATS as a translational scientist training program, we mapped specific elements of the program’s content and pedagogy to the characteristics of a translational scientist, as first defined by Gilliland et al. []: systems thinker, process innovator, boundary spanner, team player, and skilled communicator. Using a mixed-methods evaluation, we examined how the I-Corps@NCATS training program, delivered across twenty-two Clinical and Translational Science Award Hubs, impacted the development of these key translational scientist characteristics.
Methods:
We developed survey items to assess the characteristics of systems thinker, process innovator, boundary spanner, team player, and skilled communicator. Data were collected from a national sample of 281 participants in the I-Corps@NCATS program. Using post-then-retrospective-pre survey items, participants self-reported their ability to perform skills associated with each of the translational scientist characteristics. Additionally, two open-ended survey questions explored how the program shifted participants’ translational orientation, generating 211 comments. These comments were coded through a team-based, iterative process.
Results:
Respondents reported the greatest increases in self-assessed abilities related to systems thinking and skilled communication. Participants indicated the highest levels of abilities related to team player and boundary crosser. From the coding of open-ended comments, we identified two additional characteristics of translational scientists: intellectual humility and cognitive flexibility.
Conclusions:
Participation in I-Corps@NCATS accelerates translational science in two ways: 1) by teaching the process of scientific translation from research ideas to real-world solutions, and 2) by encouraging growth in the mindset and characteristics of a translational scientist.
Eating and drinking difficulties are highly prevalent in the intellectual disability population and include all aspects of the eating and drinking process. This can include stable positioning and pacing the meal all the way through to safe swallowing. Dysphagia is a subset of wider eating and drinking difficulties, often seen in the intellectual disability population. Dysphagia presents as a difficulty chewing and swallowing. It is often the underlying cause of malnutrition, dehydration, weight loss, choking, and aspiration pneumonia, with risks to mental health, social isolation, dignity, and enjoyment. A deterioration in eating and drinking skills is often a symptom of a broader physical and mental health diagnosis. People with eating and drinking difficulties can also experience a cyclical decline in health and an increased risk of malnutrition and dehydration. In addition to eating and drinking difficulties this chapter covers surgical intervention requiring insertion of a gastric tube, the impact of medication on feeding, and strategies to manage eating and drinking difficulties.
Society of Thoracic Surgeons Congenital Heart Surgery Database is the largest congenital heart surgery database worldwide but does not provide information beyond primary episode of care. Linkage to hospital electronic health records would capture complications and comorbidities along with long-term outcomes for patients with CHD surgeries. The current study explores linkage success between Society of Thoracic Surgeons Congenital Heart Surgery Database and electronic health record data in North Carolina and Georgia.
Methods:
The Society of Thoracic Surgeons Congenital Heart Surgery Database was linked to hospital electronic health records from four North Carolina congenital heart surgery using indirect identifiers like date of birth, sex, admission, and discharge dates, from 2008 to 2013. Indirect linkage was performed at the admissions level and compared to two other linkages using a “direct identifier,” medical record number: (1) linkage between Society of Thoracic Surgeons Congenital Heart Surgery Database and electronic health records from a subset of patients from one North Carolina institution and (2) linkage between Society of Thoracic Surgeons data from two Georgia facilities and Georgia’s CHD repository, which also uses direct identifiers for linkage.
Results:
Indirect identifiers successfully linked 79% (3692/4685) of Society of Thoracic Surgeons Congenital Heart Surgery Database admissions across four North Carolina hospitals. Direct linkage techniques successfully matched Society of Thoracic Surgeons Congenital Heart Surgery Database to 90.2% of electronic health records from the North Carolina subsample. Linkage between Society of Thoracic Surgeons and Georgia’s CHD repository was 99.5% (7,544/7,585).
Conclusions:
Linkage methodology was successfully demonstrated between surgical data and hospital-based electronic health records in North Carolina and Georgia, uniting granular procedural details with clinical, developmental, and economic data. Indirect identifiers linked most patients, consistent with similar linkages in adult populations. Future directions include applying these linkage techniques with other data sources and exploring long-term outcomes in linked populations.
We analyzed invasive group A streptococcal puerperal sepsis cases in a large health zone in Alberta, Canada between 2013 and 2022. Of the 21 cases, 85.7% were adjudicated as hospital/delivery-acquired, with 2 clusters having identical isolates found through whole genome sequencing. We implemented policy interventions across Alberta aimed at preventing future infections.
Knowledge of sex differences in risk factors for posttraumatic stress disorder (PTSD) can contribute to the development of refined preventive interventions. Therefore, the aim of this study was to examine if women and men differ in their vulnerability to risk factors for PTSD.
Methods
As part of the longitudinal AURORA study, 2924 patients seeking emergency department (ED) treatment in the acute aftermath of trauma provided self-report assessments of pre- peri- and post-traumatic risk factors, as well as 3-month PTSD severity. We systematically examined sex-dependent effects of 16 risk factors that have previously been hypothesized to show different associations with PTSD severity in women and men.
Results
Women reported higher PTSD severity at 3-months post-trauma. Z-score comparisons indicated that for five of the 16 examined risk factors the association with 3-month PTSD severity was stronger in men than in women. In multivariable models, interaction effects with sex were observed for pre-traumatic anxiety symptoms, and acute dissociative symptoms; both showed stronger associations with PTSD in men than in women. Subgroup analyses suggested trauma type-conditional effects.
Conclusions
Our findings indicate mechanisms to which men might be particularly vulnerable, demonstrating that known PTSD risk factors might behave differently in women and men. Analyses did not identify any risk factors to which women were more vulnerable than men, pointing toward further mechanisms to explain women's higher PTSD risk. Our study illustrates the need for a more systematic examination of sex differences in contributors to PTSD severity after trauma, which may inform refined preventive interventions.
Changing practice patterns caused by the pandemic have created an urgent need for guidance in prescribing stimulants using telepsychiatry for attention-deficit hyperactivity disorder (ADHD). A notable spike in the prescribing of stimulants accompanied the suspension of the Ryan Haight Act, allowing the prescribing of stimulants without a face-to-face meeting. Competing forces both for and against prescribing ADHD stimulants by telepsychiatry have emerged, requiring guidelines to balance these factors. On the one hand, factors weighing in favor of increasing the availability of treatment for ADHD via telepsychiatry include enhanced access to care, reduction in the large number of untreated cases, and prevention of the known adverse outcomes of untreated ADHD. On the other hand, factors in favor of limiting telepsychiatry for ADHD include mitigating the possibility of exploiting telepsychiatry for profit or for misuse, abuse, and diversion of stimulants. This Expert Consensus Group has developed numerous specific guidelines and advocates for some flexibility in allowing telepsychiatry evaluations and treatment without an in-person evaluation to continue. These guidelines also recognize the need to give greater scrutiny to certain subpopulations, such as young adults without a prior diagnosis or treatment of ADHD who request immediate-release stimulants, which should increase the suspicion of possible medication diversion, misuse, or abuse. In such cases, nonstimulants, controlled-release stimulants, or psychosocial interventions should be prioritized. We encourage the use of outside informants to support the history, the use of rating scales, and having access to a hybrid model of both in-person and remote treatment.
Clinical research requires a competent workforce of clinical research professionals (CRPs) who are well-trained to perform varied and complex tasks within their roles. The Joint Task Force for Clinical Trial Competency (JTF) framework established essential domains for conducting high-quality clinical research that can guide professional development of CRPs. The Research Professionals Network (RPN) Workshops were established in 2017 to focus on developing ongoing inter-institutional, peer-led, JTF-centric continuing education for CRPs. Four institutions and their affiliates are part of the collaboration.
Methods:
Workshop participant survey data and other metrics were collected over four academic years. Both quantitative and qualitative analyses were performed to assess participant experience and identify relevant themes.
Results:
Participants demonstrated overall high satisfaction with the workshops and significantly value the interpersonal, inter-institutional collaboration made possible through the workshops.
Conclusions:
These inter-institutional RPN Workshops have evolved into a Community of Practice, which can be expanded into future opportunities.
Childhood obesity represents a significant global health concern and identifying its risk factors is crucial for developing intervention programs. Many “omics” factors associated with the risk of developing obesity have been identified, including genomic, microbiomic, and epigenomic factors. Here, using a sample of 48 infants, we investigated how the methylation profiles in cord blood and placenta at birth were associated with weight outcomes (specifically, conditional weight gain, body mass index, and weight-for-length ratio) at age six months. We characterized genome-wide DNA methylation profiles using the Illumina Infinium MethylationEpic chip, and incorporated information on child and maternal health, and various environmental factors into the analysis. We used regression analysis to identify genes with methylation profiles most predictive of infant weight outcomes, finding a total of 23 relevant genes in cord blood and 10 in placenta. Notably, in cord blood, the methylation profiles of three genes (PLIN4, UBE2F, and PPP1R16B) were associated with all three weight outcomes, which are also associated with weight outcomes in an independent cohort suggesting a strong relationship with weight trajectories in the first six months after birth. Additionally, we developed a Methylation Risk Score (MRS) that could be used to identify children most at risk for developing childhood obesity. While many of the genes identified by our analysis have been associated with weight-related traits (e.g., glucose metabolism, BMI, or hip-to-waist ratio) in previous genome-wide association and variant studies, our analysis implicated several others, whose involvement in the obesity phenotype should be evaluated in future functional investigations.
This study aimed to parse between-person heterogeneity in growth of impulsivity across childhood and adolescence among participants enrolled in five childhood preventive intervention trials targeting conduct problems. In addition, we aimed to test profile membership in relation to adult psychopathologies. Measurement items representing impulsive behavior across grades 2, 4, 5, 7, 8, and 10, and aggression, substance use, suicidal ideation/attempts, and anxiety/depression in adulthood were integrated from the five trials (N = 4,975). We applied latent class growth analysis to this sample, as well as samples separated into nonintervention (n = 2,492) and intervention (n = 2,483) participants. Across all samples, profiles were characterized by high, moderate, low, and low-increasing impulsive levels. Regarding adult outcomes, in all samples, the high, moderate, and low profiles endorsed greater levels of aggression compared to the low-increasing profile. There were nuanced differences across samples and profiles on suicidal ideation/attempts and anxiety/depression. Across samples, there were no significant differences between profiles on substance use. Overall, our study helps to inform understanding of the developmental course and prognosis of impulsivity, as well as adding to collaborative efforts linking data across multiple studies to better inform understanding of developmental processes.
The concept of a forest transition – a regional shift from deforestation to forest recovery – tends to equate forest area expansion with sustainability, assuming that more forest is good for people and the environment. To promote debate and more just and ecologically sustainable outcomes during this period of intense focus on forests (such as the United Nations’ Decade on Ecological Restoration, the Trillion Trees initiative and at the United Nations’ Climate Change Conferences), we synthesize recent nuanced and integrated research to inform forest management and restoration in the future. Our results reveal nine pitfalls to assuming forest transitions and sustainability are automatically linked. The pitfalls are as follows: (1) fixating on forest quantity instead of quality; (2) masking local diversity with large-scale trends; (3) expecting U-shaped temporal trends of forest change; (4) failing to account for irreversibility; (5) framing categories and concepts as universal/neutral; (6) diverting attention from the simplification of forestlands into single-purpose conservation forests or intensive production lands; (7) neglecting social power transitions and dispossessions; (8) neglecting productivism as the hidden driving force; and (9) ignoring local agency and sentiments. We develop and illustrate these pitfalls with local- and national-level evidence from Southeast Asia and outline forward-looking recommendations for research and policy to address them. Forest transition research that neglects these pitfalls risks legitimizing unsustainable and unjust policies and programmes of forest restoration or tree planting.