We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Functional impairment in daily activities, such as work and socializing, is part of the diagnostic criteria for major depressive disorder and most anxiety disorders. Despite evidence that symptom severity and functional impairment are partially distinct, functional impairment is often overlooked. To assess whether functional impairment captures diagnostically relevant genetic liability beyond that of symptoms, we aimed to estimate the heritability of, and genetic correlations between, key measures of current depression symptoms, anxiety symptoms, and functional impairment.
Methods
In 17,130 individuals with lifetime depression or anxiety from the Genetic Links to Anxiety and Depression (GLAD) Study, we analyzed total scores from the Patient Health Questionnaire-9 (depression symptoms), Generalized Anxiety Disorder-7 (anxiety symptoms), and Work and Social Adjustment Scale (functional impairment). Genome-wide association analyses were performed with REGENIE. Heritability was estimated using GCTA-GREML and genetic correlations with bivariate-GREML.
Results
The phenotypic correlations were moderate across the three measures (Pearson’s r = 0.50–0.69). All three scales were found to be under low but significant genetic influence (single-nucleotide polymorphism-based heritability [h2SNP] = 0.11–0.19) with high genetic correlations between them (rg = 0.79–0.87).
Conclusions
Among individuals with lifetime depression or anxiety from the GLAD Study, the genetic variants that underlie symptom severity largely overlap with those influencing functional impairment. This suggests that self-reported functional impairment, while clinically relevant for diagnosis and treatment outcomes, does not reflect substantial additional genetic liability beyond that captured by symptom-based measures of depression or anxiety.
The transition from breastmilk to solid foods (weaning) is a critical stage in infant development and plays a decisive role in the maturation of the complex microbial community inhabiting the human colon. Diet is a major factor shaping the colonic microbiota, which ferments nutrients reaching the colon unabsorbed by the host to produce a variety of microbial metabolites influencing host physiology(1). Therefore, making adequate dietary choices during weaning can positively modulate the colonic microbiota, ultimately contributing to health in infancy and later life(2). However, our understanding of how complementary foods impact the colonic microbiota of weaning infants is limited. To address this knowledge gap, we employed a metagenome-scale modelling approach to simulate the impact of complementary foods, either combined with breastmilk or with breastmilk and other foods, on the production of organic acids by colonic microbes of weaning infants(3). Complementary foods and combinations of foods with the greatest impact on the in silico microbial production of organic acids were identified. These foods and food combinations were further tested in vitro, individually or in combination with infant formula. Fifty-three food samples were digested using a protocol adapted from INFOGEST to mimic infant digestion and then fermented with faecal inoculum from 6 New Zealand infants (5-11 months old). After 24h of fermentation, the production of organic acids was measured by gas chromatography. Differences in organic acid production between samples were determined using the Tukey Honestly Significant Difference test to account for multiple comparisons. The microbial composition was characterised by amplicon sequencing of the V3-V4 regions of the 16S bacterial gene. Taxonomy was assigned using the DADA2 pipeline and the SILVA database (version 138.1). Bioinformatic and statistical analyses were conducted using the R packages phyloseq and ANCOM-BC2, with the Holm-Bonferroni adjustment to account for false discovery rates in differential abundance testing. Blackcurrant and raspberries increased the production of acetate and propionate (Tukey’s test, p<0.05) and the relative abundance of the genus Parabacteroides (Dunnett’s test, adjusted p<0.05) compared to other foods. Raspberries also increased the abundance of the genus Eubacterium (Dunnett’s test, adjusted p<0.05). When combined with infant formula, black beans stood out for increasing the production of butyrate (Tukey’s test, p<0.05) and the relative abundance of the genus Clostridium (Dunnett’s test, adjusted p<0.05). In conclusion, this study provides new evidence on how complementary foods, both individually or in combination with other dietary compounds, influence the colonic microbiota of weaning infants in vitro. Insights generated by this research can help design future clinical trials, ultimately enhancing our understanding of the relationship between human nutrition and colonic microbiota composition and function in post-weaning life.
Globally, food waste from school lunch programmes varies considerably, ranging from 33 to 116 g/student/day, with vegetables the most wasted food category(2). In New Zealand, the Ka Ora Ka Ako school lunch programme provides free healthy lunches to schools whose communities face greater socio-economic barriers. The programme has been criticised with claims that large quantities of food is wasted, although there is no available data available to support these comments. The aim of this study was to measure the quantity and destinations of food waste from the Ka Ora, Ka Ako school lunch programme in Dunedin schools. A total of eight primary schools in Dunedin participated. At each school, data was collected over four days: the first day was an observation day and on the remaining three consecutive days food waste was measured. Equipment (e.g., measuring scales, buckets and containers) was used for direct weighing and to carry out the waste composition analysis (i.e., manually sorting waste by type). Data was recorded and analysed using Microsoft Excel software. School rolls ranged from 17 to 353 students. Across the seven schools, the total amount of food waste from leftovers was 5274 g/day, with a mean of 32 g/student/day. Destinations of food waste from leftovers varied, ranging from returning to the supplier to being disposed in school rubbish bins (to landfill). Using the Target, Measure, Act approach recommended for food waste, the ‘Target’ is to halve per capita global food waste at the retail and consumer levels by 2030(2). This study contributes to the second step, which is to ‘Measure’ food waste. The findings from this study may be used for the third step, ‘Act’, to reduce food waste from the Ka Ora, Ka Ako school lunch programme, diverting this from landfill.
The nutrition workforce plays a vital role in disease prevention and health promotion, with expanding job opportunities shaped by factors like aging populations, climate change, global food systems, and advancing technologies(1,2). Preparing students for careers that require adaptability involves understanding the valuable skills they possess and identifying any gaps. This research aimed to identify the skills and knowledge valued by students who had recently completed work-based placements, and explore recent graduates’ experiences, challenges, and preparedness for employment. At the end of their work-based placements students give presentations sharing their experiences and learning. Permission was sought from ten students to analyse the recordings of these presentations. The presentations were selected to include a range of nutrition fields, including sports nutrition, public health, community nutrition, dietary counselling, food and industry, and nutrition communication. Additionally, a list of graduates (within four years of graduation) from various fields (as above) was compiled and they were invited to participate. Semi-structured interviews (n=10) were conducted online via Zoom and recorded. The interview guide included open-ended questions on employment experiences, challenges, preparedness, and required skills. The interviews, transcription and analyses were completed by two student researchers between November 2023 and February 2024. Thematic analysis using NVivo software was used to identify themes. The themes developed included the importance of skills relating to; i) communicating complex nutrition concepts to the public, ii) collaborating within diverse teams, iii) identifying and filling personal knowledge gaps. In addition Graduates felt practical experience from their University study boosted their preparedness for the workforce, though many struggled to apply their skills in non-traditional roles and expand their career scope. In summary, ongoing focus on team-based projects, communication with non-science audiences, and strategies for continuous learning using evidence-based sources are crucial for both undergraduate and postgraduate education.
To combat the decline in North American grasslands and prairies, innovative strategies to establish new native grass and forb plantings must be considered. Integrated vegetation management entails the use of many practices to cultivate desirable vegetation along roadsides, including mowing, applying herbicides, burning, and replanting. Currently, only a limited selection of postemergence herbicides are available to improve native plant establishment along roadsides. A greenhouse herbicide screen that included four postemergence herbicides registered for use on Conservation Reserve Program (CRP) acres and rights-of-way was conducted to test their safety for use on four native grasses (big bluestem, buffalograss, sideoats grama, and switchgrass) and seven forb species (ashy sunflower, black-eyed Susan, butterfly milkweed, desert false indigo, Illinois bundleflower, Mexican hat plant, and purple coneflower). Clopyralid (689 g ae ha−1), metsulfuron (4.18 g ai ha−1), and quinclorac (418 g ai ha−1) applied at labeled rates caused no injury to the native grass species or butterfly milkweed. However, florpyrauxifen-benzyl (38.4 g ai ha−1) caused significant injury to buffalograss and switchgrass. None of the herbicides tested were universally safe to use on all forb species evaluated in this trial, with each herbicide causing unacceptable injury (≥25%) to one or more forb species. None of the herbicides studied here would be completely safe for use on mixed stands of native grasses and native forbs at the seedling growth stage, indicating that prairie establishment must use alternative chemistries, plant mixes with fewer species, or avoid postemergence applications shortly after emergence of native forbs.
Background: While efgartigimod usage is expected to reduce immunoglobulin (IG) utilization, evidence in clinical practice is limited. Methods: In this retrospective cohort study, patients with gMG treated with efgartigimod for ≥1-year were identified from US medical/pharmacy claims data (April 2016-January 2024) and data from the My VYVGART Path patient support program (PSP). The number of IG courses during 1-year before and after efgartigimod initiation (index date) were evaluated. Patients with ≥6 annual IG courses were considered chronic IG users. Myasthenia Gravis Activities of Daily Living (MG-ADL) scores before and after index were obtained from the PSP where available. Descriptive statistics were used without adjustment for covariates. Results: 167 patients with ≥1 IG claim before index were included. Prior to efgartigimod initiation, the majority of patients (62%) received IG chronically. During the 1-year after index, the number of IG courses fell by 95% (pre: 1531, post: 75). 89% (n=149/167) of patients fully discontinued IG usage. Mean (SD) best-follow up MG-ADL scores were significantly reduced after index (8.0 [4.1] to 2.8 [2.1], P<0.05, n=73/167, 44%). Conclusions: Based on US claims, IG utilization was substantially reduced among patients who continued efgartigimod for ≥1-year, with patients demonstrating a favorable MG-ADL response.
Background: Our prior six-year review (n=2165) revealed 24% of patients undergoing posterior decompression surgeries (laminectomy or discectomy) sought emergency department (ED) care within three months post-surgery. We established an integrated Spine Assessment Clinic (SAC) to enhance patient outcomes and minimize unnecessary ED visits through pre-operative education, targeted QI interventions, and early post-operative follow-up. Methods: We reviewed 13 months of posterior decompression data (n=205) following SAC implementation. These patients received individualized, comprehensive pre-operative education and follow-up phone calls within 7 days post-surgery. ED visits within 90 days post-surgery were tracked using provincial databases and compared to our pre-SAC implementation data. Results: Out of 205 patients, 24 (11.6%) accounted for 34 ED visits within 90 days post-op, showing a significant reduction in ED visits from 24% to 11.6%, and decreased overall ED utilization from 42.1% to 16.6% (when accounting for multiple visits by the same patient). Early interventions including wound monitoring, outpatient bloodwork, and prescription adjustments for pain management, helped mitigate ED visits. Patient satisfaction surveys (n=62) indicated 92% were “highly satisfied” and 100% would recommend the SAC. Conclusions: The SAC reduced ED visits after posterior decompression surgery by over 50%, with pre-operative education, focused QI initiatives, and its individualized, proactive approach.
DSM-5 specifies bulimia nervosa (BN) severity based on specific thresholds of compensatory behavior frequency. There is limited empirical support for such severity groupings. Limited support could be because the DSM-5’s compensatory behavior frequency cutpoints are inaccurate or because compensatory behavior frequency does not capture true underlying differences in severity. In support of the latter possibility, some work has suggested shape/weight overvaluation or use of single versus multiple purging methods may be better severity indicators. We used structural equation modeling (SEM) Trees to empirically determine the ideal variables and cutpoints for differentiating BN severity, and compared the SEM Tree groupings to alternate severity classifiers: the DSM-5 indicators, single versus multiple purging methods, and a binary indicator of shape/weight overvaluation.
Methods
Treatment-seeking adolescents and adults with BN (N = 1017) completed self-report measures assessing BN and comorbid symptoms. SEM Trees specified an outcome model of BN severity and recursively partitioned this model into subgroups based on shape/weight overvaluation and compensatory behaviors. We then compared groups on clinical characteristics (eating disorder symptoms, depression, anxiety, and binge eating frequency).
Results
SEM Tree analyses resulted in five severity subgroups, all based on shape/weight overvaluation: overvaluation <1.25; overvaluation 1.25–3.74; overvaluation 3.75–4.74; overvaluation 4.75–5.74; and overvaluation ≥5.75. SEM Tree groups explained 1.63–6.41 times the variance explained by other severity schemes.
Conclusions
Shape/weight overvaluation outperformed the DSM-5 severity scheme and single versus multiple purging methods, suggesting the DSM-5 severity scheme should be reevaluated. Future research should examine the predictive utility of this severity scheme.
Foods in squeeze pouches are widely available and are marketed as practical, convenient, and healthy food options for infants and children. However, these products do not provide adequate nutrition for growth(1) or align with the front-of-pack health claims. To develop effective strategies and guidance for squeeze pouch consumption, we need to understand which squeeze pouches are used, by whom, and why. A cross-sectional online survey of Tasmanian residents was conducted and included questions about the frequency and types of squeeze pouches consumed by infants and children (aged 0–18 years), the demographics of families who use squeeze pouches frequently and an open-ended question to explore parental motivations for using these products. Data were analysed using descriptive statistics and logistic regression identified demographic predictors of frequent squeeze pouch use (weekly or more). Thematic analysis of qualitative survey responses explored parental experiences. Parents (n = 179; 78% female, 37% aged 35–45 years, 84% born in Australia; 73% university educated) reported on the squeeze pouch use of n = 248 children. Most infants (0–2 years; 71.4%) used squeeze pouches weekly (85.7% consumed in past year), favouring fruit-based (57%), dairy-based (57%), vegetable-based (50%), and meal-based (36%) pouches. Over half of children aged 2–5 years (62.5%) consumed pouches weekly (81.3% consumed in past year), preferring dairy-based (73%) and fruit-based (19%) pouches. Over a third of 6–12-year-olds (35.2%) consume pouches weekly (69.3% consumed in past year), including dairy-based (66%) and fruit-based pouches (20%). A smaller proportion (13.1%) of teenagers (13–17 years) consume pouches weekly (33.3% consumed in past year), primarily choosing dairy-based (26%) and fruit-based (6%) pouches. Younger parents were over 5 times more likely to be frequent users than parents aged over 46 years (18–34 years OR: 5.3, 95% CI 1.8–15.7; 35–45 years OR: 6.0, 95% CI 2.8–12.8). Speaking a language other than English (OR: 4.8; 95% CI 1.5–14.6) also significantly predicted frequent squeeze pouch use, while gender, education, employment status, income, and food security were not associated. Key themes from parents who identified as frequent squeeze pouch users centred around convenience, on-the-go feeding, and managing fussy eating or sensory needs. Parents discussed the societal paradox they experienced, with parents expressing a dislike for squeeze pouches yet using them for behaviour modification as a food reward or buying in bulk when discounted. An understanding of commercial food influences, and greater environmental consciousness were the most common themes described by parents who identified as non-users. This study highlights the widespread use of squeeze pouches among children, particularly in younger age groups but also into middle childhood and adolescence. Comprehensive national data is needed to inform public health strategies that minimise the use of squeeze pouches in children of all ages.
Establishing early healthy eating behaviours in the complementary feeding period through to adolescence is fundamental as it can affect the health trajectory of an infant’s life into adulthood and impact lifelong eating patterns(1). Consumer demand for commercial squeeze pouches is increasing and now expanding from infants to older children. Yet, emerging research suggests that these products are nutritionally poor(2), and are frequently marketed with misleading claims that oppose infant and child feeding guidelines(3). There is a paucity of information to inform public health strategies regarding the use of squeeze pouches throughout infancy and beyond. The aim of this scoping review was to determine the frequency and types of squeeze pouches consumed by children aged 0–18 years, the sociodemographic characteristics of users, and insights from parental experiences when using these products. The scoping review was conducted in accordance with the Joanna Briggs Institute guidelines for scoping reviews. Three online databases were searched (MEDLINE, Scopus, CINAHL) in addition to grey literature. Screening of papers was completed by two independent reviewers. The database and grey literature search identified 125 articles, of which 16 underwent full-text review and 11 studies across five countries were included. In eight studies, the prevalence of squeeze pouch consumption ranged from 23.5% to 82.8% of infants and children, with studies reporting daily (n = 4; 8.7–29.2%), weekly (n = 7; 20.9–75.2%), and monthly consumption (n = 7; 16.7–70.4%). The predominant types of squeeze pouches consumed were fruit-based and dairy-based squeeze pouches. Frequent squeeze pouch use was 2.5 times more likely among families residing in areas of high deprivation, or if childcare was used. Household composition also impacted squeeze pouch use as families with two or more children were more likely to use these products. Additionally, frequent pouch use was associated with early cessation of breastfeeding and early introduction of solids. Five studies on parental perceptions of squeeze pouches reported benefits including convenience, perceived health, and perceived low cost while four studies expressed concerns relating to waste and environmental impacts, health and nutrition, and perceived high cost. This review highlights that the widespread use of squeeze pouches, particularly fruit and dairy-based pouches among infants and children is driven by convenience, changing family needs and effective marketing strategies. Public health policy is needed to address concerns regarding the nutritional quality of squeeze pouches and regulate marketing strategies to ensure parents are adequately informed about the health implications of these products. Further research should focus on identifying barriers to safe and nutritious complementary feeding practices and developing targeted education programs to promote optimal feeding practices that minimise the use of squeeze pouches in infants and children of all ages.
Posttraumatic stress disorder (PTSD) has been associated with advanced epigenetic age cross-sectionally, but the association between these variables over time is unclear. This study conducted meta-analyses to test whether new-onset PTSD diagnosis and changes in PTSD symptom severity over time were associated with changes in two metrics of epigenetic aging over two time points.
Methods
We conducted meta-analyses of the association between change in PTSD diagnosis and symptom severity and change in epigenetic age acceleration/deceleration (age-adjusted DNA methylation age residuals as per the Horvath and GrimAge metrics) using data from 7 military and civilian cohorts participating in the Psychiatric Genomics Consortium PTSD Epigenetics Workgroup (total N = 1,367).
Results
Meta-analysis revealed that the interaction between Time 1 (T1) Horvath age residuals and new-onset PTSD over time was significantly associated with Horvath age residuals at T2 (meta β = 0.16, meta p = 0.02, p-adj = 0.03). The interaction between T1 Horvath age residuals and changes in PTSD symptom severity over time was significantly related to Horvath age residuals at T2 (meta β = 0.24, meta p = 0.05). No associations were observed for GrimAge residuals.
Conclusions
Results indicated that individuals who developed new-onset PTSD or showed increased PTSD symptom severity over time evidenced greater epigenetic age acceleration at follow-up than would be expected based on baseline age acceleration. This suggests that PTSD may accelerate biological aging over time and highlights the need for intervention studies to determine if PTSD treatment has a beneficial effect on the aging methylome.
Meaningful medical data are crucial for response teams in the aftermath of disaster. Electronic Medical Record (EMR) systems have revolutionized healthcare by facilitating real-time data collection, storage, and analysis. These capabilities are particularly relevant for post-disaster and austere environments. fEMR, an EMR system designed for such settings, enables rapid documentation of patient information, treatments, and outcomes, ensuring critical data capture.
Objectives:
Data collected through fEMR can be leveraged to perform comprehensive monitoring and evaluation (M&E) of emergency medical services, assess operational needs and efficiency, and support public health syndromic surveillance.
Method/Description:
Analyzing these data identifies patterns and trends or assesses treatment effectiveness. This insight facilitates data-driven decision-making and the optimization of medical protocols. fEMR’s real-time reports enhance situational awareness and operational coordination among response units. The aggregated data can detect trends, classify case-mix, and facilitate after-action reviews, contributing to continuous improvement in emergency preparedness and response strategies. The system also supports fulfilling reporting requirements for health agencies and funding organizations, ensuring accountability and transparency.
Results/Outcomes:
EMRs like fEMR are vital for emergency response teams, supporting immediate patient care and ongoing M&E of disaster response efforts. Its robust data management capabilities support evidence-based practices and strategic planning, improving the effectiveness of emergency medical services in disaster scenarios.
Conclusion:
The effective use of fEMR in disaster response scenarios highlights its significance in enhancing operational efficiency, ensuring accountability, and improving the overall effectiveness of emergency medical services through comprehensive data management and real-time reporting.
Objectives/Goals: To evaluate equity in utilization of free initial health evaluation (IHE) services among members of a limited health care program, the World Trade Center (WTC) Health Program (Program), to inform intervention development and provide insights for similar healthcare programs. Methods/Study Population: We included Program members who newly enrolled during 2012–2022, and who had an IHE or were alive for ≥ 1 year after enrollment. Program administrative and surveillance data collected from January 2012 to February 2024 were used. We evaluated two outcomes: timely IHE utilization (proportion of members completing an IHE within 6 months of enrollment) and any IHE utilization (proportion completing an IHE by February 2024). We described IHE utilization by enrollment year and various members’ characteristics and conducted multivariable logistic regression models to estimate adjusted odds ratios for IHE utilizations to identify factors related to potential inequities for the two member types: Responders, who performed support services, vs. Survivors, who did not respond but were present in the New York disaster area. Results/Anticipated Results: A total of 27,379 Responders and 30,679 Survivors were included. Responders were 89% male, 70% 45–64 years old at enrollment and 76% White. Survivors were 46% female, 54% 45–64 years old at enrollment, and 57% White. Timely IHE utilizations remained relatively stable (~65%) among Responders across time and increased from 16% among Survivors who enrolled in 2017 to 68% among Survivors who enrolled in 2021. Timely IHE utilization was lower for younger members (enrolled Discussion/Significance of Impact: This study highlights Program achievements and gaps in providing equitable IHE services. Strategies to improve members’ equitable IHE utilization can include: adopt/expand flexible scheduling; increase non-English language capacity and cultural competency; and facilitate transportation/assistance for members with accessibility barriers.
For decades, in most states with a party registration option, the percentage of voters registering as unaffiliated with a major political party has steadily increased. But who are these registered voters in these polarized partisan times, and why might they register without a major party? We address these questions by drawing on parallel large-N original surveys of registered voters in two southeastern states experiencing a notable rise in registered independents but with different electoral rules for unaffiliated registrants. The closed primary rule in Florida reflects a much greater share of major party registrants versus North Carolina, which has a semi-closed primary rule. Nevertheless, even with these different primary laws, in both states we find that the decision not to register with a major party strongly covaries with identity as a political independent. Hence, registration rules may alter registration patterns, but individuals claiming to be less attached to a major party are markedly more likely to manifest this position by registering unaffiliated.
Preliminary evidence suggests that a ketogenic diet may be effective for bipolar disorder.
Aims
To assess the impact of a ketogenic diet in bipolar disorder on clinical, metabolic and magnetic resonance spectroscopy outcomes.
Method
Euthymic individuals with bipolar disorder (N = 27) were recruited to a 6- to 8-week single-arm open pilot study of a modified ketogenic diet. Clinical, metabolic and MRS measures were assessed before and after the intervention.
Results
Of 27 recruited participants, 26 began and 20 completed the ketogenic diet. For participants completing the intervention, mean body weight fell by 4.2 kg (P < 0.001), mean body mass index fell by 1.5 kg/m2 (P < 0.001) and mean systolic blood pressure fell by 7.4 mmHg (P < 0.041). The euthymic participants had average baseline and follow-up assessments consistent with them being in the euthymic range with no statistically significant changes in Affective Lability Scale-18, Beck Depression Inventory and Young Mania Rating Scale. In participants providing reliable daily ecological momentary assessment data (n = 14), there was a positive correlation between daily ketone levels and self-rated mood (r = 0.21, P < 0.001) and energy (r = 0.19 P < 0.001), and an inverse correlation between ketone levels and both impulsivity (r = −0.30, P < 0.001) and anxiety (r = −0.19, P < 0.001). From the MRS measurements, brain glutamate plus glutamine concentration decreased by 11.6% in the anterior cingulate cortex (P = 0.025) and fell by 13.6% in the posterior cingulate cortex (P = <0.001).
Conclusions
These findings suggest that a ketogenic diet may be clinically useful in bipolar disorder, for both mental health and metabolic outcomes. Replication and randomised controlled trials are now warranted.
The prevalence of youth anxiety and depression has increased globally, with limited causal explanations. Long-term physical health conditions (LTCs) affect 20–40% of youth, with rates also rising. LTCs are associated with higher rates of youth depression and anxiety; however, it is uncertain whether observed associations are causal or explained by unmeasured confounding or reverse causation.
Methods
Using data from the Norwegian Mother, Father, and Child Cohort Study (MoBa) and Norwegian National Patient Registry, we investigated phenotypic associations between childhood LTCs, and depression and anxiety diagnoses in youth (<19 years), defined using ICD-10 diagnoses and self-rated measures. We then conducted two-sample Mendelian Randomization (MR) analyses using SNPs associated with childhood LTCs from existing genome-wide association studies (GWAS) as instrumental variables. Outcomes were: (i) diagnoses of major depressive disorder (MDD) and anxiety disorders or elevated symptoms in MoBa, and (ii) youth-onset MDD using summary statistics from a GWAS in iPSYCH2015 cohort.
Results
Having any childhood LTC phenotype was associated with elevated youth MDD (OR = 1.48 [95% CIs 1.19, 1.85], p = 4.2×10−4) and anxiety disorder risk (OR = 1.44 [1.20, 1.73], p = 7.9×10−5). Observational and MR analyses in MoBa were consistent with a causal relationship between migraine and depression (IVW OR = 1.38 [1.19, 1.60], pFDR = 1.8x10−4). MR analyses using iPSYCH2015 did not support a causal link between LTC genetic liabilities and youth-onset depression or in the reverse direction.
Conclusions
Childhood LTCs are associated with depression and anxiety in youth, however, little evidence of causation between LTCs genetic liability and youth depression/anxiety was identified from MR analyses, except for migraine.
The excavation of a stratified sequence of deposits spanning the Initial Late Formative period (250 BC–AD 120) at Iruhito, in the upper Desaguadero Valley of Bolivia, provides insight into this previously unrecognized, four-century period separating the well-documented Middle Formative (800–250 BC) from the Late Formative (~AD 120–590) period. By tracking subtle shifts in ceramic, architectural, lithic, and faunal data, we can explore tempos of change in social life during this dynamic time. These data lead us to suggest that, rather than being a “transitional” period or a “hiatus” in regional occupation, the Initial Late Formative period was a distinct mode of sociality characterized by the realignment and expansion of interaction networks, on the one hand, and rejection of the decorative aesthetics, monumentality, and public-oriented performances of earlier periods, on the other. We argue that the Late Formative period centers emerging after ~AD 120 intentionally cited architecture and aesthetics that were distant in time and space, constituting a sophisticated political strategy. Finally, these data suggest that the chronological schemata we use to build regional histories often obscure social variability.