To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Here, we present a first assessment of the US Department of Agriculture’s (USDA) “Grass-Cast Southwest,” which is a forecasting tool for rangeland aboveground net primary productivity (ANPP) for the southwest region of the United States. Our results show that ANPP forecasts in early April were relatively close to the observation-based ANPP estimates in late May for all years evaluated (R = 0.6–0.9). The relatively high predictability of spring rangeland productivity in this region is likely because it is strongly driven by antecedent winter/early spring precipitation. Conversely, the first summer forecasts produced in June did not consistently predict the final observation-based ANPP estimates in late August (R = −0.5–0.7), likely because summer rangeland productivity in this region is highly dependent on variable, less predictable precipitation from the North American Monsoon (NAM). Antecedent El Niño Southern Oscillation (ENSO) indices could be used to improve Grass-Cast Southwest performance in both the spring and summer. The ENSOJFM (January–March) index was significantly positively correlated with rangeland productivity during the spring season, whereas ENSOMAM (March–May) was significantly negatively correlated with rangeland productivity during the summer season.
The first year of life is a critical period when nutrient intakes can affect long-term health outcomes. Although household food insecurity may result in inadequate nutrient intakes or a higher risk of obesity, no studies have comprehensively assessed nutrient intakes of infants from food insecure households. This study aimed to investigate how infant nutrient intakes and BMI differ by household food security.
Design:
Cross-sectional analysis of the First Foods New Zealand study of infants aged 7–10 months. Two 24-h diet recalls assessed nutrient intakes. ‘Usual’ intakes were calculated using the multiple source method. BMI z-scores were calculated using WHO Child Growth Standards.
Setting:
Dunedin and Auckland, New Zealand.
Participants:
Households with infants (n 604) classified as: severely food insecure, moderately food insecure or food secure.
Results:
Nutrient intakes of food insecure and food secure infants were similar, aside from slightly higher free and added sugars intakes in food insecure infants. Energy intakes were adequate, and intakes of most nutrients investigated were likely to be adequate. Severely food insecure infants had a higher mean BMI z-score than food secure infants, although no significant differences in weight categories (underweight, healthy weight and overweight) were observed between groups.
Conclusions:
Household food insecurity, in the short term, does not appear to adversely impact the nutrient intakes and weight status of infants. However, mothers may be protecting their infants from potential nutritional impacts of food insecurity. Future research should investigate how food insecurity affects nutrient intakes of the entire household.
This study aimed to compare appropriateness of restricted antimicrobial prescriptions, as assessed by antimicrobial stewardship program (ASP) prospective audit and feedback (PAF), between those ordered by medical trainees versus staff. Secondary objectives were to determine whether certain timing factors and other independent variables impacted prescription appropriateness.
Design:
Single center, retrospective cohort study.
Setting:
The University of Alberta Hospital a 700-bed tertiary care hospital in Edmonton, Canada.
Participants:
Prescriptions of six health-authority restricted antibiotics subject to ASP PAF between 2018 and 2023. Cases were excluded if prescriber role or prescription dates or times were unavailable.
Methods:
Data from a local ASP quality improvement database was extracted. Multiple logistic regression analysis was completed with adjusted odds ratios (aOR) reported.
Results:
A total of 3,687 restricted antibiotic prescriptions subjected to PAF were included in this study, of which 1,163 (31.5%) were assessed as not appropriately prescribed. Prescriptions written by medical trainees did not have higher odds of appropriateness compared to staff (aOR 1.09 [95% CI 0.94–1.28], P = .25). Weekend prescriptions had a reduced odds of being appropriate (aOR 0.71 [0.60–0.84], P < .0001). Through the course of the Coronavirus Disease 2019 (COVID-19) pandemic, appropriateness improved from 56.2% (prepandemic), 71.5% (peri-pandemic) to 76.9% (postpandemic).
Conclusions:
No differences were noted in restricted antibiotic prescription appropriateness between medical trainees and staff. Weekend prescriptions were less likely to be appropriate. Improved appropriateness over time may be multifactorial, including implementation of ASP preceding the pandemic. Further studies examining timing factors associated with appropriateness are needed.
Adverse prenatal conditions can induce intrauterine growth restriction (IUGR) and increase the risk of adulthood metabolic disease. Mechanisms underlying developmentally programmed metabolic disease remain unclear but may involve disrupted postnatal circadian rhythms and kisspeptin signalling. We investigated the impact of maternal hypoxia-induced IUGR on hypothalamic and hepatic expression of clock genes (Bmal1, Per2 and Reverbα), metabolic genes (Pparα, Pparγ and Pgc1α) and kisspeptin genes (Kiss1 and Kiss1r) in adult offspring. Pregnant BALB/c mice were housed in hypoxic conditions (10.5% oxygen) from gestational day 11 to 17.5 and then returned to normoxic conditions until term (gestational day ∼ 21). Control animals were housed in normoxic conditions throughout pregnancy. Offspring were weighed at birth. At 8 weeks of age, body, liver and brain tissues were collected and weighed. Relative clock gene, metabolic gene and kisspeptin signalling gene expression were measured using qPCR. The IUGR offspring were lighter at birth and remained lighter at 8 weeks but with higher brain relative to body weight. The IUGR offspring had decreased hypothalamic Bmal1 and Reverbα expression, but unchanged hepatic clock gene expression and no change in hypothalamic or hepatic Per2 expression, compared with Control offspring. This tissue-specific change in clock gene expression suggests circadian dysregulation. There were no IUGR-related changes to metabolic gene expression in the hypothalamus or liver, but IUGR offspring had increased hypothalamic Kiss1r expression. These results demonstrate IUGR offspring from hypoxia pregnancies show central circadian misalignment and potentially disrupted hypothalamic Kiss1/Kiss1r signalling, which may contribute to developmentally programmed metabolic disease.
Paleontology provides insights into the history of the planet, from the origins of life billions of years ago to the biotic changes of the Recent. The scope of paleontological research is as vast as it is varied, and the field is constantly evolving. In an effort to identify “Big Questions” in paleontology, experts from around the world came together to build a list of priority questions the field can address in the years ahead. The 89 questions presented herein (grouped within 11 themes) represent contributions from nearly 200 international scientists. These questions touch on common themes including biodiversity drivers and patterns, integrating data types across spatiotemporal scales, applying paleontological data to contemporary biodiversity and climate issues, and effectively utilizing innovative methods and technology for new paleontological insights. In addition to these theoretical questions, discussions touch upon structural concerns within the field, advocating for an increased valuation of specimen-based research, protection of natural heritage sites, and the importance of collections infrastructure, along with a stronger emphasis on human diversity, equity, and inclusion. These questions offer a starting point—an initial nucleus of consensus that paleontologists can expand on—for engaging in discussions, securing funding, advocating for museums, and fostering continued growth in shared research directions.
Describe the hemodynamic implications of anaesthetic choice among children with heart disease undergoing cardiac catheterisation.
Methods:
Study 1 was a secondary analysis of data obtained during catheterisation-based hemodynamic assessment of infants with hypoplastic left heart syndrome following Stage 1 palliation, randomised in the Single Ventricle Reconstruction trial. Measured and calculated hemodynamics including pulmonary and systemic vascular resistance indexed to body surface area (PVRi and SVRi respectively) and pulmonary/systemic blood flow (Qp/Qs) were analysed with respect to anaesthetic employed during catheterisation, classified as moderate sedation or general anaesthesia. Study 2 consisted of a single centre, prospective analysis of patients requiring percutaneous closure of a patent ductus arteriosus or endomyocardial biopsy after orthotopic heart transplant. Participants underwent hemodynamic assessment first using inhaled volatile anaesthesia (IA), and then transitioned to total intravenous anaesthesia, comparing hemodynamic measures with respect to anaesthetic approach.
Results:
In Study 1, independent of shunt type, PVRi, and patient size, moderate sedation was associated with a greater than two-fold odds of a Qp/Qs >1 (OR 2.12, 95%CI 1.18–3.87, p = 0.013). In Study 2, while PVRi was similar, SVRi was significantly higher using total intravenous anaesthesia. Among the patent ductus arteriosus subgroup, Qp/Qs increased significantly with a total intravenous anaesthesia relative to IA (p = 0.003); additionally, among the orthotopic heart transplant subgroup, left ventricular end diastolic pressure increased following a transition to total intravenous anaesthesia (p = 0.002).
Conclusions:
Analyses of hemodynamics during catheterisation support a significant impact of anaesthetic type on hemodynamic values including SVRi, left ventricular end diastolic pressure, and Qp/Qs. Anaesthesia choice and intraprocedural management of SVRi are important considerations when making clinical decisions based on hemodynamic data.
Functional impairment in daily activities, such as work and socializing, is part of the diagnostic criteria for major depressive disorder and most anxiety disorders. Despite evidence that symptom severity and functional impairment are partially distinct, functional impairment is often overlooked. To assess whether functional impairment captures diagnostically relevant genetic liability beyond that of symptoms, we aimed to estimate the heritability of, and genetic correlations between, key measures of current depression symptoms, anxiety symptoms, and functional impairment.
Methods
In 17,130 individuals with lifetime depression or anxiety from the Genetic Links to Anxiety and Depression (GLAD) Study, we analyzed total scores from the Patient Health Questionnaire-9 (depression symptoms), Generalized Anxiety Disorder-7 (anxiety symptoms), and Work and Social Adjustment Scale (functional impairment). Genome-wide association analyses were performed with REGENIE. Heritability was estimated using GCTA-GREML and genetic correlations with bivariate-GREML.
Results
The phenotypic correlations were moderate across the three measures (Pearson’s r = 0.50–0.69). All three scales were found to be under low but significant genetic influence (single-nucleotide polymorphism-based heritability [h2SNP] = 0.11–0.19) with high genetic correlations between them (rg = 0.79–0.87).
Conclusions
Among individuals with lifetime depression or anxiety from the GLAD Study, the genetic variants that underlie symptom severity largely overlap with those influencing functional impairment. This suggests that self-reported functional impairment, while clinically relevant for diagnosis and treatment outcomes, does not reflect substantial additional genetic liability beyond that captured by symptom-based measures of depression or anxiety.
Background: Our prior six-year review (n=2165) revealed 24% of patients undergoing posterior decompression surgeries (laminectomy or discectomy) sought emergency department (ED) care within three months post-surgery. We established an integrated Spine Assessment Clinic (SAC) to enhance patient outcomes and minimize unnecessary ED visits through pre-operative education, targeted QI interventions, and early post-operative follow-up. Methods: We reviewed 13 months of posterior decompression data (n=205) following SAC implementation. These patients received individualized, comprehensive pre-operative education and follow-up phone calls within 7 days post-surgery. ED visits within 90 days post-surgery were tracked using provincial databases and compared to our pre-SAC implementation data. Results: Out of 205 patients, 24 (11.6%) accounted for 34 ED visits within 90 days post-op, showing a significant reduction in ED visits from 24% to 11.6%, and decreased overall ED utilization from 42.1% to 16.6% (when accounting for multiple visits by the same patient). Early interventions including wound monitoring, outpatient bloodwork, and prescription adjustments for pain management, helped mitigate ED visits. Patient satisfaction surveys (n=62) indicated 92% were “highly satisfied” and 100% would recommend the SAC. Conclusions: The SAC reduced ED visits after posterior decompression surgery by over 50%, with pre-operative education, focused QI initiatives, and its individualized, proactive approach.
To estimate the cost-effectiveness of methicillin-resistant Staphylococcus aureus (MRSA) nares poymerase chain reaction (PCR) use in pediatric pneumonia and tracheitis.
Methods:
We built a cost-effectiveness model based on MRSA prevalence and probability of empiric treatment for MRSA pneumonia or tracheitis, with all parameters varied in sensitivity analyses. The hypothetical patient cohort was <18 years of age and hospitalized in the pediatric intensive care unit for community-acquired pneumonia (CAP) or tracheitis. Two strategies were compared: MRSA nares PCR-guided antibiotic therapy versus usual care. The primary measure was cost per incorrect treatment course avoided. Length of stay and hospital costs unrelated to antibiotic costs were assumed to be the same regardless of PCR use. Both literature data and expert estimates informed sensitivity analysis ranges.
Results:
When estimating the health care system willingness-to-pay threshold for PCR testing as $140 (varied in sensitivity analyses) per incorrect treatment course avoided, reflecting estimated additional costs of MRSA targeted antibiotics, and MRSA nares PCR true cost as $64, PCR testing was generally favored if empiric MRSA treatment likelihood was >52%. PCR was not favored in some scenarios when simultaneously varying MRSA infection prevalence and likelihood of MRSA empiric treatment. Screening becomes less favorable as MRSA PCR cost increased to the highest range value of the parameter ($88). Individual variation of MRSA colonization rates over wide ranges (0% – 30%) had lesser effects on results.
Conclusions:
MRSA nares PCR use in hospitalized pediatric patients with CAP or tracheitis was generally favored when empiric MRSA empiric treatment rates are moderate or high.
This study evaluated Medicaid claims (MC) data as a valid source for outpatient antimicrobial stewardship programs (ASPs) by comparing it to electronic medical record (EMR) data from a single academic center.
Methods:
This retrospective study compared pediatric patients’ MC data with EMR data from the Marshall Health Network (MHN). Claims were matched to EMR records based on patient Medicaid ID, service date, and provider NPI number. Demographics, antibiotic choice, diagnosis appropriateness, and guideline concordance were assessed across both data sources.
Setting:
The study was conducted within the MHN, involving multiple pediatric and family medicine outpatient practices in West Virginia, USA.
Patients:
Pediatric patients receiving care within MHN with Medicaid coverage.
Results:
MC and EMR data showed >90% agreement in antibiotic choice, gender, and date of service. Discrepancies were observed in diagnoses, especially for visits with multiple infectious diagnoses. MC data demonstrated similar accuracy to EMR data in identifying inappropriate prescriptions and assessing guideline concordance. Additionally, MC data provided timely information, enhancing the feasibility of impactful outpatient ASP interventions.
Conclusion:
MC data is a valid and timely resource for outpatient ASP interventions. Insurance providers should be leveraged as key partners to support large-scale outpatient stewardship efforts.
In 2010, USAID catalyzed the formation of One Health University Networks as part of a holistic response designed to promote the One Health approach for addressing complex health challenges. This globally connected One Health University network now includes the African One Health University Network (AFROHUN) and the Southeast Asia University Network (SEAOHUN) and has representation from over 120 universities in 17 countries across Africa and Southeast Asia. Over more than 15 years of USAID investment, these networks have trained more than 85,000 students, in-service professionals and faculty around the world in One Health principles and collaborative problem solving, grounded in One Health core competencies. These One Health practitioners have gone on to contribute to improved global health security in their communities and countries. The evolution and maturation of these networks is a testament to a strong vision and dedication to the task by leadership and donors. As the global academic community continues to refine and adapt training methodologies for ‘future ready’ individuals, resources and examples from One Health University Networks stand as a legacy to build upon.
Posttraumatic stress disorder (PTSD) has been associated with advanced epigenetic age cross-sectionally, but the association between these variables over time is unclear. This study conducted meta-analyses to test whether new-onset PTSD diagnosis and changes in PTSD symptom severity over time were associated with changes in two metrics of epigenetic aging over two time points.
Methods
We conducted meta-analyses of the association between change in PTSD diagnosis and symptom severity and change in epigenetic age acceleration/deceleration (age-adjusted DNA methylation age residuals as per the Horvath and GrimAge metrics) using data from 7 military and civilian cohorts participating in the Psychiatric Genomics Consortium PTSD Epigenetics Workgroup (total N = 1,367).
Results
Meta-analysis revealed that the interaction between Time 1 (T1) Horvath age residuals and new-onset PTSD over time was significantly associated with Horvath age residuals at T2 (meta β = 0.16, meta p = 0.02, p-adj = 0.03). The interaction between T1 Horvath age residuals and changes in PTSD symptom severity over time was significantly related to Horvath age residuals at T2 (meta β = 0.24, meta p = 0.05). No associations were observed for GrimAge residuals.
Conclusions
Results indicated that individuals who developed new-onset PTSD or showed increased PTSD symptom severity over time evidenced greater epigenetic age acceleration at follow-up than would be expected based on baseline age acceleration. This suggests that PTSD may accelerate biological aging over time and highlights the need for intervention studies to determine if PTSD treatment has a beneficial effect on the aging methylome.
Objectives/Goals: We describe the prevalence of individuals with household exposure to SARS-CoV-2, who subsequently report symptoms consistent with COVID-19, while having PCR results persistently negative for SARS-CoV-2 (S[+]/P[-]). We assess whether paired serology can assist in identifying the true infection status of such individuals. Methods/Study Population: In a multicenter household transmission study, index patients with SARS-CoV-2 were identified and enrolled together with their household contacts within 1 week of index’s illness onset. For 10 consecutive days, enrolled individuals provided daily symptom diaries and nasal specimens for polymerase chain reaction (PCR). Contacts were categorized into 4 groups based on presence of symptoms (S[+/-]) and PCR positivity (P[+/-]). Acute and convalescent blood specimens from these individuals (30 days apart) were subjected to quantitative serologic analysis for SARS-CoV-2 anti-nucleocapsid, spike, and receptor-binding domain antibodies. The antibody change in S[+]/P[-] individuals was assessed by thresholds derived from receiver operating characteristic (ROC) analysis of S[+]/P[+] (infected) versusS[-]/P[-] (uninfected). Results/Anticipated Results: Among 1,433 contacts, 67% had ≥1 SARS-CoV-2 PCR[+] result, while 33% remained PCR[-]. Among the latter, 55% (n = 263) reported symptoms for at least 1 day, most commonly congestion (63%), fatigue (63%), headache (62%), cough (59%), and sore throat (50%). A history of both previous infection and vaccination was present in 37% of S[+]/P[-] individuals, 38% of S[-]/P[-], and 21% of S[+]/P[+] (P<0.05). Vaccination alone was present in 37%, 41%, and 52%, respectively. ROC analyses of paired serologic testing of S[+]/P[+] (n = 354) vs. S[-]/P[-] (n = 103) individuals found anti-nucleocapsid data had the highest area under the curve (0.87). Based on the 30-day antibody change, 6.9% of S[+]/P[-] individuals demonstrated an increased convalescent antibody signal, although a similar seroresponse in 7.8% of the S[-]/P[-] group was observed. Discussion/Significance of Impact: Reporting respiratory symptoms was common among household contacts with persistent PCR[-] results. Paired serology analyses found similar seroresponses between S[+]/P[-] and S[-]/P[-] individuals. The symptomatic-but-PCR-negative phenomenon, while frequent, is unlikely attributable to true SARS-CoV-2 infections that go missed by PCR.
Since cannabis was legalized in Canada in 2018, its use among older adults has increased. Although cannabis may exacerbate cognitive impairment, there are few studies on its use among older adults being evaluated for cognitive disorders.
Methods:
We analyzed data from 238 patients who attended a cognitive clinic between 2019 and 2023 and provided data on cannabis use. Health professionals collected information using a standardized case report form.
Results:
Cannabis use was reported by 23 out of 238 patients (9.7%): 12 took cannabis for recreation, 8 for medicinal purposes and 3 for both purposes. Compared to non-users, cannabis users were younger (mean ± SD 62.0 ± 7.5 vs 68.9 ± 9.5 years; p = 0.001), more likely to have a mood disorder (p < 0.05) and be current or former cigarette smokers (p < 0.05). There were no significant differences in sex, race or education. The proportion with dementia compared with pre-dementia cognitive states did not differ significantly in users compared with non-users. Cognitive test scores were similar in users compared with non-users (Montreal Cognitive Assessment: 20.4 ± 5.0 vs 20.7 ± 4.5, p = 0.81; Folstein Mini-Mental Status Exam: 24.5 ± 5.1 vs 26.0 ± 3.6, p = 0.25). The prevalence of insomnia, obstructive sleep apnea, anxiety disorders, alcohol use or psychotic disorders did not differ significantly.
Conclusion:
The prevalence of cannabis use among patients with cognitive concerns in this study was similar to the general Canadian population aged 65 and older. Further research is necessary to investigate patients’ motivations for use and explore the relationship between cannabis use and mood disorders and cognitive decline.
There is mounting interest in the dual health and environmental benefits of plant-based diets. Such diets prioritise whole foods of plant origin and moderate (though occasionally exclude) animal-sourced foods. However, the evidence base on plant-based diets and health outcomes in Australasia is limited and diverse, making it unsuitable for systematic review. This review aimed to assess the current state of play, identify research gaps and suggest good practice recommendations. The consulted evidence base included key studies on plant-based diets and cardiometabolic health or mortality outcomes in Australian and New Zealand adults. Most studies were observational, conducted in Australia, published within the last decade, and relied on a single dietary assessment about 10–30 years ago. Plant-based diets were often examined using categories of vegetarianism, intake of plant or animal protein, or dietary indices. Health outcomes included mortality, type 2 diabetes and insulin resistance, obesity, CVD and metabolic syndrome. While Australia has an emerging and generally favourable evidence base on plant-based diets and health outcomes, New Zealand’s evidence base is still nascent. The lack of similar studies hinders the ability to judge the overall certainty of evidence, which could otherwise inform public health policies and strategies without relying on international studies with unconfirmed applicability. The proportional role of plant- and animal-sourced foods in healthy, sustainable diets in Australasia is an underexplored research area with potentially far-reaching implications, especially concerning nutrient adequacy and the combined health and environmental impacts.
To facilitate and sustain community-engaged research (CEnR) conducted by academic-community partnerships (ACPs), a Clinical Translational Science Award (CTSA)-funded Community Engagement Core (CEC) and Community Partner Council (CPC) co-created two innovative microgrant programs. The Community Health Grant (CHG) and the Partnership Development Grant (PDG) programs are designed to specifically fund ACPs conducting pilot programs aimed at improving health outcomes. Collectively, these programs have engaged 94 community partner organizations while impacting over 55,000 individuals and leveraging $1.2 million to fund over $10 million through other grants and awards. A cross-sectional survey of 57 CHG awardees demonstrated high overall satisfaction with the programs and indicated that participation addressed barriers to CEnR, such as building trust in research and improving partnership and program sustainability. The goal of this paper is to (1) describe the rationale and development of the CHG and PDG programs; (2) their feasibility, impact, and sustainability; and (3) lessons learned and best practices. Institutions seeking to implement similar programs should focus on integrating community partners throughout the design and review processes and prioritizing projects that align with specific, measurable goals.
The excavation of a stratified sequence of deposits spanning the Initial Late Formative period (250 BC–AD 120) at Iruhito, in the upper Desaguadero Valley of Bolivia, provides insight into this previously unrecognized, four-century period separating the well-documented Middle Formative (800–250 BC) from the Late Formative (~AD 120–590) period. By tracking subtle shifts in ceramic, architectural, lithic, and faunal data, we can explore tempos of change in social life during this dynamic time. These data lead us to suggest that, rather than being a “transitional” period or a “hiatus” in regional occupation, the Initial Late Formative period was a distinct mode of sociality characterized by the realignment and expansion of interaction networks, on the one hand, and rejection of the decorative aesthetics, monumentality, and public-oriented performances of earlier periods, on the other. We argue that the Late Formative period centers emerging after ~AD 120 intentionally cited architecture and aesthetics that were distant in time and space, constituting a sophisticated political strategy. Finally, these data suggest that the chronological schemata we use to build regional histories often obscure social variability.
Accurate diagnosis of bipolar disorder (BPD) is difficult in clinical practice, with an average delay between symptom onset and diagnosis of about 7 years. A depressive episode often precedes the first manic episode, making it difficult to distinguish BPD from unipolar major depressive disorder (MDD).
Aims
We use genome-wide association analyses (GWAS) to identify differential genetic factors and to develop predictors based on polygenic risk scores (PRS) that may aid early differential diagnosis.
Method
Based on individual genotypes from case–control cohorts of BPD and MDD shared through the Psychiatric Genomics Consortium, we compile case–case–control cohorts, applying a careful quality control procedure. In a resulting cohort of 51 149 individuals (15 532 BPD patients, 12 920 MDD patients and 22 697 controls), we perform a variety of GWAS and PRS analyses.
Results
Although our GWAS is not well powered to identify genome-wide significant loci, we find significant chip heritability and demonstrate the ability of the resulting PRS to distinguish BPD from MDD, including BPD cases with depressive onset (BPD-D). We replicate our PRS findings in an independent Danish cohort (iPSYCH 2015, N = 25 966). We observe strong genetic correlation between our case–case GWAS and that of case–control BPD.
Conclusions
We find that MDD and BPD, including BPD-D are genetically distinct. Our findings support that controls, MDD and BPD patients primarily lie on a continuum of genetic risk. Future studies with larger and richer samples will likely yield a better understanding of these findings and enable the development of better genetic predictors distinguishing BPD and, importantly, BPD-D from MDD.