We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
A decisive voter’s exact ideological preferences can be hard to predict, even for seasoned candidates. We develop a novel theory of electoral competition where candidates are evaluated on ideological and nonideological dimensions. The key feature of our theory is that an electorate’s partisan leaning serves as a signal of the median voter’s ideological position where extreme leanings are more informative about voters than centrist leanings. We show that this leads to an endogenous sorting of districts between “extreme” and “centrist” and that an increase in the importance of candidate competence for voters increases polarization—but only in extreme districts. We evaluate our theory using data from mayoral elections in Brazil’s 95 largest municipalities and exploit COVID-19 as a shock to the salience of candidate competence. We show that COVID-19 increases the salience of competence in these elections, leading to increased political polarization, which is concentrated in cities with extreme partisan leanings.
Validate a public health model identifying patients at high risk for carbapenem-resistant Enterobacterales (CRE) on admission and evaluate performance across a healthcare network.
Design:
Retrospective case-control studies
Participants:
Adults hospitalized with a clinical CRE culture within 3 days of admission (cases) and those hospitalized without a CRE culture (controls).
Methods:
Using public health data from Atlanta, GA (1/1/2016–9/1/2019), we validated a CRE prediction model created in Chicago. We then closely replicated this model using clinical data from a healthcare network in Atlanta (1/1/2015–12/31/2021) (“Public Health Model”) and optimized performance by adding variables from the healthcare system (“Healthcare System Model”). We frequency-matched cases and controls based on year and facility. We evaluated model performance in validation datasets using area under the curve (AUC).
Results:
Using public health data, we matched 181 cases to 764,408 controls, and the Chicago model performed well (AUC 0.85). Using clinical data, we matched 91 cases to 384,013 controls. The Public Health Model included age, prior infection diagnosis, number of and mean length of stays in acute care hospitalizations (ACH) in the prior year. The final Healthcare System Model added Elixhauser score, antibiotic days of therapy in prior year, diabetes, admission to the intensive care unit in prior year and removed prior number of ACH. The AUC increased from 0.68 to 0.73.
Conclusions:
A CRE risk prediction model using prior healthcare exposures performed well in a geographically distinct area and in an academic healthcare network. Adding variables from healthcare networks improved model performance.
Honeybees (Apis mellifera) and native bee species have ecological, economic, social, and cultural importance to smallholder coffee farmers. While the ecological contributions of bees to the sustainability of coffee systems are well documented, particularly in relation to the coffee crop, fewer studies have examined socio-economic dimensions of beekeeping for honey as an agroecological diversification strategy for coffee producers. Yet, understanding the multiple values of different diversification strategies is important as many coffee farmers in different parts of the world are finding it increasingly difficult to make a living on coffee alone and are adopting alternative strategies, such as on-farm diversification. In this Participatory Action Research (PAR) study, we examined the opportunities, limitations, and trade-offs of beekeeping (with A. mellifera) as an agroecological diversification option for smallholder coffee farmers in Chiapas, Mexico. We applied a mixed-methods approach, which consisted of monthly surveys with 25 beekeepers of Campesinos Ecológicos de la Sierra Madre de Chiapas (CESMACH)/Apicultores Miel Real del Triunfo (ART) producer cooperatives for 12 months and five focus groups between 2018 and 2019. We found that beekeeping is less labor-intensive than coffee, and there are opportunities to integrate beekeeping into the annual farming cycle of coffee and maize production without causing competing labor demands or additional time pressures. We also found that beekeeping could generate economic gains for peasant families; however, profitability hinged on various factors, such as the price for honey, yield per hive, and the number of beehives. Our results further show that beekeeping yielded multiple non-monetary benefits by contributing to the nutrition and health of farmer families and their communities, serving as a vehicle for horizontal learning and relationship building, and contributing to the emotional well-being of beekeepers. Finally, producers who hoped to gain economically from beekeeping were generally interested in growing their apiaries but expressed concerns about limited technical knowledge and the impacts of climate change. Given the multiple social, economic, and ecological benefits of beekeeping, it has great promise as a part of agroecological food and farming systems. We argue that efforts to promote beekeeping as a diversification strategy should take a holistic approach, underscoring the potential of apiculture to enhance the well-being and resilience of beekeeping families and strengthen food sovereignty and local economies (including solidarity economies) in peasant communities. These findings can be useful in supporting beekeepers and their organizations in strategic planning for enhancing the long-term sustainability of beekeeping.
Product architecture decisions are made early in the product development process and have far-reaching effects. Unless anticipated through experience or intuition, many of these effects may not be apparent until much later in the development process, making changes to the architecture costly in time, effort and resources. Many researchers through the years have studied various elements of product architecture and their effects. By using a repeatable process for aggregating statements on the effects of architecture strategies from a selection of the literature on the topic and storing them in a systematic database, this information can then be recalled and presented in the form of a Product Architecture Strategy and Effect (PASE) matrix. PASE matrices allow for the identification, comparison, evaluation, and then selection of the most desirable product architecture strategies before expending resources along a specific development path. This paper introduces the PASE Database and matrix and describes their construction and use in guiding design decisions. This paper also provides metrics for understanding the robustness of this database.
Efficient evidence generation to assess the clinical and economic impact of medical therapies is critical amid rising healthcare costs and aging populations. However, drug development and clinical trials remain far too expensive and inefficient for all stakeholders. On October 25–26, 2023, the Duke Clinical Research Institute brought together leaders from academia, industry, government agencies, patient advocacy, and nonprofit organizations to explore how different entities and influencers in drug development and healthcare can realign incentive structures to efficiently accelerate evidence generation that addresses the highest public health needs. Prominent themes surfaced, including competing research priorities and incentives, inadequate representation of patient population in clinical trials, opportunities to better leverage existing technology and infrastructure in trial design, and a need for heightened transparency and accountability in research practices. The group determined that together these elements contribute to an inefficient and costly clinical research enterprise, amplifying disparities in population health and sustaining gaps in evidence that impede advancements in equitable healthcare delivery and outcomes. The goal of addressing the identified challenges is to ultimately make clinical trials faster, more inclusive, and more efficient across diverse communities and settings.
Background: Prompt identification of patients colonized or infected with carbapenem-resistant Enterobacterales (CRE) upon admission can help ensure rapid initiation of infection prevention measures and may reduce intrafacility transmission of CRE. The Chicago CDC Prevention Epicenters Program previously created a CRE prediction model using state-wide public health data (doi: 10.1093/ofid/ofz483). We evaluated how well a similar model performed using data from a single academic healthcare system in Atlanta, Georgia, and we sought to determine whether including additional variables improved performance. Methods: We performed a case–control study using electronic medical record data. We defined cases as adult encounters to acute-care hospitals in a 4-hospital academic healthcare system from January 1, 2014, to December 31, 2021, with CRE identified from a clinical culture within the first 3 hospital days. Only the first qualifying encounter per patient was included. We frequency matched cases to control admissions (no CRE identified) from the same hospital and year. Using multivariable logistic regression, we compared 2 models. The “public health model” included 4 variables from the Chicago Epicenters model (age, number of hospitalizations in the prior 365 days, mean length of stay in hospitalizations in the prior 365 days, and hospital admission with an infection diagnosis in the prior 365 days). The “healthcare system model” added 4 additional variables (admission to the ICU in the prior 365 days, malignancy diagnosis, Elixhauser score and inpatient antibiotic days of therapy in the prior 365 days) to the public health model. We used billing codes to determine Elixhauser score, malignancy status, and recent infection diagnoses. We compared model performance using the area under the receiver operating curve (AUC). Results: We identified 105 cases and 441,460 controls (Table 1). CRE was most frequently identified in urine cultures (46%). All 4 variables included in the public health model and the 4 additional variables in the healthcare system model were all significantly associated with being a case in unadjusted analyses (Table 1). The AUC for the public health model was 0.76, and the AUC for the healthcare system model was 0.79 (Table 2; Fig. 1). In both models, a prior admission with an infection diagnosis was the most significant risk factor. Conclusions: A modified CRE prediction model developed using public health data and focused on prior healthcare exposures performed reasonably well when applied to a different academic healthcare system. The addition of variables accessible in large healthcare networks did not meaningfully improve model discrimination.
Potato producers in Canada’s Atlantic provinces of Prince Edward Island (PE) and New Brunswick rely on photosystem II (PSII)-inhibiting herbicides to provide season-long weed control. Despite this fact, a high proportion of common lambsquarters populations in the region have been identified as resistant to this class of herbicides. Crop-topping is a late-season weed management practice that exploits the height differential between weeds and a developing crop canopy. Two field experiments were conducted in Harrington, PE, in 2020 and 2021, one each to evaluate the efficacy of a different crop-topping strategy, above-canopy mowing or wick-applied glyphosate, at two potato phenological stages, on common lambsquarters viable seed production and potato yield and quality. Mowing common lambsquarters postflowering decreased viable seed production (72% to 91%) in 2020 but increased seed production (78% to 278%) in 2021. Mowing had minimal impact on potato marketable yield across cultivars in both years. In contrast, treating common lambsquarters with wick-applied glyphosate had variable impacts on seed output in 2020 but dramatically reduced seed production (up to 95%) in 2021 when treatments were applied preflowering. Glyphosate damage to potato tubers was not influenced by timing and resulted in a 14% to 15% increase in culled tubers due to black spotting and rot. Our results highlight the importance of potato and common lambsquarters phenology when selecting a crop-topping strategy and demonstrate that above-canopy mowing and wick-applied glyphosate can be utilized for seedbank management of herbicide-resistant common lambsquarters in potato production systems.
Arid regions are especially vulnerable to climate change and land use. More than one-third of Earth's population relies on these ecosystems. Modern observations lack the temporal depth to determine vegetation responses to climate and human activity, but paleoecological and archaeological records can be used to investigate these relationships. Decreasing rainfall across the Late Holocene provides a case study for vegetation response to changing hydroclimate. Rock hyrax (Procavia capensis) middens preserve paleoenvironmental indicators in arid environments where traditional archives are unavailable. Pollen from modern middens collected in Dhofar, Oman, demonstrates the reliability of this archive. Pollen, stable isotope (δ13C, δ15N), and microcharcoal data from fossil middens reveal changes in vegetation, relative moisture, and fire from 4000 cal yr BP to the present. Trees limited to moister areas (e.g., Terminalia) today existed farther inland at ~3100 cal yr BP. After ~2900 cal yr BP, taxa with more xeric affiliations (e.g., Senegalia) had increased. Coprophilous fungal spores (Sporormiella) and grazing indicator pollen revealed an amplified signal of domesticate grazing at ~1000 cal yr BP. This indicates that trees associated with semiarid environments were maintained in the interior desert during ~3000–4000 yr of decreasing rainfall and that impacts of human activity intensified after the transition to a drier environment.
The U.S. Department of Agriculture–Agricultural Research Service (USDA-ARS) has been a leader in weed science research covering topics ranging from the development and use of integrated weed management (IWM) tactics to basic mechanistic studies, including biotic resistance of desirable plant communities and herbicide resistance. ARS weed scientists have worked in agricultural and natural ecosystems, including agronomic and horticultural crops, pastures, forests, wild lands, aquatic habitats, wetlands, and riparian areas. Through strong partnerships with academia, state agencies, private industry, and numerous federal programs, ARS weed scientists have made contributions to discoveries in the newest fields of robotics and genetics, as well as the traditional and fundamental subjects of weed–crop competition and physiology and integration of weed control tactics and practices. Weed science at ARS is often overshadowed by other research topics; thus, few are aware of the long history of ARS weed science and its important contributions. This review is the result of a symposium held at the Weed Science Society of America’s 62nd Annual Meeting in 2022 that included 10 separate presentations in a virtual Weed Science Webinar Series. The overarching themes of management tactics (IWM, biological control, and automation), basic mechanisms (competition, invasive plant genetics, and herbicide resistance), and ecosystem impacts (invasive plant spread, climate change, conservation, and restoration) represent core ARS weed science research that is dynamic and efficacious and has been a significant component of the agency’s national and international efforts. This review highlights current studies and future directions that exemplify the science and collaborative relationships both within and outside ARS. Given the constraints of weeds and invasive plants on all aspects of food, feed, and fiber systems, there is an acknowledged need to face new challenges, including agriculture and natural resources sustainability, economic resilience and reliability, and societal health and well-being.
Product architecture decisions are made early in the product development process and have far-reaching effects. Unless anticipated through experience or intuition, many of these effects may not be apparent until much later in the development process, making changes to the architecture costly in time, effort, and resources. Many researchers through the years have studied various elements of product architecture and their effects. By aggregating observations on the effects of architecture strategies from a selection of the literature on the topic and storing them in a systematic data set, this information can be recalled in a matrix structure which allows for the identification, comparison and evaluation, and then selection of the most desirable product architecture strategies before expending resources along any development path. This paper introduces this matrix, referred to as the Product Architecture Strategy and Effect (PASE) Matrix, how to construct one, and a demonstration of its use.
The causal impacts of recreational cannabis legalization are not well understood due to the number of potential confounds. We sought to quantify possible causal effects of recreational cannabis legalization on substance use, substance use disorder, and psychosocial functioning, and whether vulnerable individuals are more susceptible to the effects of cannabis legalization than others.
Methods
We used a longitudinal, co-twin control design in 4043 twins (N = 240 pairs discordant on residence), first assessed in adolescence and now age 24–49, currently residing in states with different cannabis policies (40% resided in a recreationally legal state). We tested the effect of legalization on outcomes of interest and whether legalization interacts with established vulnerability factors (age, sex, or externalizing psychopathology).
Results
In the co-twin control design accounting for earlier cannabis frequency and alcohol use disorder (AUD) symptoms respectively, the twin living in a recreational state used cannabis on average more often (βw = 0.11, p = 1.3 × 10−3), and had fewer AUD symptoms (βw = −0.11, p = 6.7 × 10−3) than their co-twin living in an non-recreational state. Cannabis legalization was associated with no other adverse outcome in the co-twin design, including cannabis use disorder. No risk factor significantly interacted with legalization status to predict any outcome.
Conclusions
Recreational legalization was associated with increased cannabis use and decreased AUD symptoms but was not associated with other maladaptations. These effects were maintained within twin pairs discordant for residence. Moreover, vulnerabilities to cannabis use were not exacerbated by the legal cannabis environment. Future research may investigate causal links between cannabis consumption and outcomes.
The rise of jawed vertebrates (gnathostomes) and extinction of nearly all jawless vertebrates (agnathans) is one of the most important transitions in vertebrate evolution, but the causes are poorly understood. Competition between agnathans and gnathostomes during the Devonian period is the most commonly hypothesized cause; however, no formal attempts to test this hypothesis have been made. Generally, competition between species increases as morphological similarity increases; therefore, this study uses the largest to date morphometric comparison of Silurian and Devonian agnathan and gnathostome groups to determine which groups were most and least likely to have competed. Five agnathan groups (Anaspida, Heterostraci, Osteostraci, Thelodonti, and Furcacaudiformes) were compared with five gnathostome groups (Acanthodii, Actinopterygii, Chondrichthyes, Placodermi, and Sarcopterygii) including taxa from most major orders. Morphological dissimilarity was measured by Gower's dissimilarity coefficient, and the differences between agnathan and gnathostome body forms across early vertebrate morphospace were compared using principal coordinate analysis. Our results indicate competition between some agnathans and gnathostomes is plausible, but not all agnathan groups were similar to gnathostomes. Furcacaudiformes (fork-tailed thelodonts) are distinct from other early vertebrate groups and the least likely to have competed with other groups.
Despite extensive paleoenvironmental research on the postglacial history of the Kenai Peninsula, Alaska, uncertainties remain regarding the region's deglaciation, vegetation development, and past hydroclimate. To elucidate this complex environmental history, we present new proxy datasets from Hidden and Kelly lakes, located in the eastern Kenai lowlands at the foot of the Kenai Mountains, including sedimentological properties (magnetic susceptibility, organic matter, grain size, and biogenic silica), pollen and macrofossils, diatom assemblages, and diatom oxygen isotopes. We use a simple hydrologic and isotope mass balance model to constrain interpretations of the diatom oxygen isotope data. Results reveal that glacier ice retreated from Hidden Lake's headwaters by ca. 13.1 cal ka BP, and that groundwater was an important component of Kelly Lake's hydrologic budget in the Early Holocene. As the forest developed and the climate became wetter in the Middle to Late Holocene, Kelly Lake reached or exceeded its modern level. In the last ca. 75 years, rising temperature caused rapid changes in biogenic silica content and diatom oxygen isotope values. Our findings demonstrate the utility of mass balance modeling to constrain interpretations of paleolimnologic oxygen isotope data, and that groundwater can exert a strong influence on lake water isotopes, potentially confounding interpretations of regional climate.
To determine the incidence of severe acute respiratory coronavirus virus 2 (SARS-CoV-2) infection among healthcare personnel (HCP) and to assess occupational risks for SARS-CoV-2 infection.
Design:
Prospective cohort of healthcare personnel (HCP) followed for 6 months from May through December 2020.
Setting:
Large academic healthcare system including 4 hospitals and affiliated clinics in Atlanta, Georgia.
Participants:
HCP, including those with and without direct patient-care activities, working during the coronavirus disease 2019 (COVID-19) pandemic.
Methods:
Incident SARS-CoV-2 infections were determined through serologic testing for SARS-CoV-2 IgG at enrollment, at 3 months, and at 6 months. HCP completed monthly surveys regarding occupational activities. Multivariable logistic regression was used to identify occupational factors that increased the risk of SARS-CoV-2 infection.
Results:
Of the 304 evaluable HCP that were seronegative at enrollment, 26 (9%) seroconverted for SARS-CoV-2 IgG by 6 months. Overall, 219 participants (73%) self-identified as White race, 119 (40%) were nurses, and 121 (40%) worked on inpatient medical-surgical floors. In a multivariable analysis, HCP who identified as Black race were more likely to seroconvert than HCP who identified as White (odds ratio, 4.5; 95% confidence interval, 1.3–14.2). Increased risk for SARS-CoV-2 infection was not identified for any occupational activity, including spending >50% of a typical shift at a patient’s bedside, working in a COVID-19 unit, or performing or being present for aerosol-generating procedures (AGPs).
Conclusions:
In our study cohort of HCP working in an academic healthcare system, <10% had evidence of SARS-CoV-2 infection over 6 months. No specific occupational activities were identified as increasing risk for SARS-CoV-2 infection.
Turbulent secondary flows are defined as Prandtl's secondary flow of the first or second kind, the former produced by stretching and/or tilting of vorticity, the latter produced via spatial heterogeneity of Reynolds stresses. Both mechanisms are instantaneously active within inertia-dominated wall turbulence; Reynolds stress spatial heterogeneity is required for Reynolds-averaged secondary flows. Spanwise-variable surface roughness can induce turbulent stress spatial heterogeneity in the spanwise–wall-normal plane and provide sustenance for streamwise-aligned mean secondary flows. Herein, we demonstrate that turbulent secondary flows can also be sustained by spanwise variability in the surface heat flux in unstably stratified turbulent channels, defined hereafter as Prandtl's secondary flow of the third kind. Support for this mechanism is established with scaling arguments, while large-eddy simulation is used to model inertia-dominated channel turbulence responding to a lower boundary with uniform aerodynamic/hydrodynamic roughness but spanwise-variable surface heat flux. Transport equations for streamwise vorticity and turbulent kinetic energy, $k$, outline the conditions needed for third-kind production: shear and buoyancy production over the elevated heat flux regions necessitates lateral entrainment of low-$k$ fluid, inducing mean counter-rotating secondary cells aligned such that upwelling and downwelling occur over the high and low heat flux regions, respectively. Buoyancy-driven production of $k$ alters aggregate flow response and thus is a distinctly different mechanism responsible for sustenance of secondary flows.
To describe the cumulative seroprevalence of severe acute respiratory coronavirus virus 2 (SARS-CoV-2) antibodies during the coronavirus disease 2019 (COVID-19) pandemic among employees of a large pediatric healthcare system.
Design, setting, and participants:
Prospective observational cohort study open to adult employees at the Children’s Hospital of Philadelphia, conducted April 20–December 17, 2020.
Methods:
Employees were recruited starting with high-risk exposure groups, utilizing e-mails, flyers, and announcements at virtual town hall meetings. At baseline, 1 month, 2 months, and 6 months, participants reported occupational and community exposures and gave a blood sample for SARS-CoV-2 antibody measurement by enzyme-linked immunosorbent assays (ELISAs). A post hoc Cox proportional hazards regression model was performed to identify factors associated with increased risk for seropositivity.
Results:
In total, 1,740 employees were enrolled. At 6 months, the cumulative seroprevalence was 5.3%, which was below estimated community point seroprevalence. Seroprevalence was 5.8% among employees who provided direct care and was 3.4% among employees who did not perform direct patient care. Most participants who were seropositive at baseline remained positive at follow-up assessments. In a post hoc analysis, direct patient care (hazard ratio [HR], 1.95; 95% confidence interval [CI], 1.03–3.68), Black race (HR, 2.70; 95% CI, 1.24–5.87), and exposure to a confirmed case in a nonhealthcare setting (HR, 4.32; 95% CI, 2.71–6.88) were associated with statistically significant increased risk for seropositivity.
Conclusions:
Employee SARS-CoV-2 seroprevalence rates remained below the point-prevalence rates of the surrounding community. Provision of direct patient care, Black race, and exposure to a confirmed case in a nonhealthcare setting conferred increased risk. These data can inform occupational protection measures to maximize protection of employees within the workplace during future COVID-19 waves or other epidemics.
To determine the impact of an inpatient stewardship intervention targeting fluoroquinolone use on inpatient and postdischarge Clostridioides difficile infection (CDI).
Design:
We used an interrupted time series study design to evaluate the rate of hospital-onset CDI (HO-CDI), postdischarge CDI (PD-CDI) within 12 weeks, and inpatient fluoroquinolone use from 2 years prior to 1 year after a stewardship intervention.
Setting:
An academic healthcare system with 4 hospitals.
Patients:
All inpatients hospitalized between January 2017 and September 2020, excluding those discharged from locations caring for oncology, bone marrow transplant, or solid-organ transplant patients.
Intervention:
Introduction of electronic order sets designed to reduce inpatient fluoroquinolone prescribing.
Results:
Among 163,117 admissions, there were 683 cases of HO-CDI and 1,104 cases of PD-CDI. In the context of a 2% month-to-month decline starting in the preintervention period (P < .01), we observed a reduction in fluoroquinolone days of therapy per 1,000 patient days of 21% after the intervention (level change, P < .05). HO-CDI rates were stable throughout the study period. In contrast, we also detected a change in the trend of PD-CDI rates from a stable monthly rate in the preintervention period to a monthly decrease of 2.5% in the postintervention period (P < .01).
Conclusions:
Our systemwide intervention reduced inpatient fluoroquinolone use immediately, but not HO-CDI. However, a downward trend in PD-CDI occurred. Relying on outcome measures limited to the inpatient setting may not reflect the full impact of inpatient stewardship efforts.
Studying phenotypic and genetic characteristics of age at onset (AAO) and polarity at onset (PAO) in bipolar disorder can provide new insights into disease pathology and facilitate the development of screening tools.
Aims
To examine the genetic architecture of AAO and PAO and their association with bipolar disorder disease characteristics.
Method
Genome-wide association studies (GWASs) and polygenic score (PGS) analyses of AAO (n = 12 977) and PAO (n = 6773) were conducted in patients with bipolar disorder from 34 cohorts and a replication sample (n = 2237). The association of onset with disease characteristics was investigated in two of these cohorts.
Results
Earlier AAO was associated with a higher probability of psychotic symptoms, suicidality, lower educational attainment, not living together and fewer episodes. Depressive onset correlated with suicidality and manic onset correlated with delusions and manic episodes. Systematic differences in AAO between cohorts and continents of origin were observed. This was also reflected in single-nucleotide variant-based heritability estimates, with higher heritabilities for stricter onset definitions. Increased PGS for autism spectrum disorder (β = −0.34 years, s.e. = 0.08), major depression (β = −0.34 years, s.e. = 0.08), schizophrenia (β = −0.39 years, s.e. = 0.08), and educational attainment (β = −0.31 years, s.e. = 0.08) were associated with an earlier AAO. The AAO GWAS identified one significant locus, but this finding did not replicate. Neither GWAS nor PGS analyses yielded significant associations with PAO.
Conclusions
AAO and PAO are associated with indicators of bipolar disorder severity. Individuals with an earlier onset show an increased polygenic liability for a broad spectrum of psychiatric traits. Systematic differences in AAO across cohorts, continents and phenotype definitions introduce significant heterogeneity, affecting analyses.