We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Polygenic scores (PGSs) have garnered increasing attention in the clinical sciences due to their robust prediction signals for psychopathology, including externalizing (EXT) behaviors. However, studies leveraging PGSs have rarely accounted for the phenotypic and developmental heterogeneity in EXT outcomes. We used the National Longitudinal Study of Adolescent to Adult Health (analytic N = 4,416), spanning ages 13 to 41, to examine associations between EXT PGSs and trajectories of antisocial behaviors (ASB) and substance use behaviors (SUB) identified via growth mixture modeling. Four trajectories of ASB were identified: High Decline (3.6% of the sample), Moderate (18.9%), Adolescence-Peaked (10.6%), and Low (67%), while three were identified for SUB: High Use (35.2%), Typical Use (41.7%), and Low Use (23%). EXT PGSs were consistently associated with persistent trajectories of ASB and SUB (High Decline and High Use, respectively), relative to comparison groups. EXT PGSs were also associated with the Low Use trajectory of SUB, relative to the comparison group. Results suggest PGSs may be sensitive to developmental typologies of EXT, where PGSs are more strongly predictive of chronicity in addition to (or possibly rather than) absolute severity.
Accurate diagnosis of bipolar disorder (BPD) is difficult in clinical practice, with an average delay between symptom onset and diagnosis of about 7 years. A depressive episode often precedes the first manic episode, making it difficult to distinguish BPD from unipolar major depressive disorder (MDD).
Aims
We use genome-wide association analyses (GWAS) to identify differential genetic factors and to develop predictors based on polygenic risk scores (PRS) that may aid early differential diagnosis.
Method
Based on individual genotypes from case–control cohorts of BPD and MDD shared through the Psychiatric Genomics Consortium, we compile case–case–control cohorts, applying a careful quality control procedure. In a resulting cohort of 51 149 individuals (15 532 BPD patients, 12 920 MDD patients and 22 697 controls), we perform a variety of GWAS and PRS analyses.
Results
Although our GWAS is not well powered to identify genome-wide significant loci, we find significant chip heritability and demonstrate the ability of the resulting PRS to distinguish BPD from MDD, including BPD cases with depressive onset (BPD-D). We replicate our PRS findings in an independent Danish cohort (iPSYCH 2015, N = 25 966). We observe strong genetic correlation between our case–case GWAS and that of case–control BPD.
Conclusions
We find that MDD and BPD, including BPD-D are genetically distinct. Our findings support that controls, MDD and BPD patients primarily lie on a continuum of genetic risk. Future studies with larger and richer samples will likely yield a better understanding of these findings and enable the development of better genetic predictors distinguishing BPD and, importantly, BPD-D from MDD.
We use the 529 college savings plan setting to investigate whether and why households make suboptimal choices to invest in local assets. We estimate that 67% of open accounts between 2010 and 2020 were located suboptimally due to the plans’ tax inefficiencies and high expenses. Over the accounts’ projected lifetimes, such investments yielded expected losses of 8% on average or $15.6 billion in 2020 alone. We then investigate why suboptimal investment is so prevalent. Consistent with households’ lack of understanding of state-level tax benefits, we find that a meaningful proportion of households does not account for the potential tax benefits and costs of local versus nonlocal 529 investment. Household financial literacy and plan disclosure complexity appear to explain suboptimal investment patterns, which further supports the role of information-processing frictions. Our study presents novel evidence on individuals’ preferences for local assets and how information-processing frictions shape their investment decisions, reducing their financial well-being.
The majority of children with maltreatment histories do not go on to develop depression in their adolescent and adult years. These individuals are often identified as being “resilient”, but this characterization may conceal difficulties that individuals with maltreatment histories might face in their interpersonal relationships, substance use, physical health, and/or socioeconomic outcomes in their later lives. This study examined how adolescents with maltreatment histories who exhibit low levels of depression function in other domains during their adult years. Longitudinal trajectories of depression (across ages 13–32) in individuals with (n = 3,809) and without (n = 8,249) maltreatment histories were modeled in the National Longitudinal Study of Adolescent to Adult Health. The same “Low,” “increasing,” and “declining” depression trajectories in both individuals with and without maltreatment histories were identified. Youths with maltreatment histories in the “low” depression trajectory reported lower romantic relationship satisfaction, more exposure to intimate partner and sexual violence, more alcohol abuse/dependency, and poorer general physical health compared to individuals without maltreatment histories in the same “low” depression trajectory in adulthood. Findings add further caution against labeling individuals as “resilient” based on a just single domain of functioning (low depression), as childhood maltreatment has harmful effects on a broad spectrum of functional domains.
We assessed the efficacy of a culturally competent outreach model with promotoras in raising the coronavirus disease 2019 (COVID-19) first-dose vaccination rates in Chicago’s at-risk ZIP codes from February through May 2021. Utilizing community members from within target communities may reduce barriers, increase vaccination rates, and enhance COVID-19 prevention.
The Hierarchical Taxonomy of Psychopathology (HiTOP) has emerged out of the quantitative approach to psychiatric nosology. This approach identifies psychopathology constructs based on patterns of co-variation among signs and symptoms. The initial HiTOP model, which was published in 2017, is based on a large literature that spans decades of research. HiTOP is a living model that undergoes revision as new data become available. Here we discuss advantages and practical considerations of using this system in psychiatric practice and research. We especially highlight limitations of HiTOP and ongoing efforts to address them. We describe differences and similarities between HiTOP and existing diagnostic systems. Next, we review the types of evidence that informed development of HiTOP, including populations in which it has been studied and data on its validity. The paper also describes how HiTOP can facilitate research on genetic and environmental causes of psychopathology as well as the search for neurobiologic mechanisms and novel treatments. Furthermore, we consider implications for public health programs and prevention of mental disorders. We also review data on clinical utility and illustrate clinical application of HiTOP. Importantly, the model is based on measures and practices that are already used widely in clinical settings. HiTOP offers a way to organize and formalize these techniques. This model already can contribute to progress in psychiatry and complement traditional nosologies. Moreover, HiTOP seeks to facilitate research on linkages between phenotypes and biological processes, which may enable construction of a system that encompasses both biomarkers and precise clinical description.
Ethical decision making has long been recognized as critical for industrial-organizational (I-O) psychologists in the variety of roles they fill in education, research, and practice. Decisions with ethical implications are not always readily apparent and often require consideration of competing concerns. The American Psychological Association (APA) Ethical Principles of Psychologists and Code of Conduct are the principles and standards to which all Society for Industrial and Organizational Psychology (SIOP) members are held accountable, and these principles serve to aid in decision making. To this end, the primary focus of this article is the presentation and application of an integrative ethical decision-making framework rooted in and inspired by empirical, philosophical, and practical considerations of professional ethics. The purpose of this framework is to provide a generalizable model that can be used to identify, evaluate, resolve, and engage in discourse about topics involving ethical issues. To demonstrate the efficacy of this general framework to contexts germane to I-O psychologists, we subsequently present and apply this framework to five scenarios, each involving an ethical situation relevant to academia, practice, or graduate education in I-O psychology. With this article, we hope to stimulate the refinement of this ethical decision-making model, illustrate its application in our profession, and, most importantly, advance conversations about ethical decision making in I-O psychology.
Diagnosis of sinus venosus defects, not infrequently associated with complex anomalous pulmonary venous drainage, may be delayed requiring multimodality imaging.
Methods:
Retrospective review of all patients from February 2008 to January 2019.
Results:
Thirty-seven children were diagnosed at a median age of 4.2 years (range 0.5−15.5 years). In 32 of 37 (86%) patients, diagnosis was achieved on transthoracic echocardiography, but five patients (14%) had complex variants (four had high insertion of anomalous vein into the superior caval vein and three had multiple anomalous veins draining to different sites, two of whom had drainage of one vein into the high superior caval vein). In these five patients, the final diagnosis was achieved by multimodality imaging and intra-operative findings. The median age at surgery was 5.2 years (range 1.6−15.8 years). Thirty-one patients underwent double patch repair, four patients a Warden repair, and two patients a single-patch repair. Of the four Warden repairs, two patients had a high insertion of right-sided anomalous pulmonary vein into the superior caval vein, one patient had bilateral superior caval veins, and one patient had right lower pulmonary vein insertion into the right atrium/superior caval vein junction. There was no post-operative mortality, reoperation, residual shunt or pulmonary venous obstruction. One patient developed superior caval vein obstruction and one patient developed atrial flutter.
Conclusion:
Complementary cardiac imaging modalities improve diagnosis of complex sinus venosus defects associated with a wide variation in the pattern of anomalous pulmonary venous connection. Nonetheless, surgical treatment is associated with excellent outcomes.
Studying phenotypic and genetic characteristics of age at onset (AAO) and polarity at onset (PAO) in bipolar disorder can provide new insights into disease pathology and facilitate the development of screening tools.
Aims
To examine the genetic architecture of AAO and PAO and their association with bipolar disorder disease characteristics.
Method
Genome-wide association studies (GWASs) and polygenic score (PGS) analyses of AAO (n = 12 977) and PAO (n = 6773) were conducted in patients with bipolar disorder from 34 cohorts and a replication sample (n = 2237). The association of onset with disease characteristics was investigated in two of these cohorts.
Results
Earlier AAO was associated with a higher probability of psychotic symptoms, suicidality, lower educational attainment, not living together and fewer episodes. Depressive onset correlated with suicidality and manic onset correlated with delusions and manic episodes. Systematic differences in AAO between cohorts and continents of origin were observed. This was also reflected in single-nucleotide variant-based heritability estimates, with higher heritabilities for stricter onset definitions. Increased PGS for autism spectrum disorder (β = −0.34 years, s.e. = 0.08), major depression (β = −0.34 years, s.e. = 0.08), schizophrenia (β = −0.39 years, s.e. = 0.08), and educational attainment (β = −0.31 years, s.e. = 0.08) were associated with an earlier AAO. The AAO GWAS identified one significant locus, but this finding did not replicate. Neither GWAS nor PGS analyses yielded significant associations with PAO.
Conclusions
AAO and PAO are associated with indicators of bipolar disorder severity. Individuals with an earlier onset show an increased polygenic liability for a broad spectrum of psychiatric traits. Systematic differences in AAO across cohorts, continents and phenotype definitions introduce significant heterogeneity, affecting analyses.
A goosegrass [Eleusine indica (L.) Gaertn.] population uncontrolled by paraquat (R) in a vegetable production field in St. Clair County, AL, was collected in summer 2019. Research was conducted to assess the level of resistance of the suspected resistant population compared with three populations with no suspected paraquat resistance (S1, S2, and S3). Visual injury at all rating dates and biomass reduction at 28 d after treatment (DAT) of S populations occurred exponentially to increasing paraquat rates. S biotypes were injured more than R at 3 DAT, with biomass recovery at 28 DAT only occurring at rates <0.28 kg ha−1. Plant death or biomass reduction did not occur for any rate at any date for R. Paraquat rates that induced 50% or 90% injury or reduced biomass 50% or 90% compared with the non-treated (I50 or I90, respectively) ranged from 10 to 124 times higher I50 for R compared with S and 54 to 116 times higher I90 for R compared with S biotypes. These data confirm a paraquat-resistant E. indica biotype in Alabama, providing additional germplasm for study of resistance to photosystem I electron-diverting (PSI-ED) resistance mechanisms.
Pharmacogenomic testing has emerged to aid medication selection for patients with major depressive disorder (MDD) by identifying potential gene-drug interactions (GDI). Many pharmacogenomic tests are available with varying levels of supporting evidence, including direct-to-consumer and physician-ordered tests. We retrospectively evaluated the safety of using a physician-ordered combinatorial pharmacogenomic test (GeneSight) to guide medication selection for patients with MDD in a large, randomized, controlled trial (GUIDED).
Materials and Methods
Patients diagnosed with MDD who had an inadequate response to ≥1 psychotropic medication were randomized to treatment as usual (TAU) or combinatorial pharmacogenomic test-guided care (guided-care). All received combinatorial pharmacogenomic testing and medications were categorized by predicted GDI (no, moderate, or significant GDI). Patients and raters were blinded to study arm, and physicians were blinded to test results for patients in TAU, through week 8. Measures included adverse events (AEs, present/absent), worsening suicidal ideation (increase of ≥1 on the corresponding HAM-D17 question), or symptom worsening (HAM-D17 increase of ≥1). These measures were evaluated based on medication changes [add only, drop only, switch (add and drop), any, and none] and study arm, as well as baseline medication GDI.
Results
Most patients had a medication change between baseline and week 8 (938/1,166; 80.5%), including 269 (23.1%) who added only, 80 (6.9%) who dropped only, and 589 (50.5%) who switched medications. In the full cohort, changing medications resulted in an increased relative risk (RR) of experiencing AEs at both week 4 and 8 [RR 2.00 (95% CI 1.41–2.83) and RR 2.25 (95% CI 1.39–3.65), respectively]. This was true regardless of arm, with no significant difference observed between guided-care and TAU, though the RRs for guided-care were lower than for TAU. Medication change was not associated with increased suicidal ideation or symptom worsening, regardless of study arm or type of medication change. Special attention was focused on patients who entered the study taking medications identified by pharmacogenomic testing as likely having significant GDI; those who were only taking medications subject to no or moderate GDI at week 8 were significantly less likely to experience AEs than those who were still taking at least one medication subject to significant GDI (RR 0.39, 95% CI 0.15–0.99, p=0.048). No other significant differences in risk were observed at week 8.
Conclusion
These data indicate that patient safety in the combinatorial pharmacogenomic test-guided care arm was no worse than TAU in the GUIDED trial. Moreover, combinatorial pharmacogenomic-guided medication selection may reduce some safety concerns. Collectively, these data demonstrate that combinatorial pharmacogenomic testing can be adopted safely into clinical practice without risking symptom degradation among patients.
Childhood exposure to interpersonal violence (IPV) may be linked to distinct manifestations of mental illness, yet the nature of this change remains poorly understood. Network analysis can provide unique insights by contrasting the interrelatedness of symptoms underlying psychopathology across exposed and non-exposed youth, with potential clinical implications for a treatment-resistant population. We anticipated marked differences in symptom associations among IPV-exposed youth, particularly in terms of ‘hub’ symptoms holding outsized influence over the network, as well as formation and influence of communities of highly interconnected symptoms.
Methods
Participants from a population-representative sample of youth (n = 4433; ages 11–18 years) completed a comprehensive structured clinical interview assessing mental health symptoms, diagnostic status, and history of violence exposure. Network analytic methods were used to model the pattern of associations between symptoms, quantify differences across diagnosed youth with (IPV+) and without (IPV–) IPV exposure, and identify transdiagnostic ‘bridge’ symptoms linking multiple disorders.
Results
Symptoms organized into six ‘disorder’ communities (e.g. Intrusive Thoughts/Sensations, Depression, Anxiety), that exhibited considerably greater interconnectivity in IPV+ youth. Five symptoms emerged in IPV+ youth as highly trafficked ‘bridges’ between symptom communities (11 in IPV– youth).
Conclusion
IPV exposure may alter mutually reinforcing symptom co-occurrence in youth, thus contributing to greater psychiatric comorbidity and treatment resistance. The presence of a condensed and unique set of bridge symptoms suggests trauma-enriched nodes which could be therapeutically targeted to improve outcomes in violence-exposed youth.
Background: Carbapenem-resistant Enterobacteriaceae (CRE) are increasingly common in the United States and have the potential to spread widely across healthcare networks. Only a fraction of patients with CRE carriage (ie, infection or colonization) are identified by clinical cultures. Interventions to reduce CRE transmission can be explored with agent-based models (ABMs) comprised of unique agents (eg, patients) represented by a synthetic population or model-generated representation of the population. We used electronic health record data to determine CRE carriage risk, and we discuss how these results can inform CRE transmission parameters for hospitalized agents in a regional healthcare network ABM. Methods: We reviewed the laboratory data of patients admitted during July 1, 2016−June 30, 2017, to any of 7 short-term acute-care hospitals of a regional healthcare network in North Carolina (N = 118,022 admissions) to find clinically detected cases of CRE carriage. A case was defined as the first occurrence of Enterobacter spp, Escherichia coli, or Klebsiella spp resistant to any carbapenem isolated from a clinical specimen in an admitted patient. We used Poisson regression to estimate clinically detected CRE carriage risk according to variables common to data from both the electronic health records and the ABM synthetic population, including patient demographics, systemic antibiotic administration, intensive care unit stay, comorbidities, length of stay, and admitting hospital size. Results: We identified 58 (0.05%) cases of CRE carriage among all admissions. Among these cases, 30 (52%) were ≥65 years of age and 37 (64%) were female. During their admission, 47 cases (81%) were administered systemic antibiotics and 18 cases (31%) had an intensive care unit stay. Patients administered systemic antibiotics and those with an intensive care unit stay had CRE carriage risk 6.5 times (95% CI, 3.4–12.5) and 4.9 times (95% CI, 2.8–8.5) higher, respectively, than patients without these exposures (Fig. 1). Patients ≥50 years of age and those with a higher Elixhauser comorbidity index score and with longer length of stay also had increased CRE carriage risk. Conclusions: Among admissions in our dataset, CRE carriage risk was associated with systemic antibiotic exposure, intensive care unit stay, higher Elixhauser comorbidity index score, and longer length of stay. We will use these risk estimates in the ABM to inform agents’ CRE carriage status upon hospital admission and the CRE transmission parameters for short-term acute-care hospitals. We will explore CRE transmission interventions in the parameterized regional healthcare network ABM and assess the impact of CRE carriage underestimation.
Funding: This work was supported by Centers for Disease Control and Prevention (CDC) Cooperative Agreement number U01CK000527. The conclusions, findings, and opinions expressed do not necessarily reflect the official position of CDC.
The Genomics Used to Improve DEpresssion Decisions (GUIDED) trial assessed outcomes associated with combinatorial pharmacogenomic (PGx) testing in patients with major depressive disorder (MDD). Analyses used the 17-item Hamilton Depression (HAM-D17) rating scale; however, studies demonstrate that the abbreviated, core depression symptom-focused, HAM-D6 rating scale may have greater sensitivity toward detecting differences between treatment and placebo. However, the sensitivity of HAM-D6 has not been tested for two active treatment arms. Here, we evaluated the sensitivity of the HAM-D6 scale, relative to the HAM-D17 scale, when assessing outcomes for actively treated patients in the GUIDED trial.
Methods:
Outpatients (N=1,298) diagnosed with MDD and an inadequate treatment response to >1 psychotropic medication were randomized into treatment as usual (TAU) or combinatorial PGx-guided (guided-care) arms. Combinatorial PGx testing was performed on all patients, though test reports were only available to the guided-care arm. All patients and raters were blinded to study arm until after week 8. Medications on the combinatorial PGx test report were categorized based on the level of predicted gene-drug interactions: ‘use as directed’, ‘moderate gene-drug interactions’, or ‘significant gene-drug interactions.’ Patient outcomes were assessed by arm at week 8 using HAM-D6 and HAM-D17 rating scales, including symptom improvement (percent change in scale), response (≥50% decrease in scale), and remission (HAM-D6 ≤4 and HAM-D17 ≤7).
Results:
At week 8, the guided-care arm demonstrated statistically significant symptom improvement over TAU using HAM-D6 scale (Δ=4.4%, p=0.023), but not using the HAM-D17 scale (Δ=3.2%, p=0.069). The response rate increased significantly for guided-care compared with TAU using both HAM-D6 (Δ=7.0%, p=0.004) and HAM-D17 (Δ=6.3%, p=0.007). Remission rates were also significantly greater for guided-care versus TAU using both scales (HAM-D6 Δ=4.6%, p=0.031; HAM-D17 Δ=5.5%, p=0.005). Patients taking medication(s) predicted to have gene-drug interactions at baseline showed further increased benefit over TAU at week 8 using HAM-D6 for symptom improvement (Δ=7.3%, p=0.004) response (Δ=10.0%, p=0.001) and remission (Δ=7.9%, p=0.005). Comparatively, the magnitude of the differences in outcomes between arms at week 8 was lower using HAM-D17 (symptom improvement Δ=5.0%, p=0.029; response Δ=8.0%, p=0.008; remission Δ=7.5%, p=0.003).
Conclusions:
Combinatorial PGx-guided care achieved significantly better patient outcomes compared with TAU when assessed using the HAM-D6 scale. These findings suggest that the HAM-D6 scale is better suited than is the HAM-D17 for evaluating change in randomized, controlled trials comparing active treatment arms.
At present, analysis of diet and bladder cancer (BC) is mostly based on the intake of individual foods. The examination of food combinations provides a scope to deal with the complexity and unpredictability of the diet and aims to overcome the limitations of the study of nutrients and foods in isolation. This article aims to demonstrate the usability of supervised data mining methods to extract the food groups related to BC. In order to derive key food groups associated with BC risk, we applied the data mining technique C5.0 with 10-fold cross-validation in the BLadder cancer Epidemiology and Nutritional Determinants study, including data from eighteen case–control and one nested case–cohort study, compromising 8320 BC cases out of 31 551 participants. Dietary data, on the eleven main food groups of the Eurocode 2 Core classification codebook, and relevant non-diet data (i.e. sex, age and smoking status) were available. Primarily, five key food groups were extracted; in order of importance, beverages (non-milk); grains and grain products; vegetables and vegetable products; fats, oils and their products; meats and meat products were associated with BC risk. Since these food groups are corresponded with previously proposed BC-related dietary factors, data mining seems to be a promising technique in the field of nutritional epidemiology and deserves further examination.
The Murchison Widefield Array (MWA) is an open access telescope dedicated to studying the low-frequency (80–300 MHz) southern sky. Since beginning operations in mid-2013, the MWA has opened a new observational window in the southern hemisphere enabling many science areas. The driving science objectives of the original design were to observe 21 cm radiation from the Epoch of Reionisation (EoR), explore the radio time domain, perform Galactic and extragalactic surveys, and monitor solar, heliospheric, and ionospheric phenomena. All together
$60+$
programs recorded 20 000 h producing 146 papers to date. In 2016, the telescope underwent a major upgrade resulting in alternating compact and extended configurations. Other upgrades, including digital back-ends and a rapid-response triggering system, have been developed since the original array was commissioned. In this paper, we review the major results from the prior operation of the MWA and then discuss the new science paths enabled by the improved capabilities. We group these science opportunities by the four original science themes but also include ideas for directions outside these categories.