We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Accurate diagnosis of bipolar disorder (BPD) is difficult in clinical practice, with an average delay between symptom onset and diagnosis of about 7 years. A depressive episode often precedes the first manic episode, making it difficult to distinguish BPD from unipolar major depressive disorder (MDD).
Aims
We use genome-wide association analyses (GWAS) to identify differential genetic factors and to develop predictors based on polygenic risk scores (PRS) that may aid early differential diagnosis.
Method
Based on individual genotypes from case–control cohorts of BPD and MDD shared through the Psychiatric Genomics Consortium, we compile case–case–control cohorts, applying a careful quality control procedure. In a resulting cohort of 51 149 individuals (15 532 BPD patients, 12 920 MDD patients and 22 697 controls), we perform a variety of GWAS and PRS analyses.
Results
Although our GWAS is not well powered to identify genome-wide significant loci, we find significant chip heritability and demonstrate the ability of the resulting PRS to distinguish BPD from MDD, including BPD cases with depressive onset (BPD-D). We replicate our PRS findings in an independent Danish cohort (iPSYCH 2015, N = 25 966). We observe strong genetic correlation between our case–case GWAS and that of case–control BPD.
Conclusions
We find that MDD and BPD, including BPD-D are genetically distinct. Our findings support that controls, MDD and BPD patients primarily lie on a continuum of genetic risk. Future studies with larger and richer samples will likely yield a better understanding of these findings and enable the development of better genetic predictors distinguishing BPD and, importantly, BPD-D from MDD.
OBJECTIVES/GOALS: To tackle population-level health disparities, quality dashboards can leverage individual socioeconomic status (SES) measures, which are not always readily accessible. This study aimed to assess the feasibility of a population health management strategy for colorectal cancer (CRC) screening rates using the HOUSES index and heatmap analysis. METHODS/STUDY POPULATION: We applied the 2019 Minnesota Community Measurement data for optimal CRC screening to eligible Mayo Clinic Midwest panel patients. SES was defined by HOUSES index, a validated SES measure based on publicly available property data for the U.S. population. We first assessed the association of suboptimal CRC screening rate with HOUSES index adjusting for age, sex, race/ethnicity, comorbidity, and Zip-code level deprivation by using a mixed effects logistic regression model. We then assessed changes in ranking for performance of individual clinics (i.e., % of patients with optimal CRC screening rate) before and after adjusting for HOUSES index. Geographical hotspots of high proportions of low SES AND high proportions of suboptimal CRC screening were superimposed to identify target population for outreach. RESULTS/ANTICIPATED RESULTS: A total of 58,382 adults from 41 clinics were eligible for CRC screening assessment in 2019 (53% Female). Patients with lower SES defined by HOUSES quartile 1-3 have significantly lower CRC screening compared to those with highest SES (HOUSES quartile 4) (adj. OR [95% CI]: 0.52 [0.50-0.56] for Q1, 0.66 [0.62-0.70] for Q2, and 0.81 [0.76-0.85]) for Q3). Ranking of 26 out of 41 (63%) clinics went down after adjusting for HOUSES index suggesting disproportionately higher proportion of underserved patients with suboptimal CRC screening. We were able to successfully identify hotspots of suboptimal CRC (area with greater than 130% of expected value) and overlay with higher proportion of underserved population (HOUSES Q1), which can be used for data-driven targeted interventions such as mobile health clinics. DISCUSSION/SIGNIFICANCE: HOUSES index and associated heatmap analysis can contribute to advancing health equity. This approach can aid health care organizations in meeting the newly established standards by The Joint Commission, which have elevated health equity to a national safety priority.
To investigate the symptoms of SARS-CoV-2 infection, their dynamics and their discriminatory power for the disease using longitudinally, prospectively collected information reported at the time of their occurrence. We have analysed data from a large phase 3 clinical UK COVID-19 vaccine trial. The alpha variant was the predominant strain. Participants were assessed for SARS-CoV-2 infection via nasal/throat PCR at recruitment, vaccination appointments, and when symptomatic. Statistical techniques were implemented to infer estimates representative of the UK population, accounting for multiple symptomatic episodes associated with one individual. An optimal diagnostic model for SARS-CoV-2 infection was derived. The 4-month prevalence of SARS-CoV-2 was 2.1%; increasing to 19.4% (16.0%–22.7%) in participants reporting loss of appetite and 31.9% (27.1%–36.8%) in those with anosmia/ageusia. The model identified anosmia and/or ageusia, fever, congestion, and cough to be significantly associated with SARS-CoV-2 infection. Symptoms’ dynamics were vastly different in the two groups; after a slow start peaking later and lasting longer in PCR+ participants, whilst exhibiting a consistent decline in PCR- participants, with, on average, fewer than 3 days of symptoms reported. Anosmia/ageusia peaked late in confirmed SARS-CoV-2 infection (day 12), indicating a low discrimination power for early disease diagnosis.
To minimise infection during COVID-19, the clozapine haematological monitoring interval was extended from 4-weekly to 12-weekly intervals in South London and Maudsley NHS Foundation Trust.
Aims
To investigate the impact of this temporary policy change on clinical and safety outcomes.
Method
All patients who received clozapine treatment with extended (12-weekly) monitoring in a large London National Health Service trust were included in a 1-year mirror-image study. A comparison group was selected with standard monitoring. The proportion of participants with mild to severe neutropenia and the proportion of participants attending the emergency department for clozapine-induced severe neutropenia treatment during the follow-up period were compared. Psychiatric hospital admission rates, clozapine dose and concomitant psychotropic medication in the 1 year before and the 1 year after extended monitoring were compared. All-cause clozapine discontinuation at 1-year follow-up was examined.
Results
Of 569 participants, 459 received clozapine with extended monitoring and 110 controls continued as normal. The total person-years were 458 in the intervention group and 109 in the control group, with a median follow-up time of 1 year in both groups. During follow-up, two participants (0.4%) recorded mild to moderate neutropenia in the intervention group and one (0.9%) in the control group. There was no difference in the incidence of haematological events between the two groups (IRR = 0.48, 95% CI 0.02–28.15, P = 0.29). All neutropenia cases in the intervention group were mild, co-occurring during COVID-19 infection. The median number of admissions per patient during the pre-mirror period remained unchanged (0, IQR = 0) during the post-mirror period. There was one death in the control group, secondary to COVID-19 infection.
Conclusions
There was no evidence that the incidence of severe neutropenia was increased in those receiving extended monitoring.
Response to lithium in patients with bipolar disorder is associated with clinical and transdiagnostic genetic factors. The predictive combination of these variables might help clinicians better predict which patients will respond to lithium treatment.
Aims
To use a combination of transdiagnostic genetic and clinical factors to predict lithium response in patients with bipolar disorder.
Method
This study utilised genetic and clinical data (n = 1034) collected as part of the International Consortium on Lithium Genetics (ConLi+Gen) project. Polygenic risk scores (PRS) were computed for schizophrenia and major depressive disorder, and then combined with clinical variables using a cross-validated machine-learning regression approach. Unimodal, multimodal and genetically stratified models were trained and validated using ridge, elastic net and random forest regression on 692 patients with bipolar disorder from ten study sites using leave-site-out cross-validation. All models were then tested on an independent test set of 342 patients. The best performing models were then tested in a classification framework.
Results
The best performing linear model explained 5.1% (P = 0.0001) of variance in lithium response and was composed of clinical variables, PRS variables and interaction terms between them. The best performing non-linear model used only clinical variables and explained 8.1% (P = 0.0001) of variance in lithium response. A priori genomic stratification improved non-linear model performance to 13.7% (P = 0.0001) and improved the binary classification of lithium response. This model stratified patients based on their meta-polygenic loadings for major depressive disorder and schizophrenia and was then trained using clinical data.
Conclusions
Using PRS to first stratify patients genetically and then train machine-learning models with clinical predictors led to large improvements in lithium response prediction. When used with other PRS and biological markers in the future this approach may help inform which patients are most likely to respond to lithium treatment.
Studying phenotypic and genetic characteristics of age at onset (AAO) and polarity at onset (PAO) in bipolar disorder can provide new insights into disease pathology and facilitate the development of screening tools.
Aims
To examine the genetic architecture of AAO and PAO and their association with bipolar disorder disease characteristics.
Method
Genome-wide association studies (GWASs) and polygenic score (PGS) analyses of AAO (n = 12 977) and PAO (n = 6773) were conducted in patients with bipolar disorder from 34 cohorts and a replication sample (n = 2237). The association of onset with disease characteristics was investigated in two of these cohorts.
Results
Earlier AAO was associated with a higher probability of psychotic symptoms, suicidality, lower educational attainment, not living together and fewer episodes. Depressive onset correlated with suicidality and manic onset correlated with delusions and manic episodes. Systematic differences in AAO between cohorts and continents of origin were observed. This was also reflected in single-nucleotide variant-based heritability estimates, with higher heritabilities for stricter onset definitions. Increased PGS for autism spectrum disorder (β = −0.34 years, s.e. = 0.08), major depression (β = −0.34 years, s.e. = 0.08), schizophrenia (β = −0.39 years, s.e. = 0.08), and educational attainment (β = −0.31 years, s.e. = 0.08) were associated with an earlier AAO. The AAO GWAS identified one significant locus, but this finding did not replicate. Neither GWAS nor PGS analyses yielded significant associations with PAO.
Conclusions
AAO and PAO are associated with indicators of bipolar disorder severity. Individuals with an earlier onset show an increased polygenic liability for a broad spectrum of psychiatric traits. Systematic differences in AAO across cohorts, continents and phenotype definitions introduce significant heterogeneity, affecting analyses.
Co-occurrence of common mental disorders (CMD) with psychotic experiences is well-known. There is little research on the public mental health relevance of concurrent psychotic experiences for service use, suicidality, and poor physical health. We aim to: (1) describe the distribution of psychotic experiences co-occurring with a range of non-psychotic psychiatric disorders [CMD, depressive episode, anxiety disorder, probable post-traumatic stress disorder (PTSD), and personality dysfunction], and (2) examine associations of concurrent psychotic experiences with secondary mental healthcare use, psychological treatment use for CMD, lifetime suicide attempts, and poor self-rated health.
Methods
We linked a prospective cross-sectional community health survey with a mental healthcare provider database. For each non-psychotic psychiatric disorder, patients with concurrent psychotic experiences were compared to those without psychotic experiences on use of secondary mental healthcare, psychological treatment for CMD, suicide attempt, physical functioning, and a composite multimorbidity score, using logistic regression and Cox regressions.
Results
In all disorders except for anxiety disorder, concurrent psychotic experiences were accompanied by a greater odds of all outcomes (odds ratios) for a unit change in composite multimorbidity score ranged between 2.21 [95% confidence interval (CI) 1.49–3.27] and 3.46 (95% CI 1.52–7.85). Hazard ratios for secondary mental health service use for non-psychotic disorders with concurrent psychotic experiences, ranged from 0.53 (95% CI 0.15–1.86) for anxiety disorders with psychotic experiences to 4.99 (95% CI 1.22–20.44) among those with PTSD with psychotic experiences.
Conclusions
Co-occurring psychotic experiences indicate greater public mental health burden, suggesting psychotic experiences could be a marker for future preventive strategies improving public mental health.
Breakthrough Listen is a 10-yr initiative to search for signatures of technologies created by extraterrestrial civilisations at radio and optical wavelengths. Here, we detail the digital data recording system deployed for Breakthrough Listen observations at the 64-m aperture CSIRO Parkes Telescope in New South Wales, Australia. The recording system currently implements two modes: a dual-polarisation, 1.125-GHz bandwidth mode for single-beam observations, and a 26-input, 308-MHz bandwidth mode for the 21-cm multibeam receiver. The system is also designed to support a 3-GHz single-beam mode for the forthcoming Parkes ultra-wideband feed. In this paper, we present details of the system architecture, provide an overview of hardware and software, and present initial performance results.
Although psychotic experiences in people without diagnosed mental health problems are associated with mental health service use, few studies have assessed this prospectively or measured service use by real-world clinical data.
Aims
To describe and investigate the association between psychotic experiences and later mental health service use, and to assess the role of symptoms of common mental health disorders in this association.
Method
We linked a representative survey of south-east London (SELCoH-1, n=1698) with health records from the local mental healthcare provider. Cox regression estimated the association of PEs with rate of mental health service use.
Results
After adjustments, psychotic experiences were associated with a 1.75-fold increase in the rate of subsequent mental health service use (hazard ratio (HR) 1.75, 95% CI 1.03–2.97) compared with those without PEs. Participants with PEs experienced longer care episodes compared with those without.
Conclusions
Psychotic experiences in the general population are important predictors of public mental health need, aside from their relevance for psychoses. We found psychotic experiences to be associated with later mental health service use, after accounting for sociodemographic confounders and concurrent psychopathology.
Conservationists have adopted community-based conservation (CBC) strategies to support landscape conservation programmes in East Africa, and these projects often involve community development assistance in exchange for a commitment to dedicating a portion of community lands for conservation management. There is, however, a dearth of empirical evidence assessing the effectiveness of CBC conservation programmes. This paper uses sub-metre-resolution satellite imagery to measure land-use change on four Kenyan group ranches that had created CBCs. Each ranch underwent a common participatory planning process that established a land-use plan involving three management zones: conservation, livestock grazing and settlement/cultivation. Using a satellite image time series, we recorded threat-based development – anthropogenic modification of natural areas and the density of structures – for each ranch. We found that CBCs with tourism lodges were more effective at controlling development than the CBCs without a lodge, particularly in the conservation zones and, to a lesser degree, in the grazing zones. We conclude that our use of very-high-resolution satellite imagery offers conservationists a cost-effective, fast and replicable approach to measuring CBC land-use change and that CBC projects can lead to positive conservation results.
Greenhouse experiments were conducted to elucidate the effects of water stress on photosynthetic parameters of soybean [Glycine max (L.) Merr. ‘Hutton′] and velvetleaf (Abutilon theophrasti Medik. # ABUTH). Stomatal conductance of both species responded curvilinearly to reductions in leaf water potential. At leaf water potentials less negative than −2.5 MPa, stomatal conductance, net photosynthetic rate, and transpiration rate were greater in velvetleaf than in soybean. Soybean photosynthetic rate was linearly related to stomatal conductance. Velvetleaf photosynthetic rate increased linearly with stomatal conductances up to 1.5 cm s–1; however, no increase in photosynthetic rate was observed at stomatal conductances greater than 1.5 cm s–1, indicating nonstomatal limitations to photosynthesis. As water stress intensified, stomatal conductance, photosynthetic rate, and transpiration of velvetleaf declined more rapidly than in soybean.
In a 2-yr field study conducted on a Weswood silt loam soil (Fluventic Ustochrepts), interspecific competition between soybeans [Glycine max (L.) Merr. ‘Hutton′] and velvetleaf (Abutilon theophrasti Medik. # ABUTH) resulted in greater than 40 and 50% reductions in soybean and velvetleaf seed yield, respectively. Leaf area index, number of mainstem nodes, total number of leaves, and plant dry weight of monocultured and intercropped velvetleaf differed significantly as early as 4 weeks after emergence. Interspecific competition had litttle or no effect on soybean morphology before 8 weeks after emergence. Soil water extraction occurred to 1-m depths in a monoculture of velvetleaf (five plants/m2) in 1984 and 1985. Monocultured soybeans (32.5 plants/m2) extracted water from a 1.5-m or greater depth of the soil profile during the same years. Soil water extraction in the intercropped plots resembled that of the monocultured velvetleaf treatment until soybeans attained R6, when soil water was extracted to a 1.5-m depth. The potential for interspecific competition for water existed early in the season before late-season soybean root development. Relative water content and leaf water potential (Ψw1) did not differ (0.05) between monocultured and intercropped soybeans in 1984 or 1985. In 1985, Ψw1 differed between monocultured and intercropped velvetleaf during anthesis. Leaf water potential values in the youngest, fully expanded leaves were approximately 0.3 and 0.4 MPa lower during midmorning and midday hours, respectively, in intercropped and monocultured velvetleaf. Transpiration and stomatal conductance did not differ between monocultured and intercropped soybeans or velvetleaf at any time during 1984. Photosynthetic and transpiration rates, stomatal conductance, and Ψw1 were lower in intercropped than in monocultured velvetleaf during anthesis in 1985, suggesting interspecific competition for soil water. Soybean water relations were not affected in either year. The data suggest that soybean yield reductions in soybean-velvetleaf interspecific competition are attributable to resource limitations other than water in south-central Texas.
Variation in interference relationships have been shown for a number of crop-weed associations and may have an important effect on the implementation of decision support systems for weed management. Multiyear field experiments were conducted at eight locations to determine the stability of corn-foxtail interference relationships across years and locations. Two coefficients (I and A) of a rectangular hyperbola equation were estimated for each data set using nonlinear regression procedures. The I and A coefficients represent percent corn yield loss as foxtail density approaches zero and maximum percent corn yield loss, respectively. The coefficient I was stable across years at two locations and varied across years at four locations. Maximum yield loss (A) varied between years at one location. Both coefficients varied among locations. Although 3 to 4 foxtail plants m−-1 row was a conservative estimate of the single-year economic threshold (Tc) of foxtail density, variation in I and A resulted in a large variation in Tc. Therefore, the utility of using common coefficient estimates to predict future crop yield loss from foxtail interference between years or among locations within a region is limited.
Influenza A (H1N1) pdm09 became the predominant circulating strain in the United States during the 2013–2014 influenza season. Little is known about the epidemiology of severe influenza during this season.
METHODS
A retrospective cohort study of severely ill patients with influenza infection in intensive care units in 33 US hospitals from September 1, 2013, through April 1, 2014, was conducted to determine risk factors for mortality present on intensive care unit admission and to describe patient characteristics, spectrum of disease, management, and outcomes.
RESULTS
A total of 444 adults and 63 children were admitted to an intensive care unit in a study hospital; 93 adults (20.9%) and 4 children (6.3%) died. By logistic regression analysis, the following factors were significantly associated with mortality among adult patients: older age (>65 years, odds ratio, 3.1 [95% CI, 1.4–6.9], P=.006 and 50–64 years, 2.5 [1.3–4.9], P=.007; reference age 18–49 years), male sex (1.9 [1.1–3.3], P=.031), history of malignant tumor with chemotherapy administered within the prior 6 months (12.1 [3.9–37.0], P<.001), and a higher Sequential Organ Failure Assessment score (for each increase by 1 in score, 1.3 [1.2–1.4], P<.001).
CONCLUSION
Risk factors for death among US patients with severe influenza during the 2013–2014 season, when influenza A (H1N1) pdm09 was the predominant circulating strain type, shifted in the first postpandemic season in which it predominated toward those of a more typical epidemic influenza season.
Infect. Control Hosp. Epidemiol. 2015;36(11):1251–1260