We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Functional impairment in daily activities, such as work and socializing, is part of the diagnostic criteria for major depressive disorder and most anxiety disorders. Despite evidence that symptom severity and functional impairment are partially distinct, functional impairment is often overlooked. To assess whether functional impairment captures diagnostically relevant genetic liability beyond that of symptoms, we aimed to estimate the heritability of, and genetic correlations between, key measures of current depression symptoms, anxiety symptoms, and functional impairment.
Methods
In 17,130 individuals with lifetime depression or anxiety from the Genetic Links to Anxiety and Depression (GLAD) Study, we analyzed total scores from the Patient Health Questionnaire-9 (depression symptoms), Generalized Anxiety Disorder-7 (anxiety symptoms), and Work and Social Adjustment Scale (functional impairment). Genome-wide association analyses were performed with REGENIE. Heritability was estimated using GCTA-GREML and genetic correlations with bivariate-GREML.
Results
The phenotypic correlations were moderate across the three measures (Pearson’s r = 0.50–0.69). All three scales were found to be under low but significant genetic influence (single-nucleotide polymorphism-based heritability [h2SNP] = 0.11–0.19) with high genetic correlations between them (rg = 0.79–0.87).
Conclusions
Among individuals with lifetime depression or anxiety from the GLAD Study, the genetic variants that underlie symptom severity largely overlap with those influencing functional impairment. This suggests that self-reported functional impairment, while clinically relevant for diagnosis and treatment outcomes, does not reflect substantial additional genetic liability beyond that captured by symptom-based measures of depression or anxiety.
During the COVID-19 pandemic, the United States Centers for Disease Control and Prevention provided strategies, such as extended use and reuse, to preserve N95 filtering facepiece respirators (FFR). We aimed to assess the prevalence of N95 FFR contamination with SARS-CoV-2 among healthcare personnel (HCP) in the Emergency Department (ED).
Design:
Real-world, prospective, multicenter cohort study. N95 FFR contamination (primary outcome) was measured by real-time quantitative polymerase chain reaction. Multiple logistic regression was used to assess factors associated with contamination.
Setting:
Six academic medical centers.
Participants:
ED HCP who practiced N95 FFR reuse and extended use during the COVID-19 pandemic between April 2021 and July 2022.
Primary exposure:
Total number of COVID-19-positive patients treated.
Results:
Two-hundred forty-five N95 FFRs were tested. Forty-four N95 FFRs (18.0%, 95% CI 13.4, 23.3) were contaminated with SARS-CoV-2 RNA. The number of patients seen with COVID-19 was associated with N95 FFR contamination (adjusted odds ratio, 2.3 [95% CI 1.5, 3.6]). Wearing either surgical masks or face shields over FFRs was not associated with FFR contamination, and FFR contamination prevalence was high when using these adjuncts [face shields: 25% (16/64), surgical masks: 22% (23/107)].
Conclusions:
Exposure to patients with known COVID-19 was independently associated with N95 FFR contamination. Face shields and overlying surgical masks were not associated with N95 FFR contamination. N95 FFR reuse and extended use should be avoided due to the increased risk of contact exposure from contaminated FFRs.
Background: Our prior six-year review (n=2165) revealed 24% of patients undergoing posterior decompression surgeries (laminectomy or discectomy) sought emergency department (ED) care within three months post-surgery. We established an integrated Spine Assessment Clinic (SAC) to enhance patient outcomes and minimize unnecessary ED visits through pre-operative education, targeted QI interventions, and early post-operative follow-up. Methods: We reviewed 13 months of posterior decompression data (n=205) following SAC implementation. These patients received individualized, comprehensive pre-operative education and follow-up phone calls within 7 days post-surgery. ED visits within 90 days post-surgery were tracked using provincial databases and compared to our pre-SAC implementation data. Results: Out of 205 patients, 24 (11.6%) accounted for 34 ED visits within 90 days post-op, showing a significant reduction in ED visits from 24% to 11.6%, and decreased overall ED utilization from 42.1% to 16.6% (when accounting for multiple visits by the same patient). Early interventions including wound monitoring, outpatient bloodwork, and prescription adjustments for pain management, helped mitigate ED visits. Patient satisfaction surveys (n=62) indicated 92% were “highly satisfied” and 100% would recommend the SAC. Conclusions: The SAC reduced ED visits after posterior decompression surgery by over 50%, with pre-operative education, focused QI initiatives, and its individualized, proactive approach.
Background: Dravet syndrome and genetic epilepsy with febrile seizures plus (GEFS+) are associated with pathogenic variants in SCN1A. While most such cases are heterozygous, there have been 16 reported homozygous cases. We report two new biallelic cases associated with divergent phenotypes.Methods: We performed a chart review for two patients with different homozygous SCN1A variants and reviewed all previously published biallelic SCN1A pathogenic variants. Results: Our first patient exhibited early afebrile seizures and severe developmental delay, without febrile seizures or status epilepticus. A homozygous c. 1676T>A, (p. Ile559Asn) variant of uncertain significance was identified, carried by asymptomatic parents. The second patient exhibited early, recurrent, and prolonged febrile seizures, moderate developmental delay, and motor dysfunction; a homozygous pathogenic c. 4970G>A, (p. Arg1657His) variant carried by asymptomatic parents was identified.
Of 18 known cases of biallelic SCN1A pathogenic variants, 15/18 (83%) have diagnoses of Dravet or GEFS+. The remaining 3/18 (17%) had pharmacoresponsive epilepsy with prominent GDD. Cognitive phenotypes ranged from intact neurodevelopment to profound developmental delay. Eleven out of 18 cases (61%) had motor concerns. Conclusions: These cases expand the phenotypic spectrum of biallelic SCN1A variants. While some patients present typically for Dravet/GEFS+, others present with developmental delay and controllable epilepsy.
Languages in contact commonly leave an imprint on one other. The most straightforward of these imprints to identify is MAT-borrowing, which results in clearly identifiable lexical items of one language (the donor language) being used in utterances of another language (the recipient language). This stands in contrast with PAT-borrowing, which does not involve any such incorporation of “other language” material but rather results in the reshaping of existing structures of the recipient language on the model of the donor language. This type of language change is therefore arguably more “invisible” to speakers since no easily identifiable “other language” material is present.
This study presents a detailed examination of PAT-borrowing in Guernésiais, the Norman variety spoken in Guernsey (British Channel Islands), which is now at an advanced state of language shift. It also highlights a major difference between MAT- and PAT-borrowing, namely that, whereas MAT-borrowing can only be explained with reference to the dominant language, PAT-borrowing can on occasion admit an internal explanation.
Increased temporal variability in the gut microbiome is associated with intestinal conditions such as ulcerative colitis and Crohn’s disease, leading to the recently established concept of microbial volatility (1). Increased physiological stress has been shown to increase microbial volatility indicating that microbial volatility is susceptible to external interventions(1). Dietary fibre positively affects the gut microbiome, but it is unclear if it impacts microbial volatility. The gut microbiota influences hypertension, and high-fibre intake reduces blood pressure (BP)(2). However, not all individuals exhibit a response to these fibre-based dietary changes, and the reasons for this variability remain unclear. Similarly, it is unknown whether the degree of stability of the gut microbiota consortium could be a determining factor in individual responsiveness to dietary interventions. Here, we aimed to identify: i) whether gut microbiome volatility differs when dietary fibre vs placebo interventions, and ii) whether microbiome volatility discriminates between BP responders and non-responders to a high fibre intervention. Twenty treatment-naive participants with hypertension received either placebo or 40g per day of prebiotic acetylated and butyrylated high amylose maize starch (HAMSAB) supplementation for 3 weeks in a phase II randomised cross-over double-blind placebo-controlled trial(3). Blood pressure was monitored at baseline and each endpoint by 24-hour ambulatory BP monitoring, with those experiencing a reduction between timepoints of ≥ 2 mmHg classified as responders. Baseline stool samples were collected, and the V4 region of the 16S gene was sequenced. Taxonomy was assigned by reference to the SILVA database. Microbial volatility between timepoints (e.g., pre- and post-intervention) was calculated as the Euclidian distance of centred log-ratio transformed genera counts (Aitchison distance). No difference was observed in microbial volatility between individuals when they received the dietary fibre intervention or the placebo (21.5 ± 5.5 vs 20.5 ± 7.7, p = 0.51). There was no significant difference between microbial volatility on the dietary intervention between responders and non-responders (21.8 ± 4.9 vs 20.9 ± 7.2, p = 0.84). There was no association between the change in BP during intervention and microbial volatility during intervention (r2 = −0.09, p = 0.72). These data suggest that temporal volatility of the gut microbiota does not change with fibre intake or contribute to the BP response to dietary fibre intervention trials in people with hypertension.
Individuals with long-term physical health conditions (LTCs) experience higher rates of depression and anxiety. Conventional self-report measures do not distinguish distress related to LTCs from primary mental health disorders. This difference is important as treatment protocols differ. We developed a transdiagnostic self-report measure of illness-related distress, applicable across LTCs.
Methods
The new Illness-Related Distress (IRD) scale was developed through thematic coding of interviews, systematic literature search, think-aloud interviews with patients and healthcare providers, and expert-consensus meetings. An internet sample (n = 1,398) of UK-based individuals with LTCs completed the IRD scale for psychometric analysis. We randomly split the sample (1:1) to conduct: (1) an exploratory factor analysis (EFA; n = 698) for item reduction, and (2) iterative confirmatory factor analysis (CFA; n = 700) and exploratory structural equation modeling (ESEM). Here, further item reduction took place to generate a final version. Measurement invariance, internal consistency, convergent, test–retest reliability, and clinical cut-points were assessed.
Results
EFA suggested a 2-factor structure for the IRD scale, subsequently confirmed by iteratively comparing unidimensional, lower order, and bifactor CFAs and ESEMs. A lower-order correlated 2-factor CFA model (two 7-item subscales: intrapersonal distress and interpersonal distress) was favored and was structurally invariant for gender. Subscales demonstrated excellent internal consistency, very good test–retest reliability, and good convergent validity. Clinical cut points were identified (intrapersonal = 15, interpersonal = 12).
Conclusion
The IRD scale is the first measure that captures transdiagnostic distress. It may aid assessment within clinical practice and research related to psychological adjustment and distress in LTCs.
Epilepsy is a relatively common condition that affects approximately 4–5 per 1000 individuals in Ontario, Canada. While genetic testing is now prevalent in diagnostic and therapeutic care plans, optimal test selection and interpretation of results in a patient-specific context can be inconsistent and provider dependent.
Methods:
The first of its kind, the Ontario Epilepsy Genetic Testing Program (OEGTP) was launched in 2020 to develop clinical testing criteria, curate gene content, standardize the technical testing criteria through a centralized testing laboratory, assess diagnostic yield and clinical utility and increase genetics literacy among providers.
Results:
Here we present the results of the first two years of the program, demonstrating the overall 20.8% diagnostic yield including pathogenic sequence and copy number variation detected by next-generation sequencing panels. Routine follow-up testing of family members enabled the resolution of ambiguous findings. Post-test outcomes were collected as reported by the ordering clinicians, highlighting the clinical benefits of genetic testing.
Conclusion:
This programmatic approach to genetic testing in epilepsy by OEGTP, together with engagement of clinical and laboratory stakeholders, provided a unique opportunity to gather insight into province-wide implementation of a genetic testing program.
Objectives/Goals: The overall goal of this project is to determine bacterial transcriptional signatures from clinical sputum and assess their potential to monitor treatment response and predict the outcome of drug therapy in patients with tuberculosis (TB). Methods/Study Population: We are developing a novel transcript capture sequencing (TC-Seq) approach to sequence the mRNA of Mycobacterium tuberculosis (Mtb) and analyze transcriptomes from clinical samples containing minimal amounts of bacterial RNA. This protocol generates single-stranded biotinylated probes from Mtb DNA. Probes are hybridized to and allow enrichment of Mtb-specific mRNA within next-generation RNA sequencing libraries. We will apply TC-Seq to sputum samples collected throughout an 18-month Phase II clinical trial investigating response to TB treatment to compare the transcriptome of Mtb between patients whose treatment results in cure or relapse. Results/Anticipated Results: We have refined a technique to generate biotinylated probes starting from DNA of lab grown Mtb. This protocol achieves robust and unbiased sampling of the Mtb transcriptome from mixed samples containing both human and Mtb RNA. Preliminary sequencing of clinical sputum collected pretreatment has generated 1–4 million Mtb-specific reads, a sequencing depth that allows examination of the entire bacterial transcriptome. We will measure differential gene expression before and during treatment as well as between cure and relapse cases. These results will allow us to characterize bacterial response to treatment and identify bacterial markers that correlate with relapse. Discussion/Significance of Impact: Understanding Mtb activity during treatment will offer new ways to assess the efficacy of different treatment regimens. Crucially, identifying clear bacterial markers that demarcate a cure or relapse outcome will have a significant impact on determining patient eligibility for shorter drug therapy.
The marketing of unhealthy foods has been implicated in poor diet and rising levels of obesity. Rapid developments in the digital food marketing ecosystem and associated research mean that contemporary review of the evidence is warranted. This preregistered (CRD420212337091)1 systematic review and meta-analysis aimed to provide an updated synthesis of the evidence for behavioural and health impacts of food marketing on both children and adults, using the 4Ps framework (Promotion, Product, Price, Place). Ten databases were searched from 2014 to 2021 for primary data articles of quantitative or mixed design, reporting on one or more outcome of interest following food marketing exposure compared with a relevant control. Reviews, abstracts, letters/editorials and qualitative studies were excluded. Eighty-two studies were included in the narrative review and twenty-three in the meta-analyses. Study quality (RoB2/Newcastle–Ottawa scale) was mixed. Studies examined ‘promotion’ (n 55), ‘product’ (n 17), ‘price’ (n 15) and ‘place’ (n 2) (some > 1 category). There is evidence of impacts of food marketing in multiple media and settings on outcomes, including increased purchase intention, purchase requests, purchase, preference, choice, and consumption in children and adults. Meta-analysis demonstrated a significant impact of food marketing on increased choice of unhealthy foods (OR = 2·45 (95 % CI 1·41, 4·27), Z = 3·18, P = 0·002, I2 = 93·1 %) and increased food consumption (standardised mean difference = 0·311 (95 % CI 0·185, 0·437), Z = 4·83, P < 0·001, I2 = 53·0 %). Evidence gaps were identified for the impact of brand-only and outdoor streetscape food marketing, and for data on the extent to which food marketing may contribute to health inequalities which, if available, would support UK and international public health policy development.
The primary purpose of this study was to assess perceived burdens and benefits of participating in implementation research among staff employed in resource-constrained healthcare settings. Another objective was to use findings to generate considerations for engaging staff in research across different phases of implementation research.
Methods:
This qualitative focus group and consensus building study involved researchers affiliated with the National Cancer Institute Implementation Science Centers in Cancer Control program and nine Community Health Centers (CHCs) in Massachusetts. Six focus groups (n = 3 with CHC staff; n = 3 with researchers) assessed barriers and facilitators to staff participation in implementation research. During consensus discussions, we used findings to develop considerations for engaging staff as participants and partners throughout phases of implementation research.
Results:
Sixteen researchers and 14 staff participated in separate focus groups; nine researchers and seven staff participated in separate consensus discussions. Themes emerged across participant groups in three domains: (1) influences on research participation; (2) research burdens and benefits; and (3) ways to facilitate staff participation in research. Practical considerations included: (a) aligning research with organizational and staff values and priorities; (b) applying user-centered design to research methods; (c) building organizational and individual research capacity; and (d) offering equitable incentives for staff participation.
Conclusions:
Engaging staff as participants and partners across different phases of implementation research requires knowledge about what contributes to research burden and benefits and addressing context-specific burdens and benefits.
Clinical research professionals (CRPs) are essential members of research teams serving in multiple job roles. However, recent turnover rates have reached crisis proportions, negatively impacting clinical trial metrics. Gaining an understanding of job satisfaction factors among CRPs working at academic medical centers (AMCs) can provide insights into retention efforts.
Materials/Methods:
A survey instrument was developed to measure key factors related to CRP job satisfaction and retention. The survey included 47 rating items in addition to demographic questions. An open-text question solicited respondents to provide their top three factors for job satisfaction. The survey was distributed through listservs of three large AMCs. Here, we present a factor analysis of the instrument and quantitative and qualitative results of the subsequent survey.
Results:
A total of 484 CRPs responded to the survey. A principal components analysis with Varimax rotation was performed on the 47 rating items. The analysis resulted in seven key factors and the survey instrument was reduced to 25 rating items. Self-efficacy and pride in work were top ranked in the quantitative results; work complexity and stress and salary and benefits were top ranked in the qualitative findings. Opportunities for education and professional development were also themes in the qualitative data.
Discussion:
This study addresses the need for a tool to measure job satisfaction of CRPs. This tool may be useful for additional validation studies and research to measure the effectiveness of improvement initiatives to address CRP job satisfaction and retention.
Dysphagia is common in infants born with critical CHD. Thickened liquids are often used to treat dysphagia, but associated risks limit widespread use among feeding specialists. This survey aims to assess dysphagia treatment patterns and thickened liquid use across paediatric cardiac surgical centres.
Methods:
A 24-question, cross-sectional survey. Convenience and snowball sampling methods were used to engage 52 paediatric cardiac surgical centres affiliated with the Cardiac Newborn Neuroprotective Network. Descriptive statistics were used to analyse and compare responses.
Results:
Twenty-six individual respondents represented 21 unique paediatric cardiac surgical centres. Most responses were from experienced, speech–language pathologists (78%) at medium size centres (88%). Ninety-three percent of responding centres used thickened liquids to treat dysphagia and 81% only after formal instrumental assessment of swallowing. Thickened oral feeding was used for single-ventricle patients by 85% versus 69% for two-ventricle patients. Barriers to recommending thickened oral feedings included the cost of thickening agents, parental non-adherence, and gastrointestinal concerns.
Conclusions:
This is the first survey to report multi-institutional dysphagia treatment practice variation at United States congenital cardiac surgical centres. Thickened oral feedings are frequently used across centres in high-risk critical CHD patients but treatment benefit remains unclear. This survey highlights a broad scientific community poised to direct dysphagia research in critical CHD to address practice variation, short- and long-term impact of thickened oral feeding on feeding outcomes, and barriers to use and access of thickening agents.
Accurate diagnosis of bipolar disorder (BPD) is difficult in clinical practice, with an average delay between symptom onset and diagnosis of about 7 years. A depressive episode often precedes the first manic episode, making it difficult to distinguish BPD from unipolar major depressive disorder (MDD).
Aims
We use genome-wide association analyses (GWAS) to identify differential genetic factors and to develop predictors based on polygenic risk scores (PRS) that may aid early differential diagnosis.
Method
Based on individual genotypes from case–control cohorts of BPD and MDD shared through the Psychiatric Genomics Consortium, we compile case–case–control cohorts, applying a careful quality control procedure. In a resulting cohort of 51 149 individuals (15 532 BPD patients, 12 920 MDD patients and 22 697 controls), we perform a variety of GWAS and PRS analyses.
Results
Although our GWAS is not well powered to identify genome-wide significant loci, we find significant chip heritability and demonstrate the ability of the resulting PRS to distinguish BPD from MDD, including BPD cases with depressive onset (BPD-D). We replicate our PRS findings in an independent Danish cohort (iPSYCH 2015, N = 25 966). We observe strong genetic correlation between our case–case GWAS and that of case–control BPD.
Conclusions
We find that MDD and BPD, including BPD-D are genetically distinct. Our findings support that controls, MDD and BPD patients primarily lie on a continuum of genetic risk. Future studies with larger and richer samples will likely yield a better understanding of these findings and enable the development of better genetic predictors distinguishing BPD and, importantly, BPD-D from MDD.
This review aims to highlight the relative importance of cardiovascular disease (CVD) lifestyle-associated risk factors among individuals with inflammatory bowel disease (IBD) and examine the effectiveness of lifestyle interventions to improve these CVD risk factors. Adults with IBD are at higher risk of CVD due to systemic and gut inflammation. Besides that, tobacco smoking, dyslipidaemia, hypertension, obesity, physical inactivity and poor diet can also increase CVD risk. Typical IBD behavioural modification including food avoidance and reduced physical activity, as well as frequent corticosteroid use, can further increase CVD risk. We reviewed seven studies and found that there is insufficient evidence to conclude the effects of diet and/or physical activity interventions on CVD risk outcomes among populations with IBD. However, the limited findings suggest that people with IBD can adhere to a healthy diet or Mediterranean diet (for which there is most evidence) and safely participate in moderately intense aerobic and resistance training to potentially improve anthropometric risk factors. This review highlights the need for more robust controlled trials with larger sample sizes to assess and confirm the effects of lifestyle interventions to mitigate modifiable CVD risk factors among the IBD population.
Machine learning (ML) techniques have emerged as a powerful tool for predicting weather and climate systems. However, much of the progress to date focuses on predicting the short-term evolution of the atmosphere. Here, we look at the potential for ML methodology to predict the evolution of the ocean. The presence of land in the domain is a key difference between ocean modeling and previous work looking at atmospheric modeling. Here, we look to train a convolutional neural network (CNN) to emulate a process-based General Circulation Model (GCM) of the ocean, in a configuration which contains land. We assess performance on predictions over the entire domain and near to the land (coastal points). Our results show that the CNN replicates the underlying GCM well when assessed over the entire domain. RMS errors over the test dataset are low in comparison to the signal being predicted, and the CNN model gives an order of magnitude improvement over a persistence forecast. When we partition the domain into near land and the ocean interior and assess performance over these two regions, we see that the model performs notably worse over the near land region. Near land, RMS scores are comparable to those from a simple persistence forecast. Our results indicate that ocean interaction with land is something the network struggles with and highlight that this is may be an area where advanced ML techniques specifically designed for, or adapted for, the geosciences could bring further benefits.
Protein circular dichroism (CD) and infrared absorbance (IR) spectra are widely used to estimate the secondary structure content of proteins in solution. A range of algorithms have been used for CD analysis (SELCON, CONTIN, CDsstr, SOMSpec) and some of these have been applied to IR data, though IR is more commonly analysed by bandfitting or statistical approaches. In this work we provide a Python version of SELCON3 and explore how to combine CD and IR data to best effect. We used CD data in Δε/amino acid residue and scaled the IR spectra to similar magnitudes. Normalising the IR amide I spectra scaled to a maximum absorbance of 15 gives best general performance. Combining CD and IR improves predictions for both helix and sheet by ~2% and helps identify anomalously large errors for high helix proteins such as haemoglobin when using IR data alone and high sheet proteins when using CD data alone.
There is a growing trend for studies run by academic and nonprofit organizations to have regulatory submission requirements. As a result, there is greater reliance on REDCap, an electronic data capture (EDC) widely used by researchers in these organizations. This paper discusses the development and implementation of the Rapid Validation Process (RVP) developed by the REDCap Consortium, aimed at enhancing regulatory compliance and operational efficiency in response to the dynamic demands of modern clinical research. The RVP introduces a structured validation approach that categorizes REDCap functionalities, develops targeted validation tests, and applies structured and standardized testing syntax. This approach ensures that REDCap can meet regulatory standards while maintaining flexibility to adapt to new challenges. Results from the application of the RVP on recent successive REDCap software version releases illustrate significant improvements in testing efficiency and process optimization, demonstrating the project’s success in setting new benchmarks for EDC system validation. The project’s community-driven responsibility model fosters collaboration and knowledge sharing and enhances the overall resilience and adaptability of REDCap. As REDCap continues to evolve based on feedback from clinical trialists, the RVP ensures that REDCap remains a reliable and compliant tool, ready to meet regulatory and future operational challenges.
The association between cannabis and psychosis is established, but the role of underlying genetics is unclear. We used data from the EU-GEI case-control study and UK Biobank to examine the independent and combined effect of heavy cannabis use and schizophrenia polygenic risk score (PRS) on risk for psychosis.
Methods
Genome-wide association study summary statistics from the Psychiatric Genomics Consortium and the Genomic Psychiatry Cohort were used to calculate schizophrenia and cannabis use disorder (CUD) PRS for 1098 participants from the EU-GEI study and 143600 from the UK Biobank. Both datasets had information on cannabis use.
Results
In both samples, schizophrenia PRS and cannabis use independently increased risk of psychosis. Schizophrenia PRS was not associated with patterns of cannabis use in the EU-GEI cases or controls or UK Biobank cases. It was associated with lifetime and daily cannabis use among UK Biobank participants without psychosis, but the effect was substantially reduced when CUD PRS was included in the model. In the EU-GEI sample, regular users of high-potency cannabis had the highest odds of being a case independently of schizophrenia PRS (OR daily use high-potency cannabis adjusted for PRS = 5.09, 95% CI 3.08–8.43, p = 3.21 × 10−10). We found no evidence of interaction between schizophrenia PRS and patterns of cannabis use.
Conclusions
Regular use of high-potency cannabis remains a strong predictor of psychotic disorder independently of schizophrenia PRS, which does not seem to be associated with heavy cannabis use. These are important findings at a time of increasing use and potency of cannabis worldwide.