We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The RDA for dietary protein is likely insufficient for individuals with cystic fibrosis (CF). This study sought to characterise protein intake and diet quality in adults with cystic fibrosis (awCF), before and after elexacaftor/tezacaftor/ivacaftor (ETI) therapy, compared with healthy controls. Dietary intake was assessed by diet diary in awCF at baseline (BL, n 40) and at follow-up > 3 months post ETI therapy (follow-up (FUP), n 40) and in age-matched healthy controls (CON, n 80) free from known disease at a single time point. Protein intake dose and daily distribution, protein quality, protein source and overall diet quality were calculated for each participant. Both CON (1·39 (sd 0·47) g·kg–1·day–1) and CF (BL: 1·44 (sd 0·52) g·kg–1·day–1, FUP: 1·12 (sd 0·32) g·kg–1·day–1) had a higher mean daily protein intake than the protein RDA of 0·75g·kg–1·day–1. There was a significant reduction in daily protein intake in the CF group at FUP (P = 0·0003, d = 0·73), with levels below the alternative suggested dietary intake of ≥ 1·2 g·kg–1·day–1. There were no sex differences or noticeable effects on protein quality or source following the commencement of ETI therapy when compared with CON (all P > 0·05), although overall diet quality decreased between time points (P = 0·027, d = 0·57). The observed reduction in daily protein intake in the present cohort emphasises the importance of ensuring appropriate dietary protein intake to promote healthy ageing in adults with CF. More research is needed to evidence base dietary protein requirements in this at-risk population.
We sought to compare whether quality of life (QOL) in patients with subjective cognitive impairment (SCI) who performed normally on a neuropsychological battery significantly differed from those diagnosed with mild cognitive impairment (MCI), Alzheimer’s disease (AD) or non-Alzheimer’s dementia (non-AD) at initial assessment in a Rural and Remote Memory Clinic (RRMC).
Methods:
610 patients referred to our RRMC between 2004 and 2019 were included in this study. We compared self-reported and caregiver-reported patient QOL scores in those with SCI (n = 166) to those diagnosed with MCI (n = 98), AD (n = 228) and non-AD (n = 118).
Results:
Patients with SCI self-reported significantly lower QOL compared to patients with AD. Interestingly, the reverse was seen in caregivers: SCI caregivers rated patient QOL higher than AD caregivers. Patients with SCI also reported lower QOL than patients with MCI. SCI caregivers reported higher patient QOL than their non-AD counterparts. Caregiver-rated patient QOL was higher in those with MCI compared to AD. Patients with MCI self-reported higher QOL scores compared to patients with non-AD dementias. Similarly, MCI caregivers reported higher patient QOL than non-AD caregivers. No other comparisons were statistically significant.
Conclusion:
Although they lacked clinically significant cognitive deficits, patients with SCI self-reported significantly lower QOL than patients with MCI and AD. Conversely, caregiver-reported patient QOL was higher for patients with SCI than for patients with AD and non-AD. This shows that SCI seriously impacts QOL. More research is needed on how we can better support patients with SCI to improve their QOL.
Advances in artificial intelligence (AI) have great potential to help address societal challenges that are both collective in nature and present at national or transnational scale. Pressing challenges in healthcare, finance, infrastructure and sustainability, for instance, might all be productively addressed by leveraging and amplifying AI for national-scale collective intelligence. The development and deployment of this kind of AI faces distinctive challenges, both technical and socio-technical. Here, a research strategy for mobilising inter-disciplinary research to address these challenges is detailed and some of the key issues that must be faced are outlined.
Cannabis use and familial vulnerability to psychosis have been associated with social cognition deficits. This study examined the potential relationship between cannabis use and cognitive biases underlying social cognition and functioning in patients with first episode psychosis (FEP), their siblings, and controls.
Methods
We analyzed a sample of 543 participants with FEP, 203 siblings, and 1168 controls from the EU-GEI study using a correlational design. We used logistic regression analyses to examine the influence of clinical group, lifetime cannabis use frequency, and potency of cannabis use on cognitive biases, accounting for demographic and cognitive variables.
Results
FEP patients showed increased odds of facial recognition processing (FRP) deficits (OR = 1.642, CI 1.123–2.402) relative to controls but not of speech illusions (SI) or jumping to conclusions (JTC) bias, with no statistically significant differences relative to siblings. Daily and occasional lifetime cannabis use were associated with decreased odds of SI (OR = 0.605, CI 0.368–0.997 and OR = 0.646, CI 0.457–0.913 respectively) and JTC bias (OR = 0.625, CI 0.422–0.925 and OR = 0.602, CI 0.460–0.787 respectively) compared with lifetime abstinence, but not with FRP deficits, in the whole sample. Within the cannabis user group, low-potency cannabis use was associated with increased odds of SI (OR = 1.829, CI 1.297–2.578, FRP deficits (OR = 1.393, CI 1.031–1.882, and JTC (OR = 1.661, CI 1.271–2.171) relative to high-potency cannabis use, with comparable effects in the three clinical groups.
Conclusions
Our findings suggest increased odds of cognitive biases in FEP patients who have never used cannabis and in low-potency users. Future studies should elucidate this association and its potential implications.
In places where multiple related taxa are invasive and known to hybridize, it is important to have correct identifications to enable an appropriate legal, ecological, and management understanding of each kind of invader. Invasive knotweeds in the genus Reynoutria Houtt. are noxious weeds in Europe, North America, Africa, and Oceania, where they disrupt native plant communities and negatively impact human activities. Two species (Japanese knotweed [Reynoutria japonica Houtt.; syn.: Polygonum cuspidatum Siebold & Zucc.] and giant knotweed [Reynoutria sachalinensis (F. Schmidt ex Maxim.) Nakai; syn.: Polygonum sachalinense F. Schmidt ex Maxim.]) and their hybrid (known as Bohemian knotweed [Reynoutria ×bohemica Chrtek & Chrtková; syn.: Polygonum ×bohemicum (J. Chrtek & Chrtková) Zika & Jacobson [cuspidatum × sachalinense]]) have similar invasive tendencies, although there are some noted differences among them in their reproduction potential, ecological tolerance, and effect on native communities. Prior studies demonstrated that not only one kind of interspecific hybrid exists, but in fact there are at least four kinds that differ in the sequence variants they possess from each parent. Thus, in addition to identifying plants as hybrids, it may become important to distinguish each kind of hybrid when considering control or treatment strategies. In the current study, we expand the available genetic information for invasive Reynoutria by providing expanded DNA sequence data for the low-copy nuclear gene LEAFY, which has become important for characterizing hybrids. Our methods recover the same LEAFY genotypes that were identified previously for the commonly sequenced second intron, and we also provide sequence data for the first intron and second exon of the gene.
It is widely acknowledged that personal therapy positively contributes to the continued personal well-being and ongoing professional development of mental health professionals, including psychiatrists. As a result, most training bodies continue to recommend personal therapy to their trainees. Given its reported value and benefits, one might hypothesize that a high proportion of psychiatrists avail of personal therapy. This systematic review seeks to investigate whether this is the case.
Aim:
To identify and evaluate the findings derived from all available survey-based studies reporting quantitative data regarding psychiatrists’ and psychiatry trainees’ engagement in personal therapy.
Method:
A systematic search for survey-based studies about the use of personal therapy by psychiatric practitioners was conducted in four databases and platforms (PubMed, Scopus, Embase and EbscoHost) from inception to May 2022 following the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines. Studies were assessed for quality using the quality assessment checklist for survey studies in psychology (Q-SSP) and findings summarized using narrative synthesis.
Results:
The proportion of trainees who engaged in personal therapy ranged from a low of 13.4% in a recent UK based study to a high of 65.3% among Israeli residents. The proportion of fully qualified psychiatrists who engaged in personal therapy varied from 32.1% in South Korea to 89% in New Zealand.
Conclusion:
This review represents the first known attempt to collect and synthesize data aimed at providing insights into the past and current trends in psychiatrists’ use of personal therapy across different geographic regions and career stages.
Enteric bacterial infections are common among people who travel internationally. During 2017–2020, the Centers for Disease Control and Prevention investigated 41 multistate outbreaks of nontyphoidal Salmonella and Shiga toxin-producing Escherichia coli linked to international travel. Resistance to one or more antimicrobial agents was detected in at least 10% of isolates in 16 of 30 (53%) nontyphoidal Salmonella outbreaks and 8 of 11 (73%) Shiga toxin-producing E. coli outbreaks evaluated by the National Antimicrobial Resistance Monitoring System. At least 10% of the isolates in 14 nontyphoidal Salmonella outbreaks conferred resistance to one or more of the clinically significant antimicrobials used in human medicine. This report describes the epidemiology and antimicrobial resistance patterns of these travel-associated multistate outbreaks. Investigating illnesses among returned travellers and collaboration with international partners could result in the implementation of public health interventions to improve hygiene practices and food safety standards and to prevent illness and spread of multidrug-resistant organisms domestically and internationally.
Cognitively healthy individuals who complete a neuropsychological test battery can obtain very low scores. These very low scores are not likely indicative of cognitive impairment but are rather considered spuriously low scores. The expected number of low scores varies based on number and type of neuropsychological tests. Typically, base rates have been determined from normative samples, which could differ from samples seen in clinical settings. The current study reports on base rates of spuriously low cognitive scores in older adults presenting to a memory clinic who were diagnosed with subjective cognitive impairment after interprofessional assessment and information from collateral informants ruled out objective cognitive impairment.
Participants and Methods:
Base rates of spuriously low scores for a neuropsychological battery of 12 scores were based on 92 cognitively healthy older adults presenting to a specialist memory clinic (M(age) = 61.00, SD = 12.00; M(edu) = 12.00, SD = 2.74). Crawford’s Monte Carlo simulation algorithm was used to estimate multivariate base rates by calculating the percentage of cognitively healthy memory clinic patients who produced age and education normed scores at or below the 5th percentile. The following tests were used to produce the 12 scores: block design, digit span backwards, and coding from the WAIS-IV; logical memory I and II from the WMS-IV; immediate and delayed memory scores from the California Verbal Learning Test Second Edition short form; immediate and delayed memory scores from the Brief Visuospatial Memory Test Revised; category switching, letter number sequencing, and inhibition switching from the Delis Kaplin Executive Functioning System.
Results:
An estimated 33.58% of the cognitively healthy memory clinic population would have one or more low scores (5th percentile cutoff),14.7% would have two or more low scores, 6.55% would have three or more, 2.94% would have four or more, and 1.31% percent would have 5 or more very low scores due to chance.
Conclusions:
Determining base rates of spuriously low scores on a neuropsychological battery in a clinical sample of referred older adults with subjective memory complaints could assist in the diagnostic process. By understanding base rates of clinical samples, clinicians can use empirical data to adjust for expected low scores rather than using conventional corrections (such as 1/20 test scores expected to be low). In a memory clinic sample, three or more low test scores out of 12 is expected to be relatively rare in those who were later determined to have no objective evidence of cognitive impairment based on interprofessional assessment. Understanding normal frequency of low scores will prevent undue conclusions of cognitive impairment which will minimize false positives in diagnosis.
Palmer amaranth (Amaranthus palmeri S. Watson) is the most problematic weed of cotton (Gossypium hirsutum L.)-cropping systems in the U.S. Southeast. Heavy reliance on herbicides has selected for resistance to multiple herbicide mechanisms of action. Effective management of this weed may require the integration of cultural practices that limit germination, establishment, and growth. Cover crops have been promoted as a cultural practice that targets these processes. We conducted a 2-yr study in Georgia, USA, to measure the effects of two annual cover crops (cereal rye [Secale cereale L.] and crimson clover [Trifolium incarnatum L.]), a perennial living mulch (‘Durana®’ white clover [Trifolium repens L.]), and a bare ground control on A. palmeri population dynamics. The study was conducted in the absence of herbicides. Growth stages were integrated into a basic demographic model to evaluate differences in population trajectories. Cereal rye and living mulch treatments suppressed weed seedling recruitment (seedlings seed−1) 19.2 and 13 times and 12 and 25 times more than the bare ground control, respectively. Low recruitment was correlated positively with low light transmission (photosynthetic active radiation: above canopy photosynthetically active radiation [PAR]/below cover crop PAR) at the soil surface. Low recruitment rates were also negatively correlated with high survival rates. Greater survival rates and reduced adult plant densities resulted in greater biomass (g plant−1) and fecundity (seeds plant−1) in cereal rye and living mulch treatments in both years. The annual rate of population change (seeds seed−1) was equivalent across all treatments in the first year but was greater in the living mulch treatment in the second year. Our results highlight the potential of annual cover crops and living mulches for suppressing A. palmeri seedling recruitment and would be valuable tools as part of an integrated weed management strategy.
The coronavirus disease 2019 (COVID-19) pandemic has demonstrated the importance of stewardship of viral diagnostic tests to aid infection prevention efforts in healthcare facilities. We highlight diagnostic stewardship lessons learned during the COVID-19 pandemic and discuss how diagnostic stewardship principles can inform management and mitigation of future emerging pathogens in acute-care settings. Diagnostic stewardship during the COVID-19 pandemic evolved as information regarding transmission (eg, routes, timing, and efficiency of transmission) became available. Diagnostic testing approaches varied depending on the availability of tests and when supplies and resources became available. Diagnostic stewardship lessons learned from the COVID-19 pandemic include the importance of prioritizing robust infection prevention mitigation controls above universal admission testing and considering preprocedure testing, contact tracing, and surveillance in the healthcare facility in certain scenarios. In the future, optimal diagnostic stewardship approaches should be tailored to specific pathogen virulence, transmissibility, and transmission routes, as well as disease severity, availability of effective treatments and vaccines, and timing of infectiousness relative to symptoms. This document is part of a series of papers developed by the Society of Healthcare Epidemiology of America on diagnostic stewardship in infection prevention and antibiotic stewardship.1
University students face vast mental health challenges, and both attitudinal and structural barriers to seeking care. Embedding interventions in college courses is one solution. Acceptance and commitment therapy (ACT) is an ideal candidate intervention given its emphasis on values, context, and skill building from a transdiagnostic perspective. This study embedded a brief ACT intervention in a required freshman seminar that was delivered by trained but unlicensed graduate students. In two class sessions of the freshman seminar taught by the same instructor, one session was randomly assigned to receive the course as usual, and one session received the ACT intervention. ACT content was delivered to all students in the intervention course on five consecutive weekly class periods. Students in both classes who chose to participate in the study completed assessments before and after the intervention and at follow-up. There were no significant changes with tests that were run, including non-parametric tests given the small sample sizes. Descriptively, the intervention group had slight improvements in wellbeing and mindfulness and decreases in distress, and the control group had worsened wellbeing, mindfulness and distress. A moderate portion of intervention group students enjoyed the intervention and indicated use of ACT skills, particularly mindfulness. Results suggest that this classroom-based intervention was feasible and acceptable, but further study should occur given small sample sizes. Future work should continue course-based ACT interventions, and should also explore potential applications of student training to deliver interventions given the shortage of mental health providers on college campuses.
Key learning aims
(1) Can acceptance and commitment therapy content and skills be integrated into an existing freshman seminar curriculum?
(2) Can acceptance and commitment therapy improve wellbeing and decrease distress amongst college students?
(3) How will students engage with and practise acceptance and commitment therapy skills outside of the context of session delivery?
Methamphetamine and cannabis are two widely used, and frequently co-used, substances with possibly opposing effects on the central nervous system. Evidence of neurocognitive deficits related to use is robust for methamphetamine and mixed for cannabis. Findings regarding their combined use are inconclusive. We aimed to compare neurocognitive performance in people with lifetime cannabis or methamphetamine use disorder diagnoses, or both, relative to people without substance use disorders.
Method:
423 (71.9% male, aged 44.6 ± 14.2 years) participants, stratified by presence or absence of lifetime methamphetamine (M−/M+) and/or cannabis (C−/C+) DSM-IV abuse/dependence, completed a comprehensive neuropsychological, substance use, and psychiatric assessment. Neurocognitive domain T-scores and impairment rates were examined using multiple linear and binomial regression, respectively, controlling for covariates that may impact cognition.
Results:
Globally, M+C+ performed worse than M−C− but better than M+C−. M+C+ outperformed M+C− on measures of verbal fluency, information processing speed, learning, memory, and working memory. M−C+ did not display lower performance than M−C− globally or on any domain measures, and M−C+ even performed better than M−C− on measures of learning, memory, and working memory.
Conclusions:
Our findings are consistent with prior work showing that methamphetamine use confers risk for worse neurocognitive outcomes, and that cannabis use does not appear to exacerbate and may even reduce this risk. People with a history of cannabis use disorders performed similarly to our nonsubstance using comparison group and outperformed them in some domains. These findings warrant further investigation as to whether cannabis use may ameliorate methamphetamine neurotoxicity.
Tobacco is a highly prevalent substance of abuse in patients with psychosis. Previous studies have reported an association between tobacco use and schizophrenia. The aim of this study was to analyze the relationship between tobacco use and first-episode psychosis (FEP), age at onset of psychosis, and specific diagnosis of psychosis.
Methods
The sample consisted of 1105 FEP patients and 1355 controls from the European Network of National Schizophrenia Networks Studying Gene–Environment Interactions (EU-GEI) study. We assessed substance use with the Tobacco and Alcohol Questionnaire and performed a series of regression analyses using case-control status, age of onset of psychosis, and diagnosis as outcomes and tobacco use and frequency of tobacco use as predictors. Analyses were adjusted for sociodemographic characteristics, alcohol, and cannabis use.
Results
After controlling for cannabis use, FEP patients were 2.6 times more likely to use tobacco [p ⩽ 0.001; adjusted odds ratio (AOR) 2.6; 95% confidence interval (CI) [2.1–3.2]] and 1.7 times more likely to smoke 20 or more cigarettes a day (p = 0.003; AOR 1.7; 95% CI [1.2–2.4]) than controls. Tobacco use was associated with an earlier age at psychosis onset (β = −2.3; p ⩽ 0.001; 95% CI [−3.7 to −0.9]) and was 1.3 times more frequent in FEP patients with a diagnosis of schizophrenia than in other diagnoses of psychosis (AOR 1.3; 95% CI [1.0–1.8]); however, these results were no longer significant after controlling for cannabis use.
Conclusions
Tobacco and heavy-tobacco use are associated with increased odds of FEP. These findings further support the relevance of tobacco prevention in young populations.
To evaluate the impact of implementing clinical decision support (CDS) tools for outpatient antibiotic prescribing in the emergency department (ED) and clinic settings.
Design:
We performed a before-and-after, quasi-experimental study that employed an interrupted time-series analysis.
Setting:
The study institution was a quaternary, academic referral center in Northern California.
Participants:
We included prescriptions for patients in the ED and 21 primary-care clinics within the same health system.
Intervention:
We implemented a CDS tool for azithromycin on March 1, 2020, and a CDS tool for fluoroquinolones (FQs; ie, ciprofloxacin, levofloxacin, and moxifloxacin) on November 1, 2020. The CDS added friction to inappropriate ordering workflows while adding health information technology (HIT) features to easily perform recommended actions. The primary outcome was the number of monthly prescriptions for each antibiotic type, by implementation period (before vs after).
Results:
Immediately after azithromycin-CDS implementation, monthly rates of azithromycin prescribing decreased significantly in both the ED (−24%; 95% CI, −37% to −10%; P < .001) and outpatient clinics (−47%; 95% CI, −56% to −37%; P < .001). In the first month after FQ-CDS implementation in the clinics, there was no significant drop in ciprofloxacin prescriptions; however, there was a significant decrease in ciprofloxacin prescriptions over time (−5% per month; 95% CI, −6% to −3%; P < .001), suggesting a delayed effect of the CDS.
Conclusion:
Implementing CDS tools was associated with an immediate decrease in azithromycin prescriptions, in both the ED and clinics. CDS may serve as a valuable adjunct to existing antimicrobial stewardship programs.
While unobscured and radio-quiet active galactic nuclei are regularly being found at redshifts
$z > 6$
, their obscured and radio-loud counterparts remain elusive. We build upon our successful pilot study, presenting a new sample of low-frequency-selected candidate high-redshift radio galaxies (HzRGs) over a sky area 20 times larger. We have refined our selection technique, in which we select sources with curved radio spectra between 72–231 MHz from the GaLactic and Extragalactic All-sky Murchison Widefield Array (GLEAM) survey. In combination with the requirements that our GLEAM-selected HzRG candidates have compact radio morphologies and be undetected in near-infrared
$K_{\rm s}$
-band imaging from the Visible and Infrared Survey Telescope for Astronomy Kilo-degree Infrared Galaxy (VIKING) survey, we find 51 new candidate HzRGs over a sky area of approximately
$1200\ \mathrm{deg}^2$
. Our sample also includes two sources from the pilot study: the second-most distant radio galaxy currently known, at
$z=5.55$
, with another source potentially at
$z \sim 8$
. We present our refined selection technique and analyse the properties of the sample. We model the broadband radio spectra between 74 MHz and 9 GHz by supplementing the GLEAM data with both publicly available data and new observations from the Australia Telescope Compact Array at 5.5 and 9 GHz. In addition, deep
$K_{\rm s}$
-band imaging from the High-Acuity Widefield K-band Imager (HAWK-I) on the Very Large Telescope and from the Southern Herschel Astrophysical Terahertz Large Area Survey Regions
$K_{\rm s}$
-band Survey (SHARKS) is presented for five sources. We discuss the prospects of finding very distant radio galaxies in our sample, potentially within the epoch of reionisation at
$z \gtrsim 6.5$
.
Patients with Fontan physiology require non-cardiac surgery. Our objectives were to characterise perioperative outcomes of patients with Fontan physiology undergoing non-cardiac surgery and to identify characteristics which predict discharge on the same day.
Materials and Method:
Children and young adults with Fontan physiology who underwent a non-cardiac surgery or an imaging study under anaesthesia between 2013 and 2019 at a single-centre academic children’s hospital were reviewed in a retrospective observational study. Continuous variables were compared using the Wilcoxon rank sum test, and categorical variables were analysed using the Chi-square test or Fisher’s exact test. Multivariable logistic regression analysis results are presented by adjusted odds ratios with 95% confidence intervals and p values.
Results:
182 patients underwent 344 non-cardiac procedures with anaesthesia. The median age was 11 years (IQR 5.2–18), 56.4% were male. General anaesthesia was administered in 289 (84%). 125 patients (36.3%) were discharged on the same day. On multivariable analysis, independent predictors that reduced the odds of same-day discharge included the chronic condition index (OR 0.91 per additional chronic condition, 95% CI 0.76–0.98, p = 0.022), undergoing a major surgical procedure (OR 0.17, 95% CI 0.05–0.64, p = 0.009), the use of intraoperative inotropes (OR 0.48, 95% CI 0.25–0.94, p = 0.031), and preoperative admission (OR = 0.24, 95% CI: 0.1–0.57, p = 0.001).
Discussion:
In a contemporary cohort of paediatric and young adults with Fontan physiology, 36.3% were able to be discharged on the same day of their non-cardiac procedure. Well selected patients with Fontan physiology can undergo anaesthesia without complications and be discharged same day.
Gene x environment (G×E) interactions, i.e. genetic modulation of the sensitivity to environmental factors and/or environmental control of the gene expression, have not been reliably established regarding aetiology of psychotic disorders. Moreover, recent studies have shown associations between the polygenic risk scores for schizophrenia (PRS-SZ) and some risk factors of psychotic disorders, challenging the traditional gene v. environment dichotomy. In the present article, we studied the role of GxE interaction between psychosocial stressors (childhood trauma, stressful life-events, self-reported discrimination experiences and low social capital) and the PRS-SZ on subclinical psychosis in a population-based sample.
Methods
Data were drawn from the EUropean network of national schizophrenia networks studying Gene-Environment Interactions (EU-GEI) study, in which subjects without psychotic disorders were included in six countries. The sample was restricted to European descendant subjects (n = 706). Subclinical dimensions of psychosis (positive, negative, and depressive) were measured by the Community Assessment of Psychic Experiences (CAPE) scale. Associations between the PRS-SZ and the psychosocial stressors were tested. For each dimension, the interactions between genes and environment were assessed using linear models and comparing explained variances of ‘Genetic’ models (solely fitted with PRS-SZ), ‘Environmental’ models (solely fitted with each environmental stressor), ‘Independent’ models (with PRS-SZ and each environmental factor), and ‘Interaction’ models (Independent models plus an interaction term between the PRS-SZ and each environmental factor). Likelihood ration tests (LRT) compared the fit of the different models.
Results
There were no genes-environment associations. PRS-SZ was associated with positive dimensions (β = 0.092, R2 = 7.50%), and most psychosocial stressors were associated with all three subclinical psychotic dimensions (except social capital and positive dimension). Concerning the positive dimension, Independent models fitted better than Environmental and Genetic models. No significant GxE interaction was observed for any dimension.
Conclusions
This study in subjects without psychotic disorders suggests that (i) the aetiological continuum hypothesis could concern particularly the positive dimension of subclinical psychosis, (ii) genetic and environmental factors have independent effects on the level of this positive dimension, (iii) and that interactions between genetic and individual environmental factors could not be identified in this sample.