We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Characterizing the structure and composition of clay minerals on the surface of Mars is important for reconstructing past aqueous processes and environments. Data from the CheMin X-ray diffraction (XRD) instrument on the Mars Science Laboratory Curiosity rover demonstrate a ubiquitous presence of collapsed smectite (basal spacing of 10 Å) in ~3.6-billion-year-old lacustrine mudstone in Gale crater, except for expanded smectite (basal spacing of 13.5 Å) at the base of the stratigraphic section in a location called Yellowknife Bay. Hypotheses to explain expanded smectite include partial chloritization by Mg(OH)2 or solvation-shell H2O molecules associated with interlayer Mg2+. The objective of this work is to test these hypotheses by measuring partially chloritized and Mg-saturated smectite using laboratory instruments that are analogous to those on Mars rovers and orbiters. This work presents Mars-analog XRD, evolved gas analysis (EGA), and visible/shortwave-infrared (VSWIR) data from three smectite standards that were Mg-saturated and partially and fully chloritized with Mg(OH)2. Laboratory data are compared with XRD and EGA data collected from Yellowknife Bay by the Curiosity rover to examine whether the expanded smectite can be explained by partial chloritization and what this implies about the diagenetic history of Gale crater. Spectral signatures of partial chloritization by hydroxy-Mg are investigated that may allow the identification of partially chloritized smectite in Martian VSWIR reflectance spectra collected from orbit or in situ by the SuperCam instrument suite on the Mars 2020 Perseverance rover. Laboratory XRD and EGA data of partially chloritized saponite are consistent with data collected from Curiosity. The presence of partially chloritized (with Mg(OH)2) saponite in Gale crater suggests brief interactions between diagenetic alkaline Mg2+-bearing fluids and some of the mudstone exposed at Yellowknife Bay, but not in other parts of the stratigraphic section. The location of Yellowknife Bay at the base of the stratigraphic section may explain the presence of alkaline Mg2+-bearing fluids here but not in other areas of Gale crater investigated by Curiosity. Early diagenetic fluids may have had a sufficiently long residence time in a closed system to equilibrate with basaltic minerals, creating an elevated pH, whereas diagenetic environments higher in the section may have been in an open system, therefore preventing fluid pH from becoming alkaline.
It remains unclear which individuals with subthreshold depression benefit most from psychological intervention, and what long-term effects this has on symptom deterioration, response and remission.
Aims
To synthesise psychological intervention benefits in adults with subthreshold depression up to 2 years, and explore participant-level effect-modifiers.
Method
Randomised trials comparing psychological intervention with inactive control were identified via systematic search. Authors were contacted to obtain individual participant data (IPD), analysed using Bayesian one-stage meta-analysis. Treatment–covariate interactions were added to examine moderators. Hierarchical-additive models were used to explore treatment benefits conditional on baseline Patient Health Questionnaire 9 (PHQ-9) values.
Results
IPD of 10 671 individuals (50 studies) could be included. We found significant effects on depressive symptom severity up to 12 months (standardised mean-difference [s.m.d.] = −0.48 to −0.27). Effects could not be ascertained up to 24 months (s.m.d. = −0.18). Similar findings emerged for 50% symptom reduction (relative risk = 1.27–2.79), reliable improvement (relative risk = 1.38–3.17), deterioration (relative risk = 0.67–0.54) and close-to-symptom-free status (relative risk = 1.41–2.80). Among participant-level moderators, only initial depression and anxiety severity were highly credible (P > 0.99). Predicted treatment benefits decreased with lower symptom severity but remained minimally important even for very mild symptoms (s.m.d. = −0.33 for PHQ-9 = 5).
Conclusions
Psychological intervention reduces the symptom burden in individuals with subthreshold depression up to 1 year, and protects against symptom deterioration. Benefits up to 2 years are less certain. We find strong support for intervention in subthreshold depression, particularly with PHQ-9 scores ≥ 10. For very mild symptoms, scalable treatments could be an attractive option.
This paper develops a unified approach, based on ranks, to the statistical analysis of data arising from complex experimental designs. In this way we answer a major objection to the use of rank procedures as a major methodology in data analysis. We show that the rank procedures, including testing, estimation and multiple comparisons, are generated in a natural way from a robust measure of scale. The rank methods closely parallel the familiar methods of least squares, so that estimates and tests have natural interpretations.
We present radio observations of the galaxy cluster Abell S1136 at 888 MHz, using the Australian Square Kilometre Array Pathfinder radio telescope, as part of the Evolutionary Map of the Universe Early Science program. We compare these findings with data from the Murchison Widefield Array, XMM-Newton, the Wide-field Infrared Survey Explorer, the Digitised Sky Survey, and the Australia Telescope Compact Array. Our analysis shows the X-ray and radio emission in Abell S1136 are closely aligned and centered on the Brightest Cluster Galaxy, while the X-ray temperature profile shows a relaxed cluster with no evidence of a cool core. We find that the diffuse radio emission in the centre of the cluster shows more structure than seen in previous low-resolution observations of this source, which appeared formerly as an amorphous radio blob, similar in appearance to a radio halo; our observations show the diffuse emission in the Abell S1136 galaxy cluster contains three narrow filamentary structures visible at 888 MHz, between $\sim$80 and 140 kpc in length; however, the properties of the diffuse emission do not fully match that of a radio (mini-)halo or (fossil) tailed radio source.
Knowledge of sex differences in risk factors for posttraumatic stress disorder (PTSD) can contribute to the development of refined preventive interventions. Therefore, the aim of this study was to examine if women and men differ in their vulnerability to risk factors for PTSD.
Methods
As part of the longitudinal AURORA study, 2924 patients seeking emergency department (ED) treatment in the acute aftermath of trauma provided self-report assessments of pre- peri- and post-traumatic risk factors, as well as 3-month PTSD severity. We systematically examined sex-dependent effects of 16 risk factors that have previously been hypothesized to show different associations with PTSD severity in women and men.
Results
Women reported higher PTSD severity at 3-months post-trauma. Z-score comparisons indicated that for five of the 16 examined risk factors the association with 3-month PTSD severity was stronger in men than in women. In multivariable models, interaction effects with sex were observed for pre-traumatic anxiety symptoms, and acute dissociative symptoms; both showed stronger associations with PTSD in men than in women. Subgroup analyses suggested trauma type-conditional effects.
Conclusions
Our findings indicate mechanisms to which men might be particularly vulnerable, demonstrating that known PTSD risk factors might behave differently in women and men. Analyses did not identify any risk factors to which women were more vulnerable than men, pointing toward further mechanisms to explain women's higher PTSD risk. Our study illustrates the need for a more systematic examination of sex differences in contributors to PTSD severity after trauma, which may inform refined preventive interventions.
Comprehensive studies examining longitudinal predictors of dietary change during the coronavirus disease 2019 pandemic are lacking. Based on an ecological framework, this study used longitudinal data to test if individual, social and environmental factors predicted change in dietary intake during the peak of the coronavirus 2019 pandemic in Los Angeles County and examined interactions among the multilevel predictors.
Design:
We analysed two survey waves (e.g. baseline and follow-up) of the Understanding America Study, administered online to the same participants 3 months apart. The surveys assessed dietary intake and individual, social, and neighbourhood factors potentially associated with diet. Lagged multilevel regression models were used to predict change from baseline to follow-up in daily servings of fruits, vegetables and sugar-sweetened beverages.
Setting:
Data were collected in October 2020 and January 2021, during the peak of the coronavirus disease 2019 pandemic in Los Angeles County.
Participants:
903 adults representative of Los Angeles County households.
Results:
Individuals who had depression and less education or who identified as non-Hispanic Black or Hispanic reported unhealthy dietary changes over the study period. Individuals with smaller social networks, especially low-income individuals with smaller networks, also reported unhealthy dietary changes. After accounting for individual and social factors, neighbourhood factors were generally not associated with dietary change.
Conclusions:
Given poor diets are a leading cause of death in the USA, addressing ecological risk factors that put some segments of the community at risk for unhealthy dietary changes during a crisis should be a priority for health interventions and policy.
OBJECTIVES/GOALS: Contingency management (CM) procedures yield measurable reductions in cocaine use. This poster describes a trial aimed at using CM as a vehicle to show the biopsychosocial health benefits of reduced use, rather than total abstinence, the currently accepted metric for treatment efficacy. METHODS/STUDY POPULATION: In this 12-week, randomized controlled trial, CM was used to reduce cocaine use and evaluate associated improvements in cardiovascular, immune, and psychosocial well-being. Adults aged 18 and older who sought treatment for cocaine use (N=127) were randomized into three groups in a 1:1:1 ratio: High Value ($55) or Low Value ($13) CM incentives for cocaine-negative urine samples or a non-contingent control group. They completed outpatient sessions three days per week across the 12-week intervention period, totaling 36 clinic visits and four post-treatment follow-up visits. During each visit, participants provided observed urine samples and completed several assays of biopsychosocial health. RESULTS/ANTICIPATED RESULTS: Preliminary findings from generalized linear mixed effect modeling demonstrate the feasibility of the CM platform. Abstinence rates from cocaine use were significantly greater in the High Value group (47% negative; OR = 2.80; p = 0.01) relative to the Low Value (23% negative) and Control groups (24% negative;). In the planned primary analysis, the level of cocaine use reduction based on cocaine-negative urine samples will serve as the primary predictor of cardiovascular (e.g., endothelin-1 levels), immune (e.g., IL-10 levels) and psychosocial (e.g., Addiction Severity Index) outcomes using results from the fitted models. DISCUSSION/SIGNIFICANCE: This research will advance the field by prospectively and comprehensively demonstrating the beneficial effects of reduced cocaine use. These outcomes can, in turn, support the adoption of reduced cocaine use as a viable alternative endpoint in cocaine treatment trials.
This study describes the local Emergency Medical Services (EMS) response and patient encounters corresponding to the civil unrest occurring over a four-day period in Spring 2020 in Indianapolis, Indiana (USA).
Methods:
This study describes the non-conventional EMS response to civil unrest. The study included patients encountered by EMS in the area of the civil unrest occurring in Indianapolis, Indiana from May 29 through June 1, 2020. The area of civil unrest defined by Indianapolis Metropolitan Police Department covered 15 blocks by 12 blocks (roughly 4.0 square miles) and included central Indianapolis. The study analyzed records and collected demographics, scene times, interventions, dispositions, EMS clinician narratives, transport destinations, and hospital course with outcomes from receiving hospitals for patients extracted from the area of civil unrest by EMS.
Results:
Twenty-nine patients were included with ages ranging from two to sixty-eight years. In total, EMS transported 72.4% (21 of 29) of the patients, with the remainder declining transport. Ballistic injuries from gun violence accounted for 10.3% (3 of 29) of injuries. Two additional fatalities from penetrating trauma occurred among patients without EMS contact within and during the civil unrest. Conditions not involving trauma occurred in 37.9% (11 of 29). Among transported patients, 33.3% (7 of 21) were admitted to the hospital and there was one fatality.
Conclusions:
While most EMS transports did not result in hospitalization, it is important to note that the majority of EMS calls did result in a transport. There was a substantial amount of non-traumatic patient encounters. Trauma in many of the encounters was relatively severe, and the findings imply the need for rapid extraction methods from dangerous areas to facilitate timely in-hospital stabilization.
There is increasing recognition of cognitive and pathological heterogeneity in early-stage Alzheimer’s disease and other dementias. Data-driven approaches have demonstrated cognitive heterogeneity in those with mild cognitive impairment (MCI), but few studies have examined this heterogeneity and its association with progression to MCI/dementia in cognitively unimpaired (CU) older adults. We identified cluster-derived subgroups of CU participants based on comprehensive neuropsychological data and compared baseline characteristics and rates of progression to MCI/dementia or a Dementia Rating Scale (DRS) of <129 across subgroups.
Participants and Methods:
A hierarchical cluster analysis was conducted using 11 baseline neuropsychological test scores from 365 CU participants in the UCSD Shiley-Marcos Alzheimer’s Disease Research Center (age M=71.93 years, SD=7.51; 55.9% women; 15.6% Hispanic/Latino/a/x/e). A discriminate function analysis was then conducted to test whether the individual neuropsychological scores predicted cluster-group membership. Cox regressions examined the risk of progression to consensus diagnosis of MCI or dementia, or to DRS score <129, by cluster group.
Results:
Cluster analysis identified 5 groups: All-Average (n=139), Low-Visuospatial (n=46), Low-Executive (n=51), Low-Memory/Language (n=83), and Low-All Domains (n=46). The discriminant function analysis using the neuropsychological measures to predict group membership into these 5 clusters correctly classified 85.2% of the participants. Subgroups had unique demographic and clinical characteristics. Relative to the All-Average group, the Low-Visuospatial (hazard ratio [HR] 2.39, 95% CI [1.03, 5.56], p=.044), Low-Memory/Language (HR 4.37, 95% CI [2.24, 8.51], p<.001), and Low-All Domains (HR 7.21, 95% CI [3.59, 14.48], p<.001) groups had greater risk of progression to MCI/dementia. The Low-Executive group was also twice as likely to progress to MCI/dementia compared to the AllAverage group, but did not statistically differ (HR 2.03, 95% CI [0.88,4.70], p=.096). A similar pattern of results was found for progression to DRS score <129, with the Low-Executive (HR 2.82, 95% CI [1.26, 6.29], p=.012), Low-Memory/Language (HR 3.70, 95% CI [1.80, 7.56], p<.001) and Low-All Domains (HR 5.79, 95% CI [2.74, 12.27], p<.001) groups at greater risk of progression to a DRS score <129 than the All-Average group. The Low-Visuospatial group was also twice as likely to progress to DRS <129 compared to the All-Average group, but did not statistically differ (HR 2.02, 95% CI [0.80, 5.06], p=.135).
Conclusions:
Our results add to a growing literature documenting heterogeneity in the earliest cognitive and pathological presentations associated with Alzheimer’s disease and related disorders. Participants with subtle memory/language, executive, and visuospatial weaknesses all declined at faster rates than the All-Average group, suggesting that there are multiple pathways and/or unique subtle cognitive decline profiles that ultimately lead to a diagnosis of MCI/dementia. These results have important implications for early identification of individuals at risk for MCI/dementia. Given that the same classification approach may not be optimal for everyone, determining profiles of subtle cognitive difficulties in CU individuals and implementing neuropsychological test batteries that assess multiple cognitive domains may be a key step towards an individualized approach to early detection and fewer missed opportunities for early intervention.
An accurate accounting of prior sport-related concussion (SRC) is critical to optimizing the clinical care of athletes with SRC. Yet, obtaining such a history via medical records or lifetime monitoring is often not feasible necessitating the use of self-report histories. The primary objective of the current project is to determine the degree to which athletes consistently report their SRC history on serial assessments throughout their collegiate athletic career.
Participants and Methods:
Data were obtained from the NCAA-DoD CARE Consortium and included 1621 athletes (914 male) from a single Division 1 university who participated in athletics during the 2014-2017 academic years. From this initial cohort, 752 athletes completed a second-year assessment and 332 completed a third-year assessment. Yearly assessments included a brief self-report survey that queried SRC history of the previous year. Consistency of self-reported SRC history was defined as reporting the same number of SRC on subsequent yearly evaluation as had been reported the previous year.
For every year of participation, the number of SRC reported on the baseline exam (Reported) and the number of SRC recorded by athletes and medical staff during the ensuing season (Recorded) were tabulated. In a subsequent year, the expected number of SRC (Expected) was computed as the sum of Reported and Recorded. For participation years in which Expected could be computed, the reporting deviation (RepDev) gives the difference between the number of SRCs which were expected to be reported at a baseline exam based on previous participation year data and the number of SRCs which was actually reported by the athlete or medical record during the baseline exam. The reporting deviation was computed only for those SRC that occurred while the participant was enrolled in the current study (RepDevSO). Oneway intraclass correlations (ICC) were computed between the expected and reported numbers of SRC.
Results:
341 athletes had a history of at least one SRC and 206 of those (60.4%) had a RepDev of 0. The overall ICC for RepDev was 0.761 (95% CI 0.73-0.79). The presence of depression (ICC 0.87, 95% CI 0.79-0.92) and loss of consciousness (ICC 0.80, 95% CI 0.720.86) were associated with higher ICCs compared to athletes without these variables. Female athletes demonstrated higher self-report consistency (ICC 0.82, 95% CI 0.79-0.85) compared to male athletes (ICC 0.72, 95% CI 0.68-0.76). Differences in the classification of RepDev according to sex and sport were found to be significant (x2=77.6, df=56, p=0.03). The sports with the highest consistency were Women’s Tennis, Men’s Diving, and Men’s Tennis with 100% consistency between academic years. Sports with the lowest consistency were Women’s Gymnastics (69%), Men’s Lacrosse (70%), and Football (72%). 96 athletes had at least one study-only SRC in the previous year and 69 of those (71.9%) had a RepDevSO of 0 (ICC 0.673, 95% CI 0.64-0.71).
Conclusions:
Approximately 40% of athletes do not consistently report their SRC history, potentially further complicating the clinical management of SRC. These findings encourage clinicians to be aware of factors which could influence the reliability of self-reported SRC history.
Anterior temporal lobectomy is a common surgical approach for medication-resistant temporal lobe epilepsy (TLE). Prior studies have shown inconsistent findings regarding the utility of presurgical intracarotid sodium amobarbital testing (IAT; also known as Wada test) and neuroimaging in predicting postoperative seizure control. In the present study, we evaluated the predictive utility of IAT, as well as structural magnetic resonance imaging (MRI) and positron emission tomography (PET), on long-term (3-years) seizure outcome following surgery for TLE.
Participants and Methods:
Patients consisted of 107 adults (mean age=38.6, SD=12.2; mean education=13.3 years, SD=2.0; female=47.7%; White=100%) with TLE (mean epilepsy duration =23.0 years, SD=15.7; left TLE surgery=50.5%). We examined whether demographic, clinical (side of resection, resection type [selective vs. non-selective], hemisphere of language dominance, epilepsy duration), and presurgical studies (normal vs. abnormal MRI, normal vs. abnormal PET, correctly lateralizing vs. incorrectly lateralizing IAT) were associated with absolute (cross-sectional) seizure outcome (i.e., freedom vs. recurrence) with a series of chi-squared and t-tests. Additionally, we determined whether presurgical evaluations predicted time to seizure recurrence (longitudinal outcome) over a three-year period with univariate Cox regression models, and we compared survival curves with Mantel-Cox (log rank) tests.
Results:
Demographic and clinical variables (including type [selective vs. whole lobectomy] and side of resection) were not associated with seizure outcome. No associations were found among the presurgical variables. Presurgical MRI was not associated with cross-sectional (OR=1.5, p=.557, 95% CI=0.4-5.7) or longitudinal (HR=1.2, p=.641, 95% CI=0.4-3.9) seizure outcome. Normal PET scan (OR= 4.8, p=.045, 95% CI=1.0-24.3) and IAT incorrectly lateralizing to seizure focus (OR=3.9, p=.018, 95% CI=1.2-12.9) were associated with higher odds of seizure recurrence. Furthermore, normal PET scan (HR=3.6, p=.028, 95% CI =1.0-13.5) and incorrectly lateralized IAT (HR= 2.8, p=.012, 95% CI=1.2-7.0) were presurgical predictors of earlier seizure recurrence within three years of TLE surgery. Log rank tests indicated that survival functions were significantly different between patients with normal vs. abnormal PET and incorrectly vs. correctly lateralizing IAT such that these had seizure relapse five and seven months earlier on average (respectively).
Conclusions:
Presurgical normal PET scan and incorrectly lateralizing IAT were associated with increased risk of post-surgical seizure recurrence and shorter time-to-seizure relapse.
The ability to extinguish a maladaptive conditioned fear response is crucial for healthy emotional processing and resiliency to aversive experiences. Therefore, enhancing fear extinction learning has immense potential emotional and health benefits. Mindfulness training enhances both fear conditioning and recall of extinguished fear; however, its effects on fear extinction learning are unknown. Here we investigated the impact of mindfulness training on brain mechanisms associated with fear-extinction learning, compared to an exercise-based program.
Methods
We investigated BOLD activations in response to a previously learned fear-inducing cue during an extinction paradigm, before and after an 8-week mindfulness-based stress reduction program (MBSR, n = 49) or exercise-based stress management education program (n = 27).
Results
The groups exhibited similar reductions in stress, but the MBSR group was uniquely associated with enhanced activation of salience network nodes and increased hippocampal engagement.
Conclusions
Our results suggest that mindfulness training increases attention to anticipatory aversive stimuli, which in turn facilitates decreased aversive subjective responses and enhanced reappraisal of the memory.
Several hypotheses may explain the association between substance use, posttraumatic stress disorder (PTSD), and depression. However, few studies have utilized a large multisite dataset to understand this complex relationship. Our study assessed the relationship between alcohol and cannabis use trajectories and PTSD and depression symptoms across 3 months in recently trauma-exposed civilians.
Methods
In total, 1618 (1037 female) participants provided self-report data on past 30-day alcohol and cannabis use and PTSD and depression symptoms during their emergency department (baseline) visit. We reassessed participant's substance use and clinical symptoms 2, 8, and 12 weeks posttrauma. Latent class mixture modeling determined alcohol and cannabis use trajectories in the sample. Changes in PTSD and depression symptoms were assessed across alcohol and cannabis use trajectories via a mixed-model repeated-measures analysis of variance.
Results
Three trajectory classes (low, high, increasing use) provided the best model fit for alcohol and cannabis use. The low alcohol use class exhibited lower PTSD symptoms at baseline than the high use class; the low cannabis use class exhibited lower PTSD and depression symptoms at baseline than the high and increasing use classes; these symptoms greatly increased at week 8 and declined at week 12. Participants who already use alcohol and cannabis exhibited greater PTSD and depression symptoms at baseline that increased at week 8 with a decrease in symptoms at week 12.
Conclusions
Our findings suggest that alcohol and cannabis use trajectories are associated with the intensity of posttrauma psychopathology. These findings could potentially inform the timing of therapeutic strategies.
In adults with Clostridioides difficile infection (CDI), higher stool concentrations of toxins A and B are associated with severe baseline disease, CDI-attributable severe outcomes, and recurrence. We evaluated whether toxin concentration predicts these presentations in children with CDI.
Methods:
We conducted a prospective cohort study of inpatients aged 2–17 years with CDI who received treatment. Patients were followed for 40 days after diagnosis for severe outcomes (intensive care unit admission, colectomy, or death, categorized as CDI primarily attributable, CDI contributed, or CDI not contributing) and recurrence. Baseline stool toxin A and B concentrations were measured using ultrasensitive single-molecule array assay, and 12 plasma cytokines were measured when blood was available.
Results:
We enrolled 187 pediatric patients (median age, 9.6 years). Patients with severe baseline disease by IDSA-SHEA criteria (n = 34) had nonsignificantly higher median stool toxin A+B concentration than those without severe disease (n = 122; 3,217.2 vs 473.3 pg/mL; P = .08). Median toxin A+B concentration was nonsignificantly higher in children with a primarily attributed severe outcome (n = 4) versus no severe outcome (n = 148; 19,472.6 vs 429.1 pg/mL; P = .301). Recurrence occurred in 17 (9.4%) of 180 patients. Baseline toxin A+B concentration was significantly higher in patients with versus without recurrence: 4,398.8 versus 280.8 pg/mL (P = .024). Plasma granulocyte colony-stimulating factor concentration was significantly higher in CDI patients versus non-CDI diarrhea controls: 165.5 versus 28.5 pg/mL (P < .001).
Conclusions:
Higher baseline stool toxin concentrations are present in children with CDI recurrence. Toxin quantification should be included in CDI treatment trials to evaluate its use in severity assessment and outcome prediction.
The present chapter provides a review and analysis of research on mental health stigma among military personnel. Given expectations of psychological resilience for military personnel, a large amount of research has been conducted to examine the antecedents and consequences of mental health stigma in this population. The chapter first provides an analysis of the different types of mental health stigma that have been examined, including their definitions and assessments. The chapter then addresses the antecedents of mental health stigma in the military, including demographics, mental health symptoms, unit factors, and personality/individual differences. The consequences of mental health stigma are then examined, including treatment- seeking intentions and treatment seeking, mental health symptoms, treatment dropout, and suicidal ideation. Training and interventions to reduce mental health stigma are then discussed. The chapter concludes with recommendations for future research, including a larger focus on unit/organizational factors that influence mental health stigma in the military.
Response to lithium in patients with bipolar disorder is associated with clinical and transdiagnostic genetic factors. The predictive combination of these variables might help clinicians better predict which patients will respond to lithium treatment.
Aims
To use a combination of transdiagnostic genetic and clinical factors to predict lithium response in patients with bipolar disorder.
Method
This study utilised genetic and clinical data (n = 1034) collected as part of the International Consortium on Lithium Genetics (ConLi+Gen) project. Polygenic risk scores (PRS) were computed for schizophrenia and major depressive disorder, and then combined with clinical variables using a cross-validated machine-learning regression approach. Unimodal, multimodal and genetically stratified models were trained and validated using ridge, elastic net and random forest regression on 692 patients with bipolar disorder from ten study sites using leave-site-out cross-validation. All models were then tested on an independent test set of 342 patients. The best performing models were then tested in a classification framework.
Results
The best performing linear model explained 5.1% (P = 0.0001) of variance in lithium response and was composed of clinical variables, PRS variables and interaction terms between them. The best performing non-linear model used only clinical variables and explained 8.1% (P = 0.0001) of variance in lithium response. A priori genomic stratification improved non-linear model performance to 13.7% (P = 0.0001) and improved the binary classification of lithium response. This model stratified patients based on their meta-polygenic loadings for major depressive disorder and schizophrenia and was then trained using clinical data.
Conclusions
Using PRS to first stratify patients genetically and then train machine-learning models with clinical predictors led to large improvements in lithium response prediction. When used with other PRS and biological markers in the future this approach may help inform which patients are most likely to respond to lithium treatment.
Two introduced carnivores, the European red fox Vulpes vulpes and domestic cat Felis catus, have had extensive impacts on Australian biodiversity. In this study, we collate information on consumption of Australian birds by the fox, paralleling a recent study reporting on birds consumed by cats. We found records of consumption by foxes on 128 native bird species (18% of the non-vagrant bird fauna and 25% of those species within the fox’s range), a smaller tally than for cats (343 species, including 297 within the fox’s Australian range, a subset of that of the cat). Most (81%) bird species eaten by foxes are also eaten by cats, suggesting that predation impacts are compounded. As with consumption by cats, birds that nest or forage on the ground are most likely to be consumed by foxes. However, there is also some partitioning, with records of consumption by foxes but not cats for 25 bird species, indicating that impacts of the two predators may also be complementary. Bird species ≥3.4 kg were more likely to be eaten by foxes, and those <3.4 kg by cats. Our compilation provides an inventory and describes characteristics of Australian bird species known to be consumed by foxes, but we acknowledge that records of predation do not imply population-level impacts. Nonetheless, there is sufficient information from other studies to demonstrate that fox predation has significant impacts on the population viability of some Australian birds, especially larger birds, and those that nest or forage on the ground.
Micronutrients are important for normal cardiovascular function. They may play a role in the increased risk of cardiovascular disease observed in people with type 2 diabetes (T2D) and T2D-related heart failure. The aims of this study were to (1) examine micronutrient status in people with T2D v. healthy controls; (2) assess any changes following a nutritionally complete meal replacement plan (MRP) compared with routine care; (3) determine if any changes were associated with changes in cardiovascular structure/function. This was a secondary analysis of data from a prospective, randomised, open-label, blinded end-point trial of people with T2D, with a nested case–control [NCT02590822]. Anthropometrics, cardiac resonance imaging and fasting blood samples (to quantify vitamins B1, B6, B12, D and C; and iron and ferritin) were collected at baseline and 12 weeks following the MRP or routine care. Comparative data in healthy controls were collected at baseline. A total of eighty-three people with T2D and thirty-six healthy controls were compared at baseline; all had micronutrient status within reference ranges. Vitamin B1 was higher (148⋅9 v. 131⋅7; P 0⋅01) and B6 lower (37⋅3 v. 52⋅9; P 0⋅01) in T2D v. controls. All thirty participants randomised to routine care and twenty-four to the MRP completed the study. There was an increase in vitamins B1, B6, D and C following the MRP, which were not associated with changes in cardiovascular structure/function. In conclusion, changes in micronutrient status following the MRP were not independently associated with improvements in cardiovascular structure/function in people with T2D.