We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Climate change is causing Himalayan glaciers to shrink rapidly and natural hazards to increase, while downstream exposure is growing. Glacier shrinkage promotes the formation of glacial lakes, which can suddenly drain and produce glacier lake outburst floods (GLOFs). Bhutan is one of the most vulnerable countries globally to these hazards. Here we use remotely sensed imagery to quantify changes in supraglacial water storage on Tshojo Glacier, Bhutan, where previous supraglacial pond drainage events have necessitated downstream evacuation. Results showed a doubling of both total ponded area (104 529 m2 to 213 943 m2) and its std dev. (64 808 m2 to 158 550 m2) between the periods 1987–2003 and 2007–2020, which was predominantly driven by increases in the areas of the biggest ponds. These ponds drained regularly and have occupied the same location since at least 1967. Tshojo Glacier has remained in the first stage of proglacial lake development for 53 years, which we attribute to its moderate slopes and ice velocities. Numerical modelling shows that pond outbursts can reach between ~6 and 47 km downstream, impacting the remote settlement of Lunana. Our results highlight the need to better quantify variability in supraglacial water storage and its potential to generate GLOFs, as climate warms.
Neural predictors underlying variability in depression outcomes are poorly understood. Functional MRI measures of subgenual cortex connectivity, self-blaming and negative perceptual biases have shown prognostic potential in treatment-naïve, medication-free and fully remitting forms of major depressive disorder (MDD). However, their role in more chronic, difficult-to-treat forms of MDD is unknown.
Methods:
Forty-five participants (n = 38 meeting minimum data quality thresholds) fulfilled criteria for difficult-to-treat MDD. Clinical outcome was determined by computing percentage change at follow-up from baseline (four months) on the self-reported Quick Inventory of Depressive Symptomatology (16-item). Baseline measures included self-blame-selective connectivity of the right superior anterior temporal lobe with an a priori Brodmann Area 25 region-of-interest, blood-oxygen-level-dependent a priori bilateral amygdala activation for subliminal sad vs happy faces, and resting-state connectivity of the subgenual cortex with an a priori defined ventrolateral prefrontal cortex/insula region-of-interest.
Findings:
A linear regression model showed that baseline severity of depressive symptoms explained 3% of the variance in outcomes at follow-up (F[3,34] = .33, p = .81). In contrast, our three pre-registered neural measures combined, explained 32% of the variance in clinical outcomes (F[4,33] = 3.86, p = .01).
Conclusion:
These findings corroborate the pathophysiological relevance of neural signatures of emotional biases and their potential as predictors of outcomes in difficult-to-treat depression.
Although preventable through established infection control practices, catheter-associated urinary tract infections (CAUTIs) remain prevalent in acute-care settings. Our goal was to improve the CAUTI rates through multiple hospitals through implementing sustainable practices, including enhancing communication, provider engagement, accountability, and transparency in reporting to achieve long-term improvements.
Design:
Quality improvement with multiple levels of interventions
Setting:
A health system in northern Ohio with 21 affiliated hospitals across 16 counties.
Patients:
Adult patients admitted to the hospital between June 2020 and June 2023.
Methods:
A broad set of quality improvement (QI) strategies was developed by an interdisciplinary team and guided by the Fractal Management System framework to ensure accountability, communication, and alignment across teams and facilities. Key drivers were indwelling urinary catheter (IUC) alternatives, insertion, maintenance, removal, and smart diagnostics. The main outcome measures were standardized infection ratios (SIR) and standardized utilization ratio (SUR), comparing period 1 (P1, June 2020 to December 2021) and period 2 (P2, January 2022 to June 2023).
Results:
Enhanced communication and management played crucial roles in minimizing IUC placement. Updated policies and protocols, coupled with clear guidelines and decision support tools, facilitated effective urinary management. Performance tracking and visual management boards provided real-time insights, while collaborative efforts, including staff huddles and multidisciplinary teamwork, ensured consistent adherence to best practices.
Conclusions:
A systemwide QI initiative focused on enhanced communication, management, and collaboration contributed to improved SIR and reduced CAUTI rates across multiple hospitals, highlighting the impact of strong communication and proactive management in healthcare settings.
Major Depressive Disorder (MDD) is a complex mental health condition characterized by a wide spectrum of symptoms. According to the Diagnostic Statistical Manual 5 (DSM-5) criteria, patients can present with up to 1,497 different symptom combinations, yet all receive the same MDD diagnosis. This diversity in symptom presentation poses a significant challenge to understanding the disorder in the wider population. Subtyping offers a way to unpick this phenotypic diversity and enable improved characterization of the disorder. According to reviews, MDD subtyping work to date has lacked consistency in results due to inadequate statistics, non-transparent reporting, or inappropriate sample choice. By addressing these limitations, the current study aims to extend past phenotypic subtyping studies in MDD.
Objectives
(1) To investigate phenotypic subtypes at baseline in a sample of people with MDD;
(2) To determine if subtypes are consistent between baseline 6- and 12-month follow-ups; and
(3) To examine how participants move between subtypes over time.
Methods
This was a secondary analysis of a one-year longitudinal observational cohort study. We collected data from individuals with a history of recurrent MDD in the United Kingdom, the Netherlands and Spain (N=619). The presence or absence of symptoms was tracked at three-month intervals through the Inventory of Depressive Symptomatology: Self-Report (IDS-SR) assessment. We used latent class and three-step latent transition analysis to identify subtypes at baseline, determined their consistency at 6- and 12-month follow-ups, and examined participants’ transitions over time.
Results
We identified a 4-class solution based on model fit and interpretability, including (Class 1) severe with appetite increase, (Class 2), severe with appetite decrease, (Class 3) moderate, and (Class 4) low severity. The classes mainly differed in terms of severity (the varying likelihood of symptom endorsement) and, for the two more severe classes, the type of neurovegetative symptoms reported (Figure 1). The four classes were stable over time (measurement invariant) and participants tended to remain in the same class over baseline and follow-up (Figure 2).
Image:
Image 2:
Conclusions
We identified four stable subtypes of depression, with individuals most likely to remain in their same class over 1-year follow-up. This suggests a chronic nature of depression, with (for example) individuals in severe classes more likely to remain in the same class throughout follow-up. Despite the vast heterogeneous symptom combinations possible in MDD, our results emphasize differences across severity rather than symptom type. This raises questions about the meaningfulness of these subtypes beyond established measures of depression severity. Implications of these findings and recommendations for future research are made.
Disclosure of Interest
C. Oetzmann Grant / Research support from: C.O. is supported by the UK Medical Research Council (MR/N013700/1) and King’s College London member of the MRC Doctoral Training Partnership in Biomedical Sciences., N. Cummins: None Declared, F. Lamers: None Declared, F. Matcham: None Declared, K. White: None Declared, J. Haro: None Declared, S. Siddi: None Declared, S. Vairavan Employee of: S.V is an employee of Janssen Research & Development, LLC and hold company stocks/stock options., B. Penninx : None Declared, V. Narayan: None Declared, M. Hotopf Grant / Research support from: M.H. is the principal investigator of the RADAR-CNS programme, a precompetitive public–private partnership funded by the Innovative Medicines Initiative and the European Federation of Pharmaceutical Industries and Associations. The programme received support from Janssen, Biogen, MSD, UCB and Lundbeck., E. Carr: None Declared
This review of the literature shows that there have been many attempts to modify or revise the original definition of halloysites as distinguished from kaolinites, which was based on the greater water content of the halloysites. In general, these various attempts have arrived at definitions of halloysites as distinguished from kaolinites that are based on one or more particular instrumental or chemical techniques. Further investigations with almost all of these techniques have shown the apparently clear distinctions of this kind to be misleading. All such instrumentally—or chemically—based definitions were shown to either complicate and confuse the distinction between halloysites and kaolinites or to provide only empirical and subtle distinctions. It is concluded that only the original definition, with slight adaptations, enables clear and unambiguous distinctions to be made between halloysites and kaolinites. It is noted, however, that a distinctive structure for halloysite has been postulated as a result of electron diffraction studies. Further studies of this kind could well establish such a structure as being definitive of the mineral species.
The literature also reveals a long-standing disagreement over the nomenclature of different forms of halloysite and particularly over the nomenclature of and distinction between the two forms of halloysite at the extreme ends of the hydration series. An analysis of experimental studies of the relationship between these two and other hydration states of halloysite reveals that forms of halloysite with all possible interlayer water contents between 0 and 2 molecules H2O per Al2Si2O5(OH)4 unit cell may exist and that the two end members of the hydration series may not be seen as distinct phases. The fully dehydrated halloysite is the only thermodynamically stable form of the mineral. A nomenclature system which was proposed by MacEwan in 1947 is consistent with these results. This system, amended only by the exclusion of the unnecessary term ‘metahalloysite’ should therefore be adopted in all studies of halloysites.
A study of the mineralogical changes taking place during the loss of interlayer water in an halloysite has been carried out in order to clarify the relationship between the most hydrated and least hydrated states of the mineral. A number of samples of halloysite which together exhibit a variety of average interlayer water capacities were obtained by the conditioning of a largely hydrated sample with different atmospheres of known relative humidities. Profiles were obtained of X-ray peaks which characterize the interlayer water capacities of halloysite samples. An attempt has been made to analyse these profiles into a sum of peaks attributable to the fully hydrated and dehydrated states of the mineral. Such an analysis does not satisfactorily explain the profile shapes. A mechanism of interstratification of hydrated and dehydrated kaolin layers in which there is a tendency towards the segregation of these layer types gives a more satisfactory explanation of these profile shapes. It is concluded that dehydration takes place through an interstratification in which there is a partial segregation of the two basic layer types. This conclusion implies that halloysites with all average interlayer water contents between 0 and 2 molecules per unit cell may exist and that fully hydrated halloysite and dehydrated halloysite are the end members of a continuous series of hydration states.
Medical researchers are increasingly prioritizing the inclusion of underserved communities in clinical studies. However, mere inclusion is not enough. People from underserved communities frequently experience chronic stress that may lead to accelerated biological aging and early morbidity and mortality. It is our hope and intent that the medical community come together to engineer improved health outcomes for vulnerable populations. Here, we introduce Health Equity Engineering (HEE), a comprehensive scientific framework to guide research on the development of tools to identify individuals at risk of poor health outcomes due to chronic stress, the integration of these tools within existing healthcare system infrastructures, and a robust assessment of their effectiveness and sustainability. HEE is anchored in the premise that strategic intervention at the individual level, tailored to the needs of the most at-risk people, can pave the way for achieving equitable health standards at a broader population level. HEE provides a scientific framework guiding health equity research to equip the medical community with a robust set of tools to enhance health equity for current and future generations.
Functional connectivity of the default mode network (DMN) during rest has been shown to be different among adults with Mild Cognitive Impairment (MCI) relative to aged-matched individuals without MCI and is predictive of transition to dementia. Post-traumatic stress disorder (PTSD) is also associated with aberrant connectivity of the DMN. Prior work from this group has demonstrated a higher rate of MCI and PTSD among World Trade Center (WTC) responders relative to the general population. The current study sought to investigate the main and interactive effects of MCI and PTSD on DMN functioning. Based on prior work, we hypothesized that MCI, but not PTSD, would predict aberrant connectivity in the DMN.
Participants and Methods:
99 WTC responders aged 44–65 stratified by MCI status (yes/no) and PTSD status (yes/no) and matched for age in years, sex (male vs. female), race (white, black, and other), and educational attainment (high school or less, some college / technical school, and university degree), and occupation on September 11, 2001 (law enforcement vs. other) underwent fMRI using a 3T Siemens Biograph MR scanner. A single 10-minute continuous functional MR sequence was acquired while participants were at rest with their eyes open. Group-level analyses were conducted using SPM-12, with correction for multiple comparisons using AFNI's 3dClustSim. Based on this threshold, the number of comparisons in our imaging volume, and the smoothness of our imaging data as measured by 3dFWHMx-acf, a minimum cluster size of 1134 voxels was required to have a corrected p . .05 with 2-sided thresholding. Spherical 3 mm seeds were placed in the dorsal (4, -50, 26) and ventral (4, -60, 46) posterior cingulate cortex (PCC).
Results:
Individuals with PTSD demonstrated significantly less connectivity of the dorsal posterior cingulate cortex (PCC) with medial insula (T = 5.21), subthalamic nucleus (T = 4.66), and postcentral gyrus (T = 3.81). There was no difference found in this study for connectivity between groups stratified by MCI status. There were no significant results for the ventral PCC seed.
Conclusions:
Contrary to hypotheses that were driven by a study of cortical thickness in WTC responders, the impact of PTSD appears to outweigh the impact of MCI on dorsal DMN connectivity among WTC responders stratified by PTSD and MCI status. This study is limited by several issues, including low number of female and minority participants, relatively small group cell sizes (n = 23–27 per cell), a brief resting state sequence (10 minutes), and lack of a non-WTC control group. Importantly, responders are a unique population so generalizability to other populations may be limited. Individuals in the current study are now being followed longitudinally to relate baseline resting state functional connectivity with cognitive changes and changes in connectivity over a four-year period.
Significant advances in the research of sport-related concussion (SRC) and repetitive head impacts (RHI) over the previous decade have translated to improved injury identification, diagnosis, and management. However, an objective gold standard for SRC/RHI treatment has remained elusive. SRC often result in heterogenous clinical outcomes, and the accumulation of RHI over time is associated with long-term declines in neurocognitive functioning. Medical management typically entails an amalgamation of outpatient medical treatment and psychiatric and/or behavioral interventions for specific symptoms rather than treatment of the underlying functional and/or structural brain injury. Transcranial photobiomodulation (tPBM), a form of light therapy, has been proposed as a non-invasive treatment for individuals with traumatic brain injuries (TBI), possibly including SRC/RHI. With the present proof-of-concept pilot study, we sought to address important gaps in the neurorehabilitation of former athletes with a history of SRC and RHI by examining the effects of tPBM on neurocognitive functioning.
Participants and Methods:
The current study included 49 participants (45 male) with a history of SRC and/or RHI. Study inclusion criteria included: age 18-65 years and a self-reported history of SRC and/or RHI. Exclusion criteria included: a history of neurologic disease a history of psychiatric disorder, and MRI contraindication. We utilized a non-randomized proof-of-concept design of active treatment over the course of 8-10 weeks, and neurocognitive functioning was assessed at pre- and post-treatment. A Vielight Neuro Gamma at-home brain tPBM device was distributed to each participant following baseline assessment.
Participants completed standardized measures of neurocognitive functioning, including the California Verbal Learning Test (CVLT-3), Delis Kaplan Executive Function System (D-KEFS), Continuous Performance Test (CPT-3), and The NIH Toolbox Cognition Battery. Neurocognitive assessments were collected prior to and following tPBM treatment. Paired t-tests and Wilcoxon’s signed-rank tests were used to evaluate change in performance on measures of neurocognitive functioning for normal and nonnormal variables, respectively, and estimates of effect size were obtained.
Results:
Study participants’ ability for adapting to novel stimuli and task requirements (i.e., fluid cognition; t=5.96; p<.001; d=.90), verbal learning/encoding (t=3.20; p=.003; d=.48) and delayed recall (z=3.32; p=.002; d=.50), processing speed (t=3.13; p=.003; d=.47), sustained attention (t=-4.39; p<.001; d=-.71), working memory (t=3.61; p=.001; d=.54), and aspects of executive functioning improved significantly following tPBM treatment. No significant improvements in phonemic and semantic verbal fluencies, reading ability, and vocabulary were shown following tPBM treatment.
Conclusions:
The results of this pilot study demonstrate that following 8-10 weeks of active tPBM treatment, retired athletes with a history of SRC and/or RHI experienced significant improvements in fluid cognition, learning and memory, processing speed, attention, working memory, and aspects of executive functioning. Importantly, the majority of effect sizes ranged from moderate to large, suggesting that tPBM has clinically meaningful improvements on neurocognitive functioning across various cognitive domains. These results offer support for future research employing more rigorous study designs on the potential neurorehabilitative effects of tPBM in athletes with SRC/RHI.
We identify a set of essential recent advances in climate change research with high policy relevance, across natural and social sciences: (1) looming inevitability and implications of overshooting the 1.5°C warming limit, (2) urgent need for a rapid and managed fossil fuel phase-out, (3) challenges for scaling carbon dioxide removal, (4) uncertainties regarding the future contribution of natural carbon sinks, (5) intertwinedness of the crises of biodiversity loss and climate change, (6) compound events, (7) mountain glacier loss, (8) human immobility in the face of climate risks, (9) adaptation justice, and (10) just transitions in food systems.
Technical summary
The Intergovernmental Panel on Climate Change Assessment Reports provides the scientific foundation for international climate negotiations and constitutes an unmatched resource for researchers. However, the assessment cycles take multiple years. As a contribution to cross- and interdisciplinary understanding of climate change across diverse research communities, we have streamlined an annual process to identify and synthesize significant research advances. We collected input from experts on various fields using an online questionnaire and prioritized a set of 10 key research insights with high policy relevance. This year, we focus on: (1) the looming overshoot of the 1.5°C warming limit, (2) the urgency of fossil fuel phase-out, (3) challenges to scale-up carbon dioxide removal, (4) uncertainties regarding future natural carbon sinks, (5) the need for joint governance of biodiversity loss and climate change, (6) advances in understanding compound events, (7) accelerated mountain glacier loss, (8) human immobility amidst climate risks, (9) adaptation justice, and (10) just transitions in food systems. We present a succinct account of these insights, reflect on their policy implications, and offer an integrated set of policy-relevant messages. This science synthesis and science communication effort is also the basis for a policy report contributing to elevate climate science every year in time for the United Nations Climate Change Conference.
Social media summary
We highlight recent and policy-relevant advances in climate change research – with input from more than 200 experts.
Caregivers of adult phase 1 oncology trial patients experience high levels of distress and face barriers to in-person supportive care. The Phase 1 Caregiver LifeLine (P1CaLL) pilot study assessed the feasibility, acceptability, and general impact of an individual telephone-based cognitive behavioral stress-management (CBSM) intervention for caregivers of phase I oncology trial patients.
Methods
The pilot study involved 4 weekly adapted CBSM sessions followed by participant randomization to 4 weekly cognitive behavioral therapy sessions or metta-meditation sessions. A mixed-methods design used quantitative data from 23 caregivers and qualitative data from 5 caregivers to examine the feasibility and acceptability outcomes. Feasibility was determined using recruitment, retention, and assessment completion rates. Acceptability was assessed with self-reported satisfaction with program content and participation barriers. Baseline to post-intervention changes in caregiver distress and other psychosocial outcomes were assessed for the 8-session intervention.
Results
The enrollment rate was 45.3%, which demonstrated limited feasibility based on an a priori criterion enrollment rate of 50%. Participants completed an average of 4.9 sessions, with 9/25 (36%) completing all sessions and an 84% assessment completion rate. Intervention acceptability was high, and participants found the sessions helpful in managing stress related to the phase 1 oncology trial patient experience. Participants showed reductions in worry and isolation and stress.
Significance of results
The P1CaLL study demonstrated adequate acceptability and limited feasibility and provided data on the general impact of the intervention on caregiver distress and other psychosocial outcomes. Caregivers of phase 1 oncology trial patients would benefit from supportive care services; a telephone-based intervention may have more utilization and thus make a larger impact.
Persons discharged from inpatient psychiatric services are at greatly elevated risk of harming themselves or inflicting violence on others, but no studies have reported gender-specific absolute risks for these two outcomes across the spectrum of psychiatric diagnoses. We aimed to estimate absolute risks for self-harm and interpersonal violence post-discharge according to gender and diagnostic category.
Methods
Danish national registry data were utilized to investigate 62,922 discharged inpatients, born 1967–2000. An age and gender matched cohort study was conducted to examine risks for self-harm and interpersonal violence at 1 year and at 10 years post-discharge. Absolute risks were estimated as cumulative incidence percentage values.
Results
Patients diagnosed with substance misuse disorders were at especially elevated risk, with the absolute risks for either self-harm or interpersonal violence being 15.6% (95% CI 14.9, 16.3%) of males and 16.8% (15.6, 18.1%) of females at 1 year post-discharge, rising to 45.7% (44.5, 46.8%) and 39.0% (37.1, 40.8%), respectively, within 10 years. Diagnoses of personality disorders and early onset behavioral and emotional disorders were also associated with particularly high absolute risks, whilst risks linked with schizophrenia and related disorders, mood disorders, and anxiety/somatoform disorders, were considerably lower.
Conclusions
Patients diagnosed with substance misuse disorders, personality disorders and early onset behavioral and emotional disorders are at especially high risk for internally and externally directed violence. It is crucial, however, that these already marginalized individuals are not further stigmatized. Enhanced care at discharge and during the challenging transition back to life in the community is needed.
When comparing a pair of attribute values, English speakers can use a “larger” comparative (“A is larger/longer/higher/more than B”) or a “smaller” comparative (“B is smaller/shorter/lower/less than A”). This choice matters because it affects people’s inferences about the absolute magnitudes of the compared items, and influences the perceived truthfulness of the comparative sentence itself. In 4 studies (total N = 2335), we investigated the language that people use to describe ordinal relations between attributes. Specifically, we examined whether demography, emotion, and personality predict the tendency to use “larger” comparatives rather than “smaller” ones. Participants viewed pairs of items differing in a single attribute and indicated the word they would use to describe the relationship between them; they also completed a series of self-report measures. Replicating previous work, we found a robust tendency to use “larger” comparatives, both when people chose between two adjectives and when they freely produced their own words in a sentence completion task. We also found that this tendency was more pronounced in older participants, those with positive mood or outlook, and among people high in agreeableness, conscientiousness, and emotional stability. However, these effects were very small, with meta-analytic effect sizes indicating they explain less than 1% of the variance. We conclude that, although people’s use of comparative adjectives is influenced by properties of the items that are being compared, the way that people describe magnitude relations is relatively stable across variation in a range of important traits and dispositions, protecting decision-makers from a potentially undesirable source of bias in their inferences and representations of described options.
There is substantial variation in patient symptoms following psychological therapy for depression and anxiety. However, reliance on endpoint outcomes ignores additional interindividual variation during therapy. Knowing a patient's likely symptom trajectories could guide clinical decisions. We aimed to identify latent classes of patients with similar symptom trajectories over the course of psychological therapy and explore associations between baseline variables and trajectory class.
Methods
Patients received high-intensity psychological treatment for common mental health problems at National Health Service Improving Access to Psychological Therapies services in South London (N = 16 258). To identify trajectories, we performed growth mixture modelling of depression and anxiety symptoms over 11 sessions. We then ran multinomial regressions to identify baseline variables associated with trajectory class membership.
Results
Trajectories of depression and anxiety symptoms were highly similar and best modelled by four classes. Three classes started with moderate-severe symptoms and showed (1) no change, (2) gradual improvement, and (3) fast improvement. A final class (4) showed initially mild symptoms and minimal improvement. Within the moderate-severe baseline symptom classes, patients in the two showing improvement as opposed to no change tended not to be prescribed psychotropic medication or report a disability and were in employment. Patients showing fast improvement additionally reported lower baseline functional impairment on average.
Conclusions
Multiple trajectory classes of depression and anxiety symptoms were associated with baseline characteristics. Identifying the most likely trajectory for a patient at the start of treatment could inform decisions about the suitability and continuation of therapy, ultimately improving patient outcomes.
We summarize what we assess as the past year's most important findings within climate change research: limits to adaptation, vulnerability hotspots, new threats coming from the climate–health nexus, climate (im)mobility and security, sustainable practices for land use and finance, losses and damages, inclusive societal climate decisions and ways to overcome structural barriers to accelerate mitigation and limit global warming to below 2°C.
Technical summary
We synthesize 10 topics within climate research where there have been significant advances or emerging scientific consensus since January 2021. The selection of these insights was based on input from an international open call with broad disciplinary scope. Findings concern: (1) new aspects of soft and hard limits to adaptation; (2) the emergence of regional vulnerability hotspots from climate impacts and human vulnerability; (3) new threats on the climate–health horizon – some involving plants and animals; (4) climate (im)mobility and the need for anticipatory action; (5) security and climate; (6) sustainable land management as a prerequisite to land-based solutions; (7) sustainable finance practices in the private sector and the need for political guidance; (8) the urgent planetary imperative for addressing losses and damages; (9) inclusive societal choices for climate-resilient development and (10) how to overcome barriers to accelerate mitigation and limit global warming to below 2°C.
Social media summary
Science has evidence on barriers to mitigation and how to overcome them to avoid limits to adaptation across multiple fields.
Seabirds are declining globally and are one of the most threatened groups of birds. To halt or reverse this decline they need protection both on land and at sea, requiring site-based conservation initiatives based on seabird abundance and diversity. The Important Bird and Biodiversity Area (IBA) programme is a method of identifying the most important places for birds based on globally agreed standardised criteria and thresholds. However, while great strides have been made identifying terrestrial sites, at-sea identification is lacking. The Chagos Archipelago, central Indian Ocean, supports four terrestrial IBAs (tIBAs) and two proposed marine IBAs (mIBAs). The mIBAs are seaward extensions to breeding colonies based on outdated information and, other types of mIBA have not been explored. Here, we review the proposed seaward extension mIBAs using up-to-date seabird status and distribution information and, use global positioning system (GPS) tracking from Red-footed Booby Sula sula – one of the most widely distributed breeding seabirds on the archipelago – to identify any pelagic mIBAs. We demonstrate that due to overlapping boundaries of seaward extension to breeding colony and pelagic areas of importance there is a single mIBA in the central Indian Ocean that lays entirely within the Chagos Archipelago Marine Protected Area (MPA). Covering 62,379 km2 it constitutes ~10% of the MPA and if designated, would become the 11th largest mIBA in the world and 4th largest in the Indian Ocean. Our research strengthens the evidence of the benefits of large-scale MPAs for the protection of marine predators and provides a scientific foundation stone for marine biodiversity hotspot research in the central Indian Ocean.
No single environmental factor is a necessary or sufficient cause of mental disorder; multifactorial and transdiagnostic approaches are needed to understand the impact of the environment on the development of mental disorders across the life course.
Method
Using linked multi-agency administrative data for 71 932 children from the New South Wales Child Developmental Study, using logistic regression, we examined associations between 16 environmental risk factors in early life (prenatal period to <6 years of age) and later diagnoses of mental disorder recorded in health service data (from age 6 to 13 years), both individually and summed as an environmental risk score (ERS).
Results
The ERS was associated with all types of mental disorder diagnoses in a dose–response fashion, such that 2.8% of children with no exposure to any of the environmental factors (ERS = 0), compared to 18.3% of children with an ERS of 8 or more indicating exposure to 8 or more environmental factors (ERS ⩾ 8), had been diagnosed with any type of mental disorder up to age 13–14 years. Thirteen of the 16 environmental factors measured (including prenatal factors, neighbourhood characteristics and more proximal experiences of trauma or neglect) were positively associated with at least one category of mental disorder.
Conclusion
Exposure to cumulative environmental risk factors in early life is associated with an increased likelihood of presenting to health services in childhood for any kind of mental disorder. In many instances, these factors are preventable or capable of mitigation by appropriate public policy settings.
People diagnosed with a severe mental illness (SMI) are at elevated risk of dying prematurely compared to the general population. We aimed to understand the additional risk among people with SMI after discharge from inpatient psychiatric care, when many patients experience an acute phase of their illness.
Methods
In the Clinical Practice Research Datalink (CPRD) GOLD and Aurum datasets, adults aged 18 years and older who were discharged from psychiatric inpatient care in England between 2001 and 2018 with primary diagnoses of SMI (schizophrenia, bipolar disorder, other psychoses) were matched by age and gender with up to five individuals with SMI and without recent hospital stays. Using survival analysis approaches, cumulative incidence and adjusted hazard ratios were estimated for all-cause mortality, external and natural causes of death, and suicide. All analyses were stratified by younger, middle and older ages and also by gender.
Results
In the year after their discharge, the risk of dying by all causes examined was higher than among individuals with SMI who had not received inpatient psychiatric care recently. Suicide risk was 11.6 times (95% CI 6.4–20.9) higher in the first 3 months and remained greater at 2–5 years after discharge (HR 2.3, 1.7–3.2). This risk elevation remained after adjustment for self-harm in the 6 months prior to the discharge date. The relative risk of dying by natural causes was raised in the first 3 months (HR 1.6, 1.3–1.9), with no evidence of elevation during the second year following discharge.
Conclusions
There is an additional risk of death by suicide and natural causes for people with SMI who have been recently discharged from inpatient care over and above the general risk among people with the same diagnosis who have not recently been treated as an inpatient. This mortality gap shows the importance of continued focus, following discharge, on individuals who require inpatient care.
This study describes risk factors associated with mortality among COVID-19 cases reported in the WHO African region between 21 March and 31 October 2020. Average hazard ratios of death were calculated using weighted Cox regression as well as median time to death for key risk factors. We included 46 870 confirmed cases reported by eight Member States in the region. The overall incidence was 20.06 per 100 000, with a total of 803 deaths and a total observation time of 3 959 874 person-days. Male sex (aHR 1.54 (95% CI 1.31–1.81); P < 0.001), older age (aHR 1.08 (95% CI 1.07–1.08); P < 0.001), persons who lived in a capital city (aHR 1.42 (95% CI 1.22–1.65); P < 0.001) and those with one or more comorbidity (aHR 36.37 (95% CI 20.26–65.27); P < 0.001) had a higher hazard of death. Being a healthcare worker reduced the average hazard of death by 40% (aHR 0.59 (95% CI 0.37–0.93); P = 0.024). Time to death was significantly less for persons ≥60 years (P = 0.038) and persons residing in capital cities (P < 0.001). The African region has COVID-19-related mortality similar to that of other regions, and is likely underestimated. Similar risk factors contribute to COVID-19-associated mortality as identified in other regions.