We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Psychological therapies can be effective in reducing symptoms of depression and anxiety in people living with dementia (PLWD). However, factors associated with better therapy outcomes in PLWD are currently unknown.
Aims
To investigate whether dementia-specific and non-dementia-specific factors are associated with therapy outcomes in PLWD.
Method
National linked healthcare records were used to identify 1522 PLWD who attended psychological therapy services across England. Associations between various factors and therapy outcomes were explored.
Results
People with frontotemporal dementia were more likely to experience reliable deterioration in depression/anxiety symptoms compared with people with vascular dementia (odds ratio 2.98, 95% CI 1.08–8.22; P = 0.03) or Alzheimer's disease (odds ratio 2.95, 95% CI 1.15–7.55; P = 0.03). Greater depression severity (reliable recovery: odds ratio 0.95, 95% CI 0.92–0.98, P < 0.001; reliable deterioration: odds ratio 1.73, 95% CI 1.04–2.90, P = 0.04), lower work and social functioning (recovery: odds ratio 0.98, 95% CI 0.96–0.99, P = 0.002), psychotropic medication use (recovery: odds ratio 0.67, 95% CI 0.51–0.90, P = 0.01), being of working age (recovery: odds ratio 2.03, 95% CI 1.10–3.73, P = 0.02) and fewer therapy sessions (recovery: odds ratio 1.12, 95% CI 1.09–1.16, P < 0.001) were associated with worse therapy outcomes in PLWD.
Conclusions
Dementia type was generally not associated with outcomes, whereas clinical factors were consistent with those identified for the general population. Additional support and adaptations may be required to improve therapy outcomes in PLWD, particularly in those who are younger and have more severe depression.
We describe the association between job roles and coronavirus disease 2019 (COVID-19) among healthcare personnel. A wide range of hazard ratios were observed across job roles. Medical assistants had higher hazard ratios than nurses, while attending physicians, food service workers, laboratory technicians, pharmacists, residents and fellows, and temporary workers had lower hazard ratios.
Depression is an important, potentially modifiable dementia risk factor. However, it is not known whether effective treatment of depression through psychological therapies is associated with reduced dementia incidence. The aim of this study was to investigate associations between reduction in depressive symptoms following psychological therapy and the subsequent incidence of dementia.
Methods
National psychological therapy data were linked with hospital records of dementia diagnosis for 119808 people aged 65+. Participants received a course of psychological therapy treatment in Improving Access to Psychological Therapies (IAPT) services between 2012 and 2019. Cox proportional hazards models were run to test associations between improvement in depression following psychological therapy and incidence of dementia diagnosis up to eight years later.
Results
Improvements in depression following treatment were associated with reduced rates of dementia diagnosis up to 8 years later (HR = 0.88, 95% CI 0.83–0.94), after adjustment for key covariates. Strongest effects were observed for vascular dementia (HR = 0.86, 95% CI 0.77–0.97) compared with Alzheimer's disease (HR = 0.91, 95% CI 0.83–1.00).
Conclusions
Reliable improvement in depression across psychological therapy was associated with reduced incidence of future dementia. Results are consistent with at least two possibilities. Firstly, psychological interventions to improve symptoms of depression may have the potential to contribute to dementia risk reduction efforts. Secondly, psychological therapies may be less effective in people with underlying dementia pathology or they may be more likely to drop out of therapy (reverse causality). Tackling the under-representation of older people in psychological therapies and optimizing therapy outcomes is an important goal for future research.
Response to lithium in patients with bipolar disorder is associated with clinical and transdiagnostic genetic factors. The predictive combination of these variables might help clinicians better predict which patients will respond to lithium treatment.
Aims
To use a combination of transdiagnostic genetic and clinical factors to predict lithium response in patients with bipolar disorder.
Method
This study utilised genetic and clinical data (n = 1034) collected as part of the International Consortium on Lithium Genetics (ConLi+Gen) project. Polygenic risk scores (PRS) were computed for schizophrenia and major depressive disorder, and then combined with clinical variables using a cross-validated machine-learning regression approach. Unimodal, multimodal and genetically stratified models were trained and validated using ridge, elastic net and random forest regression on 692 patients with bipolar disorder from ten study sites using leave-site-out cross-validation. All models were then tested on an independent test set of 342 patients. The best performing models were then tested in a classification framework.
Results
The best performing linear model explained 5.1% (P = 0.0001) of variance in lithium response and was composed of clinical variables, PRS variables and interaction terms between them. The best performing non-linear model used only clinical variables and explained 8.1% (P = 0.0001) of variance in lithium response. A priori genomic stratification improved non-linear model performance to 13.7% (P = 0.0001) and improved the binary classification of lithium response. This model stratified patients based on their meta-polygenic loadings for major depressive disorder and schizophrenia and was then trained using clinical data.
Conclusions
Using PRS to first stratify patients genetically and then train machine-learning models with clinical predictors led to large improvements in lithium response prediction. When used with other PRS and biological markers in the future this approach may help inform which patients are most likely to respond to lithium treatment.
Clinical and empirical reports suggest that individuals use non-suicidal self-injury (NSSI) not only to ameliorate dysphoria, but to curb suicidal ideation or avoid suicidal behaviour. To date, however, no study has quantitatively assessed whether NSSI leads to short-term reductions in suicidal ideation. Using real-time monitoring over 7 days in a sample with borderline personality disorder, we found evidence that NSSI is followed by reductions in suicidal ideation in the subsequent hours. This suggests that NSSI may serve as an effective, albeit maladaptive, coping strategy for suicidal states. These findings have important implications for the management of suicide risk and self-harm.
Studying phenotypic and genetic characteristics of age at onset (AAO) and polarity at onset (PAO) in bipolar disorder can provide new insights into disease pathology and facilitate the development of screening tools.
Aims
To examine the genetic architecture of AAO and PAO and their association with bipolar disorder disease characteristics.
Method
Genome-wide association studies (GWASs) and polygenic score (PGS) analyses of AAO (n = 12 977) and PAO (n = 6773) were conducted in patients with bipolar disorder from 34 cohorts and a replication sample (n = 2237). The association of onset with disease characteristics was investigated in two of these cohorts.
Results
Earlier AAO was associated with a higher probability of psychotic symptoms, suicidality, lower educational attainment, not living together and fewer episodes. Depressive onset correlated with suicidality and manic onset correlated with delusions and manic episodes. Systematic differences in AAO between cohorts and continents of origin were observed. This was also reflected in single-nucleotide variant-based heritability estimates, with higher heritabilities for stricter onset definitions. Increased PGS for autism spectrum disorder (β = −0.34 years, s.e. = 0.08), major depression (β = −0.34 years, s.e. = 0.08), schizophrenia (β = −0.39 years, s.e. = 0.08), and educational attainment (β = −0.31 years, s.e. = 0.08) were associated with an earlier AAO. The AAO GWAS identified one significant locus, but this finding did not replicate. Neither GWAS nor PGS analyses yielded significant associations with PAO.
Conclusions
AAO and PAO are associated with indicators of bipolar disorder severity. Individuals with an earlier onset show an increased polygenic liability for a broad spectrum of psychiatric traits. Systematic differences in AAO across cohorts, continents and phenotype definitions introduce significant heterogeneity, affecting analyses.
Intensified cover-cropping practices are increasingly viewed as a herbicide-resistance management tool but clear distinction between reactive and proactive resistance management performance targets is needed. We evaluated two proactive performance targets for integrating cover-cropping tactics, including (1) facilitation of reduced herbicide inputs and (2) reduced herbicide selection pressure. We conducted corn (Zea mays L.) and soybean [Glycine max (L.) Merr.] field experiments in Pennsylvania and Delaware using synthetic weed seedbanks of horseweed [Conyza canadensis (L.) Cronquist] and smooth pigweed (Amaranthus hybridus L.) to assess winter and summer annual population dynamics, respectively. The effect of alternative cover crops was evaluated across a range of herbicide inputs. Cover crop biomass production ranged from 2,000 to 8,500 kg ha−1 in corn and 3,000 to 5,500 kg ha−1 in soybean. Experimental results demonstrated that herbicide-based tactics were the primary drivers of total weed biomass production, with cover-cropping tactics providing an additive weed-suppression benefit. Substitution of cover crops for PRE or POST herbicide programs did not reduce total weed control levels or cash crop yields but did result in lower net returns due to higher input costs. Cover-cropping tactics significantly reduced C. canadensis populations in three of four cover crop treatments and decreased the number of large rosettes (>7.6-cm diameter) at the time of preplant herbicide exposure. Substitution of cover crops for PRE herbicides resulted in increased selection pressure on POST herbicides, but reduced the number of large individuals (>10 cm) at POST applications. Collectively, our findings suggest that cover crops can reduce the intensity of selection pressure on POST herbicides, but the magnitude of the effect varies based on weed life-history traits. Additional work is needed to describe proactive resistance management concepts and performance targets for integrating cover crops so producers can apply these concepts in site-specific, within-field management practices.
Mobile devices with health apps, direct-to-consumer genetic testing, crowd-sourced information, and other data sources have enabled research by new classes of researchers. Independent researchers, citizen scientists, patient-directed researchers, self-experimenters, and others are not covered by federal research regulations because they are not recipients of federal financial assistance or conducting research in anticipation of a submission to the FDA for approval of a new drug or medical device. This article addresses the difficult policy challenge of promoting the welfare and interests of research participants, as well as the public, in the absence of regulatory requirements and without discouraging independent, innovative scientific inquiry. The article recommends a series of measures, including education, consultation, transparency, self-governance, and regulation to strike the appropriate balance.
Organic grain producers are interested in reducing tillage to conserve soil and decrease labor and fuel costs. We examined agronomic and economic tradeoffs associated with alternative strategies for reducing tillage frequency and intensity in a cover crop–soybean (Glycine max L. Merr.) sequence within a corn (Zea mays L.)–soybean–spelt (Triticum spelta L.) organic cropping system experiment in Pennsylvania. Tillage-based soybean production preceded by a cover crop mixture of annual ryegrass (Lolium perenne L. ssp. multiflorum), orchardgrass (Dactylis glomerata L.) and forage radish (Raphanus sativus L.) interseeded into corn grain (Z. mays L.) was compared with reduced-tillage soybean production preceded by roller-crimped cereal rye (Secale cereale L.) that was sown after corn silage. Total aboveground weed biomass did not differ between soybean production strategies. Each strategy, however, was characterized by high inter-annual variability in weed abundance. Tillage-based soybean production marginally increased grain yield by 0.28 Mg ha−1 compared with reduced-tillage soybean. A path model of soybean yield indicated that soybean stand establishment and weed biomass were primary drivers of yield, but soybean production strategy had a measurable effect on yields due to factors other than within-season weed–crop competition. Cumulative tillage frequency and intensity were quantified for each cover crop—sequence using the Soil Tillage Intensity Rating (STIR) index. The reduced-tillage soybean sequence resulted in 50% less soil disturbance compared to tillage-based soybean sequence across study years. Finally, enterprise budget comparisons showed that the reduced-tillage soybean sequence resulted in lower input costs than the tillage-based soybean sequence but was approximately $114 ha−1 less profitable because of lower average yields.
A 2018 workshop on the White Mountain Apache Tribe lands in Arizona examined ways to enhance investigations into cultural property crime (CPC) through applications of rapidly evolving methods from archaeological science. CPC (also looting, graverobbing) refers to unauthorized damage, removal, or trafficking in materials possessing blends of communal, aesthetic, and scientific values. The Fort Apache workshop integrated four generally partitioned domains of CPC expertise: (1) theories of perpetrators’ motivations and methods; (2) recommended practice in sustaining public and community opposition to CPC; (3) tactics and strategies for documenting, investigating, and prosecuting CPC; and (4) forensic sedimentology—uses of biophysical sciences to link sediments from implicated persons and objects to crime scenes. Forensic sedimentology served as the touchstone for dialogues among experts in criminology, archaeological sciences, law enforcement, and heritage stewardship. Field visits to CPC crime scenes and workshop deliberations identified pathways toward integrating CPC theory and practice with forensic sedimentology’s potent battery of analytic methods.
In preparation for a multisite antibiotic stewardship intervention, we assessed knowledge and attitudes toward management of asymptomatic bacteriuria (ASB) plus teamwork and safety climate among providers, nurses, and clinical nurse assistants (CNAs).
Design:
Prospective surveys during January–June 2018.
Setting:
All acute and long-term care units of 4 Veterans’ Affairs facilities.
Methods:
The survey instrument included 2 previously tested subcomponents: the Kicking CAUTI survey (ASB knowledge and attitudes) and the Safety Attitudes Questionnaire (SAQ).
Results:
A total of 534 surveys were completed, with an overall response rate of 65%. Cognitive biases impacting management of ASB were identified. For example, providers presented with a case scenario of an asymptomatic patient with a positive urine culture were more likely to give antibiotics if the organism was resistant to antibiotics. Additionally, more than 80% of both nurses and CNAs indicated that foul smell is an appropriate indication for a urine culture. We found significant interprofessional differences in teamwork and safety climate (defined as attitudes about issues relevant to patient safety), with CNAs having highest scores and resident physicians having the lowest scores on self-reported perceptions of teamwork and safety climates (P < .001). Among providers, higher safety-climate scores were significantly associated with appropriate risk perceptions related to ASB, whereas social norms concerning ASB management were correlated with higher teamwork climate ratings.
Conclusions:
Our survey revealed substantial misunderstanding regarding management of ASB among providers, nurses, and CNAs. Educating and empowering these professionals to discourage unnecessary urine culturing and inappropriate antibiotic use will be key components of antibiotic stewardship efforts.
Alzheimer's disease and vascular dementia are associated with overlapping symptoms of anxiety and depression. More accurate discrimination between emerging neuropsychiatric and cognitive symptoms would better assist illness detection. The potential for protection against cognitive decline and dementia following early identification and intervention of neuropsychiatric symptoms warrants investigation.
Drawing on a landscape analysis of existing data-sharing initiatives, in-depth interviews with expert stakeholders, and public deliberations with community advisory panels across the U.S., we describe features of the evolving medical information commons (MIC). We identify participant-centricity and trustworthiness as the most important features of an MIC and discuss the implications for those seeking to create a sustainable, useful, and widely available collection of linked resources for research and other purposes.
The hippocampus plays an important role in psychopathology and treatment outcome. While posterior hippocampus (PH) may be crucial for the learning process that exposure-based treatments require, affect-focused treatments might preferentially engage anterior hippocampus (AH). Previous studies have distinguished the different functions of these hippocampal sub-regions in memory, learning, and emotional processes, but not in treatment outcome. Examining two independent clinical trials, we hypothesized that anterior hippocampal volume would predict outcome of affect-focused treatment outcome [Interpersonal Psychotherapy (IPT); Panic-Focused Psychodynamic Psychotherapy (PFPP)], whereas posterior hippocampal volume would predict exposure-based treatment outcome [Prolonged Exposure (PE); Cognitive Behavioral Therapy (CBT); Applied Relaxation Training (ART)].
Methods
Thirty-five patients with posttraumatic stress disorder (PTSD) and 24 with panic disorder (PD) underwent structural magnetic resonance imaging (MRI) before randomization to affect-focused (IPT for PTSD; PFPP for PD) or exposure-based treatments (PE for PTSD; CBT or ART for PD). AH and PH volume were regressed with clinical outcome changes.
Results
Baseline whole hippocampal volume did not predict post-treatment clinical severity scores in any treatment. For affect-focused treatments, but not exposure-based treatments, anterior hippocampal volume predicted clinical improvement. Smaller AH correlated with greater affect-focused treatment improvement. Posterior hippocampal volume did not predict treatment outcome.
Conclusions
This is the first study to explore associations between hippocampal volume sub-regions and treatment outcome in PTSD and PD. Convergent results suggest that affect-focused treatment may influence the clinical outcome through the ‘limbic’ AH, whereas exposure-based treatments do not. These preliminary, theory-congruent, therapeutic findings require replication in a larger clinical trial.
Bipolar disorder is less prevalent in older people but accounts for 8–10% of psychiatric admissions. Treating and managing bipolar disorder in older people is challenging because of medical comorbidity. We review the cognitive problems observed in older people, explore why these are important and consider current treatment options. There are very few studies examining the cognitive profiles of older people with bipolar disorder and symptomatic depression and mania, and these show significant impairments in executive function. Most studies have focused on cognitive impairment in euthymic older people: as in euthymic adults of working age, significant impairments are observed in tests of attention, memory and executive function/processing speeds. Screening tests are not always helpful in euthymic older people as the impairment can be relatively subtle, and more in-depth neuropsychological testing may be needed to show impairments. Cognitive impairment may be more pronounced in older people with ‘late-onset’ bipolar disorder than in those with ‘early-onset’ disorder. Strategies to address symptomatic cognitive impairment in older people include assertive treatment of the mood disorder, minimising drugs that can adversely affect cognition, optimising physical healthcare and reducing relapse rates.
LEARNING OBJECTIVES
After reading this article you will be able to:
• understand that cognitive impairment in euthymic older people with bipolar disorder is similar to that in working-age adults with the disorder, affecting attention, memory and executive function/processing speeds
• recognise that cognitive impairment in older people is likely to be a major determinant of functional outcomes
• Implement approaches to treat cognitive impairment in bipolar disorder.
DECLARATION OF INTEREST
B.J.S. consults for Cambridge Cognition, PEAK (www.peak.net) and Mundipharma.
While our fascination with understanding the past is sufficient to warrant an increased focus on synthesis, solutions to important problems facing modern society require understandings based on data that only archaeology can provide. Yet, even as we use public monies to collect ever-greater amounts of data, modes of research that can stimulate emergent understandings of human behavior have lagged behind. Consequently, a substantial amount of archaeological inference remains at the level of the individual project. We can more effectively leverage these data and advance our understandings of the past in ways that contribute to solutions to contemporary problems if we adapt the model pioneered by the National Center for Ecological Analysis and Synthesis to foster synthetic collaborative research in archaeology. We propose the creation of the Coalition for Archaeological Synthesis coordinated through a U.S.-based National Center for Archaeological Synthesis. The coalition will be composed of established public and private organizations that provide essential scholarly, cultural heritage, computational, educational, and public engagement infrastructure. The center would seek and administer funding to support collaborative analysis and synthesis projects executed through coalition partners. This innovative structure will enable the discipline to address key challenges facing society through evidentially based, collaborative synthetic research.
The chemical composition of soil from the Glasgow (UK) urban area was used to identify the controls on the availability of potentially harmful elements (PHEs) in soil to humans. Total and bioaccessible concentrations of arsenic (As), chromium (Cr) and lead (Pb) in 27 soil samples, collected from different land uses, were coupled to information on their solid-phase partitioning derived from sequential extraction data. The total element concentrations in the soils were in the range <0.1–135mgkg–1 for As; 65–3680mgkg–1 for Cr and 126–2160mgkg–1 for Pb, with bioaccessible concentrations averaging 27, 5 and 27% of the total values, respectively. Land use does not appear to be a predictor of contamination; however, the history of the contamination is critically important. The Chemometric Identification of Substrates and Element Distribution (CISED) sequential chemical extraction and associated self-modelling mixture resolution analysis identified three sample groupings and 16 geochemically distinct phases (substrates). These were related to iron (n=3), aluminium–silicon (Al–Si; n=2), calcium (n=3), phosphorus (n=1), magnesium (Mg; n=3), manganese (n=1) and easily extractable (n=3), which was predominantly made up of sodium and sulphur. As, Cr and Pb were respectively found in 9, 10 and 12 of the identified phases, with bioaccessible As predominantly associated with easily extractable phases, bioaccessible Cr with the Mg-dominated phases and bioaccessible Pb with both the Mg-dominated and Al–Si phases. Using a combination of the Unified Barge Method to measure the bioaccessibility of PHEs and CISED to identify the geochemical sources has allowed a much better understanding of the complexity of PHE mobility in the Glasgow urban environment. This approach can be applied to other urban environments and cases of soil contamination, and made part of land-use planning.
Palynological and sedimentological data from Lake Telmen, in north-central Mongolia, permit qualitative reconstruction of relative changes in moisture balance throughout the mid to late Holocene. The climate of the Atlantic period (7500–4500 yr ago) was relatively arid, indicating that Lake Telmen lay beyond the region of enhanced precipitation delivered by the expanded Asian monsoon. Maximum humidity is recorded between ∼4500 and 1600 cal yr B.P., during the Subboreal (4500–2500 yr ago) and early Subatlantic (2500 yr–present) periods. Additional humid intervals during the Medieval Warm Epoch (∼1000–1300 A.D. or 950–650 ago) and the Little Ice Age (1500– 1900 A.D. or 450–50 yr B.P.) demonstrate the lack of long-term correlation between temperature and moisture availability in this region. A brief aridification centered around 1410 cal yr B.P. encompasses a decade of cold temperatures and summer frost between A.D. 536 and 545 (1414–1405 yr B.P.) inferred from records of Mongolian tree-ring widths. These data suggest that steppe vegetation of the Lake Telmen region is sensitive to centennial- and decadal-scale climatic perturbations.