We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Objectives/Goals: Collaborations between translational science programs and academic health sciences libraries can enhance research impact by improving efficiency, leveraging diverse professional expertise, and expanding opportunities for collaboration between librarians and translational science programs. Methods/Study Population: A team science approach was utilized, integrating findings from a literature review, practical experiences of health sciences librarians, and collaborative writing. An analysis of case studies from institutions with successful partnerships explored the roles of libraries in partnering with translational science programs. The data collected were mapped to the Clinical and Translational Science Award Program’s five functional areas outlined in the Notice of Funding Opportunity PAR-24–272. Librarians from 21 institutions engaged in discussions and collaborative writing to share insights and identify key factors driving successful partnerships. Results/Anticipated Results: Academic health sciences libraries play a crucial role in enhancing translational science programs through expert knowledge management, facilitation of research dissemination, and support for interdisciplinary collaboration. Results from this project include a table outlining 16 specific opportunities mapped across five functional areas and six topical categories for translational science programs and libraries to collaborate effectively.Successful partnerships demonstrate improved research workflows, increased interactions between researchers and libraries, and accelerated translation of discoveries into clinical settings. These collaborations illustrate opportunities for other institutions to adopt as they consider best practices in supporting translational science. Discussion/Significance of Impact: By combining resources and expertise between libraries and translational science programs, these partnerships enhance the ability to transform scientific discoveries into real-world clinical applications, drive innovation, and amplify the contributions of both libraries and translational science programs.
Contaminated surfaces in clinics pose a pathogen transmission risk. Far ultraviolet-C light (UVC), with a favorable safety profile for human exposure, has the potential for continuous pathogen inactivation in occupied clinical areas. This study demonstrated real-world bioburden reduction on surfaces, despite frequent contamination from routine use by staff and patients in clinics.
Emergency department (ED) visits for epilepsy are common, costly, and often clinically unnecessary. Configuration of care pathways (CPs) that could divert people away from ED offer an alternative. The aim was to measure patient and carer preferences for alternative CPs and to explore the feasibility of implementing the preferred CPs in the National Health Service (NHS) England with a wider group of stakeholders.
Methods
Formative work (provider survey, service-user interviews, knowledge exchange, and think-aloud piloting) informed a discrete choice experiment (DCE) with six attributes: access to care plan, conveyance, time, epilepsy specialist today, general practitioner (GP) notification, and epilepsy specialist follow-up. This was hosted online with random assignment to two of three scenarios (home, public, or atypical). Logistic regression generated preference weights that were used to calculate the utility of CPs. The highest ranked CPs plus a status quo were discussed at three online knowledge exchange workshops. The nominal group technique was used to ascertain stakeholder views on preference evidence and to seek group consensus on optimal feasible alternatives.
Results
A sample of 427 people with epilepsy and 167 friends or family completed the survey. People with epilepsy preferred paramedics to have access to care plan, non-conveyance, one to three hours, epilepsy specialists today, GP notification, and specialist follow-up within two to three weeks. Family and friends differed when considering atypical seizures, favoring conveyance to urgent treatment centers and shorter time. Optimal configuration of services from service users’ perspectives outranked current practice. Knowledge exchange (n=27 participants) identified the optimal CP as feasible but identified two scenarios for resource reallocation: care plan substitutes specialist advice today and times of strain on NHS resources.
Conclusions
Preferences differed to current practice but had minimal variation by seizure type or stakeholder. This study clearly identified optimal and feasible alternative CPs. The mixed-methods approach allowed for robust measurement of preferences, whilst knowledge exchange examined feasibility to enhance implementation of optimal alternative CPs in the future.
Deployment of an electronic automated advisory vital signs monitoring and notification system to signal clinical deterioration is associated with significant improvement in clinical outcomes. This study aimed to estimate the incremental cost per quality-adjusted life-year (QALY) gained with an electronic automated advisory notification system, compared with standard care.
Methods
A decision analytic model was developed to estimate the cost effectiveness of an electronic automated advisory notification system, compared with standard care, in adults admitted to a district general hospital. Analyses considered the following: (i) cost effectiveness (cost/event avoided) based on a before-and-after study (n=3,787) that recorded rates of acute myocardial infarction, pulmonary embolism, acute pulmonary edema, respiratory failure, stroke, severe sepsis, acute renal failure, cardiopulmonary arrest, admission to the intensive care unit, and death; and (ii) the cost utility (cost per QALY) over a lifetime horizon extrapolated using published data. The analysis was conducted from the perspective of the National Health Services (NHS) in the UK.
Results
The automated notification system was more effective (2.7 fewer events per 100 patients) and provided cost savings of −GBP12.17 [−EUR14.07] per patient admission (95% CI: −GBP182.07 [−EUR211.20], GBP154.80 [EUR179.57]). The automated notification system was dominant over a lifetime horizon, demonstrating a positive incremental QALY gain (0.0287 QALYs, equivalent to approximately 10 days of perfect health) and a cost saving of −GBP55.35 (−EUR64.02). At a threshold of GBP20,000 per QALY (EUR23,126), the probability of automated monitoring being cost effective in the NHS was 0.81. The increased use of cableless sensors may reduce cost-savings, but the intervention remained cost effective at 100 percent usage (incremental cost-effectiveness ratio GBP3,107 per QALY [EUR3,594 per QALY]).
Conclusions
An automated notification system for adult patients admitted to general wards appears to be a cost-effective strategy in the NHS. The analysis suggests that adopting this technology could be good use of scarce resources. The impact of automated monitoring solutions on staffing warrants further exploration and may show additional value in adopting such technology.
Academic health sciences libraries (“libraries”) offer services that span the entire research lifecycle, positioning them as natural partners in advancing clinical and translational science. Many libraries enjoy active and productive collaborations with Clinical and Translational Science Award (CTSA) Program hubs and other translational initiatives like the IDeA Clinical & Translational Research Network. This article explores areas of potential partnership between libraries and Translational Science Hubs (TSH), highlighting areas where libraries can support the CTSA Program’s five functional areas outlined in the Notice of Funding Opportunity. It serves as a primer for TSH and libraries to explore potential collaborations, demonstrating how libraries can connect researchers to services and resources that support the information needs of TSH.
Research shows that highly educated individuals have at least 20 graphomotor features associated with clock drawing with hands set for '10 after 11' (Davoudi et al., 2021). Research has yet to understand clock drawing features in individuals with fewer years of education. In the current study, we compared older adults with < 8 years of education to those with > 9 years of education on number and pattern of graphomotor feature relationships in the clock drawing command condition.
Participants and Methods:
Participants age 65+ from the University of Florida (UF) and UF Health (N= 10,491) completed both command and copy conditions of the digital Clock Drawing Test (dCDT) as a part of a federally-funded investigation. Participants were categorized into two education groups: < 8 years of education (n= 304) and > 9 years of education (n= 10,187). Propensity score matching was then used to match participants from each subgroup (n= 266 for each subgroup) on the following demographic characteristics: age, sex, race, and ethnicity (n= 532, age= 74.99±6.21, education= 10.41±4.45, female= 42.7%, non-white= 32.0%). Network models were derived using Bayesian Structure Learning (BSL) with the hill-climbing algorithm to obtain optimal directed acyclic graphs (DAGs) from all possible solutions in each subgroup for the dCDT command condition.
Results:
Both education groups retained 13 of 91 possible edges (14.29%). For the < 8 years of education group (education= 6.65±1.74, ASA= 3.08±0.35), the network included 3 clock face (CF), 7 digit, and 3 hour hand (HH) and minute hand (MH) independent, or “parent,” features connected to the retained edges (BIC= -7395.24). In contrast, the > 9 years of education group (education= 14.17±2.88, ASA= 2.90±0.46) network retained 1 CF, 6 digit, 5 HH and MH, and 1 additional parent features representing the total number of pen strokes (BIC= -6689.92). Both groups showed that greater distance from the HH to the center of the clock also had greater distance from the MH to the center of the clock [ßz(< 8 years)= 0.73, ßz(> 9 years)= 0.76]. Groups were similar in the size of the digit height relative to the distance of the digits to the CF [ßz(< 8 years)= 0.27, ßz(> 9 years)= 0.56]. Larger HH angle was associated with larger MH angle across groups [ßz(< 8 years)= 0.28, ßz(> 9 years)= 0.23].
Conclusions:
Education groups differed in the ratio of dCDT parent feature types. Specifically, copy clock production in older adults with < 8 years of education relied more heavily on CF parent features. In contrast, older adults with > 9 years of education relied more heavily on HH and MH parent features. Individuals with < 8 years of education may more infrequently present the concept of time in the clock drawing command condition. This study highlights the importance of considering education level in interpreting dCDT scores and features.
Chronic musculoskeletal pain is associated with neurobiological, physiological, and cellular measures. Importantly, we have previously demonstrated that a biobehavioral and psychosocial resilience index appears to have a protective relationship on the same biomarkers. Less is known regarding the relationships between chronic musculoskeletal pain, protective factors, and brain aging. This study investigates the relationships between clinical pain, a resilience index, and brain age. We hypothesized that higher reported chronic pain would correlate with older appearing brains, and the resilience index will attenuate the strength of the relationship between chronic pain and brain age.
Participants and Methods:
Participants were drawn from an ongoing observational multisite study and included adults with chronic pain who also reported knee pain (N = 135; age = 58.3 ± 8.1; 64% female; 49% non-Hispanic Black, 51% non-Hispanic White; education Mdn = some college; income level Mdn = $30,000 - $40,000; MoCA M = 24.27 ± 3.49). Measures included the Graded Chronic Pain Scale (GCPS), characteristic pain intensity (CPI) and disability, total pain body sites; and a cognitive screening (MoCA). The resilience index consisted of validated biobehavioral (e.g., smoking, waist/hip ratio, and active coping) and psychosocial measures (e.g., optimism, positive affect, negative affect, perceived stress, and social support). T1-weighted MRI data were obtained. Surface area metrics were calculated in FreeSurfer using the Human Connectome Project's multi-modal cortical parcellation scheme. We calculated brain age in R using previously validated and trained machine learning models. Chronological age was subtracted from predicted brain age to generate a brain age gap (BAG). With higher scores of BAG indicating predicated age is older than chronological age. Three parallel hierarchical regression models (each containing one of three pain measures) with three blocks were performed to assess the relationships between chronic pain and the resilience index in relation to BAG, adjusting for covariates. For each model, Block 1 entered the covariates, Block 2 entered a pain score, and Block 3 entered the resilience index.
Results:
GCPS CPI (R2 change = .033, p = .027) and GCPS disability (R2 change = 0.038, p = 0.017) significantly predicted BAG beyond the effects of the covariates, but total pain sites (p = 0.865) did not. The resilience index was negatively correlated and a significant predictor of BAG in all three models (p < .05). With the resilience index added in Block 3, both GCPS CPI (p = .067) and GCPS disability (p = .066) measures were no longer significant in their respective models. Additionally, higher education/income (p = 0.016) and study site (p = 0.031) were also significant predictors of BAG.
Conclusions:
In this sample, higher reported chronic pain correlated with older appearing brains, and higher resilience attenuated this relationship. The biobehavioral and psychosocial resilience index was associated with younger appearing brains. While our data is cross-sectional, findings are encouraging that interventions targeting both chronic pain and biobehavioral and psychosocial factors (e.g., coping strategies, positive and negative affect, smoking, and social support) might buffer brain aging. Future directions include assessing if chronic pain and resilience factors can predict brain aging over time.
Research shows that highly educated individuals have at least 20 graphomotor features associated with clock drawing with hands set for '10 after 11' (Davoudi et al., 2021). Research has yet to understand clock drawing features in individuals with fewer years of education. In the current study, we compared older adults with < 8 years of education to those with > 9 years of education on number and pattern of graphomotor feature relationships in the clock drawing copy condition.
Participants and Methods:
Participants age 65+ from the University of Florida (UF) and UF Health (N= 10,491) completed command and copy digital Clock Drawing Tests (dCDT) as a part of a federally-funded investigation. Participants were categorized into two groups: < 8 years of education (n= 304) and > 9 years of education (n= 10,187). Propensity score matching was used to match participants from each subgroup (n= 266 for each subgroup) on the following: age, sex, race, and ethnicity (n= 532, age= 74.99±6.21, education= 10.41±4.45, female= 42.7%, non-white= 32.0%). Network models were derived using Bayesian Structure Learning (BSL) with the hill-climbing algorithm to obtain optimal directed acyclic graphs (DAGs) from all possible solutions in each subgroup for the dCDT copy condition.
Results:
The < 8 years of education group (education= 6.65±1.74, ASA= 3.08±0.35), retained 12 of 91 possible edges (13.19%, BIC= -7775.50). The network retained 2 clock face (CF), 5 digit, and 5 hour hand (HH) and minute hand (MH) independent, or “parent,” features connected to the retained edges. In contrast, the > 9 years of education group (education= 14.17±2.88, ASA= 2.90±0.46) network retained 15 of 91 possible edges (16.48%, BIC= -8261.484). The network retained 2 CF, 6 digit, 4 HH and MH, and an additional 3 total stroke parent features. Both groups showed that greater distance from the HH to the clock center also had greater distance from the MH to the clock center (ßz= 0.73, both). Groups were similar in digit width size relative to digit height [ßz(< 8 years)= 0.72, ßz(> 9 years)= 0.74]. Digit height size related to CF area [ßz(< 8 years)= 0.44, ßz(> 9 years)= 0.62] and CF area related to the digit distance to the CF across groups [ßz(< 8 years)= 0.39, ßz(> 9 years)= 0.46]. Greater distance from the MH to the clock center was associated with smaller MH angle [ßz(< 8 years)= -0.35, ßz(> 9 years)= -0.31], whereas greater digit misplacement was associated with larger MH angle across groups [ßz(< 8 years)= 0.14, ßz(> 9 years)= 0.29].
Conclusions:
Education groups differed in the ratio of dCDT parent feature types. Specifically, copy clock production in older adults with < 8 years of education relied more evenly across CF, digit, and MH and HH parent features. In contrast, those with > 9 years of education differed in the additional reliance on total stroke parent features. Individuals with < 8 years of education may more heavily rely upon visual referencing when copying a clock. This study highlights the importance of considering education level in interpreting dCDT scores and features.
Cognitive behavior therapy (CBT) is effective for most patients with a social anxiety disorder (SAD) but a substantial proportion fails to remit. Experimental and clinical research suggests that enhancing CBT using imagery-based techniques could improve outcomes. It was hypothesized that imagery-enhanced CBT (IE-CBT) would be superior to verbally-based CBT (VB-CBT) on pre-registered outcomes.
Methods
A randomized controlled trial of IE-CBT v. VB-CBT for social anxiety was completed in a community mental health clinic setting. Participants were randomized to IE (n = 53) or VB (n = 54) CBT, with 1-month (primary end point) and 6-month follow-up assessments. Participants completed 12, 2-hour, weekly sessions of IE-CBT or VB-CBT plus 1-month follow-up.
Results
Intention to treat analyses showed very large within-treatment effect sizes on the social interaction anxiety at all time points (ds = 2.09–2.62), with no between-treatment differences on this outcome or clinician-rated severity [1-month OR = 1.45 (0.45, 4.62), p = 0.53; 6-month OR = 1.31 (0.42, 4.08), p = 0.65], SAD remission (1-month: IE = 61.04%, VB = 55.09%, p = 0.59); 6-month: IE = 58.73%, VB = 61.89%, p = 0.77), or secondary outcomes. Three adverse events were noted (substance abuse, n = 1 in IE-CBT; temporary increase in suicide risk, n = 1 in each condition, with one being withdrawn at 1-month follow-up).
Conclusions
Group IE-CBT and VB-CBT were safe and there were no significant differences in outcomes. Both treatments were associated with very large within-group effect sizes and the majority of patients remitted following treatment.
Even in cases with complexity, simple techniques can be useful to target a specific symptom. Intrusive mental images are highly disruptive, drive emotion, and contribute to maintaining psychopathology. Cognitive science suggests that we might target intrusive images using competing tasks.
Aims:
We describe an imagery competing task technique within cognitive behavioural therapy (CBT) with a patient with bipolar disorder and post-traumatic stress disorder (PTSD) symptoms. The intervention – including Tetris computer game-play – was used (1) to target a specific image within one therapy session, and (2) to manage multiple images in daily life.
Method:
A single case (AB) design was used. (1) To target a specific image, the patient brought the image to mind and, after mental rotation instructions and game-play practice, played Tetris for 10 minutes. Outcomes, pre- and post-technique, were: vividness/distress ratings when the image was brought to mind; reported intrusion frequency over a week. (2) To manage multiple images, the patient used the intervention after an intrusive image occurred. Outcomes were weekly measures of: (a) imagery characteristics; (b) symptoms of PTSD, anxiety, depression and mania.
Results:
(1) For the target image, there were reductions in vividness (80% to 40%), distress (70% to 0%), and intrusion frequency (daily to twice/week). (2) For multiple images, there were reductions from baseline to follow-up in (a) imagery vividness (38%), realness (66%) and compellingness (23%), and (b) PTSD symptoms (Impact of Events Scale-Revised score 26.33 to 4.83).
Conclusion:
This low-intensity intervention aiming to directly target intrusive mental imagery may offer an additional, complementary tool in CBT.
The effects of the COVID-19 pandemic on population mental health are unknown. We need to understand the scale of any such impact in different sections of the population, who is most affected and how best to mitigate, prevent and treat any excess morbidity. We propose a coordinated and interdisciplinary mental health science response.
To describe the initial results of implementing pharmacogenomics testing in a community-based psychiatry practice and potential impacts on medication management.
Method:
Retrospective chart review of prospectively maintained medical records of all adult patients with pharmacogenomics results from 9/01/2017 to 6/30/2019 under the care of psychiatrist and clinical pharmacist.
Results:
A total of 51 patients met inclusion criteria. A total of 7 pharmacokinetic genes and, due to changes in the test report over time, a range of 6-10 pharmacodynamic genes relevant to psychotropic medications were evaluated per patient. Every patient had genetic variations, with an average of 6.1 per patient (range 3-9; SD= 1.5). Patients were taking an average of 3.6 (range 1-8; SD=1.7) psychiatric medications at the time of the genetic test, to treat an average of 5 psychiatric conditions (range 1-9; SD=2.2). An average of 1.2 (range 0-4; SD=1.0) gene-drug interactions were uncovered per patient. Following review by psychiatrist and pharmacist, medication adjustments resulted in patients remaining on an average of 3.6 psychiatric medications, but decreasing the average number of gene-drug interactions per patient to 0.8 (range 0-3, SD=0.8).
Discussion:
The large number of genetic variations observed per patient is consistent with previous findings 1-2. The decrease in number of gene-drug interactions following testing demonstrates the practical utility of pharmacogenomics information to guide medication therapy. This study did not examine outcomes such as improvement in psychiatric condition or reduction in medication adverse effects; however, these endpoints have been evaluated in other trials 3-4.
Conclusions:
Pharmacogenomics testing presents an opportunity for a personalized medicine approach in a community-based psychiatry practice.
In much of Europe, the advent of low-input cereal farming regimes between c.ad 800 and 1200 enabled landowners—lords—to amass wealth by greatly expanding the amount of land under cultivation and exploiting the labour of others. Scientific analysis of plant remains and animal bones from archaeological contexts is generating the first direct evidence for the development of such low-input regimes. This article outlines the methods used by the FeedSax project to resolve key questions regarding the ‘cerealization’ of the medieval countryside and presents preliminary results using the town of Stafford as a worked example. These indicate an increase in the scale of cultivation in the Mid-Saxon period, while the Late Saxon period saw a shift to a low-input cultivation regime and probably an expansion onto heavier soils. Crop rotation appears to have been practised from at least the mid-tenth century.
There is demand for new, effective and scalable treatments for depression, and development of new forms of cognitive bias modification (CBM) of negative emotional processing biases has been suggested as possible interventions to meet this need.
Methods
We report two double blind RCTs, in which volunteers with high levels of depressive symptoms (Beck Depression Inventory ii (BDI-ii) > 14) completed a brief course of emotion recognition training (a novel form of CBM using faces) or sham training. In Study 1 (N = 36), participants completed a post-training emotion recognition task whilst undergoing functional magnetic resonance imaging to investigate neural correlates of CBM. In Study 2 (N = 190), measures of mood were assessed post-training, and at 2-week and 6-week follow-up.
Results
In both studies, CBM resulted in an initial change in emotion recognition bias, which (in Study 2) persisted for 6 weeks after the end of training. In Study 1, CBM resulted in increases neural activation to happy faces, with this effect driven by an increase in neural activity in the medial prefrontal cortex and bilateral amygdala. In Study 2, CBM did not lead to a reduction in depressive symptoms on the BDI-ii, or on related measures of mood, motivation and persistence, or depressive interpretation bias at either 2 or 6-week follow-ups.
Conclusions
CBM of emotion recognition has effects on neural activity that are similar in some respects to those induced by Selective Serotonin Reuptake Inhibitors (SSRI) administration (Study 1), but we find no evidence that this had any later effect on self-reported mood in an analogue sample of non-clinical volunteers with low mood (Study 2).
Despite a reported high rate of mental disorders in refugees, scientific knowledge on their risk of suicide attempt and suicide is scarce. We aimed to investigate (1) the risk of suicide attempt and suicide in refugees in Sweden, according to their country of birth, compared with Swedish-born individuals and (2) to what extent time period effects, socio-demographics, labour market marginalisation (LMM) and morbidity explain these associations.
Methods
Three cohorts comprising the entire population of Sweden, 16–64 years at 31 December 1999, 2004 and 2009 (around 5 million each, of which 3.3–5.0% refugees), were followed for 4 years each through register linkage. Additionally, the 2004 cohort was followed for 9 years, to allow analyses by refugees' country of birth. Crude and multivariate hazard ratios (HRs) with 95% confidence intervals (CIs) were computed. The multivariate models were adjusted for socio-demographic, LMM and morbidity factors.
Results
In multivariate analyses, HRs regarding suicide attempt and suicide in refugees, compared with Swedish-born, ranged from 0.38–1.25 and 0.16–1.20 according to country of birth, respectively. Results were either non-significant or showed lower risks for refugees. Exceptions were refugees from Iran (HR 1.25; 95% CI 1.14–1.41) for suicide attempt. The risk for suicide attempt in refugees compared with the Swedish-born diminished slightly across time periods.
Conclusions
Refugees seem to be protected from suicide attempt and suicide relative to Swedish-born, which calls for more studies to disentangle underlying risk and protective factors.
The early Middle Ages saw a major expansion of cereal cultivation across large parts of Europe thanks to the spread of open-field farming. A major project to trace this expansion in England by deploying a range of scientific methods is generating direct evidence for this so-called ‘Medieval Agricultural Revolution’.
Subclinical delusional ideas, including persecutory beliefs, in otherwise healthy individuals are heritable symptoms associated with increased risk for psychotic illness, possibly representing an expression of one end of a continuum of psychosis severity. The identification of variation in brain function associated with these symptoms may provide insights about the neurobiology of delusions in clinical psychosis.
Methods
A resting-state functional magnetic resonance imaging scan was collected from 131 young adults with a wide range of severity of subclinical delusional beliefs, including persecutory ideas. Because of evidence for a key role of the amygdala in fear and paranoia, resting-state functional connectivity of the amygdala was measured.
Results
Connectivity between the amygdala and early visual cortical areas, including striate cortex (V1), was found to be significantly greater in participants with high (n = 43) v. low (n = 44) numbers of delusional beliefs, particularly in those who showed persistence of those beliefs. Similarly, across the full sample, the number of and distress associated with delusional beliefs were positively correlated with the strength of amygdala-visual cortex connectivity. Moreover, further analyses revealed that these effects were driven by those who endorsed persecutory beliefs.
Conclusions
These findings are consistent with the hypothesis that aberrant assignments of threat to sensory stimuli may lead to the downstream development of delusional ideas. Taken together with prior findings of disrupted sensory-limbic coupling in psychosis, these results suggest that altered amygdala-visual cortex connectivity could represent a marker of psychosis-related pathophysiology across a continuum of symptom severity.
Mental imagery refers to the experience of perception in the absence of external sensory input. Deficits in the ability to generate mental imagery or to distinguish it from actual sensory perception are linked to neurocognitive conditions such as dementia and schizophrenia, respectively. However, the importance of mental imagery to psychiatry extends beyond neurocognitive impairment. Mental imagery has a stronger link to emotion than verbal-linguistic cognition, serving to maintain and amplify emotional states, with downstream impacts on motivation and behavior. As a result, anomalies in the occurrence of emotion-laden mental imagery has transdiagnostic significance for emotion, motivation, and behavioral dysfunction across mental disorders. This review aims to demonstrate the conceptual and clinical significance of mental imagery in psychiatry through examples of mood and anxiety disorders, self-harm and suicidality, and addiction. We contend that focusing on mental imagery assessment in research and clinical practice can increase our understanding of the cognitive basis of psychopathology in mental disorders, with the potential to drive the development of algorithms to aid treatment decision-making and inform transdiagnostic treatment innovation.
Background: Despite the global impact of bipolar disorder (BD), treatment success is limited. Challenges include syndromal and subsyndromal mood instability, comorbid anxiety, and uncertainty around mechanisms to target. The Oxford Mood Action Psychology Programme (OxMAPP) offered a novel approach within a cognitive behavioural framework, via mental imagery-focused cognitive therapy (ImCT). Aims: This clinical audit evaluated referral rates, clinical outcomes and patient satisfaction with the OxMAPP service. Method: Eleven outpatients with BD received ImCT in addition to standard psychiatric care. Mood data were collected weekly from 6 months pre-treatment to 6 months post-treatment via routine mood monitoring. Anxiety was measured weekly from start of treatment until 1 month post-treatment. Patient feedback was provided via questionnaire. Results: Referral and treatment uptake rates indicated acceptability to referrers and patients. From pre- to post-treatment, there was (i) a significant reduction in the duration of depressive episode relapses, and (ii) a non-significant trend towards a reduction in the number of episodes, with small to medium effect size. There was a large effect size for the reduction in weekly anxiety symptoms from assessment to 1 month follow-up. Patient feedback indicated high levels of satisfaction with ImCT, and underscored the importance of the mental imagery focus. Conclusions: This clinical audit provides preliminary evidence that ImCT can help improve depressive and anxiety symptoms in BD as part of integrated clinical care, with high patient satisfaction and acceptability. Formal assessment designs are needed to further test the feasibility and efficacy of the new ImCT treatment on anxiety and mood instability.