We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Persons discharged from inpatient psychiatric services are at greatly elevated risk of harming themselves or inflicting violence on others, but no studies have reported gender-specific absolute risks for these two outcomes across the spectrum of psychiatric diagnoses. We aimed to estimate absolute risks for self-harm and interpersonal violence post-discharge according to gender and diagnostic category.
Methods
Danish national registry data were utilized to investigate 62,922 discharged inpatients, born 1967–2000. An age and gender matched cohort study was conducted to examine risks for self-harm and interpersonal violence at 1 year and at 10 years post-discharge. Absolute risks were estimated as cumulative incidence percentage values.
Results
Patients diagnosed with substance misuse disorders were at especially elevated risk, with the absolute risks for either self-harm or interpersonal violence being 15.6% (95% CI 14.9, 16.3%) of males and 16.8% (15.6, 18.1%) of females at 1 year post-discharge, rising to 45.7% (44.5, 46.8%) and 39.0% (37.1, 40.8%), respectively, within 10 years. Diagnoses of personality disorders and early onset behavioral and emotional disorders were also associated with particularly high absolute risks, whilst risks linked with schizophrenia and related disorders, mood disorders, and anxiety/somatoform disorders, were considerably lower.
Conclusions
Patients diagnosed with substance misuse disorders, personality disorders and early onset behavioral and emotional disorders are at especially high risk for internally and externally directed violence. It is crucial, however, that these already marginalized individuals are not further stigmatized. Enhanced care at discharge and during the challenging transition back to life in the community is needed.
Observational studies suggest that 25-hydroxy vitamin D (25(OH)D) concentration is inversely associated with pain. However, findings from intervention trials are inconsistent. We assessed the effect of vitamin D supplementation on pain using data from a large, double-blind, population-based, placebo-controlled trial (the D-Health Trial). 21 315 participants (aged 60–84 years) were randomly assigned to a monthly dose of 60 000 IU vitamin D3 or matching placebo. Pain was measured using the six-item Pain Impact Questionnaire (PIQ-6), administered 1, 2 and 5 years after enrolment. We used regression models (linear for continuous PIQ-6 score and log-binomial for binary categorisations of the score, namely ‘some or more pain impact’ and ‘presence of any bodily pain’) to estimate the effect of vitamin D on pain. We included 20 423 participants who completed ≥1 PIQ-6. In blood samples collected from 3943 randomly selected participants (∼800 per year), the mean (sd) 25(OH)D concentrations were 77 (sd 25) and 115 (sd 30) nmol/l in the placebo and vitamin D groups, respectively. Most (76 %) participants were predicted to have 25(OH)D concentration >50 nmol/l at baseline. The mean PIQ-6 was similar in all surveys (∼50·4). The adjusted mean difference in PIQ-6 score (vitamin D cf placebo) was 0·02 (95 % CI (−0·20, 0·25)). The proportion of participants with some or more pain impact and with the presence of bodily pain was also similar between groups (both prevalence ratios 1·01, 95 % CI (0·99, 1·03)). In conclusion, supplementation with 60 000 IU of vitamin D3/month had negligible effect on bodily pain.
People diagnosed with a severe mental illness (SMI) are at elevated risk of dying prematurely compared to the general population. We aimed to understand the additional risk among people with SMI after discharge from inpatient psychiatric care, when many patients experience an acute phase of their illness.
Methods
In the Clinical Practice Research Datalink (CPRD) GOLD and Aurum datasets, adults aged 18 years and older who were discharged from psychiatric inpatient care in England between 2001 and 2018 with primary diagnoses of SMI (schizophrenia, bipolar disorder, other psychoses) were matched by age and gender with up to five individuals with SMI and without recent hospital stays. Using survival analysis approaches, cumulative incidence and adjusted hazard ratios were estimated for all-cause mortality, external and natural causes of death, and suicide. All analyses were stratified by younger, middle and older ages and also by gender.
Results
In the year after their discharge, the risk of dying by all causes examined was higher than among individuals with SMI who had not received inpatient psychiatric care recently. Suicide risk was 11.6 times (95% CI 6.4–20.9) higher in the first 3 months and remained greater at 2–5 years after discharge (HR 2.3, 1.7–3.2). This risk elevation remained after adjustment for self-harm in the 6 months prior to the discharge date. The relative risk of dying by natural causes was raised in the first 3 months (HR 1.6, 1.3–1.9), with no evidence of elevation during the second year following discharge.
Conclusions
There is an additional risk of death by suicide and natural causes for people with SMI who have been recently discharged from inpatient care over and above the general risk among people with the same diagnosis who have not recently been treated as an inpatient. This mortality gap shows the importance of continued focus, following discharge, on individuals who require inpatient care.
Optical parametric chirped-pulse amplification implemented using multikilojoule Nd:glass pump lasers is a promising approach for producing ultra-intense pulses (>1023 W/cm2). We report on the MTW-OPAL Laser System, an optical parametric amplifier line (OPAL) pumped by the Nd:doped portion of the multi-terawatt (MTW) laser. This midscale prototype was designed to produce 0.5-PW pulses with technologies scalable to tens of petawatts. Technology choices made for MTW-OPAL were guided by the longer-term goal of two full-scale OPALs pumped by the OMEGA EP to produce 2 × 25-PW beams that would be co-located with kilojoule−nanosecond ultraviolet beams. Several MTW-OPAL campaigns that have been completed since “first light” in March 2020 show that the laser design is fundamentally sound, and optimization continues as we prepare for “first-focus” campaigns later this year.
Concerns have been raised about the utility of self-report assessments in predicting future suicide attempts. Clinicians in pediatric emergency departments (EDs) often are required to assess suicidal risk. The Death Implicit Association Test (IAT) is an alternative to self-report assessment of suicidal risk that may have utility in ED settings.
Methods
A total of 1679 adolescents recruited from 13 pediatric emergency rooms in the Pediatric Emergency Care Applied Research Network were assessed using a self-report survey of risk and protective factors for a suicide attempt, and the IAT, and then followed up 3 months later to determine if an attempt had occurred. The accuracy of prediction was compared between self-reports and the IAT using the area under the curve (AUC) with respect to receiver operator characteristics.
Results
A few self-report variables, namely, current and past suicide ideation, past suicidal behavior, total negative life events, and school or social connectedness, predicted an attempt at 3 months with an AUC of 0.87 [95% confidence interval (CI), 0.84–0.90] in the entire sample, and AUC = 0.91, (95% CI 0.85–0.95) for those who presented without reported suicidal ideation. The IAT did not add significantly to the predictive power of selected self-report variables. The IAT alone was modestly predictive of 3-month attempts in the overall sample ((AUC = 0.59, 95% CI 0.52–0.65) and was a better predictor in patients who were non-suicidal at baseline (AUC = 0.67, 95% CI 0.55–0.79).
Conclusions
In pediatric EDs, a small set of self-reported items predicted suicide attempts within 3 months more accurately than did the IAT.
The first demonstration of laser action in ruby was made in 1960 by T. H. Maiman of Hughes Research Laboratories, USA. Many laboratories worldwide began the search for lasers using different materials, operating at different wavelengths. In the UK, academia, industry and the central laboratories took up the challenge from the earliest days to develop these systems for a broad range of applications. This historical review looks at the contribution the UK has made to the advancement of the technology, the development of systems and components and their exploitation over the last 60 years.
Gravitational waves from coalescing neutron stars encode information about nuclear matter at extreme densities, inaccessible by laboratory experiments. The late inspiral is influenced by the presence of tides, which depend on the neutron star equation of state. Neutron star mergers are expected to often produce rapidly rotating remnant neutron stars that emit gravitational waves. These will provide clues to the extremely hot post-merger environment. This signature of nuclear matter in gravitational waves contains most information in the 2–4 kHz frequency band, which is outside of the most sensitive band of current detectors. We present the design concept and science case for a Neutron Star Extreme Matter Observatory (NEMO): a gravitational-wave interferometer optimised to study nuclear physics with merging neutron stars. The concept uses high-circulating laser power, quantum squeezing, and a detector topology specifically designed to achieve the high-frequency sensitivity necessary to probe nuclear matter using gravitational waves. Above 1 kHz, the proposed strain sensitivity is comparable to full third-generation detectors at a fraction of the cost. Such sensitivity changes expected event rates for detection of post-merger remnants from approximately one per few decades with two A+ detectors to a few per year and potentially allow for the first gravitational-wave observations of supernovae, isolated neutron stars, and other exotica.
This study sought to compare disease recidivism rates between canal wall up mastoidectomy and a canal wall down with obliteration technique.
Methods
Patients undergoing primary cholesteatoma surgery at our institution over a five-year period (2013–2017) using the aforementioned techniques were eligible for inclusion in the study. Rates of discharge and disease recidivism were analysed using chi-square statistics.
Results
A total of 104 ears (98 patients) were included. The mean follow-up period was 30 months (range, 12–52 months). A canal wall down with mastoid obliteration technique was performed in 55 cases and a canal wall up approach was performed in 49 cases. Disease recidivism rates were 7.3 per cent and 16.3 per cent in the canal wall down with mastoid obliteration and canal wall up groups respectively (p = 0.02), whilst discharge rates were similar (7.3 per cent and 10.2 per cent respectively).
Conclusion
Our direct comparative data suggest that canal wall down mastoidectomy with obliteration is superior to a canal wall up technique in primary cholesteatoma surgery, providing a lower recidivism rate combined with a low post-operative ear discharge rate.
Optical parametric chirped-pulse amplification (OPCPA) [Dubietis et al., Opt. Commun. 88, 437 (1992)] implemented by multikilojoule Nd:glass pump lasers is a promising approach to produce ultraintense pulses (${>}10^{23}~\text{W}/\text{cm}^{2}$). Technologies are being developed to upgrade the OMEGA EP Laser System with the goal to pump an optical parametric amplifier line (EP OPAL) with two of the OMEGA EP beamlines. The resulting ultraintense pulses (1.5 kJ, 20 fs, $10^{24}~\text{W}/\text{cm}^{2}$) would be used jointly with picosecond and nanosecond pulses produced by the other two beamlines. A midscale OPAL pumped by the Multi-Terawatt (MTW) laser is being constructed to produce 7.5-J, 15-fs pulses and demonstrate scalable technologies suitable for the upgrade. MTW OPAL will share a target area with the MTW laser (50 J, 1 to 100 ps), enabling several joint-shot configurations. We report on the status of the MTW OPAL system, and the technology development required for this class of all-OPCPA laser system for ultraintense pulses.
Excessive mobilization of body reserves during the transition from pregnancy to lactation imposes a risk for metabolic diseases on dairy cows. We aimed to establish an experimental model for high v. normal mobilization and herein characterized performance, metabolic and endocrine changes from 7 weeks antepartum (a.p.) to 12 weeks postpartum (p.p.). Fifteen weeks a.p., 38 pregnant multiparous Holstein cows were allocated to two groups that were fed differently to reach either high or normal body condition scores (HBCS: 7.2 NEL MJ/kg dry matter (DM); NBCS: 6.8 NEL MJ/kg DM) at dry-off. Allocation was also based on differences in body condition score (BCS) in the previous and the ongoing lactation that was further promoted by feeding to reach the targeted BCS and back fat thickness (BFT) at dry-off (HBCS: >3.75 and >1.4 cm; NBCS: <3.5 and <1.2 cm). Thereafter, both groups were fed identical diets. Blood samples were drawn weekly from 7 weeks a.p. to 12 weeks p.p. to assess the serum concentrations of metabolites and hormones. The HBCS cows had greater BCS, BFT and BW than the NBCS cows throughout the study and lost more than twice as much BFT during the first 7 weeks p.p. compared with NCBS. Milk yield and composition were not different between groups, except that lactose concentrations were greater in NBSC than in HBCS. Feed intake was also greater in NBCS, and NBCS also reached a positive energy balance earlier than HBCS. The greater reduction in body mass in HBCS was accompanied by greater concentrations of non-esterified fatty acids, and β-hydroxybutyrate in serum after calving than in NBCS, indicating increased lipomobilization and ketogenesis. The mean concentrations of insulin across all time-points were greater in HBCS than in NBCS. In both groups, insulin and IGF-1 concentrations were lower p.p than in a.p. Greater free thyroxine (fT4) concentrations and a lower free 3-3′-5-triiodothyronine (fT3)/fT4 ratio were observed in HBCS than in NBCS a.p., whereas p.p. fT3/fT4 ratio followed a reverse pattern. The variables indicative for oxidative status had characteristic time courses; group differences were limited to greater plasma ferric reducing ability values in NBSC. The results demonstrate that the combination of pre-selection according to BCS and differential feeding before dry-off to promote the difference was successful in obtaining cows that differ in the intensity of mobilizing body reserves. The HBCS cows were metabolically challenged due to intense mobilization of body fat, associated with reduced early lactation dry matter intake and compromised antioxidative capacity.
Major depressive disorder (MDD) is a highly heterogeneous condition in terms of symptom presentation and, likely, underlying pathophysiology. Accordingly, it is possible that only certain individuals with MDD are well-suited to antidepressants. A potentially fruitful approach to parsing this heterogeneity is to focus on promising endophenotypes of depression, such as neuroticism, anhedonia, and cognitive control deficits.
Methods
Within an 8-week multisite trial of sertraline v. placebo for depressed adults (n = 216), we examined whether the combination of machine learning with a Personalized Advantage Index (PAI) can generate individualized treatment recommendations on the basis of endophenotype profiles coupled with clinical and demographic characteristics.
Results
Five pre-treatment variables moderated treatment response. Higher depression severity and neuroticism, older age, less impairment in cognitive control, and being employed were each associated with better outcomes to sertraline than placebo. Across 1000 iterations of a 10-fold cross-validation, the PAI model predicted that 31% of the sample would exhibit a clinically meaningful advantage [post-treatment Hamilton Rating Scale for Depression (HRSD) difference ⩾3] with sertraline relative to placebo. Although there were no overall outcome differences between treatment groups (d = 0.15), those identified as optimally suited to sertraline at pre-treatment had better week 8 HRSD scores if randomized to sertraline (10.7) than placebo (14.7) (d = 0.58).
Conclusions
A subset of MDD patients optimally suited to sertraline can be identified on the basis of pre-treatment characteristics. This model must be tested prospectively before it can be used to inform treatment selection. However, findings demonstrate the potential to improve individual outcomes through algorithm-guided treatment recommendations.
Factors associated with relapse among children who are discharged after reaching a threshold denoted ‘recovered’ from moderate acute malnutrition (MAM) are not well understood. The aim of this study was to identify factors associated with sustained recovery, defined as maintaining a mid-upper-arm circumference≥12·5 cm for 1 year after release from treatment. On the basis of an observational study design, we analysed data from an in-depth household (HH) survey on a sub-sample of participants within a larger cluster randomised controlled trial (cRCT) that followed up children for 1 year after recovery from MAM. Out of 1497 children participating in the cRCT, a subset of 315 children participated in this sub-study. Accounting for other factors, HH with fitted lids on water storage containers (P=0·004) was a significant predictor of sustained recovery. In addition, sustained recovery was better among children whose caregivers were observed to have clean hands (P=0·053) and in HH using an improved sanitation facility (P=0·083). By contrast, socio-economic status and infant and young child feeding practices at the time of discharge and HH food security throughout the follow-up period were not significant. Given these results, we hypothesise that improved water, sanitation and hygiene conditions in tandem with management of MAM through supplemental feeding programmes have the possibility to decrease relapse following recovery from MAM. Furthermore, the absence of associations between relapse and nearly all HH-level factors indicates that the causal factors of relapse may be related mostly to the child’s individual, underlying health and nutrition status.
Previous studies have demonstrated that several major psychiatric disorders are influenced by shared genetic factors. This shared liability may influence clinical features of a given disorder (e.g. severity, age at onset). However, findings have largely been limited to European samples; little is known about the consistency of shared genetic liability across ethnicities.
Method
The relationship between polygenic risk for several major psychiatric diagnoses and major depressive disorder (MDD) was examined in a sample of unrelated Han Chinese women. Polygenic risk scores (PRSs) were generated using European discovery samples and tested in the China, Oxford, and VCU Experimental Research on Genetic Epidemiology [CONVERGE (maximum N = 10 502)], a sample ascertained for recurrent MDD. Genetic correlations between discovery phenotypes and MDD were also assessed. In addition, within-case characteristics were examined.
Results
European-based polygenic risk for several major psychiatric disorder phenotypes was significantly associated with the MDD case status in CONVERGE. Risk for clinically significant indicators (neuroticism and subjective well-being) was also associated with case–control status. The variance accounted for by PRS for both psychopathology and for well-being was similar to estimates reported for within-ethnicity comparisons in European samples. However, European-based PRS were largely unassociated with CONVERGE family history, clinical characteristics, or comorbidity.
Conclusions
The shared genetic liability across severe forms of psychopathology is largely consistent across European and Han Chinese ethnicities, with little attenuation of genetic signal relative to within-ethnicity analyses. The overall absence of associations between PRS for other disorders and within-MDD variation suggests that clinical characteristics of MDD may arise due to contributions from ethnicity-specific factors and/or pathoplasticity.
Dissimilarity coefficients measure the difference between multivariate samples and provide a quantitative aid to the identification of modern analogs for fossil pollen samples. How eight coefficients responded to differences among modern pollen samples from eastern North America was tested. These coefficients represent three different classes: (1) unweighted coefficients that are most strongly influenced by large-valued pollen types, (2) equal-weight coefficients that weight all pollen types equally but can be too sensitive to variations among rare types, and (3) signal-to-noise coefficients that are intermediate in their weighting of pollen types. The studies with modern pollen allowed definition of critical values for each coefficient, which, when not exceeded, indicate that two pollen samples originate from the same vegetation region. Dissimilarity coefficients were used to compare modern and fossil pollen samples, and modern samples so similar to fossil samples were found that most of three late Quaternary pollen diagrams could be “reconstructed” by substituting modern samples for fossil samples. When the coefficients indicated that the fossil spectra had no modern analogs, then the reconstructed diagrams did not match all aspects of the originals. No modern analogs existed for samples from before 9300 yr B.P. at Kirchner Marsh, Minnesota, and from before 11,000 yr B.P. at Wintergreen Lake, Michigan, but modern analogs existed for almost all Holocene samples from these two sites and Brandreth Bog, New York.
Stereotypies are used as indicators of poor animal welfare and it is, therefore, important to understand underlying factors mediating their development. In calves, two oral stereotypies, that is, tongue playing and object manipulation, result mostly from insufficient structure in the diet. Three hypotheses were studied: (1) oral stereotypies in calves are one of two alternative strategies, the alternative being hypo-activity; (2) stereotyping and non-stereotyping calves differ in terms of cortisol secretion; (3) oral stereotypy development in calves rests on a gene by environment interaction. Eight-week-old bull calves (n=48) were assigned to one of four solid feed allowances (0, 9, 18 or 27 g dry matter/kg metabolic weight per day) with the following composition: 50% concentrate, 25% maize silage and 25% straw on dry matter basis. The calves received milk replacer in buckets, the provision of which was adjusted to achieve equal growth rates. At 14 to 18 weeks of age, calves were exposed to a challenge, that is, tethering inside cages. Oral stereotypies and inactivity were recorded in the home pens in the 4 weeks before the challenge using instantaneous scan sampling. Salivary cortisol levels were measured at −120, +40, +80, +120 min and +48 h relative to the challenge. Individual differences in behaviour were recorded in the first 30 min after challenge implementation using focal animal sampling and continuous recording, and these elements were entered into a principal component (PC) analysis to extract PCs. Regression analyses were performed to find relationships between stereotypies and inactivity, stereotypies and cortisol, and stereotypies and PCs (individual differences, genes) and solid feed (environment). Relationships between PCs and cortisol were also investigated to help with the interpretation of PCs. Hypotheses 1 and 2 were rejected. Hypothesis 3, however, was supported: calves with a zero solid feed allowance, that is, in the most barren environment, showed links between stereotypies and two of the PCs. Calves that displayed high levels of idle and rapid locomotion and low levels of oral contact with the cage during the challenge also displayed high levels of object manipulation in the home pens. Calves that displayed low levels of stepping and turning attempts during the challenge also displayed high levels of tongue playing in the home pens. This study corroborates the gene by environment interaction on the development of oral stereotypies in calves.
Head lice (Pediculus humanus capitis) infestations are a public health concern. The insecticidal properties of the Australian native plant Kunzea ambigua (commonly known as tick bush) have been documented. In this study, we tested activity of kunzea oil (KO) against head lice through in vitro bioassays. Head lice were exposed to filter paper treated with either KO, as either a 5 or 100% oil, or commercial formulations containing either permethrin or tea tree oil (TTO) for 120 min. Head lice exposure to KO, both as a 5 and 100% solution oil, resulted in 100% mortality within 120 min with a mean survival times of 17·1 and 34·8 min, respectively. There was no significant difference between the mean mortality of head lice exposed to 5% KO (17·1 ± 1·0; 95% CI: 115·2–19·0) and 5% TTO (21·2 ± 1·9; 95% CI: 17·4–25·1). This study revealed, for the first time, that KO holds great potential as an effective alternative to current active ingredients contained within commercial pediculicide formulations.
Observational studies have suggested that 25-hydroxyvitamin D (25(OH)D) levels are associated with inflammatory markers. Most trials reporting significant associations between vitamin D intake and inflammatory markers used specific patient groups. Thus, we aimed to determine the effect of supplementary vitamin D using secondary data from a population-based, randomised, placebo-controlled, double-blind trial (Pilot D-Health trial 2010/0423). Participants were 60- to 84-year-old residents of one of the four eastern states of Australia. They were randomly selected from the electoral roll and were randomised to one of three trial arms: placebo (n 214), 750 μg (n 215) or 1500 μg (n 215) vitamin D3, each taken once per month for 12 months. Post-intervention blood samples for the analysis of C-reactive protein (CRP), IL-6, IL-10, leptin and adiponectin levels were available for 613 participants. Associations between intervention group and biomarker levels were evaluated using quantile regression. There were no statistically significant differences in distributions of CRP, leptin, adiponectin, leptin:adiponectin ratio or IL-10 levels between the placebo group and either supplemented group. The 75th percentile IL-6 level was 2·8 pg/ml higher (95 % CI 0·4, 5·8 pg/ml) in the 1500 μg group than in the placebo group (75th percentiles:11·0 v. 8·2 pg/ml), with a somewhat smaller, non-significant difference in 75th percentiles between the 750 μg and placebo groups. Despite large differences in serum 25(OH)D levels between the three groups after 12 months of supplementation, we found little evidence of an effect of vitamin D supplementation on cytokine or adipokine levels, with the possible exception of IL-6.
Influenza is rarely laboratory-confirmed and the outpatient influenza burden is rarely studied due to a lack of suitable data. We used the Clinical Practice Research Datalink (CPRD) and surveillance data from Public Health England in a linear regression model to assess the number of persons consulting UK general practitioners (GP episodes) for respiratory illness, otitis media and antibiotic prescriptions attributable to influenza during 14 seasons, 1995–2009. In CPRD we ascertained influenza vaccination status in each season and risk status (conditions associated with severe influenza outcomes). Seasonal mean estimates of influenza-attributable GP episodes in the UK were 857 996 for respiratory disease including 68 777 for otitis media, with wide inter-seasonal variability. In an average season, 2·4%/0·5% of children aged <5 years and 1·3%/0·1% of seniors aged ⩾75 years had a GP episode for respiratory illness attributed to influenza A/B. Two-thirds of influenza-attributable GP episodes were estimated to result in prescription of antibiotics. These estimates are substantially greater than those derived from clinically reported influenza-like illness in surveillance programmes. Because health service costs of influenza are largely borne in general practice, these are important findings for cost-benefit assessment of influenza vaccination programmes.
Depression is characterized by poor executive function, but – counterintuitively – in some studies, it has been associated with highly accurate performance on certain cognitively demanding tasks. The psychological mechanisms responsible for this paradoxical finding are unclear. To address this issue, we applied a drift diffusion model (DDM) to flanker task data from depressed and healthy adults participating in the multi-site Establishing Moderators and Biosignatures of Antidepressant Response for Clinical Care for Depression (EMBARC) study.
Method
One hundred unmedicated, depressed adults and 40 healthy controls completed a flanker task. We investigated the effect of flanker interference on accuracy and response time, and used the DDM to examine group differences in three cognitive processes: prepotent response bias (tendency to respond to the distracting flankers), response inhibition (necessary to resist prepotency), and executive control (required for execution of correct response on incongruent trials).
Results
Consistent with prior reports, depressed participants responded more slowly and accurately than controls on incongruent trials. The DDM indicated that although executive control was sluggish in depressed participants, this was more than offset by decreased prepotent response bias. Among the depressed participants, anhedonia was negatively correlated with a parameter indexing the speed of executive control (r = −0.28, p = 0.007).
Conclusions
Executive control was delayed in depression but this was counterbalanced by reduced prepotent response bias, demonstrating how participants with executive function deficits can nevertheless perform accurately in a cognitive control task. Drawing on data from neural network simulations, we speculate that these results may reflect tonically reduced striatal dopamine in depression.