We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Laboratory experiments are used to evaluate the extent to which players in games can coordinate investments that diminish the probability of losses due to security breaches or terrorist attacks. In this environment, economically sensible investments may be foregone if their potential benefits are negated by failures to invest in security at other sites. The result is a coordination game with a desirable high- payoff, high-security equilibrium and an undesirable low-security equilibrium that may result if players do not expect others to invest in security. One unique feature of this coordination situation is that investment in security by one player generates a positive externality such that all other players’ expected payoffs are increased, regardless of those other players’ investment decisions. Coordination failures are pervasive in a baseline experiment with simultaneous decisions, but coordination is improved if players are allowed to move in an endogenously determined sequence. In addition, coordinated security investments are observed more often when the largest single security threat to individuals is preventable by their own decisions to invest in security. The security coordination game is a “potential game,” and the success of coordination on the more secure equilibrium is related to the notion of potential function maximization and basin of attraction.
The global population and status of Snowy Owls Bubo scandiacus are particularly challenging to assess because individuals are irruptive and nomadic, and the breeding range is restricted to the remote circumpolar Arctic tundra. The International Union for Conservation of Nature (IUCN) uplisted the Snowy Owl to “Vulnerable” in 2017 because the suggested population estimates appeared considerably lower than historical estimates, and it recommended actions to clarify the population size, structure, and trends. Here we present a broad review and status assessment, an effort led by the International Snowy Owl Working Group (ISOWG) and researchers from around the world, to estimate population trends and the current global status of the Snowy Owl. We use long-term breeding data, genetic studies, satellite-GPS tracking, and survival estimates to assess current population trends at several monitoring sites in the Arctic and we review the ecology and threats throughout the Snowy Owl range. An assessment of the available data suggests that current estimates of a worldwide population of 14,000–28,000 breeding adults are plausible. Our assessment of population trends at five long-term monitoring sites suggests that breeding populations of Snowy Owls in the Arctic have decreased by more than 30% over the past three generations and the species should continue to be categorised as Vulnerable under the IUCN Red List Criterion A2. We offer research recommendations to improve our understanding of Snowy Owl biology and future population assessments in a changing world.
Access to local, population specific, and timely data is vital in understanding factors that impact population health. The impact of place (neighborhood, census tract, and city) is particularly important in understanding the Social Determinants of Health. The University of Rochester Medical Center’s Clinical and Translational Science Institute created the web-based tool RocHealthData.org to provide access to thousands of geographically displayed publicly available health-related datasets. The site has also hosted a variety of locally curated datasets (eg., COVID-19 vaccination rates and community-derived health indicators), helping set community priorities and impacting outcomes. Usage statistics (available through Google Analytics) show returning visitors with a lower bounce rate (leaving a site after a single page access) and spent longer at the site than new visitors. Of the currently registered 1033 users, 51.7% were from within our host university, 20.1% were from another educational institution, and 28.2% identified as community members. Our assessments indicate that these data are useful and valued across a variety of domains. Continuing site improvement depends on new sources of locally relevant data, as well as increased usage of data beyond our local region.
Individuals with Down syndrome (DS) experience intellectual disability, such that measures of cognitive and adaptive functioning are near the normative floor upon evaluation. Individuals with DS are also at increased risk for Alzheimer's disease (AD) beginning around age 40; and test performances and adaptive ratings at the normative floor make it difficult to detect change in cognition and functioning. This study first assessed the range of raw intelligence scores and raw adaptive functioning of individuals with DS at the normative floor. Next, we assessed whether those raw intelligence scores were predictive of raw adaptive functioning scores, and by association, whether they may be meaningful when assessing change in individuals with a lower baseline of cognitive functioning.
Participants and Methods:
Participants were selected from a cohort of 117 adults with DS in a longitudinal study examining AD risk. Participants (n=96; M=40.9 years-old, SD=10.67; 57.3% female) were selected if they had both a completed measure of IQ (Kaufmann Brief Intelligence Test; KBIT2) and informant ratings of adaptive functioning (Vineland Adaptive Behavior Scales; VABS-II). Multiple regression was conducted predicting VABS-II total raw score using K-BIT2 total raw score, while controlling for age.
Results:
A slight majority (57.3%) of the sample had a standardized IQ score of 40 with the majority (95.7%) having a standardized score at or below 60. Additionally, 85.3% of the sample had a standard VABS-II score at or below 60. Within the normative floor for the KBIT2 (IQ=40), there was a normal distribution and substantial range of both KBIT2 raw scores (M = 31.19, SD = 13.19, range: 2 to 41) and VABS-II raw scores (M = 406.33, SD = 84.91, range: 198 to 569). Using the full sample, age significantly predicted raw VABS-II scores (ß = -.283, p = .008). When KBIT2 raw scores were included in the model, age was no longer an independently significant predictor. KBIT2 raw scores significantly predicted raw VABS-II scores (ß = .689, p < .001). Age alone accounted for 8.0% of variance in VABS-II raw scores and KBIT2 raw scores accounted for 43.8% additional variance in VABS-II raw scores. This relationship was maintained when the sample was reduced to individuals at the normative floor (n = 51) where KBIT2 raw scores accounted for 23.7% of the variance in raw VABS-II scores (ß = .549, p < .001).
Conclusions:
The results indicate that meaningful variability exists among raw intelligence test performances that may be masked by scores at the normative floor. Further, the variability in raw intelligence scores is associated with variability in adaptive functioning, such that lower intelligence scores are associated with lower ratings of adaptive functioning. Considering this relationship would be masked by a reduction of range due to norming, these findings indicate that raw test performances and adaptive functioning ratings may have value when monitoring change in adults with DS at risk for AD.
CHD care is resource-intensive. Unwarranted variation in care may increase cost and result in poorer health outcomes. We hypothesise that process variation exists within the pre-operative evaluation and planning process for children undergoing repair of atrial septal defect or ventricular septal defect and that substantial variation occurs in a small number of care points.
Methods:
From interviews with staff of an integrated congenital heart centre, an initial process map was constructed. A retrospective chart review of patients with isolated surgical atrial septal defect and ventricular septal defect repair from 7/1/2018 through 11/1/2020 informed revisions of the process map. The map was assessed for points of consistency and variability.
Results:
Thirty-two surgical atrial septal defect/ventricular septal defect repair patients were identified. Ten (31%) were reviewed by interventional cardiology before surgical review. Of these, 6(60%) had a failed catheter-based closure and 4 (40%) were deemed inappropriate for catheter-based closure. Thirty (94%) were reviewed in case conference, all attended surgical clinic, and none were admitted prior to surgery. The process map from interviews alone identified surgery rescheduling as a point of major variability; however, chart review revealed this was not as prominent a source of variability as pre-operative interventional cardiology review.
Conclusions:
Significant variation in the pre-operative evaluation and planning process for surgical atrial septal defect/ventricular septal defect patients was identified. If such process variation is widespread through CHD care, it may contribute to variations in outcome and cost previously documented within CHD surgery. Future research will focus on determining whether the variation is warranted or unwarranted, associated health outcomes and cost variation attributed to these variations in care processes.
Reward processing has been proposed to underpin the atypical social feature of autism spectrum disorder (ASD). However, previous neuroimaging studies have yielded inconsistent results regarding the specificity of atypicalities for social reward processing in ASD.
Aims
Utilising a large sample, we aimed to assess reward processing in response to reward type (social, monetary) and reward phase (anticipation, delivery) in ASD.
Method
Functional magnetic resonance imaging during social and monetary reward anticipation and delivery was performed in 212 individuals with ASD (7.6–30.6 years of age) and 181 typically developing participants (7.6–30.8 years of age).
Results
Across social and monetary reward anticipation, whole-brain analyses showed hypoactivation of the right ventral striatum in participants with ASD compared with typically developing participants. Further, region of interest analysis across both reward types yielded ASD-related hypoactivation in both the left and right ventral striatum. Across delivery of social and monetary reward, hyperactivation of the ventral striatum in individuals with ASD did not survive correction for multiple comparisons. Dimensional analyses of autism and attention-deficit hyperactivity disorder (ADHD) scores were not significant. In categorical analyses, post hoc comparisons showed that ASD effects were most pronounced in participants with ASD without co-occurring ADHD.
Conclusions
Our results do not support current theories linking atypical social interaction in ASD to specific alterations in social reward processing. Instead, they point towards a generalised hypoactivity of ventral striatum in ASD during anticipation of both social and monetary rewards. We suggest this indicates attenuated reward seeking in ASD independent of social content and that elevated ADHD symptoms may attenuate altered reward seeking in ASD.
Childhood adversities (CAs) predict heightened risks of posttraumatic stress disorder (PTSD) and major depressive episode (MDE) among people exposed to adult traumatic events. Identifying which CAs put individuals at greatest risk for these adverse posttraumatic neuropsychiatric sequelae (APNS) is important for targeting prevention interventions.
Methods
Data came from n = 999 patients ages 18–75 presenting to 29 U.S. emergency departments after a motor vehicle collision (MVC) and followed for 3 months, the amount of time traditionally used to define chronic PTSD, in the Advancing Understanding of Recovery After Trauma (AURORA) study. Six CA types were self-reported at baseline: physical abuse, sexual abuse, emotional abuse, physical neglect, emotional neglect and bullying. Both dichotomous measures of ever experiencing each CA type and numeric measures of exposure frequency were included in the analysis. Risk ratios (RRs) of these CA measures as well as complex interactions among these measures were examined as predictors of APNS 3 months post-MVC. APNS was defined as meeting self-reported criteria for either PTSD based on the PTSD Checklist for DSM-5 and/or MDE based on the PROMIS Depression Short-Form 8b. We controlled for pre-MVC lifetime histories of PTSD and MDE. We also examined mediating effects through peritraumatic symptoms assessed in the emergency department and PTSD and MDE assessed in 2-week and 8-week follow-up surveys. Analyses were carried out with robust Poisson regression models.
Results
Most participants (90.9%) reported at least rarely having experienced some CA. Ever experiencing each CA other than emotional neglect was univariably associated with 3-month APNS (RRs = 1.31–1.60). Each CA frequency was also univariably associated with 3-month APNS (RRs = 1.65–2.45). In multivariable models, joint associations of CAs with 3-month APNS were additive, with frequency of emotional abuse (RR = 2.03; 95% CI = 1.43–2.87) and bullying (RR = 1.44; 95% CI = 0.99–2.10) being the strongest predictors. Control variable analyses found that these associations were largely explained by pre-MVC histories of PTSD and MDE.
Conclusions
Although individuals who experience frequent emotional abuse and bullying in childhood have a heightened risk of experiencing APNS after an adult MVC, these associations are largely mediated by prior histories of PTSD and MDE.
We assess the composition and geometry of four individual rock glaciers in Alaska, Wyoming and Colorado by measuring their radio wave speed and applying these results to ground-penetrating radar depth corrections and dielectric mixing models. Our method includes a correction for subsurface reflector dip angle, which we show can lead to an incorrect determination of wave speeds using common midpoint configurations. By observing the radar properties of the rock glaciers and their supraglacial debris, we find that some of the sites exhibit nearly pure ice cores, and all of the sites indicate volumetric ice fractions >50%. These results have implications for terrestrial glaciology and hydrology because the present ice volume is connected to past ice accumulation and subsurface ice preservation, which may affect the future availability of alpine water resources. An understanding of the processes that govern rock glacier evolution over a wide range of latitudes and elevations will also contribute to the exploration of planetary surfaces such as Mars, which hosts a significant population of debris-covered glaciers. Our subsurface composition and geometry estimates will inform simulations of rock glacier formation and evolution to test hypothesized ice origin mechanisms along with the preservation of climate signals.
The impact of the coronavirus disease 2019 (COVID-19) pandemic on mental health is still being unravelled. It is important to identify which individuals are at greatest risk of worsening symptoms. This study aimed to examine changes in depression, anxiety and post-traumatic stress disorder (PTSD) symptoms using prospective and retrospective symptom change assessments, and to find and examine the effect of key risk factors.
Method
Online questionnaires were administered to 34 465 individuals (aged 16 years or above) in April/May 2020 in the UK, recruited from existing cohorts or via social media. Around one-third (n = 12 718) of included participants had prior diagnoses of depression or anxiety and had completed pre-pandemic mental health assessments (between September 2018 and February 2020), allowing prospective investigation of symptom change.
Results
Prospective symptom analyses showed small decreases in depression (PHQ-9: −0.43 points) and anxiety [generalised anxiety disorder scale – 7 items (GAD)-7: −0.33 points] and increases in PTSD (PCL-6: 0.22 points). Conversely, retrospective symptom analyses demonstrated significant large increases (PHQ-9: 2.40; GAD-7 = 1.97), with 55% reported worsening mental health since the beginning of the pandemic on a global change rating. Across both prospective and retrospective measures of symptom change, worsening depression, anxiety and PTSD symptoms were associated with prior mental health diagnoses, female gender, young age and unemployed/student status.
Conclusions
We highlight the effect of prior mental health diagnoses on worsening mental health during the pandemic and confirm previously reported sociodemographic risk factors. Discrepancies between prospective and retrospective measures of changes in mental health may be related to recall bias-related underestimation of prior symptom severity.
This article is a clinical guide which discusses the “state-of-the-art” usage of the classic monoamine oxidase inhibitor (MAOI) antidepressants (phenelzine, tranylcypromine, and isocarboxazid) in modern psychiatric practice. The guide is for all clinicians, including those who may not be experienced MAOI prescribers. It discusses indications, drug-drug interactions, side-effect management, and the safety of various augmentation strategies. There is a clear and broad consensus (more than 70 international expert endorsers), based on 6 decades of experience, for the recommendations herein exposited. They are based on empirical evidence and expert opinion—this guide is presented as a new specialist-consensus standard. The guide provides practical clinical advice, and is the basis for the rational use of these drugs, particularly because it improves and updates knowledge, and corrects the various misconceptions that have hitherto been prominent in the literature, partly due to insufficient knowledge of pharmacology. The guide suggests that MAOIs should always be considered in cases of treatment-resistant depression (including those melancholic in nature), and prior to electroconvulsive therapy—while taking into account of patient preference. In selected cases, they may be considered earlier in the treatment algorithm than has previously been customary, and should not be regarded as drugs of last resort; they may prove decisively effective when many other treatments have failed. The guide clarifies key points on the concomitant use of incorrectly proscribed drugs such as methylphenidate and some tricyclic antidepressants. It also illustrates the straightforward “bridging” methods that may be used to transition simply and safely from other antidepressants to MAOIs.
Non-alcoholic fatty liver disease (NAFLD) is an increasing cause of chronic liver disease that accompanies obesity and the metabolic syndrome. Excess fructose consumption can initiate or exacerbate NAFLD in part due to a consequence of impaired hepatic fructose metabolism. Preclinical data emphasized that fructose-induced altered gut microbiome, increased gut permeability, and endotoxemia play an important role in NAFLD, but human studies are sparse. The present study aimed to determine if two weeks of excess fructose consumption significantly alters gut microbiota or permeability in humans.
Methods:
We performed a pilot double-blind, cross-over, metabolic unit study in 10 subjects with obesity (body mass index [BMI] 30–40 mg/kg/m2). Each arm provided 75 grams of either fructose or glucose added to subjects’ individual diets for 14 days, substituted isocalorically for complex carbohydrates, with a 19-day wash-out period between arms. Total fructose intake provided in the fructose arm of the study totaled a mean of 20.1% of calories. Outcome measures included fecal microbiota distribution, fecal metabolites, intestinal permeability, markers of endotoxemia, and plasma metabolites.
Results:
Routine blood, uric acid, liver function, and lipid measurements were unaffected by the fructose intervention. The fecal microbiome (including Akkermansia muciniphilia), fecal metabolites, gut permeability, indices of endotoxemia, gut damage or inflammation, and plasma metabolites were essentially unchanged by either intervention.
Conclusions:
In contrast to rodent preclinical findings, excess fructose did not cause changes in the gut microbiome, metabolome, and permeability as well as endotoxemia in humans with obesity fed fructose for 14 days in amounts known to enhance NAFLD.
Background: In the Erasmus MC University Medical Center, Rotterdam, the Netherlands, patients considered at risk for carrying highly resistant microorganisms (HRMO) are placed in isolation on admission, until tested negative for HRMO (ie, targeted screening). Patients without risk factors are not routinely screened (ie, nontargeted screening). However, nontargeted screening could identify patients colonized with HRMO missed by targeted screening. To determine the additional value of nontargeted screening, we compared the outcomes of the nontargeted screening approach with all available clinical cultures. Objective: We aim to identify patients colonized with HRMO, but missed by targeted screening, and to determine whether non-targeted screening has additional value. Methods: For the MOVE study, nontargeted admission and discharge cultures (nose and perianal) were obtained from randomly selected patients admitted to specific wards, regardless of HRMO risk factors. This study was part of a research initiative to identify the relation of a contaminated environment with the risk of becoming infected or colonized on a patient level. All bacteriological clinical samples positive for at least 1 HRMO from January 1, 2018, until August 31, 2019, were compared with the nontargeted screening samples. Samples were screened for methicillin-susceptible Staphylococcus aureus (MSSA) and methicillin-resistant Staphylococcus aureus (MRSA) as well as highly resistant Pseudomonas aeruginosa, Acinetobacter baumannii, Enterococcus faecium, and Enterobacteriales. Broth enrichment was used for all cultures. Results: During the study period, 50,653 patients were admitted. 706 patients (1%) had a clinical sample positive for at least 1 HRMO during their hospital stay. 936 (1.8%) patients were included in the nontargeted screening for the MOVE study, and 40 patients were found to have at least 1 culture positive for HRMO (4.3%). Among these 40 patients, 28 were positive at admission and 12 were positive at discharge. Extended-spectrum β-lactamase (ESBL)–producing Enterobacteriales were most prevalent (n = 36, 90.0%) both at admission and discharge (n = 26 and n = 10, respectively). At admission, 1 patient was identified with MRSA and 1 patient was positive for vancomycin-resistant E. faecium (VRE). At discharge, 1 patient was identified with VRE and 1 had Verona Integron-encoded Metallo-β-lactamase (VIM)–positive P. aeruginosa. Conclusions: Our results show that the current targeted screening does not identify all HRMO carriers. Furthermore, patients who acquire an HRMO during admission are missed. The nontargeted screening identified 40 unknown carriers (4.3%). The limitations of the study are the restricted number of sample sites and the fact that we were unable to culture all patients. Therefore, it is likely that our study shows an underestimation of the true number of patients with HRMO.
Background: Studies have shown that patients colonized with highly resistant microorganisms (HRMO) contaminate the hospital environment, and that transmission from contaminated environments to patients occurs. In May 2018, the Erasmus MC University Medical Center, Rotterdam, moved from a hospital with mostly multiple-occupancy rooms to a new hospital with 100% single-patient rooms with private bathrooms. This move provided the unique opportunity to determine environmental contamination before the new hospital was open for admissions and thereafter and to compare the environmental contamination to the number of patients colonized with HRMO. Method: Environmental sampling took place twice in the old building and 12 times in the new building, from 2 weeks before to 15 months after relocating patients. At each moment, ~306 samples were taken from 13 locations (eg, nightstands, sinks) in 40 patient rooms. Samples were screened for Staphylococcus aureus (methicillin-susceptible [MSSA] and methicillin resistant [MRSA]) and highly resistant Pseudomonas aeruginosa, Acinetobacter baumannii, Enterococcus faecium, and Enterobacteriales. During the study period, January 1, 2018, until August 31, 2019, all clinical samples positive for HRMO were included. Results: Environmental sampling revealed that 29 of 724 (4.0%) locations were positive for HRMO in the old building, whereas 4 of 3,358 (0.1%) samples in the new building were positive for HRMO (P < .001). In the old building, 14 of 29 locations were positive for extended-spectrum β-lactamase (ESBL)–producing bacteria and 15 were positive for carbapenemase-producing bacteria. In the new building, 3 of 4 positive samples were positive for vancomycin-resistant E. faecium (VRE), 1 was positive for ESBL-producing K. pneumoniae. For both HRMO, no carriers were detected. In the old building, 145 of 12,256 adult patients (1.2%) had clinical samples positive for HRMO, compared to 561 of 38,397 (1.5%) in the new building, a small but significant increase (P = .02). Conclusions: The transition from mainly 2- and 4-person rooms to 100% single-patient rooms resulted in a significant decrease in environmental contamination, even though the number of patients colonized with HRMO slightly increased. No molecular typing to determine transfer from environment to patients and vice versa has yet been performed. Future sampling is needed to determine whether the low environmental contamination is a long-term effect of the transition to single rooms.
Background: Isolation precautions are recommended when caring for patients identified with highly resistant microorganisms (HRMOs). However, the direct costs of isolating patients are largely unknown. Therefore, we aimed to obtain detailed information on the daily direct costs associated with isolating patients identified with HRMO. Methods: This study was performed from November until December 2017 on a 12-bed surgical ward. This ward contained solely isolation rooms with an anteroom. The daily direct costs of isolation were based on three cost items: (1) additional personal protective equipment (PPE); measured by counting the consumption of empty packaging materials, (2) cleaning and disinfection of the isolation room; based on the costs of an outsourced cleaning company, and (3) additional workload for healthcare workers; based on literature and multiplied by the average gross hourly salary of nurses. A distinction was made between the costs for strict isolation, contact-plus isolation, and contact isolation. Results: During the study period, 26 patients were nursed in isolation because of HRMO carriage, resulting in a total of 304 isolation days (median 7 isolation days; range 1-44). Gloves were consumed the most and hair caps the least. The average daily direct costs of isolation were the least expensive for contact isolation, €28/$31, and the most expensive for strict isolation, €41/$47. Conclusions: By using a novel, easy method to estimate consumption of PPE, we conclude that the daily direct costs of isolating a patient, differs per type of isolation. Insight into the direct costs of isolation is of utmost importance when developing or revising policies.
Whilst it is important that we treat patients with depression in primary care if possible there are many patients with depression who will need the more expert support provided in secondary care.
Aims and methods
An Anonymised Database held by the Bedford East Community Mental Health Team was studied to assess what factors were related to the use of Augmentation Strategies to treat resistant depression.
Results
Of the total 282 patients 109 (38.7%) were on augmentation therapy. In the F32 and F33 group just over a third of the patients (35.8% and 37.1%) were on augmentation therapy and in the F41.2 group over a half of patients (56.7%) were on augmentation therapy.
Discussion
There does seem to be a relationship between the number of risk factors a patient has and the likelihood that they are on augmentation. Particularly strong factors are another psychiatric diagnosis and ‘other suicide risk factors’.
Conclusion
Generally the patients coming to secondary care with more of the specified risk factors are more likely to need augmentation.
People with severe mental illnesses, such as schizophrenia, depression or bipolar disorder, have worse physical health and reduced life expectancy compared to the general population. The excess cardiovascular mortality associated with schizophrenia and bipolar disorder is attributed in part to an increased risk of the modifiable coronary heart disease risk factors; obesity, smoking, diabetes, hypertension and dyslipidaemia. Antipsychotic medication and possibly other psychotropic medication like antidepressants can induce weight gain or worsen other metabolic cardiovascular risk factors. Patients may have limited access to general healthcare with less opportunity for cardiovascular risk screening and prevention than would be expected in a non-psychiatric population. The European Psychiatric Association (EPA), supported by the European Association for the Study of Diabetes (EASD) and the European Society of Cardiology (ESC) published this statement with the aim of improving the care of patients suffering from severe mental illness. The intention is to initiate cooperation and shared care between the different healthcare professionals and to increase the awareness of psychiatrists and primary care physicians caring for patients with severe mental illness to screen and treat cardiovascular risk factors and diabetes.
In recent years there has been a move towards treating depressed patients in the community.One factor that may reduce the likelihood of discharge from secondary care is suicidality. The aim of this audit was to identify factors associated with continued suicidality among Community patients.
Subjects and methods
We searched an anonymised database of patients and identified all those with previously documented suicidal thoughts or attempts. We also noted the presence of factors such as alcohol problems, drug problems, augmentation therapy and ‘other risk’ factors (e.g. financial problems or homelessness). We assessed the latest clinic letter, to see if patients were still reporting suicidality. We compared the aforementioned factors between the group of patients in which suicidality was still present (group N) and the group of patients in which suicidality was no longer a feature (group Y).
Results
Of the 56 patients with suicidal thoughts or attempts there were 44 in group N (79%) and 12 in group Y (21%). Alcohol problems, drug problems and ‘other’ risk factors were more common among group Y than group N. Conversely, the percentage of patients on augmentation therapy was greater in group N than group Y.
Discussion
The audit provides an insight into the factors that might influence outcomes among depressed patients.
Conclusions
Although the results are suggestive, it is difficult to make firm conclusions about patient outcomes on the present data. The audit provides a useful starting point, especially in considering the treatment of patients within the CMHT.