We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
According to International Union for the Conservation of Nature (IUCN) guidelines, all species must be assessed against all criteria during the Red Listing process. For organismal groups that are diverse and understudied, assessors face considerable challenges in assembling evidence due to difficulty in applying definitions of key terms used in the guidelines. Challenges also arise because of uncertainty in population sizes (Criteria A, C, D) and distributions (Criteria A2/3/4c, B). Lichens, which are often small, difficult to identify, or overlooked during biodiversity inventories, are one such group for which specific difficulties arise in applying Red List criteria. Here, we offer approaches and examples that address challenges in completing Red List assessments for lichens in a rapidly changing arena of data availability and analysis strategies. While assessors still contend with far from perfect information about individual species, we propose practical solutions for completing robust assessments given the currently available knowledge of individual lichen life-histories.
Disordered eating (DE) is associated with elevated cardiometabolic risk (CMR) factors, yet little is known about this association in non-Western countries. We examined the association between DE characteristics and CMR and tested the potential mediating role of BMI. This cross-sectional study included 2005 Chinese women (aged 18–50 years) from the 2015 China Health and Nutrition Survey. Loss of control, restraint, shape concern and weight concern were assessed using selected questions from the SCOFF questionnaire and the Eating Disorder Examination-Questionnaire. Eight CMR were measured by trained staff. Generalised linear models examined associations between DE characteristics with CMR accounting for dependencies between individuals in the same household. We tested whether BMI potentially mediated significant associations using structural equation modelling. Shape concern was associated with systolic blood pressure (β (95 % CI) 0·06 (0·01, 0·10)), diastolic blood pressure (DBP) (0·07 (95 % CI 0·03, 0·11)) and high-density lipoprotein (HDL)-cholesterol (–0·08 (95 % CI –0·12, −0·04)). Weight concern was associated with DBP (0·06 (95 % CI 0·02, 0·10)), triglyceride (0·06 (95 % CI 0·02, 0·10)) and HDL-cholesterol (–0·10 (95 % CI –0·14, −0·07)). Higher scores on DE characteristics were associated with higher BMI, and higher BMI was further associated with lower HDL-cholesterol and higher other CMR. In summary, we observed significant associations between shape and weight concerns with some CMR in Chinese women, and these associations were potentially partially mediated by BMI. Our findings suggest that prevention and intervention strategies focusing on addressing DE could potentially help reduce the burden of CMR in China, possibly through controlling BMI.
Psychiatric disorders and type 2 diabetes mellitus (T2DM) are heritable, polygenic, and often comorbid conditions, yet knowledge about their potential shared familial risk is lacking. We used family designs and T2DM polygenic risk score (T2DM-PRS) to investigate the genetic associations between psychiatric disorders and T2DM.
Methods
We linked 659 906 individuals born in Denmark 1990–2000 to their parents, grandparents, and aunts/uncles using population-based registers. We compared rates of T2DM in relatives of children with and without a diagnosis of any or one of 11 specific psychiatric disorders, including neuropsychiatric and neurodevelopmental disorders, using Cox regression. In a genotyped sample (iPSYCH2015) of individuals born 1981–2008 (n = 134 403), we used logistic regression to estimate associations between a T2DM-PRS and these psychiatric disorders.
Results
Among 5 235 300 relative pairs, relatives of individuals with a psychiatric disorder had an increased risk for T2DM with stronger associations for closer relatives (parents:hazard ratio = 1.38, 95% confidence interval 1.35–1.42; grandparents: 1.14, 1.13–1.15; and aunts/uncles: 1.19, 1.16–1.22). In the genetic sample, one standard deviation increase in T2DM-PRS was associated with an increased risk for any psychiatric disorder (odds ratio = 1.11, 1.08–1.14). Both familial T2DM and T2DM-PRS were significantly associated with seven of 11 psychiatric disorders, most strongly with attention-deficit/hyperactivity disorder and conduct disorder, and inversely with anorexia nervosa.
Conclusions
Our findings of familial co-aggregation and higher T2DM polygenic liability associated with psychiatric disorders point toward shared familial risk. This suggests that part of the comorbidity is explained by shared familial risks. The underlying mechanisms still remain largely unknown and the contributions of genetics and environment need further investigation.
Two studies were conducted to investigate the growth and activity of the fungus, Duddingtonia flagrans, within cattle faecal pats. Artificial faecal pats were constructed with the centre separated from the outer layer by a nylon mesh. Eight treatments were tested, by varying the presence/absence of Cooperia oncophora eggs and fungal spores within each layer. With parasite eggs in the centre layer, a statistically lower recovery of larvae was observed compared to both pats with parasite eggs in the periphery and pats with parasite eggs throughout both layers. Regardless of location within the pat, if co-located with the parasite egg, D. flagrans was found to be effective in trapping developing larvae. The reduction in recovery of larvae from pats with parasite eggs and fungal spores in the centre was found to be significantly higher than when parasite eggs were in the centre and fungal spores in the periphery. In the second study, pats were made up in two treatments: pats containing fungal spores and C. oncophora eggs (fungus) and pats containing C. oncophora eggs (control). The pats were incubated at low or high humidity. Ten pats were used in a cross over where five pats incubated at low humidity for 7 weeks were removed, water added and then incubated at a high humidity for 1 week. Another five pats were incubated at a high humidity for 7 weeks, aerated and incubated at a low humidity for 1 week. There was no apparent growth of fungus in faecal pats incubated at a high humidity and less than 20% of larvae were recovered. The growth of D. flagrans was observed in faecal pats incubated at a low humidity, but a corresponding reduction in the percentage recovery of larvae did not occur, except in week 4. No statistical difference between fungal and control pats was seen in the change over pats. Nematophagous activity was assessed throughout the study and observed in the first 4 weeks within the pats containing fungus.
The fungus, Duddingtonia flagrans, is able to trap and kill free-living nematode larvae of the cattle parasite Cooperia oncophora when chlamydospores are mixed in cattle faeces. Isolates of Bacillus subtilis (two isolates), Pseudomonas spp. (three isolates) and single isolates of the fungal genera Alternaria, Cladosporium, Fusarium, Trichoderma and Verticillium were isolated from cattle faeces and shown to reduce D. flagrans growth on agar plates. When these isolates were added to cattle faeces containing D. flagrans and nematode larvae of C. oncophora, developing from eggs, none of the isolates reduced nematode mortality attributed to D. flagrans. Similarly, the coprophilic fungus Pilobolus kleinii, which cannot be cultivated on agar, also failed to suppress the ability of D. flagrans to trap and kill developing larvae of C. oncophora. Increasing chlamydospore doses of D. flagrans in faecal cultures resulted in higher nematode mortality. Thus, no evidence of interspecific or intraspecific competition was observed. The consequences of these findings are discussed.
The effect on the pasture contamination level with infective trichostrongylid larvae by feeding the nematode-trapping fungus, Duddingtonia flagrans at two dose levels to first time grazing calves was examined in Lithuania. Thirty heifer-calves, aged 3–6 months, were divided into three comparable groups, A, B and C. Each group was turned out on a 1.07 ha paddock (a, b and c). The paddocks were naturally contaminated with infective trichostrongylid larvae from infected cattle grazing the previous year. Fungal material was fed to the animals daily during a two month period starting 3 weeks after turnout. Groups A and B were given 106 and 2.5×105 chlamydospores per kg of live weight per day, respectively, while group C served as a non-dosed control group. Every two weeks the heifers were weighed and clinically inspected. On the same dates, faeces, blood and grass samples were collected. From mid-July onwards, the number of infective larvae in grass samples increased markedly (P<0.05) on paddock c, whereas low numbers of infective larvae were observed on paddocks a and b grazed by the fungus treated groups. However, the results indicate that administering fungal spores at a dose of 2.5×105 chlamydospores per kg live weight per day did not significantly prevent parasitism in calves, presumably due to insufficient suppression of developing infective larvae in the faeces. In contrast, a dose of 106 chlamydospores per kg lowered the parasite larval population on the pasture, reduced pepsinogen levels (P<0.05), and prevented calves from developing parasitosis.
The ability of two isolates of the nematode-trapping fungus Duddingtonia flagrans to reduce the numbers of gastrointestinal nematode larvae on herbage was tested in three plot studies. Artificially prepared cow pats containing Ostertagia ostertagi eggs, with and without fungal spores, were deposited on pasture plots two or three times during the grazing season in 1995, 1996 and 1997. The herbage around each pat was sampled fortnightly over a period of 2 months and the number of infective larvae was recorded. At the end of the sampling period, the remainder of the faecal pats was collected to determine the wet weight, dry weight, and content of organic matter. The infective larvae remaining in the pats were extracted. Faecal cultures showed that both fungal isolates significantly reduced the number of infective larvae. Significantly fewer larvae were recovered from herbage surrounding fungus-treated pats compared with control pats in all three experiments, reflecting the ability of the fungus to destroy free-living larval stages in the faecal pat environment. After 8 weeks on pasture there were no differences between control and fungus-treated pats with respect to wet weight, dry weight, and organic matter content. This indicates that the degradation of faeces was not negatively affected by the presence of the fungus.
A series of experiments on corn meal agar was carried out to evaluate the efficacy of the nematode-trapping fungus Duddingtonia flagrans in different abiotic and biotic conditions which occur in cow pats. Above a concentration of 50 parasitic larvae (L3) cm–2 the fungus produced a maximum of between 500 and 600 nets cm–2 at 20°C in 2 days on the surface of corn meal agar. There were no differences in the trap-producing capacity of three strains of D. flagrans (CIII4, CI3 and Trol A). On agar at 30° and 20°C, the fungus responded to Cooperia oncophora L3 very quickly producing a maximum of trapping nets 1 day after induction. At 10°C, traps were produced slowly starting on day 4 after induction and continued over the following week. Duddingtonia flagrans (CI3) grew at a normal rate at least down to an oxygen concentration of 6 vol.% O2, but it did not grow anaerobically. On agar, D. flagrans (CI3) did not produce trapping nets in an anaerobic atmosphere. Moreover, C. oncophora L3 stopped migration under anaerobic conditions. When the fungal cultures were transferred to a normal aerobic atmosphere, after 1 and 2 weeks under anaerobic conditions, the C. oncophora L3 resumed migrating on the agar and, in response, D. flagrans produced traps in the same amount as when it had not been under anaerobic stress. Under microaerophilic conditions (6 vol.% O2) D. flagrans was able to grow, but the C. oncophora L3 were not able to induce trapping nets in D. flagrans (Trol A) because of larval immobility. But, as under anaerobic conditions, the fungus could return to a nematode-trapping state when transferred to a normal aerobic atmosphere within 1 or 2 weeks if migrating nematodes were present. Under natural conditions in the cow pat it is expected that the fungus will be ready to attack parasitic larvae, when the oxygen tension increases as a result of, for example the activity of the coprophilic fauna. Artificial light giving 3000–3400 Lux on the surface of the agar significantly depressed the growth rate and the production of trapping nets in D. flagrans (CI3). On agar, D. flagrans (CI3) could grow and produce trapping nets at pH levels of 6.3 to 9.3. Net-production has its optimum between pH 7 and 8. On dry faeces mycelial growth was 7–10 mm during a 15 day period while on moist faeces the fungus expanded 15–20 mm during the same period. Based on the parameters investigated, D. flagrans is expected to be especially active in the well aerated surface layer of a cow pat, an area which normally contains a high concentration of infective nematode parasite larvae, but also an area where the temperature can be high and the water content low.
Although several types of risk factors for anorexia nervosa (AN) have been identified, including birth-related factors, somatic, and psychosocial risk factors, their interplay with genetic susceptibility remains unclear. Genetic and epidemiological interplay in AN risk were examined using data from Danish nationwide registers. AN polygenic risk score (PRS) and risk factor associations, confounding from AN PRS and/or parental psychiatric history on the association between the risk factors and AN risk, and interactions between AN PRS and each level of target risk factor on AN risk were estimated.
Methods
Participants were individuals born in Denmark between 1981 and 2008 including nationwide-representative data from the iPSYCH2015, and Danish AN cases from the Anorexia Nervosa Genetics Initiative and Eating Disorder Genetics Initiative cohorts. A total of 7003 individuals with AN and 45 229 individuals without a registered AN diagnosis were included. We included 22 AN risk factors from Danish registers.
Results
Risk factors showing association with PRS for AN included urbanicity, parental ages, genitourinary tract infection, and parental socioeconomic factors. Risk factors showed the expected association to AN risk, and this association was only slightly attenuated when adjusted for parental history of psychiatric disorders or/and for the AN PRS. The interaction analyses revealed a differential effect of AN PRS according to the level of the following risk factors: sex, maternal age, genitourinary tract infection, C-section, parental socioeconomic factors and psychiatric history.
Conclusions
Our findings provide evidence for interactions between AN PRS and certain risk-factors, illustrating potential diverse risk pathways to AN diagnosis.
Behavioral (externalizing) and emotional (internalizing) problems were showed to be associated with the prenatal environment. Changes in placental DNA methylation was identified as a relevant potential mechanism of such association.
Objectives
We aimed to explore the associations between placental DNA methylation and child behavior in order to explore pathways that could link prenatal exposures to child behavior.
Methods
Data including 441 children of 3 years of age from the EDEN mother-child cohort. Child behavior assessed using the Strengths and Difficulties Questionnaire (SDQ). Both hypotheses-driven and exploratory analyses (including epigenome-wide association studies (EWAS) and differentially methylated regions (DMR) analyses) were conducted. The analyses were adjusted for confounding and technical factors and estimated placental cell composition. All the p-values were corrected using a false discovery rate (FDR) procedure for multiple tests.
Results
In the hypothesis-driven analysis, cg26703534 (AHRR), was significantly associated with emotional problems (pFDR = 0.03). In the exploratory analyses, cg09126090 (pFDR = 0.04) and cg10305789 (PPP1R16B; pFDR < 0.01) were significantly associated with peer-relationship problems and 33 DMRs were significantly associated with at least one of the SDQ subscales. Placental DNA methylation showed more associations with internalizing than externalizing symptoms, especially among girls. DMRs tented to include highly methylated CpGs.
Conclusions
This study investigated for the first time the associations between placental DNA methylation and internalizing and externalizing symptoms in preschoolers. Further analyses, such as consortium meta-analyses would be necessary to confirm and extend our results.
Machine learning (ML) approaches are a promising venue for identifying vocal markers of neuropsychiatric disorders, such as schizophrenia. While recent studies have shown that voice-based ML models can reliably predict diagnosis and clinical symptoms of schizophrenia, it is unclear to what extent such ML markers generalize to new speech samples collected using a different task or in a different language: the assessment of generalization performance is however crucial for testing their clinical applicability.
Objectives
In this research, we systematically assessed the generalizability of ML models across contexts and languages relying on a large cross-linguistic dataset of audio recordings of patients with schizophrenia and controls.
Methods
We trained ML models of vocal markers of schizophrenia on a large cross-linguistic dataset of audio recordings of 231 patients with schizophrenia and 238 matched controls (>4.000 recordings in Danish, German, Mandarin and Japanese). We developed a rigorous pipeline to minimize overfitting, including cross-validated training set and Mixture of Experts (MoE) models. We tested the generalizability of the ML models on: (i) different participants, speaking the same language (hold-out test set); (ii) different participants, speaking a different language. Finally, we compared the predictive performance of: (i) models trained on a single language (e.g., Danish) (ii) MoE models, i.e., ensemble of models (experts) trained on a single language whose predictions are combined using a weighted sum (iii) multi-language models trained on multiple languages (e.g., Danish and German).
Results
Model performance was comparable to state-of-the art findings (F1: 70%-80%) when trained and tested on participants speaking the same language (out-of-sample performance). Crucially, however, the ML models did not generalize well - showing a substantial decrease of performance (close to chance) - when trained in a language and tested on new languages (e.g., trained on Danish and tested on German). MoE and multi-language models showed a better increase of performance (F1: 55%-60%), but still far from those requested for achieving clinical applicability.
Conclusions
Our results show that the cross-linguistic generalizability of ML models of vocal markers of schizophrenia is very limited. This is an issue if our first goal is to translate these vocal markers into effective clinical applications. We argue that more emphasis needs to be placed on collecting large open datasets to test the generalizability of voice-based ML models, for example, across different speech tasks or across the heterogeneous clinical profiles that characterize schizophrenia spectrum disorder.
Insomnia in depression is common and difficult to resolve. Music is commonly used as a sleep aid, and clinical trials pointing to positive effects of music as a sleep aid are increasing adding to the evidence base. There is little knowledge on the effectiveness of music for depression related insomnia.
Objectives
A recent RCT study conducted in psychiatry at Aalborg University Hospital examined effects of a music intervention for insomnia in depression. The intervention group listened to music at bedtime for four weeks, controls were offered music intervention post-test. Primary outcome measure was Pittsburgh Sleep Quality Index (PSQI). Secondary outcomes included Actigraphy, The Hamilton depression Rating Scale (HAMD-17) and World Health Organisation well-being questionnaires (WHO-5, WHOQOL-BREF).
Methods
A two-armed randomized controlled trial (n=112) and a qualitative interview study (n=4)
Results
The RCT study showed signficant improvements for the music intervention group in sleep quality and quality of life at four weeks according to global PSQI scores (effect size= -2.1, 95%CI -3.3; -0.9) and WHO-5 scores (effect size 8.4, 95%CI 2.7; 14.0). Actigraphy measures showed no changes and changes in depression symptoms (HAMD-17) were not detected.
The interview study unfolded examples of the influences of music on sleep and relaxation. Music distracted, affected mood and arousal positively and supported formation of sleep habits.
Results from the trial are discussed and merged with findings from the interview study. The results from the trial suggested moderate effects of music listening for the population while findings from the interview study showed examples of individual and highly varying outcomes.
Conclusions
Music is suggested as a low-cost, side-effect free and safe intervention in supplement to existing treatments improving sleep in depression.
Research on the effect of oral contraceptive (OC) use on the risk of depression shows inconsistent findings, especially in adult OC users. One possible reason for this inconsistency is the omission of women who discontinue OCs due to adverse mood effects, leading to healthy user bias. To address this issue, we aim to estimate the risk of depression that is associated with the initiation of OCs as well as the effect of OC use on lifetime risk of depression.
Methods
This is a population-based cohort study based on data from 264,557 women from the UK Biobank. Incidence of depression was addressed via interviews, inpatient hospital or primary care data. The hazard ratio (HR) between OC use and incident depression was estimated by multivariable Cox regression with OC use as a time-varying exposure. To validate causality, we examined familial confounding in 7,354 sibling pairs.
Results
We observed that the first 2 years of OC use were associated with a higher rate of depression compared to never users (HR = 1.71, 95% confidence interval [CI]: 1.55–1.88). Although the risk was not as pronounced beyond the first 2 years, ever OC use was still associated with an increased lifetime risk of depression (HR = 1.05, 95% CI: 1.01–1.09). Previous OC use were associated with a higher rate of depression compared to never users, with adolescent OC users driving the increased hazard (HR = 1.18, 95% CI: 1.12–1.25). No significant association were observed among adult OC users who had previously used OCs (HR = 1.00, 95% CI: 0.95–1.04). Notably, the sibling analysis provided further evidence for a causal effect of OC use on the risk of depression.
Conclusions
Our findings suggest that the use of OCs, particularly during the first 2 years, increases the risk of depression. Additionally, OC use during adolescence might increase the risk of depression later in life. Our results are consistent with a causal relationship between OC use and depression, as supported by the sibling analysis. This study highlights the importance of considering the healthy user bias as well as family-level confounding in studies of OC use and mental health outcomes. Physicians and patients should be aware of this potential risk when considering OCs, and individualized risk–benefit assessments should be conducted.
Cocaine is a highly addictive psychostimulant that affects synaptic activity with structural and functional adaptations of neurons. The transmembrane synaptic vesicle glycoprotein 2A (SV2A) of pre-synaptic vesicles is commonly used to measure synaptic density, as a novel approach to the detection of synaptic changes. We do not know if a single dose of cocaine suffices to affect pre-synaptic SV2A density, especially during adolescence when synapses undergo intense maturation. Here, we explored potential changes of pre-synaptic SV2A density in target brain areas associated with the cocaine-induced boost of dopaminergic neurotransmission, specifically testing if the effects would last after the return of dopamine levels to baseline.
Methods:
We administered cocaine (20 mg/kg i.p.) or saline to rats in early adolescence, tested their activity levels and removed the brains 1 hour and 7 days after injection. To evaluate immediate and lasting effects, we did autoradiography with [3H]UCB-J, a specific tracer for SV2A, in medial prefrontal cortex, striatum, nucleus accumbens, amygdala, and dorsal and ventral areas of hippocampus. We also measured the striatal binding of [3H]GBR-12935 to test cocaine’s occupancy of the dopamine transporter at both times of study.
Results:
We found a significant increase of [3H]UCB-J binding in the dorsal and ventral sections of hippocampus 7 days after the cocaine administration compared to saline-injected rats, but no differences 1 hour after the injection. The [3H]GBR-12935 binding remained unchanged at both times.
Conclusion:
Cocaine provoked lasting changes of hippocampal synaptic SV2A density after a single exposure during adolescence
Mismatch negativity (MMN) amplitude is reduced in psychotic disorders and associated with symptoms and functioning. Due to these robust associations, it is often considered a biomarker for psychotic illness. The relationship between MMN and clinical outcomes has been examined well in early onset psychotic illness; however, its stability and predictive utility in chronic samples are not clear.
Method
We examined the five-year stability of MMN amplitude over two timepoints in individuals with established psychotic disorders (cases; N = 132) and never-psychotic participants (NP; N = 170), as well as longitudinal associations with clinical symptoms and functioning.
Results
MMN amplitude exhibited good temporal stability (cases, r = 0.53; never-psychotic, r = 0.52). In cases, structural equation models revealed MMN amplitude to be a significant predictor of worsening auditory hallucinations (β = 0.19), everyday functioning (β = −0.13), and illness severity (β = −0.12) at follow-up. Meanwhile, initial IQ (β = −0.24), negative symptoms (β = 0.23), and illness severity (β = −0.16) were significant predictors of worsening MMN amplitude five years later.
Conclusions
These results imply that MMN measures a neural deficit that is reasonably stable up to five years. Results support disordered cognition and negative symptoms as preceding reduced MMN, which then may operate as a mechanism driving reductions in everyday functioning and the worsening of auditory hallucinations in chronic psychotic disorders. This pattern may inform models of illness course, clarifying the relationships amongst biological mechanisms of predictive processing and clinical deficits in chronic psychosis and allowing us to better understand the mechanisms driving such impairments over time.
This article is a clinical guide which discusses the “state-of-the-art” usage of the classic monoamine oxidase inhibitor (MAOI) antidepressants (phenelzine, tranylcypromine, and isocarboxazid) in modern psychiatric practice. The guide is for all clinicians, including those who may not be experienced MAOI prescribers. It discusses indications, drug-drug interactions, side-effect management, and the safety of various augmentation strategies. There is a clear and broad consensus (more than 70 international expert endorsers), based on 6 decades of experience, for the recommendations herein exposited. They are based on empirical evidence and expert opinion—this guide is presented as a new specialist-consensus standard. The guide provides practical clinical advice, and is the basis for the rational use of these drugs, particularly because it improves and updates knowledge, and corrects the various misconceptions that have hitherto been prominent in the literature, partly due to insufficient knowledge of pharmacology. The guide suggests that MAOIs should always be considered in cases of treatment-resistant depression (including those melancholic in nature), and prior to electroconvulsive therapy—while taking into account of patient preference. In selected cases, they may be considered earlier in the treatment algorithm than has previously been customary, and should not be regarded as drugs of last resort; they may prove decisively effective when many other treatments have failed. The guide clarifies key points on the concomitant use of incorrectly proscribed drugs such as methylphenidate and some tricyclic antidepressants. It also illustrates the straightforward “bridging” methods that may be used to transition simply and safely from other antidepressants to MAOIs.
There is a concern that the COVID-19 pandemic will lead to an increase in suicides. Several reports from the first months of the pandemic showed no increase in suicide rates while studies with longer observation times report contrasting results. In this study, we explore the suicide rates in Norway during the first year of the pandemic for the total population as well as for relevant subgroups such as sex, age, geographical areas, and pandemic phases.
Methods
This is a cohort study covering the entire Norwegian population between 2010 and 2020. The main outcome was age-standardized suicide rates (per 100,000 inhabitants) in 2020 according to the Norwegian Cause of Death Registry. This was compared with 95% prediction intervals (95% PI) based on the suicide rates between 2010 and 2019.
Results
In 2020, there were 639 suicides in Norway corresponding to a rate of 12.1 per 100,000 (95% PI 10.2–14.4). There were no significant deviations from the predicted values for suicides in 2020 when analyzing age, sex, pandemic phase, or geographical area separately. We observed a trend toward a lower than predicted suicide rate among females (6.5, 95% PI 6.0–9.2), and during the two COVID-19 outbreak phases in 2020 (2.8, 95% PI 2.3–4.3 and 2.8, 95% CI 2.3–4.3).
Conclusion
There is no indication that the COVID-19 pandemic led to an increase in suicide rates in Norway in 2020.
Music therapy is frequently provided to patients at the end of life, and studies have shown a benefit in relief of symptoms and a positive impact on quality of life (QoL), but little is known regarding the effect of music therapy (MT) on caregivers. Caregivers are at risk for anxiety, emotional distress and experience anticipatory grief as the patient nears death. Caregivers are present with patients and may also benefit from MT.
Objective
To assess the impact of MT on caregivers for hospice patients and determine the feasibility of research in this population.
Methods
Twenty caregivers of patients hospitalized for general inpatient hospice care were enrolled. MT was provided by a board-certified music therapist, and sessions included pre-MT assessment, 20-45 minutes of MT, and post-MT assessment. Caregiver stress was measured with the Pearlin Role Overload Measure (ROM), QoL was measured with the Linear Analogue Self-Assessment (LASA), and depression and anxiety were measured with the Patient Health Questionnaire for Depression and Anxiety (PHQ-4). These three measures were taken pre-MT, post-MT and 6 months post-MT. Caregivers were also asked to complete a Music Therapy Program Survey post-MT.
Results
The MT intervention was completed for 15/20 caregivers (75%). Of those who did not complete MT, 2 withdrew prior, 1 was not available, 1 patient passed during the MT session, and 1 patient died prior to MT. 14 caregivers completed pre-MT and post-MT assessments, and 9 caregivers completed assessments at all 3 timepoints. The MT Program Survey (post-MT assessment, n=14) showed 100% of caregivers were very satisfied with MT and would recommend to others, 78% found MT effective for stress relief, 69% for relaxation, 71% for spiritual support, 86% for emotional support, and 71% for feeling of wellness.
Conclusion
Research on MT is feasible for acute hospice care caregivers with a majority of caregivers consenting to research and about half completing surveys pre-MT, post-MT, and 6-months post-MT (9/20). Future larger studies should be conducted to better assess the impact of MT on caregivers.
A subgroup of patients with anorexia nervosa (AN) undergoing involuntary treatment (IT) seems to account for most of the IT events. Little is known about these patients and their treatment including the temporal distribution of IT events and factors associated with subsequent utilization of IT. Hence, this study explores (1) utilization patterns of IT events, and (2) factors associated with subsequent utilization of IT in patients with AN.
Methods
In this nationwide Danish register-based retrospective exploratory cohort study patients were identified from their first (index) hospital admission with an AN diagnosis and followed up for 5 years. We explored data on IT events including estimated yearly and total 5-year rates, and factors associated with subsequent increased IT rates and restraint, using regression analyses and descriptive statistics.
Results
IT utilization peaked in the initial few years starting at or following the index admission. A small percentage (1.0%) of patients accounted for 67% of all IT events. The most frequent measures reported were mechanical and physical restraint. Factors associated with subsequent increased IT utilization were female sex, lower age, previous admissions with psychiatric disorders before index admission, and IT related to those admissions. Factors associated with subsequent restraint were lower age, previous admissions with psychiatric disorders, and IT related to these.
Conclusions
High IT utilization in a small percentage of individuals with AN is concerning and can lead to adverse treatment experiences. Exploring alternative approaches to treatment that reduce the need for IT is an important focus for future research.