We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Depression is the largest global contributor to non-fatal disease burden(1). A growing body of evidence suggests that dietary behaviours, such as higher fruit and vegetable intake, may be protective against the risk of depression(2). However, this evidence is primarily from high-income countries, despite over 80% of the burden of depression being experienced in low- and middle-income countries(1). There are also limited studies to date focusing on older adults. The aim of this study was to prospectively examine the associations between baseline fruit and vegetable intake and incidence of depression in adults aged 45-years and older from 10 cohorts across six continents, including four cohorts from low and middle-income countries. The association between baseline fruit and vegetable intake and incident depression over a 3–6-year follow-up period was examined using Cox proportional hazard regression after controlling for a range of potential confounders. Participants were 7771 community-based adults aged 45+ years from 10 diverse cohorts. All cohorts were members of the Cohort Studies of Memory in an International Consortium collaboration(3). Fruit intake (excluding juice) and vegetable intake was collected using either a comprehensive food frequency questionnaire, short food questionnaire or diet history. Depressive symptoms were assessed using validated depression measures, and depression was defined as a score greater than or equal to a validated cut-off. Prior to analysis all data were harmonised. Analysis was performed by cohort and then cohort results were combined using meta-analysis. Subgroup analysis was performed by sex, age (45 – 64 versus 65+ years) and income level of country (high income countries versus low- and middle-income countries). There were 1537 incident cases of depression over 32,420 person-years of follow-up. Mean daily intakes of fruit were 1.7 ± 1.5 serves and vegetables 1.9 ± 1.4. serves. We found no association between fruit and vegetable intakes and risk of incident depression in any of the analyses, and this was consistent across the subgroup analyses. The low intake of fruit and vegetables of participants, diverse measures used across the different cohorts, and modest sample size of our study compared with prior studies in the literature, may have prevented an association being detected. Further investigation using standardised measures in larger cohorts of older adults from low- to middle-income countries is needed. Future research should consider the potential relationship between different types of fruits and vegetables and depression.
Direct numerical simulations (DNSs) of three-dimensional cylindrical release gravity currents in a linearly stratified ambient are presented. The simulations cover a range of stratification strengths $0< S\leq 0.8$ (where $S=(\rho _b^*-\rho _0^*)/(\rho _c^*-\rho _0^*), \rho _b^*, \rho _0^*$ and $\rho _c^*$ are the dimensional density at the bottom of the domain, top of the domain and the dense fluid, respectively) at two different Reynolds numbers. A comparison between the stratified and unstratified cases illustrates the influence of stratification strength on the dynamics of cylindrical gravity currents. Specifically, the front velocity in the slumping phase decreases with increasing stratification strength whereas the duration of the slumping phase increases with increments of $S$. The Froude number calculated in this phase shows a good agreement with models proposed by Ungarish & Huppert (J. Fluid Mech., vol. 458, 2002, pp. 283–301) and Ungarish (J. Fluid Mech., vol. 548, 2006, pp. 49–68), originally developed for planar gravity currents in a stratified ambient. In the inertial phase, the front velocity across cases with different stratification strengths adheres to a power-law scaling with an exponent of $-$1/2. Higher Reynolds numbers led to more frequent lobe splitting and merging, with lobe size diminishing as stratification strength increased. Strong interactions among inner vortex rings occurred during the slumping phase, leading to the early formation of hairpin vortices in weakly stratified cases, while strongly stratified cases exhibited delayed vortex formation and less turbulence.
Identifying neuroimaging biomarkers of antidepressant response may help guide treatment decisions and advance precision medicine.
Aims
To examine the relationship between anhedonia and functional neurocircuitry in key reward processing brain regions in people with major depressive disorder receiving aripiprazole adjunct therapy with escitalopram.
Method
Data were collected as part of the CAN-BIND-1 study. Participants experiencing a current major depressive episode received escitalopram for 8 weeks; escitalopram non-responders received adjunct aripiprazole for an additional 8 weeks. Functional magnetic resonance imaging (on weeks 0 and 8) and clinical assessment of anhedonia (on weeks 0, 8 and 16) were completed. Seed-based correlational analysis was employed to examine the relationship between baseline resting-state functional connectivity (rsFC), using the nucleus accumbens (NAc) and anterior cingulate cortex (ACC) as key regions of interest, and change in anhedonia severity after adjunct aripiprazole.
Results
Anhedonia severity significantly improved after treatment with adjunct aripiprazole.
There was a positive correlation between anhedonia improvement and rsFC between the ACC and posterior cingulate cortex, ACC and posterior praecuneus, and NAc and posterior praecuneus. There was a negative correlation between anhedonia improvement and rsFC between the ACC and anterior praecuneus and NAc and anterior praecuneus.
Conclusions
Eight weeks of aripiprazole, adjunct to escitalopram, was associated with improved anhedonia symptoms. Changes in functional connectivity between key reward regions were associated with anhedonia improvement, suggesting aripiprazole may be an effective treatment for individuals experiencing reward-related deficits. Future studies are required to replicate our findings and explore their generalisability, using other agents with partial dopamine (D2) agonism and/or serotonin (5-HT2A) antagonism.
Public stigma and fear are heightened in cases of extreme violence perpetrated by persons with serious mental illness (SMI). Prevention efforts require understanding of illness patterns and treatment needs prior to these events unfolding.
Aims
To examine mental health service utilisation by persons who committed homicide and entered into forensic care, to investigate the adequacy of mental healthcare preceding these offences.
Method
Forensic patients across two mental health hospitals in Ontario with an admitting offence of homicide between 2011 and 2021 were identified (n = 112). Sociodemographic, clinical and offence-related variables were coded from the health record and reports prepared for the forensic tribunal.
Results
Most patients (75.7%) had mental health contacts preceding the homicide, with 28.4% having a psychiatric in-patient admission in the year prior. For those with service contacts in the year preceding, 50.9% had had only sporadic contact and 70.7% were non-adherent with prescribed medications. Victims were commonly known to the individual (35.7%) and were often family members in care-providing roles (55.4%). Examination of age at onset of illness and offending patterns suggested that most persons admitted to forensic care for homicide act in the context of illness and exhibit a low frequency of pre-homicide offending.
Conclusions
Many individuals admitted to forensic care for homicide have had inadequate mental healthcare leading up to this point. Effective responses to reduce and manage risk should encompass services that proactively address illness-related (e.g. earlier access and better maintenance in care) and criminogenic (e.g. substance use treatment, employment and psychosocial supports) domains.
Despite replicated cross-sectional evidence of aberrant levels of peripheral inflammatory markers in individuals with major depressive disorder (MDD), there is limited literature on associations between inflammatory tone and response to sequential pharmacotherapies.
Objectives
To assess associations between plasma levels of pro-inflammatory markers and treatment response to escitalopram and adjunctive aripiprazole in adults with MDD.
Methods
In a 16-week open-label clinical trial, 211 participants with MDD were treated with escitalopram 10– 20 mg daily for 8 weeks. Responders continued on escitalopram while non-responders received adjunctive aripiprazole 2–10 mg daily for 8 weeks. Plasma levels of pro-inflammatory markers – C-reactive protein, Interleukin (IL)-1β, IL-6, IL-17, Interferon gamma (IFN)-Γ, Tumour Necrosis Factor (TNF)-α, and Chemokine C–C motif ligand-2 (CCL-2) - measured at baseline, and after 2, 8 and 16 weeks were included in logistic regression analyses to assess associations between inflammatory markers and treatment response.
Results
Pre-treatment levels of IFN-Γ and CCL-2 were significantly higher in escitalopram non-responders compared to responders. Pre-treatment IFN-Γ and CCL-2 levels were significantly associated with a lower of odds of response to escitalopram at 8 weeks. Increases in CCL-2 levels from weeks 8 to 16 in escitalopram non-responders were significantly associated with higher odds of non-response to adjunctive aripiprazole at week 16.
Conclusions
Pre-treatment levels of IFN-Γ and CCL-2 were predictive of response to escitalopram. Increasing levels of these pro-inflammatory markers may predict non-response to adjunctive aripiprazole. These findings require validation in independent clinical populations.
Fontan baffle punctures and creation of Fontan fenestration for cardiac catheterisation procedures remain challenging especially due to the heavy calcification of prosthetic material and complex anatomy.
Objectives:
We sought to evaluate our experience using radiofrequency current via surgical electrocautery needle for Fontan baffle puncture to facilitate diagnostic, electrophysiology, and interventional procedures.
Methods:
A retrospective chart review of all Fontan patients (pts) who underwent Fontan baffle puncture using radiofrequency energy via surgical electrocautery from three centres were performed from January 2011 to July 2021.
Results:
A total of 19 pts underwent 22 successful Fontan baffle puncture. The median age and weight were 17 (3–36 years) and 55 (14–88) kg, respectively. The procedural indications for Fontan fenestration creation included: diagnostic study (n = 1), atrial septostomy and stenting (n = 1), electrophysiology study and ablation procedures (n = 8), Fontan baffle stenting for Fontan failure including protein-losing enteropathy (n = 7), and occlusion of veno-venous collaterals (n = 2) for cyanosis. The type of Fontan baffles included: extra-cardiac conduits (n = 12), lateral tunnel (n = 5), classic atrio-pulmonary connection (n = 1), and intra-cardiac baffle (n = 1). A Fontan baffle puncture was initially attempted using traditional method in 6 pts and Baylis radiofrequency trans-septal system in 2 pts unsuccessfully. In all pts, Fontan baffle puncture using radiofrequency energy via electrocautery needle was successful. The radiofrequency energy utilised was (10–50 W) and required 1–5 attempts for 2–5 seconds. There were no vascular or neurological complications.
Conclusions:
Radiofrequency current delivery using surgical electrocautery facilitates Fontan baffle puncture in patients with complex and calcified Fontan baffles for diagnostic, interventional, and electrophysiology procedures.
This study quantifies the effect of fertilizer and irrigation management on water use efficiency (WUE), crop growth and crop yield in sub-humid to semi-arid conditions of Limpopo Province, South Africa. An approach of coupling a cropping system model (DSSAT) with an agro-hydrological model (SWAT) was developed and applied to simulate crop yield at the field and catchment scale. Simulation results indicated that the application of fertilizer has a greater positive effect on maize yield than irrigation. WUE ranged from 0.10–0.57 kg/m3 (rainfed) to 0.84–1.39 kg/m3 (irrigated) and was positively correlated with fertilizer application rate. The combined application of the variants with deficit irrigation and fertilizer rate (120:60 kg N:P/ha) for maize turned out to be the best option, giving the highest WUE and increasing average yield by up to 5.7 t/ha compared to no fertilization and rainfed cultivation (1.3 t/ha). The simulated results at the catchment scale showed the considerable spatial variability of maize yield across agricultural fields with different soils, slopes and climate conditions. The average annual simulated maize yield across the catchment corresponding to the highest WUE ranged from 4.0 to 7.0 t/ha. The yield gaps ranged from 3.0 to 6.0 t/ha under deficit irrigation combined with 120N:60P kg/ha and ranged from 0.2 to 1.5 t/ha when only applying deficit irrigation but no fertilizer. This information can support regional decision makers to find appropriate interventions that aim at improving crop yield and WUE for catchments/regions.
Aggressive challenging behavior in people with intellectual disability is a frequent reason for referral to secondary care services and is associated with direct harm, social exclusion, and criminal sanctions. Understanding the factors underlying aggressive challenging behavior and predictors of adverse clinical outcome is important in providing services and developing effective interventions.
Methods
This was a retrospective total-population cohort study using electronic records linked with Hospital Episode Statistics data. Participants were adults with intellectual disability accessing secondary services at a large mental healthcare provider in London, United Kingdom, between 2014 and 2018. An adverse outcome was defined as at least one of the following: admission to a mental health hospital, Mental Health Act assessment, contact with a psychiatric crisis team or attendance at an emergency department.
Results
There were 1,515 patient episodes related to 1,225 individuals, of which 1,019 episodes were reported as displaying aggressive challenging behavior. Increased episode length, being younger, psychotropic medication use, pervasive developmental disorder (PDD), more mentions of mood instability, agitation, and irritability, more contact with mental health professionals, and more mentions of social and/or home care package in-episode were all associated with increased odds of medium-high levels of aggression. Risk factors for an adverse clinical outcome in those who exhibited aggression included increased episode length, personality disorder, common mental disorder (CMD), more mentions of agitation in-episode, and contact with mental health professionals. PDD predicted better outcome.
Conclusions
Routinely collected data confirm aggressive challenging behavior as a common concern in adults with intellectual disability who are referred for specialist support and highlight factors likely to signal an adverse outcome. Treatment targets may include optimizing management of CMDs and agitation.
To describe the evolution of respiratory antibiotic prescribing during the coronavirus disease 2019 (COVID-19) pandemic across 3 large hospitals that maintained antimicrobial stewardship services throughout the pandemic.
Design:
Retrospective interrupted time-series analysis.
Setting:
A multicenter study was conducted including medical and intensive care units (ICUs) from 3 hospitals within a Canadian epicenter for COVID-19.
Methods:
Interrupted time-series analysis was used to analyze rates of respiratory antibiotic utilization measured in days of therapy per 1,000 patient days (DOT/1,000 PD) in medical units and ICUs. Each of the first 3 waves of the pandemic were compared to the baseline.
Results:
Within the medical units, use of respiratory antibiotics increased during the first wave of the pandemic (rate ratio [RR], 1.76; 95% CI, 1.38–2.25) but returned to the baseline in waves 2 and 3 despite more COVID-19 admissions. In ICU, the use of respiratory antibiotics increased in wave 1 (RR, 1.30; 95% CI, 1.16–1.46) and wave 2 of the pandemic (RR, 1.21; 95% CI, 1.11–1.33) and returned to the baseline in the third wave, which had the most COVID-19 admissions.
Conclusions:
After an initial surge in respiratory antibiotic prescribing, we observed the normalization of prescribing trends at 3 large hospitals throughout the COVID-19 pandemic. This trend may have been due to the timely generation of new research and guidelines developed with frontline clinicians, allowing for the active application of new research to clinical practice.
Background: Peritoneal dialysis is a type of dialysis performed by patients in their homes; patients receive training from dialysis clinic staff. Peritonitis is a serious complication of peritoneal dialysis, most commonly caused by gram-positive organisms. During March‒April 2019, a dialysis provider organization transitioned ~400 patients to a different manufacturer of peritoneal dialysis equipment and supplies (from product A to B). Shortly thereafter, patients experienced an increase in peritonitis episodes, caused predominantly by gram-negative organisms. In May 2019, we initiated an investigation to determine the source. Methods: We conducted case finding, reviewed medical records, observed peritoneal dialysis procedures and trainings, and performed patient home visits and interviews. A 1:1 matched case–control study was performed in 1 state. A case had ≥2 of the following: (1) positive peritoneal fluid culture, (2) high peritoneal fluid white cell count with ≥50% polymorphonuclear cells, or (3) cloudy peritoneal fluid and/or abdominal pain. Controls were matched to cases by week of clinic visit. Conditional logistic regression was used to estimate univariate matched odds ratios (mOR) and 95% confidence intervals (CIs). We conducted microbiological testing of peritoneal dialysis fluid bags to rule out product contamination. Results: During March‒September 2019, we identified 157 cases of peritonitis across 15 clinics in 2 states (attack rate≍39%). Staphylococcus spp (14%), Serratia spp (12%) and Klebsiella spp (6.3%) were the most common pathogens. Steps to perform peritoneal dialysis using product B differed from product A in several key areas; however, no common errors in practice were identified to explain the outbreak. Patient training on transitioning products was not standardized. Outcomes of the 73 cases in the case–control study included hospitalization (77%), peritoneal dialysis failure (40%), and death (7%). The median duration of training prior to product transition was 1 day for cases and controls (P = .86). Transitioning to product B (mOR, 18.00; 95% CI, 2.40‒134.83), using product B (mOR, 18.26; 95% CI, 3.86‒∞), drain-line reuse (mOR, 4.67; 95% CI, 1.34‒16.24) and performing daytime exchanges (mOR, 3.63; 95% CI, 1.71‒8.45) were associated with peritonitis. After several interventions, including transition of patients back to product A (Fig. 1), overall cases declined. Sterility testing of samples from 23 unopened product B peritoneal dialysis solution bags showed no contamination. Conclusions: Multiple factors may have contributed to this large outbreak, including a rapid transition in peritoneal dialysis products and potentially inadequate patient training. Efforts are needed to identify and incorporate best training practices, and product advances are desired to improve the safety of patient transitions between different types of peritoneal dialysis equipment.
To compare long-term survival of Parkinson’s disease (PD) patients with deep brain stimulation (DBS) to matched controls, and examine whether DBS was associated with differences in injurious falls, long-term care, and home care.
Methods:
Using administrative health data (Ontario, Canada), we examined DBS outcomes within a cohort of individuals diagnosed with PD between 1997 and 2012. Patients receiving DBS were matched with non-DBS controls by age, sex, PD diagnosis date, time with PD, and a propensity score. Survival between groups was compared using the log-rank test and marginal Cox proportional hazards regression. Cumulative incidence function curves and marginal subdistribution hazard models were used to assess effects of DBS on falls, long-term care admission, and home care use, with death as a competing risk.
Results:
There were 260 DBS recipients matched with 551 controls. Patients undergoing DBS did not experience a significant survival advantage compared to controls (log-rank test p = 0.50; HR: 0.89, 95% CI: 0.65–1.22). Among patients <65 years of age, DBS recipients had a significantly reduced risk of death (HR: 0.49, 95% CI: 0.28–0.84). Patients receiving DBS were more likely than controls to receive care for falls (HR: 1.56, 95% CI: 1.19–2.05) and home care (HR: 1.59, 95% CI: 1.32–1.90), while long-term care admission was similar between groups.
Conclusions:
Receiving DBS may increase survival for younger PD patients who undergo DBS. Future studies should examine whether survival benefits may be attributed to effects on PD or the absence of comorbidities that influence mortality.
Echinoderms make up a substantial component of Ordovician marine invertebrates, yet their speciation and dispersal history as inferred within a rigorous phylogenetic and statistical framework is lacking. We use biogeographic stochastic mapping (BSM; implemented in the R package BioGeoBEARS) to infer ancestral area relationships and the number and type of dispersal events through the Ordovician for diploporan blastozoans and related species. The BSM analysis was divided into three time slices to analyze how dispersal paths changed before and during the great Ordovician biodiversification event (GOBE) and within the Late Ordovician mass extinction intervals. The best-fit biogeographic model incorporated jump dispersal, indicating this was an important speciation strategy. Reconstructed areas within the phylogeny indicate the first diploporan blastozoans likely originated within Baltica or Gondwana. Dispersal, jump dispersal, and sympatry dominated the BSM inference through the Ordovician, while dispersal paths varied in time. Long-distance dispersal events in the Early Ordovician indicate distance was not a significant predictor of dispersal, whereas increased dispersal events between Baltica and Laurentia are apparent during the GOBE, indicating these areas were important to blastozoan speciation. During the Late Ordovician, there is an increase in dispersal events among all paleocontinents. The drivers of dispersal are attributed to oceanic and epicontinental currents. Speciation events plotted against geochemical data indicate that blastozoans may not have responded to climate cooling events and other geochemical perturbations, but additional data will continue to shed light on the drivers of early Paleozoic blastozoan speciation and dispersal patterns.
Background: Unintentional opioid overdoses in and around acute care hospitals, including in the ED, are of increasing concern. In April 2018, the Addiction Recovery and Community Health (ARCH) Team at the Royal Alexandra Hospital opened the first acute care Supervised Consumption Service (SCS) in North America available to inpatients. In the SCS, patients can consume substances by injection, oral or intranasal routes under nursing supervision; immediate assistance is provided if an overdose occurs. After a quality assurance review, work began to expand SCS access to ED patients as well. Aim Statement: By expanding SCS access to ED patients, we aim to reduce unintentional and unwitnessed opioid overdoses in registered ED patients to 0 per month by the end of 2020. Measures & Design: Between June 13-July 15, 2019, ARCH ED Registered Nurses were asked to identify ED patients with a history of active substance use who may potentially require SCS access. Nurses identified 69 patients over 43 8-hour shifts (range 0-4 patients per shift); thus, we anticipated an average of 5 ED patients per 24-hour period to potentially require SCS access. Based on this evidence of need, ARCH leadership worked with a) hospital legal team and Health Canada to expand SCS access to ED patients; b) ED leadership to develop a procedure and flowchart for ED SCS access. ED patients were able to access the SCS effective October 1, 2019. Evaluation/Results: From October 1 to December 1, 2019, the SCS had 35 visits by 23 unique ED patients. The median time spent in the SCS was 42.5 minutes (range 14.0-140.0 minutes). Methamphetamine was the most commonly used substance (19, 45.2%), followed by fentanyl (10, 23.8%); substances were all injected (91.4% into a vein and 8.6% into an existing IV). In this time period, there were zero unintentional, unwitnessed opioid poisonings in registered ED patients. Data collection is ongoing and will expand to include chief complaint, ED length of stay and discharge status. Discussion/Impact: Being able to reduce unintentional overdoses and unwitnessed injection drug use in the ED has the potential to improve both patient and staff safety. Next steps include a case series designed to examine the impact of SCS access on emergency care, retention in treatment and uptake into addiction treatment.
Recent botanical explorations in the province of Palawan, Philippines, have resulted in the discovery of two new ginger species, namely Boesenbergia eburnea Docot and Boesenbergia leonardocoi Funak. & Docot, which are described and illustrated here, including information on their distribution, habitat, phenology, ecology and conservation status. Additionally, a key to Boesenbergia species in the Philippines is provided.
To examine whether sociodemographic characteristics and health care utilization are associated with receiving deep brain stimulation (DBS) surgery for Parkinson’s disease (PD) in Ontario, Canada.
Methods:
Using health administrative data, we identified a cohort of individuals aged 40 years or older diagnosed with incident PD between 1995 and 2009. A case-control study was used to examine whether select factors were associated with DBS for PD. Patients were classified as cases if they underwent DBS surgery at any point 1-year after cohort entry until December 31, 2016. Conditional logistic regression modeling was used to estimate the adjusted odds of DBS surgery for sociodemographic and health care utilization indicators.
Results:
A total of 46,237 individuals with PD were identified, with 543 (1.2%) receiving DBS surgery. Individuals residing in northern Ontario were more likely than southern patients to receive DBS surgery [adjusted odds ratio (AOR) = 2.23, 95% confidence interval (CI) = 1.15–4.34]; however, regional variations were not observed after accounting for medication use among older adults (AOR = 1.04, 95% CI = 0.26–4.21). Patients living in neighborhoods with the highest concentration of visible minorities were less likely to receive DBS surgery compared to patients living in predominantly white neighborhoods (AOR = 0.27, 95% CI = 0.16–0.46). Regular neurologist care and use of multiple PD medications were positively associated with DBS surgery.
Conclusions:
Variations in use of DBS may reflect differences in access to care, specialist referral pathways, health-seeking behavior, or need for DBS. Future studies are needed to understand drivers of potential disparities in DBS use.
To examine the effectiveness of an Internet Based Therapy (IBT) for Bulimia Nervosa (BN), when compared to a brief psychoeducational group therapy (PET) or a waiting list (WL).
Method:
93 female BN patients, diagnosed according to DSM-IV criteria. An experimental group (31 IBT patients) was compared to two groups (31 PET and 31 WL). PET and WL were matched to the IBT group in terms of age, disorder duration, previous treatments and severity. All patients completed assesment, prior and after treatment.
Results:
Considering IBT, mean scores were lower at the end of treatment for some EDI scales and BITE symptoms scale, while the mean BMI was higher at post-therapy. Main predictors of good IBT outcome were higher scores in EDI perfectionism and higher scores on reward dependence. Drop-out was related to higher SCL-obsessive/compulsive (p=0.045) and novelty seeking (p=0.044) scores and lower reward dependence (p=0.018). At the end of the treatment bingeing and vomiting abstinence rates (22.6% for IBT, 33.3% for PET, and 0.0% for WL; p=0.003) and drop-out rates (35.5% IBT, 12.9% PET and 0% WL; p= 0.001) differed significantly between groups. While the concrete comparison between the two treatments (IBT and PET) did not evidence significant differences for success proportions (p=0.375), statistical differences for drop-out rates (p=0.038) were obtained.
Conclusions:
The results of this study suggest that an online self-help approach appears to be a valid treatment option for BN, especially for people who present lower severity of their eating disorder (ED) symptomatology and some specific personality traits.
The existing literature on chronic pain points to the effects anxiety sensitivity, pain hypervigilance, and pain catastrophizing on pain-related fear; however, the nature of the relationships remains unclear. The three dispositional factors may affect one another in the prediction of pain adjustment outcomes. The addition of one disposition may increase the association between another disposition and outcomes, a consequence known as suppressor effects in statistical terms.
Objective
This study examined the possible statistical suppressor effects of anxiety sensitivity, pain hypervigilance and pain catastrophizing in predicting pain-related fear and adjustment outcomes (disability and depression).
Methods
Chinese patients with chronic musculoskeletal pain (n = 401) completed a battery of assessments on pain intensity, depression, anxiety sensitivity, pain vigilance, pain catastrophizing, and pain-related fear. Multiple regression analyses assessed the mediating/moderating role of pain hypervigilance. Structural equation modeling (SEM) was used to evaluate suppression effects.
Results
Our results evidenced pain hypervigilance mediated the effects of anxiety sensitivity (Model 1: Sobel z = 4.86) and pain catastrophizing (Model 3: Sobel z = 5.08) on pain-related fear. Net suppression effect of pain catastrophizing on anxiety sensitivity was found in SEM where both anxiety sensitivity and pain catastrophizing were included in the same full model to predict disability (Model 9: CFI = 0.95) and depression (Model 10: CFI = 0.93) (all P < 0.001) (see Figs. 3 and 4, Figs. 1 and 2).
Conclusions
Our findings evidenced that pain hypervigilance mediated the relationship of two dispositional factors, pain catastrophic cognition and anxiety sensitivity, with pain-related fear. The net suppression effects of pain catastrophizing suggest that anxiety sensitivity enhanced the effect of pain catastrophic cognition on pain hypervigilance.
Disclosure of interest
The authors have not supplied their declaration of competing interest.
A body of evidence has accrued supporting the Fear-Avoidance Model (FAM) of chronic pain which postulated the mediating role of pain-related fear in the relationships between pain catastrophizing and pain anxiety in affecting pain-related outcomes. Yet, relatively little data points to the extent to which the FAM be extended to understand chronic pain in Chinese population and its impact on quality of life (QoL).
Objective
This study explored the relationships between FAM components and their effects on QoL in a Chinese sample.
Methods
A total of 401 Chinese patients with chronic musculoskeletal pain completed measures of three core FAM components (pain catastrophizing, pain-related fear, and pain anxiety) and QoL. Cross-sectional structural equation modeling (SEM) assessed the goodness of fit of the FAM for two QoL outcomes, Physical (Model 1) and Mental (Model 2). In both models, pain catastrophizing was hypothesized to underpin pain-related fear, thereby influencing pain anxiety and subsequently QoL outcomes.
Results
Results of SEM evidenced adequate data-model fit (CFI30.90) for the two models tested (Model 1: CFI = 0.93; Model 2: CFI = 0.94). Specifically, pain catastrophizing significantly predicted pain-related fear (Model 1: stdb = 0.90; Model 2: stdb = 0.91), which in turn significantly predicted pain anxiety (Model 1: stdb = 0.92; Model 2: stdb = 0.929) and QoL outcomes in a negative direction (Model 1: stdb = −0.391; Model 2: stdb = −0.651) (all P < 0.001) (Table 1, Fig. 1).
Conclusion
Our data substantiated the existing FAM literature and offered evidence for the cross-cultural validity of the FAM in the Chinese population with chronic pain.
Disclosure of interest
The authors have not supplied their declaration of competing interest.
Low socioeconomic status (SES) has been established as a risk factor for poor mental health; however, the relationship between SES and mental health problems can be confounded by genetic and environmental factors in standard regression analyses and observational studies of unrelated individuals. In this study, we used a within-pair twin design to control for unmeasured genetic and environmental confounders in investigating the association between SES and psychological distress. We also employed within–between pair regression analysis to assess whether the association was consistent with causality. SES was measured using the Index of Relative Socio-economic Disadvantage (IRSD), income and the Australian Socioeconomic Index 2006 (AUSEI06); psychological distress was measured using the Kessler 6 Psychological Distress Scale (K6). Data were obtained from Twins Research Australia’s Health and Lifestyle Questionnaire (2014–2017), providing a maximum sample size of 1395 pairs. Twins with higher AUSEI06 scores had significantly lower K6 scores than their co-twins after controlling for shared genetic and environmental traits (βW [within-pair regression coefficient] = −0.012 units, p = .006). Twins with higher income had significantly lower K6 scores than their co-twins after controlling for familial confounders (βW = −0.182 units, p = .002). There was no evidence of an association between the IRSD and K6 scores within pairs (βW, p = .6). Using a twin design to eliminate the effect of potential confounders, these findings further support the association between low SES and poor mental health, reinforcing the need to address social determinants of poor mental health, in addition to interventions targeted to individuals.
Global inequity in access to and availability of essential mental health services is well recognized. The mental health treatment gap is approximately 50% in all countries, with up to 90% of people in the lowest-income countries lacking access to required mental health services. Increased investment in global mental health (GMH) has increased innovation in mental health service delivery in LMICs. Situational analyses in areas where mental health services and systems are poorly developed and resourced are essential when planning for research and implementation, however, little guidance is available to inform methodological approaches to conducting these types of studies. This scoping review provides an analysis of methodological approaches to situational analysis in GMH, including an assessment of the extent to which situational analyses include equity in study designs. It is intended as a resource that identifies current gaps and areas for future development in GMH. Formative research, including situational analysis, is an essential first step in conducting robust implementation research, an essential area of study in GMH that will help to promote improved availability of, access to and reach of mental health services for people living with mental illness in low- and middle-income countries (LMICs). While strong leadership in this field exists, there remain significant opportunities for enhanced research representing different LMICs and regions.