We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Inappropriate diagnosis and treatment of urinary tract infections (UTIs) contribute to antibiotic overuse. The Inappropriate Diagnosis of UTI (ID-UTI) measure uses a standard definition of asymptomatic bacteriuria (ASB) and was validated in large hospitals. Critical access hospitals (CAHs) have different resources which may make ASB stewardship challenging. To address this inequity, we adapted the ID-UTI metric for use in CAHs and assessed the adapted measure’s feasibility, validity, and reliability.
Design:
Retrospective observational study
Participants:
10 CAHs
Methods:
From October 2022 to July 2023, CAHs submitted clinical information for adults admitted or discharged from the emergency department who received antibiotics for a positive urine culture. Feasibility of case submission was assessed as the number of CAHs achieving the goal of 59 cases. Validity (sensitivity/specificity) and reliability of the ID-UTI definition were assessed by dual-physician review of a random sample of submitted cases.
Results:
Among 10 CAHs able to participate throughout the study period, only 40% (4/10) submitted >59 cases (goal); an additional 3 submitted >35 cases (secondary goal). Per the ID-UTI metric, 28% (16/58) of cases were ASB. Compared to physician review, the ID-UTI metric had 100% specificity (ie all cases called ASB were ASB on clinical review) but poor sensitivity (48.5%; ie did not identify all ASB cases). Measure reliability was high (93% [54/58] agreement).
Conclusions:
Similar to measure performance in non-CAHs, the ID-UTI measure had high reliability and specificity—all cases identified as ASB were considered ASB—but poor sensitivity. Though feasible for a subset of CAHs, barriers remain.
Asymptomatic bacteriuria (ASB) treatment is a common form of antibiotic overuse and diagnostic error. Antibiotic stewardship using the inappropriate diagnosis of urinary tract infection (ID-UTI) measure has reduced ASB treatment in diverse hospitals. However, critical access hospitals (CAHs) have differing resources that could impede stewardship. We aimed to determine if stewardship including the ID-UTI measure could reduce ASB treatment in CAHs.
Methods:
From October 2022 to July 2023, ten CAHs participated in an Intensive Quality Improvement Cohort (IQIC) program including 3 interventions to reduce ASB treatment: 1) learning labs (ie, didactics with shared learning), 2) mentoring, and 3) data-driven performance reports including hospital peer comparison based on the ID-UTI measure. To assess effectiveness of the IQIC program, change in the ID-UTI measure (ie, percentage of patients treated for a UTI who had ASB) was compared to two non-equivalent control outcomes (antibiotic duration and unjustified fluoroquinolone use).
Results:
Ten CAHs abstracted a total of 608 positive urine culture cases. Over the cohort period, the percentage of patients treated for a UTI who had ASB declined (aOR per month = 0.935, 95% CI: 0.873, 1.001, P = 0.055) from 28.4% (range across hospitals, 0%-63%) in the first to 18.6% (range, 0%-33%) in the final month. In contrast, antibiotic duration and unjustified fluoroquinolone use were unchanged (P = 0.768 and 0.567, respectively).
Conclusions:
The IQIC intervention, including learning labs, mentoring, and performance reports using the ID-UTI measure, was associated with a non-significant decrease in treatment of ASB, while control outcomes (duration and unjustified fluoroquinolone use) did not change.
Auditory verbal hallucinations (AVHs) in schizophrenia have been suggested to arise from failure of corollary discharge mechanisms to correctly predict and suppress self-initiated inner speech. However, it is unclear whether such dysfunction is related to motor preparation of inner speech during which sensorimotor predictions are formed. The contingent negative variation (CNV) is a slow-going negative event-related potential that occurs prior to executing an action. A recent meta-analysis has revealed a large effect for CNV blunting in schizophrenia. Given that inner speech, similar to overt speech, has been shown to be preceded by a CNV, the present study tested the notion that AVHs are associated with inner speech-specific motor preparation deficits.
Objectives
The present study aimed to provide a useful framework for directly testing the long-held idea that AVHs may be related to inner speech-specific CNV blunting in patients with schizophrenia. This may hold promise for a reliable biomarker of AVHs.
Methods
Hallucinating (n=52) and non-hallucinating (n=45) patients with schizophrenia, along with matched healthy controls (n=42), participated in a novel electroencephalographic (EEG) paradigm. In the Active condition, they were asked to imagine a single phoneme at a cue moment while, precisely at the same time, being presented with an auditory probe. In the Passive condition, they were asked to passively listen to the auditory probes. The amplitude of the CNV preceding the production of inner speech was examined.
Results
Healthy controls showed a larger CNV amplitude (p = .002, d = .50) in the Active compared to the Passive condition, replicating previous results of a CNV preceding inner speech. However, both patient groups did not show a difference between the two conditions (p > .05). Importantly, a repeated measure ANOVA revealed a significant interaction effect (p = .007, ηp2 = .05). Follow-up contrasts showed that healthy controls exhibited a larger CNV amplitude in the Active condition than both the hallucinating (p = .013, d = .52) and non-hallucinating patients (p < .001, d = .88). No difference was found between the two patient groups (p = .320, d = .20).
Conclusions
The results indicated that motor preparation of inner speech in schizophrenia was disrupted. While the production of inner speech resulted in a larger CNV than passive listening in healthy controls, which was indicative of the involvement of motor planning, patients exhibited markedly blunted motor preparatory activity to inner speech. This may reflect dysfunction in the formation of corollary discharges. Interestingly, the deficits did not differ between hallucinating and non-hallucinating patients. Future work is needed to elucidate the specificity of inner speech-specific motor preparation deficits with AVHs. Overall, this study provides evidence in support of atypical inner speech monitoring in schizophrenia.
Adolescents with depression have distinct affective reactions to daily events, but current research is controversial. The emotional context insensitivity theory suggests blunted reactivity in depression, whereas the hypotheses of negative potentiation and mood brightening effect suggest otherwise. While nonlinear associations between depression severity and affective reactivity have been observed, studies with a separate subclinical group remain rare. Subthreshold depression (SD), defined by two to four symptoms lasting for two weeks or more, provides a dimensional view to the underpinnings of affective reactivity. In this study, we compared positive affect (PA) and negative affect (NA) reactivity to positive and negative daily events (uplifts and stress) among adolescents with Major Depressive Disorder (MDD), SD and healthy controls (HC) using experience sampling methods (ESM).
Objectives
We hypothesized a stepped difference in affective reactivity along the depression spectrum: the MDD group will have the strongest reactivity of PA and NA to uplifts and stress, followed by SD and HC.
Methods
Three groups (MDD, SD, and HC) of adolescents were recruited from an epidemiologic sample entitled ‘Hong Kong Child and Adolescent Psychiatric Epidemiologic Survey: Age 6 to 17’. Group status was determined by the Diagnostic Interview Schedule for Children Version 5. They completed an experience sampling diary on smartphone for 14 consecutive days, with 5-10 entries per day. Momentary levels of PA (happy, relaxed, contented), NA (irritated, low, nervous), uplifts and stress experienced before the entry were measured on a 1-7 Likert scale.
Results
The sample consisted of 19 adolescents with MDD, 30 with SD, and 59 HC. The M:F ratio was 17:19. The age range was 12-18 with a mean of 14.8. The overall ESM completion rate was 46%. The MDD group had the highest levels of stress and NA, and the lowest levels of uplifts and PA, followed by the SD and HC groups respectively (p<0.01). Across groups, levels of PA were positively associated with uplifts and negatively associated with stress, whereas levels of NA were positively associated with stress and negatively associated with uplifts. The Group x Uplift interaction effect on PA was significant, with greater PA reactivity in SD (p<0.01) and MDD (p=0.07) when compared with HC. The Group x Uplift interaction effect on NA was significant, with greater NA reactivity in SD than HC (p<0.01). The Group x Stress interaction effect on PA was significant, with greater PA reactivity in SD than HC (p<0.01) and MDD (p<0.01). The Group x Stress interaction effect with NA is non-significant.
Conclusions
Contrary to our hypothesis, adolescents with SD experienced strongest PA and NA reactivity in uplifts and PA reactivity in stress. It provides evidence towards a nonlinear relationship between severity of depression and affective reactivity.
Although the link between alcohol involvement and behavioral phenotypes (e.g. impulsivity, negative affect, executive function [EF]) is well-established, the directionality of these associations, specificity to stages of alcohol involvement, and extent of shared genetic liability remain unclear. We estimate longitudinal associations between transitions among alcohol milestones, behavioral phenotypes, and indices of genetic risk.
Methods
Data came from the Collaborative Study on the Genetics of Alcoholism (n = 3681; ages 11–36). Alcohol transitions (first: drink, intoxication, alcohol use disorder [AUD] symptom, AUD diagnosis), internalizing, and externalizing phenotypes came from the Semi-Structured Assessment for the Genetics of Alcoholism. EF was measured with the Tower of London and Visual Span Tasks. Polygenic scores (PGS) were computed for alcohol-related and behavioral phenotypes. Cox models estimated associations among PGS, behavior, and alcohol milestones.
Results
Externalizing phenotypes (e.g. conduct disorder symptoms) were associated with future initiation and drinking problems (hazard ratio (HR)⩾1.16). Internalizing (e.g. social anxiety) was associated with hazards for progression from first drink to severe AUD (HR⩾1.55). Initiation and AUD were associated with increased hazards for later depressive symptoms and suicidal ideation (HR⩾1.38), and initiation was associated with increased hazards for future conduct symptoms (HR = 1.60). EF was not associated with alcohol transitions. Drinks per week PGS was linked with increased hazards for alcohol transitions (HR⩾1.06). Problematic alcohol use PGS increased hazards for suicidal ideation (HR = 1.20).
Conclusions
Behavioral markers of addiction vulnerability precede and follow alcohol transitions, highlighting dynamic, bidirectional relationships between behavior and emerging addiction.
The prevalence of childhood obesity is increasing globally(1). While BMI is commonly used to define obesity, it is unable to differentiate between fat and muscle mass, leading to calls to measure body composition specifically(2). While several tools are available to assess body composition in infancy, it is unclear if they are directly comparable. Among a subset of healthy infants born to mothers participating in a randomised controlled trial of a preconception and antenatal nutritional supplement(3), measurements were made at ages 6 weeks (n = 58) and 6 months (n = 70) using air displacement plethysmography (ADP), whole-body dual-energy X-ray absorptiometry (DXA), and bioelectrical impedance spectroscopy (BIS). Estimates of percentage fat mass (%FM) were compared using Cohen’s kappa statistic (κ) and Bland-Altman analysis (4,5). There was none to weak agreement when comparing tertiles of %FM (κ = 0.15–0.59). When comparing absolute values, the bias (i.e., mean difference) was smallest when comparing BIS to ADP at 6 weeks (+1.7%). A similar bias was observed at 6 months when comparing DXA to ADP (+1.8%). However, when comparing BIA to DXA at both ages, biases were much larger (+7.6% and +4.7% at 6 weeks and 6 months, respectively). Furthermore, there was wide interindividual variance (limits of agreement [LOA] i.e., ± 1.96 SD) for each comparison. At 6 weeks, LOA ranged from ± 4.8 to ± 6.5% for BIA vs. DXA and BIA vs. ADP, respectively. At 6 months, LOA were even wider, ranging from ± 7.3 to ± 8.1% (DXA vs. ADP and BIA vs. DXA, respectively). Proportional biases were apparent when comparing BIS to the other tools at both ages, with BIS generally overestimating %FM more among infants with low adiposity. In addition to differences according to tool type, within-tool factors impacted body composition estimation. For ADP measurements, the choice of FFM density reference (Fomon vs. Butte) had minimal impact; however, choice of DXA software version (GE Lunar enCORE basic vs. enhanced) and BIS analysis approach (empirical equation vs. mixture theory prediction) led to very different estimates of body composition. In conclusion, when comparing body composition assessment tools in infancy, there was limited agreement between three commonly used tools. Therefore, researchers and clinicians must be cautious when conducting longitudinal analyses or when comparing findings across studies, as estimates are not comparable across tools.
Previous studies suggest that influenza virus infection may provide temporary non-specific immunity and hence lower the risk of non-influenza respiratory virus infection. In a randomized controlled trial of influenza vaccination, 1 330 children were followed-up in 2009–2011. Respiratory swabs were collected when they reported acute respiratory illness and tested against influenza and other respiratory viruses. We used Poisson regression to compare the incidence of non-influenza respiratory virus infection before and after influenza virus infection. Based on 52 children with influenza B virus infection, the incidence rate ratio (IRR) of non-influenza respiratory virus infection after influenza virus infection was 0.47 (95% confidence interval: 0.27–0.82) compared with before infection. Simulation suggested that this IRR was 0.87 if the temporary protection did not exist. We identified a decreased risk of non-influenza respiratory virus infection after influenza B virus infection in children. Further investigation is needed to determine if this decreased risk could be attributed to temporary non-specific immunity acquired from influenza virus infection.
Persons newly diagnosed with dementia and their family member is imperative often experience uncertainty and inadequate support. This study aims to evaluate a post-diagnostic support programme guided by the 5 Pillars Model proposed by Alzheimer Scotland on the self-efficacy among persons with early dementia and their family members.
Methods:
A prospective cohort study design was conducted between 2019 and 2022. Subject recruitment was conducted in four non-government organizations. A multi-domain empowerment programme, covering various aspects about dementia knowledge, management skills, peer support, future decision-making and community resources, was developed. The programme was provided to people newly diagnosed of early dementia in small group format over 2 months and to family members individually through an eLearning platform over 9 months. Self efficacy in dementia management of people with dementia and their family members were measured using Chronic Disease Self-efficacy Scale and Caregiver Self-efficacy Scale (CSES), respectively, whereas caregiving burden was measured using Zarit Burden Interview (ZBI). Study outcomes were measured at baseline, immediate and 6-month post-intervention. Paired t-tests were performed to detect within-subject changes over time.
Results:
A total of 151 persons with early dementia and 294 family caregivers completed assessment at baseline and follow up. Self-efficacy in dementia management reported by persons with dementia at 6-month post-intervention was significantly higher than that reported at baseline (p = .021) and immediate post-intervention (i.e. 2-month follow up) (p = .006). Family members reported a significantly higher CSES score (p < .001) and subscale scores in thoughts (p = .001) and disruptive behaviour management (p = .001) at 9-month follow up, but significant reduction in caregiving burden (p < .001) was only noted among those who perceived higher burden than the local norms at baseline (ZBI score ≥ 25, n = 110).
Discussion:
This study provides empirical evidence that post-diagnostic support would empower persons with early dementia and their family members on adapting the impacts brought by dementia. Further study on examining the longer term effects on care outcomes and health service utilisation would be valuable.
People with dementia are more prone to premature nursing home placement after hospitalization due to physical and mental deconditioning which makes care-at- home more difficult. This study aimed to evaluate the effect of a post hospital discharge transitional care program on reduction of nursing home placement in people with dementia.
Methods:
A matched case-control study was conducted between 2018 and 2021. A transitional care program using case management approach was developed. Participants enrolled the program by self-enrolment or referral from hospitals or NGOs. Community-dwelling people with dementia discharged from hospitals received a four- week residential care at a dementia care centre with intensive nursing care, physiotherapy and group activities promoting social engagement, followed by eight- week day care rehabilitation activities to improve their mobility and cognitive functioning. They were matched on a 1:5 ratio by age and sex to people with dementia discharged from a convalescent hospital who did not participate in this program for comparison. The study outcome was nursing home admission, measured three months (i.e. post-intervention), six months, and nine months after hospital discharge. Multinomial logistic regression was conducted to investigate factors associated with nursing home placement at each measurement time-point.
Results:
361 hospital admission episodes (n=67 interevntion, n=294 control) were examined. The regression results showed that participants in the intervention group were significantly less likely to be admitted to nursing home three months (OR = 0.023, 95% CI: 0.003-0.201, p = .001) and six months (OR = 0.094, 95% CI: 0.025-0.353, p = .001) than the controls after hospital discharge, but the intervention effect did not sustain nine months after hospital discharge. Longer hospital length of stay, and hospital admission due to dementia, mental disturbances such as delirium, or mental disorders IPA_Abstract_PDP_20230119_clean 2 such as schizophrenia significantly predicted nursing home admission three months and six months after hospital discharge.
Conclusion:
The transitional care program could help reduce nursing home placement in people with dementia after hospital discharge. To sustain the intervention effect, more continual support after the intervention as well as family caregiver training would be required.
Attention-deficit/hyperactivity disorder (ADHD) symptoms are associated with myriad adverse outcomes, including interpersonal difficulties, but factors that moderate the developmental course and functional impact of ADHD over time are not well understood. The present study evaluated developmental contributions of the triarchic neurobehavioral traits (boldness, meanness, and disinhibition) to ADHD symptomatology and its subdimensions from adolescence to young adulthood. Participants were twins and triplets assessed at ages 14, 17, and 19 (initial N = 1,185, 51.2% female). Path analyses using negative binomial regression revealed that boldness at age 14 was associated with more ADHD symptoms cross-sectionally (especially hyperactivity/impulsivity), but fewer symptoms (especially inattention) at age 19 in the prospective analysis. Notably, inclusion of interpersonal problems at ages 14 and 17 as covariates reduced the latter effect to nonsignificant. Disinhibition concurrently and prospectively predicted higher levels of ADHD symptoms, including both subdimensions, and the prospective effects were partially mediated by greater social impairment at age 17. Meanness prospectively (but not concurrently) predicted higher levels of hyperactivity/impulsivity symptoms. Sex moderated certain associations of meanness and disinhibition with ADHD symptoms. These findings highlight how fundamental neurobehavioral traits shape both psychopathology and adaptive outcomes in the developmental course of ADHD.
There is evidence that child maltreatment is associated with shorter telomere length in early life.
Aims
This study aims to examine if child maltreatment is associated with telomere length in middle- and older-age adults.
Method
This was a retrospective cohort study of 141 748 UK Biobank participants aged 37–73 years at recruitment. Leukocyte telomere length was measured with quantitative polymerase chain reaction, and log-transformed and scaled to have unit standard deviation. Child maltreatment was recalled by participants. Linear regression was used to analyse the association.
Results
After adjusting for sociodemographic characteristics, participants with three or more types of maltreatment presented with the shortest telomere lengths (β = −0.05, 95% CI −0.07 to −0.03; P < 0.0001), followed by those with two types of maltreatment (β = −0.02, 95% CI −0.04 to 0.00; P = 0.02), referent to those who had none. When adjusted for depression and post-traumatic stress disorder, the telomere lengths of participants with three or more types of maltreatment were still shorter (β = −0.04, 95% CI −0.07 to −0.02; P = 0.0008). The telomere lengths of those with one type of maltreatment were not significantly different from those who had none. When mutually adjusted, physical abuse (β = −0.05, 95% CI −0.07 to −0.03; P < 0.0001) and sexual abuse (β = −0.02, 95% CI −0.04 to 0.00; P = 0.02) were independently associated with shorter telomere length.
Conclusions
Our findings showed that child maltreatment is associated with shorter telomere length in middle- and older-aged adults, independent of sociodemographic and mental health factors.
Nixtun-Ch'ich', on the western edge of Lake Peten Itza in Peten, northern Guatemala, features an axis urbis and an urban grid dating to the Middle Preclassic period (800–500 b.c.). New research reveals that Middle Preclassic constructions—five circular or oval artificial pools and planned surface drainage—facilitated or impeded the movement of water. Large limestone rubble lines at least two of the pools (aguadas) in the city's core; two pools lie on the axis urbis, demonstrating that they were central ceremonial constructions. The gridded streets facilitated drainage: they consistently slope from west to east and from the center to north and south. In some areas seeing intense water flow, the streets divide into waterways and pedestrian-ways and/or were given special paving. Many scholars argue that water management contributed to the power of despotic kings, but no evidence of such rulers exists among the Middle Preclassic Maya. Nonetheless, we believe that such systems emerged in the Middle Preclassic. Nixtun-Ch'ich' appears to have been cooperative in its organization and its water management system was a public good.
The purpose of this study was to examine possible pathways by which genetic risk associated with externalizing is transmitted in families. We used molecular data to disentangle the genetic and environmental pathways contributing to adolescent externalizing behavior in a sample of 1,111 adolescents (50% female; 719 European and 392 African ancestry) and their parents from the Collaborative Study on the Genetics of Alcoholism. We found evidence for genetic nurture such that parental externalizing polygenic scores were associated with adolescent externalizing behavior, over and above the effect of adolescents’ own externalizing polygenic scores. Mediation analysis indicated that parental externalizing psychopathology partly explained the effect of parental genotype on children’s externalizing behavior. We also found evidence for evocative gene-environment correlation, whereby adolescent externalizing polygenic scores were associated with lower parent–child communication, less parent–child closeness, and lower parental knowledge, controlling for parental genotype. These effects were observed among participants of European ancestry but not African ancestry, likely due to the limited predictive power of polygenic scores across ancestral background. These results demonstrate that in addition to genetic transmission, genes influence offspring behavior through the influence of parental genotypes on their children’s environmental experiences, and the role of children’s genotypes in shaping parent–child relationships.
Young people are most vulnerable to suicidal behaviours but least likely to seek help. A more elaborate study of the intrinsic and extrinsic correlates of suicidal ideation and behaviours particularly amid ongoing population-level stressors and the identification of less stigmatising markers in representative youth populations is essential.
Methods
Participants (n = 2540, aged 15–25) were consecutively recruited from an ongoing large-scale household-based epidemiological youth mental health study in Hong Kong between September 2019 and 2021. Lifetime and 12-month prevalence of suicidal ideation, plan, and attempt were assessed, alongside suicide-related rumination, hopelessness and neuroticism, personal and population-level stressors, family functioning, cognitive ability, lifetime non-suicidal self-harm, 12-month major depressive disorder (MDD), and alcohol use.
Results
The 12-month prevalence of suicidal ideation, ideation-only (no plan or attempt), plan, and attempt was 20.0, 15.4, 4.6, and 1.3%, respectively. Importantly, multivariable logistic regression findings revealed that suicide-related rumination was the only factor associated with all four suicidal outcomes (all p < 0.01). Among those with suicidal ideation (two-stage approach), intrinsic factors, including suicide-related rumination, poorer cognitive ability, and 12-month MDE, were specifically associated with suicide plan, while extrinsic factors, including coronavirus disease 2019 (COVID-19) stressors, poorer family functioning, and personal life stressors, as well as non-suicidal self-harm, were specifically associated with suicide attempt.
Conclusions
Suicide-related rumination, population-level COVID-19 stressors, and poorer family functioning may be important less-stigmatising markers for youth suicidal risks. The respective roles played by not only intrinsic but also extrinsic factors in suicide plan and attempt using a two-stage approach should be considered in future preventative intervention work.
Many patients with mental health disorders become increasingly isolated at home due to anxiety about going outside. A cognitive perspective on this difficulty is that threat cognitions lead to the safety-seeking behavioural response of agoraphobic avoidance.
Aims:
We sought to develop a brief questionnaire, suitable for research and clinical practice, to assess a wide range of cognitions likely to lead to agoraphobic avoidance. We also included two additional subscales assessing two types of safety-seeking defensive responses: anxious avoidance and within-situation safety behaviours.
Method:
198 patients with psychosis and agoraphobic avoidance and 1947 non-clinical individuals completed the item pool and measures of agoraphobic avoidance, generalised anxiety, social anxiety, depression and paranoia. Factor analyses were used to derive the Oxford Cognitions and Defences Questionnaire (O-CDQ).
Results:
The O-CDQ consists of three subscales: threat cognitions (14 items), anxious avoidance (11 items), and within-situation safety behaviours (8 items). Separate confirmatory factor analyses demonstrated a good model fit for all subscales. The cognitions subscale was significantly associated with agoraphobic avoidance (r = .672, p < .001), social anxiety (r = .617, p < .001), generalized anxiety (r = .746, p < .001), depression (r = .619, p < .001) and paranoia (r = .655, p < .001). Additionally, both the O-CDQ avoidance (r = .867, p < .001) and within-situation safety behaviours (r = .757, p < .001) subscales were highly correlated with agoraphobic avoidance. The O-CDQ demonstrated excellent internal consistency (cognitions Cronbach’s alpha = .93, avoidance Cronbach’s alpha = .94, within-situation Cronbach’s alpha = .93) and test–re-test reliability (cognitions ICC = 0.88, avoidance ICC = 0.92, within-situation ICC = 0.89).
Conclusions:
The O-CDQ, consisting of three separate scales, has excellent psychometric properties and may prove a helpful tool for understanding agoraphobic avoidance across mental health disorders.
Brief measurements of the subjective experience of stress with good predictive capability are important in a range of community mental health and research settings. The potential for large-scale implementation of such a measure for screening may facilitate early risk detection and intervention opportunities. Few such measures however have been developed and validated in epidemiological and longitudinal community samples. We designed a new single-item measure of the subjective level of stress (SLS-1) and tested its validity and ability to predict long-term mental health outcomes of up to 12 months through two separate studies.
Methods
We first examined the content and face validity of the SLS-1 with a panel consisting of mental health experts and laypersons. Two studies were conducted to examine its validity and predictive utility. In study 1, we tested the convergent and divergent validity as well as incremental validity of the SLS-1 in a large epidemiological sample of young people in Hong Kong (n = 1445). In study 2, in a consecutively recruited longitudinal community sample of young people (n = 258), we first performed the same procedures as in study 1 to ensure replicability of the findings. We then examined in this longitudinal sample the utility of the SLS-1 in predicting long-term depressive, anxiety and stress outcomes assessed at 3 months and 6 months (n = 182) and at 12 months (n = 84).
Results
The SLS-1 demonstrated good content and face validity. Findings from the two studies showed that SLS-1 was moderately to strongly correlated with a range of mental health outcomes, including depressive, anxiety, stress and distress symptoms. We also demonstrated its ability to explain the variance explained in symptoms beyond other known personal and psychological factors. Using the longitudinal sample in study 2, we further showed the significant predictive capability of the SLS-1 for long-term symptom outcomes for up to 12 months even when accounting for demographic characteristics.
Conclusions
The findings altogether support the validity and predictive utility of the SLS-1 as a brief measure of stress with strong indications of both concurrent and long-term mental health outcomes. Given the value of brief measures of mental health risks at a population level, the SLS-1 may have potential for use as an early screening tool to inform early preventative intervention work.
Disease-related malnutrition is prevalent among older adults; therefore, identifying the modifiable risk factors in the diet is essential for the prevention and management of disease-related malnutrition. The present study examined the cross-sectional association between dietary patterns and malnutrition in Chinese community-dwelling older adults aged ≥65 years in Hong Kong. Dietary patterns, including Diet Quality Index International (DQI-I), Dietary Approaches to Stop Hypertension (DASH), the Mediterranean Diet Score, ‘vegetable–fruit’ pattern, ‘snack–drink–milk product’ pattern and ‘meat–fish’ pattern, were estimated and generated from a validated food frequency questionnaire. Malnutrition was classified according to the modified Global Leadership Initiative on Malnutrition (GLIM) criteria based on two phenotypic components (low body mass index and reduced muscle mass) and one aetiologic component (inflammation/disease burden). The association between the tertile or level of adherence of each dietary pattern and modified GLIM criteria was analysed using adjusted binary logistic regression models. Data of 3694 participants were available (49 % men). Malnutrition was present in 397 participants (10⋅7 %). In men, a higher DQI-I score, a higher ‘vegetable–fruit’ pattern score and a lower ‘meat–fish’ pattern score were associated with a lower risk of malnutrition. In women, higher adherence to the DASH diet was associated with a lower risk of malnutrition. After the Bonferroni correction, the association remained statistically significant only in men for the DQI-I score. To conclude, a higher DQI-I score was associated with a lower risk of malnutrition in Chinese older men. Nutritional strategies for the prevention and management of malnutrition could potentially be targeted on dietary quality.
Despite its efficacy in treating comorbid insomnia and depression, cognitive behavioral therapy for insomnia (CBT-I) is limited in its accessibility and, in many countries, cultural compatibility. Smartphone-based treatment is a low-cost, convenient alternative modality. This study evaluated a self-help smartphone-based CBT-I in alleviating major depression and insomnia.
Methods
A parallel-group randomized, waitlist-controlled trial was conducted with 320 adults with major depression and insomnia. Participants were randomized to receive either a 6-week CBT-I via a smartphone application, proACT-S, or waitlist condition. The primary outcomes included depression severity, insomnia severity, and sleep quality. The secondary outcomes included anxiety severity, subjective health, and acceptability of treatment. Assessments were administered at baseline, post-intervention (week 6) follow-up, and week 12 follow-up. The waitlist group received treatment after the week 6 follow-up.
Results
Intention to treat analysis was conducted with multilevel modeling. In all but one model, the interaction between treatment condition and time at week 6 follow-up was significant. Compared with the waitlist group, the treatment group had lower levels of depression [Center for Epidemiologic Studies Depression Scale (CES-D): Cohen's d = 0.86, 95% CI (−10.11 to −5.37)], insomnia [Insomnia Severity Index (ISI): Cohen's d = 1.00, 95% CI (−5.93 to −3.53)], and anxiety [Hospital Anxiety and Depression Scale – Anxiety subscale (HADS-A): Cohen's d = 0.83, 95% CI (−3.75 to −1.96)]. They also had better sleep quality [Pittsburgh Sleep Quality Index (PSQI): Cohen's d = 0.91, 95% CI (−3.34 to −1.83)]. No differences across any measures were found at week 12, after the waitlist control group received the treatment.
Conclusion
proACT-S is an efficacious sleep-focused self-help treatment for major depression and insomnia.