We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Auditory verbal hallucinations (AVHs) in schizophrenia have been suggested to arise from failure of corollary discharge mechanisms to correctly predict and suppress self-initiated inner speech. However, it is unclear whether such dysfunction is related to motor preparation of inner speech during which sensorimotor predictions are formed. The contingent negative variation (CNV) is a slow-going negative event-related potential that occurs prior to executing an action. A recent meta-analysis has revealed a large effect for CNV blunting in schizophrenia. Given that inner speech, similar to overt speech, has been shown to be preceded by a CNV, the present study tested the notion that AVHs are associated with inner speech-specific motor preparation deficits.
Objectives
The present study aimed to provide a useful framework for directly testing the long-held idea that AVHs may be related to inner speech-specific CNV blunting in patients with schizophrenia. This may hold promise for a reliable biomarker of AVHs.
Methods
Hallucinating (n=52) and non-hallucinating (n=45) patients with schizophrenia, along with matched healthy controls (n=42), participated in a novel electroencephalographic (EEG) paradigm. In the Active condition, they were asked to imagine a single phoneme at a cue moment while, precisely at the same time, being presented with an auditory probe. In the Passive condition, they were asked to passively listen to the auditory probes. The amplitude of the CNV preceding the production of inner speech was examined.
Results
Healthy controls showed a larger CNV amplitude (p = .002, d = .50) in the Active compared to the Passive condition, replicating previous results of a CNV preceding inner speech. However, both patient groups did not show a difference between the two conditions (p > .05). Importantly, a repeated measure ANOVA revealed a significant interaction effect (p = .007, ηp2 = .05). Follow-up contrasts showed that healthy controls exhibited a larger CNV amplitude in the Active condition than both the hallucinating (p = .013, d = .52) and non-hallucinating patients (p < .001, d = .88). No difference was found between the two patient groups (p = .320, d = .20).
Conclusions
The results indicated that motor preparation of inner speech in schizophrenia was disrupted. While the production of inner speech resulted in a larger CNV than passive listening in healthy controls, which was indicative of the involvement of motor planning, patients exhibited markedly blunted motor preparatory activity to inner speech. This may reflect dysfunction in the formation of corollary discharges. Interestingly, the deficits did not differ between hallucinating and non-hallucinating patients. Future work is needed to elucidate the specificity of inner speech-specific motor preparation deficits with AVHs. Overall, this study provides evidence in support of atypical inner speech monitoring in schizophrenia.
The Central Mental Hospital is the Republic of Ireland's only secure forensic hospital and the seat of its National Forensic Mental Health Service (NFMHS). We scrutinised admission patterns in the NFMHS during the period 01/01/2018–01/10/2023; before and after relocating from the historic 1850 site in Dundrum to a modern facility in Portrane on 13/11/2022.
Methods
This prospective longitudinal cohort study included all patients admitted during the above period. The study initially commenced in Dundrum and continued afterwards in Portrane. Data gathered included demographics, diagnoses, capacity to consent to treatment, and the need for intramuscular medication (IM) after admission. Therapeutic security needs and urgency of need for admission were collated from DUNDRUM-1 and DUNDRUM-2 scores rated pre-admission. Hours spent in seclusion during the first day, week, and month after admission were calculated. Data were collected as part of the Dundrum Forensic Redevelopment Evaluation Study (D-FOREST).
Results
There were 117 admissions during the 69-month period. The majority were male (n = 98). Most were admitted from prisons (87%). Schizophrenia was the most common diagnosis (55.8%). Mean DUNDRUM-1 triage security scores were in the medium-security range (2.84–3.15) during this period. At the time of admission, 53.8% required seclusion, 25.6% required IM medication, and 79.5% lacked capacity to consent to treatment. Those who required seclusion on admission had worse scores on the DUNDRUM-2 triage urgency scale (F = 20.9, p < 0.001). On linear logistic regression, the most parsimonious model resolved with five predictors of hours in seclusion during the first day and week, which were: D1 item 8 – Victim sensitivity/public confidence issues, D1 item 10 – Institutional behaviour, D2 item 2 – Mental health, D2 item 4 – Humanitarian, and D2 item 6 – Legal urgency. 50% required IM medication during their first week of admission and these patients had significantly worse scores on: D1 item 8 – Victim sensitivity/public confidence issues, D1 item 10 – Institutional behaviour, D2 item 2 – Mental health, and D2 item 4 – Humanitarian (all p < 0.05).
Conclusion
There was an increase in the frequency of admissions since relocating to Portrane. The results suggest that there was no change in overall triage security and urgency needs during the time period in question. Major mental illness related factors impacted the need for seclusion early in the admission, whereas factors linked to prison behaviour or personality-related factors were more associated with an ongoing need for seclusion at month one.
Forensic psychiatric services address the therapeutic needs of mentally disordered offenders in a secure setting. Clinical, ethical, and legal considerations underpinning treatment emphasize that the Quality of Life (QOL) of patients admitted to forensic hospitals should be optimised. This study aims to examine changes in the QOL in Ireland's National Forensic Mental Health Service (NFMHS) following its relocation from the historic 1850 site in Dundrum to a new campus in Portrane, Dublin.
Methods
This multisite prospective longitudinal study is part of the Dundrum Forensic Redevelopment Evaluation Study (D-FOREST). Repeated measures were taken for all inpatients in the service at regular 6 monthly intervals. The WHOQOL-BREF questionnaire was offered to all inpatients. An anonymised EssenCES questionnaire was used to measure atmosphere in wards. Data were obtained at 5 time points for each individual patient and ward. WHOQOL-BREF ratings were obtained across 5 time points with comparisons available for 4 time intervals, including immediately before and after relocation. For 101 subjects across 4 time intervals, 215 sets of data were obtained; 140 before and 65 after relocation with 10 community patients who did not move. Using Generalised Estimating Equations (GEE) to correct for multiple comparisons over time, the effect of relocation, with community patients as a control, was analysed by ward cluster and whether patients moved between wards. Observations were categorised according to security level – high dependency, medium secure, rehabilitation, or community – and trichotomised based on positive moves to less secure wards, negative moves to more secure wards, or no moves.
Results
Relocation of the NFMHS was associated with a significant increase in environmental QOL (Wald X2 = 15.9, df = 1, p < 0.001), even when controlling for cluster location, positive and negative moves. When controlling for ward atmosphere, environmental QOL remained significantly increased after the move (Wald X2 = 10.0, df = 1, p = 0.002). EssenCES scores were obtained within the hospital for 3 time points before relocation and 2 time points afterwards. No significant differences were found on the three subscales before and after the move. All three EssenCES subscales progressively improved with decreasing security level (Patient Cohesion: Wald X2 = 958.3, df = 1, p < 0.001; Experiencing Safety: Wald X2 = 152.9, df = 5, p < 0.001; Therapeutic Hold: Wald X2 = 33.6, df = 3, p < 0.001).
Conclusion
The GEE model demonstrated that the move of the NFMHS improved self-reported environmental QOL. The cluster location made significant differences, as expected for a system of stratified therapeutic security, with a steady improvement in scores on all three atmosphere subscales.
OBJECTIVES/GOALS: Childhood Sjögren’s disease (cSD) is a rare autoimmune disease. Despite the profound impact on children and their families, pediatric-specific clinical trials to inform therapeutic strategies in cSD are lacking. In 2022 we participated in the Trial Innovation Network (TIN) Design Lab with the purpose of designing a series of N-of-1 trials for cSD. METHODS/STUDY POPULATION: New medications have the potential to be safe/effective treatments for cSD but must be evaluated in randomized trials. To overcome limitations of traditional parallel-group designs given the rarity of cSD, we developed an N-of-1 trial approach. Our proposal was selected by the Tufts TIN Design Lab. The Design Lab multi-stakeholder process involved parents of and patients with cSD, pediatric and adult rheumatologists, and experts in clinical trial design and outcomes. We engaged all stakeholders in protocol development to maximize the impact of the proposed approach on clinical care, ensure a successful recruitment plan, and inform the choice of endpoints as there are no widely accepted cSD outcome measures to determine treatment efficacy. RESULTS/ANTICIPATED RESULTS: Using the Design Lab methodology, we clarified the N-of-1 study goals and engaged in an iterative process to develop a “briefing book” that ensured a sound premise for our study. We reviewed and accumulated published literature to support our focus on mucosal/glandular manifestations, identified potential interventions to be used in the N-of-1 trials, and enumerated possible outcomes, including outcomes important to patient/parents. This work culminated in a full-day Design Lab event that included multiple stakeholders who provided expertise from different perspectives on the full drug development pathway. Study design feedback focused on three specific areas. 1) Inclusion and exclusion criteria; 2) Identification of outcome measures; 3) Treatment and washout periods. DISCUSSION/SIGNIFICANCE: To address the critical need and move treatment of cSD forward, we are designing a prototype N-of-1 trial in children with rheumatic disease. We will continue to engage stakeholders by using a series of Delphi surveys and an in-person meeting to create composite outcome measures to test cSD therapies in personalized trials.
Despite their documented efficacy, substantial proportions of patients discontinue antidepressant medication (ADM) without a doctor's recommendation. The current report integrates data on patient-reported reasons into an investigation of patterns and predictors of ADM discontinuation.
Methods
Face-to-face interviews with community samples from 13 countries (n = 30 697) in the World Mental Health (WMH) Surveys included n = 1890 respondents who used ADMs within the past 12 months.
Results
10.9% of 12-month ADM users reported discontinuation-based on recommendation of the prescriber while 15.7% discontinued in the absence of prescriber recommendation. The main patient-reported reason for discontinuation was feeling better (46.6%), which was reported by a higher proportion of patients who discontinued within the first 2 weeks of treatment than later. Perceived ineffectiveness (18.5%), predisposing factors (e.g. fear of dependence) (20.0%), and enabling factors (e.g. inability to afford treatment cost) (5.0%) were much less commonly reported reasons. Discontinuation in the absence of prescriber recommendation was associated with low country income level, being employed, and having above average personal income. Age, prior history of psychotropic medication use, and being prescribed treatment from a psychiatrist rather than from a general medical practitioner, in comparison, were associated with a lower probability of this type of discontinuation. However, these predictors varied substantially depending on patient-reported reasons for discontinuation.
Conclusion
Dropping out early is not necessarily negative with almost half of individuals noting they felt better. The study underscores the diverse reasons given for dropping out and the need to evaluate how and whether dropping out influences short- or long-term functioning.
Childhood adversities (CAs) predict heightened risks of posttraumatic stress disorder (PTSD) and major depressive episode (MDE) among people exposed to adult traumatic events. Identifying which CAs put individuals at greatest risk for these adverse posttraumatic neuropsychiatric sequelae (APNS) is important for targeting prevention interventions.
Methods
Data came from n = 999 patients ages 18–75 presenting to 29 U.S. emergency departments after a motor vehicle collision (MVC) and followed for 3 months, the amount of time traditionally used to define chronic PTSD, in the Advancing Understanding of Recovery After Trauma (AURORA) study. Six CA types were self-reported at baseline: physical abuse, sexual abuse, emotional abuse, physical neglect, emotional neglect and bullying. Both dichotomous measures of ever experiencing each CA type and numeric measures of exposure frequency were included in the analysis. Risk ratios (RRs) of these CA measures as well as complex interactions among these measures were examined as predictors of APNS 3 months post-MVC. APNS was defined as meeting self-reported criteria for either PTSD based on the PTSD Checklist for DSM-5 and/or MDE based on the PROMIS Depression Short-Form 8b. We controlled for pre-MVC lifetime histories of PTSD and MDE. We also examined mediating effects through peritraumatic symptoms assessed in the emergency department and PTSD and MDE assessed in 2-week and 8-week follow-up surveys. Analyses were carried out with robust Poisson regression models.
Results
Most participants (90.9%) reported at least rarely having experienced some CA. Ever experiencing each CA other than emotional neglect was univariably associated with 3-month APNS (RRs = 1.31–1.60). Each CA frequency was also univariably associated with 3-month APNS (RRs = 1.65–2.45). In multivariable models, joint associations of CAs with 3-month APNS were additive, with frequency of emotional abuse (RR = 2.03; 95% CI = 1.43–2.87) and bullying (RR = 1.44; 95% CI = 0.99–2.10) being the strongest predictors. Control variable analyses found that these associations were largely explained by pre-MVC histories of PTSD and MDE.
Conclusions
Although individuals who experience frequent emotional abuse and bullying in childhood have a heightened risk of experiencing APNS after an adult MVC, these associations are largely mediated by prior histories of PTSD and MDE.
Racially and ethnically minoritized populations have been historically excluded and underrepresented in research. This paper will describe best practices in multicultural and multilingual awareness-raising strategies used by the Recruitment Innovation Center to increase minoritized enrollment into clinical trials. The Passive Immunity Trial for Our Nation will be used as a primary example to highlight real-world application of these methods to raise awareness, engage community partners, and recruit diverse study participants.
Recently, the Health of the Nation Outcome Scales 65+ (HoNOS65+) were revised. Twenty-five experts from Australia and New Zealand completed an anonymous web-based survey about the content validity of the revised measure, the HoNOS Older Adults (HoNOS OA).
Results
All 12 HoNOS OA scales were rated by most (≥75%) experts as ‘important’ or ‘very important’ for determining overall clinical severity among older adults. Ratings of sensitivity to change, comprehensibility and comprehensiveness were more variable, but mostly positive. Experts’ comments provided possible explanations. For example, some experts suggested modifying or expanding the glossary examples for some scales (e.g. those measuring problems with relationships and problems with activities of daily living) to be more older adult-specific.
Clinical implications
Experts agreed that the HoNOS OA measures important constructs. Training may need to orient experienced raters to the rationale for some revisions. Further psychometric testing of the HoNOS OA is recommended.
A method for three-dimensional reconstruction of objects from defocused images collected at multiple illumination directions in high-resolution transmission electron microscopy is presented. The method effectively corrects for the Ewald sphere curvature by taking into account the in-particle propagation of the electron beam. Numerical simulations demonstrate that the proposed method is capable of accurately reconstructing biological molecules or nanoparticles from high-resolution defocused images under conditions achievable in single-particle electron cryo-microscopy or electron tomography with realistic radiation doses, non-trivial aberrations, multiple scattering, and other experimentally relevant factors. The physics of the method is based on the well-known Diffraction Tomography formalism, but with the phase-retrieval step modified to include a conjugation of the phase (i.e., multiplication of the phase by a negative constant). At each illumination direction, numerically backpropagating the beam with the conjugated phase produces maximum contrast at the location of individual atoms in the molecule or nanoparticle. The resultant algorithm, Conjugated Holographic Reconstruction, can potentially be incorporated into established software tools for single-particle analysis, such as, for example, RELION or FREALIGN, in place of the conventional contrast transfer function correction procedure, in order to account for the Ewald sphere curvature and improve the spatial resolution of the three-dimensional reconstruction.
Gatherings where people are eating and drinking can increase the risk of getting and spreading SARS-CoV-2 among people who are not fully vaccinated; prevention strategies like wearing masks and physical distancing continue to be important for some groups. We conducted an online survey to characterise fall/winter 2020–2021 holiday gatherings, decisions to attend and prevention strategies employed during and before gatherings. We determined associations between practicing prevention strategies, demographics and COVID-19 experience. Among 502 respondents, one-third attended in person holiday gatherings; 73% wore masks and 84% practiced physical distancing, but less did so always (29% and 23%, respectively). Younger adults were 44% more likely to attend gatherings than adults ≥35 years. Younger adults (adjusted prevalence ratio (aPR) 1.53, 95% CI 1.19–1.97), persons who did not experience COVID-19 themselves or have relatives/close friends experience severe COVID-19 (aPR 1.56, 95% CI 1.18–2.07), and non-Hispanic White persons (aPR 1.57, 95% CI 1.13–2.18) were more likely to not always wear masks in public during the 2 weeks before gatherings. Public health messaging emphasizing consistent application of COVID-19 prevention strategies is important to slow the spread of COVID-19.
Whole-genome sequencing (WGS) shotgun metagenomics (metagenomics) attempts to sequence the entire genetic content straight from the sample. Diagnostic advantages lie in the ability to detect unsuspected, uncultivatable, or very slow-growing organisms.
Objective:
To evaluate the clinical and economic effects of using WGS and metagenomics for outbreak management in a large metropolitan hospital.
Design:
Cost-effectiveness study.
Setting:
Intensive care unit and burn unit of large metropolitan hospital.
Patients:
Simulated intensive care unit and burn unit patients.
Methods:
We built a complex simulation model to estimate pathogen transmission, associated hospital costs, and quality-adjusted life years (QALYs) during a 32-month outbreak of carbapenem-resistant Acinetobacter baumannii (CRAB). Model parameters were determined using microbiology surveillance data, genome sequencing results, hospital admission databases, and local clinical knowledge. The model was calibrated to the actual pathogen spread within the intensive care unit and burn unit (scenario 1) and compared with early use of WGS (scenario 2) and early use of WGS and metagenomics (scenario 3) to determine their respective cost-effectiveness. Sensitivity analyses were performed to address model uncertainty.
Results:
On average compared with scenario 1, scenario 2 resulted in 14 fewer patients with CRAB, 59 additional QALYs, and $75,099 cost savings. Scenario 3, compared with scenario 1, resulted in 18 fewer patients with CRAB, 74 additional QALYs, and $93,822 in hospital cost savings. The likelihoods that scenario 2 and scenario 3 were cost-effective were 57% and 60%, respectively.
Conclusions:
The use of WGS and metagenomics in infection control processes were predicted to produce favorable economic and clinical outcomes.
The most common treatment for major depressive disorder (MDD) is antidepressant medication (ADM). Results are reported on frequency of ADM use, reasons for use, and perceived effectiveness of use in general population surveys across 20 countries.
Methods
Face-to-face interviews with community samples totaling n = 49 919 respondents in the World Health Organization (WHO) World Mental Health (WMH) Surveys asked about ADM use anytime in the prior 12 months in conjunction with validated fully structured diagnostic interviews. Treatment questions were administered independently of diagnoses and asked of all respondents.
Results
3.1% of respondents reported ADM use within the past 12 months. In high-income countries (HICs), depression (49.2%) and anxiety (36.4%) were the most common reasons for use. In low- and middle-income countries (LMICs), depression (38.4%) and sleep problems (31.9%) were the most common reasons for use. Prevalence of use was 2–4 times as high in HICs as LMICs across all examined diagnoses. Newer ADMs were proportionally used more often in HICs than LMICs. Across all conditions, ADMs were reported as very effective by 58.8% of users and somewhat effective by an additional 28.3% of users, with both proportions higher in LMICs than HICs. Neither ADM class nor reason for use was a significant predictor of perceived effectiveness.
Conclusion
ADMs are in widespread use and for a variety of conditions including but going beyond depression and anxiety. In a general population sample from multiple LMICs and HICs, ADMs were widely perceived to be either very or somewhat effective by the people who use them.
Depression and anxiety are among the most common mental health conditions treated in primary care. They frequently co-occur and involve recommended treatments that overlap. Evidence from randomised controlled trials (RCTs) shows specific stepped care interventions to be cost-effective in improving symptom remission. However, most RCTs have focused on either depression or anxiety, which limits their generalisability to routine primary care settings. This study aimed to evaluate the cost-effectiveness of a collaborative stepped care (CSC) intervention to treat depression and/or anxiety among adults in Australian primary care settings.
Method
A quasi-decision tree model was developed to evaluate the cost-effectiveness of a CSC intervention relative to care-as-usual (CAU). The model adapted a CSC intervention described in a previous Dutch RCT to the Australian context. This 8-month, cluster RCT recruited patients with depression and/or anxiety (n = 158) from 30 primary care clinics in the Netherlands. The CSC intervention involved two steps: (1) guided self-help with a nurse at a primary care clinic; and (2) referral to specialised mental healthcare. The cost-effectiveness model adopted a health sector perspective and synthesised data from two main sources: RCT data on intervention pathways, remission probabilities and healthcare service utilisation; and Australia-specific data on demography, epidemiology and unit costs from external sources. Incremental costs and incremental health outcomes were estimated across a 1-year time horizon. Health outcomes were measured as disability-adjusted life years (DALYs) due to remitted cases of depression and/or anxiety. Incremental cost-effectiveness ratios (ICERs) were measured in 2019 Australian dollars (A$) per DALY averted. Uncertainty and sensitivity analyses were performed to test the robustness of cost-effectiveness findings.
Result
The CSC intervention had a high probability (99.6%) of being cost-effective relative to CAU. The resulting ICER (A$5207/DALY; 95% uncertainty interval: dominant to 25 345) fell below the willingness-to-pay threshold of A$50 000/DALY. ICERs were robust to changes in model parameters and assumptions.
Conclusions
This study found that a Dutch CSC intervention, with nurse-delivered guided self-help treatment as a first step, could potentially be cost-effective in treating depression and/or anxiety if transferred to the Australian primary care context. However, adaptations may be required to ensure feasibility and acceptability in the Australian healthcare context. In addition, further evidence is needed to verify the real-world cost-effectiveness of the CSC intervention when implemented in routine practice and to evaluate its effectiveness/cost-effectiveness when compared to other viable stepped care interventions for the treatment of depression and/or anxiety.
Clinical trials continue to face significant challenges in participant recruitment and retention. The Recruitment Innovation Center (RIC), part of the Trial Innovation Network (TIN), has been funded by the National Center for Advancing Translational Sciences of the National Institutes of Health to develop innovative strategies and technologies to enhance participant engagement in all stages of multicenter clinical trials. In collaboration with investigator teams and liaisons at Clinical and Translational Science Award institutions, the RIC is charged with the mission to design, field-test, and refine novel resources in the context of individual clinical trials. These innovations are disseminated via newsletters, publications, a virtual toolbox on the TIN website, and RIC-hosted collaboration webinars. The RIC has designed, implemented, and promised customized recruitment support for 173 studies across many diverse disease areas. This support has incorporated site feasibility assessments, community input sessions, recruitment materials recommendations, social media campaigns, and an array of study-specific suggestions. The RIC’s goal is to evaluate the efficacy of these resources and provide access to all investigating teams, so that more trials can be completed on time, within budget, with diverse participation, and with enough accrual to power statistical analyses and make substantive contributions to the advancement of healthcare.
The first demonstration of laser action in ruby was made in 1960 by T. H. Maiman of Hughes Research Laboratories, USA. Many laboratories worldwide began the search for lasers using different materials, operating at different wavelengths. In the UK, academia, industry and the central laboratories took up the challenge from the earliest days to develop these systems for a broad range of applications. This historical review looks at the contribution the UK has made to the advancement of the technology, the development of systems and components and their exploitation over the last 60 years.
Epidemiological studies indicate that individuals with one type of mental disorder have an increased risk of subsequently developing other types of mental disorders. This study aimed to undertake a comprehensive analysis of pair-wise lifetime comorbidity across a range of common mental disorders based on a diverse range of population-based surveys.
Methods
The WHO World Mental Health (WMH) surveys assessed 145 990 adult respondents from 27 countries. Based on retrospectively-reported age-of-onset for 24 DSM-IV mental disorders, associations were examined between all 548 logically possible temporally-ordered disorder pairs. Overall and time-dependent hazard ratios (HRs) and 95% confidence intervals (CIs) were calculated using Cox proportional hazards models. Absolute risks were estimated using the product-limit method. Estimates were generated separately for men and women.
Results
Each prior lifetime mental disorder was associated with an increased risk of subsequent first onset of each other disorder. The median HR was 12.1 (mean = 14.4; range 5.2–110.8, interquartile range = 6.0–19.4). The HRs were most prominent between closely-related mental disorder types and in the first 1–2 years after the onset of the prior disorder. Although HRs declined with time since prior disorder, significantly elevated risk of subsequent comorbidity persisted for at least 15 years. Appreciable absolute risks of secondary disorders were found over time for many pairs.
Conclusions
Survey data from a range of sites confirms that comorbidity between mental disorders is common. Understanding the risks of temporally secondary disorders may help design practical programs for primary prevention of secondary disorders.
Community care units (CCUs) are a model of residential psychiatric rehabilitation aiming to improve the independence and community functioning of people with severe and persistent mental illness. This study examined factors predicting improvement in outcomes among CCU consumers.
Methods
Hierarchical regression using data from a retrospective cohort (N = 501) of all consumers admitted to five CCUs in Queensland, Australia between 2005 and 2014. The primary outcome was changed in mental health and social functioning (Health of the Nation Outcome Scale). Secondary outcomes were disability (Life Skills Profile-16), service use, accommodation instability, and involuntary treatment. Potential predictors covered service, consumer, and treatment characteristics. Group-level and individualised change were assessed between the year pre-admission and post-discharge. Where relevant and available, the reliable and clinically significant (RCS) change was assessed by comparison with a normative sample.
Results
Group-level analyses showed statistically significant improvements in mental health and social functioning, and reductions in psychiatry-related bed-days, emergency department (ED) presentations and involuntary treatment. There were no significant changes in disability or accommodation instability. A total of 54.7% of consumers demonstrated reliable improvement in mental health and social functioning, and 43.0% showed RCS improvement. The majority (60.6%) showed a reliable improvement in psychiatry-related bed-use; a minority demonstrated reliable improvement in ED presentations (12.5%). Significant predictors of improvement included variables related to the CCU care (e.g. episode duration), consumer characteristics (e.g. primary diagnosis) and treatment variables (e.g. psychiatry-related bed-days pre-admission). Higher baseline impairment in mental health and social functioning (β = 1.12) and longer episodes of CCU care (β = 1.03) increased the likelihood of RCS improvement in mental health and social functioning.
Conclusions
CCU care was followed by reliable improvements in relevant outcomes for many consumers. Consumers with poorer mental health and social functioning, and a longer episode of CCU care were more likely to make RCS improvements in mental health and social functioning.
The transmission rate of methicillin-resistant Staphylococcus aureus (MRSA) to gloves or gowns of healthcare personnel (HCP) caring for MRSA patients in a non–intensive care unit setting was 5.4%. Contamination rates were higher among HCP performing direct patient care and when patients had detectable MRSA on their body. These findings may inform risk-based contact precautions.
We studied the association between chlorhexidine gluconate (CHG) concentration on skin and resistant bacterial bioburden. CHG was almost always detected on the skin, and detection of methicillin-resistant Staphylococcus aureus, carbapenem-resistant Enterobacteriaceae, and vancomycin-resistant Enterococcus on skin sites was infrequent. However, we found no correlation between CHG concentration and bacterial bioburden.