We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Background: Our prior six-year review (n=2165) revealed 24% of patients undergoing posterior decompression surgeries (laminectomy or discectomy) sought emergency department (ED) care within three months post-surgery. We established an integrated Spine Assessment Clinic (SAC) to enhance patient outcomes and minimize unnecessary ED visits through pre-operative education, targeted QI interventions, and early post-operative follow-up. Methods: We reviewed 13 months of posterior decompression data (n=205) following SAC implementation. These patients received individualized, comprehensive pre-operative education and follow-up phone calls within 7 days post-surgery. ED visits within 90 days post-surgery were tracked using provincial databases and compared to our pre-SAC implementation data. Results: Out of 205 patients, 24 (11.6%) accounted for 34 ED visits within 90 days post-op, showing a significant reduction in ED visits from 24% to 11.6%, and decreased overall ED utilization from 42.1% to 16.6% (when accounting for multiple visits by the same patient). Early interventions including wound monitoring, outpatient bloodwork, and prescription adjustments for pain management, helped mitigate ED visits. Patient satisfaction surveys (n=62) indicated 92% were “highly satisfied” and 100% would recommend the SAC. Conclusions: The SAC reduced ED visits after posterior decompression surgery by over 50%, with pre-operative education, focused QI initiatives, and its individualized, proactive approach.
Background: Transcranial doppler ultrasound (TCD) in a pediatric neurocritical setting can determine cerebral hemodynamics by assessing the blood flow velocity in main cerebral arteries. In large vessel occlusions (LVO) that require endovascular thrombectomy (EVT), TCD can monitor recanalization and arterial re-occlusion. We describe one case in a previously healthy 13-year-old girl with a right M1 middle cerebral artery occlusion. Methods: Analysis was done via a retrospective case review. Results: Our patient underwent a successful endovascular thrombectomy (EVT) six hours after symptom onset. Follow up TCDs done at 4, 8, and 24 hours showed stable peak systolic velocities (PSV) on the narrowing of right M1 ranging from 245 to 270 cm/s with stable pre-stenotic PSV around 110 cm/s, indicating focal and stable narrowing of M1 without reocclusion. No high transient signals (HITS) were identified on sub 10 minute TCDs. An urgent echocardiogram revealed a bicuspid aortic valve with vegetations, with later confirmation of infective endocarditis. The patient made an impressive recovery with only mild deficits. Conclusions: TCD can be an effective tool in a pediatric neurocritical setting in guiding initial recanalization after EVT and monitoring for arterial re-occlusion, HITS and hyperperfusion. TCD monitoring also decreases the amount of radiation exposure via CTA.
Multicenter clinical trials are essential for evaluating interventions but often face significant challenges in study design, site coordination, participant recruitment, and regulatory compliance. To address these issues, the National Institutes of Health’s National Center for Advancing Translational Sciences established the Trial Innovation Network (TIN). The TIN offers a scientific consultation process, providing access to clinical trial and disease experts who provide input and recommendations throughout the trial’s duration, at no cost to investigators. This approach aims to improve trial design, accelerate implementation, foster interdisciplinary teamwork, and spur innovations that enhance multicenter trial quality and efficiency. The TIN leverages resources of the Clinical and Translational Science Awards (CTSA) program, complementing local capabilities at the investigator’s institution. The Initial Consultation process focuses on the study’s scientific premise, design, site development, recruitment and retention strategies, funding feasibility, and other support areas. As of 6/1/2024, the TIN has provided 431 Initial Consultations to increase efficiency and accelerate trial implementation by delivering customized support and tailored recommendations. Across a range of clinical trials, the TIN has developed standardized, streamlined, and adaptable processes. We describe these processes, provide operational metrics, and include a set of lessons learned for consideration by other trial support and innovation networks.
Recent changes to US research funding are having far-reaching consequences that imperil the integrity of science and the provision of care to vulnerable populations. Resisting these changes, the BJPsych Portfolio reaffirms its commitment to publishing mental science and advancing psychiatric knowledge that improves the mental health of one and all.
Individuals with long-term physical health conditions (LTCs) experience higher rates of depression and anxiety. Conventional self-report measures do not distinguish distress related to LTCs from primary mental health disorders. This difference is important as treatment protocols differ. We developed a transdiagnostic self-report measure of illness-related distress, applicable across LTCs.
Methods
The new Illness-Related Distress (IRD) scale was developed through thematic coding of interviews, systematic literature search, think-aloud interviews with patients and healthcare providers, and expert-consensus meetings. An internet sample (n = 1,398) of UK-based individuals with LTCs completed the IRD scale for psychometric analysis. We randomly split the sample (1:1) to conduct: (1) an exploratory factor analysis (EFA; n = 698) for item reduction, and (2) iterative confirmatory factor analysis (CFA; n = 700) and exploratory structural equation modeling (ESEM). Here, further item reduction took place to generate a final version. Measurement invariance, internal consistency, convergent, test–retest reliability, and clinical cut-points were assessed.
Results
EFA suggested a 2-factor structure for the IRD scale, subsequently confirmed by iteratively comparing unidimensional, lower order, and bifactor CFAs and ESEMs. A lower-order correlated 2-factor CFA model (two 7-item subscales: intrapersonal distress and interpersonal distress) was favored and was structurally invariant for gender. Subscales demonstrated excellent internal consistency, very good test–retest reliability, and good convergent validity. Clinical cut points were identified (intrapersonal = 15, interpersonal = 12).
Conclusion
The IRD scale is the first measure that captures transdiagnostic distress. It may aid assessment within clinical practice and research related to psychological adjustment and distress in LTCs.
This editorial considers the value and nature of academic psychiatry by asking what defines the specialty and psychiatrists as academics. We frame academic psychiatry as a way of thinking that benefits clinical services and discuss how to inspire the next generation of academics.
Military Servicemembers and Veterans are at elevated risk for suicide, but rarely self-identify to their leaders or clinicians regarding their experience of suicidal thoughts. We developed an algorithm to identify posts containing suicide-related content on a military-specific social media platform.
Methods
Publicly-shared social media posts (n = 8449) from a military-specific social media platform were reviewed and labeled by our team for the presence/absence of suicidal thoughts and behaviors and used to train several machine learning models to identify such posts.
Results
The best performing model was a deep learning (RoBERTa) model that incorporated post text and metadata and detected the presence of suicidal posts with relatively high sensitivity (0.85), specificity (0.96), precision (0.64), F1 score (0.73), and an area under the precision-recall curve of 0.84. Compared to non-suicidal posts, suicidal posts were more likely to contain explicit mentions of suicide, descriptions of risk factors (e.g. depression, PTSD) and help-seeking, and first-person singular pronouns.
Conclusions
Our results demonstrate the feasibility and potential promise of using social media posts to identify at-risk Servicemembers and Veterans. Future work will use this approach to deliver targeted interventions to social media users at risk for suicide.
The COVID-19 has had major direct (e.g., deaths) and indirect (e.g., social inequities) effects in the United States. While the public health response to the epidemic featured some important successes (e.g., universal masking ,and rapid development and approval of vaccines and therapeutics), there were systemic failures (e.g., inadequate public health infrastructure) that overshadowed these successes. Key deficiency in the U.S. response were shortages of personal protective equipment (PPE) and supply chain deficiencies. Recommendations are provided for mitigating supply shortages and supply chain failures in healthcare settings in future pandemics. Some key recommendations for preventing shortages of essential components of infection control and prevention include increasing the stockpile of PPE in the U.S. National Strategic Stockpile, increased transparency of the Stockpile, invoking the Defense Production Act at an early stage, and rapid review and authorization by FDA/EPA/OSHA of non-U.S. approved products. Recommendations are also provided for mitigating shortages of diagnostic testing, medications and medical equipment.
Throughout the COVID-19 pandemic, many areas in the United States experienced healthcare personnel (HCP) shortages tied to a variety of factors. Infection prevention programs, in particular, faced increasing workload demands with little opportunity to delegate tasks to others without specific infectious diseases or infection control expertise. Shortages of clinicians providing inpatient care to critically ill patients during the early phase of the pandemic were multifactorial, largely attributed to increasing demands on hospitals to provide care to patients hospitalized with COVID-19 and furloughs.1 HCP shortages and challenges during later surges, including the Omicron variant-associated surges, were largely attributed to HCP infections and associated work restrictions during isolation periods and the need to care for family members, particularly children, with COVID-19. Additionally, the detrimental physical and mental health impact of COVID-19 on HCP has led to attrition, which further exacerbates shortages.2 Demands increased in post-acute and long-term care (PALTC) settings, which already faced critical staffing challenges difficulty with recruitment, and high rates of turnover. Although individual healthcare organizations and state and federal governments have taken actions to mitigate recurring shortages, additional work and innovation are needed to develop longer-term solutions to improve healthcare workforce resiliency. The critical role of those with specialized training in infection prevention, including healthcare epidemiologists, was well-demonstrated in pandemic preparedness and response. The COVID-19 pandemic underscored the need to support growth in these fields.3 This commentary outlines the need to develop the US healthcare workforce in preparation for future pandemics.
Throughout history, pandemics and their aftereffects have spurred society to make substantial improvements in healthcare. After the Black Death in 14th century Europe, changes were made to elevate standards of care and nutrition that resulted in improved life expectancy.1 The 1918 influenza pandemic spurred a movement that emphasized public health surveillance and detection of future outbreaks and eventually led to the creation of the World Health Organization Global Influenza Surveillance Network.2 In the present, the COVID-19 pandemic exposed many of the pre-existing problems within the US healthcare system, which included (1) a lack of capacity to manage a large influx of contagious patients while simultaneously maintaining routine and emergency care to non-COVID patients; (2) a “just in time” supply network that led to shortages and competition among hospitals, nursing homes, and other care sites for essential supplies; and (3) longstanding inequities in the distribution of healthcare and the healthcare workforce. The decades-long shift from domestic manufacturing to a reliance on global supply chains has compounded ongoing gaps in preparedness for supplies such as personal protective equipment and ventilators. Inequities in racial and socioeconomic outcomes highlighted during the pandemic have accelerated the call to focus on diversity, equity, and inclusion (DEI) within our communities. The pandemic accelerated cooperation between government entities and the healthcare system, resulting in swift implementation of mitigation measures, new therapies and vaccinations at unprecedented speeds, despite our fragmented healthcare delivery system and political divisions. Still, widespread misinformation or disinformation and political divisions contributed to eroded trust in the public health system and prevented an even uptake of mitigation measures, vaccines and therapeutics, impeding our ability to contain the spread of the virus in this country.3 Ultimately, the lessons of COVID-19 illustrate the need to better prepare for the next pandemic. Rising microbial resistance, emerging and re-emerging pathogens, increased globalization, an aging population, and climate change are all factors that increase the likelihood of another pandemic.4
Background: Sedation in PICU masks physical exam findings, leading to diagnostic challenges. In adult models, electroencephalography can evaluate the brain’s response to sedation using feedforward connectivity and anteriorization of alpha hubs, proving useful for prognostication. Feasibility of model translation into pediatric population was assessed, with the hypothesis that the same markers of adaptive reconfiguration would correlate with a higher potential for recovering consciousness. Methods: Electroencephalograms from children undergoing sedation were analyzed for strength and direction of functional connectivity using the weighted and directed phase lag index. Target population was refined with an iterative inclusion criteria. We examined relationships between hub location reconfiguration, directed phase lag index, baseline Glasgow Coma Scale, and 3-month post-treatment Glasgow Outcome Scale-Extended. Results: Evaluation of 14 subjects showed promise in children aged 5-18 undergoing sedation with midazolam, dexmedetomidine, and propofol. Further analysis of five subjects revealed a correlation between adaptive reconfiguration during anesthesia and both higher baseline Glasgow Coma Scale and Glasgow Outcome Scale-Extended scores post-treatment. Conclusions: The findings indicate that the functional brain network connectivity model may have diagnostic and prognostic potential regarding children’s consciousness levels. While the initial data is promising, further analysis of six additional cases is pending and deemed essential to thoroughly evaluate the model’s efficacy.
Background: We performed a network meta-analysis of randomized controlled trials to assess the comparative effectiveness of available pharmacological prophylaxis for migraines. Methods: We searched MEDLINE, EMBASE, Web of Science, Scopus, PsycINFO and Cochrane CENTRAL up to October 2023 for trials that: (1) enrolled adults diagnosed with chronic migraine, and (2) randomized them to any prophylactic medication vs. another medication or placebo. We performed a random-effects frequentist network meta-analysis for patient-important outcomes. Results: We included 193 randomized trials. Compared to placebo, CGRP monoclonal antibodies (mean difference [MD] -1.7, 95%CI: -1.1 to -2.2), injection of botulinum toxin (MD -1.8, 95%CI: -0.7 to -2.9), calcium channel blockers (MD -1.8, 95%CI: -0.5 to -3.0), beta-blockers (MD -1.4, 95%CI: -0.2 to -2.6), and anticonvulsants (MD -1.1, 95%CI: -0.4 to -1.8) were among the most effective treatments in reducing average number of headache days per months. Anticonvulsants (Risk Ratio [RR] 2.3, 95%CI: 1.8 to 3.0), calcium channel blockers (RR 1.8, 95% CI: 1.1 to 3.1), and tricyclic antidepressants (RR 2.3, 95% CI: 1.3 to 3.8) showed the highest risk of discontinuation due to adverse events. Conclusions: Our findings suggest that CGRP inhibitors, botulinum toxin, and beta-blockers may provide the greatest benefit, and tolerability, for reducing the frequency of migraine headaches.
Knowledge of sex differences in risk factors for posttraumatic stress disorder (PTSD) can contribute to the development of refined preventive interventions. Therefore, the aim of this study was to examine if women and men differ in their vulnerability to risk factors for PTSD.
Methods
As part of the longitudinal AURORA study, 2924 patients seeking emergency department (ED) treatment in the acute aftermath of trauma provided self-report assessments of pre- peri- and post-traumatic risk factors, as well as 3-month PTSD severity. We systematically examined sex-dependent effects of 16 risk factors that have previously been hypothesized to show different associations with PTSD severity in women and men.
Results
Women reported higher PTSD severity at 3-months post-trauma. Z-score comparisons indicated that for five of the 16 examined risk factors the association with 3-month PTSD severity was stronger in men than in women. In multivariable models, interaction effects with sex were observed for pre-traumatic anxiety symptoms, and acute dissociative symptoms; both showed stronger associations with PTSD in men than in women. Subgroup analyses suggested trauma type-conditional effects.
Conclusions
Our findings indicate mechanisms to which men might be particularly vulnerable, demonstrating that known PTSD risk factors might behave differently in women and men. Analyses did not identify any risk factors to which women were more vulnerable than men, pointing toward further mechanisms to explain women's higher PTSD risk. Our study illustrates the need for a more systematic examination of sex differences in contributors to PTSD severity after trauma, which may inform refined preventive interventions.
We present and evaluate the prospects for detecting coherent radio counterparts to gravitational wave (GW) events using Murchison Widefield Array (MWA) triggered observations. The MWA rapid-response system, combined with its buffering mode ($\sim$4 min negative latency), enables us to catch any radio signals produced from seconds prior to hours after a binary neutron star (BNS) merger. The large field of view of the MWA ($\sim$$1\,000\,\textrm{deg}^2$ at 120 MHz) and its location under the high sensitivity sky region of the LIGO-Virgo-KAGRA (LVK) detector network, forecast a high chance of being on-target for a GW event. We consider three observing configurations for the MWA to follow up GW BNS merger events, including a single dipole per tile, the full array, and four sub-arrays. We then perform a population synthesis of BNS systems to predict the radio detectable fraction of GW events using these configurations. We find that the configuration with four sub-arrays is the best compromise between sky coverage and sensitivity as it is capable of placing meaningful constraints on the radio emission from 12.6% of GW BNS detections. Based on the timescales of four BNS merger coherent radio emission models, we propose an observing strategy that involves triggering the buffering mode to target coherent signals emitted prior to, during or shortly following the merger, which is then followed by continued recording for up to three hours to target later time post-merger emission. We expect MWA to trigger on $\sim$$5-22$ BNS merger events during the LVK O4 observing run, which could potentially result in two detections of predicted coherent emission.
Several hypotheses may explain the association between substance use, posttraumatic stress disorder (PTSD), and depression. However, few studies have utilized a large multisite dataset to understand this complex relationship. Our study assessed the relationship between alcohol and cannabis use trajectories and PTSD and depression symptoms across 3 months in recently trauma-exposed civilians.
Methods
In total, 1618 (1037 female) participants provided self-report data on past 30-day alcohol and cannabis use and PTSD and depression symptoms during their emergency department (baseline) visit. We reassessed participant's substance use and clinical symptoms 2, 8, and 12 weeks posttrauma. Latent class mixture modeling determined alcohol and cannabis use trajectories in the sample. Changes in PTSD and depression symptoms were assessed across alcohol and cannabis use trajectories via a mixed-model repeated-measures analysis of variance.
Results
Three trajectory classes (low, high, increasing use) provided the best model fit for alcohol and cannabis use. The low alcohol use class exhibited lower PTSD symptoms at baseline than the high use class; the low cannabis use class exhibited lower PTSD and depression symptoms at baseline than the high and increasing use classes; these symptoms greatly increased at week 8 and declined at week 12. Participants who already use alcohol and cannabis exhibited greater PTSD and depression symptoms at baseline that increased at week 8 with a decrease in symptoms at week 12.
Conclusions
Our findings suggest that alcohol and cannabis use trajectories are associated with the intensity of posttrauma psychopathology. These findings could potentially inform the timing of therapeutic strategies.
Preoperatively, the patient will transition from different depths of anesthesia, including the levels of sedation, to general anesthesia (GA). Sedation is a continuum of symptoms that range from minimal symptoms of anxiolysis to symptoms of moderate and deep sedation. Moderate sedation is defined by the patient remaining asleep, but being easily arousable. Deep sedation is achieved when the patient is only arousable to painful stimulation. GA refers to medically induced loss of consciousness with concurrent loss of protective reflexes and skeletal muscle relaxation. GA is most commonly achieved via induction with intravenous sedatives and analgesics, followed by maintenance of volatile anesthetics [1]. Table 9.1 lists the depths of anesthesia and associated characteristics.
Childhood adversities (CAs) predict heightened risks of posttraumatic stress disorder (PTSD) and major depressive episode (MDE) among people exposed to adult traumatic events. Identifying which CAs put individuals at greatest risk for these adverse posttraumatic neuropsychiatric sequelae (APNS) is important for targeting prevention interventions.
Methods
Data came from n = 999 patients ages 18–75 presenting to 29 U.S. emergency departments after a motor vehicle collision (MVC) and followed for 3 months, the amount of time traditionally used to define chronic PTSD, in the Advancing Understanding of Recovery After Trauma (AURORA) study. Six CA types were self-reported at baseline: physical abuse, sexual abuse, emotional abuse, physical neglect, emotional neglect and bullying. Both dichotomous measures of ever experiencing each CA type and numeric measures of exposure frequency were included in the analysis. Risk ratios (RRs) of these CA measures as well as complex interactions among these measures were examined as predictors of APNS 3 months post-MVC. APNS was defined as meeting self-reported criteria for either PTSD based on the PTSD Checklist for DSM-5 and/or MDE based on the PROMIS Depression Short-Form 8b. We controlled for pre-MVC lifetime histories of PTSD and MDE. We also examined mediating effects through peritraumatic symptoms assessed in the emergency department and PTSD and MDE assessed in 2-week and 8-week follow-up surveys. Analyses were carried out with robust Poisson regression models.
Results
Most participants (90.9%) reported at least rarely having experienced some CA. Ever experiencing each CA other than emotional neglect was univariably associated with 3-month APNS (RRs = 1.31–1.60). Each CA frequency was also univariably associated with 3-month APNS (RRs = 1.65–2.45). In multivariable models, joint associations of CAs with 3-month APNS were additive, with frequency of emotional abuse (RR = 2.03; 95% CI = 1.43–2.87) and bullying (RR = 1.44; 95% CI = 0.99–2.10) being the strongest predictors. Control variable analyses found that these associations were largely explained by pre-MVC histories of PTSD and MDE.
Conclusions
Although individuals who experience frequent emotional abuse and bullying in childhood have a heightened risk of experiencing APNS after an adult MVC, these associations are largely mediated by prior histories of PTSD and MDE.
Understanding the quality of seed dispersal effectiveness of frugivorous species can elucidate how endozoochory structures tropical forests. Large seeds, containing more resources for growth, and gut passage by frugivores, which remove seed pulp, both typically enhance the speed and probability of germination of tropical seeds. However, the interaction of seed size and gut passage has not been well studied. We assessed the role of two species of toucans (Ramphastos spp.) in seed germination of the tropical tree Eugenia uniflora, which produces seeds that vary considerably in size (3.7–14.3 mm), using 151 control and 137 regurgitated seeds in germination trials. We found that toucan regurgitation did not increase germination success, although 93.4% germinated compared to 76.8% of control seeds; however, larger seeds germinated more often at faster rates. Although only marginally significant, germination rates were 3.6× faster when seeds were both large and regurgitated by toucans, demonstrating that toucan regurgitation can disproportionally benefit larger E. uniflora seeds. As tropical forests are increasingly disturbed and fragmented by human activities, the ability of toucans to continue providing seed dispersal services to degraded habitats may be vital to the persistence of many tropical plants that contain larger seeds and depend on larger dispersers.