We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Patients with posttraumatic stress disorder (PTSD) exhibit smaller regional brain volumes in commonly reported regions including the amygdala and hippocampus, regions associated with fear and memory processing. In the current study, we have conducted a voxel-based morphometry (VBM) meta-analysis using whole-brain statistical maps with neuroimaging data from the ENIGMA-PGC PTSD working group.
Methods
T1-weighted structural neuroimaging scans from 36 cohorts (PTSD n = 1309; controls n = 2198) were processed using a standardized VBM pipeline (ENIGMA-VBM tool). We meta-analyzed the resulting statistical maps for voxel-wise differences in gray matter (GM) and white matter (WM) volumes between PTSD patients and controls, performed subgroup analyses considering the trauma exposure of the controls, and examined associations between regional brain volumes and clinical variables including PTSD (CAPS-4/5, PCL-5) and depression severity (BDI-II, PHQ-9).
Results
PTSD patients exhibited smaller GM volumes across the frontal and temporal lobes, and cerebellum, with the most significant effect in the left cerebellum (Hedges’ g = 0.22, pcorrected = .001), and smaller cerebellar WM volume (peak Hedges’ g = 0.14, pcorrected = .008). We observed similar regional differences when comparing patients to trauma-exposed controls, suggesting these structural abnormalities may be specific to PTSD. Regression analyses revealed PTSD severity was negatively associated with GM volumes within the cerebellum (pcorrected = .003), while depression severity was negatively associated with GM volumes within the cerebellum and superior frontal gyrus in patients (pcorrected = .001).
Conclusions
PTSD patients exhibited widespread, regional differences in brain volumes where greater regional deficits appeared to reflect more severe symptoms. Our findings add to the growing literature implicating the cerebellum in PTSD psychopathology.
The Biden administration requested comments regarding “Public and Private Sector Uses of Biometric Technologies” in the Federal Register from October 2021 to January 2022. This generated 130 responses, helped shape the “Blueprint for an AI Bill of Rights,” and resulted in Executive Order 14110 on “Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence.” While the Trump administration immediately rescinded this executive order, these comments provide insight into salient AI biometrics technologies and relevant political players. We first identify AI biometric technologies before asking which institutions and individuals commented (RQ1), and what the substance and tenor of responses were regarding the opportunities and threats posed by AI biometrics (RQ2-a) based on respondent type (RQ2-b). We use text mining and qualitative analyses to illuminate how uncertainty about AI biometric technology in this nascent policy subsystem reflects participants’ language use and policy preferences.
The stars of the Milky Way carry the chemical history of our Galaxy in their atmospheres as they journey through its vast expanse. Like barcodes, we can extract the chemical fingerprints of stars from high-resolution spectroscopy. The fourth data release (DR4) of the Galactic Archaeology with HERMES (GALAH) Survey, based on a decade of observations, provides the chemical abundances of up to 32 elements for 917 588 stars that also have exquisite astrometric data from the Gaia satellite. For the first time, these elements include life-essential nitrogen to complement carbon, and oxygen as well as more measurements of rare-earth elements critical to modern-life electronics, offering unparalleled insights into the chemical composition of the Milky Way. For this release, we use neural networks to simultaneously fit stellar parameters and abundances across the whole wavelength range, leveraging synthetic grids computed with Spectroscopy Made Easy. These grids account for atomic line formation in non-local thermodynamic equilibrium for 14 elements. In a two-iteration process, we first fit stellar labels to all 1 085 520 spectra, then co-add repeated observations and refine these labels using astrometric data from Gaia and 2MASS photometry, improving the accuracy and precision of stellar parameters and abundances. Our validation thoroughly assesses the reliability of spectroscopic measurements and highlights key caveats. GALAH DR4 represents yet another milestone in Galactic archaeology, combining detailed chemical compositions from multiple nucleosynthetic channels with kinematic information and age estimates. The resulting dataset, covering nearly a million stars, opens new avenues for understanding not only the chemical and dynamical history of the Milky Way but also the broader questions of the origin of elements and the evolution of planets, stars, and galaxies.
Syncope is common among pediatric patients and is rarely pathologic. The mechanisms for symptoms during exercise are less well understood than the resting mechanisms. Additionally, inert gas rebreathing analysis, a non-invasive examination of haemodynamics including cardiac output, has not previously been studied in youth with neurocardiogenic syncope.
Methods:
This was a retrospective (2017–2023), single-center cohort study in pediatric patients ≤ 21 years with prior peri-exertional syncope evaluated with echocardiography and cardiopulmonary exercise testing with inert gas rebreathing analysis performed on the same day. Patients with and without symptoms during or immediately following exercise were noted.
Results:
Of the 101 patients (15.2 ± 2.3 years; 31% male), there were 22 patients with symptoms during exercise testing or recovery. Resting echocardiography stroke volume correlated with resting (r = 0.53, p < 0.0001) and peak stroke volume (r = 0.32, p = 0.009) by inert gas rebreathing and with peak oxygen pulse (r = 0.61, p < 0.0001). Patients with syncopal symptoms peri-exercise had lower left ventricular end-diastolic volume (Z-score –1.2 ± 1.3 vs. –0.36 ± 1.3, p = 0.01) and end-systolic volume (Z-score –1.0 ± 1.4 vs. −0.1 ± 1.1, p = 0.001) by echocardiography, lower percent predicted peak oxygen pulse during exercise (95.5 ± 14.0 vs. 104.6 ± 18.5%, p = 0.04), and slower post-exercise heart rate recovery (31.0 ± 12.7 vs. 37.8 ± 13.2 bpm, p = 0.03).
Discussion:
Among youth with a history of peri-exertional syncope, those who become syncopal with exercise testing have lower left ventricular volumes at rest, decreased peak oxygen pulse, and slower heart rate recovery after exercise than those who remain asymptomatic. Peak oxygen pulse and resting stroke volume on inert gas rebreathing are associated with stroke volume on echocardiogram.
We present a re-discovery of G278.94+1.35a as possibly one of the largest known Galactic supernova remnants (SNRs) – that we name Diprotodon. While previously established as a Galactic SNR, Diprotodon is visible in our new Evolutionary Map of the Universe (EMU) and GaLactic and Extragalactic All-sky MWA (GLEAM) radio continuum images at an angular size of $3{{{{.\!^\circ}}}}33\times3{{{{.\!^\circ}}}}23$, much larger than previously measured. At the previously suggested distance of 2.7 kpc, this implies a diameter of 157$\times$152 pc. This size would qualify Diprotodon as the largest known SNR and pushes our estimates of SNR sizes to the upper limits. We investigate the environment in which the SNR is located and examine various scenarios that might explain such a large and relatively bright SNR appearance. We find that Diprotodon is most likely at a much closer distance of $\sim$1 kpc, implying its diameter is 58$\times$56 pc and it is in the radiative evolutionary phase. We also present a new Fermi-LAT data analysis that confirms the angular extent of the SNR in gamma rays. The origin of the high-energy emission remains somewhat puzzling, and the scenarios we explore reveal new puzzles, given this unexpected and unique observation of a seemingly evolved SNR having a hard GeV spectrum with no breaks. We explore both leptonic and hadronic scenarios, as well as the possibility that the high-energy emission arises from the leftover particle population of a historic pulsar wind nebula.
Identify risk factors for central line-associated bloodstream infections (CLABSI) in pediatric intensive care settings in an era with high focus on prevention measures.
Design:
Matched, case–control study.
Setting:
Quaternary children’s hospital.
Patients:
Cases had a CLABSI during an intensive care unit (ICU) stay between January 1, 2015 and December 31, 2020. Controls were matched 4:1 by ICU and admission date and did not develop a CLABSI.
Methods:
Multivariable, mixed-effects logistic regression.
Results:
129 cases were matched to 516 controls. Central venous catheter (CVC) maintenance bundle compliance was >70%. Independent CLABSI risk factors included administration of continuous non-opioid sedative (adjusted odds ratio (aOR) 2.96, 95% CI [1.16, 7.52], P = 0.023), number of days with one or more CVC in place (aOR 1.42 per 10 days [1.16, 1.74], P = 0.001), and the combination of a chronic CVC with administration of parenteral nutrition (aOR 4.82 [1.38, 16.9], P = 0.014). Variables independently associated with lower odds of CLABSI included CVC location in an upper extremity (aOR 0.16 [0.05, 0.55], P = 0.004); non-tunneled CVC (aOR 0.17 [0.04, 0.63], P = 0.008); presence of an endotracheal tube (aOR 0.21 [0.08, 0.6], P = 0.004), Foley catheter (aOR 0.3 [0.13, 0.68], P = 0.004); transport to radiology (aOR 0.31 [0.1, 0.94], P = 0.039); continuous neuromuscular blockade (aOR 0.29 [0.1, 0.86], P = 0.025); and administration of histamine H2 blocking medications (aOR 0.17 [0.06, 0.48], P = 0.001).
Conclusions:
Pediatric intensive care patients with chronic CVCs receiving parenteral nutrition, those on non-opioid sedative infusions, and those with more central line days are at increased risk for CLABSI despite current prevention measures.
Postural orthostatic tachycardia syndrome is a debilitating disorder. We compared paediatric patients with this dysautonomia presenting with and without peak upright heart rate > 100 beats per minute.
Materials and Methods:
Subjects were drawn from the Postural Orthostatic Tachycardia Syndrome Program database of the Children’s Hospital of Philadelphia diagnosed between 2007 and 2018. Subjects were aged 12–18 years at diagnosis with demographic data, supine and peak heart rate from 10-minute stand, symptoms, and family history. Patients were divided into “low heart rate” (peak less than 100 beats/minute) and “high heart rate” (peak at least 100 beats/minute) groups.
Results:
In total, 729 subjects were included (low heart rate group: 131 patients, high heart rate group: 598 patients). The low heart rate group had later age at diagnosis (16.1 versus 15.7, p = 0.0027). Median heart rate increase was 32 beats/minute in the low heart rate group versus 40 beats/minute in the high heart rate group (p < 0.00001). Excluding palpitations and tachypalpitations, there were no differences in symptom type or frequency between groups.
Discussion:
Paediatric patients meeting heart rate criteria for postural orthostatic tachycardia syndrome but without peak heart rate > 100 demonstrate no difference in symptom type or frequency versus those who meet both criteria. Differences observed reached statistical significance due to population size but are not clinically meaningful. This suggests that increased heart rate, but not necessarily tachycardia, is seen in these patients, supporting previous findings suggesting maximal heart rate is not a major determinant of symptom prevalence in paediatric postural orthostatic tachycardia syndrome.
Contemporary understanding of the mechanisms of disease increasingly points to examples of “genetic diseases” with an infectious component and of “infectious diseases” with a genetic component. Such blurred boundaries generate ethical, legal, and social issues and highlight historical contexts that must be examined when incorporating host genomic information into the prevention, outbreak control, and treatment of infectious diseases.
Evaluation of adult antibiotic order sets (AOSs) on antibiotic stewardship metrics has been limited. The primary outcome was to evaluate the standardized antimicrobial administration ratio (SAAR). Secondary outcomes included antibiotic days of therapy (DOT) per 1,000 patient days (PD); selected antibiotic use; AOS utilization; Clostridioides difficile infection (CDI) cases; and clinicians’ perceptions of the AOS via a survey following the final study phase.
Design:
This 5-year, single-center, quasi-experimental study comprised 5 phases from 2017 to 2022 over 10-month periods between August 1 and May 31.
Setting:
The study was conducted in a 752-bed tertiary care, academic medical center.
Intervention:
Our institution implemented AOSs in the electronic medical record (EMR) for common infections among hospitalized adults.
Results:
For the primary outcome, a statistically significant decreases in SAAR were detected from phase 1 to phase 5 (1.0 vs 0.90; P < .001). A statistically significant decreases were detected in DOT per 1,000 PD (4,884 vs 3,939; P = .001), fluoroquinolone orders (407 vs 175; P < .001), carbapenem orders (147 vs 106; P = .024), and clindamycin orders (113 vs 73; P = .01). No statistically significant change in mean vancomycin orders was detected (991 vs 902; P = .221). A statistically significant decrease in CDI cases was also detected (7.8, vs 2.4; P = .002) but may have been attributable to changes in CDI case diagnosis. Clinicians indicated that the AOSs were easy to use overall and that they helped them select the appropriate antibiotics.
Conclusions:
Implementing AOS into the EMR was associated with a statistically significant reduction in SAAR, antibiotic DOT per 1,000 PD, selected antibiotic orders, and CDI cases.
Although some animal research suggests possible sex differences in response to THC exposure (e.g., Cooper & Craft, 2018), there are limited human studies. One study found that among individuals rarely using cannabis, when given similar amounts of oral and vaporized THC females report greater subjective intoxication compared to males (Sholler et al., 2020). However, in a study of daily users, females reported indistinguishable levels of intoxication compared to males after smoking similar amounts (Cooper & Haney, 2014), while males and females using 1–4x/week showed similar levels of intoxication, despite females having lower blood THC and metabolite concentrations (Matheson et al., 2020). It is important to elucidate sex differences in biological indicators of cannabis intoxication given potential driving/workplace implications as states increasingly legalize use. The current study examined if when closely matching males and females on cannabis use variables there are predictable sex differences in residual whole blood THC and metabolite concentrations, and THC/metabolites, subjective appraisals of intoxication, and driving performance following acute cannabis consumption.
Participants and Methods:
The current study was part of a randomized clinical trial (Marcotte et al., 2022). Participants smoked ad libitum THC cigarettes and then completed driving simulations, blood draws, and subjective measures of intoxication. The main outcomes were the change in Composite Drive Score (CDS; global measure of driving performance) from baseline, whole blood THC, 11-OH-THC, and THC-COOH levels (ng/mL), and subjective ratings of how “high” participants felt (0 = not at all, 100 = extremely). For this analysis of participants receiving active THC, males were matched to females on 1) estimated THC exposure (g) in the last 6 months (24M, 24F) or 2) whole blood THC concentrations immediately post-smoking (23M, 23F).
Results:
When matched on THC exposure in the past 6 months (overall mean of 46 grams; p = .99), there were no sex differences in any cannabinoid/metabolite concentrations at baseline (all p > .83) or after cannabis administration (all p > .72). Nor were there differences in the change in CDS from pre-to-post-smoking (p = .26) or subjective “highness” ratings (p = .53). When matched on whole blood THC concentrations immediately after smoking (mean of 34 ng/mL for both sexes, p = .99), no differences were found in CDS change from pre-to-post smoking (p = .81), THC metabolite concentrations (all p > .25), or subjective “highness” ratings (p = .56). For both analyses, males and females did not differ in BMI (both p > .7).
Conclusions:
When male/female cannabis users are well-matched on use history, we find no significant differences in cannabinoid concentrations following a mean of 5 days of abstinence, suggesting that there are no clear biological differences in carryover residual effects. We also find no significant sex differences following ad libitum smoking in driving performance, subjective ratings of “highness,” nor whole blood THC and metabolite concentrations, indicating that there are no biological differences in acute response to THC. This improves upon previous research by closely matching participants over a wider range of use intensity variables, although the small sample size precludes definitive conclusions.
Multiple sclerosis (MS) is a debilitating neurological disease associated with a variety of psychological, cognitive, and motoric symptoms. Walking is among the most important functions compromised by MS. Dual-task walking (DTW), an everyday activity in which people walk and engage in a concurrent, discrete task, has been assessed in MS, but little is known about how it relates to other MS symptoms. Self-awareness theory suggests that DTW may be a function of the interactions among psychological, cognitive, and motor processes.
Method:
Cognitive testing, self-report assessments for depression and falls self-efficacy (FSE), and walk evaluations [DTW and single-task walk (STW)] were assessed in seventy-three people with MS in a clinical care setting. Specifically, we assessed whether psychological factors (depression and FSE) that alter subjective evaluations regarding one’s abilities would moderate the relationships between physical and cognitive abilities and DTW performance.
Results:
DTW speed is related to diverse physical and cognitive predictors. In support of self-awareness theory, FSE moderated the relationship between STW and DTW speeds such that lower FSE attenuated the strength of the relationship between them. DTW costs – the change in speed normalized by STW speed – did not relate to cognitive and motor predictors. DTW costs did relate to depressive symptoms, and depressive symptoms moderated the effect of information processing on DTW costs.
Conclusions:
Findings indicate that an interplay of physical ability and psychological factors – like depression and FSE – may enhance understanding of walking performance under complex, real-world, DTW contexts.
Disruptive behavior disorders (DBD) are heterogeneous at the clinical and the biological level. Therefore, the aims were to dissect the heterogeneous neurodevelopmental deviations of the affective brain circuitry and provide an integration of these differences across modalities.
Methods
We combined two novel approaches. First, normative modeling to map deviations from the typical age-related pattern at the level of the individual of (i) activity during emotion matching and (ii) of anatomical images derived from DBD cases (n = 77) and controls (n = 52) aged 8–18 years from the EU-funded Aggressotype and MATRICS consortia. Second, linked independent component analysis to integrate subject-specific deviations from both modalities.
Results
While cases exhibited on average a higher activity than would be expected for their age during face processing in regions such as the amygdala when compared to controls these positive deviations were widespread at the individual level. A multimodal integration of all functional and anatomical deviations explained 23% of the variance in the clinical DBD phenotype. Most notably, the top marker, encompassing the default mode network (DMN) and subcortical regions such as the amygdala and the striatum, was related to aggression across the whole sample.
Conclusions
Overall increased age-related deviations in the amygdala in DBD suggest a maturational delay, which has to be further validated in future studies. Further, the integration of individual deviation patterns from multiple imaging modalities allowed to dissect some of the heterogeneity of DBD and identified the DMN, the striatum and the amygdala as neural signatures that were associated with aggression.
Cross-species evidence suggests that the ability to exert control over a stressor is a key dimension of stress exposure that may sensitize frontostriatal-amygdala circuitry to promote more adaptive responses to subsequent stressors. The present study examined neural correlates of stressor controllability in young adults. Participants (N = 56; Mage = 23.74, range = 18–30 years) completed either the controllable or uncontrollable stress condition of the first of two novel stressor controllability tasks during functional magnetic resonance imaging (fMRI) acquisition. Participants in the uncontrollable stress condition were yoked to age- and sex-matched participants in the controllable stress condition. All participants were subsequently exposed to uncontrollable stress in the second task, which is the focus of fMRI analyses reported here. A whole-brain searchlight classification analysis revealed that patterns of activity in the right dorsal anterior insula (dAI) during subsequent exposure to uncontrollable stress could be used to classify participants' initial exposure to either controllable or uncontrollable stress with a peak of 73% accuracy. Previous experience of exerting control over a stressor may change the computations performed within the right dAI during subsequent stress exposure, shedding further light on the neural underpinnings of stressor controllability.
Guidelines recommend empowering patients and families to remind healthcare workers (HCWs) to perform hand hygiene (HH). The effectiveness of empowerment tools for patients and their families in Southeast Asia is unknown.
Methods:
We performed a prospective study in a pediatric intensive care unit (PICU) of a Vietnamese pediatric referral hospital. With family and HCW input, we developed a visual tool for families to prompt HCW HH. We used direct observation to collect baseline HH data. We then enrolled families to receive the visual tool and education on its use while continuing prospective collection of HH data. Multivariable logistic regression was used to identify independent predictors of HH in baseline and implementation periods.
Results:
In total, 2,014 baseline and 2,498 implementation-period HH opportunities were observed. During the implementation period, 73 families were enrolled. Overall, HCW HH was 46% preimplementation, which increased to 73% in the implementation period (P < .001). The lowest HH adherence in both periods occurred after HCW contact with patient surroundings: 16% at baseline increased to 24% after implementation. In multivariable analyses, the odds of HCW HH during the implementation period were significantly higher than baseline (adjusted odds ratio [aOR], 2.94; 95% confidence interval [CI], 2.54–3.41; P < .001) after adjusting for observation room, HCW type, time of observation (weekday business hours vs evening or weekend), and HH moment.
Conclusions:
The introduction of a visual empowerment tool was associated with significant improvement in HH adherence among HCWs in a Vietnamese PICU. Future research should explore acceptability and barriers to use of similar tools in low- and middle-income settings.
Brainstem auditory evoked potentials (BAEP) are useful indicators of auditory function during posterior fossa surgery. Several potential mechanisms of injury may affect the cochlear nerve, and complete loss of BAEP is often associated with postoperative hearing loss. We report two cases of intraoperative auditory loss related to vascular compression upon the cochlear nerve.
Methods:
Intra-operative BAEP were monitored in a consecutive series of over 300 microvascu-lar decompressions (MVD) performed in a recent twelve-month period. In two patients undergoing treatment for trigeminal neuralgia, BAEP waveforms suddenly disappeared completely during closure of the dura.
Results:
The cerebello-pontine angle was immediately re-explored and there was no evidence of hemorrhage or cerebellar swelling. The cochlear nerve and brainstem were inspected, and prominent vascular compression was identified in both patients. A cochlear nerve MVD resulted in immediate restoration of BAEP, and both patients recovered without hearing loss.
Conclusion:
These cases illustrate that vascular compression upon the cochlear nerve may disrupt function, and is reversible with MVD. Awareness of this event and recognition of BAEP changes alert the neurosurgeon to a potential reversible cause of hearing loss during posterior fossa surgery.
We observed pediatric S. aureus hospitalizations decreased 36% from 26.3 to 16.8 infections per 1,000 admissions from 2009 to 2016, with methicillin-resistant S. aureus (MRSA) decreasing by 52% and methicillin-susceptible S. aureus decreasing by 17%, among 39 pediatric hospitals. Similar decreases were observed for days of therapy of anti-MRSA antibiotics.
Novel approaches to improving disaster response have begun to include the use of big data and information and communication technology (ICT). However, there remains a dearth of literature on the use of these technologies in disasters. We have conducted an integrative literature review on the role of ICT and big data in disasters. Included in the review were 113 studies that met our predetermined inclusion criteria. Most studies used qualitative methods (39.8%, n=45) over mixed methods (31%, n=35) or quantitative methods (29.2%, n=33). Nearly 80% (n=88) covered only the response phase of disasters and only 15% (n=17) of the studies addressed disasters in low- and middle-income countries. The 4 most frequently mentioned tools were geographic information systems, social media, patient information, and disaster modeling. We suggest testing ICT and big data tools more widely, especially outside of high-income countries, as well as in nonresponse phases of disasters (eg, disaster recovery), to increase an understanding of the utility of ICT and big data in disasters. Future studies should also include descriptions of the intended users of the tools, as well as implementation challenges, to assist other disaster response professionals in adapting or creating similar tools. (Disaster Med Public Health Preparedness. 2019;13:353–367)
Advances in emergency medicine research can be slow to make their way into clinical care, and implementing a new evidence-based intervention can be challenging in the emergency department. The Canadian Association of Emergency Physicians (CAEP) Knowledge Translation Symposium working group set out to produce recommendations for best practice in the implementation of a new science in Canadian emergency departments.
Methods
A systematic review of implementation strategies to change health care provider behaviour in the emergency department was conducted simultaneously with a national survey of emergency physician experience. We summarized our findings into a list of draft recommendations that were presented at the national CAEP Conference 2017 and further refined based on feedback through social media strategies.
Results
We produced 10 recommendations for implementing new evidence-based interventions in the emergency department, which cover identifying a practice gap, evaluating the evidence, planning the intervention strategy, monitoring, providing feedback during implementation, and desired qualities of future implementation research.
Conclusions
We present recommendations to guide future emergency department implementation initiatives. There is a need for robust and well-designed implementation research to guide future emergency department implementation initiatives.
Depression contributes to persistent opioid analgesic use (OAU). Treating depression may increase opioid cessation.
Aims
To determine if adherence to antidepressant medications (ADMs) v. non-adherence was associated with opioid cessation in patients with a new depression episode after >90 days of OAU.
Method
Patients with non-cancer, non-HIV pain (n = 2821), with a new episode of depression following >90 days of OAU, were eligible if they received ≥1 ADM prescription from 2002 to 2012. ADM adherence was defined as >80% of days covered. Opioid cessation was defined as ≥182 days without a prescription refill. Confounding was controlled by inverse probability of treatment weighting.
Results
In weighted data, the incidence rate of opioid cessation was significantly (P = 0.007) greater in patients who adhered v. did not adhered to taking antidepressants (57.2/1000 v. 45.0/1000 person-years). ADM adherence was significantly associated with opioid cessation (odds ratio (OR) = 1.24, 95% CI 1.05–1.46).
Conclusions
ADM adherence, compared with non-adherence, is associated with opioid cessation in non-cancer pain. Opioid taper and cessation may be more successful when depression is treated to remission.
Firefighters represent an important population for understanding the consequences of exposure to potentially traumatic stressors.
Hypothesis/Problem
The researchers were interested in the effects of pre-employment disaster exposure on firefighter recruits’ depression and posttraumatic stress disorder (PTSD) symptoms during the first three years of fire service and hypothesized that: (1) disaster-exposed firefighters would have greater depression and PTSD symptoms than non-exposed overall; and (2) depression and PTSD symptoms would worsen over years in fire service in exposed firefighters, but not in their unexposed counterparts.
Methods
In a baseline interview, 35 male firefighter recruits from seven US cities reported lifetime exposure to natural disaster. These disaster-exposed male firefighter recruits were matched on age, city, and education with non-exposed recruits.
Results
A generalized linear mixed model revealed a significant exposure×time interaction (ecoef =1.04; P<.001), such that depression symptoms increased with time for those with pre-employment disaster exposure only. This pattern persisted after controlling for social support from colleagues (ecoefficient=1.05; P<.001), social support from families (ecoefficient=1.04; P=.001), and on-the-job trauma exposure (coefficient=0.06; ecoefficient=1.11; P<.001). Posttraumatic stress disorder symptoms did not vary significantly between exposure groups at baseline (P=.61).
Conclusion
Depression symptoms increased with time for those with pre-employment disaster exposure only, even after controlling for social support. Posttraumatic stress disorder symptoms did not vary between exposure groups.
PenningtonML, CarpenterTP, SynettSJ, TorresVA, TeagueJ, MorissetteSB, KnightJ, KamholzBW, KeaneTM, ZimeringRT, GulliverSB. The Influence of Exposure to Natural Disasters on Depression and PTSD Symptoms among Firefighters. Prehosp Disaster Med. 2018;33(1):102–108.