We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Patients with posttraumatic stress disorder (PTSD) exhibit smaller regional brain volumes in commonly reported regions including the amygdala and hippocampus, regions associated with fear and memory processing. In the current study, we have conducted a voxel-based morphometry (VBM) meta-analysis using whole-brain statistical maps with neuroimaging data from the ENIGMA-PGC PTSD working group.
Methods
T1-weighted structural neuroimaging scans from 36 cohorts (PTSD n = 1309; controls n = 2198) were processed using a standardized VBM pipeline (ENIGMA-VBM tool). We meta-analyzed the resulting statistical maps for voxel-wise differences in gray matter (GM) and white matter (WM) volumes between PTSD patients and controls, performed subgroup analyses considering the trauma exposure of the controls, and examined associations between regional brain volumes and clinical variables including PTSD (CAPS-4/5, PCL-5) and depression severity (BDI-II, PHQ-9).
Results
PTSD patients exhibited smaller GM volumes across the frontal and temporal lobes, and cerebellum, with the most significant effect in the left cerebellum (Hedges’ g = 0.22, pcorrected = .001), and smaller cerebellar WM volume (peak Hedges’ g = 0.14, pcorrected = .008). We observed similar regional differences when comparing patients to trauma-exposed controls, suggesting these structural abnormalities may be specific to PTSD. Regression analyses revealed PTSD severity was negatively associated with GM volumes within the cerebellum (pcorrected = .003), while depression severity was negatively associated with GM volumes within the cerebellum and superior frontal gyrus in patients (pcorrected = .001).
Conclusions
PTSD patients exhibited widespread, regional differences in brain volumes where greater regional deficits appeared to reflect more severe symptoms. Our findings add to the growing literature implicating the cerebellum in PTSD psychopathology.
The COVID-19 pandemic resulted in a mental health crisis in adolescents. To evaluate resource needs, we attempted to collect data from Children’s Health Fund’s national network of pediatric practices working in resource-limited settings.
Methods
Data could not be collected largely due to other disaster response priorities for our network. Using a STROBE flowchart, we characterize the inability to collect data, provide insight into network challenges, and offer this report as a case example for the limitations in collecting data during disaster response.
Results
Only 2 of 24 programs had the capacity and the data to participate. Causes of non-participation included shifting work toward other aspects of disaster response, limiting collection of data, or lack of human resources to extract it.
Conclusions
Disaster disproportionately affects under-resourced communities. The lack of resources impairs disaster response due to conflicting priorities in those working within these communities.
Objectives/Goals: Second-generation antipsychotics (SGA) are used to treat mental disorders in youth but are linked metabolic syndrome (MetS). Most data on prescribing practices and risk factors are from short-term studies (6–12 months). We aim to characterize prescribing and identify clinical and genetic predictors of MetS using electronic health records (EHR). Methods/Study Population: EHR data were extracted from Cincinnati Children’s Hospital Medical Center (CCHMC) for patients aged ≤21 years prescribed SGAs from 7/1/2009 and 7/1/2024, identifying prescribing prevalence. Next steps are to create an SGA-MetS case–control dataset 8 weeks after an SGA prescription. A case will be defined by meeting 3 of 5 criteria: 1) BMI ≥95th percentile for age/sex; 2) fasting glucose ≥100 mg/dL or use of anti-diabetics; 3) triglycerides ≥110 mg/dL; 4) HDL-C ≤40 mg/dL; 5) systolic/diastolic BP ≥90th percentile for age/sex or use of antihypertensives. The prevalence of SGA-MetS will be calculated by dividing SGA-MetS cases by total SGA users. Logistic regression will identify clinical predictors of MetS, and we will evaluate the association of polygenic risk scores (PRS) of BMI and type 2 diabetes with SGA-MetS risk. Results/Anticipated Results: Our preliminary analysis identified 30,076 patients who were prescribed SGAs (mean age 12 years, SD = 4; 58.8% female; n = 17685). Most self-identified as non-Hispanic (95%, n = 28,595) and of White race (76%; n = 22,935), with 18.5% self-identifying as Black or African American (n = 5,579). The most commonly prescribed SGAs were risperidone (n = 12,382, 41.1%), aripiprazole (n = 9,847, 32.7%), and quetiapine (n = 5,263, 17.5%), with much lower prescribing rates of other SGA known of their low risk of MetS (e.g., ziprasidone 5.5%, lurasidone 1.4%, paliperidone (n = 316, 1.1%), or others cariprazine (n = 72), asenapine (n = 43), brexipiprazole (n = 39), iloperidone (n = 24), and clozapine (n = 20). Discussion/Significance of Impact: Our analyses found that risperidone, quetiapine, and aripiprazole were the most prescribed SGA, with risperidone/quetiapine linked to a higher risk of MetS. We will present ongoing work identifying risk factors for SGA-MetS and examining the association with PRS. Our work has the potential to identify high-risk patients for personalized treatment.
To better understand clinicians’ rationale for ordering testing for C. difficile infection (CDI) for patients receiving laxatives and the impact of the implementation of a clinical decision support (CDS) intervention.
Design:
A mixed-methods, case series was performed from March 2, 2017 to December 31, 2018.
Setting:
Yale New Haven Hospital, a 1,541 bed tertiary academic medical center.
Participants:
Hospitalized patients ≥ 18 years old, and clinicians who were alerted by the CDS.
Intervention:
CDS was triggered in real-time when a clinician sought to order testing for CDI for a patient who received one or more doses of laxatives within the preceding 24 hours.
Results:
A total of 3,376 CDS alerts were triggered during the 21-month study period from 2,567 unique clinician interactions. Clinicians bypassed the CDS alert 74.5% of the time, more frequent among residents (48.3% bypass vs. 39.9% accept) and advanced practice providers (APPs) (34.9% bypass vs. 30.6% accept) than attendings (11.3% bypass vs. 22.5% accept). Ordering clinicians noted increased stool frequency/output (48%), current antibiotic exposure (34%), and instructions by an attending physician to test (28%) were among the most common reasons for overriding the alert and proceeding with testing for CDI.
Conclusions:
Testing for CDI despite patient laxative use was associated with an increased clinician concern for CDI, patient risk for CDI, and attending physician instruction for testing. Attendings frequently accepted CDS guidance while residents and APPs often reinstated CDI test orders, suggesting a need for greater empowerment and discretion when ordering tests.
New drugs to target different pathways in pulmonary hypertension has resulted in increased combination therapy, but details of this use in infants are not well described. In this large multicenter database study, we describe the pharmacoepidemiology of combination pulmonary vasodilator therapy in critically ill infants.
Methods:
We identified inborn infants discharged home from a Pediatrix neonatal ICU from 1997 to 2020 exposed to inhaled nitric oxide, sildenafil, epoprostenol, or bosentan for greater than two consecutive days. We compared clinical variables and drug utilisation between infants receiving simultaneous combination and monotherapy. We reported each combination’s frequency, timing, and duration and graphically represented drug use over time.
Results:
Of the 7681 infants that met inclusion criteria, 664 (9%) received combination therapy. These infants had a lower median gestational age and birth weight, were more likely to have cardiac and pulmonary anomalies, receive cardiorespiratory support, and had higher in-hospital mortality than those receiving monotherapy. Inhaled nitric oxide and sildenafil were most frequently used, and utilisation of combination and monotherapy for all drugs increased over time. Inhaled nitric oxide and epoprostenol were used in infants with a higher gestational age, earlier postnatal age, and shorter duration than sildenafil and bosentan. Dual therapy with inhaled nitric oxide and sildenafil was the most common combination therapy.
Conclusion:
Our study revealed an increased use of combination pulmonary vasodilator therapy, favouring inhaled nitric oxide and sildenafil, yet with considerable practice variation. Further research is needed to determine the optimal combination, sequence, dosing, and disease-specific indications for combination therapy.
The heavy atom content and distribution in chlorite were estimated using the relative intensities of basal X-ray powder diffraction (XRD) peaks. For these peaks to be meaningful, however, corrections had to be made for the effects of sample thickness, sample length, and preferred orientation of the mineral grains, all of which are 2θ dependent. The effects of sample thickness were corrected for by a simple formula. The effects of sample length were accounted for by using rectangular samples and by ensuring that the sample intersected the X-ray beam through the range of diffraction angles of interest. Preferred orientation of mineral grains were either measured directly or estimated. Estimated values were quicker and easier to obtain and were within 5% of measured values. A comparison of the compositional parameters of chlorite estimated before correcting for these sample effects with those estimated after the corrections had been applied indicate that the uncorrected values differed from the corrected values by as much as 55% of the latter values. Mounts of a single sample prepared by different filter-membrane peel and porous-plate techniques yielded widely different compositions until the measurements were corrected for sample effects. Analyses in triplicate indicated that the XRD intensity ratio 003/001 is preferred for calculating heavy atom distributions and abundances in chlorite because of the relative strength of the 001 peak.
Helium or neopentane can be used as surrogate gas fill for deuterium (D2) or deuterium-tritium (DT) in laser-plasma interaction studies. Surrogates are convenient to avoid flammability hazards or the integration of cryogenics in an experiment. To test the degree of equivalency between deuterium and helium, experiments were conducted in the Pecos target chamber at Sandia National Laboratories. Observables such as laser propagation and signatures of laser-plasma instabilities (LPI) were recorded for multiple laser and target configurations. It was found that some observables can differ significantly despite the apparent similarity of the gases with respect to molecular charge and weight. While a qualitative behaviour of the interaction may very well be studied by finding a suitable compromise of laser absorption, electron density, and LPI cross sections, a quantitative investigation of expected values for deuterium fills at high laser intensities is not likely to succeed with surrogate gases.
Homeless shelter residents and staff may be at higher risk of SARS-CoV-2 infection. However, SARS-CoV-2 infection estimates in this population have been reliant on cross-sectional or outbreak investigation data. We conducted routine surveillance and outbreak testing in 23 homeless shelters in King County, Washington, to estimate the occurrence of laboratory-confirmed SARS-CoV-2 infection and risk factors during 1 January 2020–31 May 2021. Symptom surveys and nasal swabs were collected for SARS-CoV-2 testing by RT-PCR for residents aged ≥3 months and staff. We collected 12,915 specimens from 2,930 unique participants. We identified 4.74 (95% CI 4.00–5.58) SARS-CoV-2 infections per 100 individuals (residents: 4.96, 95% CI 4.12–5.91; staff: 3.86, 95% CI 2.43–5.79). Most infections were asymptomatic at the time of detection (74%) and detected during routine surveillance (73%). Outbreak testing yielded higher test positivity than routine surveillance (2.7% versus 0.9%). Among those infected, residents were less likely to report symptoms than staff. Participants who were vaccinated against seasonal influenza and were current smokers had lower odds of having an infection detected. Active surveillance that includes SARS-CoV-2 testing of all persons is essential in ascertaining the true burden of SARS-CoV-2 infections among residents and staff of congregate settings.
Migratory birds are implicated in dispersing haemosporidian parasites over great geographic distances. However, their role in sharing these vector-transmitted blood parasites with resident avian host species along their migration flyway is not well understood. We studied avian haemosporidian parasites in 10 localities where Chilean Elaenia, a long-distance Neotropical austral migrant species, spends part of its annual cycle to determine local parasite transmission among resident sympatric host species in the elaenia's distributional range across South America. We sampled 371 Chilean Elaenias and 1,818 birds representing 243 additional sympatric species from Brazilian wintering grounds to Argentinian breeding grounds. The 23 haemosporidian lineages found in Chilean Elaenias exhibited considerable variation in distribution, specialization, and turnover across the 10 avian communities in South America. Parasite lineage dissimilarity increased with geographic distance, and infection probability by Parahaemoproteus decreased in localities harbouring a more diverse haemosporidian fauna. Furthermore, blood smears from migrating Chilean Elaenias and local resident avian host species did not contain infective stages of Leucocytozoon, suggesting that transmission did not take place in the Brazilian stopover site. Our analyses confirm that this Neotropical austral migrant connects avian host communities and transports haemosporidian parasites along its distributional range in South America. However, the lack of transmissive stages at stopover site and the infrequent parasite lineage sharing between migratory host populations and residents at breeding and wintering grounds suggest that Chilean Elaenias do not play a significant role in dispersing haemosporidian parasites, nor do they influence local transmission across South America.
Firm-level variables that predict cross-sectional stock returns, such as price-to-earnings and short interest, are often averaged and used to predict market returns. Using various samples of cross-sectional predictors and accounting for the number of predictors and their interdependence, we find only weak evidence that cross-sectional predictors make good time-series predictors, especially out-of-sample. The results suggest that cross-sectional predictors do not generally contain systematic information.
Disruptive behavior disorders (DBD) are heterogeneous at the clinical and the biological level. Therefore, the aims were to dissect the heterogeneous neurodevelopmental deviations of the affective brain circuitry and provide an integration of these differences across modalities.
Methods
We combined two novel approaches. First, normative modeling to map deviations from the typical age-related pattern at the level of the individual of (i) activity during emotion matching and (ii) of anatomical images derived from DBD cases (n = 77) and controls (n = 52) aged 8–18 years from the EU-funded Aggressotype and MATRICS consortia. Second, linked independent component analysis to integrate subject-specific deviations from both modalities.
Results
While cases exhibited on average a higher activity than would be expected for their age during face processing in regions such as the amygdala when compared to controls these positive deviations were widespread at the individual level. A multimodal integration of all functional and anatomical deviations explained 23% of the variance in the clinical DBD phenotype. Most notably, the top marker, encompassing the default mode network (DMN) and subcortical regions such as the amygdala and the striatum, was related to aggression across the whole sample.
Conclusions
Overall increased age-related deviations in the amygdala in DBD suggest a maturational delay, which has to be further validated in future studies. Further, the integration of individual deviation patterns from multiple imaging modalities allowed to dissect some of the heterogeneity of DBD and identified the DMN, the striatum and the amygdala as neural signatures that were associated with aggression.
Identification of treatment-specific predictors of drug therapies for bipolar disorder (BD) is important because only about half of individuals respond to any specific medication. However, medication response in pediatric BD is variable and not well predicted by clinical characteristics.
Methods
A total of 121 youth with early course BD (acute manic/mixed episode) were prospectively recruited and randomized to 6 weeks of double-blind treatment with quetiapine (n = 71) or lithium (n = 50). Participants completed structural magnetic resonance imaging (MRI) at baseline before treatment and 1 week after treatment initiation, and brain morphometric features were extracted for each individual based on MRI scans. Positive antimanic treatment response at week 6 was defined as an over 50% reduction of Young Mania Rating Scale scores from baseline. Two-stage deep learning prediction model was established to distinguish responders and non-responders based on different feature sets.
Results
Pre-treatment morphometry and morphometric changes occurring during the first week can both independently predict treatment outcome of quetiapine and lithium with balanced accuracy over 75% (all p < 0.05). Combining brain morphometry at baseline and week 1 allows prediction with the highest balanced accuracy (quetiapine: 83.2% and lithium: 83.5%). Predictions in the quetiapine and lithium group were found to be driven by different morphometric patterns.
Conclusions
These findings demonstrate that pre-treatment morphometric measures and acute brain morphometric changes can serve as medication response predictors in pediatric BD. Brain morphometric features may provide promising biomarkers for developing biologically-informed treatment outcome prediction and patient stratification tools for BD treatment development.
To investigate a cluster of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) infections in employees working on 1 floor of a hospital administration building.
Methods:
Contact tracing was performed to identify potential exposures and all employees were tested for SARS-CoV-2. Whole-genome sequencing was performed to determine the relatedness of SARS-CoV-2 samples from infected personnel and from control cases in the healthcare system with coronavirus disease 2019 (COVID-19) during the same period. Carbon dioxide levels were measured during a workday to assess adequacy of ventilation; readings >800 parts per million (ppm) were considered an indication of suboptimal ventilation. To assess the potential for airborne transmission, DNA-barcoded aerosols were released, and real-time polymerase chain reaction was used to quantify particles recovered from air samples in multiple locations.
Results:
Between December 22, 2020, and January 8, 2021, 17 coworkers tested positive for SARS-CoV-2, including 13 symptomatic and 4 asymptomatic individuals. Of the 5 cluster SARS-CoV-2 samples sequenced, 3 were genetically related, but these employees denied higher-risk contacts with one another. None of the sequences from the cluster were genetically related to the 17 control sequences of SARS-CoV-2. Carbon dioxide levels increased during a workday but never exceeded 800 ppm. DNA-barcoded aerosol particles were dispersed from the sites of release to locations throughout the floor; 20% of air samples had >1 log10 particles.
Conclusions:
In a hospital administration building outbreak, sequencing of SARS-CoV-2 confirmed transmission among coworkers. Transmission occurred despite the absence of higher-risk exposures and in a setting with adequate ventilation based on monitoring of carbon dioxide levels.
The fossil record is notoriously imperfect and biased in representation, hindering our ability to place fossil specimens into an evolutionary context. For groups with fossil records mostly consisting of disarticulated parts (e.g., vertebrates, echinoderms, plants), the limited morphological information preserved sparks concerns about whether fossils retain reliable evidence of phylogenetic relationships and lends uncertainty to analyses of diversification, paleobiogeography, and biostratigraphy in Earth's history. To address whether a fragmentary past can be trusted, we need to assess whether incompleteness affects the quality of phylogenetic information contained in fossil data. Herein, we characterize skeletal incompleteness bias in a large dataset (6585 specimens; 14,417 skeletal elements) of fossil squamates (lizards, snakes, amphisbaenians, and mosasaurs). We show that jaws + palatal bones, vertebrae, and ribs appear more frequently in the fossil record than other parts of the skeleton. This incomplete anatomical representation in the fossil record is biased against regions of the skeleton that contain the majority of morphological phylogenetic characters used to assess squamate evolutionary relationships. Despite this bias, parsimony- and model-based comparative analyses indicate that the most frequently occurring parts of the skeleton in the fossil record retain similar levels of phylogenetic signal as parts of the skeleton that are rarer. These results demonstrate that the biased squamate fossil record contains reliable phylogenetic information and support our ability to place incomplete fossils in the tree of life.
Cross-species evidence suggests that the ability to exert control over a stressor is a key dimension of stress exposure that may sensitize frontostriatal-amygdala circuitry to promote more adaptive responses to subsequent stressors. The present study examined neural correlates of stressor controllability in young adults. Participants (N = 56; Mage = 23.74, range = 18–30 years) completed either the controllable or uncontrollable stress condition of the first of two novel stressor controllability tasks during functional magnetic resonance imaging (fMRI) acquisition. Participants in the uncontrollable stress condition were yoked to age- and sex-matched participants in the controllable stress condition. All participants were subsequently exposed to uncontrollable stress in the second task, which is the focus of fMRI analyses reported here. A whole-brain searchlight classification analysis revealed that patterns of activity in the right dorsal anterior insula (dAI) during subsequent exposure to uncontrollable stress could be used to classify participants' initial exposure to either controllable or uncontrollable stress with a peak of 73% accuracy. Previous experience of exerting control over a stressor may change the computations performed within the right dAI during subsequent stress exposure, shedding further light on the neural underpinnings of stressor controllability.
To describe the cumulative seroprevalence of severe acute respiratory coronavirus virus 2 (SARS-CoV-2) antibodies during the coronavirus disease 2019 (COVID-19) pandemic among employees of a large pediatric healthcare system.
Design, setting, and participants:
Prospective observational cohort study open to adult employees at the Children’s Hospital of Philadelphia, conducted April 20–December 17, 2020.
Methods:
Employees were recruited starting with high-risk exposure groups, utilizing e-mails, flyers, and announcements at virtual town hall meetings. At baseline, 1 month, 2 months, and 6 months, participants reported occupational and community exposures and gave a blood sample for SARS-CoV-2 antibody measurement by enzyme-linked immunosorbent assays (ELISAs). A post hoc Cox proportional hazards regression model was performed to identify factors associated with increased risk for seropositivity.
Results:
In total, 1,740 employees were enrolled. At 6 months, the cumulative seroprevalence was 5.3%, which was below estimated community point seroprevalence. Seroprevalence was 5.8% among employees who provided direct care and was 3.4% among employees who did not perform direct patient care. Most participants who were seropositive at baseline remained positive at follow-up assessments. In a post hoc analysis, direct patient care (hazard ratio [HR], 1.95; 95% confidence interval [CI], 1.03–3.68), Black race (HR, 2.70; 95% CI, 1.24–5.87), and exposure to a confirmed case in a nonhealthcare setting (HR, 4.32; 95% CI, 2.71–6.88) were associated with statistically significant increased risk for seropositivity.
Conclusions:
Employee SARS-CoV-2 seroprevalence rates remained below the point-prevalence rates of the surrounding community. Provision of direct patient care, Black race, and exposure to a confirmed case in a nonhealthcare setting conferred increased risk. These data can inform occupational protection measures to maximize protection of employees within the workplace during future COVID-19 waves or other epidemics.
Seed retention, and ultimately seed shatter, are extremely important for the efficacy of harvest weed seed control (HWSC) and are likely influenced by various agroecological and environmental factors. Field studies investigated seed-shattering phenology of 22 weed species across three soybean [Glycine max (L.) Merr.]-producing regions in the United States. We further evaluated the potential drivers of seed shatter in terms of weather conditions, growing degree days, and plant biomass. Based on the results, weather conditions had no consistent impact on weed seed shatter. However, there was a positive correlation between individual weed plant biomass and delayed weed seed–shattering rates during harvest. This work demonstrates that HWSC can potentially reduce weed seedbank inputs of plants that have escaped early-season management practices and retained seed through harvest. However, smaller individuals of plants within the same population that shatter seed before harvest pose a risk of escaping early-season management and HWSC.
Substantial progress has been made in the standardization of nomenclature for paediatric and congenital cardiac care. In 1936, Maude Abbott published her Atlas of Congenital Cardiac Disease, which was the first formal attempt to classify congenital heart disease. The International Paediatric and Congenital Cardiac Code (IPCCC) is now utilized worldwide and has most recently become the paediatric and congenital cardiac component of the Eleventh Revision of the International Classification of Diseases (ICD-11). The most recent publication of the IPCCC was in 2017. This manuscript provides an updated 2021 version of the IPCCC.
The International Society for Nomenclature of Paediatric and Congenital Heart Disease (ISNPCHD), in collaboration with the World Health Organization (WHO), developed the paediatric and congenital cardiac nomenclature that is now within the eleventh version of the International Classification of Diseases (ICD-11). This unification of IPCCC and ICD-11 is the IPCCC ICD-11 Nomenclature and is the first time that the clinical nomenclature for paediatric and congenital cardiac care and the administrative nomenclature for paediatric and congenital cardiac care are harmonized. The resultant congenital cardiac component of ICD-11 was increased from 29 congenital cardiac codes in ICD-9 and 73 congenital cardiac codes in ICD-10 to 318 codes submitted by ISNPCHD through 2018 for incorporation into ICD-11. After these 318 terms were incorporated into ICD-11 in 2018, the WHO ICD-11 team added an additional 49 terms, some of which are acceptable legacy terms from ICD-10, while others provide greater granularity than the ISNPCHD thought was originally acceptable. Thus, the total number of paediatric and congenital cardiac terms in ICD-11 is 367. In this manuscript, we describe and review the terminology, hierarchy, and definitions of the IPCCC ICD-11 Nomenclature. This article, therefore, presents a global system of nomenclature for paediatric and congenital cardiac care that unifies clinical and administrative nomenclature.
The members of ISNPCHD realize that the nomenclature published in this manuscript will continue to evolve. The version of the IPCCC that was published in 2017 has evolved and changed, and it is now replaced by this 2021 version. In the future, ISNPCHD will again publish updated versions of IPCCC, as IPCCC continues to evolve.
Although important treatment decisions are made in the Emergency Department (ED), conversations about patients’ goals and values and priorities often do not occur. There is a critical need to improve the frequency of these conversations, so that ED providers can align treatment plans with these goals, values, and priorities. The Serious Illness Conversation Guide has been used in other care settings and has been demonstrated to improve the frequency, quality, and timing of conversations, but it has not been used in the ED setting. Additionally, ED social workers, although integrated into hospital and home-based palliative care, have not been engaged in programs to advance serious illness conversations in the ED. We set out to adapt the Serious Illness Conversation Guide for use in the ED by social workers.
Methods
We undertook a four-phase process for the adaptation of the Serious Illness Conversation Guide for use in the ED by social workers. This included simulated testing exercises, pilot testing, and deployment with patients in the ED.
Results
During each phase of the Guide's adaptation, changes were made to reflect both the environment of care (ED) and the clinicians (social workers) that would be using the Guide. A final guide is presented.
Significance of results
This report presents an adapted Serious Illness Conversation Guide for use in the ED by social workers. This Guide may provide a tool that can be used to increase the frequency and quality of serious illness conversations in the ED.
We assessed long-term incidence and prevalence trends of dementia and parkinsonism across major ethnic and immigrant groups in Ontario.
Methods:
Linking administrative databases, we established two cohorts (dementia 2001–2014 and parkinsonism 2001–2015) of all residents aged 20 to 100 years with incident diagnosis of dementia (N = 387,937) or parkinsonism (N = 59,617). We calculated age- and sex-standardized incidence and prevalence of dementia and parkinsonism by immigrant status and ethnic groups (Chinese, South Asian, and the General Population). We assessed incidence and prevalence trends using Poisson regression and Cochran–Armitage trend tests.
Results:
Across selected ethnic groups, dementia incidence and prevalence were higher in long-term residents than recent or longer-term immigrants from 2001 to 2014. During this period, age- and sex-standardized incidence of dementia in Chinese, South Asian, and the General Population increased, respectively, among longer-term immigrants (by 41%, 58%, and 42%) and long-term residents (28%, 7%, and 4%), and to a lesser degree among recent immigrants. The small number of cases precluded us from assessing parkinsonism incidence trends. For Chinese, South Asian, and the General Population, respectively, prevalence of dementia and parkinsonism modestly increased over time among recent immigrants but significantly increased among longer-term immigrants (dementia: 134%, 217%, and 117%; parkinsonism: 55%, 54%, and 43%) and long-term residents (dementia: 97%, 132%, and 71%; parkinsonism: 18%, 30%, and 29%). Adjustment for pre-existing conditions did not appear to explain incidence trends, except for stroke and coronary artery disease as potential drivers of dementia incidence.
Conclusion:
Recent immigrants across major ethnic groups in Ontario had considerably lower rates of dementia and parkinsonism than long-term residents, but this difference diminished with longer-term immigrants.