We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The global population and status of Snowy Owls Bubo scandiacus are particularly challenging to assess because individuals are irruptive and nomadic, and the breeding range is restricted to the remote circumpolar Arctic tundra. The International Union for Conservation of Nature (IUCN) uplisted the Snowy Owl to “Vulnerable” in 2017 because the suggested population estimates appeared considerably lower than historical estimates, and it recommended actions to clarify the population size, structure, and trends. Here we present a broad review and status assessment, an effort led by the International Snowy Owl Working Group (ISOWG) and researchers from around the world, to estimate population trends and the current global status of the Snowy Owl. We use long-term breeding data, genetic studies, satellite-GPS tracking, and survival estimates to assess current population trends at several monitoring sites in the Arctic and we review the ecology and threats throughout the Snowy Owl range. An assessment of the available data suggests that current estimates of a worldwide population of 14,000–28,000 breeding adults are plausible. Our assessment of population trends at five long-term monitoring sites suggests that breeding populations of Snowy Owls in the Arctic have decreased by more than 30% over the past three generations and the species should continue to be categorised as Vulnerable under the IUCN Red List Criterion A2. We offer research recommendations to improve our understanding of Snowy Owl biology and future population assessments in a changing world.
Background: Parkinson’s disease (PD) varies widely across individuals in terms of clinical manifestations and course of progression. We aimed to compare patterns of brain atrophy between PD clinical subtypes using longitudinally acquired brain MRIs. Methods: We used T1-weighted MRIs from Parkinson’s Progression Markers Initiative (PPMI) on 134 PD individuals and 60 healthy controls with at least two MRIs. Patients were classified into three clinical subtypes at de novo stage using validated subtyping criteria based on major motor and non-motor classifiers (early cognitive impairment, RBD, dysautonomia): mild-motor predominant (n=74), intermediate (n=44), and diffuse-malignant (n=16). Deformation-based morphometry (DBM) maps were calculated and mixed effect models were used to examine the interaction between PD subtypes and rate of atrophy across brain regions over time, controlling for sex and age at baseline. Results: Individuals with ‘diffuse malignant’ PD showed a significantly higher rate of atrophy across multiple brain regions, including lateral nucleus of the forebrain, precuneus, paracentral lobule, inferior temporal gyrus, fusiform gyrus, and lateral hemisphere of the cerebellum (FDR corrected p<0.05). Conclusions: We demonstrated an accelerated atrophy pattern within several brain regions in ‘diffuse malignant’ PD subtype. These findings suggest the presence of a more diffuse multidomain neurodegenerative process in a subgroup of people with PD, favoring the existence of diverse underlying pathophysiologies.
It has been posited that alcohol use may confound the association between greater concussion history and poorer neurobehavioral functioning. However, while greater alcohol use is positively correlated with neurobehavioral difficulties, the association between alcohol use and concussion history is not well understood. Therefore, this study investigated the cross-sectional and longitudinal associations between cumulative concussion history, years of contact sport participation, and health-related/psychological factors with alcohol use in former professional football players across multiple decades.
Participants and Methods:
Former professional American football players completed general health questionnaires in 2001 and 2019, including demographic information, football history, concussion/medical history, and health-related/psychological functioning. Alcohol use frequency and amount was reported for three timepoints: during professional career (collected retrospectively in 2001), 2001, and 2019. During professional career and 2001 alcohol use frequency included none, 1-2, 3-4, 5-7 days/week, while amount included none, 12, 3-5, 6-7, 8+ drinks/occasion. For 2019, frequency included never, monthly or less, 2-4 times/month, 2-3 times/week, >4 times/week, while amount included none, 1-2, 3-4, 5-6, 7-9, 10+ drinks/occasion. Scores on a screening measure for Alcohol Use Disorder (CAGE) were also available at during professional career and 2001 timepoints. Concussion history was recorded in 2001 and binned into five groups: 0, 1-2, 3-5, 6-9, 10+. Depression and pain interference were assessed via PROMIS measures at all timepoints. Sleep disturbance was assessed in 2001 via separate instrument and with PROMIS Sleep Disturbance in 2019. Spearman’s rho correlations tested associations between concussion history and years of sport participation with alcohol use across timepoints, and whether poor health functioning (depression, pain interference, sleep disturbance) in 2001 and 2019 were associated with alcohol use both within and between timepoints.
Results:
Among the 351 participants (Mage=47.86[SD=10.18] in 2001), there were no significant associations between concussion history or years of contact sport participation with CAGE scores or alcohol use frequency/amount during professional career, 2001, or 2019 (rhos=-.072-.067, ps>.05). In 2001, greater depressive symptomology and sleep disturbance were related to higher CAGE scores (rho=.209, p<.001; rho=.176, p<.001, respectively), while greater depressive symptomology, pain interference, and sleep disturbance were related to higher alcohol use frequency (rho=.176, p=.002; rho=.109, p=.045; rho=.132, p=.013, respectively) and amount/occasion (rho=.215, p<.001; rho=.127, p=.020; rho=.153, p=.004, respectively). In 2019, depressive symptomology, pain interference, and sleep disturbance were not related to alcohol use (rhos=-.047-.087, ps>.05). Between timepoints, more sleep disturbance in 2001 was associated with higher alcohol amount/occasion in 2019 (rho=.115, p=.036).
Conclusions:
Increased alcohol intake has been theorized to be a consequence of greater concussion history, and as such, thought to confound associations between concussion history and neurobehavioral function later in life. Our findings indicate concussion history and years of contact sport participation were not significantly associated with alcohol use cross-sectionally or longitudinally, regardless of alcohol use characterization. While higher levels of depression, pain interference, and sleep disturbance in 2001 were related to greater alcohol use in 2001, they were not associated cross-sectionally in 2019. Results support the need to concurrently address health-related and psychological factors in the implementation of alcohol use interventions for former NFL players, particularly earlier in the sport discontinuation timeline.
To assess the relative risk of hospital-onset Clostridioides difficile (HO-CDI) during each month of the early coronavirus disease 2019 (COVID-19) pandemic and to compare it with historical expectation based on patient characteristics.
Design:
This study used a retrospective cohort design. We collected secondary data from the institution’s electronic health record (EHR).
Setting:
The Ohio State University Wexner Medical Center, Ohio, a large tertiary healthcare system in the Midwest.
Patients or participants:
All adult patients admitted to the inpatient setting between January 2018 and May 2021 were eligible for the study. Prisoners, children, individuals presenting with Clostridioides difficile on admission, and patients with <4 days of inpatient stay were excluded from the study.
Results:
After controlling for patient characteristics, the observed numbers of HO-CDI cases were not significantly different than expected. However, during 3 months of the pandemic period, the observed numbers of cases were significantly different from what would be expected based on patient characteristics. Of these 3 months, 2 months had more cases than expected and 1 month had fewer.
Conclusions:
Variations in HO-CDI incidence seemed to trend with COVID-19 incidence but were not fully explained by our case mix. Other factors contributing to the variability in HO-CDI incidence beyond listed patient characteristics need to be explored.
In Paper I, we presented an overview of the Southern-sky MWA Rapid Two-metre (SMART) survey, including the survey design and search pipeline. While the combination of MWA’s large field-of-view and the voltage capture system brings a survey speed of ${\sim} 450\, {\textrm{deg}}^{2}\,\textrm{h}^{-1}$, the progression of the survey relies on the availability of compact configuration of the Phase II array. Over the past few years, by taking advantage of multiple windows of opportunity when the compact configuration was available, we have advanced the survey to 75% of the planned sky coverage. To date, about 10% of the data collected thus far have been processed for a first-pass search, where 10 min of observation is processed for dispersion measures out to 250 ${\textrm{pc cm}}^{-3}$, to realise a shallow survey that is largely sensitive to long-period pulsars. The ongoing analysis has led to two new pulsar discoveries, as well as an independent discovery and a rediscovery of a previously incorrectly characterised pulsar, all from ${\sim} 3\% $ of the data for which candidate scrutiny is completed. In this sequel to Paper I, we describe the strategies for further detailed follow-up including improved sky localisation and convergence to timing solution, and illustrate them using example pulsar discoveries. The processing has also led to re-detection of 120 pulsars in the SMART observing band, bringing the total number of pulsars detected to date with the MWA to 180, and these are used to assess the search sensitivity of current processing pipelines. The planned second-pass (deep survey) processing is expected to yield a three-fold increase in sensitivity for long-period pulsars, and a substantial improvement to millisecond pulsars by adopting optimal de-dispersion plans. The SMART survey will complement the highly successful Parkes High Time Resolution Universe survey at 1.2–1.5 GHz, and inform future large survey efforts such as those planned with the low-frequency Square Kilometre Array (SKA-Low).
We present an overview of the Southern-sky MWA Rapid Two-metre (SMART) pulsar survey that exploits the Murchison Widefield Array’s large field of view and voltage-capture system to survey the sky south of 30$^{\circ}$ in declination for pulsars and fast transients in the 140–170 MHz band. The survey is enabled by the advent of the Phase II MWA’s compact configuration, which offers an enormous efficiency in beam-forming and processing costs, thereby making an all-sky survey of this magnitude tractable with the MWA. Even with the long dwell times employed for the survey (4800 s), data collection can be completed in $<$100 h of telescope time, while still retaining the ability to reach a limiting sensitivity of $\sim$2–3 mJy (at 150 MHz, near zenith), which is effectively 3–5 times deeper than the previous-generation low-frequency southern-sky pulsar survey, completed in the 1990s. Each observation is processed to generate $\sim$5000–8000 tied-array beams that tessellate the full $\sim 610\, {\textrm{deg}^{2}}$ field of view (at 155 MHz), which are then processed to search for pulsars. The voltage-capture recording of the survey also allows a multitude of post hoc processing options including the reprocessing of data for higher time resolution and even exploring image-based techniques for pulsar candidate identification. Due to the substantial computational cost in pulsar searches at low frequencies, the survey data processing is undertaken in multiple passes: in the first pass, a shallow survey is performed, where 10 min of each observation is processed, reaching about one-third of the full-search sensitivity. Here we present the system overview including details of ongoing processing and initial results. Further details including first pulsar discoveries and a census of low-frequency detections are presented in a companion paper. Future plans include deeper searches to reach the full sensitivity and acceleration searches to target binary and millisecond pulsars. Our simulation analysis forecasts $\sim$300 new pulsars upon the completion of full processing. The SMART survey will also generate a complete digital record of the low-frequency sky, which will serve as a valuable reference for future pulsar searches planned with the low-frequency Square Kilometre Array.
Terracing is found widely in the Mediterranean and in other hilly and mountainous regions of the world. Yet while archaeological attention to these ‘mundane’ landscape features has grown, they remain understudied, particularly in Northern Europe. Here, the authors present a multidisciplinary study of terraces in the Breamish Valley, Northumberland. The results date their construction to the Early to Middle Bronze Age, when they were built by cutting back the hillside, stone clearance and wall construction. Environmental evidence points to their use for cereal cultivation. The authors suggest that the construction and use of these terraces formed part of an Early to Middle Bronze Age agricultural intensification, which may have been both demographically and culturally driven.
Auxin is a key regulator of root morphogenesis across angiosperms. To better understand auxin-regulated networks underlying maize root development, we have characterized auxin-responsive transcription across two time points (30 and 120 min) and four regions of the primary root: the meristematic zone, elongation zone, cortex and stele. Hundreds of auxin-regulated genes involved in diverse biological processes were quantified in these different root regions. In general, most auxin-regulated genes are region unique and are predominantly observed in differentiated tissues compared with the root meristem. Auxin gene regulatory networks were reconstructed with these data to identify key transcription factors that may underlie auxin responses in maize roots. Additionally, Auxin-Response Factor subnetworks were generated to identify target genes that exhibit tissue or temporal specificity in response to auxin. These networks describe novel molecular connections underlying maize root development and provide a foundation for functional genomic studies in a key crop.
Background: Eye movements reveal neurodegenerative disease processes due to overlap between oculomotor circuitry and disease-affected areas. Characterizing oculomotor behaviour in context of cognitive function may enhance disease diagnosis and monitoring. We therefore aimed to quantify cognitive impairment in neurodegenerative disease using saccade behaviour and neuropsychology. Methods: The Ontario Neurodegenerative Disease Research Initiative recruited individuals with neurodegenerative disease: one of Alzheimer’s disease, mild cognitive impairment, amyotrophic lateral sclerosis, frontotemporal dementia, Parkinson’s disease, or cerebrovascular disease. Patients (n=450, age 40-87) and healthy controls (n=149, age 42-87) completed a randomly interleaved pro- and anti-saccade task (IPAST) while their eyes were tracked. We explored the relationships of saccade parameters (e.g. task errors, reaction times) to one another and to cognitive domain-specific neuropsychological test scores (e.g. executive function, memory). Results: Task performance worsened with cognitive impairment across multiple diseases. Subsets of saccade parameters were interrelated and also differentially related to neuropsychology-based cognitive domain scores (e.g. antisaccade errors and reaction time associated with executive function). Conclusions: IPAST detects global cognitive impairment across neurodegenerative diseases. Subsets of parameters associate with one another, suggesting disparate underlying circuitry, and with different cognitive domains. This may have implications for use of IPAST as a cognitive screening tool in neurodegenerative disease.
Grasshoppers are one of the most predominant insects in the grasslands of the southern Pampas. In this region, Dichroplus elongatus, Dichroplus maculipennis, Dichroplus pratensis and Borellia bruneri are the most abundant species and have the greatest economic importance. This study aimed to assess the relationship between temporal changes in the density of these species and climate variables associated with temperature and rainfall over an 11-year study period., We monitored 22 sites in different areas of Laprida county from 2005 to 2016. A total of 25 grasshopper species were collected. The most abundant species were D. maculipennis and B. bruneri which reached the highest densities from 2008–2009 to 2010–2011. The rainfall accumulated from September (RAS) to the sampling date and the number of rainy days (RD) largely explained the density variation of B. bruneri. Besides RD and RAS, winter rainfall, rainfall accumulated from October to the sampling date, and thermal amplitude of October (TAO) influenced the density of D. maculipennis. Our results indicated that seasons with less rainfall and fewer RD favored these two species’ abundance. We identified that the RD and TAO contributed significantly to variations in the density of D. elongatus. In contrast to the other two species, we recorded D. elongatus in seasons with high rainfall and high RD. A better understanding of the climate influence on the life cycle of these economically important insects may identify key factors in their population dynamics which in turn may improve management options.
Surgical site infections (SSIs) greatly burden healthcare systems around the world, particularly in low- and middle-income countries. We sought to employ the Systems Engineering Initiative for Patient Safety (SEIPS) model to better characterize SSI prevention practices and factors affecting adherence to prevention guidelines at Jimma University Medical Center (JUMC).
Design:
Our cross-sectional study consisted of semistructured interviews designed to elicit perceptions of and barriers and facilitators to SSI prevention among surgical staff and observations of current preoperative, perioperative, and postoperative SSI prevention practices in surgical cases. Interviews were recorded, manually transcribed, and thematically coded within the SEIPS framework. Trained observers recorded compliance with the World Health Organization’s SSI prevention recommendations.
Setting:
A tertiary-care hospital in Jimma, Ethiopia.
Participants:
Surgical nurses, surgeons, and anesthetists at JUMC.
Results:
Within 16 individual and group interviews, participants cited multiple barriers to SSI prevention including shortages of water and antiseptic materials, lack of clear SSI guidelines and training, minimal Infection Prevention Control (IPC) interaction with surgical staff, and poor SSI tracking. Observations from nineteen surgical cases revealed high compliance with antibiotic prophylaxis (94.7%), hand scrubbing (100%), sterile gloves and instrument use (100%), incision site sterilization (100%), and use of surgical safety checklist (94.7%) but lower compliance with preoperative bathing (26.3%), MRSA screening (0%), and pre- and postoperative glucose (0%, 10.5%) and temperature (57.9%, 47.3%) monitoring.
Conclusions:
Utilizing the SEIPS model helped identify institution-specific barriers and facilitators that can inform targeted interventions to increase compliance with currently underperformed SSI prevention practices at JUMC.
Research indicates that Internet use positively influences cognitive functioning in later life, but we do not know the behavioral pathways that explain this association. This study explored the role of participation in activities as a potential mediator of the relationship between Internet use and cognitive functioning over a 4-year period. We analyzed representative data from the Survey of Health, Ageing and Retirement in Europe (SHARE). The sample included 8353 European participants between 50 and 97 years of age. We used data from 2013 (T1), 2015 (T2), and 2017 (T3). Participants reported whether they participated in a diverse range of social and leisure activities. In addition, they provided information about their Internet use as well as cognitive functioning measures. Findings from cross-lagged panel analysis indicated a positive association between Internet use and change in cognition over the course of 4 years. This relationship was partly mediated by the number of reported activities. Internet use was positively associated with the change in activities after 2 years, which, in turn, positively predicted cognitive functioning 2 more years later. This is the first study that explores the temporal sequence of Internet use, participation in activities, and cognitive functioning. It sheds light on the mechanisms that account for the positive effects of Internet use on healthy aging.
Opioid use disorder is a major public health crisis, and evidence suggests ways of better serving patients who live with opioid use disorder in the emergency department (ED). A multi-disciplinary team developed a quality improvement project to implement this evidence.
Methods
The intervention was developed by an expert working group consisting of specialists and stakeholders. The group set goals of increasing prescribing of buprenorphine/naloxone and providing next day walk-in referrals to opioid use disorder treatment clinics. From May to September 2018, three Alberta ED sites and three opioid use disorder treatment clinics worked together to trial the intervention. We used administrative data to track the number of ED visits where patients were given buprenorphine/naloxone. Monthly ED prescribing rates before and after the intervention were considered and compared with eight nonintervention sites. We considered whether patients continued to fill opioid agonist treatment prescriptions at 30, 60, and 90 days after their index ED visit to measure continuity in treatment.
Results
The intervention sites increased their prescribing of buprenorphine/naloxone during the intervention period and prescribed more buprenorphine/naloxone than the controls. Thirty-five of 47 patients (74.4%) discharged from the ED with buprenorphine/naloxone continued to fill opioid agonist treatment prescriptions 30 days and 60 days after their index ED visit. Thirty-four patients (72.3%) filled prescriptions at 90 days.
Conclusions
Emergency clinicians can effectively initiate patients on buprenorphine/naloxone when supports for this standardized evidence-based care are in place within their practice setting and timely follow-up in community is available.
The radiocarbon (14C) calibration curve so far contains annually resolved data only for a short period of time. With accelerator mass spectrometry (AMS) matching the precision of decay counting, it is now possible to efficiently produce large datasets of annual resolution for calibration purposes using small amounts of wood. The radiocarbon intercomparison on single-year tree-ring samples presented here is the first to investigate specifically possible offsets between AMS laboratories at high precision. The results show that AMS laboratories are capable of measuring samples of Holocene age with an accuracy and precision that is comparable or even goes beyond what is possible with decay counting, even though they require a thousand times less wood. It also shows that not all AMS laboratories always produce results that are consistent with their stated uncertainties. The long-term benefits of studies of this kind are more accurate radiocarbon measurements with, in the future, better quantified uncertainties.
OBJECTIVES/GOALS: The glucocorticoid receptor (GR) is a ubiquitous steroid hormone receptor that is emerging as a mediator of breast cancer metastasis. We aim to better understand the biology associated with phospho-GR species in TNBC and their contribution to tumor progression. METHODS/STUDY POPULATION: To better understand how p-S134 GR may impact TNBC cell biology, we probed GR regulation by soluble factors that are rich within the tumor microenvironment (TME), such as TGFβ. TNBC cells harboring endogenous wild-type or S134A-GR species were created by CRISPR/Cas knock-in and subjected to in vitro assays of advanced cancer behavior. RNA-Seq was employed to identify pS134-GR target genes that are uniquely regulated by TGFβ in the absence of exogenously added GR ligands. Direct regulation of selected TGFβ-induced pS134-GR target genes was validated accordingly. Bioinformatics tools were used to probe publicly available TNBC patient data sets for expression of a pS134-GR 24-gene signature. RESULTS/ANTICIPATED RESULTS: In the absence of GR ligands, GR is transcriptionally activated via p38-MAPK-dependent phosphorylation of Ser134 upon exposure of TNBC cells to TME-derived agents (TGFβ, HGF). The ligand-independent pS134-GR transcriptome primarily encompasses gene sets associated with TNBC cell survival and migration/invasion. Accordingly, pS134-GR was essential for TGFβ-induced TNBC cell migration, anchorage-independent growth in soft-agar, and tumorsphere formation, an in vitro readout of breast cancer stemness properties. Finally, a 24-gene pSer134-GR-dependent signature induced by TGFβ1 predicts shortened survival in breast cancer. We expect to find similar results using an in-house tissue microarray. DISCUSSION/SIGNIFICANCE OF IMPACT: Phospho-S134-GR is a critical downstream mediator of p38 MAPK signaling and TNBC migration, survival, and stemness properties. Our studies define GR as a required effector of TGFβ1 signaling and nominate pS134-GR as a biomarker of elevated risk of breast cancer dissemination.
Introduction: Compared to other areas in Alberta Health Services (AHS), internal data show that emergency departments (EDs) and urgent care centres (UCCs) experience a high rate of workforce violence. As such, reducing violence in AHS EDs and UCCs is a key priority. This project explored staff's lived experience with patient violence with the goal of better understanding its impact, and what strategies and resources could be put in place. Methods: To obtain a representative sample, we recruited staff from EDs and a UCC (n = 6) situated in urban and rural settings across Alberta. As the interviews had the potential to be upsetting, we conducted in-person interviews in a private space. Interviews were conducted with over 60 staff members including RNs, LPNs, unit clerks, physicians, and protective services. Data collection and analysis occurred simultaneously and iteratively until saturation was reached. The analysis involved data reduction, category development, and synthesis. Key phrases and statements were first highlighted. Preliminary labels were then assigned to the data and data was then organized into meaningful clusters. Finally, we identified common themes of participants’ lived experience. Triangulation of sources, independent and team analysis, and frequent debriefing sessions were used to enhance the trustworthiness of the data. Results: Participants frequently noted the worry they carry with them when coming into work, but also said there was a high threshold of acceptance dominating ED culture. A recurring feature of this experience was the limited resources (e.g., no peace officers, scope of security staff) available to staff to respond when patients behave violently or are threatening. Education like non-violent crisis intervention training, although helpful, was insufficient to make staff feel safe. Participants voiced the need for more protective services, the addition of physical barriers like locking doors and glass partitions, more investment in addictions and mental health services (e.g., increased access to psychiatrists or addictions counsellors), and a greater shared understanding of AHS’ zero tolerance policy. Conclusion: ED and UCC staff describe being regularly exposed to violence from patients and visitors. Many of these incidents go unreported and unresolved, leaving the workforce feeling worried and unsupported. Beyond education, the ED and UCC workforce need additional resources to support them in feeling safe coming to work.
Introduction: Emergency Departments (EDs) are at high risk of workforce-directed violence (WDV). To address ED violence in Alberta Health Services (AHS), we conducted key informant interviews to identify successful strategies that could be adopted in AHS EDs. Methods: The project team identified potential participants through their ED network; additional contacts were identified through snowball sampling. We emailed 197 individuals from Alberta (123), Canada (46), and abroad (28). The interview guide was developed and reviewed in partnership with ED managers and Workplace Health and Safety. We conducted semi-structured phone interviews with 26 representatives from urban and rural EDs or similar settings from Canada, the United States, and Australia. This interview process received an ARECCI score of 2. Two researchers conducted a content analysis of the interview notes; rural and urban sites were analyzed separately. We extracted strategies, their impact, and implementation barriers and facilitators. Strategies identified were categorized into emergent themes. We aggregated similar strategies and highlighted key or unique findings. Results: Interview results showed that there is no single solution to address ED violence. Sites with effective violence prevention strategies used a comprehensive approach where multiple strategies were used to address the issue. For example, through a violence prevention working group, one site implemented weekly violence simulations, a peer mentorship support team, security rounding, and more. This multifaceted approach had positive results: a decrease in code whites, staff feeling more supported, and the site no longer being on union “concerned” lists. Another promising strategy included addressing the culture of violence by increasing reporting, clarifying policies (i.e., zero tolerance), and establishing flagging or alert systems for visitors with violent histories. Physician involvement and support was highly valued in responding to violence (e.g., support when refusing care, on the code white response team, flagging). Conclusion: Overall, one strategy is not enough to successfully address WDV in EDs. Strategies need to be comprehensive and context specific, especially when considering urban and rural sites with different resources available. We note that few strategies were formally evaluated, and recommend that future work focus on developing comprehensive metrics to evaluate the strategies and define success.
Introduction: Emergency department (ED) flow is a strong predictor of patient safety, quality of care and provider satisfaction. Throughput interventions have been shown to improve flow metrics, yet few studies have considered MD leadership roles and evaluated provider experience. Our objective was to evaluate the emergency physician lead (EPL) role, a novel MD staffing initiative. Methods: This mixed-method observational time series analysis evaluated ED metrics at two tertiary EDs including ED length of stay (LOS), EMS Park LOS and physician initial assessment (PIA) time as well as 72-hour readmit and left without being seen (LWBS) rates. Data was collected from the ED information system database for control (Dec 6, 2017-Feb 28, 2018 SITE1 and Mar 1–May 31, 2018 SITE2), pre (Sept 3-Nov 30, 2018 SITE 1 and Dec 3, 2018-Feb 28, 2019 SITE2) and post (Dec 3, 2018 –Feb 28, 2019 SITE1, Mar 1- May 31, 2019 SITE2) periods for adult patients presenting to each site. Site data was analyzed independently using descriptive and inferential statistics to calculate differences in means, and means were compared using t-tests. A survey elicited provider feedback from ED physicians, nurses, and EMS professionals on the effect of the EPL on throughput, timeliness of admissions and discharges, provider workload, and the EPL as a resource to other professionals. Results: The number of ED visits at SITE1 were 13136 (Ctrl), 13236 (Pre) and 13137 (Post), and at SITE2 were 14371(Ctrl), 13866 (Pre) and 14962 (Post). Mean ED LOS was decreased by 17 min in post vs control and 20 min vs pre at SITE1 (p < 0.01). SITE2 saw an increase in ED LOS by 7 min vs control and 8 min vs pre (p < 0.01). EMS LOS at SITE1 was decreased by 21 min vs control and 22 min vs pre (p < 0.01), but was increased at SITE2 by 2 min vs control (p = 0.09) and 14 min vs pre (p < 0.01). PIA time at SITE1 was decreased by 15 min vs control (p < 0.01) and 13 min vs pre and increased by 5 min vs control and 12 min vs pre at SITE2 (p < 0.01). 72 hour readmit and LWBS rates were unchanged at both sites. Qualitative feedback from ED providers highlighted the early provision of treatments and investigations by the EPL, and many felt the EPL was an important resource. Conclusion: The inclusion of both quantitative and qualitative data in this study provided a robust analysis of the impact of the EPL role and demonstrated modest but important improvements. A site-dependent, carefully considered implementation of the EPL role may improve ED metrics and provider experiences.