We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The use of community treatment orders (CTOs) has increased in many jurisdictions despite very limited evidence for their efficacy. In this context, it is important to investigate any differences in outcome by subgroup.
Aims
To investigate the variables associated with CTO placement and the impact of CTOs on admissions and bed-days over the following 12 months, including differences by diagnosis.
Method
Cases and controls from a complete jurisdiction, the state of Queensland, Australia, were analysed. Administrative health data were matched by age, sex and time of hospital discharge (index date) with two controls per case subject to a CTO. Multivariate analyses were used to examine factors associated with CTOs, as well as the impact on admissions and bed-days over the 12 months after CTO placement. Registration: Australian and New Zealand Clinical Trials Registry (ACTRN12624000152527).
Results
We identified 10 872 cases and 21 710 controls from January 2018 to December 2022 (total n = 32 582). CTO use was more likely in First Nations people (adjusted odds ratio = 1.14; 95% CI: 1.06–1.23), people from culturally diverse backgrounds (adjusted odds ratio = 1.45; 95% CI: 1.33–1.59) and those with a preferred language other than English (adjusted odds ratio = 1.21; 95% CI: 1.02–1.44). When all diagnostic groups were considered, there were no differences in subsequent admissions or bed-days between cases and controls. However, both re-admissions and bed-days were significantly reduced for CTO cases compared with controls in analyses restricted to non-affective psychoses (e.g. adjusted odds ratio = 0.77, 95% CI: 0.71–0.84 for re-admission).
Conclusions
Queenslanders from culturally or linguistically diverse backgrounds and First Nations peoples are more likely to be placed on CTOs. Targeting CTO use to people with non-affective psychosis would both address rising CTO rates and mean that people placed on these orders derive possible benefit. This has implications for both clinical practice and policy.
Some trials have evaluated peer support for people with mental ill health in high-income, mainly English-speaking countries, but the quality of the evidence is weak.
Aims
To investigate the effectiveness of UPSIDES peer support in high-, middle- and low-income countries.
Method
This pragmatic multicentre parallel-group wait-list randomised controlled trial (registration: ISRCTN26008944) with three measurement points (baseline and 4 and 8 months) took place at six study sites: two in Germany, and one each in Uganda, Tanzania, Israel and India. Participants were adults with long-standing severe mental health conditions. Outcomes were improvements in social inclusion (primary) and empowerment, hope, recovery, health and social functioning (secondary). Participants allocated to the intervention group were offered UPSIDES peer support.
Results
Of the 615 participants (305 intervention group), 337 (54.8%) identified as women. The average age was 38.3 (s.d. = 11.2) years, and the mean illness duration was 14.9 (s.d. = 38.4) years. Those allocated to the intervention group received 6.9 (s.d. = 4.2) peer support sessions on average. Intention-to-treat analysis showed effects on two of the three subscales of the Social Inclusion Scale, Empowerment Scale and HOPE Scale. Per-protocol analysis with participants who had received three or more intervention sessions also showed an effect on the Social Inclusion Scale total score (β = 0.18, P = 0.031, 95% CI: 0.02–0.34).
Conclusions
Peer support has beneficial impacts on social inclusion, empowerment and hope among people with severe mental health conditions across diverse settings. As social isolation is a key driver of mental ill health, and empowerment and hope are both crucial for recovery, peer support can be recommended as an effective component of mental healthcare. Peer support has the potential to move global mental health closer towards a recovery- and rights-based orientation.
The use of compulsory community treatment (CCT) in Australia is some of the highest worldwide despite limited evidence of effectiveness. Even within Australia, use varies widely across jurisdictions despite general similarities in legislation and health services. However, there is much less information on whether variation occurs within the same jurisdiction.
Aims
To measure variations in the use of CCT in a standardised way across the following four Australian jurisdictions: Queensland, South Australia, New South Wales (NSW) and Victoria. We also investigated associated sociodemographic variables.
Methods
We used aggregated administrative data from the Australian Institute of Health and Welfare.
Results
There were data on 402 060 individuals who were in contact with specialist mental health services, of whom 51 351 (12.8%) were receiving CCT. Percentages varied from 8% in NSW to 17.6% in South Australia. There were also wide variations within jurisdictions. In NSW, prevalence ranged from 2% to 13%, in Victoria from 6% to 24%, in Queensland from 11% to 25% and in South Australia from 6% to 36%. People in contact with services who were male, single and aged between 25 and 44 years old were significantly more likely to be subject to CCT, as were people living in metropolitan areas or those born outside Oceania.
Conclusions
There are marked variations in the use of CCT both within and between Australian jurisdictions. It is unclear how much of this variation is determined by clinical need and these findings may be of relevance to jurisdictions with similar clinician-initiated orders.
The focus of job satisfaction literature remains on the subordinate even though supervisors are responsible for evaluating employee performance, determining employee pay, raises, promotions, growth opportunities, etc., all of which impact employees’ subsequent performance that contributes (or not) to organizational success. Using a psychological contracts lens, we develop and test theoretical arguments predicting supervisors’ response to contributions is not uniformly positive depending on the type and amount of contribution involved. Across two studies, we ask supervisors to evaluate subordinates’ delivered contributions relative to promised contributions. Our results challenge the assumption that supervisors always desire larger amounts of work from their subordinates; excess contributions were associated with lower supervisors’ satisfaction with subordinates for some types of contributions. The results imply that subordinates’ contributions of work to supervisors may influence supervisors’ satisfaction with subordinates perhaps affecting their performance reviews and career opportunities.
The study sought to explore nutrition graduates’ employability and role of employability capitals in supporting nutrition science graduate outcomes.
Design:
In-depth semi-structured, audio-recorded interviews were conducted with nutrition graduates who had completed a nutrition science degree between 2015 and 2021. Interpretivism guided this study, which endeavoured to co-construct meaning with participants. Transcribed interviews were thematically analysed, whereby data were coded, themes identified and discussed by all authors. The data were further mapped against the graduate capital model (GCM) by deductively coding against the five graduate capitals (human, identity, social, psychological and cultural).
Setting:
Ireland and Australia.
Participants:
Forty-two nutrition graduates from across nine universities in Ireland and twenty-two from a single university programme in Australia.
Results:
All elements of the GCM were identified with human, social and identity capital most dominant and identified as significantly influential on employability. Presence or absence of these capitals could be clearly identified within each graduates’ experience. Formation of professional identity and connection to the profession was strongest amongst Irish graduates. However, more than half of the Australian cohort perceived barriers to professional identity formation, including lack of regulation, imposter syndrome, presence of non-qualified individuals and comparison to dietetics. Both psychological and cultural capitals were rarely spoken about.
Conclusion:
The development of human, social and identity capital is observed among nutrition science graduates. Further investigation is required to enhance the process of identity development and ascertain potential remedies for obstacles. The absence of psychological and cultural capital, therefore, poses a significant issue for the resilience and comprehension of prospective graduates.
Shark vertebrae and their centra (vertebral bodies) are high-performance structures able to survive millions of cycles of high amplitude strain despite lacking a repair mechanism for accumulating damage. Shark centra consist of mineralized cartilage, a biocomposite of bioapatite (bAp), and collagen, and the nanocrystalline bAp's contribution to functionality remains largely uninvestigated. Using the multiple detector energy-dispersive diffraction (EDD) system at 6-BM-B, the Advanced Photon Source, and 3D tomographic sampling, the 3D functionality of entire centra were probed. Immersion in ethanol vs phosphate-buffered saline produces only small changes in bAp d-spacing within a great hammerhead centrum. EDD mapping under in situ loading was performed an entire blue shark centrum, and 3D maps of bAp strain showed the two structural zones of the centrum, the corpus calcareum and intermedialia, contained opposite-signed strains approaching 0.5%, and application of ~8% nominal strain did not alter these strain magnitudes and their spatial distribution.
Essential minerals are cofactors for synthesis of neurotransmitters supporting cognition and mood. An 8-week fully-blind randomised controlled trial of multinutrients for attention-deficit/hyperactivity disorder (ADHD) demonstrated three times as many children (age 6–12) had significantly improved behaviour (‘treatment responders’) on multinutrients (54 %) compared with placebo (18 %). The aim of this secondary study was to evaluate changes in fasted plasma and urinary mineral concentrations following the intervention and their role as mediators and moderators of treatment response. Fourteen essential or trace minerals were measured in plasma and/or urine at baseline and week eight from eighty-six participants (forty-nine multinutrients, thirty-seven placebos). Two-sample t tests/Mann–Whitney U tests compared 8-week change between treatment and placebo groups, which were also evaluated as potential mediators. Baseline levels were evaluated as potential moderators, using logistic regression models with clinical treatment response as the outcome. After 8 weeks, plasma boron, Cr (in females only), Li, Mo, Se and vanadium and urinary iodine, Li and Se increased more with multinutrients than placebo, while plasma phosphorus decreased. These changes did not mediate treatment response. However, baseline urinary Li trended towards moderation: participants with lower baseline urinary Li were more likely to respond to multinutrients (P = 0·058). Additionally, participants with higher baseline Fe were more likely to be treatment responders regardless of the treatment group (P = 0·036.) These results show that multinutrient treatment response among children with ADHD is independent of their baseline plasma mineral levels, while baseline urinary Li levels show potential as a non-invasive biomarker of treatment response requiring further study.
In this chapter, the authors focus on working collaboratively with the wider community to engage in ECEfS values. They do this through discussing a story from the field from Korea, where children actively engaged in a project to protect local wildlife, and a collaboration in Australia between an early childhood university academic with an interest in participatory and arts-based approaches that support listening to children, and an environmental educator with the local council. Each project demonstrates the value of shared goals, openness and trust between partners. The Australian project was based at the Mary Cairncross Scenic Reserve, a public environmental visitor centre situated in a popular rainforest reserve on the Sunshine Coast in Queensland, Australia, and comprised partnerships between environmental education centre staff and volunteers, student teachers, early childhood practitioners and children aged from 3 to 10 years. The ripple effects of these projects led to powerful ways of thinking and doing ECEfS that enriched child, family and community connections, and strengthened individual, collective and organisational commitments to sustainability.
There is growing evidence that disasters may increase the risk of developing chronic diseases, including diabetes, dyslipidemia, chronic kidney disease, and cardiovascular disease. However, how much disaster exposure specifically affects chronic disease risk is unknown. This presentation introduces the study protocol for the Risk of hEalth ConditiOn AdVerse Events after disasteRs (RECOVER) Cohort Study, which addresses this gap.
Method:
The primary aim of RECOVER is to determine the extent to which disaster exposure specifically increases the risk of developing chronic disease (Aim 1). The secondary aims of the study are to determine if the nature, duration and severity of disaster exposure are risk factors for disease (Aim 2), to map mediators of post-disaster chronic disease risk (Aim 3), and to identify potential biomarkers of post-disaster chronic disease risk (Aim 4). RECOVER will recruit over 6000 adults (1:1 disaster exposed vs unexposed) in Australia to a nationally representative cohort for longitudinal follow-up. Detailed data will be obtained annually on disaster exposure, demographic, social and health factors. The primary health outcome (Aim 1) of chronic disease will be defined as new, incident diabetes, cardiovascular or respiratory disease, and will be ascertained through data linkage with the Pharmaceutical Benefits Scheme. A biomarker sub-stream will include ~1,000 participants who provide a hair and saliva sample for cortisol and epigenetic analysis.
Results:
N/A
Conclusion:
There is an urgent need for detailed individual-level data to analyze the nature of the association between disaster exposure and chronic disease. In 2020 alone, 16.8 million Australians were exposed to disasters. The frequency and severity of disasters are only expected to grow due to climate change. As the first prospective cohort study to longitudinally track individual-level disaster exposure and chronic disease outcomes, RECOVER will fill a critical evidence gap.
OBJECTIVES/GOALS: The characterization of the zebrafish as an animal model for Cockayne Syndrome may guide us towards role of Transcription-Coupled Nucleotide Excision Repair (TC-NER) defects in sensorineural hearing loss. METHODS/STUDY POPULATION: To examine our model, we have developed a zebrafish line with a 9+1 base-pair deletion in the ercc6 gene using TALENs. Mutation has since been confirmed by PCR and subsequent restriction digest with StuI. A series of assays evaluating hair cell morphology, structure and function, as well as ribbon synapse structure, will be used to analyze potential differences between the ercc6 mutant zebrafish line a their wild-type siblings. Additionally, electron microscopy will be used to assess differences in hair cell ultrastructure between the ercc6 mutant zebrafish line a their wild-type siblings. Finally, UVC exposure assays will be used to determine the role TC-NER plays in our novel zebrafish model, and evaluate its potential implications in sensorineural hearing loss. RESULTS/ANTICIPATED RESULTS: We anticipate that biallelic loss of function mutations in the zebrafish ercc6 gene will result in abnormalities in hair cell structure, mechanotransduction, or cell number. Additionally, we anticipate that hair cell ultrastructure and ribbon synapse structure will be impacted by loss of ercc6 expression. DISCUSSION/SIGNIFICANCE: Hearing loss mechanisms associated with defects in TC-NER are yet to be described. We believe our model will provide the tools for a faster and efficient way to carry out Cockayne Syndrome studies while laying the groundwork for the association between TC-NER and hearing loss.
Improving equity in the context of protected areas conservation cannot be achieved in situations where people have different capabilities to participate. Participatory video has the potential to uncover hidden perspectives and worldviews and to build trustworthy, transparent and accountable relationships between marginalized communities and external agencies. We present findings from video-mediated dialogues between Indigenous peoples and decision makers involved in the management of three protected areas in Guyana. Participatory films created by Indigenous researchers in their communities were screened and discussed with protected area managers. We recorded their responses and presented them back to the communities. We show how the video-mediated process provided a rich and contextualized understanding of equity issues. It enabled recognition and respect by protected area managers for Indigenous lived experiences and the contribution of their values and knowledge. For Indigenous peoples, the participatory video process built confidence and critical reflection on their own activities and responsibilities whilst allowing them to challenge decision makers on issues of transparency, communication and accountability. We show that equity is an evolving process and that different protected areas with their differing histories and relationships with Indigenous communities produce distinct outcomes over time. Thus, promoting equity in protected areas and conservation must be a long-term process, enabling participation and producing the conditions for regular, transparent and honest communications. Standardized indicators of protected areas equity could be useful for reporting on international targets, but video-mediated dialogue can facilitate deeper understanding, greater representation and a recognition of rights.
Although associations among borderline personality disorder (BPD), social rejection, and frontal EEG alpha asymmetry scores (FAA, a neural correlate of emotion regulation and approach-withdrawal motivations) have been explored in different studies, relatively little work has examined these relations during adolescence in the same study. We examined whether FAA moderated the relation between BPD features and rejection sensitivity following a validated social exclusion paradigm, Cyberball. A mixed, clinical-community sample of 64 adolescents (females = 62.5%; Mage = 14.45 years; SD = 1.6; range = 11-17 years) completed psychodiagnostic interviews and a self-report measure of BPD (Time 1). Approximately two weeks later (Time 2), participants completed a resting EEG recording followed by Cyberball. FAA moderated the relation between BPD features and overall feelings of rejection following Cyberball: individuals with greater relative left FAA had the highest and lowest feelings of social rejection depending on whether they had high and low BPD feature scores, respectively. Results remained after controlling for age, sex, gender, depression, and BPD diagnosis. These results suggest that FAA may moderate the relation between BPD features and social rejection, and that left frontal brain activity at rest may be differentially associated with those feelings in BPD. Findings are discussed in terms of the link between left frontal brain activity in the regulation and dysregulation of social approach behaviors, characteristic of BPD.
Measurement burst designs, in which assessments of a set of constructs are made at two or more times in quick succession (e.g., within days), can be used as a novel method to improve the stability of basic measures typically used in longitudinal peer research. In this Element, we hypothesized that the stabilities for adolescent-reported peer acceptance, anxiety, and self-concept would be stronger when using the measurement burst approach versus the single time observation. Participants included youth between 10 and 13 years old who completed (a) sociometric assessments of acceptance, and measures of (b) social and test anxiety, and (c) self-concept across three times with two assessments made at each burst. Findings broadly showed that the stabilities were significantly stronger with the measurement burst when compared to the single time assessment, supporting our main hypothesis. We discuss the utility of the measurement burst in a broader context and considerations for researchers.
Stem cells give rise to the entirety of cells within an organ. Maintaining stem cell identity and coordinately regulating stem cell divisions is crucial for proper development. In plants, mobile proteins, such as WUSCHEL-RELATED HOMEOBOX 5 (WOX5) and SHORTROOT (SHR), regulate divisions in the root stem cell niche. However, how these proteins coordinately function to establish systemic behaviour is not well understood. We propose a non-cell autonomous role for WOX5 in the cortex endodermis initial (CEI) and identify a regulator, ANGUSTIFOLIA (AN3)/GRF-INTERACTING FACTOR 1, that coordinates CEI divisions. Here, we show with a multi-scale hybrid model integrating ordinary differential equations (ODEs) and agent-based modeling that quiescent center (QC) and CEI divisions have different dynamics. Specifically, by combining continuous models to describe regulatory networks and agent-based rules, we model systemic behaviour, which led us to predict cell-type-specific expression dynamics of SHR, SCARECROW, WOX5, AN3 and CYCLIND6;1, and experimentally validate CEI cell divisions. Conclusively, our results show an interdependency between CEI and QC divisions.
ABSTRACT IMPACT: The implementation of DPYD and UGT1A1 pharmacogenetic testing, a promising tool of precision medicine, translates evidence-based research into clinical oncology practice with personalized dosing to better predict interpatient variability in chemotherapy tolerability. OBJECTIVES/GOALS: Patients with DPYD and UGT1A1 genetic variants are at risk for severe toxicity from fluoropyrimidines and irinotecan, respectively. We propose that providing clinicians with the option to order a pharmacogenetic (PGx) test with relevant dose recommendations will increase test uptake to guide pharmacotherapy decisions and improve safety outcomes. METHODS/STUDY POPULATION: We plan to conduct a non-randomized, pragmatic, open-label study in 600 adult patients with gastrointestinal (GI) cancers initiating a fluoropyrimidine- and/or irinotecan-based regimen at three cancer centers within a health system. Implementation metrics of a new, in-house laboratory developed PGx test will be measured, including feasibility of returning results within one week, fidelity of providers following dose recommendations, and penetrance via test ordering rates. Clinical aims will include assessing severe toxicity during the first six months of chemotherapy. Outcomes will be compared to a historical control of GI cancer patients enrolled in a biobank and treated with standard dose chemotherapy. RESULTS/ANTICIPATED RESULTS: We anticipate that there will be an increase in PGx test uptake given its shorter turnaround time to facilitate clinical decision-making prior to the first dose of chemotherapy. Through integration of test results in the electronic health record (EHR) and clinical decision support tools for patients with actionable genotypes, we also expect that providers will have a high level of agreement to the recommended dose adjustments. We anticipate a decreased incidence of severe (Grade >3) toxicity among prospectively genotyped patients in the first six months of chemotherapy compared to DPYD and UGT1A1 variant carriers in the historical control group. Exploratory clinical utility data on costs of hospitalizations, chemotherapy treatment, PGx test, and medical services will also be reported. DISCUSSION/SIGNIFICANCE OF FINDINGS: This study aims to address barriers identified by key stakeholders to implementing PGx testing to better tailor chemotherapy dosing to the genetic profiles to patients. This may prevent adverse event-related hospitalizations, improve quality of life for patients, and reduce health system resource utilization costs.
To investigate (i) changes in stakeholder commitment and (ii) perceptions of the purpose, challenges and benefits of healthy food and beverage provision in community sports settings during the stepwise implementation of a healthy beverage policy.
Design:
Convergent, parallel, mixed-methods design complemented (i) repeat semi-structured interviews with council stakeholders (n 17 interviews, n 6 interviewees), with (ii) repeat quantitative stakeholder surveys measuring Commitment to Organisational Change; (iii) weekly sales data examining health behaviour and revenue effects (15 months pre-intervention; 14 months post-intervention); (iv) customer exit surveys (n 458); and (v) periodic photographic audits of beverage availability. Interviews were analysed inductively. Stakeholder surveys, sales data, customer surveys and audits were analysed descriptively.
Setting:
Four local government-owned sports and recreation centres in Melbourne, Australia, completed a 3-month trial to increase the availability of healthy beverages and decrease the availability of unhealthy beverages in food outlets.
Participants:
Interviews were conducted with council managers and those involved in implementation (September 2016–October 2017). Customers were surveyed (September–October 2017).
Results:
Interviews and surveys indicated that stakeholders’ commitment to policies varied such that, over time, optimism that changing beverage availability could increase the healthiness of customers’ purchases became more widespread among interviewees. Stakeholder focus generally progressed from anticipatory concern to solutions-focused discussions. Sales, audit and customer survey data supported interview findings.
Conclusions:
We found a general increase in optimism regarding policy outcomes over time during the implementation of a healthy beverage policy. Stepwise trials should be further explored as an engagement tool within community retail settings.
There are significant concerns over the long- and short-term implications of continuous glyphosate use and potential problems associated with weed species shifts and the development of glyphosate-resistant weed species. Field research was conducted to determine the effect of herbicide treatment and application timing on weed control in glyphosate-resistant soybean. Ten herbicide treatments were evaluated that represented a range of PPI, PRE, and POST-only application timings. All herbicide treatments included a reduced rate of glyphosate applied POST. PRE herbicides with residual properties followed by (fb) glyphosate POST provides more effective control of broadleaf weed species than POST-only treatments. There was no difference in soybean yield between PRE fb POST and POST-only treatments in 2008. Conversely, PRE fb POST herbicide treatments resulted in greater yield than POST-only treatments in 2009. Using PRE fb POST herbicide tactics improves weed control and reduces the risk for crop yield loss when dealing with both early- and late-emerging annual broadleaf weed species across variable cropping environments.
Cover crops play an important role in agricultural sustainability. Unlike commodity cash crops, however, there has been relatively little cover crop breeding research and development. We conducted an online survey to evaluate: (a) the perspectives of organic and conventional farmers in the USA who use cover crops and (b) the specific cover crop traits that are important to farmers. We recruited participants from both organic and conventional agriculture networks and 69% of respondents reported that they farmed organic land. In addition to demographic data and information on management practices, we quantified farmer perspectives on four winter annual cover crops: (1) Austrian winter pea, (2) crimson clover, (3) hairy vetch and (4) cereal rye. Overall, respondents represented a wide range of states, farm sizes, plant hardiness zones and cash crops produced. Of the 417 full responses received, 87% of respondents reported that they used cover crops. The maximum amount farmers were willing to spend on cover crop seed varied by farmer type: 1% of conventional farmers versus 19% of organic farmers were willing to spend over US$185 ha−1 (US$75 acre−1). Organic and conventional farmers differed in terms of the reasons why they grew cover crops, with organic farmers placing greater value on the ecosystem services from cover crops. More organic (63%) than conventional (51%) farmers agreed that participatory breeding was important for cover crop variety development (P = 0.047). Both groups shared strong support for cover crop research and considered many of the same traits to be important for breeding. For the legume cover crops, nitrogen fixation was considered the most important trait, whereas winter hardiness, early vigor, biomass production and weed suppression were the most important traits for cereal rye. Our results illustrate common interests as well as differences in the perspectives between organic and conventional farmers on cover crops and can be used to inform nascent cover crop breeding efforts.
In 2008–2009, the Canadian Institute for Health Information reported over 30,000 cases of sepsis hospitalizations in Canada, an increase of almost 4,000 from 2005. Mortality rates from severe sepsis and septic shock continue to remain greater than 30% in Canada and are significantly higher than other critical conditions treated in the emergency department (ED). Our group formed a multidisciplinary sepsis committee, conducted an ED process of care analysis, and developed a quality improvement protocol. The objective of this study was to evaluate the effects of this sepsis management bundle on patient mortality.
Methods
This before and after study was conducted in two large Canadian tertiary care EDs and included adult patients with suspected severe infection that met at least two systemic inflammatory response syndrome (SIRS) criteria. We studied the implementation of a sepsis bundle including triage flagging, RN medical directive, education campaign, and a modified sepsis protocol. The primary outcomes were 30-day all-cause mortality and sepsis protocol use.
Results
We included a total of 167 and 185 patients in the pre- and post-intervention analysis, respectively. Compared to the pre-intervention group, mortality was significantly lower in the post-intervention group (30.7% versus 17.3%; absolute difference, 13.4%; 95% CI 9.8–17.0; p=0.006). There was also a higher rate of sepsis protocol use in the post-intervention group (20.3% versus 80.5%, absolute difference 60.2%; 95% CI 55.1–65.3; p<0.001). Additionally, we found shorter time-intervals from triage to MD assessment, fluid resuscitation, and antibiotic administration as well as lower rates of vasopressor requirements and ICU admission.
Interpretation
The implementation of our multidisciplinary ED sepsis bundle, including improved early identification and protocolized medical care, was associated with improved time to achieve key therapeutic interventions and a reduction in 30-day mortality. Similar low-cost initiatives could be implemented in other EDs to potentially improve outcomes for this high-risk group of patients.