We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The impact of the coronavirus disease 2019 (COVID-19) pandemic on mental health is still being unravelled. It is important to identify which individuals are at greatest risk of worsening symptoms. This study aimed to examine changes in depression, anxiety and post-traumatic stress disorder (PTSD) symptoms using prospective and retrospective symptom change assessments, and to find and examine the effect of key risk factors.
Method
Online questionnaires were administered to 34 465 individuals (aged 16 years or above) in April/May 2020 in the UK, recruited from existing cohorts or via social media. Around one-third (n = 12 718) of included participants had prior diagnoses of depression or anxiety and had completed pre-pandemic mental health assessments (between September 2018 and February 2020), allowing prospective investigation of symptom change.
Results
Prospective symptom analyses showed small decreases in depression (PHQ-9: −0.43 points) and anxiety [generalised anxiety disorder scale – 7 items (GAD)-7: −0.33 points] and increases in PTSD (PCL-6: 0.22 points). Conversely, retrospective symptom analyses demonstrated significant large increases (PHQ-9: 2.40; GAD-7 = 1.97), with 55% reported worsening mental health since the beginning of the pandemic on a global change rating. Across both prospective and retrospective measures of symptom change, worsening depression, anxiety and PTSD symptoms were associated with prior mental health diagnoses, female gender, young age and unemployed/student status.
Conclusions
We highlight the effect of prior mental health diagnoses on worsening mental health during the pandemic and confirm previously reported sociodemographic risk factors. Discrepancies between prospective and retrospective measures of changes in mental health may be related to recall bias-related underestimation of prior symptom severity.
We apply the author's computational approach to groups to our empirical work studying and modelling riots. We suggest that assigning roles in particular gives insight, and measuring the frequency of bystander behaviour provides a method to understand the dynamic nature of intergroup conflict, allowing social identity to be incorporated into models of riots.
Poor mental health is a state of psychological distress that is influenced by lifestyle factors such as sleep, diet, and physical activity. Compulsivity is a transdiagnostic phenotype cutting across a range of mental illnesses including obsessive–compulsive disorder, substance-related and addictive disorders, and is also influenced by lifestyle. Yet, how lifestyle relates to compulsivity is presently unknown, but important to understand to gain insights into individual differences in mental health. We assessed (a) the relationships between compulsivity and diet quality, sleep quality, and physical activity, and (b) whether psychological distress statistically contributes to these relationships.
Methods
We collected harmonized data on compulsivity, psychological distress, and lifestyle from two independent samples (Australian n = 880 and US n = 829). We used mediation analyses to investigate bidirectional relationships between compulsivity and lifestyle factors, and the role of psychological distress.
Results
Higher compulsivity was significantly related to poorer diet and sleep. Psychological distress statistically mediated the relationship between poorer sleep quality and higher compulsivity, and partially statistically mediated the relationship between poorer diet and higher compulsivity.
Conclusions
Lifestyle interventions in compulsivity may target psychological distress in the first instance, followed by sleep and diet quality. As psychological distress links aspects of lifestyle and compulsivity, focusing on mitigating and managing distress may offer a useful therapeutic approach to improve physical and mental health. Future research may focus on the specific sleep and diet patterns which may alter compulsivity over time to inform lifestyle targets for prevention and treatment of functionally impairing compulsive behaviors.
This paper presents the current state of mathematical modelling of the electrochemical behaviour of lithium-ion batteries (LIBs) as they are charged and discharged. It reviews the models developed by Newman and co-workers, both in the cases of dilute and moderately concentrated electrolytes and indicates the modelling assumptions required for their development. Particular attention is paid to the interface conditions imposed between the electrolyte and the active electrode material; necessary conditions are derived for one of these, the Butler–Volmer relation, in order to ensure physically realistic solutions. Insight into the origin of the differences between various models found in the literature is revealed by considering formulations obtained by using different measures of the electric potential. Materials commonly used for electrodes in LIBs are considered and the various mathematical models used to describe lithium transport in them discussed. The problem of upscaling from models of behaviour at the single electrode particle scale to the cell scale is addressed using homogenisation techniques resulting in the pseudo-2D model commonly used to describe charge transport and discharge behaviour in lithium-ion cells. Numerical solution to this model is discussed and illustrative results for a common device are computed.
Our objective was to assess how, and to what extent, a systems-level perspective is considered in decision-making processes for health interventions by illustrating how studies define the boundaries of the system in their analyses and by defining the decision-making context in which a systems-level perspective is undertaken.
Method
We conducted a scoping review following the Joanna Briggs Institute methodology. MEDLINE, EMBASE, Cochrane Library, and EconLit were searched and key search concepts included decision making, system, and integration. Studies were classified according to an interpretation of the “system” of analysis used in each study based on a four-level model of the health system (patient, care team, organization, and/or policy environment) and using categories (based on intervention type and system impacts considered) to describe the decision-making context.
Results
A total of 2,664 articles were identified and 29 were included for analysis. Most studies (16/29; 55%) considered multiple levels of the health system (i.e., patient, care team, organization, environment) in their analysis and assessed multiple classes of interventions versus a single class of intervention (e.g., pharmaceuticals, screening programs). Approximately half (15/29; 52%) of the studies assessed the influence of policy options on the system as a whole, and the other half assessed the impact of interventions on other phases of the disease pathway or life trajectory (14/29; 48%).
Conclusions
We found that systems thinking is not common in areas where health technology assessments (HTAs) are typically conducted. Against this background, our study demonstrates the need for future conceptualizations and interpretations of systems thinking in HTA.
Alexander Disease (AD) is a rare and ultimately lethal leukodystrophy, typically presenting in infants who exhibit developmental delay, macrocephaly, seizures, spasticity and quadriparesis. Classic infantile forms are generally due to sporadic mutations in GFAP that result in the massive deposition of intra-astrocytic Rosenthal fibres, particularly in the frontal white matter. However, phenotypic manifestations are broad and include both juvenile and adult forms that often display infratentorial pathology and a paucity of leukodystrophic features. We describe the unique case of an 8.5 year old female who presented with an 8 month history of progressively worsening vomiting and cachexia, whose extensive multidisciplinary systemic workup, including GI biopsies, proved negative. Neuroimaging ultimately revealed bilaterally symmetric and anterior predominant supratentorial signal alterations in the white matter plus a 1.7 x 1.2 x 0.7 mm right dorsal medullary mass. Biopsy of this presumed low-grade glioma revealed features in keeping with AD, which was later confirmed on whole exome sequencing. The proband exhibited a pathogenic p.Arg239Cys heterozygous missense mutation in GFAP, which was apparently inherited from her asymptomatic mother (1% mosaicism in the mother’s blood). Germline mosaic inheritance patterns of young-onset AD, particularly those presenting with a tumor-like mass of the brainstem, are scarcely reported in the literature and serve to expand the clinicopathologic spectrum of AD.
LEARNING OBJECTIVES
This presentation with enable the learner to:
1. Recognize an uncommon clinical presentation of AD.
2. Describe the underlying genetics of AD, including a rare familial juvenile onset form featuring germline mosaicism.
To develop a fully automated algorithm using data from the Veterans’ Affairs (VA) electrical medical record (EMR) to identify deep-incisional surgical site infections (SSIs) after cardiac surgeries and total joint arthroplasties (TJAs) to be used for research studies.
Design:
Retrospective cohort study.
Setting:
This study was conducted in 11 VA hospitals.
Participants:
Patients who underwent coronary artery bypass grafting or valve replacement between January 1, 2010, and March 31, 2018 (cardiac cohort) and patients who underwent total hip arthroplasty or total knee arthroplasty between January 1, 2007, and March 31, 2018 (TJA cohort).
Methods:
Relevant clinical information and administrative code data were extracted from the EMR. The outcomes of interest were mediastinitis, endocarditis, or deep-incisional or organ-space SSI within 30 days after surgery. Multiple logistic regression analysis with a repeated regular bootstrap procedure was used to select variables and to assign points in the models. Sensitivities, specificities, positive predictive values (PPVs) and negative predictive values were calculated with comparison to outcomes collected by the Veterans’ Affairs Surgical Quality Improvement Program (VASQIP).
Results:
Overall, 49 (0.5%) of the 13,341 cardiac surgeries were classified as mediastinitis or endocarditis, and 83 (0.6%) of the 12,992 TJAs were classified as deep-incisional or organ-space SSIs. With at least 60% sensitivity, the PPVs of the SSI detection algorithms after cardiac surgeries and TJAs were 52.5% and 62.0%, respectively.
Conclusions:
Considering the low prevalence rate of SSIs, our algorithms were successful in identifying a majority of patients with a true SSI while simultaneously reducing false-positive cases. As a next step, validation of these algorithms in different hospital systems with EMR will be needed.
Community pharmacies were underutilized as vaccination locations during the 2009 H1N1 pandemic. Since that time, community pharmacies are a common location for seasonal influenza vaccinations with approximately one-third of adults now getting vaccinated at a pharmacy. Leveraging community pharmacies to vaccinate during a pandemic such as pandemic influenza or the current coronavirus disease (COVID-19) pandemic will result in a more timely and comprehensive public health response. The purpose of this article is to summarize the results of a strategic planning meeting held in 2017 that focused on operationalizing pandemic influenza vaccinations at a regional supermarket chain pharmacy. Participating in the planning session from the supermarket chain were organizational experts in pharmacy clinical programs, managed care, operations leadership, supply chain, information technology, loss prevention, marketing, and compliance. Additionally, experts from the county and state departments of health and university faculty collaborated in the planning session. Topics addressed included (1) establishing a memorandum of understanding with the state, (2) developing an internal emergency response plan, (3) scaling the pandemic response, (4) considerations for pharmacy locations, (5) staffing for pandemic response, (6) pandemic vaccine-specific training, (7) pharmacy workflow, (8) billing considerations, (9) documentation, (10) supplies and equipment, (11) vaccine supply chain, (12) communications, and (13) security and crowd control. Information from this planning session may be valuable to community pharmacies across the nation that seek to participate in COVID-19 pandemic vaccinations.
This project will work closely with existing service partners involved in street level services and focus on testing and evaluating three approaches for street level interventions for youth who are homeless and who have severe or moderate mentally illness. Youth will be asked to choose their preferred service approach:
Housing First related initiatives focused on interventions designed to move youth to appropriate and available housing and ongoing housing supports.
Treatment First initiatives to provide Mental Health/Addiction supports and treatment solutions, and; Simultaneous attention to both Housing and Treatment Together
Our primary objective is to understand the service delivery preferences of homeless youth and understand the outcomes of these choices. Our research questions include:
1. Which approaches to service are chosen by youth?
2. What are the differences and similarities between groups choosing each approach?
3. What are the critical ingredients needed to effectively implement services for homeless youth from the perspectives of youth, families and service providers?
Focus groups with staff and family members will occur to assist in understanding the nature of each of service approach, changes that evolve within services, & facilitators and barriers to service delivery. This work will be important in determining which approach is chosen by youth and why. Evaluating the outcomes with each choice will provide valuable information about outcomes for the service options chosen by youth. This assist in better identifying weaknesses in the services offered and inform further development of treatment options that youth will accept.
Gene × environment (G × E) interactions in eating pathology have been increasingly investigated, however studies have been limited by sample size due to the difficulty of obtaining genetic data.
Objective
To synthesize existing G × E research in the eating disorders (ED) field and provide a clear picture of the current state of knowledge with analyses of larger samples.
Method
Complete data from seven studies investigating community (n = 1750, 64.5% female) and clinical (n = 426, 100% female) populations, identified via systematic review, were included. Data were combined to perform five analyses: 5-HTTLPR × Traumatic Life Events (0–17 events) to predict ED status (n = 909), 5-HTTLPR × Sexual and Physical Abuse (n = 1097) to predict bulimic symptoms, 5-HTLPR × Depression to predict bulimic symptoms (n = 1256), and 5-HTTLPR × Impulsivity to predict disordered eating (n = 1149).
Results
The low function (s) allele of 5-HTTLPR interacted with number of traumatic life events (P < .01) and sexual and physical abuse (P < .05) to predict increased likelihood of an ED in females but not males (Fig. 1). No other G × E interactions were significant, possibly due to the medium to low compatibility between datasets (Fig. 1).
Conclusion
Early promising results suggest that increased knowledge of G × E interactions could be achieved if studies increased uniformity of measuring ED and environmental variables, allowing for continued collaboration to overcome the restrictions of obtaining genetic samples.
Disclosure of interest
The authors have not supplied their declaration of competing interest.
Given the common view that pre-exercise nutrition/breakfast is important for performance, the present study investigated whether breakfast influences resistance exercise performance via a physiological or psychological effect. Twenty-two resistance-trained, breakfast-consuming men completed three experimental trials, consuming water-only (WAT), or semi-solid breakfasts containing 0 g/kg (PLA) or 1·5 g/kg (CHO) maltodextrin. PLA and CHO meals contained xanthan gum and low-energy flavouring (approximately 122 kJ), and subjects were told both ‘contained energy’. At 2 h post-meal, subjects completed four sets of back squat and bench press to failure at 90 % ten repetition maximum. Blood samples were taken pre-meal, 45 min and 105 min post-meal to measure serum/plasma glucose, insulin, ghrelin, glucagon-like peptide-1 and peptide tyrosine-tyrosine concentrations. Subjective hunger/fullness was also measured. Total back squat repetitions were greater in CHO (44 (sd 10) repetitions) and PLA (43 (sd 10) repetitions) than WAT (38 (sd 10) repetitions; P < 0·001). Total bench press repetitions were similar between trials (WAT 37 (sd 7) repetitions; CHO 39 (sd 7) repetitions; PLA 38 (sd 7) repetitions; P = 0·130). Performance was similar between CHO and PLA trials. Hunger was suppressed and fullness increased similarly in PLA and CHO, relative to WAT (P < 0·001). During CHO, plasma glucose was elevated at 45 min (P < 0·05), whilst serum insulin was elevated (P < 0·05) and plasma ghrelin suppressed at 45 and 105 min (P < 0·05). These results suggest that breakfast/pre-exercise nutrition enhances resistance exercise performance via a psychological effect, although a potential mediating role of hunger cannot be discounted.
The Battelle Critical Care Decontamination System™ (CCDS™) decontaminates N95 filtering facepiece respirators (FFRs) using vapor phase hydrogen peroxide (VPHP) for reuse when there is a critical supply shortage. The Battelle CCDS received an Emergency Use Authorization (EUA) from the US Food and Drug Administration (FDA) in March 2020. This research focused on evaluating the mechanical properties of the straps as an indicator of respirator fit. The objective was to characterize the load generated by the straps following up to 20 don/doff and decontamination cycles in Battelle's CCDS. In general, the measured loads at 50 and 100% strains after 20 cycles were similar (±15%) to the as-received controls. Qualitatively, reductions in the load may be associated with loss of elasticity in the straps, potentially reducing the ability to obtain a proper fit. However, small changes in strap elasticity may not affect the ability to obtain a proper fit given the potential for variation in strap length and positioning on the head. Regardless, prior to reusing a N95 respirator, it is important to complete a visual inspection to ensure it is not damaged, malformed, or soiled. If so, it is recommended to discard the respirator and use a different one. Similarly, the respirator should be discarded if the wearer cannot obtain a proper fit during the user seal check.
Feed represents a substantial proportion of production costs in the dairy industry and is a useful target for improving overall system efficiency and sustainability. The objective of this study was to develop methodology to estimate the economic value for a feed efficiency trait and the associated methane production relevant to Canada. The approach quantifies the level of economic savings achieved by selecting animals that convert consumed feed into product while minimizing the feed energy used for inefficient metabolism, maintenance and digestion. We define a selection criterion trait called Feed Performance (FP) as a 1 kg increase in more efficiently used feed in a first parity lactating cow. The impact of a change in this trait on the total lifetime value of more efficiently used feed via correlated selection responses in other life stages is then quantified. The resulting improved conversion of feed was also applied to determine the resulting reduction in output of emissions (and their relative value based on a national emissions value) under an assumption of constant methane yield, where methane yield is defined as kg methane/kg dry matter intake (DMI). Overall, increasing the FP estimated breeding value by one unit (i.e. 1 kg of more efficiently converted DMI during the cow’s first lactation) translates to a total lifetime saving of 3.23 kg in DMI and 0.055 kg in methane with the economic values of CAD $0.82 and CAD $0.07, respectively. Therefore, the estimated total economic value for FP is CAD $0.89/unit. The proposed model is robust and could also be applied to determine the economic value for feed efficiency traits within a selection index in other production systems and countries.
Introduction: Although use of point of care ultrasound (PoCUS) protocols for patients with undifferentiated hypotension in the Emergency Department (ED) is widespread, our previously reported SHoC-ED study showed no clear survival or length of stay benefit for patients assessed with PoCUS. In this analysis, we examine if the use of PoCUS changed fluid administration and rates of other emergency interventions between patients with different shock types. The primary comparison was between cardiogenic and non-cardiogenic shock types. Methods: A post-hoc analysis was completed on the database from an RCT of 273 patients who presented to the ED with undifferentiated hypotension (SBP <100 or shock index > 1) and who had been randomized to receive standard care with or without PoCUS in 6 centres in Canada and South Africa. PoCUS-trained physicians performed scans after initial assessment. Shock categories and diagnoses recorded at 60 minutes after ED presentation, were used to allocate patients into subcategories of shock for analysis of treatment. We analyzed actual care delivered including initial IV fluid bolus volumes (mL), rates of inotrope use and major procedures. Standard statistical tests were employed. Sample size was powered at 0.80 (α:0.05) for a moderate difference. Results: Although there were expected differences in the mean fluid bolus volume between patients with non-cardiogenic and cardiogenic shock, there was no difference in fluid bolus volume between the control and PoCUS groups (non-cardiogenic control 1878 mL (95% CI 1550 – 2206 mL) vs. non-cardiogenic PoCUS 1687 mL (1458 – 1916 mL); and cardiogenic control 768 mL (194 – 1341 mL) vs. cardiogenic PoCUS 981 mL (341 – 1620 mL). Likewise there were no differences in rates of inotrope administration, or major procedures for any of the subcategories of shock between the control group and PoCUS group patients. The most common subcategory of shock was distributive. Conclusion: Despite differences in care delivered by subcategory of shock, we did not find any significant difference in actual care delivered between patients who were examined using PoCUS and those who were not. This may help to explain the previously reported lack of outcome difference between groups.
Introduction: Point of care ultrasound has been reported to improve diagnosis in non-traumatic hypotensive ED patients. We compared diagnostic performance of physicians with and without PoCUS in undifferentiated hypotensive patients as part of an international prospective randomized controlled study. The primary outcome was diagnostic performance of PoCUS for cardiogenic vs. non-cardiogenic shock. Methods: SHoC-ED recruited hypotensive patients (SBP < 100 mmHg or shock index > 1) in 6 centres in Canada and South Africa. We describe previously unreported secondary outcomes relating to diagnostic accuracy. Patients were randomized to standard clinical assessment (No PoCUS) or PoCUS groups. PoCUS-trained physicians performed scans after initial assessment. Demographics, clinical details and findings were collected prospectively. Initial and secondary diagnoses including shock category were recorded at 0 and 60 minutes. Final diagnosis was determined by independent blinded chart review. Standard statistical tests were employed. Sample size was powered at 0.80 (α:0.05) for a moderate difference. Results: 273 patients were enrolled with follow-up for primary outcome completed for 270. Baseline demographics and perceived category of shock were similar between groups. 11% of patients were determined to have cardiogenic shock. PoCUS had a sensitivity of 80.0% (95% CI 54.8 to 93.0%), specificity 95.5% (90.0 to 98.1%), LR+ve 17.9 (7.34 to 43.8), LR-ve 0.21 (0.08 to 0.58), Diagnostic OR 85.6 (18.2 to 403.6) and accuracy 93.7% (88.0 to 97.2%) for cardiogenic shock. Standard assessment without PoCUS had a sensitivity of 91.7% (64.6 to 98.5%), specificity 93.8% (87.8 to 97.0%), LR+ve 14.8 (7.1 to 30.9), LR- of 0.09 (0.01 to 0.58), Diagnostic OR 166.6 (18.7 to 1481) and accuracy of 93.6% (87.8 to 97.2%). There was no significant difference in sensitivity (-11.7% (-37.8 to 18.3%)) or specificity (1.73% (-4.67 to 8.29%)). Diagnostic performance was also similar between other shock subcategories. Conclusion: As reported in other studies, PoCUS based assessment performed well diagnostically in undifferentiated hypotensive patients, especially as a rule-in test. However performance was similar to standard (non-PoCUS) assessment, which was excellent in this study.
Potassium-rich mafic dykes and lavas from the Highwood Mountains Igneous Province, USA were studied by electron-microprobe and bulk-rock analysis. For the mafic phonolites, compositional trends for olivine and augite phenocrysts and groundmass biotite, alkali feldspar and titanomagnetites are presented and substitution mechanisms discussed. Phenocrysts of biotite and augite in the minettes are also characterized, together with groundmass alkali feldspar and titanomagnetite. The alkali feldspars and biotites are commonly enriched in Ba. Olivine, clinopyroxene and biotite phenocrysts are generally quite magnesium-rich, which is consistent with the primitive natures of the least evolved rocks.
Bulk-rock major-element compositions are combined with modal and microprobe data for the principal phenocrysts to calculate model residual liquid compositions for mafic phonolites, minettes and a syenitic rock. On the basis of phase-equilibria, it is suggested that the main controls of differentiation are polybaric involving crystallization during transport of primary magmas from the mantle for the minettes, and low-pressure differentiation for the mafic phonolites. Whereas magma mixing might have contributed to petrogenesis, many of the disequilibrium features exhibited by clinopyroxene and biotite phenocrysts can also be attributed to pre-existing phenocrysts undergoing decompression melting during magma uprise from its mantle source, followed by rapid crystal growth and episodic volatile loss in sub-volcanic magma chambers.
Introduction: Point of care ultrasound (PoCUS) is an established tool in the initial management of patients with undifferentiated hypotension in the emergency department (ED). While PoCUS protocols have been shown to improve early diagnostic accuracy, there is little published evidence for any mortality benefit. We report the findings from our international multicenter randomized controlled trial, assessing the impact of a PoCUS protocol on survival and key clinical outcomes. Methods: Recruitment occurred at 7 centres in North America (4) and South Africa (3). Scans were performed by PoCUS-trained physicians. Screening at triage identified patients (SBP<100 or shock index>1), randomized to PoCUS or control (standard care and no PoCUS) groups. Demographics, clinical details and study findings were collected prospectively. Initial and secondary diagnoses were recorded at 0 and 60 minutes, with ultrasound performed in the PoCUS group prior to secondary assessment. The primary outcome measure was 30-day/discharge mortality. Secondary outcome measures included diagnostic accuracy, changes in vital signs, acid-base status, and length of stay. Categorical data was analyzed using Fishers test, and continuous data by Student T test and multi-level log-regression testing. (GraphPad/SPSS) Final chart review was blinded to initial impressions and PoCUS findings. Results: 258 patients were enrolled with follow-up fully completed. Baseline comparisons confirmed effective randomization. There was no difference between groups for the primary outcome of mortality; PoCUS 32/129 (24.8%; 95% CI 14.3-35.3%) vs. Control 32/129 (24.8%; 95% CI 14.3-35.3%); RR 1.00 (95% CI 0.869 to 1.15; p=1.00). There were no differences in the secondary outcomes; ICU and total length of stay. Our sample size has a power of 0.80 (α:0.05) for a moderate effect size. Other secondary outcomes are reported separately. Conclusion: This is the first RCT to compare PoCUS to standard care for undifferentiated hypotensive ED patients. We did not find any mortality or length of stay benefits with the use of a PoCUS protocol, though a larger study is required to confirm these findings. While PoCUS may have diagnostic benefits, these may not translate into a survival benefit effect.
Introduction: Point of Care Ultrasound (PoCUS) protocols are commonly used to guide resuscitation for emergency department (ED) patients with undifferentiated non-traumatic hypotension. While PoCUS has been shown to improve early diagnosis, there is a minimal evidence for any outcome benefit. We completed an international multicenter randomized controlled trial (RCT) to assess the impact of a PoCUS protocol on key resuscitation markers in this group. We report diagnostic impact and mortality elsewhere. Methods: The SHoC-ED1 study compared the addition of PoCUS to standard care within the first hour in the treatment of adult patients presenting with undifferentiated hypotension (SBP<100 mmHg or a Shock Index >1.0) with a control group that did not receive PoCUS. Scans were performed by PoCUS-trained physicians. 4 North American, and 3 South African sites participated in the study. Resuscitation outcomes analyzed included volume of fluid administered in the ED, changes in shock index (SI), modified early warning score (MEWS), venous acid-base balance, and lactate, at one and four hours. Comparisons utilized a T-test as well as stratified binomial log-regression to assess for any significant improvement in resuscitation amount the outcomes. Our sample size was powered at 0.80 (α:0.05) for a moderate effect size. Results: 258 patients were enrolled with follow-up fully completed. Baseline comparisons confirmed effective randomization. There was no significant difference in mean total volume of fluid received between the control (1658 ml; 95%CI 1365-1950) and PoCUS groups (1609 ml; 1385-1832; p=0.79). Significant improvements were seen in SI, MEWS, lactate and bicarbonate with resuscitation in both the PoCUS and control groups, however there was no difference between groups. Conclusion: SHOC-ED1 is the first RCT to compare PoCUS to standard of care in hypotensive ED patients. No significant difference in fluid used, or markers of resuscitation was found when comparing the use of a PoCUS protocol to that of standard of care in the resuscitation of patients with undifferentiated hypotension.