We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Both impulsivity and compulsivity have been identified as risk factors for problematic use of the internet (PUI). Yet little is known about the relationship between impulsivity, compulsivity and individual PUI symptoms, limiting a more precise understanding of mechanisms underlying PUI.
Aims
The current study is the first to use network analysis to (a) examine the unique association among impulsivity, compulsivity and PUI symptoms, and (b) identify the most influential drivers in relation to the PUI symptom community.
Method
We estimated a Gaussian graphical model consisting of five facets of impulsivity, compulsivity and individual PUI symptoms among 370 Australian adults (51.1% female, mean age = 29.8, s.d. = 11.1). Network structure and bridge expected influence were examined to elucidate differential associations among impulsivity, compulsivity and PUI symptoms, as well as identify influential nodes bridging impulsivity, compulsivity and PUI symptoms.
Results
Results revealed that four facets of impulsivity (i.e. negative urgency, positive urgency, lack of premeditation and lack of perseverance) and compulsivity were related to different PUI symptoms. Further, compulsivity and negative urgency were the most influential nodes in relation to the PUI symptom community due to their highest bridge expected influence.
Conclusions
The current findings delineate distinct relationships across impulsivity, compulsivity and PUI, which offer insights into potential mechanistic pathways and targets for future interventions in this space. To realise this potential, future studies are needed to replicate the identified network structure in different populations and determine the directionality of the relationships among impulsivity, compulsivity and PUI symptoms.
Background: Pain in a common symptom in adult-onset idiopathic dystonia (AOID). An appropriate tool to understand this symptom is needed to improve AOID patients’ care. We developed a rating instrument for pain in AOID and validated it in cervical dystonia (CD). Methods: Development and validation of the Pain in Dystonia Scale (PIDS) in three phases: 1. International experts and participants generated and evaluated the preliminary items for content validity; 2. The PIDS was drafted and revised, followed by cognitive interviews to ensure suitability for self-administration; and 3. the clinimetric properties of the final PIDS were assessed in 85 participants. Results: PIDS evaluates pain severity (by body part), functional impact and external modulating factors. It showed high test-retest reliability the total score (0.9, p<0.001), intraclass correlation coefficients higher than 0.7 for all items and high internal consistency (Cronbach’s alpha 0.9). Convergent validity analysis revealed a strong correlation between the PIDS severity score and the TWSTRS pain subscale (0.8, p<0.001), the brief pain inventory short form (0.7, p<0.001) and impact of pain on daily functioning (0.7, p<0.001). Conclusions: The PIDS is the first specific questionnaire developed to evaluate pain in patients with AOID with high-level clinimetric properties in people with CD.
The impact of the coronavirus disease 2019 (COVID-19) pandemic on mental health is still being unravelled. It is important to identify which individuals are at greatest risk of worsening symptoms. This study aimed to examine changes in depression, anxiety and post-traumatic stress disorder (PTSD) symptoms using prospective and retrospective symptom change assessments, and to find and examine the effect of key risk factors.
Method
Online questionnaires were administered to 34 465 individuals (aged 16 years or above) in April/May 2020 in the UK, recruited from existing cohorts or via social media. Around one-third (n = 12 718) of included participants had prior diagnoses of depression or anxiety and had completed pre-pandemic mental health assessments (between September 2018 and February 2020), allowing prospective investigation of symptom change.
Results
Prospective symptom analyses showed small decreases in depression (PHQ-9: −0.43 points) and anxiety [generalised anxiety disorder scale – 7 items (GAD)-7: −0.33 points] and increases in PTSD (PCL-6: 0.22 points). Conversely, retrospective symptom analyses demonstrated significant large increases (PHQ-9: 2.40; GAD-7 = 1.97), with 55% reported worsening mental health since the beginning of the pandemic on a global change rating. Across both prospective and retrospective measures of symptom change, worsening depression, anxiety and PTSD symptoms were associated with prior mental health diagnoses, female gender, young age and unemployed/student status.
Conclusions
We highlight the effect of prior mental health diagnoses on worsening mental health during the pandemic and confirm previously reported sociodemographic risk factors. Discrepancies between prospective and retrospective measures of changes in mental health may be related to recall bias-related underestimation of prior symptom severity.
Primary care providers (PCPs) are expected to help patients with obesity to lose weight through behavior change counseling and patient-centered use of available weight management resources. Yet, many PCPs face knowledge gaps and clinical time constraints that hinder their ability to successfully support patients’ weight loss. Fortunately, a small and growing number of physicians are now certified in obesity medicine through the American Board of Obesity Medicine (ABOM) and can provide personalized and effective obesity treatment to individual patients. Little is known, however, about how to extend the expertise of ABOM-certified physicians to support PCPs and their many patients with obesity.
Aim:
To develop and pilot test an innovative care model – the Weight Navigation Program (WNP) – to integrate ABOM-certified physicians into primary care settings and to enhance the delivery of personalized, effective obesity care.
Methods:
Quality improvement program with an embedded, 12-month, single-arm pilot study. Patients with obesity and ≥1 weight-related co-morbidity may be referred to the WNP by PCPs. All patients seen within the WNP during the first 12 months of clinical operations will be compared to a matched cohort of patients from another primary care site. We will recruit a subset of WNP patients (n = 30) to participate in a remote weight monitoring pilot program, which will include surveys at 0, 6, and 12 months, qualitative interviews at 0 and 6 months, and use of an electronic health record (EHR)-based text messaging program for remote weight monitoring.
Discussion:
Obesity is a complex chronic condition that requires evidence-based, personalized, and longitudinal care. To deliver such care in general practice, the WNP leverages the expertise of ABOM-certified physicians, health system and community weight management resources, and EHR-based population health management tools. The WNP is an innovative model with the potential to be implemented, scaled, and sustained in diverse primary care settings.
Poor mental health is a state of psychological distress that is influenced by lifestyle factors such as sleep, diet, and physical activity. Compulsivity is a transdiagnostic phenotype cutting across a range of mental illnesses including obsessive–compulsive disorder, substance-related and addictive disorders, and is also influenced by lifestyle. Yet, how lifestyle relates to compulsivity is presently unknown, but important to understand to gain insights into individual differences in mental health. We assessed (a) the relationships between compulsivity and diet quality, sleep quality, and physical activity, and (b) whether psychological distress statistically contributes to these relationships.
Methods
We collected harmonized data on compulsivity, psychological distress, and lifestyle from two independent samples (Australian n = 880 and US n = 829). We used mediation analyses to investigate bidirectional relationships between compulsivity and lifestyle factors, and the role of psychological distress.
Results
Higher compulsivity was significantly related to poorer diet and sleep. Psychological distress statistically mediated the relationship between poorer sleep quality and higher compulsivity, and partially statistically mediated the relationship between poorer diet and higher compulsivity.
Conclusions
Lifestyle interventions in compulsivity may target psychological distress in the first instance, followed by sleep and diet quality. As psychological distress links aspects of lifestyle and compulsivity, focusing on mitigating and managing distress may offer a useful therapeutic approach to improve physical and mental health. Future research may focus on the specific sleep and diet patterns which may alter compulsivity over time to inform lifestyle targets for prevention and treatment of functionally impairing compulsive behaviors.
This project will work closely with existing service partners involved in street level services and focus on testing and evaluating three approaches for street level interventions for youth who are homeless and who have severe or moderate mentally illness. Youth will be asked to choose their preferred service approach:
Housing First related initiatives focused on interventions designed to move youth to appropriate and available housing and ongoing housing supports.
Treatment First initiatives to provide Mental Health/Addiction supports and treatment solutions, and; Simultaneous attention to both Housing and Treatment Together
Our primary objective is to understand the service delivery preferences of homeless youth and understand the outcomes of these choices. Our research questions include:
1. Which approaches to service are chosen by youth?
2. What are the differences and similarities between groups choosing each approach?
3. What are the critical ingredients needed to effectively implement services for homeless youth from the perspectives of youth, families and service providers?
Focus groups with staff and family members will occur to assist in understanding the nature of each of service approach, changes that evolve within services, & facilitators and barriers to service delivery. This work will be important in determining which approach is chosen by youth and why. Evaluating the outcomes with each choice will provide valuable information about outcomes for the service options chosen by youth. This assist in better identifying weaknesses in the services offered and inform further development of treatment options that youth will accept.
There is increasing evidence for a neurobiological basis of antisocial personality disorder (ASPD), includinggenetic liability, aberrant serotonergic function, neuropsychological deficits and structural and functional brain abnormalities. However, few functional brain imaging studies have been conducted using tasks of clinically relevant functions such as impulse control and reinforcement processing. Here we report on a study investigating the neural basis of behavioural inhibition and reward sensitivity in ASPD using functional magnetic resonance imaging (fMRI).
Methods
17 medication-free male individuals with DSM IV ASPD and 14 healthy controls were included. All subjects were screened for Axis I pathology and substance misuse. Scanner tasks included two block design tasks: one Go/No-Go task and one reward task. Scanning was carried out on a 1.5T Phillips system. Whole brain coverage was achieved using 40 axial slices with 3.5mm spacing a TR of 5 seconds. Data were analysed using SPM5 using random effects models.
Results
Results of the Go/No-Go task confirmed brain activation previously described in the processing of impulse inhibition, namely in the orbitofrontal and dorsolateral prefrontal cortex and the anterior cingulate, and these were enhanced in the PD group. The reward task was associated with BOLD response changes in the reward network in both groups. However, these BOLD responses were reduced in the ASPD group, particularly in prefrontal areas.
Conclusions
Our results further support the notion of prefrontal dysfunction in ASPD. However, contrary to previous studies suggesting “hypofrontality” in this disorder, we found task specific increased and decreased BOLD responses.
Gene × environment (G × E) interactions in eating pathology have been increasingly investigated, however studies have been limited by sample size due to the difficulty of obtaining genetic data.
Objective
To synthesize existing G × E research in the eating disorders (ED) field and provide a clear picture of the current state of knowledge with analyses of larger samples.
Method
Complete data from seven studies investigating community (n = 1750, 64.5% female) and clinical (n = 426, 100% female) populations, identified via systematic review, were included. Data were combined to perform five analyses: 5-HTTLPR × Traumatic Life Events (0–17 events) to predict ED status (n = 909), 5-HTTLPR × Sexual and Physical Abuse (n = 1097) to predict bulimic symptoms, 5-HTLPR × Depression to predict bulimic symptoms (n = 1256), and 5-HTTLPR × Impulsivity to predict disordered eating (n = 1149).
Results
The low function (s) allele of 5-HTTLPR interacted with number of traumatic life events (P < .01) and sexual and physical abuse (P < .05) to predict increased likelihood of an ED in females but not males (Fig. 1). No other G × E interactions were significant, possibly due to the medium to low compatibility between datasets (Fig. 1).
Conclusion
Early promising results suggest that increased knowledge of G × E interactions could be achieved if studies increased uniformity of measuring ED and environmental variables, allowing for continued collaboration to overcome the restrictions of obtaining genetic samples.
Disclosure of interest
The authors have not supplied their declaration of competing interest.
Early in a foodborne disease outbreak investigation, illness incubation periods can help focus case interviews, case definitions, clinical and environmental evaluations and predict an aetiology. Data describing incubation periods are limited. We examined foodborne disease outbreaks from laboratory-confirmed, single aetiology, enteric bacterial and viral pathogens reported to United States foodborne disease outbreak surveillance from 1998–2013. We grouped pathogens by clinical presentation and analysed the reported median incubation period among all illnesses from the implicated pathogen for each outbreak as the outbreak incubation period. Outbreaks from preformed bacterial toxins (Staphylococcus aureus, Bacillus cereus and Clostridium perfringens) had the shortest outbreak incubation periods (4–10 h medians), distinct from that of Vibrio parahaemolyticus (17 h median). Norovirus, salmonella and shigella had longer but similar outbreak incubation periods (32–45 h medians); campylobacter and Shiga toxin-producing Escherichia coli had the longest among bacteria (62–87 h medians); hepatitis A had the longest overall (672 h median). Our results can help guide diagnostic and investigative strategies early in an outbreak investigation to suggest or rule out specific etiologies or, when the pathogen is known, the likely timeframe for exposure. They also point to possible differences in pathogenesis among pathogens causing broadly similar syndromes.
Introduction: Although use of point of care ultrasound (PoCUS) protocols for patients with undifferentiated hypotension in the Emergency Department (ED) is widespread, our previously reported SHoC-ED study showed no clear survival or length of stay benefit for patients assessed with PoCUS. In this analysis, we examine if the use of PoCUS changed fluid administration and rates of other emergency interventions between patients with different shock types. The primary comparison was between cardiogenic and non-cardiogenic shock types. Methods: A post-hoc analysis was completed on the database from an RCT of 273 patients who presented to the ED with undifferentiated hypotension (SBP <100 or shock index > 1) and who had been randomized to receive standard care with or without PoCUS in 6 centres in Canada and South Africa. PoCUS-trained physicians performed scans after initial assessment. Shock categories and diagnoses recorded at 60 minutes after ED presentation, were used to allocate patients into subcategories of shock for analysis of treatment. We analyzed actual care delivered including initial IV fluid bolus volumes (mL), rates of inotrope use and major procedures. Standard statistical tests were employed. Sample size was powered at 0.80 (α:0.05) for a moderate difference. Results: Although there were expected differences in the mean fluid bolus volume between patients with non-cardiogenic and cardiogenic shock, there was no difference in fluid bolus volume between the control and PoCUS groups (non-cardiogenic control 1878 mL (95% CI 1550 – 2206 mL) vs. non-cardiogenic PoCUS 1687 mL (1458 – 1916 mL); and cardiogenic control 768 mL (194 – 1341 mL) vs. cardiogenic PoCUS 981 mL (341 – 1620 mL). Likewise there were no differences in rates of inotrope administration, or major procedures for any of the subcategories of shock between the control group and PoCUS group patients. The most common subcategory of shock was distributive. Conclusion: Despite differences in care delivered by subcategory of shock, we did not find any significant difference in actual care delivered between patients who were examined using PoCUS and those who were not. This may help to explain the previously reported lack of outcome difference between groups.
Introduction: Point of care ultrasound has been reported to improve diagnosis in non-traumatic hypotensive ED patients. We compared diagnostic performance of physicians with and without PoCUS in undifferentiated hypotensive patients as part of an international prospective randomized controlled study. The primary outcome was diagnostic performance of PoCUS for cardiogenic vs. non-cardiogenic shock. Methods: SHoC-ED recruited hypotensive patients (SBP < 100 mmHg or shock index > 1) in 6 centres in Canada and South Africa. We describe previously unreported secondary outcomes relating to diagnostic accuracy. Patients were randomized to standard clinical assessment (No PoCUS) or PoCUS groups. PoCUS-trained physicians performed scans after initial assessment. Demographics, clinical details and findings were collected prospectively. Initial and secondary diagnoses including shock category were recorded at 0 and 60 minutes. Final diagnosis was determined by independent blinded chart review. Standard statistical tests were employed. Sample size was powered at 0.80 (α:0.05) for a moderate difference. Results: 273 patients were enrolled with follow-up for primary outcome completed for 270. Baseline demographics and perceived category of shock were similar between groups. 11% of patients were determined to have cardiogenic shock. PoCUS had a sensitivity of 80.0% (95% CI 54.8 to 93.0%), specificity 95.5% (90.0 to 98.1%), LR+ve 17.9 (7.34 to 43.8), LR-ve 0.21 (0.08 to 0.58), Diagnostic OR 85.6 (18.2 to 403.6) and accuracy 93.7% (88.0 to 97.2%) for cardiogenic shock. Standard assessment without PoCUS had a sensitivity of 91.7% (64.6 to 98.5%), specificity 93.8% (87.8 to 97.0%), LR+ve 14.8 (7.1 to 30.9), LR- of 0.09 (0.01 to 0.58), Diagnostic OR 166.6 (18.7 to 1481) and accuracy of 93.6% (87.8 to 97.2%). There was no significant difference in sensitivity (-11.7% (-37.8 to 18.3%)) or specificity (1.73% (-4.67 to 8.29%)). Diagnostic performance was also similar between other shock subcategories. Conclusion: As reported in other studies, PoCUS based assessment performed well diagnostically in undifferentiated hypotensive patients, especially as a rule-in test. However performance was similar to standard (non-PoCUS) assessment, which was excellent in this study.
Background: EMBRACE (NCT02462759) Part 1 is a randomized, double-blind, sham-procedure controlled study assessing safety/tolerability of intrathecal nusinersen (12-mg equivalent dose) in symptomatic infants/children with SMA who were not eligible to participate in ENDEAR or CHERISH. Methods: Eligible participants had onset of SMA symptoms at ≤6 months with 3 SMN2 copies; onset at ≤6 months, age >7 months and 2 copies; or onset at >6 months, age ≤18 months, and 2/3 copies. Safety/tolerability was the primary endpoint. Exploratory endpoints included Hammersmith Infant Neurological Examination Section 2 (HINE-2) motor milestone attainment, change in ventilator use, and growth. Results: EMBRACE Part 1 was terminated early based on positive results from ENDEAR. Safety/tolerability was similar to previous trials. More nusinersen-treated (11/14;79%) vs. sham–treated individuals (2/7;29%) were HINE-2 motor milestone responders. Between Day 183 and 302, mean (SD) hours of ventilator use changed by +1.236 (3.712) hours in nusinersen-treated (n=12) and +2.123 (3.023) hours in sham–treated individuals (n=7). Similar increases in weight and body length were observed in nusinersen-treated and sham–treated individuals by Day 183. Conclusions: In EMBRACE Part 1, nusinersen demonstrated a favorable benefit-risk profile. These results add to the aggregated efficacy, safety/tolerability data of nusinersen in SMA.
Fluid residence time is a key concept in the understanding and design of chemically reacting flows. In order to investigate how turbulent mixing affects the residence time distribution within a flow, this study examines statistics of fluid residence time from a direct numerical simulation (DNS) of a statistically stationary turbulent round jet with a jet Reynolds number of 7290. The residence time distribution in the flow is characterised by solving transport equations for the residence time of the jet fluid and for the jet fluid mass fraction. The product of the jet fluid residence time and the jet fluid mass fraction, referred to as the mass-weighted stream age, gives a quantity that has stationary statistics in the turbulent jet. Based on the observation that the statistics of the mass fraction and velocity are self-similar downstream of an initial development region, the transport equation for the jet fluid residence time is used to derive a model describing a self-similar profile for the mean of the mass-weighted stream age. The self-similar profile predicted is dependent on, but different from, the self-similar profiles for the mass fraction and the axial velocity. The DNS data confirm that the first four moments and the shape of the one-point probability density function of mass-weighted stream age are indeed self-similar, and that the model derived for the mean mass-weighted stream-age profile provides a useful approximation. Using the self-similar form of the moments and probability density functions presented it is therefore possible to estimate the local residence time distribution in a wide range of practical situations in which fluid is introduced by a high-Reynolds-number jet of fluid.
Absorption coefficient (k), infinite reflectance (Ri), and scattering coefficient (s) were tabulated for five wavelengths and analyzed for statistical differences for seven weed species. The wavelengths were: 0.55 μm, 0.65 μm, 0.85 μm, 1.65 μm, and 2.20 μm. The Ri of common lambsquarters (Chenopodium album L.), johnsongrass [Sorghum halepense (L.) Pers.], and annual sowthistle (Sonchus oleraceus L.) leaves at the 0.85-μm wavelength were significantly (p = 0.05) higher than for sunflower (Helianthus annuus L.), ragweed parthenium (Parthenium hysterophorus L.), or London rocket (Sisymbrium irio L.). Annual sowthistle had the largest k value, and Palmer amaranth (Amaranthus palmeri S. Wats.) had the smallest k value at the 0.65-μm chlorophyll absorption wavelength. In general, johnsongrass, ragweed parthenium, or London rocket had the largest s values among the five wavelengths, whereas annual sowthistle and Palmer amaranth were usually lowest.
Introduction: Point of care ultrasound (PoCUS) is an established tool in the initial management of patients with undifferentiated hypotension in the emergency department (ED). While PoCUS protocols have been shown to improve early diagnostic accuracy, there is little published evidence for any mortality benefit. We report the findings from our international multicenter randomized controlled trial, assessing the impact of a PoCUS protocol on survival and key clinical outcomes. Methods: Recruitment occurred at 7 centres in North America (4) and South Africa (3). Scans were performed by PoCUS-trained physicians. Screening at triage identified patients (SBP<100 or shock index>1), randomized to PoCUS or control (standard care and no PoCUS) groups. Demographics, clinical details and study findings were collected prospectively. Initial and secondary diagnoses were recorded at 0 and 60 minutes, with ultrasound performed in the PoCUS group prior to secondary assessment. The primary outcome measure was 30-day/discharge mortality. Secondary outcome measures included diagnostic accuracy, changes in vital signs, acid-base status, and length of stay. Categorical data was analyzed using Fishers test, and continuous data by Student T test and multi-level log-regression testing. (GraphPad/SPSS) Final chart review was blinded to initial impressions and PoCUS findings. Results: 258 patients were enrolled with follow-up fully completed. Baseline comparisons confirmed effective randomization. There was no difference between groups for the primary outcome of mortality; PoCUS 32/129 (24.8%; 95% CI 14.3-35.3%) vs. Control 32/129 (24.8%; 95% CI 14.3-35.3%); RR 1.00 (95% CI 0.869 to 1.15; p=1.00). There were no differences in the secondary outcomes; ICU and total length of stay. Our sample size has a power of 0.80 (α:0.05) for a moderate effect size. Other secondary outcomes are reported separately. Conclusion: This is the first RCT to compare PoCUS to standard care for undifferentiated hypotensive ED patients. We did not find any mortality or length of stay benefits with the use of a PoCUS protocol, though a larger study is required to confirm these findings. While PoCUS may have diagnostic benefits, these may not translate into a survival benefit effect.
Introduction: Point of Care Ultrasound (PoCUS) protocols are commonly used to guide resuscitation for emergency department (ED) patients with undifferentiated non-traumatic hypotension. While PoCUS has been shown to improve early diagnosis, there is a minimal evidence for any outcome benefit. We completed an international multicenter randomized controlled trial (RCT) to assess the impact of a PoCUS protocol on key resuscitation markers in this group. We report diagnostic impact and mortality elsewhere. Methods: The SHoC-ED1 study compared the addition of PoCUS to standard care within the first hour in the treatment of adult patients presenting with undifferentiated hypotension (SBP<100 mmHg or a Shock Index >1.0) with a control group that did not receive PoCUS. Scans were performed by PoCUS-trained physicians. 4 North American, and 3 South African sites participated in the study. Resuscitation outcomes analyzed included volume of fluid administered in the ED, changes in shock index (SI), modified early warning score (MEWS), venous acid-base balance, and lactate, at one and four hours. Comparisons utilized a T-test as well as stratified binomial log-regression to assess for any significant improvement in resuscitation amount the outcomes. Our sample size was powered at 0.80 (α:0.05) for a moderate effect size. Results: 258 patients were enrolled with follow-up fully completed. Baseline comparisons confirmed effective randomization. There was no significant difference in mean total volume of fluid received between the control (1658 ml; 95%CI 1365-1950) and PoCUS groups (1609 ml; 1385-1832; p=0.79). Significant improvements were seen in SI, MEWS, lactate and bicarbonate with resuscitation in both the PoCUS and control groups, however there was no difference between groups. Conclusion: SHOC-ED1 is the first RCT to compare PoCUS to standard of care in hypotensive ED patients. No significant difference in fluid used, or markers of resuscitation was found when comparing the use of a PoCUS protocol to that of standard of care in the resuscitation of patients with undifferentiated hypotension.
Introduction: Point of care ultrasonography (PoCUS) is an established tool in the initial management of hypotensive patients in the emergency department (ED). It has been shown rule out certain shock etiologies, and improve diagnostic certainty, however evidence on benefit in the management of hypotensive patients is limited. We report the findings from our international multicenter RCT assessing the impact of a PoCUS protocol on diagnostic accuracy, as well as other key outcomes including mortality, which are reported elsewhere. Methods: Recruitment occurred at 4 North American and 3 Southern African sites. Screening at triage identified patients (SBP<100 mmHg or shock index >1) who were randomized to either PoCUS or control groups. Scans were performed by PoCUS-trained physicians. Demographics, clinical details and findings were collected prospectively. Initial and secondary diagnoses were recorded at 0 and 60 minutes, with ultrasound performed in the PoCUS group prior to secondary assessment. Final chart review was blinded to initial impressions and PoCUS findings. Categorical data was analyzed using Fishers two-tailed test. Our sample size was powered at 0.80 (α:0.05) for a moderate effect size. Results: 258 patients were enrolled with follow-up fully completed. Baseline comparisons confirmed effective randomization. The perceived shock category changed more frequently in the PoCUS group 20/127 (15.7%) vs. control 7/125 (5.6%); RR 2.81 (95% CI 1.23 to 6.42; p=0.0134). There was no significant difference in change of diagnostic impression between groups PoCUS 39/123 (31.7%) vs control 34/124 (27.4%); RR 1.16 (95% CI 0.786 to 1.70; p=0.4879). There was no significant difference in the rate of correct category of shock between PoCUS (118/127; 93%) and control (113/122; 93%); RR 1.00 (95% CI 0.936 to 1.08; p=1.00), or for correct diagnosis; PoCUS 90/127 (70%) vs control 86/122 (70%); RR 0.987 (95% CI 0.671 to 1.45; p=1.00). Conclusion: This is the first RCT to compare PoCUS to standard care for undifferentiated hypotensive ED patients. We found that the use of PoCUS did change physicians’ perceived shock category. PoCUS did not improve diagnostic accuracy for category of shock or diagnosis.
Flue-cured tobacco is sensitive to foliar and soil residues of off-target synthetic auxin drift. Aminocyclopyrachlor is a newly developed synthetic auxin herbicide that may be used in right-of-way applications for broadleaf weed and brush control. Aminocyclopyrachlor is considered a reduced-risk alternative in rights-of-way compared with similar compounds because of its low application rate and volatility risk. However, no research is available on the response of field-grown, flue-cured tobacco to aminocyclopyrachlor drift exposure. Research was conducted in 2009 and 2010 at the Border Belt Tobacco Research Station in Whiteville, NC, to determine the response of ‘NC 71’ flue-cured tobacco to five simulated drift rates of aminocyclopyrachlor (0.31, 1.6, 3.1, 15.7, and 31.4 g ae ha−1) and one aminopyralid (6.1 g ae ha−1) simulated drift rates applied pretransplant incorporated, pretransplant unincorporated, 3 wk after transplant, and 6 wk after transplant. All herbicide rates and application timings caused significant visual tobacco injury, ranging from slight to severe with increasing herbicide drift rates. Tobacco plant heights and fresh weights were reduced at all application timings receiving ≥ 15.7 g ha−1 aminocyclopyrachlor and the comparative aminopyralid rate.