We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
A synthetic octahedral-site-vacancy-free annite sample and its progressive oxidation, induced by heating in air, were studied by powder X-ray diffraction (pXRD), Mössbauer spectroscopy, nuclear reaction analysis (NRA), Raman spectroscopy, X-ray fluorescence (XRF) spectroscopy, gas chromatography (GC), thermogravimetric analysis (TGA), differential thermal analysis (DTA), scanning electron microscopy (SEM), and size-fraction separation methods. For a set heating time and as temperature is increased, the sample first evolves along an annite-oxyannite join, until all H is lost via the oxybiotite reaction (Fe2+ + OH− ⇌ Fe3+ + O2− + H↑). It then evolves along an oxyannite-ferrioxyannite join, where ideal ferrioxyannite, KFe3+8/3□1/3AlSi3O12, is defined as the product resulting from complete oxidation of ideal oxyannite, KFe3+2Fe2+AlSi3O12, via the vacancy mechanism (3 Fe2+ ⇌ 2 Fe3+ + [6]□ + Fe↑). A pillaring collapse transition is observed as a collapse of c near the point where and all OH groups are predicted and observed to be lost. Quantitative analyses of H, using NRA, GC, and Raman spectroscopy, corroborate this interpretation and, in combination with accurate ferric/ferrous ratios from Mössbauer spectroscopy and lattice parameter determinations, allow a clear distinction to be made between vacancy-free and vacancy-bearing annite. The amount of Fe in ancillary Fe oxide phases produced by the vacancy mechanism is measured by Mössbauer spectroscopy to be 11.3(5)% of total Fe, in agreement with both the theoretical prediction of 1/9 = 11.1% and the observed TGA weight gain. The initiation of Fe oxide formation near the point of completion of the oxybiotite reaction () is corroborated by pXRD, TGA, Raman spectroscopy, and appearance of an Fe oxide hyperfine field sextet in the Mössbauer spectra. The region of Fe oxide formation is shown to coincide with a region of octahedral site vacancy formation, using a new Mössbauer spectral signature of vacancies that consists of a component at 2.2 mm/s in the [6]Fe3+ quadrupole splitting distribution (QSD). The crystal chemical behaviors of annite-oxyannite and of oxyannite-ferrioxyannite are best contrasted and compared to the behaviors of other layer-silicate series in terms of b vs. [D] (average octahedral cation to O bond length). This also leads to a diagnostic test for the presence of octahedral site vacancies in hydrothermally synthesized annite, based on a graph of b vs. Fe2+/Fe. The implications of the observed sequence of thermal oxidation reactions for the thermodynamic relevance of the oxybiotite and vacancy reactions in hydrothermal syntheses are examined and it is concluded that the oxybiotite reaction is the relevant reaction in the single-phase stability field of annite, at high hydrogen fugacity and using ideal starting cation stoichiometry. The vacancy reaction is only relevant in a multi-phase field, at lower hydrogen fugacity, that includes an Fe oxide equilibrium phase (magnetite) that can effectively compete for Fe, or when using non-ideal starting cation stoichiometries.
We model linear, inviscid, internal tides generated by the interaction of a barotropic tide with one-dimensional topography. Starting from the body-forcing formulation of the hydrodynamic problem, we derive a coupled-mode system (CMS) using a local eigenfunction expansion of the stream function. For infinitesimal topography, we solve this CMS analytically, recovering the classical weak topography approximation (WTA) formula for the barotropic-to-baroclinic energy conversion rate. For arbitrary topographies, we solve this CMS numerically. The CMS enjoys faster convergence with respect to existing modal solutions and can be applied in the subcritical and supercritical regimes for both ridges and shelf profiles. We show that the non-uniform barotropic tide affects the baroclinic field locally over topographies with large slopes and we study the dependence of the radiated energy conversion rate on the criticality. We show that non-radiating or weakly radiating topographies are common in the subcritical regime. We also assess the region of validity of the WTA approximation for the commonly used Gaussian ridge and a compactly supported bump ridge studied here for the first time. Finally, we provide numerical evidence showing that in the strongly supercritical regime, the energy conversion rate for a ridge (respectively shelf) approaches the value obtained by the knife-edge (respectively step) topography.
Background: Smoking is the leading cause of preventable morbidity worldwide and therefore developing effective smoking cessation strategies is a public health priority. However, what brain networks support maintenance of smoking cessation in the long term remains unexplored. Methods: We analyzed the baseline resting-state fMRI data acquired in 23 smokers (Mage = 61.52 ± 3.7) who were followed longitudinally in a cohort of cognitively normal older adults. Self-reported smoking status and amount were recorded at baseline and repeated after 4 years. We investigated the effect of smoking behaviour change on functional brain connectivity using seed-to-voxel approach. We examined a-priori regions of interest (ROIs) including the reward network (ventromedial prefrontal cortex (vMPFC) and ventral striatum) and the right insula. These ROIs are promising target mechanisms given prior behavioural research linking it to smoking cessation. Results: Our results revealed that reduced smoking was associated with reduced connectivity between ventral striatum and middle frontal gyrus and enhanced connectivity between right insula and middle temporal gyrus (voxel p <0.001, cluster p<0.05 FDR corrected). However, change in smoking did not reveal any significant effects in the vMPFC. Conclusions: Our findings suggest that successful smoking behaviour change is associated with altered reward network and insular functional connectivity in the long term.
To assess demographic, clinical, and injury characteristics associated with health-related quality of life (HRQOL) in adults with persistent post-concussion symptoms (PPCS).
Methods:
Adults with PPCS presenting to a specialized brain injury clinic completed demographic, injury, and clinical outcome questionnaires at the initial clinic assessment. Clinical outcome measures were collected including the Rivermead Post-Concussion Symptoms Questionnaire (RPQ), Patient Health Questionnaire-9 (PHQ-9), Generalized Anxiety Disorder Scale-7 (GAD-7), and the Fatigue Severity Scale (FSS). HRQOL was measured using the Quality of Life after Brain Injury (QOLIBRI) questionnaire. Stepwise hierarchical multiple regression analysis adjusting for age, sex, and months since injury was used to determine associations between quality of life and clinical outcome measures.
Results:
Overall, 125 participants were included. The PHQ-9, FSS, and GAD-7 were significant predictors of QOLIBRI scores (R2 = 0.481, p < .001), indicating that participants with higher levels of depressive symptoms, fatigue, and anxiety reported poorer HRQOL. The PHQ-9 score was the strongest predictor, accounting for 42.0% of the variance in QOLIBRI scores. No demographic or injury characteristics significantly predicted QOLIBRI scores. There was a high prevalence of depressive symptoms with 72.8% of participants having PHQ-9 scores ≥ 10.
Conclusion:
Among patients with PPCS, mental health and fatigue are important contributors to HRQOL. As there is a high burden of mood disorders and fatigue in this population, targeted treatments for these concerns may impact the quality of life.
Introduction: Selecting appropriate patients for hospitalization following emergency department (ED) evaluation of syncope is critical for serious adverse event (SAE) identification. The primary objective of this study is to determine the association of hospitalization and SAE detection using propensity score (PS) matching. The secondary objective was to determine if SAE identification with hospitalization varied by the Canadian Syncope Risk Score (CSRS) risk-category. Methods: This was a secondary analysis of two large prospective cohort studies that enrolled adults (age ≥ 16 years) with syncope at 11 Canadian EDs. Patients with a serious condition identified during index ED evaluation were excluded. Outcome was a 30-day SAE identified either in-hospital for hospitalized patients or after ED disposition for discharged patients and included death, ventricular arrhythmia, non-lethal arrhythmia and non-arrhythmic SAE (myocardial infarction, structural heart disease, pulmonary embolism, hemorrhage). Patients were propensity matched using age, sex, blood pressure, prodrome, presumed ED diagnosis, ECG abnormalities, troponin, heart disease, hypertension, diabetes, arrival by ambulance and hospital site. Multivariable logistic regression assessed the interaction between CSRS and SAE detection and we report odds ratios (OR). Results: Of the 8183 patients enrolled, 743 (9.0%) patients were hospitalized and 658 (88.6%) were PS matched. The OR for SAE detection for hospitalized patients in comparison to those discharged from the ED was 5.0 (95%CI 3.3, 7.4), non-lethal arrhythmia 5.4 (95%CI 3.1, 9.6) and non-arrhythmic SAE 6.3 (95%CI 2.9, 13.5). Overall, the odds of any SAE identification, and specifically non-lethal arrhythmia and non-arrhythmia was significantly higher in-hospital among hospitalized patients than those discharged from the ED (p < 0.001). There were no significant differences in 30-day mortality (p = 1.00) or ventricular arrhythmia detection (p = 0.21). The interaction between ED disposition and CSRS was significant (p = 0.04) and the probability of 30-day SAEs while in-hospital was greater for medium and high risk CSRS patients. Conclusion: In this multicenter prospective cohort, 30-day SAE detection was greater for hospitalized compared with discharged patients. CSRS low-risk patients are least likely to have SAEs identified in-hospital; out-patient monitoring for moderate risk patients requires further study.
Introduction: For rhythm control of acute atrial flutter (AAFL) in the emergency department (ED), choices include initial drug therapy or initial electrical cardioversion (ECV). We compared the strategies of pharmacological cardioversion followed by ECV if necessary (Drug-Shock), and ECV alone (Shock Only). Methods: We conducted a randomized, blinded, placebo-controlled trial (1:1 allocation) comparing two rhythm control strategies at 11 academic EDs. We included stable adult patients with AAFL, where onset of symptoms was <48 hours. Patients underwent central web-based randomization stratified by site. The Drug-Shock group received an infusion of procainamide (15mg/kg over 30 minutes) followed 30 minutes later, if necessary, by ECV at 200 joules x 3 shocks. The Shock Only group received an infusion of saline followed, if necessary, by ECV x 3 shocks. The primary outcome was conversion to sinus rhythm for ≥30 minutes at any time following onset of infusion. Patients were followed for 14 days. The primary outcome was evaluated on an intention-to-treat basis. Statistical significance was assessed using chi-squared tests and multivariable logistic regression. Results: We randomized 76 patients, and none was lost to follow-up. The Drug-Shock (N = 33) and Shock Only (N = 43) groups were similar for all characteristics including mean age (66.3 vs 63.4 yrs), duration of AAFL (30.1 vs 24.5 hrs), previous AAFL (72.7% vs 69.8%), median CHADS2 score (1 vs 1), and mean initial heart rate (128.9 vs 126.0 bpm). The Drug-Shock and Shock only groups were similar for the primary outcome of conversion (100% vs 93%; absolute difference 7.0%, 95% CI -0.6;14.6; P = 0.25). The multivariable analyses confirmed the similarity of the two strategies (P = 0.19). In the Drug-Shock group 21.2% of patients converted with the infusion. There were no statistically significant differences for time to conversion (84.2 vs 97.6 minutes), total ED length of stay (9.4 vs 7.5 hours), disposition home (100% vs 95.3%), and stroke within 14 days (0 vs 0). Premature discontinuation of infusion (usually for transient hypotension) was more common in the Drug-Shock group (9.1% vs 0.0%) but there were no serious adverse events. Conclusion: Both the Drug-Shock and Shock Only strategies were highly effective and safe in allowing AAFL patients to go home in sinus rhythm. IV procainamide alone was effective in only one fifth of patients, much less than for acute AF.
Introduction: Older (age >=65 years) trauma patients suffer increased morbidity and mortality. This is due to under-triage of older trauma victims, resulting in lack of transfer to a trauma centre or failure to activate the trauma team. There are currently no Canadian guidelines for the management of older trauma patients. The objective of this study was to identify modifiers to the prehospital and emergency department (ED) phases of major trauma care for older adults based on expert consensus. Methods: We conducted a modified Delphi study to assess senior-friendly major trauma care modifiers based on national expert consensus. The panel consisted of 24 trauma care providers across Canada, including medical directors, paramedics, emergency physicians, emergency nurses, trauma surgeons and trauma administrators. Following a literature review, we developed an online Delphi survey consisting of 16 trauma care modifiers. Three online survey rounds were distributed and panelists were asked to score items on a 9-point Likert scale. The following predetermined thresholds were used: appropriate (median score 7–9, without disagreement); inappropriate (median score 1–3; without disagreement), and uncertain (any median score with disagreement). The disagreement index (DI) is a method for measuring consensus within groups. Agreement was defined a priori as a DI score <1. Results: There was a 100% response rate for all survey rounds. Three new trauma care modifiers were suggested by panelists. Of 19 trauma care modifiers, the expert panel achieved consensus agreement for 17 items. The prehospital modifier with the strongest agreement to transfer to a trauma centre was a respiratory rate <10 or >20 breaths/minute or needing ventilatory support (DI = 0.24). The ED modifier with the strongest level of agreement was obtaining a 12-lead electrocardiogram following the primary and secondary survey for all older adults (DI = 0.01). Two trauma care modifiers failed to reach consensus agreement: transporting older patients with ground level falls to a trauma centre and activating the trauma team based solely on an age >=65 years. Conclusion: Using a modified Delphi process, an expert panel agreed upon 17 trauma care modifiers for older adults in the prehospital and ED phases of care. These modifiers may improve the delivery of senior-friendly trauma care and should be considered when developing local and national trauma guidelines.
Introduction: CAEP recently developed the acute atrial fibrillation (AF) and flutter (AFL) [AAFF] Best Practices Checklist to promote optimal care and guidance on cardioversion and rapid discharge of patients with AAFF. We sought to assess the impact of implementing the Checklist into large Canadian EDs. Methods: We conducted a pragmatic stepped-wedge cluster randomized trial in 11 large Canadian ED sites in five provinces, over 14 months. All hospitals started in the control period (usual care), and then crossed over to the intervention period in random sequence, one hospital per month. We enrolled consecutive, stable patients presenting with AAFF, where symptoms required ED management. Our intervention was informed by qualitative stakeholder interviews to identify perceived barriers and enablers for rapid discharge of AAFF patients. The many interventions included local champions, presentation of the Checklist to physicians in group sessions, an online training module, a smartphone app, and targeted audit and feedback. The primary outcome was length of stay in ED in minutes from time of arrival to time of disposition, and this was analyzed at the individual patient-level using linear mixed effects regression accounting for the stepped-wedge design. We estimated a sample size of 800 patients. Results: We enrolled 844 patients with none lost to follow-up. Those in the control (N = 316) and intervention periods (N = 528) were similar for all characteristics including mean age (61.2 vs 64.2 yrs), duration of AAFF (8.1 vs 7.7 hrs), AF (88.6% vs 82.9%), AFL (11.4% vs 17.1%), and mean initial heart rate (119.6 vs 119.9 bpm). Median lengths of stay for the control and intervention periods respectively were 413.0 vs. 354.0 minutes (P < 0.001). Comparing control to intervention, there was an increase in: use of antiarrhythmic drugs (37.4% vs 47.4%; P < 0.01), electrical cardioversion (45.1% vs 56.8%; P < 0.01), and discharge in sinus rhythm (75.3% vs. 86.7%; P < 0.001). There was a decrease in ED consultations to cardiology and medicine (49.7% vs 41.1%; P < 0.01), but a small but insignificant increase in anticoagulant prescriptions (39.6% vs 46.5%; P = 0.21). Conclusion: This multicenter implementation of the CAEP Best Practices Checklist led to a significant decrease in ED length of stay along with more ED cardioversions, fewer ED consultations, and more discharges in sinus rhythm. Widespread and rigorous adoption of the CAEP Checklist should lead to improved care of AAFF patients in all Canadian EDs.
Introduction: Mild Traumatic Brain Injury (mTBI) is a common problem: each year in Canada, its incidence is estimated at 500-600 cases per 100 000. Between 10 and 56% of mTBI patients develop persistent post-concussion symptoms (PPCS) that can last for more than 90 days. It is therefore important for clinicians to identify patients who are at risk of developing PPCS. We hypothesized that blood biomarkers drawn upon patient arrival to the Emergency Department (ED) could help predict PPCS. The main objective of this project was to measure the association between four biomarkers and the incidence of PPCS 90 days post mTBI. Methods: Patients were recruited in seven Canadian ED. Non-hospitalized patients, aged ≥14 years old with a documented mTBI that occurred ≤24 hrs of ED consultation, with a GCS ≥13 at arrival were included. Sociodemographic and clinical data as well as blood samples were collected in the ED. A standardized telephone questionnaire was administered at 90 days post ED visit. The following biomarkers were analyzed using enzyme-linked immunosorbent assay (ELISA): S100B protein, Neuron Specific Enolase (NSE), cleaved-Tau (c-Tau) and Glial fibrillary acidic protein (GFAP). The primary outcome measure was the presence of persistent symptoms at 90 days after mTBI, as assessed using the Rivermead Post-Concussion symptoms Questionnaire (RPQ). A ROC curve was constructed for each biomarker. Results: 1276 patients were included in the study. The median age for this cohort was 39 (IQR 23-57) years old, 61% were male and 15% suffered PPCS. The median values (IQR) for patients with PPCS compared to those without were: 43 pg/mL (26-67) versus 42 pg/mL (24-70) for S100B protein, 50 pg/mL (50-223) versus 50 pg/mL (50-199) for NSE, 2929 pg/mL (1733-4744) versus 3180 pg/mL (1835-4761) for c-Tau and 1644 pg/mL (650-3215) versus 1894 pg/mL (700-3498) for GFAP. For each of these biomarkers, Areas Under the Curve (AUC) were 0.495, 0.495, 0.51 and 0.54, respectively. Conclusion: Among mTBI patients, S100B protein, NSE, c-Tau or GFAP during the first 24 hours after trauma do not seem to be able to predict PPCS. Future research testing of other biomarkers is needed in order to determine their usefulness in predicting PPCS when combined with relevant clinical data.
Introduction: Clinical assessment of patients with mTBI is challenging and overuse of head CT in the emergency department (ED) is a major problem. During the last decades, studies have attempted to reduce unnecessary head CTs following a mTBI by identifying new tools aiming to predict intracranial bleeding. S100B serum protein level might be helpful reducing those imaging since a higher level of S-100B protein has been associated with intracranial hemorrhage following a mTBI in previous literature. The main objective of this study was to assess whether the S100B serum protein level is associated with clinically important brain injury and could be used to reduce the number of head CT following a mTBI. Methods: This prospective multicenter cohort study was conducted in five Canadian ED. MTBI patients with a Glasgow Coma Scale (GCS) score of 13-15 in the ED and a blood sample drawn within 24-hours after the injury were included. S-100B protein was analyzed using enzyme-linked immunosorbent assay (ELISA). All types of intracranial bleedings were reviewed by a radiologist who was blinded to the biomarker results. The main outcome was the presence of clinically important brain injury. Results: A total of 476 patients were included. Mean age was 41 ± 18 years old and 150 (31.5%) were female. Twenty-four (5.0%) patients had a clinically significant intracranial hemorrhage while 37 (7.8%) had any type of intracranial bleeding. S100B median value (Q1-Q3) of was: 0.043 ug/L (0.008-0.080) for patients with clinically important brain injury versus 0.039 μg/L (0.023-0.059) for patients without clinically important brain injury. Sensitivity and specificity of the S100B protein level, if used alone to detect clinically important brain injury, were 16.7% (95% CI 4.7-37.4) and 88.5% (95% CI 85.2-91.3), respectively. Conclusion: S100B serum protein level was not associated with clinically significant intracranial hemorrhage in mTBI patients. This protein did not appear to be useful to reduce the number of CT prescribed in the ED and would have missed many clinically important brain injuries. Future research should focus on different ways to assess mTBI patient and ultimately reduce unnecessary head CT.
Introduction: Wide variability exists in emergency department (ED) syncope management. The Canadian Syncope Risk Score (CSRS) was derived and validated to predict the probability of 30-day serious outcomes after ED disposition. The objective was to identify barriers and facilitators among physicians for CSRS use to stratify risk and guide disposition decisions Methods: We conducted semi-structured interviews with physicians involved in ED syncope care at 8 Canadian sites. We used purposive sampling, contacting ED physicians, cardiologists, internists, and hospitalists until theme saturation was reached. Interview questions were designed to understand whether the CSRS recommendations are consistent with current practice, barriers and facilitators for application into practice, and intention for future CSRS use. Interviews were conducted via telephone or videoconference. Two independent raters coded interviews using an inductive approach to identify themes, with discrepancies resolved through consensus. Our methods were consistent with the Knowledge to Action Framework, which highlights the need to assess barriers and facilitators for knowledge use and for adapting new interventions into local contexts. Results: We interviewed 14 ED physicians, 7 cardiologists, and 10 hospitalists/internists across 8 sites. All physicians reported the use of electrocardiograms for patients with syncope, a key component in the CSRS criteria. Almost all physicians reported that the low risk recommendation (discharge without specific follow-up) was consistent with current practice, while less consistency was seen for moderate (15 days outpatient monitoring) and high risk recommendations (outpatient monitoring and/or admission). Key barriers to following the CSRS included a lack of access to outpatient monitoring and uncertainty over timely follow-up care. Other barriers included patient/family concerns, social factors, and necessary bloodwork. Facilitators included assisting with patient education, reassurance of their clinical gestalt, and optimal patient factors (e.g. reliability to return, support at home, few comorbidities). Conclusion: Physicians are receptive to using the CSRS tool for risk stratification and decision support. Implementation should address identified barriers, and adaptation to local settings may involve modifying the recommended clinical actions based on local resources and feasibility.
Introduction: Each year, 3/1000 Canadians sustain a mild traumatic brain injury (mTBI). Many of those mTBI are accompanied by various co-injuries such as dislocations, sprains, fractures or internal injuries. A number of those patients, with or without co-injuries will suffer from persistent post-concussive symptoms (PPCS) more than 90 days post injury. However, little is known about the impact of co-injuries on mTBI outcome. This study aims to describe the impact of co-injuries on PPCS and on patient return to normal activities. Methods: This multicenter prospective cohort study took place in seven large Canadian Emergency Departments (ED). Inclusion criteria: patients aged ≥ 14 who had a documented mTBI that occurred within 24 hours of ED visit, with a Glasgow Coma Scale score of 13-15. Patients who were admitted following their ED visit or unable to consent were excluded. Clinical and sociodemographic information was collected during the initial ED visit. A research nurse then conducted three follow-up phone interviews at 7, 30 and 90 days post-injury, in which they assessed symptom evolution using the validated Rivermead Post-concussion Symptoms Questionnaire (RPQ). Adjusted risk ratios (RR) were calculated to estimate the influence of co-injuries. Results: A total of 1674 patients were included, of which 1023 (61.1%) had at least one co-injury. At 90 days, patients with co-injuries seemed to be at higher risk of having 3 symptoms ≥2 points according to the RPQ (RR: 1.28 95% CI 1.02-1.61) and of experiencing the following symptoms: dizziness (RR: 1.50 95% CI 1.03-2.20), fatigue (RR: 1.35 95% CI 1.05-1.74), headaches (RR: 1.53 95% CI 1.10-2.13), taking longer to think (RR: 1.50 95% CI 1.07-2.11) and feeling frustrated (RR: 1.45 95% CI 1.01-2.07). We also observed that patients with co-injuries were at higher risk of non-return to their normal activities (RR: 2.31 95% CI 1.37-3.90). Conclusion: Patients with co-injuries could be at higher risk of suffering from specific symptoms at 90 days post-injury and to be unable to return to normal activities 90 days post-injury. A better understanding of the impact of co-injuries on mTBI could improve patient management. However, further research is needed to determine if the differences shown in this study are due to the impact of co-injuries on mTBI recovery or to the co-injuries themselves.
Introduction: Emergency department (ED) syncope management is extremely variable. We developed practice recommendations based on the validated Canadian Syncope Risk Score (CSRS) and outpatient cardiac monitoring strategy with physician input. Methods: We used a 2-step approach. Step-1: We pooled data from the derivation and validation prospective cohort studies (with adequate sample size) conducted at 11 Canadian sites (Sep 2010 to Apr 2018). Adults with syncope were enrolled excluding those with serious outcome identified during index ED evaluation. 30-day adjudicated serious outcomes were arrhythmic (arrhythmias, unknown cause of death) and non-arrhythmic (MI, structural heart disease, pulmonary embolism, hemorrhage)]. We compared the serious outcome proportion among risk categories using Cochran-Armitage test. Step-2: We conducted semi-structured interviews using observed risk to develop and refine the recommendations. We used purposive sampling of physicians involved in syncope care at 8 sites from Jun-Dec 2019 until theme saturation was reached. Two independent raters coded interviews using an inductive approach to identify themes; discrepancies were resolved by consensus. Results: Of the 8176 patients (mean age 54, 55% female), 293 (3.6%; 95%CI 3.2-4.0%) experienced 30-day serious outcomes; 0.4% deaths, 2.5% arrhythmic, 1.1% non-arrhythmic outcomes. The serious outcome proportion significantly increased from low to high-risk categories (p < 0.001; overall 0.6% to 27.7%; arrhythmic 0.2% to 17.3%; non-arrhythmic 0.4% to 5.9% respectively). C-statistic was 0.88 (95%CI0.86–0.90). Non-arrhythmia risk per day for the first 2 days was 0.5% for medium-risk, 2% for high-risk and very low thereafter. We recruited 31 physicians (14 ED, 7 cardiologists, 10 hospitalists/internists). 80% of physicians agreed that low risk patients can be discharged without specific follow-up with inconsistencies around length of ED observation. For cardiac monitoring of medium and high-risk, 64% indicated that they don't have access; 56% currently admit high-risk patients and an additional 20% agreed to this recommendation. A deeper exploration led to following refinement: discharge without specific follow-up for low-risk, a shared decision approach for medium-risk and short course of hospitalization for high-risk patients. Conclusion: The recommendations were developed (with online calculator) based on in-depth feedback from key stakeholders to improve uptake during implementation.
Introduction: Mild traumatic brain injury (mTBI) is a serious public health issue and as much as one third of mTBI patients could be affected by persistent post-concussion symptoms (PPCS) three months after their injury. Even though a significant proportion of all mTBIs are sports-related (SR), little is known on the recovery process of SR mTBI patients and the potential differences between SR mTBI and patients who suffered non-sports-related mTBI. The objective of this study was to describe the evolution of PPCS among patients who sustained a SR mTBI compared to those who sustained non sport-related mTBI. Methods: This Canadian multicenter prospective cohort study included patients aged ≥ 14 who had a documented mTBI that occurred within 24 hours of Emergency Department (ED) visit, with a Glasgow Coma Scale score of 13-15. Patients who were hospitalized following their ED visit or unable to consent were excluded. Clinical and sociodemographic information was collected during the initial ED visit. Three follow-up phone interviews were conducted by a research nurse at 7, 30 and 90 days post-injury to assess symptom evolution using the validated Rivermead Post-concussion Symptoms Questionnaire (RPQ). Adjusted risk ratios (RR) were calculated to demonstrate the impact of the mechanism of injury (sports vs non-sports) on the presence and severity of PPCS. Results: A total of 1676 mTBI patients were included, 358 (21.4%) of which sustained a SR mTBI. At 90 days post-injury, patients who suffered a SR mTBI seemed to be significantly less affected by fatigue (RR: 0.70 (95% CI: 0.50-0.97)) and irritability (RR: 0.60 (95% CI: 0.38-0.94)). However, no difference was observed between the two groups regarding each other symptom evaluated in the RPQ. Moreover, the proportion of patients with three symptoms or more, a score ≥21 on the RPQ and those who did return to their normal activities were also comparable. Conclusion: Although persistent post-concussion symptoms are slightly different depending on the mechanism of trauma, our results show that patients who sustained SR-mTBI could be at lower risk of experiencing some types of symptoms 90 days post-injury, in particular, fatigue and irritability.
We consider the unbounded settling dynamics of a circular disk of diameter $d$ and finite thickness $h$ evolving with a vertical speed $U$ in a linearly stratified fluid of kinematic viscosity $\unicode[STIX]{x1D708}$ and diffusivity $\unicode[STIX]{x1D705}$ of the stratifying agent, at moderate Reynolds numbers ($Re=Ud/\unicode[STIX]{x1D708}$). The influence of the disk geometry (diameter $d$ and aspect ratio $\unicode[STIX]{x1D712}=d/h$) and of the stratified environment (buoyancy frequency $N$, viscosity and diffusivity) are experimentally and numerically investigated. Three regimes for the settling dynamics have been identified for a disk reaching its gravitational equilibrium level. The disk first falls broadside-on, experiencing an enhanced drag force that can be linked to the stratification. A second regime corresponds to a change of stability for the disk orientation, from broadside-on to edgewise settling. This occurs when the non-dimensional velocity $U/\sqrt{\unicode[STIX]{x1D708}N}$ becomes smaller than some threshold value. Uncertainties in identifying the threshold value is discussed in terms of disk quality. It differs from the same problem in a homogeneous fluid which is associated with a fixed orientation (at its initial value) in the Stokes regime and a broadside-on settling orientation at low, but finite Reynolds numbers. Finally, the third regime corresponds to the disk returning to its broadside orientation after stopping at its neutrally buoyant level.
The extending market of concentrated solar power plants requires high-temperature materials for solar surface receivers that would ideally heat an air coolant beyond 1300 K. This work presents investigation on high-temperature alloys with ceramic coatings (AlN or SiC/AlN stacking) to combine the properties of the substrate (creep resistance, machinability) and coating (slow oxidation kinetics, high solar absorptivity). The first results showed that high-temperature oxidation resistance and optical properties of metallic alloys were improved by the different coatings. However, the fast thermal shocks led to high stress levels not compatible due to the differences in thermal expansion coefficients.
Stratification due to salt or heat gradients greatly affects the distribution of inert particles and living organisms in the ocean and the lower atmosphere. Laboratory studies considering the settling of a sphere in a linearly stratified fluid confirmed that stratification may dramatically enhance the drag on the body, but failed to identify the generic physical mechanism responsible for this increase. We present a rigorous splitting scheme of the various contributions to the drag on a settling body, which allows them to be properly disentangled whatever the relative magnitude of inertial, viscous, diffusive and buoyancy effects. We apply this splitting procedure to data obtained via direct numerical simulation of the flow past a settling sphere over a range of parameters covering a variety of situations of laboratory and geophysical interest. Contrary to widespread belief, we show that, in the parameter range covered by the simulations, the drag enhancement is generally not primarily due to the extra buoyancy force resulting from the dragging of light fluid by the body, but rather to the specific structure of the vorticity field set in by buoyancy effects. Simulations also reveal how the different buoyancy-induced contributions to the drag vary with the flow parameters. To unravel the origin of these variations, we analyse the different possible leading-order balances in the governing equations. Thanks to this procedure, we identify several distinct regimes which differ by the relative magnitude of length scales associated with stratification, viscosity and diffusivity. We derive the scaling laws of the buoyancy-induced drag contributions in each of these regimes. Considering tangible examples, we show how these scaling laws combined with numerical results may be used to obtain reliable predictions beyond the range of parameters covered by the simulations.
Introduction: For rhythm control of acute atrial fibrillation (AAF) in the emergency department (ED), choices include initial drug therapy or initial electrical cardioversion (ECV). We compared the strategies of pharmacological cardioversion followed by ECV if necessary (Drug-Shock), and ECV alone (Shock Only). Methods: We conducted a randomized, blinded, placebo-controlled trial (1:1 allocation) comparing two rhythm control strategies at 11 academic EDs. We included stable adult patients with AAF, where onset of symptoms was <48 hours. Patients underwent central web-based randomization stratified by site. The Drug-Shock group received an infusion of procainamide (15mg/kg over 30 minutes) followed 30 minutes later, if necessary, by ECV at 200 joules x 3 shocks. The Shock Only group received an infusion of saline followed, if necessary, by ECV x 3 shocks. The primary outcome was conversion to sinus rhythm for ≥30 minutes at any time following onset of infusion. Patients were followed for 14 days. The primary outcome was evaluated on an apriori-specified modified intention-to-treat (MITT) basis excluding patients who never received the study infusion (e.g. spontaneous conversion). Data were analyzed using chi-squared tests and logistic regression. Our target sample size was 374 evaluable patients. Results: Of 395 randomized patients, 18 were excluded from the MITT analysis; none were lost to follow-up. The Drug-Shock (N = 198) and Shock Only (N = 180) groups (total = 378) were similar for all characteristics including mean age (60.0 vs 59.5 yrs), duration of AAF (10.1 vs 10.8 hrs), previous AF (67.2% vs 68.3%), median CHADS2 score (0 vs 0), and mean initial heart rate (119.9 vs 118.0 bpm). More patients converted to normal sinus rhythm in the Drug-Shock group (97.0% vs 92.2%; absolute difference 4.8%, 95% CI 0.2-9.9; P = 0.04). The multivariable analyses confirmed the Drug-Shock strategy superiority (P = 0.04). There were no statistically significant differences for time to conversion (91.4 vs 85.4 minutes), total ED length of stay (7.1 vs 7.7 hours), disposition home (97.0% vs 96.1%), and stroke within 14 days (0 vs 0). Premature discontinuation of infusion was more common in the Drug-Shock group (8.1% vs 0.6%) but there were no serious adverse events. Conclusion: Both the Drug-Shock and Shock Only strategies were highly effective and safe in allowing AAF patients to go home in sinus rhythm. A strategy of initial cardioversion with procainamide was superior to a strategy of immediate ECV.