We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Acid-base titrations and attenuated total reflectance-infrared (ATR-IR) spectroscopy of solutions containing Zn(NO3)2 and the herbicide 3-amino-1,2,4-triazole suggested that soluble complexes ZnL2+ and Zn(OH)L+ form, where L represents aminotriazole. Sorption experiments and modeling in systems containing K-saturated Wyoming (SWy-K) montmorillonite suggest that at low concentrations the aminotriazole sorbs primarily in cationic form via an ion-exchange mechanism. Sorption isotherms for aminotriazole are ‘s’-shaped, indicating a co-operative sorption mechanism as the concentration of the molecule increases. At higher concentrations, ATR-IR spectroscopy indicated the presence of cationic and neutral triazole molecules on the surface, while X-ray diffraction data suggest interaction with interlayer regions of the clay. When the concentration of the herbicide was high, initial sorption of aminotriazole cations modified the clay to make the partitioning of neutral molecules to the surface more favorable. Experiments conducted in the presence of Zn(II) indicated that below pH 7, Zn(II) and aminotriazole compete for sorption sites, while above pH 7 the presence of Zn(II) enhances the uptake of aminotriazole. The enhancement was attributed to the formation of an inner-sphere ternary surface complex at hydroxyl sites (SOH) on crystal edges, having the form [(SOZn(OH)L)]0.
Simple extended constant capacitance surface complexation models have been developed to represent the adsorption of polyaromatic dyes (9-aminoacridine, 3,6-diaminoacridine, azure A and safranin O) to kaolinite, and the competitive adsorption of the dyes with Cd. The formulation of the models was based on data from recent publications, including quantitative adsorption measurements over a range of conditions (varying pH and concentration), acid-base titrations and attenuated total reflectance-Fourier transform infrared spectroscopic data. In the models the dye molecules adsorb as aggregates of three or four, forming outer-sphere complexes with sites on the silica face of kaolinite. Both electrostatic and hydrophobic interactions are implicated in the adsorption processes. Despite their simplicity, the models fit a wide range of experimental data, thereby supporting the underlying hypothesis that the flat, hydrophobic, but slightly charged silica faces of kaolinite facilitate the aggregation and adsorption of the flat, aromatic, cationic dye molecules.
Over the last 25 years, radiowave detection of neutrino-generated signals, using cold polar ice as the neutrino target, has emerged as perhaps the most promising technique for detection of extragalactic ultra-high energy neutrinos (corresponding to neutrino energies in excess of 0.01 Joules, or 1017 electron volts). During the summer of 2021 and in tandem with the initial deployment of the Radio Neutrino Observatory in Greenland (RNO-G), we conducted radioglaciological measurements at Summit Station, Greenland to refine our understanding of the ice target. We report the result of one such measurement, the radio-frequency electric field attenuation length $L_\alpha$. We find an approximately linear dependence of $L_\alpha$ on frequency with the best fit of the average field attenuation for the upper 1500 m of ice: $\langle L_\alpha \rangle = ( ( 1154 \pm 121) - ( 0.81 \pm 0.14) \, ( \nu /{\rm MHz}) ) \,{\rm m}$ for frequencies ν ∈ [145 − 350] MHz.
Introduction: Selecting appropriate patients for hospitalization following emergency department (ED) evaluation of syncope is critical for serious adverse event (SAE) identification. The primary objective of this study is to determine the association of hospitalization and SAE detection using propensity score (PS) matching. The secondary objective was to determine if SAE identification with hospitalization varied by the Canadian Syncope Risk Score (CSRS) risk-category. Methods: This was a secondary analysis of two large prospective cohort studies that enrolled adults (age ≥ 16 years) with syncope at 11 Canadian EDs. Patients with a serious condition identified during index ED evaluation were excluded. Outcome was a 30-day SAE identified either in-hospital for hospitalized patients or after ED disposition for discharged patients and included death, ventricular arrhythmia, non-lethal arrhythmia and non-arrhythmic SAE (myocardial infarction, structural heart disease, pulmonary embolism, hemorrhage). Patients were propensity matched using age, sex, blood pressure, prodrome, presumed ED diagnosis, ECG abnormalities, troponin, heart disease, hypertension, diabetes, arrival by ambulance and hospital site. Multivariable logistic regression assessed the interaction between CSRS and SAE detection and we report odds ratios (OR). Results: Of the 8183 patients enrolled, 743 (9.0%) patients were hospitalized and 658 (88.6%) were PS matched. The OR for SAE detection for hospitalized patients in comparison to those discharged from the ED was 5.0 (95%CI 3.3, 7.4), non-lethal arrhythmia 5.4 (95%CI 3.1, 9.6) and non-arrhythmic SAE 6.3 (95%CI 2.9, 13.5). Overall, the odds of any SAE identification, and specifically non-lethal arrhythmia and non-arrhythmia was significantly higher in-hospital among hospitalized patients than those discharged from the ED (p < 0.001). There were no significant differences in 30-day mortality (p = 1.00) or ventricular arrhythmia detection (p = 0.21). The interaction between ED disposition and CSRS was significant (p = 0.04) and the probability of 30-day SAEs while in-hospital was greater for medium and high risk CSRS patients. Conclusion: In this multicenter prospective cohort, 30-day SAE detection was greater for hospitalized compared with discharged patients. CSRS low-risk patients are least likely to have SAEs identified in-hospital; out-patient monitoring for moderate risk patients requires further study.
Introduction: For rhythm control of acute atrial flutter (AAFL) in the emergency department (ED), choices include initial drug therapy or initial electrical cardioversion (ECV). We compared the strategies of pharmacological cardioversion followed by ECV if necessary (Drug-Shock), and ECV alone (Shock Only). Methods: We conducted a randomized, blinded, placebo-controlled trial (1:1 allocation) comparing two rhythm control strategies at 11 academic EDs. We included stable adult patients with AAFL, where onset of symptoms was <48 hours. Patients underwent central web-based randomization stratified by site. The Drug-Shock group received an infusion of procainamide (15mg/kg over 30 minutes) followed 30 minutes later, if necessary, by ECV at 200 joules x 3 shocks. The Shock Only group received an infusion of saline followed, if necessary, by ECV x 3 shocks. The primary outcome was conversion to sinus rhythm for ≥30 minutes at any time following onset of infusion. Patients were followed for 14 days. The primary outcome was evaluated on an intention-to-treat basis. Statistical significance was assessed using chi-squared tests and multivariable logistic regression. Results: We randomized 76 patients, and none was lost to follow-up. The Drug-Shock (N = 33) and Shock Only (N = 43) groups were similar for all characteristics including mean age (66.3 vs 63.4 yrs), duration of AAFL (30.1 vs 24.5 hrs), previous AAFL (72.7% vs 69.8%), median CHADS2 score (1 vs 1), and mean initial heart rate (128.9 vs 126.0 bpm). The Drug-Shock and Shock only groups were similar for the primary outcome of conversion (100% vs 93%; absolute difference 7.0%, 95% CI -0.6;14.6; P = 0.25). The multivariable analyses confirmed the similarity of the two strategies (P = 0.19). In the Drug-Shock group 21.2% of patients converted with the infusion. There were no statistically significant differences for time to conversion (84.2 vs 97.6 minutes), total ED length of stay (9.4 vs 7.5 hours), disposition home (100% vs 95.3%), and stroke within 14 days (0 vs 0). Premature discontinuation of infusion (usually for transient hypotension) was more common in the Drug-Shock group (9.1% vs 0.0%) but there were no serious adverse events. Conclusion: Both the Drug-Shock and Shock Only strategies were highly effective and safe in allowing AAFL patients to go home in sinus rhythm. IV procainamide alone was effective in only one fifth of patients, much less than for acute AF.
Introduction: For rhythm control of acute atrial fibrillation (AAF) in the emergency department (ED), choices include initial drug therapy or initial electrical cardioversion (ECV). We compared the strategies of pharmacological cardioversion followed by ECV if necessary (Drug-Shock), and ECV alone (Shock Only). Methods: We conducted a randomized, blinded, placebo-controlled trial (1:1 allocation) comparing two rhythm control strategies at 11 academic EDs. We included stable adult patients with AAF, where onset of symptoms was <48 hours. Patients underwent central web-based randomization stratified by site. The Drug-Shock group received an infusion of procainamide (15mg/kg over 30 minutes) followed 30 minutes later, if necessary, by ECV at 200 joules x 3 shocks. The Shock Only group received an infusion of saline followed, if necessary, by ECV x 3 shocks. The primary outcome was conversion to sinus rhythm for ≥30 minutes at any time following onset of infusion. Patients were followed for 14 days. The primary outcome was evaluated on an apriori-specified modified intention-to-treat (MITT) basis excluding patients who never received the study infusion (e.g. spontaneous conversion). Data were analyzed using chi-squared tests and logistic regression. Our target sample size was 374 evaluable patients. Results: Of 395 randomized patients, 18 were excluded from the MITT analysis; none were lost to follow-up. The Drug-Shock (N = 198) and Shock Only (N = 180) groups (total = 378) were similar for all characteristics including mean age (60.0 vs 59.5 yrs), duration of AAF (10.1 vs 10.8 hrs), previous AF (67.2% vs 68.3%), median CHADS2 score (0 vs 0), and mean initial heart rate (119.9 vs 118.0 bpm). More patients converted to normal sinus rhythm in the Drug-Shock group (97.0% vs 92.2%; absolute difference 4.8%, 95% CI 0.2-9.9; P = 0.04). The multivariable analyses confirmed the Drug-Shock strategy superiority (P = 0.04). There were no statistically significant differences for time to conversion (91.4 vs 85.4 minutes), total ED length of stay (7.1 vs 7.7 hours), disposition home (97.0% vs 96.1%), and stroke within 14 days (0 vs 0). Premature discontinuation of infusion was more common in the Drug-Shock group (8.1% vs 0.6%) but there were no serious adverse events. Conclusion: Both the Drug-Shock and Shock Only strategies were highly effective and safe in allowing AAF patients to go home in sinus rhythm. A strategy of initial cardioversion with procainamide was superior to a strategy of immediate ECV.
Salmonella spp. continue to be a leading cause of foodborne morbidity worldwide. To assess the risk of foodborne disease, current national regulatory schemes focus on prevalence estimates of Salmonella and other pathogens. The role of pathogen quantification as a risk management measure and its impact on public health is not well understood. To address this information gap, a quantitative risk assessment model was developed to evaluate the impact of pathogen enumeration strategies on public health after consumption of contaminated ground turkey in the USA. Public health impact was evaluated by using several dose–response models for high- and low-virulent strains to account for potential under- or overestimation of human health impacts. The model predicted 2705–21 099 illnesses that would result in 93–727 reported cases of salmonellosis. Sensitivity analysis predicted cooking an unthawed product at home as the riskiest consumption scenario and microbial concentration the most influential input on the incidence of human illnesses. Model results indicated that removing ground turkey lots exceeding contamination levels of 1 MPN/g and 1 MPN in 25 g would decrease the median number of illnesses by 86–94% and 99%, respectively. For a single production lot, contamination levels higher than 1 MPN/g would be needed to result in a reported case to public health officials. At contamination levels of 10 MPN/g, there would be a 13% chance of detecting an outbreak, and at 100 MPN/g, the likelihood of detecting an outbreak increases to 41%. Based on these model prediction results, risk management strategies should incorporate pathogen enumeration. This would have a direct impact on illness incidence linking public health outcomes with measurable food safety objectives.
Paper chromatographic separation of hydroxydimethylarsine oxide (cacodylic acid), monosodium methanearsonate (MSMA), sodium arsenate, and sodium arsenite was achieved with the aid of four solvent systems. Aqueous extracts of plant tissues removed essentially all the arsenicals applied, but methanolic fractionation was required before the extracts could be analyzed by paper chromatographic procedures. A standard nitric-sulfuric acid digestion procedure was employed for arsenic analyses, but great care was taken to avoid sulfuric-acid-induced charring by first adding relatively large amounts of nitric acid to drive off chlorides present. Depending upon the amount of chloride present, substantial losses of arsenic as arsine chlorides were observed if the samples charred. Five minutes in fuming sulfuric acid to completely break the carbon-arsenic bonds was another critical requirement for the quantitative determination of arsenic from cacodylic acid and MSMA. The silver diethyldithiocarbamate colorimetric method was useful for detecting as little as 0.6 μg or as much as 20 μg of arsenic per sample.
Introduction: Acute heart failure (AHF) is a common, serious condition that frequently results in morbidity and death and is a leading cause for hospital admissions. There is little evidence to guide ED physician disposition decisions for AHF patients. We sought to create a risk-stratification tool for use by ED physicians to determine which AHF patients are at high risk for poor outcomes. Methods: We conducted a prospective cohort study in 9 tertiary hospital EDs and enrolled adult patients presenting with shortness of breath due to AHF. Patients were assessed for standardized clinical and laboratory variables and then followed to determine short-term serious outcome (SSO), defined as death, intubation, myocardial infarction, or relapse requiring admission within 14 days. We identified predictors of SSO by stepwise logistic regression and then rounded beta coefficients to create a risk scale. Results: We enrolled 1,733 patients with mean age 77.1 years, male 54.5%, and initially admitted 50.1%. SSOs occurred in 202 (11.7%) cases (14.0% in those admitted and 9.3% in those discharged from the ED). We created the CHFRS consisting of:1. Initial Assessment a) History of valvular heart disease b) On anti-arrhythmic c) Arrival heart rate ≥ 110d) Treated with non-invasive ventilation2. Investigations a) Urea >12 mmol/L or Cr>150 µmol/L b) Serum CO2>35 mmol/L or pCO2 >60 mmHg (VBG or ABG) c) Troponin >5x Upper Reference Level 3. Fails reassessment after ED treatment:(i) Resting vital signs abnormal, (SaO2 <90% on room air or usual O2, or HR >110, or RR >28); OR(ii) Unable to complete 3-minute walk test. The risk of SSO varied from 5.0% for a score of 0, to 77.4% for a score of 9. Discrimination between SSO and no SSO cases was good with an area under the ROC curve of 0.70 (95% CI 0.66-0.74). There was good calibration between the observed and expected probability of SSO and internal validation showed the risk scores to be very accurate across 1,000 replications using the bootstrap method. Conclusion: We have created the CHFRS tool which consists of 8 simple variables and which estimates the short-term risk of SSOs in AHF patients. CHFRS should help improve and standardize admission practices, diminishing both unnecessary admissions for low-risk patients and unsafe discharge decisions for high-risk patients. This will ultimately lead to better safety for patients and more efficient use of hospital resources.
Background: It has been hypothesized that [18F]-sodium fluoride (NaF) uptake imaged with positron emission tomography (PET) binds to hydroxyapatite molecules expressed in regions with active calcification. Therefore, we aimed to validate NaF as a marker of hydroxyapatite expression in high-risk carotid plaque. Methods: Eleven patients (69 ± 5 years, 3 female) scheduled for carotid endarterectomy were prospectively recruited for NaF PET/CT. One patient received a second contralateral endarterectomy; two patients were excluded (intolerance to contrast media and PET/CT misalignment). The bifurcation of the common carotid was used as the reference point; NaF uptake (tissue to blood ratio - TBR) was measured at every PET slice extending 2 cm above and below the bifurcation. Excised plaque was immunostained with Goldner’s Trichrome and whole-slide digitized images were used to quantify hydroxyapatite expression. Pathology was co-registered with PET. Results: NaF uptake was related to the extent of hydroxyapatite expression (r=0.45, p<0.001). Upon classifying bilateral plaque for symptomatology, symptomatic plaque was associated with cerebrovascular events (3.75±1.1 TBR, n=9) and had greater NaF uptake than clinically silent asymptomatic plaque (2.79±0.6 TBR, n=11) (p=0.04). Conclusion: NaF uptake is related to hydroxyapatite expression and is increased in plaque associated with cerebrovascular events. NaF may serve as a novel biomarker of active calcification and plaque vulnerability.
Altered levels of selenium and copper have been linked with altered cardiovascular disease risk factors including changes in blood triglyceride and cholesterol levels. However, it is unclear whether this can be observed prenatally. This cross-sectional study includes 274 singleton births from 2004 to 2005 in Baltimore, Maryland. We measured umbilical cord serum selenium and copper using inductively coupled plasma mass spectrometry. We evaluated exposure levels vis-à-vis umbilical cord serum triglyceride and total cholesterol concentrations in multivariable regression models adjusted for gestational age, birth weight, maternal age, race, parity, smoking, prepregnancy body mass index, n-3 fatty acids and methyl mercury. The percent difference in triglycerides comparing those in the highest v. lowest quartile of selenium was 22.3% (95% confidence interval (CI): 7.1, 39.7). For copper this was 43.8% (95% CI: 25.9, 64.3). In multivariable models including both copper and selenium as covariates, copper, but not selenium, maintained a statistically significant association with increased triglycerides (percent difference: 40.7%, 95% CI: 22.1, 62.1). There was limited evidence of a relationship of increasing selenium with increasing total cholesterol. Our findings provide evidence that higher serum copper levels are associated with higher serum triglycerides in newborns, but should be confirmed in larger studies.
The emergence of invasive fungal wound infections (IFIs) in combat casualties led to development of a combat trauma-specific IFI case definition and classification. Prospective data were collected from 1133 US military personnel injured in Afghanistan (June 2009–August 2011). The IFI rates ranged from 0·2% to 11·7% among ward and intensive care unit admissions, respectively (6·8% overall). Seventy-seven IFI cases were classified as proven/probable (n = 54) and possible/unclassifiable (n = 23) and compared in a case-case analysis. There was no difference in clinical characteristics between the proven/probable and possible/unclassifiable cases. Possible IFI cases had shorter time to diagnosis (P = 0·02) and initiation of antifungal therapy (P = 0·05) and fewer operative visits (P = 0·002) compared to proven/probable cases, but clinical outcomes were similar between the groups. Although the trauma-related IFI classification scheme did not provide prognostic information, it is an effective tool for clinical and epidemiological surveillance and research.
Inflammation is associated with preterm premature rupture of membranes (PPROM) and adverse neonatal outcomes. Subchorionic thrombi, with or without inflammation, may also be a significant pathological finding in PPROM. Patterns of inflammation and thrombosis may give insight into mechanisms of adverse neonatal outcomes associated with PPROM. To characterize histologic findings of placentas from pregnancies complicated by PPROM at altitude, 44 placentas were evaluated for gross and histological indicators of inflammation and thrombosis. Student's t-test (or Mann–Whitney U-test), χ2 analysis (or Fisher's exact test), mean square contingency and logistic regression were used when appropriate. The prevalence of histologic acute chorioamnionitis (HCA) was 59%. Fetal-derived inflammation (funisitis and chorionic plate vasculitis) was seen at lower frequency (30% and 45%, respectively) and not always in association with HCA. There was a trend for Hispanic women to have higher odds of funisitis (OR = 5.9; P = 0.05). Subchorionic thrombi were seen in 34% of all placentas. The odds of subchorionic thrombi without HCA was 6.3 times greater that the odds of subchorionic thrombi with HCA (P = 0.02). There was no difference in gestational age or rupture-to-delivery interval, with the presence or absence of inflammatory or thrombotic lesions. These findings suggest that PPROM is caused by or can result in fetal inflammation, placental malperfusion, or both, independent of gestational age or rupture-to-delivery interval; maternal ethnicity and altitude may contribute to these findings. Future studies focused on this constellation of PPROM placental findings, genetic polymorphisms and neonatal outcomes are needed.
In paediatric practice, mean reference energy requirements for groups are often used to predict individual infant energy requirements. References from the FAO/WHO/United Nations University are based on infants not fed according to the current infant feeding recommendations. The objective of the present study was to measure total energy expenditure (TEE) and determine energy requirements using criterion methods, and validate the use of TEE prediction equation and mean energy requirement references for predicting individual TEE and energy requirements, respectively, in infants who were exclusively breast-fed (EBF) to 6 months of age. EBF infants were included from Greater Glasgow for measurements at 3·5 (n 36) and 6 (n 33) months of age. TEE was measured using doubly labelled water and energy requirements were determined using the factorial approach. TEE and energy requirements were also predicted using equations based on body weight. Relationships between criterion methods and predictions were assessed using correlations. Paired t tests and Bland–Altman plots were used to assess agreement. At the population level, predicted and measured TEE were similar. The energy requirement reference significantly underestimated energy requirements by 7·2 % at 3·5 months at the population level, but there was no bias at 6 months. Errors at individual levels were large and energy requirements were underestimated to a larger extent for infants with higher energy requirements. This indicates that references presently used in clinical practice to estimate energy requirements may not fully account for the different growth pattern of EBF infants. More studies in infants EBF to 6 months of age are needed to understand how growth of EBF infants influences energy requirements.
The serological responses of conventionally reared sheep were compared after vaccination with inactivated parainfluenza 3 (PI3) virus incorporated in three different adjuvants. Inactivated PI3 virus with the double-stranded RNA, BRL 5907 in an oil emulsion was shown to stimulate higher serum antibody titres over the first 5 weeks after vaccination than virus with and without BCG emulsified in oil. The ability of this vaccine to protect specific pathogen-free lambs against challenge with PI3 virus was examined in a second experiment. In this experiment the vaccine stimulated virus neutralizing and haemagglutination inhibiting antibodies in the serum. After intranasal and intratracheal inoculation with PI3 virus at challenge, vaccinated lambs showed no clinical illness and virus isolation was confined, except in one lamb, to the first two days. In contrast, unvaccinated lambs developed respiratory disease and virus was isolated daily for 7 days after challenge.