We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Clostridioides difficile infection (CDI) is a common and often nosocomial infection associated with increased mortality and morbidity. Antibiotic use is the most important modifiable risk factor, but many patients require empiric antibiotics. We estimated the increased risk of hospital-onset CDI with one daily dose-equivalent (DDE) of various empiric antibiotics compared to management without that daily dose-equivalent.
Methods:
Using a multicenter retrospective cohort of adults admitted between March 2, 2020 and February 11, 2021 for the treatment of SARS-CoV-2, we used a series of three-level logistic regression models to estimate the probability of receiving each of several antibiotics of interest. For each antibiotic, we then limited our data set to patient-days at intermediate probability of receipt and used augmented inverse-probability weighted models to estimate the average treatment effect of one daily dose-equivalent, compared to management without that daily dose-equivalent, on the probability of hospital-onset CDI.
Results:
In 24,406 patient-days at intermediate probability of receipt, parenteral vancomycin increased risk of hospital-onset CDI, with an average treatment effect of 0.0096 cases per daily dose-equivalent (95% CI: 0.0053—0.0138). In 38,003 patient-days at intermediate probability of receipt, cefepime also increased subsequent CDI risk, with an estimated effect of 0.0074 more cases per daily dose-equivalent (95% CI: 0.0022—0.0126).
Conclusions:
Among common empiric antibiotics, parenteral vancomycin and cefepime appeared to increase risk of hospital-onset CDI. Causal inference observational study designs can be used to estimate patient-level harms of interventions such as empiric antimicrobials.
Introduction: EMS time factors such as total prehospital, activation, response, scene and transport intervals have been used as a measure of EMS system quality with the assumption that shorter EMS time factors save lives. The objective was to assess in adults and children accessing ground EMS (population), whether operational time factors (intervention and control) were associated with survival at hospital discharge (outcome). Methods: Medline, EMBASE, and CINAHL were searched up to January 2015 for articles reporting original data that associated EMS operational time factors and survival. Conference abstracts and non-English language articles were excluded. Two investigators independently assessed the candidate titles, abstracts, and full text with discrepant reviews resolved by consensus. Risk of bias was assessed using GRADE. Results: A total of 10,151 abstracts were screened for potential inclusion, 199 articles were reviewed in full-text, and 73 met inclusion criteria. Amongst included studies, 49 investigated response time, while 24 investigated other time factors. All articles were observational studies. Amongst the 14 (28.6%) studies where response time was the primary analysis, statistically significant associations between shorter response time and increased survival were found in 5 of 7 cardiac arrest, 1 of 5 general EMS population, and 0 of 2 trauma studies. Other time factors were reported in the primary analysis in 10 (41.7%) studies. One study reported shorter combined scene and transport intervals associated with increased survival in acute heart failure patients. Two studies in trauma patients had somewhat conflicting results with one study reporting shorter prehospital interval associated with increased survival whereas the other reported increased survival associated with longer scene and transport intervals. Study design, analysis, and methodological quality were of considerable variability, and thus, meta-analyses were not possible. Conclusion: There is a substantial body of literature describing the association between EMS time factors and survival, but evidence informing these relationships are heterogeneous and complex. Important details such as patient population, EMS system characteristics, and analytical approach must be taken into consideration to appropriately translate these findings to practice. These results will be important for EMS leaders wishing to create evidence-based time policies.
Introduction: Outside of key conditions such as cardiac arrest and trauma, little is known about the epidemiology of mortality of all transported EMS patients. The objective of this study is to describe characteristics of EMS patients who after transport die in a health care facility. Methods: EMS transport events over one year (April, 2015-16) from a BLS/ALS system serving an urban/rural population of approximately 2 million were linked with in-hospital datasets to determine proportion of all-cause in-hospital mortality by Medical Priority Dispatch System (MPDS) determinant (911 call triage system), age in years (>=18 yrs. - adult, <=17 yrs. - pediatric), sex, day of week, season, time (in six hour periods), and emergency department Canadian Triage and Acuity Scale (CTAS). The MPDS card, patient chief complaint, and ED diagnosis category (International Classification of Disease v.10 - Canadian) with the highest proportion of mortality are also reported. Analyses included two-sided t-test or chi-square with alpha <0.05. Results: A total of 239,534 EMS events resulted in 159,507 patient transports; 141,114 were included for analysis after duplicate removal (89.1% linkage), with 127,867 reporting final healthcare system outcome. There were 4,269 who died (3.3%; 95%CI 3.2%, 3.4%). The proportion of mortality by MPDS determinant was, from most to least critical 911 call, Echo (7.3%), Delta (37.2%), Charlie (31.3%), Bravo (5.8%), Alpha (18.3%), and Omega (0.3%). For adults the mean age of survivors was less than non-survivors (57.7 vs. 75.8; p<0.001), but pediatric survivors were older than non-survivors (8.7 vs. 2.8; p<0.001). There were more males that died than females (53.0% vs. 47.0%; p<0.001). There was no statistically significant difference in the day of week (p=0.592), but there was by season with the highest mortality in winter (27.1%; p=0.045). The highest mortality occurred with patients presenting to EMS between 0600-1200 hours (34.6%), and the lowest between 0000-0600 hours (11.8%; p<0.001). Mortality by CTAS was category 1 (27.1%), 2 (36.7%), 3 (29.9%), 4 (4.3%), and 5 (0.5%). The highest mortality was seen in MPDS card 26-Sick Person (specific diagnosis) (19.1%), chief complaint shortness of breath (19.3%), and ED diagnoses pertaining to the circulatory system (31.1%). Conclusion: Significant all-cause in-hospital mortality differences were found between event, patient, and clinical characteristics. These data provide foundational and hypothesis generating knowledge regarding mortality in transported EMS patients that can be used to guide research and training. Future research should further explore the characteristics of those that access health care through the EMS system.
Introduction: Community Paramedics (CPs) require access to timely blood analysis in the field to guide treatment and transport decisions. Point of care testing (POCT), as opposed to traditional laboratory analysis, may offer a solution, but limited research exists on CP POCT. The objective of this study is to compare the validity of two POCT devices (Abbott i-STAT® and Alere epoc®) and their use by CPs in the community. Methods: In a CP programme responding to 6,000 annual patient care events, a split sample validation of POCT against traditional laboratory analysis for seven analytes (sodium, potassium, chloride, creatinine, hemoglobin, hematocrit, and glucose) was conducted on a consecutive sample of patients. The difference of proportion of discrepant results between POCT and laboratory was compared using a two sample proportion test. Usability was analysed by survey of CP experience, an expert heuristic evaluation of devices, a review of device-logged errors, coded observations of POCT use during quality control testing, and a linear mixed effects model of Systems Usability Scale (SUS) adjusted for CP clinical and POCT experience. Results: Of 1,649 CP calls for service screened for enrollment, 174 had a blood draw, with 108 patient care encounters (62.1%) enrolled from 73 participants. Participants had a mean age of 58.7 years (SD16.3); 49% were female. In 4 of 646 (0.6%) individual comparisons, POCT reported a critical value that the laboratory did not; with no statistically significant difference in the number of discrepant critical values reported with epoc® compared to i-STAT®. There were no instances of the laboratory reporting a critical value when POCT did not. In 88 of 1,046 (8.4%) individual comparisons, the a priori defined acceptable difference between POCT and the laboratory was exceeded; occurring more often in epoc® (10.7%;95%CI:8.1%,13.3%) compared to i-STAT® (6.1%;95%CI:4.1%,8.2%)(p=0.007). Eighteen of 19 CP surveys were returned, with 11/18 (61.1%) preferring i-STAT® over epoc®. The i-STAT® had a higher mean SUS score (higher usability) compared to the epoc® (84.0/100 vs. 59.6/100; p=0.011). Fewer field blood analysis device-logged errors occurred in i-STAT® (7.8%;95%CI:2.9%,12.7%) compared to epoc® (15.5%;95%CI:9.3%,21.7%) although not statistically significant (p=0.063). Conclusion: CP programs can expect valid results from POCT. Usability assessment suggests a preference for i-STAT.
A comparison of two electron microscopy techniques used to determine the polarity of GaN nanowires is presented. The techniques are convergent beam electron diffraction (CBED) in TEM mode and annular bright field (ABF) imaging in aberration corrected STEM mode. Both measurements were made at nominally the same locations on a variety of GaN nanowires. In all cases the two techniques gave the same polarity result. An important aspect of the study was the calibration of the CBED pattern rotation relative to the TEM image. Three different microscopes were used for CBED measurements. For all three instruments there was a substantial rotation of the diffraction pattern (120 or 180°) relative to the image, which, if unaccounted for, would have resulted in incorrect polarity determination. The study also shows that structural defects such as inversion domains can be readily identified by ABF imaging, but may escape identification by CBED. The relative advantages of the two techniques are discussed.
Eco-efficiency is a useful guide to dairy farm sustainability analysis aimed at increasing output (physical or value added) and minimizing environmental impacts (EIs). Widely used partial eco-efficiency ratios (EIs per some functional unit, e.g. kg milk) can be problematic because (i) substitution possibilities between EIs are ignored, (ii) multiple ratios can complicate decision making and (iii) EIs are not usually associated with just the functional unit in the ratio’s denominator. The objective of this study was to demonstrate a ‘global’ eco-efficiency modelling framework dealing with issues (i) to (iii) by combining Life Cycle Analysis (LCA) data and the multiple-input, multiple-output production efficiency method Data Envelopment Analysis (DEA). With DEA each dairy farm’s outputs and LCA-derived EIs are aggregated into a single, relative, bounded, dimensionless eco-efficiency score, thus overcoming issues (i) to (iii). A novelty of this study is that a model providing a number of additional desirable properties was employed, known as the Range Adjusted Measure (RAM) of inefficiency. These properties altogether make RAM advantageous over other DEA models and are as follows. First, RAM is able to simultaneously minimize EIs and maximize outputs. Second, it indicates which EIs and/or outputs contribute the most to a farm’s eco-inefficiency. Third it can be used to rank farms in terms of eco-efficiency scores. Thus, non-parametric rank tests can be employed to test for significant differences in terms of eco-efficiency score ranks between different farm groups. An additional DEA methodology was employed to ‘correct’ the farms’ eco-efficiency scores for inefficiencies attributed to managerial factors. By removing managerial inefficiencies it was possible to detect differences in eco-efficiency between farms solely attributed to uncontrollable factors such as region. Such analysis is lacking in previous dairy studies combining LCA with DEA. RAM and the ‘corrective’ methodology were demonstrated with LCA data from French specialized dairy farms grouped by region (West France, Continental France) and feeding strategy (regardless of region). Mean eco-efficiency score ranks were significantly higher for farms with <10% and 10% to 30% maize than farms with >30% maize in the total forage area before correcting for managerial inefficiencies. Mean eco-efficiency score ranks were higher for West than Continental farms, but significantly higher only after correcting for managerial inefficiencies. These results helped identify the eco-efficiency potential of each region and feeding strategy and could therefore aid advisors and policy makers at farm or region/sector level. The proposed framework helped better measure and understand (dairy) farm eco-efficiency, both within and between different farm groups.
A recent mixed-methods study on the state of emergency medical services (EMS) research in Canada led to the generation of nineteen actionable recommendations. As part of the dissemination plan, a survey was distributed to EMS stakeholders to determine the anticipated impact and feasibility of implementing these recommendations in Canadian systems.
Methods
An online survey explored both the implementation impact and feasibility for each recommendation using a five-point scale. The sample consisted of participants from the Canadian National EMS Research Agenda study (published in 2013) and additional EMS research stakeholders identified through snowball sampling. Responses were analysed descriptively using median and plotted on a matrix. Participants reported any planned or ongoing initiatives related to the recommendations, and required or anticipated resources. Free text responses were analysed with simple content analysis, collated by recommendation.
Results
The survey was sent to 131 people, 94 (71.8%) of whom responded: 30 EMS managers/regulators (31.9%), 22 researchers (23.4%), 15 physicians (16.0%), 13 educators (13.8%), and 5 EMS providers (5.3%). Two recommendations (11%) had a median impact score of 4 (of 5) and feasibility score of 4 (of 5). Eight recommendations (42%) had an impact score of 5, with a feasibility score of 3. Nine recommendations (47%) had an impact score of 4 and a feasibility score of 3.
Conclusions
For most recommendations, participants scored the anticipated impact higher than the feasibility to implement. Ongoing or planned initiatives exist pertaining to all recommendations except one. All of the recommendations will require additional resources to implement.
The Bovine Respiratory Disease Coordinated Agricultural Project (BRD CAP) is a 5-year project funded by the United States Department of Agriculture (USDA), with an overriding objective to use the tools of modern genomics to identify cattle that are less susceptible to BRD. To do this, two large genome wide association studies (GWAS) were conducted using a case:control design on preweaned Holstein dairy heifers and beef feedlot cattle. A health scoring system was used to identify BRD cases and controls. Heritability estimates for BRD susceptibility ranged from 19 to 21% in dairy calves to 29.2% in beef cattle when using numerical scores as a semi-quantitative definition of BRD. A GWAS analysis conducted on the dairy calf data showed that single nucleotide polymorphism (SNP) effects explained 20% of the variation in BRD incidence and 17–20% of the variation in clinical signs. These results represent a preliminary analysis of ongoing work to identify loci associated with BRD. Future work includes validation of the chromosomal regions and SNPs that have been identified as important for BRD susceptibility, fine mapping of chromosomes to identify causal SNPs, and integration of predictive markers for BRD susceptibility into genetic tests and national cattle genetic evaluations.
Ground beetles common in temperate agroecosystems are predators of crop pests, including slugs (Gastropoda: Pulmonata). Salad green production in greenhouses during autumn and spring can be limited by damage due to slugs and other pests. Introducing ground beetles to greenhouses may help reduce damage and improve yields. In the laboratory, while arenas with only slugs produced nearly no harvestable leaves, the presence of Carabus nemoralis Müller (Coleoptera: Carabidae) increased the number and weight of harvestable leaves to 55% of the amount in control arenas (without slugs or beetles), in addition to reducing the number of slugs. In a second experiment, adult or second-instar Pterostichus melanarius (Illiger) (Coleoptera: Carabidae) were released into greenhouse mesocosms (75 cm diameter steel rings) containing salad greens and slugs. Neither adults nor larvae improved the number or weight of harvestable leaves at the first two harvests, and there was no evidence of slug consumption. Towards the end of the experiment cutworms were common in the mesocosms and contributed to damaging salad greens. Adult P. melanarius likely consumed some cutworms, resulting in small increases in salad green yields at the third harvest. Our results suggest that ground beetles should be further examined as part of an integrated approach to pest control in late and early season salad green production in greenhouses.
Ongoing intensification and specialisation of livestock production lead to increasing volumes of manure to be managed, which are a source of the greenhouse gases (GHGs) methane (CH4) and nitrous oxide (N2O). Net emissions of CH4 and N2O result from a multitude of microbial activities in the manure environment. Their relative importance depends not only on manure composition and local management practices with respect to treatment, storage and field application, but also on ambient climatic conditions. The diversity of livestock production systems, and their associated manure management, is discussed on the basis of four regional cases (Sub-Saharan Africa, Southeast Asia, China and Europe) with increasing levels of intensification and priorities with respect to nutrient management and environmental regulation. GHG mitigation options for production systems based on solid and liquid manure management are then presented, and potentials for positive and negative interactions between pollutants, and between management practices, are discussed. The diversity of manure properties and environmental conditions necessitate a modelling approach for improving estimates of GHG emissions, and for predicting effects of management changes for GHG mitigation, and requirements for such a model are discussed. Finally, we briefly discuss drivers for, and barriers against, introduction of GHG mitigation measures for livestock production. There is no conflict between efforts to improve food and feed production, and efforts to reduce GHG emissions from manure management. Growth in livestock populations are projected to occur mainly in intensive production systems where, for this and other reasons, the largest potentials for GHG mitigation may be found.
The functional properties, including antioxidant and chemopreventative capacities as well as the inhibitory effects on angiotensin-converting enzyme (ACE), α-glucosidase and pancreatic lipase, of three Australian-grown faba bean genotypes (Nura, Rossa and TF(Ic*As)*483/13) were investigated using an array of in vitro assays. Chromatograms of on-line post column derivatisation assay coupled with HPLC revealed the existence of active phenolics (hump) in the coloured genotypes, which was lacking in the white-coloured breeding line, TF(Ic*As)*483/13. Roasting reduced the phenolic content, and diminished antioxidant activity by 10–40 % as measured by the reagent-based assays (diphenylpicrylhydrazyl, 2,2′-azino-bis(3-ethylbenzthiazoline-6-sulphonic acid) and oxygen radical absorbance capacity) in all genotypes. Cell culture-based antioxidant activity assay (cellular antioxidant activity) showed an increase of activity in the coloured genotypes after roasting. Faba bean extracts demonstrated cellular protection ability against H2O2-induced DNA damage (assessed using RAW264.7 cells), and inhibited the proliferation of all human cancer cell lines (BL13, AGS, Hep G2 and HT-29) evaluated. However, the effect of faba bean extracts on the non-transformed human cells (CCD-18Co) was negligible. Flow cytometric analyses showed that faba bean extracts successfully induced apoptosis of HL-60 (acute promyelocytic leukaemia) cells. The faba bean extracts also exhibited ACE, α-glucosidase and pancreatic lipase inhibitory activities. Overall, extracts from Nura (buff-coloured) and Rossa (red-coloured) were comparable, while TF(Ic*As)*483/13 (white-coloured) contained the lowest phenolic content and exhibited the least antioxidant and enzyme inhibition activities. These results are important to promote the utilisation of faba beans in human diets for various health benefits.
Interventional cardiology procedures can involve potentiallyhigh doses of radiation to the patients. Stochastic effects of ionisingradiation – radiation-induced cancers in the long term – may occur.We analysed clinical characteristics and dosimetric data in a populationof patients undergoing interventional cardiology. In all, 1 591patients who had undergone coronarography and/or angioplasty inthe course of a year at the Saint-Gatien Clinic in Tours (France)were included. Information on patients’ individual clinical characteristicsand Dose-Area Product values were collected. Organ doses to thelung, oesophagus, bone marrow and breast were mathematically evaluated.The median age of patients was 70 years. Their median cumulativedose-area product value was 48.4 Gy.cm2 for the wholeyear and the median effective dose was 9.7 mSv. The median organdoses were 41 mGy for the lung, 31 mGy for the oesophagus, 10 mGyfor the bone marrow and 4 mGy for the breast. Levels of doses closeto the heart appear to be rather high in the case of repeated interventionalcardiology procedures. Clinical characteristics should be takeninto account when planning epidemiological studies on potential radiation-inducedcancers.