We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Clostridioides difficile infection (CDI) may be misdiagnosed if testing is performed in the absence of signs or symptoms of disease. This study sought to support appropriate testing by estimating the impact of signs, symptoms, and healthcare exposures on pre-test likelihood of CDI.
Methods:
A panel of fifteen experts in infectious diseases participated in a modified UCLA/RAND Delphi study to estimate likelihood of CDI. Consensus, defined as agreement by >70% of panelists, was assessed via a REDCap survey. Items without consensus were discussed in a virtual meeting followed by a second survey.
Results:
All fifteen panelists completed both surveys (100% response rate). In the initial survey, consensus was present on 6 of 15 (40%) items related to risk of CDI. After panel discussion and clarification of questions, consensus (>70% agreement) was reached on all remaining items in the second survey. Antibiotics were identified as the primary risk factor for CDI and grouped into three categories: high-risk (likelihood ratio [LR] 7, 93% agreement among panelists in first survey), low-risk (LR 3, 87% agreement in first survey), and minimal-risk (LR 1, 71% agreement in first survey). Other major factors included new or unexplained severe diarrhea (e.g., ≥ 10 liquid bowel movements per day; LR 5, 100% agreement in second survey) and severe immunosuppression (LR 5, 87% agreement in second survey).
Conclusion:
Infectious disease experts concurred on the importance of signs, symptoms, and healthcare exposures for diagnosing CDI. The resulting risk estimates can be used by clinicians to optimize CDI testing and treatment.
This study aimed to assess the impact of clinical decision support (CDS) to improve ordering of multiplex gastrointestinal polymerase chain reaction (PCR) testing panel (“GI panel”).
Design:
Single-center, retrospective, before-after study.
Setting:
Tertiary care Veteran’s Affairs (VA) Medical Center provides inpatient, outpatient, and residential care.
Patients:
All patients tested with a GI panel between June 22, 2022 and April 20, 2023.
Intervention:
We designed a CDS questionnaire in the electronic medical record (EMR) to guide appropriate ordering of the GI panel. A “soft stop” reminder at the point of ordering prompted providers to confirm five appropriateness criteria: 1) documented diarrhea, 2) no recent receipt of laxatives, 3) C. difficile is not the leading suspected cause of diarrhea, 4) time period since a prior test is >14 days or prior positive test is >4 weeks and 5) duration of hospitalization <72 hours. The CDS was implemented in November 2022.
Results:
Compared to the pre-implementation period (n = 136), fewer tests were performed post-implementation (n = 92) with an IRR of 0.61 (p = 0.003). Inappropriate ordering based on laxative use or undocumented diarrhea decreased (IRR 0.37, p = 0.012 and IRR 0.25, p = 0.08, respectively). However, overall inappropriate ordering and outcome measures did not significantly differ before and after the intervention.
Conclusions:
Implementation of CDS in the EMR decreased testing and inappropriate ordering based on use of laxatives or undocumented diarrhea. However, inappropriate ordering of tests overall remained high post-intervention, signaling the need for continued diagnostic stewardship efforts.
In total, 50 healthcare facilities completed a survey in 2021 to characterize changes in infection prevention and control and antibiotic stewardship practices. Notable findings include sustained surveillance for multidrug-resistant organisms but decreased use of human resource-intensive interventions compared to previous surveys in 2013 and 2018 conducted prior to the COVID-19 pandemic.
This SHEA white paper identifies knowledge gaps and challenges in healthcare epidemiology research related to coronavirus disease 2019 (COVID-19) with a focus on core principles of healthcare epidemiology. These gaps, revealed during the worst phases of the COVID-19 pandemic, are described in 10 sections: epidemiology, outbreak investigation, surveillance, isolation precaution practices, personal protective equipment (PPE), environmental contamination and disinfection, drug and supply shortages, antimicrobial stewardship, healthcare personnel (HCP) occupational safety, and return to work policies. Each section highlights three critical healthcare epidemiology research questions with detailed description provided in supplementary materials. This research agenda calls for translational studies from laboratory-based basic science research to well-designed, large-scale studies and health outcomes research. Research gaps and challenges related to nursing homes and social disparities are included. Collaborations across various disciplines, expertise and across diverse geographic locations will be critical.
Introduction: Acute aortic syndrome (AAS) is a time sensitive aortic catastrophe that is often misdiagnosed. There are currently no Canadian guidelines to aid in diagnosis. Our goal was to adapt the existing American Heart Association (AHA) and European Society of Cardiology (ESC) diagnostic algorithms for AAS into a Canadian evidence based best practices algorithm targeted for emergency medicine physicians. Methods: We chose to adapt existing high-quality clinical practice guidelines (CPG) previously developed by the AHA/ESC using the GRADE ADOLOPMENT approach. We created a National Advisory Committee consisting of 21 members from across Canada including academic, community and remote/rural emergency physicians/nurses, cardiothoracic and cardiovascular surgeons, cardiac anesthesiologists, critical care physicians, cardiologist, radiologists and patient representatives. The Advisory Committee communicated through multiple teleconference meetings, emails and a one-day in person meeting. The panel prioritized questions and outcomes, using the Grading of Recommendations Assessment, Development and Evaluation (GRADE) approach to assess evidence and make recommendations. The algorithm was prepared and revised through feedback and discussions and through an iterative process until consensus was achieved. Results: The diagnostic algorithm is comprised of an updated pre test probability assessment tool with further testing recommendations based on risk level. The updated tool incorporates likelihood of an alternative diagnosis and point of care ultrasound. The final best practice diagnostic algorithm defined risk levels as Low (0.5% no further testing), Moderate (0.6-5% further testing required) and High ( >5% computed tomography, magnetic resonance imaging, trans esophageal echocardiography). During the consensus and feedback processes, we addressed a number of issues and concerns. D-dimer can be used to reduce probability of AAS in an intermediate risk group, but should not be used in a low or high-risk group. Ultrasound was incorporated as a bedside clinical examination option in pre test probability assessment for aortic insufficiency, abdominal/thoracic aortic aneurysms. Conclusion: We have created the first Canadian best practice diagnostic algorithm for AAS. We hope this diagnostic algorithm will standardize and improve diagnosis of AAS in all emergency departments across Canada.
The contemporary relevance of archaeology would be greatly enhanced if archaeologists could develop theory that frames human societies of all scales in the same terms. We present evidence that an approach known as settlement scaling theory can contribute to such a framework. The theory proposes that a variety of aggregate socioeconomic properties of human networks emerge from individuals arranging themselves in space so as to balance the costs of movement with the benefits of social interactions. This balancing leads to settlements that concentrate human interactions and their products in space and time in an open-ended way. The parameters and processes embedded in settlement scaling models are very basic, and this suggests that scaling phenomena should be observable in the archaeological record of middle-range societies just as readily as they have been observed in contemporary first-world nations. In this paper, we show that quantitative scaling relationships observed for modern urban systems, and more recently for early civilizations, are also apparent in settlement data from the Central Mesa Verde and northern Middle Missouri regions of North America. These findings suggest that settlement scaling theory may help increase the practical relevance of archaeology for present-day concerns.
Excellent protection from 2-chloro-4,6-bis(ethylamino)-s-triazine (simazine) injury at twice the dose needed for weed control was obtained by dipping the roots of strawberry plants (Fragaria grandiflora Ehrh.) in a 10% slurry of activated carbon before transplanting. Protection was greater when the roots were dipped in the slurry of activated carbon than when activated carbon was applied in the transplant water. Protection was obtained from three different activated carbons. Protection from injury to varying degrees was observed when several other herbicides were used.
Parasites of the genera Plasmodium and Haemoproteus (Apicomplexa: Haemosporida) are a diverse group of pathogens that infect birds nearly worldwide. Despite their ubiquity, the ecological and evolutionary factors that shape the diversity and distribution of these protozoan parasites among avian communities and geographic regions are poorly understood. Based on a survey throughout the Neotropics of the haemosporidian parasites infecting manakins (Pipridae), a family of Passerine birds endemic to this region, we asked whether host relatedness, ecological similarity and geographic proximity structure parasite turnover between manakin species and local manakin assemblages. We used molecular methods to screen 1343 individuals of 30 manakin species for the presence of parasites. We found no significant correlations between manakin parasite lineage turnover and both manakin species turnover and geographic distance. Climate differences, species turnover in the larger bird community and parasite lineage turnover in non-manakin hosts did not correlate with manakin parasite lineage turnover. We also found no evidence that manakin parasite lineage turnover among host species correlates with range overlap and genetic divergence among hosts. Our analyses indicate that host switching (turnover among host species) and dispersal (turnover among locations) of haemosporidian parasites in manakins are not constrained at this scale.
The history of the feed industry is pertinent in terms of understanding how and why certain practices have evolved. Some of these practices have been superseded by modern, more natural alternatives, for example the traditional use of antibiotics in feed. In other cases, such as inorganic minerals, more natural versions akin to those found in plant and animal materials are available, although these new initiatives are still being taken up globally. Research continues to increase our knowledge and understanding of nutrient balance and digestion, and in some species this is more advanced than others. The following paper represents the first complete history of the feed industry, its major milestones, and projects how it might continue to utilise new technology developments to improve animal feeding practices.
From the early 1600s, when tobacco exports literally saved the struggling Jamestown settlement, to the January 4, 1980, embargo of grain to the Soviet Union, food and agriculture have played varied roles in international affairs of the U.S., that is, in the political, military, economic, and cultural exchanges that affect the power of the U.S. relative to other sovereign nations. Food donations have been used as a humanitarian gesture to avert starvation. Food export embargoes have been used as weapons against foreign adversaries and domestic scarcities. Food pledges have been used to promote international food aid conventions. Food import quotas have been reallocated to reward friendly nations and penalize unfriendly ones. U.S. food shipments have been used to feed Allied soldiers and to barter for strategic materials. Food exports have been used to bolster the domestic economy and strengthen the dollar.
The majority of decisions concerning investment and allocation of public funds for agricultural research, extension, and teaching (RET) are made at the state-level, while most of the quantitative RET evaluations are made on a national basis. This paper illustrates an approach for conducting a disaggregated state-level evaluation of agricultural research, extension, and teaching. Ridge regression is employed to handle multicollinearity problems.
Tall fescue toxicosis adversely affects calving rate and weight gainsreducing returns to cow-calf producers in the south–central United States.This grazing study estimated animal and economic performance implications ofendophyte-infected fescue and calving season. Establishing novelendophyte-infected tall fescue on 25% of pasture acres resulted in improvedcalving rates (87% vs. 70%), weaning weights (532 lbs vs. 513 lbs), andpartial returns per acre ($257 vs. $217). Additionally, fall-calving cowshad higher calving rates (91% vs. 67%), weaning weights (550 lbs vs. 496lbs), and partial returns per acre ($269 vs. $199) than spring calvingcows.
Genetic improvement is easy when selecting for one heritable and well-recorded trait at a time. Many industrialised national dairy herds have overall breeding indices that incorporate a range of traits balanced by their known or estimated economic value. Future breeding goals will contain more non-production traits and, in the context of this paper, traits associated with human health and cow robustness. The definition of Robustness and the traits used to predict it are currently fluid; however, the use of mid-infrared reflectance spectroscopic analysis of milk will help to create new phenotypes on a large scale that can be used to improve the human health characteristics of milk and the robustness of cows producing it. This paper describes the state-of-the-art in breeding strategies that include animal robustness (mainly energy status) and milk quality (as described by milk fatty acid profile), with particular emphasis on the research results generated by the FP7-funded RobustMilk project