We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Mandatory thresholds for the accuracy of reported energy on food and beverage product labels do not exist in many countries. Accurate nutrition information is essential for ensuring nutritional adequacy among hospital patients. The aim of this study was to compare direct measures of energy of nutritional fluids provided in hospitals to values determined via manufacturers’ specifications. Nutritional fluids were identified as any liquid provided to hospital patients orally, enterally or parenterally, to deliver nutrition. These were categorised into six groups aligned to food/medical standards, including (1) local recipes, (2) pre-packaged general fluids, (3) supplementary fluids, (4) prescribed nutrition fluids – thickened, (5) prescribed nutrition fluids – oral/enteral and (6) prescribed medical nutrition – intravenous (IV) and parenteral. An equivalence testing statistical approach (±10 % thresholds) was used to compare energy values derived directly via bomb calorimetry against those obtained from manufacturer specifications. A total of sixty-nine fluids were measured. One-fifth (n 14) exhibited non-equivalent energy values, with the majority of these (n 11; 79 %) likely to contain less energy than that calculated from reported values. Almost all (34/35; 97 %) prescribed nutrition fluids (oral/enteral (20/20; 100 %), IV and parenteral (7/7; 100 %) and thickened fluid (7/8; 88 %) products were equivalent. In contrast, only 21/34 (62 %) non-prescribed fluids (local recipes (2/11; 18 %), supplementary fluids (4/5; 80 %) and pre-packaged general fluid (15/18; 83 %) products) demonstrated equivalence. Energy content of nutritional fluids prescribed to hospital patients typically aligns with manufacturers’ values. Consumption of non-prescribed fluids may result in lower energy intakes than expected.
Background: During the COVID-19 pandemic, rates of central line bloodstream infections (CLABSI) increased nationally. Studies pre-pandemic showed improved CLABSI rates with implementation of a standardized vascular access team (VAT).[PL1] [PL2] [mi3] Varying VAT resources and coverage existed in our 10 acute care facilities (ACF) prior to and during the pandemic. VAT scope also varied in 1) process for line selection during initial placement, 2) ability to place a peripherally inserted central catheter (PICC), midline or ultrasound-guided peripheral IV in patients with difficult vascular access, 3) ownership of daily assessment of central line (CL) necessity, and 4) routine CL dressing changes. We aimed to define and implement the ideal VAT structure and evaluate the impact on CLABSI standardized infection ratios (SIR) and rates prior to and during the pandemic. Methods: A multidisciplinary workgroup including representatives from nursing, infection prevention, and vascular access was formed to understand the current state of VAT responsibilities across all ACFs. The group identified key responsibilities a VAT should conduct to aid in CLABSI prevention. Complete VAT coverage[mi4] was defined as the ability to conduct the identified responsibilities daily. We compared the SIR and CLABSI rates between hospitals who had complete VAT (CVAT) coverage to hospitals with incomplete VAT (IVAT) coverage. Given this work occurred during the pandemic, we further stratified our analysis based on a time frame prior to the pandemic (1/2015 – 12/2019) and intra-pandemic (1/2020 - 12/2022). Results: The multidisciplinary team identified 6 key components of complete VAT coverage: Assessment for appropriate line selection prior to insertion, ability to insert PICC and midlines, daily CL and midline care and maintenance assessments, daily assessment of necessity for CL, and weekly dressing changes for CL and midlines[NA5] . A cross walk of VAT scope (Figure 1) was performed in October 2022 which revealed two facilities (A and E) which met CVAT criteria. Pre-pandemic, while IVAT CLABSI rates and SIR were higher than in CVAT units, the difference was not statistically significant. During the pandemic, however, CLABSI rates and SIR were 40-50% higher in IVAT compared to CVAT facilities (Incident Rate Ratio 1.5, 95% CI 1.1-2.0 and SIR Relative Ratio 1.4, 95% CI1.1-1.9 respectively) (Table 1). Conclusions: CLABSI rates were lower in facilities with complete VAT coverage prior to and during the COVID-19 pandemic suggesting a highly functioning VAT can aid in preventing CLABSIs, especially when a healthcare system is stressed and resources are limited.
Methicillin-resistant Staphylococcus aureus (MRSA) is a common etiology of hospital-acquired infections (HAIs). One strategy to reduce HAIs due to MRSA involves a multistep decolonization process. This often involves nasal application of mupirocin 2% ointment. In our institution, when individuals meet criteria for decolonization, we recommend 5 days of treatment given twice daily. High levels of mupirocin resistance have been reported in some hospital systems, with >80% of tested isolates being resistant. To better understand our resistance levels, we selected 238 MRSA isolates from blood cultures to be tested for mupirocin resistance to correlate the presence of resistance and use of mupirocin for decolonization. We choose to assess MRSA blood isolates rather than nasal swabs given that we aim to prevent invasive MRSA infections, including blood stream infections, with decolonization. The blood cultures were collected from 11 acute-care facilities within our system from March 2021 through June 2022. High-level resistance was defined as an MIC >1,024 μg/mL according to Clinical and Laboratory Standards Institute guidelines. Of those, 7.14% showed high level resistance, and 76.47% occurred in those who were exposed to mupirocin and 23.53% occurred in those without mupirocin exposure (P = .0094). On average, those with high-level resistance had had more recent exposure to mupirocin compared to those without resistance, which was statistically significant. Also, those with high resistance, on average, received more doses of mupirocin, although this was not statistically significant. Conclusions: More recent and higher number of doses of mupirocin were associated with the development of resistance, which is consistent with what we know from pharmacodynamics of antibiotic resistance with other agents. These findings may be particularly important for those patients who have frequent hospitalizations and often require decolonization. Understanding baseline mupirocin resistance levels in an institution can assist with determining decolonization strategies.
An examination of invasive procedure cancellations found that the lack of pre-procedural oral screening was a preventable cause, for children with congenital heart disease. The purpose of this study was to implement an oral screening tool within the paediatric cardiology clinic, with referral to paediatric dental providers for positive screens. The target population were children aged ≥6 months to <18 years old, being referred for cardiac procedures.
Methods:
The quality implementation framework method was used for this study design. The multi-modal intervention included education, audit and feedback, screening guidelines, environmental support, and interdisciplinary collaboration. Baseline rates for oral screenings were determined by retrospective chart audit from January 2018 to January 2019 (n = 211). Provider adherence to the oral screening tool was the outcome measure. Positive oral screens, resulting in referral to the paediatric dental clinic, were measured as a secondary outcome. Provider adherence rates were used as a process measure.
Results:
Data collected over 14 weeks showed a 29% increase in documentation of oral screenings prior to referral, as compared to the retrospective chart audit. During the study period, 13% of completed screenings were positive (n = 5). Provider compliance for the period was averaged at 70% adherence.
Conclusion:
A substantial increase in pre-procedural oral screenings by paediatric cardiologists was achieved using the quality implementation framework and targeted interventions.
No evidence-based therapy for borderline personality disorder (BPD) exhibits a clear superiority. However, BPD is highly heterogeneous, and different patients may specifically benefit from the interventions of a particular treatment.
Methods
From a randomized trial comparing a year of dialectical behavior therapy (DBT) to general psychiatric management (GPM) for BPD, long-term (2-year-post) outcome data and patient baseline variables (n = 156) were used to examine individual and combined patient-level moderators of differential treatment response. A two-step bootstrapped and partially cross-validated moderator identification process was employed for 20 baseline variables. For identified moderators, 10-fold bootstrapped cross-validated models estimated response to each therapy, and long-term outcomes were compared for patients randomized to their model-predicted optimal v. non-optimal treatment.
Results
Significant moderators surviving the two-step process included psychiatric symptom severity, BPD impulsivity symptoms (both GPM > DBT), dependent personality traits, childhood emotional abuse, and social adjustment (all DBT > GPM). Patients randomized to their model-predicted optimal treatment had significantly better long-term outcomes (d = 0.36, p = 0.028), especially if the model had a relatively stronger (top 60%) prediction for that patient (d = 0.61, p = 0.004). Among patients with a stronger prediction, this advantage held even when applying a conservative statistical check (d = 0.46, p = 0.043).
Conclusions
Patient characteristics influence the degree to which they respond to two treatments for BPD. Combining information from multiple moderators may help inform providers and patients as to which treatment is the most likely to lead to long-term symptom relief. Further research on personalized medicine in BPD is needed.
Individuals with posttraumatic stress disorder (PTSD) are at increased risk of various chronic diseases. One hypothesized pathway is via changes in diet quality. This study evaluated whether PTSD was associated with deterioration in diet quality over time.
Methods
Data were from 51 965 women in the Nurses' Health Study II PTSD sub-study followed over 20 years. Diet, assessed at 4-year intervals, was characterized via the Alternative Healthy Eating Index-2010 (AHEI). Based on information from the Brief Trauma Questionnaire and Short Screening Scale for DSM-IV PTSD, trauma/PTSD status was classified as no trauma exposure, prevalent exposure (trauma/PTSD onset before study entry), or new-onset (trauma/PTSD onset during follow-up). We further categorized women with prevalent exposure as having trauma with no PTSD symptoms, trauma with low PTSD symptoms, and trauma with high PTSD symptoms, and created similar categories for women with new-onset exposure, resulting in seven comparison groups. Multivariable linear mixed-effects spline models tested differences in diet quality changes by trauma/PTSD status over follow-up.
Results
Overall, diet quality improved over time regardless of PTSD status. In age-adjusted models, compared to those with no trauma, women with prevalent high PTSD and women with new-onset high PTSD symptoms had 3.3% and 3.6% lower improvement in diet quality, respectively, during follow-up. Associations remained consistent after adjusting for health conditions, sociodemographics, and behavioral characteristics.
Conclusions
PTSD is associated with less healthy changes in overall diet quality over time. Poor diet quality may be one pathway linking PTSD with a higher risk of chronic disease development.
Alteplase is an effective treatment for ischaemic stroke patients, and it is widely available at all primary stroke centres. The effectiveness of alteplase is highly time-dependent. Large tertiary centres have reported significant improvements in their door-to-needle (DTN) times. However, these same improvements have not been reported at community hospitals.
Methods
Red Deer Regional Hospital Centre (RDRHC) is a community hospital of 370 beds that serves approximately 150,000 people in their acute stroke catchment area. The RDRHC participated in a provincial DTN improvement initiative, and implemented a streamlined algorithm for the treatment of stroke patients. During this intervention period, they implemented the following changes: early alert of an incoming acute stroke patient to the neurologist and care team, meeting the patient immediately upon arrival, parallel work processes, keeping the patient on the Emergency Medical Service stretcher to the CT scanner, and administering alteplase in the imaging area. Door-to-needle data were collected from July 2007 to December 2017.
Results
A total of 289 patients were treated from July 2007 to December 2017. In the pre-intervention period, 165 patients received alteplase and the median DTN time was 77 minutes [interquartile range (IQR): 60–103 minutes]; in the post-intervention period, 104 patients received alteplase and the median DTN time was 30 minutes (IQR: 22–42 minutes) (p < 0.001). The annual number of patients that received alteplase increased from 9 to 29 in the pre-intervention period to annual numbers of 41 to 63 patients in the post-intervention period.
Conclusion
Community hospitals staffed with community neurologists can achieve median DTN times of 30 minutes or less.
To characterize the current state of Canadian emergency medicine (EM) resident research and develop recommendations to promote excellence in this area.
Methods
We performed a systematic review of MEDLINE, Embase, and ERIC using search terms relevant to EM resident research. We conducted an online survey of EM residency program directors from the Royal College of Physicians and Surgeons of Canada (RCPSC) and College of Family Physicians of Canada (CFPC). An expert panel reviewed these data, presented recommendations at the Canadian Association of Emergency Physicians 2014 Academic Symposium, and refined them based on feedback received.
Results
Of 654 potentially relevant citations, 35 articles were included. These were categorized into four themes: 1) expectations and requirements, 2) training and assessment, 3) infrastructure and support, and 4) dissemination. We received 31 responses from all 31 RCPSC-EM and CFPC-EM programs. The majority of EM programs reported requiring a resident scholarly project; however, we found wide-ranging expectations for the type of resident research performed and how results were disseminated, as well as the degree of completion expected. Although 93% of RCPSC-EM programs reported providing formal training on how to conduct research, only 53% of CFPC-EM programs reported doing so. Almost all programs (94%) reported having infrastructure in place to support resident research, but the nature of support was highly variable. Finally, there was marked variability regarding the number of resident-published abstracts and manuscripts.
Conclusions
Based on the literature, our national survey, and discussions with stakeholders, we offer 14 recommendations encompassing goals, expectations, training, assessment, infrastructure, and dissemination in order to improve Canadian EM resident research.
Determining which patients with ureterolithiasis are likely to require urologic intervention is a common challenge in the emergency department (ED). The objective was to determine if normal renal sonogram could identify low-risk renal colic patients, who were defined as not requiring urologic intervention within 90 days of their initial ED visit and can be managed conservatively.
Methods
This was a prospective cohort study involving adult patients presenting to the EDs of a tertiary care centre with suspected renal colic over a 20-month period. Renal ultrasonography (US) was performed in the diagnostic imaging department by trained ultrasonographers, and the results were categorized into four mutually exclusive groups: normal, suggestive of ureterolithiasis, visualized ureteric stone, or findings unrelated to urolithiasis. Electronic medical records were reviewed to determine if patients received urologic intervention within 90 days of their ED visit.
Results
Of 610 patients enrolled, 341 (55.9%) had US for suspected renal colic. Of those, 105 (30.8%) were classified as normal; none of these patients underwent urologic intervention within 90 days of their ED visit. Ninety (26.4%) US results were classified as suggestive, and nine (10%) patients received urologic intervention. A total of 139 (40.8%) US results were classified as visualized ureteric stone, and 34 (24.5%) patients had urologic intervention. Seven (2.1%) US results were classified as findings unrelated to urolithiasis, and none of these patients required urologic intervention. The rate of urologic intervention was significantly lower in those with normal US results (p<0.001) than in those with abnormal findings.
Conclusion
A normal renal sonogram predicts a low likelihood for urologic intervention within 90 days for adult ED patients with suspected renal colic.
This article is an executive summary of a report from the Centers for Disease Control and Prevention Ventilator-Associated Pneumonia Surveillance Definition Working Group, entitled “Developing a new, national approach to surveillance for ventilator-associated events” and published in Critical Care Medicine. The full report provides a comprehensive description of the Working Group process and outcome.
In September 2011, the Centers for Disease Control and Prevention (CDC) convened a Ventilator-Associated Pneumonia (VAP) Surveillance Definition Working Group to organize a formal process for leaders and experts of key stakeholder organizations to discuss the challenges of VAP surveillance definitions and to propose new approaches to VAP surveillance in adult patients (Table 1).
Woodland bird communities are immensely variable in the number and composition of species and the overall density. Some of this variation is essentially biogeographic. For example, the species pool in most taxonomic and ecological avian groups increases from Ireland through to central Europe (Fuller et al., 2007a). At more local scales, variation is driven mainly by environmental attributes that influence the resources available and consequently determine fitness of individual birds within habitat patches (Holmes, 1990; Chapter 2).
The context for this chapter is long-established woodland in landscapes that have been heavily populated and modified by people for hundreds, even thousands, of years. These woods are predominantly broadleaved, often with a recently introduced coniferous element. Mountain and conifer forests lie outside the scope of the chapter, but for a discussion of northern conifer forests see Chapter 19. In western Europe, a long history of human-related disturbance has produced woodland that is far removed from any ‘natural’ state. Historical interactions between socio-economic processes and environmental factors have produced great diversity of woodland types of varying habitat quality for birds. Regional traditions, differences in management systems and markets, spatial variation in grazing pressure, even neglect, all contribute to this heterogeneity. Whilst some heavily wooded landscapes have persisted, much woodland exists merely as fragments in agricultural landscapes and its plant and animal communities are strongly affected by the surroundings (Chapters 4–6).
The objective of this study was to assess medical students' knowledge of and attitudes toward the two Canadian emergency medicine (EM) residency programs (Fellow of the Royal College of Physicians of Canada [FRCPC] and Certificant of the College of Family Physicians-Emergency Medicine [CCFP-EM]). Additionally, medical students interested in EM were asked to select factors affecting their preferred choice of residency training program and their intended future practice.
Methods:
Medical students enrolled at The University of Western Ontario for the 2008–2009 academic year were invited to complete an online 47-item questionnaire pertaining to their knowledge, opinions, and attitudes toward EM residency training.
Results:
Of the 563 students invited to participate, 406 (72.1%) completed the survey. Of the respondents, 178 (43.8%) expressed an interest in applying to an EM residency training program, with 85 (47.8%) most interested in applying to the CCFP-EM program.
The majority of respondents (54.1%) interested in EM believed that there should be two streams to EM certification, whereas 18.0% disagreed. Family life and control over work schedule appeared to be common priorities seen as benefits of any career in EM. Other high-ranking factors influencing career choice differed between the groups interested in CCFP-EM and FRCPC. The majority of students interested in the CCFP-EM residency program (78%) reported that they intend to blend their EM with their family medicine practice. Only 2% of students planned to practice only EM with no family medicine.
Conclusions:
This is the first survey of Canadian medical students to describe disparities in factors influencing choice of EM residency stream, perceptions of postgraduate work life, and anticipated practice environment.
Computed tomography (CT) is an imaging modality used to detect renal stones. However, there is concern about the lifetime cumulative radiation exposure attributed to CT. Ultrasonography (US) has been used to diagnose urolithiasis, thereby avoiding radiation exposure. The objective of this study was to determine the ability of US to identify renal colic patients with a low risk of requiring urologic intervention within 90 days of their initial emergency department (ED) visit.
Methods:
We completed a retrospective medical record review for all adult patients who underwent ED-ordered renal US for suspected urolithiasis over a 1-year period. Independent, double data extraction was performed for all imaging reports and US results were categorized as “normal,” “suggestive of ureterolithiasis,” “ureteric stone seen” or “disease unrelated to urolithiasis.” Charts were reviewed to determine how many patients underwent subsequent CT and urologic intervention.
Results:
Of the 817 renal US procedures ordered for suspected urolithiasis during the study period, the results of 352 (43.2%) were classified as normal, and only 2 (0.6%) of these patients required urologic intervention. The results of 177 (21.7%) renal US procedures were suggestive of ureterolithiasis. Of these, 12 (6.8%) patients required urologic intervention. Of the 241 (29.5%) patients who had a ureteric stone seen on US, 15 (6.2%) required urologic intervention. The rate of urologic intervention was significantly lower in those with normal results on US (p < 0.001) than in those with abnormal results on US.
Conclusion:
A normal result on renal US predicts a low likelihood for urologic intervention within 90 days for adult ED patients with suspected urolithiasis.
Application of tensile strain to the Si(100) lattice is known to enhance carrier mobility in field effect transistors through modification of the Si band structure. Si is conventionally placed under tensile strain using methods such as Si3N4 capping for strained channel devices, and epitaxial growth of Si on a strain graded SiGe substrate for large area strain. The latter case preserves and propagates threading dislocations, and both cases require use of a bulk rigid substrate, which prohibits the use of strained Si in applications such as flexible electronics, or indeed in any application where strained Si is desirable on a non-epitaxial substrate. Elastically strained, single-crystal, Si-based nanomembranes, in which the release of a Si/SiGe/Si heterostructure from its growth substrate allows elastic strain sharing between the layers, circumvent these issues. These nanomembranes are extremely flexible, virtually dislocation-free, and transferable to almost any other surface.