We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
A 54-question survey about System Healthcare Infection Prevention Programs (SHIPPs) was sent out to SHEA Research Network participants in August 2023. Thirty-eight United States-based institutions responded (38/93, 41%), of which 23 have SHIPPs. We found heterogeneity in the structure, staffing, and resources for system infection prevention (IP) programs.
Background: Cerebral venous thrombosis (CVT)most commonly affects younger women. Diagnosis may be delayed due to its distinct presentation and demographic profile compared to other stroke types. Methods: We examined delays to diagnosis of CVT in the SECRET randomized trial and TOP-SECRET parallel registry. Adults diagnosed with symptomatic CVT within <14 days were included. We examined time to diagnosis and number of health care encounters prior to diagnosis and associations with demographics, clinical and radiologic features and functional and patient-reported outcomes (PROMS) at days 180&365. Results: Of 103 participants, 68.9% were female; median age was 45 (IQR 31.0-61.0). Median time from symptom onset to diagnosis was 4 (1-8) days. Diagnosis on first presentation to medical attention was made in 60.2%. The difference in time to diagnosis for single versus multiple presentations was on the order of days (3[1-7] vs. 5[2-11.75], p=0.16). Women were likelier to have multiple presentations (OR 2.53; 95% CI1.00-6.39; p=0.05) and longer median times to diagnosis (5[2-8]days vs. 2[1-4.5] days; p=0.005). However, this was not associated with absolute or change in functional, or any patient reported, outcome measures (PROMs) at days 180&365. Conclusions: Diagnosis of CVT was commonly delayed; women were likelier to have multiple presentations. We found no association between delayed diagnosis and outcomes.
Background: Stroke simulation-based training has been associated with improved stroke quality metrics. The purpose of this study was to assess whether high-fidelity acute stroke simulation participation led to better knowledge retention one month post simulation in off-service residents. Methods: Off-service residents were provided with non-mandatory pre-simulation pre-reading on stroke. Immediately before stroke simulation, they completed a questionnaire to test their knowledge on a set of 8 questions related to stroke. Immediately post-stroke simulation, they were provided with a debrief including teaching on stroke. After the debrief and one month later, they completed the same questionnaire again. Results: There were a total of 16 off-service resident participants. Wilcoxon signed ranks test was performed. There was a significant difference between pre-simulation and immediate post-simulation scores on the knowledge retention questionnaire (p = 0.008). There was a significant difference between pre-simulation and one-month post-simulation on the knowledge retention questionnaire (p = 0.007). There was no difference between immediate post-simulation and one-month post-simulation on the knowledge retention questionnaire (p = 0.77). Conclusions: Participants performed better on the questionnaire after the simulation, and this improved performance was retained at one month. This is the first study to demonstrate delayed knowledge retention in stroke simulation literature.
The mineralogy of the clay fraction was studied for soils and saprolite on two Blue Ridge Front mountain slopes. The clay fraction contained the weathering products of primary minerals in the mica gneiss and schist parent rocks. Gibbsite is most abundant in the saprolite and residual soil horizons, where only chemical weathering has been operable. In colluvial soil horizons, where both physical and chemical weathering have occurred, the clay fraction consists largely of comminuted primary phyllosilicates —muscovite, chlorite, and possibly biotite—and their weathering products: vermiculite, interstratified biotite/vermiculite (B/V), and kaolinite. The clay-size chlorite contains Fe2+ as indicated by Mössbauer spectroscopy, and is more resistant to weathering than biotite. The vermiculite and B/V, both weathering products of biotite, contain Fe3+. Vermiculite in colluvial soils and, especially, surface horizons is weakly hydroxy-interlayered. The kaolinite in the clay fraction resulted at least partly from the comminution of kaolinized biotite in coarser fractions.
The hematite content ranged from 0 to 8% of the clay fraction and strongly correlates (r =.95) with dry clay redness, as measured by the redness rating: RR = (10 - YR hue) × (chroma) ÷ (value). The hematite is largely a product of the weathering of almandine; thus, the soil redness is dependent on the amount of almandine in the parent materials and its degree of weathering in the soils. Goethite (13–22% of the clay fraction) imparts a yellow-brown hue to soils derived from almandine-free parent rocks. The release of Fe in relatively low concentrations during the weathering of Fe-bearing primary minerals, particularly biotite, appears to have promoted the formation of goethite.
A green, Lithic Torriorthent soil derived from a celadonite-rich, hydrothermally altered basalt immediately north of the Mojave Desert region in southern California was studied to investigate the fate of the celadonite in a pedogenic weathering environment. Celadonite was found to be disseminated in the highly altered rock matrix with cristobalite, chalcedony, and stilbite. X-ray powder diffraction (XRD) showed the soil material to contain celadonite having a d(060) value of 1.510 Å, indicative of its dioctahedral nature. Very little smectite was detected in the parent material, whereas Fe-rich smectite was found to be abundant in the soil. The Fe-smectite and celadonite were identified as the sole components of the green-colored clay fraction (<2 µm) of all soil horizons. The soil clay showed a single d(060) value of 1.507 Å, indicating that the smectite was also dioctahedral and that its b-dimension was the same as that of the celadonite. Mössbauer spectroscopy showed that the chemical environments of Fe in the rock-matrix celadonite and in the smectite-rich soil clay were also nearly identical. These data strongly suggest a simple transformation of the celadonite to an Fe-rich smectite during soil formation.
Supporting evidence for this transformation was obtained by artificial weathering of celadonite, using sodium tetraphenyl boron to extract interlayer K. The intensity of the 001 XRD peak (at 10.1 Å) of celadonite was greatly reduced after the treatment and a peak at 14.4 Å, absent in the pattern of the untreated material, appeared. On glycolation of the sample, this peak expanded to 17.4 Å, similar to the behavior of the soil smectite. The alteration of celadonite to smectite is a simple transformation requiring only the loss of interlayer K. The transformation is apparently possible under present-day conditions, inasmuch as the erosional landscape position, shallow depth, and lack of significant horizonation indicate that the soil is very young.
The weathering products of primary biotite, chlorite, magnetite, and almandine in mica gneiss and schist in the North Carolina Blue Ridge Front were determined. Sand-size grains of biotite, the most abundant, readily weathered mineral in the parent rock, have altered to interstratified biotite/vermiculite, vermiculite, kaolinite, and gibbsite in the saprolite and soil. Fe2+-chlorite in the parent rock was relatively resistant to chemical weathering, which appears to be confined to the external surfaces of particles. Magnetite grains in the saprolite are essentially unaltered, but those in the soil contain abundant crystallographically controlled etch pits and are coated with oxidation crusts. Almandine altered to goethite, hematite, and gibbsite as the rock weathered to saprolite. Extensively weathered almandine grains were found to contain etch pits and what appeared to be oxide coatings. Apparently, a rapid release of Fe during weathering produced hematite, whereas slower release of Fe favored the formation of goethite.
The Falkland Shelf is a highly productive ecosystem in the Southwest Atlantic Ocean. It is characterized by upwelling oceanographic dynamics and displays a wasp-waist structure, with few intermediate trophic-level species and many top predators that migrate on the shelf for feeding. One of these resident intermediate trophic-level species, the Patagonian longfin-squid Doryteuthis gahi, is abundant and plays an important role in the ecosystem. We used two methods to estimate the trophic structure of the Falkland Shelf food web, focusing on the trophic niche of D. gahi and its impacts on other species and functional groups to highlight the importance of D. gahi in the ecosystem. First, stable isotope measurements served to calculate trophic levels based on an established nitrogen baseline. Second, an Ecopath model was built to corroborate trophic levels derived from stable isotopes and inform about trophic interactions of D. gahi with other functional groups. The results of both methods placed D. gahi in the centre of the ecosystem with a trophic level of ~ 3. The Ecopath model predicted high impacts and therefore a high keystoneness for both seasonal cohorts of D. gahi. Our results show that the Falkland Shelf is not only controlled by species feeding at the top and the bottom of the trophic chain. The importance of species feeding at the third trophic level (e.g. D. gahi and Patagonotothen ramsayi) and observed architecture of energy flows confirm the ecosystem's wasp-waist structure with middle-out control mechanisms at play.
Background: Stroke is a leading cause of death and disability worldwide, including Canada. Treatments for stroke are time dependent and IV tPA for acute ischemic stroke decreases the chance of disability at 90 days if given within 4.5 hours of symptom onset. The onset of the Covid-19 pandemic was initially associated with a decrease in acute stroke treatment with thrombolysis across North America. These decreases seemed transient, with a rebound in numbers seen in other provinces across Canada as widespread lockdown orders were lifted. However, a rebound in thrombolysis was not seen at Royal University Hospital (RUH) in Saskatoon, Saskatchewan during the same period. We will analyze documented reasons why thrombolysis was withheld. Methods: We conducted a retrospective chart review of adult patients with ischemic strokes presenting within 4.5 hours of symptom onset to the RUH from March 2019 –January 2021. We received a waiver of consent from the Research Ethics Board. Results: 128 patients met the inclusion criteria. Statistical analysis is currently ongoing. Conclusions: Initial results suggest that there are similar reasons for withholding tPA before and after the Covid-19 pandemic. The main reasons include rapidly resolving/resolved symptoms and a documented tPA exclusion criterion.
The pace and trajectory of global and local environmental changes are jeopardizing our health in numerous ways, among them exacerbating the risk of disease emergence and spread in both the community and the healthcare setting via healthcare-associated infections (HAIs). Factors such as climate change, widespread land alteration, and biodiversity loss underlie changing human–animal–environment interactions that drive disease vectors, pathogen spillover, and cross-species transmission of zoonoses. Climate change–associated extreme weather events also threaten critical healthcare infrastructure, infection prevention and control (IPC) efforts, and treatment continuity, adding to stress to strained systems and creating new areas of vulnerability. These dynamics increase the likelihood of developing antimicrobial resistance (AMR), vulnerability to HAIs, and high-consequence hospital-based disease transmission. Using a One Health approach to both human and animal health systems, we can become climate smart by re-examining impacts on and relationships with the environment. We can then work collaboratively to reduce and respond to the growing threat and burden of infectious diseases.
To evaluate the impact of a diagnostic stewardship intervention on Clostridioides difficile healthcare-associated infections (HAI).
Design:
Quality improvement study.
Setting:
Two urban acute care hospitals.
Interventions:
All inpatient stool testing for C. difficile required review and approval prior to specimen processing in the laboratory. An infection preventionist reviewed all orders daily through chart review and conversations with nursing; orders meeting clinical criteria for testing were approved, orders not meeting clinical criteria were discussed with the ordering provider. The proportion of completed tests meeting clinical criteria for testing and the primary outcome of C. difficile HAI were compared before and after the intervention.
Results:
The frequency of completed C. difficile orders not meeting criteria was lower [146 (7.5%) of 1,958] in the intervention period (January 10, 2022–October 14, 2022) than in the sampled 3-month preintervention period [26 (21.0%) of 124; P < .001]. C. difficile HAI rates were 8.80 per 10,000 patient days prior to the intervention (March 1, 2021–January 9, 2022) and 7.69 per 10,000 patient days during the intervention period (incidence rate ratio, 0.87; 95% confidence interval, 0.73–1.05; P = .13).
Conclusions:
A stringent order-approval process reduced clinically nonindicated testing for C. difficile but did not significantly decrease HAIs.
Gestational diabetes mellitus (GDM) is the most common medical complication of pregnancy and a severe threat to pregnant people and offspring health. The molecular origins of GDM, and in particular the placental responses, are not fully known. The present study aimed to perform a comprehensive characterisation of the lipid species in placentas from pregnancies complicated with GDM using high-resolution MS lipidomics, with a particular focus on sphingolipids and acylcarnitines in a semi-targeted approach. The results indicated that despite no major disruption in lipid metabolism, placentas from GDM pregnancies showed significant alterations in sphingolipids, mostly lower abundance of total ceramides. Additionally, very long-chain ceramides and sphingomyelins with twenty-four carbons were lower, and glucosylceramides with sixteen carbons were higher in placentas from GDM pregnancies. Semi-targeted lipidomics revealed the strong impact of GDM on the placental acylcarnitine profile, particularly lower contents of medium and long-chain fatty-acyl carnitine species. The lower contents of sphingolipids may affect the secretory function of the placenta, and lower contents of long-chain fatty acylcarnitines is suggestive of mitochondrial dysfunction. These alterations in placental lipid metabolism may have consequences for fetal growth and development.
Hospitals are increasingly consolidating into health systems. Some systems have appointed healthcare epidemiologists to lead system-level infection prevention programs. Ideal program infrastructure and support resources have not been described. We informally surveyed 7 healthcare epidemiologists with recent experience building and leading system-level infection prevention programs. Key facilitators and barriers for program structure and implementation are described.
Background: While mechanical thrombectomy (MT) has become broadly used, many nuances around its performance are still contentious. In particular, the optimal sedation strategy for MT is not clear in the literature. Methods: This study was a single-center retrospective cohort study of a prospectively collected database. Age, gender, pre-treatment NIH stroke score (NIHSS), Alberta stroke program early score CT (ASPECTS), quality of collateralization, whether the patient underwent thrombectomy, tandem carotid occlusion, and thrombolysis in cerebral infarction (TICI) score were recorded in the database. Results: We identified 228 patients having anterior circulation mechanical thrombectomy (MT). 91 were right-sided, 108 were left-sided. Collaterals were graded as good in 135 (71.4), moderate in 44 (23.2%), and poor in 10 (5.3%). The average pre-MT ASPECTS was 8.1 (range). We found significant differences between all patients, patients with good outcome (mRS 0-2) and death in age, baseline NIHSS, collateralization, and TICI revascularization score. Multivariate analysis was performed with showed significant associations of sidedness, collateralization, TICI score and hemorrhage with neurological outcome. Right-sided stroke, better collaterals, higher TICI score and absence of hemorrhage were associated with better outcomes. Conclusions: We found comparable outcomes to those reported in the literature with use of general anesthetic. We identify several factors that influence outcomes.
Background: Canadian Stroke Best Practice Recommendations recommend both cardiac monitoring and transthoracic echocardiography (TTE) to assess for cardioembolic sources of stroke. TTE has a diagnostic yield, which is historically low at 5-10%. The goal of this project was to evaluate the practicality of a bedside, focused approach to TTE in ischemic stroke. Methods: A cross-sectional study evaluating patients undergoing echocardiography for evidence of possible cardioembolic stroke was developed. It compared the standard and focused TTE imaging approaches. Of the 61 patients reported, data is currently available for 15 participants. Independent samples t-test were performed to compare measurements. Results: Mean time to finish image acquisition for the focused, bedside TTE was significantly shorter than the complete TTE (12 min or less vs 30 min or more) (p<0.0001). No cardiac sources of stroke were found by either mechanism in this cohort, representing 100% agreement between the two modalities. Conclusions: Focused, bedside echocardiography studies are quicker to execute and employ more affordable, portable, digital TTE devices. The test is done at bedside, reducing the need for patient transport. Image acquisition takes approximately half the time to obtain, potentially allowing for more rapid clinical decision making and facilitation of discharge from hospital.
Background: Resident physicians often observe stroke alerts before managing them alone, which exposes patients to potential harm from trainees’ lack of experience. Simulation training offers a low-risk environment for skill acquisition. This project assessed learners’ confidence in leading stroke codes before and after completing a stroke simulation training program during neurology rotations at the University of Saskatchewan. Methods: High-fidelity simulation cases were developed encompassing several diagnostic and therapeutic goals of acute stroke care. Standardized patients were trained for increased fidelity. Standardized debriefing was given after each session. Pre- and post-simulation surveys captured learner confidence and cognitive load. Results: Pilot data reveal learners’ confidence and comfort in providing acute stroke care, including thrombolysis treatment decisions, significantly increases after simulation training (n=8; p=0.0006-0.01). They also felt more prepared to conduct future acute stroke care (p=0.009). Skills not directly addressed in simulation did not show significant improvement (p=0.09-1.89). Learners consistently rated the session as requiring high mental effort. Conclusions: Implementation of high-fidelity simulation training leads to significant improvement in learner confidence. Future cases will capture additional objectives and ensure acceptable cognitive load. Ongoing data collection to explore residents’ experiences and knowledge improvement in stroke care and assess local reductions in treatment delays is underway.
Background: Visual impairment can impact 70% of individuals who have experienced a stroke. Identification and remediation of visual impairments can improve overall function and perceived quality of life. Our project aimed to improve visual assessment and timely intervention for patients with post-stroke visual impairment (PSVI). Methods: We conducted a quality improvement initiative to create a standardized screening and referral process for patients with PSVI to access an orthoptist. Post-stroke visual impairment was identified using the Visual Screen Assessment (VISA) tool. Patients filled out a VFQ-25 questionnaire before and after orthoptic assessment, and differences between scores were evaluated. Results: Eighteen patients completed the VFQ-25 both before and after orthoptic assessment. Of the vision related constructs, there was a significant improvement in reported outcomes for general vision (M=56.9, SD=30.7; M=48.6, SD=16.0), p=0.002, peripheral vision (M=88.3, SD=16; M=75, SD=23.1), p= 0.027, ocular pain (M=97.2, SD=6.9; M=87.5, SD=21.4), p=0.022, near activities (M=82.4, SD=24.1; M=67.8, SD=25.6), p<0.001, social functioning (M=90.2, SD=19; M=78.5, SD=29.3), p=0.019, mental health (M=84.0, SD=25.9; M=70.5, SD=31.2), p=0.017, and role difficulties (M=84.7, SD=26.3; M=67.4, SD=37.9), p=0.005. Conclusions: Orthoptic assessments for those with PSVI significantly improved perceived quality of life in a numerous vision related constructs, suggesting it is a valuable part of a patient’s post-stroke recovery.
To evaluate infectious pathogen transmission data visualizations in outbreak publications.
Design:
Scoping review.
Methods:
Medline was searched for outbreak investigations of infectious diseases within healthcare facilities that included ≥1 data visualization of transmission using data observable by an infection preventionist showing temporal and/or spatial relationships. Abstracted data included the nature of the cluster(s) (pathogen, scope of transmission, and individuals involved) and data visualization characteristics including visualization type, transmission elements, and software.
Results:
From 1,957 articles retrieved, we analyzed 30 articles including 37 data visualizations. The median cluster size was 20.5 individuals (range, 7–1,963) and lasted a median of 214 days (range, 12–5,204). Among the data visualization types, 10 (27%) were floor-plan transmission maps, 6 (16%) were timelines, 11 (30%) were transmission networks, 3 (8%) were Gantt charts, 4 (11%) were cluster map, and 4 (11%) were other types. In addition, 26 data visualizations (70%) contained spatial elements, 26 (70%) included person type, and 19 (51%) contained time elements. None of the data visualizations contained contagious periods and only 2 (5%) contained symptom-onset date.
Conclusions:
The data visualizations of healthcare-associated infectious disease outbreaks in the systematic review were diverse in type and visualization elements, though no data visualization contained all elements important to deriving hypotheses about transmission pathways. These findings aid in understanding the visualizing transmission pathways by describing essential elements of the data visualization and will inform the creation of a standardized mapping tool to aid in earlier initiation of interventions to prevent transmission.