We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This article provides a retrospective on the formation of the Engaged Humanities Lab at Royal Holloway, University of London. It sets the lab’s development within the broader history of labs in the humanities, setting out how the Engaged Humanities Lab aligns closely with Pawlicka-Deger’s description of “infrastructure of engagement” rather than the physical spaces that characterise science laboratories. We also explain why the sub-category of “engaged” humanities was selected over the broader and more established “public” humanities. The second half of the article provides reflections on the activities and achievements of the Engaged Humanities Lab, focusing on how intra-institution collaboration between an academic school and the Research and Innovation Department supported the formation and governance of the lab, allowing for ongoing dialogue and co-creation between subject area experts and research management professionals with expertise in research funding, policy, and knowledge exchange. This article also illuminates what is needed from university leaders to ensure the success and longevity of infrastructures of engagement like the Engaged Humanities Lab.
Herbaceous perennials must annually rebuild the aboveground photosynthetic architecture from carbohydrates stored in crowns, rhizomes, and roots. Knowledge of carbohydrate utilization and storage can inform management decisions and improve control outcomes for invasive perennials. We monitored the nonstructural carbohydrates in a population of the hybrid Bohemian knotweed [Polygonum ×bohemicum (J. Chrtek & Chrtková) Zika & Jacobson [cuspidatum × sachalinense]; syn.: Fallopia ×bohemica (Chrtek and Chrtková) J.P. Bailey] and in Japanese knotweed [Polygonum cuspidatum Siebold & Zucc.; syn.: Fallopia japonica (Houtt.) Ronse Decr.]. Carbohydrate storage in crowns followed seasonal patterns typical of perennial herbaceous dicots corresponding to key phenological events. Starch was consistently the highest nonstructural carbohydrate present. Sucrose levels did not show a consistent inverse relationship with starch levels. Lateral distribution of starch in rhizomes and, more broadly, total nonstructural carbohydrates sampled before dormancy break showed higher levels in rhizomes compared with crowns. Total nonstructural carbohydrate levels in crowns reached seasonal lows at an estimated 22.6% of crown dry weight after accumulating 1,453.8 growing degree days (GDD) by the end of June, mainly due to depleted levels of stored starch, with the estimated minimum of 12.3% reached by 1,220.3 GDD accumulated by mid-June. Depletion corresponded to rapid development of vegetative canopy before entering the reproductive phase in August. Maximum starch accumulation in crowns followed complete senescence of aboveground tissues by mid- to late October. Removal of aboveground shoot biomass in late June to early July with removal of vegetation regrowth in early September before senescence would optimize the use of time and labor to deplete carbohydrate reserves. Additionally, foliar-applied systemic herbicide translocation to belowground tissue should be maximized with applications in late August through early fall to optimize downward translocation with assimilate movement to rebuild underground storage reserves. Fall applications should be made before loss of healthy leaf tissue, with the window for control typically ending by late September in Minnesota.
Patients with cancer benefit greatly from receiving palliative care (PC), improving their overall survival and quality of life. Despite its benefits, PC is underutilized among patients with hematologic malignancies (HMs), particularly among Black patients, who face higher symptom burdens and lower survival rates compared to White patients. The purpose of this review was to identify and describe what is known about PC use among Black HM patients in the United States.
Methods
This review was conducted using the Joanna Briggs Institute approach for scoping reviews and included a search of the databases MEDLINE (PubMed), Embase (Elsevier), Scopus and Web of Science (Clarivate). The search was developed and conducted by a professional medical librarian in consultation with the author team and focused on keywords such as Black/African American patients, HM, and PC. Articles were screened and selected based on predefined inclusion criteria and carried out using Covidence software for systematic review management.
Results
Seven publications were included in the final sample and most used quantitative methods and data from large national databases such as the National Cancer Database. Four of the studies reported that Black patients with HM were less likely to receive or use PC compared to White patients. Access to PC was associated with better outcomes, such as lower hospital charges and a reduced likelihood of dying within 30 days of initiating palliative radiotherapy.
Significance of the results
This scoping review highlights ongoing inequities in the use of PC among Black patients with HM which mirrors trends in patients with solid cancers. Future studies should be conducted to understand the determinants of these disparities and to also build testable interventions to improve PC use within this underserved population.
Objectives/Goals: The creatine (Cr) system is impaired in Alzheimer’s disease (AD). Data show that creatine monohydrate (CrM) supplementation may improve AD symptoms in AD mouse models, but no human studies have been reported. Thus, we investigated whether an eight-week CrM supplementation was feasible and associated with increased brain creatine in patients with AD. Methods/Study Population: Twenty participants with probable AD were allocated to an open-label, eight-week intervention of 20 g/day CrM. Fasting blood draws were taken at baseline, 4-, and 8-week visits to measure serum creatine (Quest Diagnostics). 1H magnetic resonance spectroscopy was performed at baseline and 8-week visits to measure brain Cr as a ratio to unsuppressed water. Self-reported compliance (with assistance from study partners) was assessed with daily CrM trackers. The mean compliance percentage across all participants was used to describe overall compliance with the intervention. We used paired t-tests to analyze the mean changes in serum Cr levels from baseline to 4- and 8-week visits and the mean change in brain Cr from baseline to 8-week visits. Statistical significance was set at p<0.05. Results/Anticipated Results: Participants were 65% male with a mean age of 73.1±6.3 years. All participants completed the study, with 19 out of 20 achieving the dose compliance target of ≥80%. The mean self-reported dose intake was 90%. Serum Cr levels were significantly increased at 4- and 8-week visits compared to baseline (0.6±0.4 mg/dL vs. 14.0±9.9 mg/dL and 15.0±13.6 mg/dL, respectively; p<0.001). Brain Cr levels also significantly increased (330.5±36.80 i.u. vs. 366.9±57.52 i.u., p<0.001). Discussion/Significance of Impact: We are the first to demonstrate that 20 g/day of CrM for eight weeks is feasible and associated with increased brain Cr in patients with AD. Our findings support further investigation of brain target engagement of CrM and its efficacy in AD. With AD cases expected to rise, CrM could serve as an effective, affordable therapeutic to slow AD progression.
Objectives/Goals: We hypothesized that the bulk transcriptomic profiling of blood collected from within the ischemic vasculature during an acute ischemic stroke with large vessel occlusion (LVO) will contain unique biomarkers that are different from the peripheral circulation and may provide much-needed insight into the underlying pathogenesis of LVO in humans. Methods/Study Population: The transcriptomic biomarkers of Inflammation in Large Vessel Ischemic Stroke pilot study prospectively enrolled patients ≥ 18 years of age with an anterior circulation LVO, treated with endovascular thrombectomy (EVT). Two periprocedural arterial blood samples were obtained (DNA/RNA Shield™ tubes, Zymo Research); 1) proximal to the thrombus, from the internal carotid artery and 2) immediately downstream from the thrombus, by puncturing through the thrombus with the microcatheter. Bulk RNA sequencing was performed and differential gene expression was identified using the Wilcoxon signed rank test for paired data, adjusting for age, sex, use of thrombolytics, last known well to EVT, and thrombolysis in cerebral infarction score. Bioinformatic pathway analyses were computed using MCODE and reactome. Results/Anticipated Results: From May to October 2022, 20 patients were screened and 13 were enrolled (median age 68 [SD 10.1], 47% male, 100% white). A total of 608 differentially expressed genes were found to be significant (p-value) Discussion/Significance of Impact: These results provide evidence of significant gene expression changes occurring within the ischemic vasculature of the brain during LVO, which may correlate with larger ischemic infarct volumes and worse functional outcomes at 90 days. Future studies with larger sample sizes are supported by this work.
The Early Minimally Invasive Removal of Intracerebral Hemorrhage (ENRICH) trial demonstrated that minimally invasive surgery to treat spontaneous lobar intracerebral hemorrhage (ICH) improved functional outcomes. We aimed to explore current management trends for spontaneous lobar ICH in Canada to assess practice patterns and determine whether further randomized controlled trials are needed to clarify the role of surgical intervention.
Methods:
Neurologists, neurosurgeons, physiatrists and trainees in these specialties were invited to complete a 16-question survey exploring three areas: (1) current management for spontaneous lobar ICH at their institution, (2) perceived influence of ENRICH on their practice and (3) perceived need for additional clinical trial data. Standard descriptive statistics were used to report categorical variables. The χ2 test was used to compare responses across specialties and career stages.
Results:
The survey was sent to 433 physicians, and 101 (23.3%) responded. Sixty-eight percent of participants reported that prior to publication of the ENRICH trial, spontaneous lobar ICH was primarily managed conservatively, with surgery reserved for life-threatening situations. Forty-three percent of participants did not foresee a significant increase in surgical intervention at their institution. Of neurosurgical respondents, 33% remained hesitant to offer surgical intervention beyond lifesaving operations. Only 5% reported routinely using specifically designed technologies to evacuate ICH. Seventy percent reported that another randomized controlled trial comparing nonsurgical to surgical management for spontaneous lobar ICH is needed.
Conclusions:
There is significant practice variability in the management of spontaneous lobar ICH across Canadian institutions, stressing the need for additional clinical trial data to determine the role of surgical intervention.
A central goal in ecology is investigating the impact of major perturbations, such as invasion, on the structure of biological communities. One promising line of inquiry is using co-occurrence analyses to examine how species’ traits mediate coexistence and how major ecological, climatic, and environmental disturbances can affect this relationship and underlying mechanisms. However, present communities are heavily influenced by anthropogenic behaviors and may exhibit greater or lesser resistance to invasion than communities that existed before human arrival. Therefore, to disentangle the impact of individual disturbances on mammalian communities, it is important to examine community dynamics before humans. Here, we use the North American fossil record to evaluate the co-occurrence structure of mammals across the Great American Biotic Interchange. We compiled 126 paleocommunities from the late Pliocene (4–2.5 Ma) and early Pleistocene (2.5–1 Ma). Genus-level co-occurrence was calculated to identify significantly aggregated (co-occur more than expected) and segregated (co-occur less than expected) genus pairs. A functional diversity analysis was used to calculate functional distance between genus pairs to evaluate the relationship between pair association strength and functional role. We found that the strength distribution of aggregating and segregating genus pairs does not significantly change from the late Pliocene to the early Pleistocene, even with different mammals forming the pairs, including immigrant mammals from South America. However, we did find that significant pairs, both aggregations and segregations, became more similar in their functional roles following the Plio-Pleistocene transition. Due to different mammals and ecological roles forming significant associations and the stability of co-occurrence structure across this interval, our study suggests that mammals have fundamental ways of assembling that may have been altered by humans in the present.
This study sought to assess undergraduate students’ knowledge and attitudes surrounding perceived self-efficacy and threats in various common emergencies in communities of higher education.
Methods
Self-reported perceptions of knowledge and skills, as well as attitudes and beliefs regarding education and training, obligation to respond, safety, psychological readiness, efficacy, personal preparedness, and willingness to respond were investigated through 3 representative scenarios via a web-based survey.
Results
Among 970 respondents, approximately 60% reported their university had adequately prepared them for various emergencies while 84% reported the university should provide such training. Respondents with high self-efficacy were significantly more likely than those with low self-efficacy to be willing to respond in whatever capacity needed across all scenarios.
Conclusions
There is a gap between perceived student preparedness for emergencies and training received. Students with high self-efficacy were the most likely to be willing to respond, which may be useful for future training initiatives.
Clozapine is the antipsychotic medication with the greatest efficacy in treatment-resistant schizophrenia (TRS). Unfortunately, clozapine is ceased in approximately 0.2% to 8.5% of people due to concerns about clozapine-associated myocarditis (CAM). The opportunity for clozapine rechallenge is important for people with TRS and CAM, due to limited alternative treatments. However, there is a lack of consensus regarding the optimal process, monitoring, and dose titration to achieve successful clozapine rechallenge. The study aimed to review the process, monitoring, and dose titration within cases of clozapine rechallenge after CAM, to identify features associated with successful rechallenge.
Methods
A systematic review of clozapine rechallenge cases following CAM was conducted. PubMed, EMBASE, Cinahl, and PsycINFO were searched for cases. Reference lists of retrieved articles and field experts were consulted to identify additional studies.
Results
Forty-five cases were identified that described clozapine rechallenge, 31 of which were successful. Successful rechallenge cases generally used a slower dose titration regime with more frequent monitoring than standard clozapine initiation protocols; however, this data was not always completely recorded within cases. Six cases referred to published rechallenge protocols to guide their rechallenge.
Conclusions
The process, monitoring, and dose titration of clozapine rechallenge are inconsistently reported in the literature. Despite this, 69% of case reports detailed a successful rechallenge post CAM; noting limitations associated with reliance on case data. Ensuring published clozapine rechallenge cases report standardised data, including titration speed and monitoring frequencies, is required to guide the development and validation of guidelines for clozapine rechallenge.
By coupling long-range polymerase chain reaction, wastewater-based epidemiology, and pathogen sequencing, we show that adenovirus type 41 hexon-sequence lineages, described in children with hepatitis of unknown origin in the United States in 2021, were already circulating within the country in 2019. We also observed other lineages in the wastewater, whose complete genomes have yet to be documented from clinical samples.
To compare rates of Clostridioides difficile infection (CDI) recurrence following initial occurrence treated with tapered enteral vancomycin compared to standard vancomycin.
Design:
Retrospective cohort study.
Setting:
Community health system.
Patients:
Adults ≥18 years of age hospitalized with positive C. difficile polymerase chain reaction or toxin enzyme immunoassay who were prescribed either standard 10–14 days of enteral vancomycin four times daily or a 12-week tapered vancomycin regimen.
Methods:
Retrospective propensity score pair matched cohort study. Groups were matched based on age < or ≥ 65 years and receipt of non-C. difficile antibiotics during hospitalization or within 6 months post-discharge. Recurrence rates were analyzed via logistic regression conditioned on matched pairs and reported as conditional odds ratios. The primary outcome was recurrence rates compared between standard vancomycin versus tapered vancomycin for treatment of initial CDI.
Results:
The CDI recurrence rate at 6 months was 5.3% (4/75) in the taper cohort versus 28% (21/75) in the standard vancomycin cohort. The median time to CDI recurrence was 115 days versus 20 days in the taper and standard vancomycin cohorts, respectively. When adjusted for matching, patients in the taper arm were less likely to experience CDI recurrence at 6 months when compared to standard vancomycin (cOR = 0.19, 95% CI 0.07–0.56, p < 0.002).
Conclusions:
Larger prospective trials are needed to elucidate the clinical utility of tapered oral vancomycin as a treatment option to achieve sustained clinical cure in first occurrences of CDI.
Background: Environmental sampling and detection methods for fungi in healthcare settings are not well-established. We previously refined methods for fungal sampling and detection in a controlled laboratory environment and aimed to validate them in a real-world healthcare setting. Methods: We performed a microbiological analysis of air and surfaces in three inpatient units at a tertiary care center. Surface samples were obtained with foam sponges from 3 locations in patient rooms (Patient bedrails, bathroom floor, HVAC export) and 5 locations in units (HVAC exports 3x, clean linen storage, soiled linen storage). Air samples were taken with an active air sampler directly below HVAC exports. Sponges were processed using the stomacher technique. Samples underwent DNA extraction followed by qPCR with FungiQuant primers targeting the 18S rRNA gene. Amplicons from positive samples were sequenced (NextSeq 1000, 300bp PE) and SmartGene databases were used to interpret sequence data. For comparison to culture methods, samples were also plated onto Sabouraud and HardyCHROM Candida + auris medias. Fungal growth underwent DNA extraction, 18S PCR and Sanger sequencing for genus and species identification. Results: A total of 85 samples were obtained, from 15 patient rooms and three units resulting in 61 surface and 24 air samples. Patients in study rooms had a median age of 53, 9 (60%) were male, and no patients had an invasive fungal infection during their hospital encounter. 44 (53%) and 39 (46%) samples were positive for fungi via qPCR and culture, respectively. Of the 44 positive qPCR samples, microbiome analyses identified at least one fungi to the species, genus and family levels in 43 (98%), 28 (64%), 18 (41%) samples, respectively (Table 1). 114 total isolates were identified of which the most common were Mallassezia restricta (30 [26%]), Malassezia globose (29 [25%]), and Pennicillium paradoxum (4 [4%]). 39 genera were identified of which the most common were Mucor (19 [49%]) and Candida (8 [21%]). Of the 39 culture positive samples, 90 total isolates were recovered. The most common species were Paradendryphiella arenariae (19 [21%]), Aspergillus niger (12 [13%]) and Penicillium commune (12 [13%]). Conclusion: These results demonstrate the presence of diverse fungal species in both air and surface samples across inpatient units. Higher sensitivity was noted utilizing qPCR, however, identified genera and species were markedly different between qPCR and culture methods. Larger studies are needed to assess the efficacy of qPCR for fungal detection in the healthcare environment.
Biomedical research on advanced cryopreservation has spillover effects on innovation in the food and agricultural sector. Advanced biopreservation technology has three key domains of impact in the food system: (1) improving efficiencies in storage and utilization of gametes and organoids for plant and animal breeding; (2) isochoric methods for preservation of fresh food products; and (3) in biorepositories for storage of genetic resources for agriculturally significant plants and livestock species.
We evaluated sampling and detection methods for fungal contamination on healthcare surface materials, comparing the efficacy of foam sponges, flocked swabs, and Replicate Organism Detection And Counting (RODAC) plates alongside culture-based quantification and quantitative polymerase chain reaction (qPCR). Findings indicate that sponge sampling and qPCR detection performed best, suggesting a foundation for future studies aiming to surveillance practices for fungi.
Operative cancellations adversely affect patient health and impose resource strain on the healthcare system. Here, our objective was to describe neurosurgical cancellations at five Canadian academic institutions.
Methods:
The Canadian Neurosurgery Research Collaborative performed a retrospective cohort study capturing neurosurgical procedure cancellation data at five Canadian academic centres, during the period between January 1, 2014 and December 31, 2018. Demographics, procedure type, reason for cancellation, admission status and case acuity were collected. Cancellation rates were compared on the basis of demographic data, procedural data and between centres.
Results:
Overall, 7,734 cancellations were captured across five sites. Mean age of the aggregate cohort was 57.1 ± 17.2 years. The overall procedure cancellation rate was 18.2%. The five-year neurosurgical operative cancellation rate differed between Centre 1 and 2 (Centre 1: 25.9%; Centre 2: 13.0%, p = 0.008). Female patients less frequently experienced procedural cancellation. Elective, outpatient and spine procedures were more often cancelled. Reasons for cancellation included surgeon-related factors (28.2%), cancellation for a higher acuity case (23.9%), patient condition (17.2%), other factors (17.0%), resource availability (7.0%), operating room running late (6.4%) and anaesthesia-related (0.3%). When clustered, the reason for cancellation was patient-related in 17.2%, staffing-related in 28.5% and operational or resource-related in 54.3% of cases.
Conclusions:
Neurosurgical operative cancellations were common and most often related to operational or resource-related factors. Elective, outpatient and spine procedures were more often cancelled. These findings highlight areas for optimizing efficiency and targeted quality improvement initiatives.