We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Spaceflight missions must limit biological contamination on both the outbound and return legs to comply with planetary protection requirements. Depending on the mission profile, contamination concerns may include the potential presence of bioactive molecules as defined by NASA’s Planetary Protection policies. Thus, the present study has examined the temperature and time requirements for sufficient inactivation/degradation of an infectious, heat-stable prion protein (Sup35NM), which serves as a model bioactive molecule. Bovine serum albumin was used to establish the method parameters and feasibility. Differential scanning calorimetry, Fourier transform infrared spectroscopy, analytical reversed-phase high-performance liquid chromatography, and mass spectrometry were utilized to analyze heat-treated samples, with non-treated samples serving as controls. Heat treatment at 400°C for 5 seconds was found to result in substantial decomposition of Sup35NM. In addition to the disruption of the protein backbone amide bonds, the side chain residues were also compromised. Fragments of molecular weight <4600 were observed by mass spectrometry but the impact of treatment on both the backbone and side chains of Sup35NM suggested that these fragments would not self-associate to create potentially pathogenic entities. The present methodology provided insight into the protein degradation process and can be applied to a variety of treatment strategies (e.g., any form of sterilization or inactivation) to ensure a lack of protein-based contamination of isolated extraterrestrial specimens.
Blast injuries can occur by a multitude of mechanisms, including improvised explosive devices (IEDs), military munitions, and accidental detonation of chemical or petroleum stores. These injuries disproportionately affect people in low- and middle-income countries (LMICs), where there are often fewer resources to manage complex injuries and mass-casualty events.
Study Objective:
The aim of this systematic review is to describe the literature on the acute facility-based management of blast injuries in LMICs to aid hospitals and organizations preparing to respond to conflict- and non-conflict-related blast events.
Methods:
A search of Ovid MEDLINE, Scopus, Global Index Medicus, Web of Science, CINAHL, and Cochrane databases was used to identify relevant citations from January 1998 through July 2024. This systematic review was conducted in adherence with PRISMA guidelines. Data were extracted and analyzed descriptively. A meta-analysis calculated the pooled proportions of mortality, hospital admission, intensive care unit (ICU) admission, intubation and mechanical ventilation, and emergency surgery.
Results:
Reviewers screened 3,731 titles and abstracts and 173 full texts. Seventy-five articles from 22 countries were included for analysis. Only 14.7% of included articles came from low-income countries (LICs). Sixty percent of studies were conducted in tertiary care hospitals. The mean proportion of patients who were admitted was 52.1% (95% CI, 0.376 to 0.664). Among all in-patients, 20.0% (95% CI, 0.124 to 0.288) were admitted to an ICU. Overall, 38.0% (95% CI, 0.256 to 0.513) of in-patients underwent emergency surgery and 13.8% (95% CI, 0.023 to 0.315) were intubated. Pooled in-patient mortality was 9.5% (95% CI, 0.046 to 0.156) and total hospital mortality (including emergency department [ED] mortality) was 7.4% (95% CI, 0.034 to 0.124). There were no significant differences in mortality when stratified by country income level or hospital setting.
Conclusion:
Findings from this systematic review can be used to guide preparedness and resource allocation for acute care facilities. Pooled proportions for mortality and other outcomes described in the meta-analysis offer a metric by which future researchers can assess the impact of blast events. Under-representation of LICs and non-tertiary care medical facilities and significant heterogeneity in data reporting among published studies limited the analysis.
The maintenance of cross-cultural variation and arbitrary traditions in human populations is a key question in cultural evolution. Conformist transmission, the tendency to follow the majority, was previously considered central to this phenomenon. However, recent theory indicates that cognitive biases can greatly reduce its ability to maintain traditions. Therefore, we expanded prior models to investigate two other ways that cultural variation can be sustained: payoff-biased transmission and norm reinforcement. Our findings predict that both payoff-biased transmission and reinforcement can enhance conformist transmission's ability to maintain traditions. However, payoff-biased transmission can only sustain cultural variation if it is functionally related to environmental factors. In contrast, norm reinforcement readily generates and maintains arbitrary cultural variation. Furthermore, reinforcement results in path-dependent cultural dynamics, meaning that historical traditions influence current practices, even though group behaviours have changed. We conclude that environmental variation probably plays a role in functional cultural traditions, but arbitrary cultural variation is more plausibly due to the reinforcement of norm compliance.
Understanding characteristics of healthcare personnel (HCP) with SARS-CoV-2 infection supports the development and prioritization of interventions to protect this important workforce. We report detailed characteristics of HCP who tested positive for SARS-CoV-2 from April 20, 2020 through December 31, 2021.
Methods:
CDC collaborated with Emerging Infections Program sites in 10 states to interview HCP with SARS-CoV-2 infection (case-HCP) about their demographics, underlying medical conditions, healthcare roles, exposures, personal protective equipment (PPE) use, and COVID-19 vaccination status. We grouped case-HCP by healthcare role. To describe residential social vulnerability, we merged geocoded HCP residential addresses with CDC/ATSDR Social Vulnerability Index (SVI) values at the census tract level. We defined highest and lowest SVI quartiles as high and low social vulnerability, respectively.
Results:
Our analysis included 7,531 case-HCP. Most case-HCP with roles as certified nursing assistant (CNA) (444, 61.3%), medical assistant (252, 65.3%), or home healthcare worker (HHW) (225, 59.5%) reported their race and ethnicity as either non-Hispanic Black or Hispanic. More than one third of HHWs (166, 45.2%), CNAs (283, 41.7%), and medical assistants (138, 37.9%) reported a residential address in the high social vulnerability category. The proportion of case-HCP who reported using recommended PPE at all times when caring for patients with COVID-19 was lowest among HHWs compared with other roles.
Conclusions:
To mitigate SARS-CoV-2 infection risk in healthcare settings, infection prevention, and control interventions should be specific to HCP roles and educational backgrounds. Additional interventions are needed to address high social vulnerability among HHWs, CNAs, and medical assistants.
The COVID-19 pandemic has disproportionally affected the mental health of health and social care workers (HSCWs), with many experiencing symptoms of depression, anxiety and post-traumatic stress disorder. Psychological interventions have been offered via mental health services and in-house psychology teams, but their effectiveness in this context is not well documented.
Aims
To evaluate a stepped-care psychological support pathway for HSCWs from Homerton Healthcare Foundation Trust in London, which offered psychological first aid, evidence-based psychological therapies and group-based well-being workshops.
Method
The service evaluation used a pre–post approach to assess depression, anxiety, functional impairment and post-traumatic stress disorder symptom change for those who attended sessions of psychological first aid, low- or high-intensity cognitive–behavioural therapy or a combination of these. In addition, the acceptability of the psychological first aid sessions and well-being workshops was explored via feedback data.
Results
Across all interventions, statistically significant reductions of depression (d = 1.33), anxiety (d = 1.37) and functional impairment (d = 0.93) were observed, and these reductions were equivalent between the interventions, as well as the demographic and occupational differences between the HSCWs (ethnicity, staff group and redeployment status). HSCWs were highly satisfied with the psychological first aid and well-being workshops.
Conclusions
The evaluation supports the utility of evidence-based interventions delivered as part of a stepped-care pathway for HSCWs with common mental health problems in the context of the COVID-19 pandemic. Given the novel integration of psychological first aid within the stepped-care model as a step one intervention, replication and further testing in larger-scale studies is warranted.
The western Antarctic Peninsula harbours a diverse benthic marine community where dense canopies of macroalgae can dominate the shallow subtidal zone (0–40 m or greater). In the lower portion of this range (below 25–35 m depending on topography), invertebrates such as sponges and echinoderms can be found in greater abundance due to reduced competition for space from the algal species. Dendrilla antarctica (previously Dendrilla membranosa) is a common demosponge that thrives in both communities and is known for producing diterpene secondary metabolites as a defence against sympatric sea star and amphipod predators. Omnivorous mesograzers such as amphipods inhabit both communities; however, they are in greatest abundance within the macroalgal canopy. Due to the differences between habitats, it was hypothesized that specific amphipod species not susceptible to the defensive metabolites of D. antarctica would take refuge from predators in the chemically defended sponge. Analysis of the metabolome and amphipod communities from sponges in both habitats found correlations of metabolic profile to both abundance and habitat. These studies serve to inform our understanding of the complex ecosystem of the Antarctic benthos that stands to be dramatically altered by the rapidly changing climate in the years to come.
The aim of this study was to determine the in vitro digestibility and in sacco disappearance of dry matter (DM) and neutral detergent fibre (NDF) in total almond hulls (TAH), pure almond hulls (PAH) or Debris. The TAH were used because there are no data on the effect of debris (non-hull material) on the nutritional value of almond hulls. Twelve samples of commercial almond hulls were used, with one subsample representing the TAH and the other subsample hand sorted to separate the hulls (PAH) from Debris. Gas production and Ankom Daisy method were used to determine in vitro digestibility, while two rumen-fistulated cows were used to measure in sacco disappearance of PAH and TAH. For in vitro digestibility, both PAH and TAH were more digestible and had greater gas production than Debris. The PAH had greater in vitro true digestibility on a DM basis and NDF digestibility at 48 and 72 h compared with TAH. Nonpareil hulls provided greater metabolizable energy (ME) concentration when compared with other almond varieties, with PAH supplying numerically more ME than TAH for both varieties. For in sacco disappearance, PAH had greater DM and NDF disappearance along with a greater rate of disappearance for NDF compared with TAH. This research demonstrated that Debris is highly indigestible; therefore, hulling, agronomic and harvesting practices should be focused on reducing Debris in commercial almond hulls to improve their nutritive value as a feedstuff for livestock.
To estimate population-based rates and to describe clinical characteristics of hospital-acquired (HA) influenza.
Design:
Cross-sectional study.
Setting:
US Influenza Hospitalization Surveillance Network (FluSurv-NET) during 2011–2012 through 2018–2019 seasons.
Methods:
Patients were identified through provider-initiated or facility-based testing. HA influenza was defined as a positive influenza test date and respiratory symptom onset >3 days after admission. Patients with positive test date >3 days after admission but missing respiratory symptom onset date were classified as possible HA influenza.
Results:
Among 94,158 influenza-associated hospitalizations, 353 (0.4%) had HA influenza. The overall adjusted rate of HA influenza was 0.4 per 100,000 persons. Among HA influenza cases, 50.7% were 65 years of age or older, and 52.0% of children and 95.7% of adults had underlying conditions; 44.9% overall had received influenza vaccine prior to hospitalization. Overall, 34.5% of HA cases received ICU care during hospitalization, 19.8% required mechanical ventilation, and 6.7% died. After including possible HA cases, prevalence among all influenza-associated hospitalizations increased to 1.3% and the adjusted rate increased to 1.5 per 100,000 persons.
Conclusions:
Over 8 seasons, rates of HA influenza were low but were likely underestimated because testing was not systematic. A high proportion of patients with HA influenza were unvaccinated and had severe outcomes. Annual influenza vaccination and implementation of robust hospital infection control measures may help to prevent HA influenza and its impacts on patient outcomes and the healthcare system.
We described the epidemiology of bat intrusions into a hospital and subsequent management of exposures during 2018–2020. Most intrusions occurred in older buildings during the summer and fall months. Hospitals need bat intrusion surveillance systems and protocols for bat handling, exposure management, and intrusion mitigation.
Post-traumatic stress disorder (PTSD) has been identified as a potential risk factor for developing dementia. There are currently, however, no meta-analyses quantifying this risk.
Aims
To systematically review and quantify the risk of future dementia associated with PTSD across populations. PROSPERO registration number CRD42019130392.
Method
We searched nine electronic databases up to 25 October 2019 for longitudinal studies assessing PTSD and risk of dementia. We used random- and fixed-effects meta-analyses to pool estimates across studies.
Results
PTSD was associated with a significant risk for all-cause dementia: pooled hazard ratio HR = 1.61 (95% CI 1.43–1.81, I2= 85.8%, P < 0.001; n = 1 693 678; 8 studies). Pooled HR was 1.61 (95% CI 1.46–1.78; I2= 80.9%, P < 0.001; n = 905 896; 5 studies) in veterans, and 2.11 (95% CI 1.03–4.33, I2= 91.2%, P < 0.001; n = 787 782; 3 studies) in the general population. The association between PTSD and dementia remained significant after excluding studies with high risk of bias (HR = 1.55, 95% CI 1.39–1.73, I2= 83.9%, P < 0.001; n = 1 684 928; 7 studies). Most studies included were retrospective and there was evidence of high heterogeneity.
Conclusions
This is the first meta-analysis quantifying the association of PTSD and risk of dementia showing that PTSD is a strong and potentially modifiable risk factor for all-cause dementia. Future studies investigating potential causal mechanisms, and the protective value of treating PTSD are needed.
Introduction: Emergency care serves as an important health resource for First Nations (FN) persons. Previous reporting shows that FN persons visit emergency departments at almost double the rate of non-FN persons. Working collaboratively with FN partners, academic researchers and health authority staff, the objective of this study is to investigate FN emergency care patient visit statistics in Alberta over a five year period. Methods: Through a population-based retrospective cohort study for the period from April 1, 2012 to March 31, 2017, patient demographics and emergency care visit characteristics for status FN patients in Alberta were analyzed and compared to non-FN statistics. Frequencies and percentages (%) describe patients and visits by categorical variables (e.g., Canadian Triage Acuity Scale (CTAS)). Means and standard deviations (medians and interquartile ranges (IQR)) describe continuous variables (e.g., distances) as appropriate for the data distribution. These descriptions are repeated for the FN and non-FN populations, separately. Results: The data set contains 11,686,288 emergency facility visits by 3,024,491 unique persons. FN people make up 4.8% of unique patients and 9.4% of emergency care visits. FN persons live further from emergency facilities than their non-FN counterparts (FN median 6 km, IQR 1-24; vs. non-FN median 4 km, IQR 2-8). FN visits arrive more often by ground ambulance (15.3% vs. 10%). FN visits are more commonly triaged as less acute (59% CTAS levels 4 and 5, compared to non-FN 50.4%). More FN visits end in leaving without completing treatment (6.7% vs. 3.6%). FN visits are more often in the evening – 4:01pm to 12:00am (43.6% vs. 38.1%). Conclusion: In a collaborative validation session, FN Elders and health directors contextualized emergency care presentation in evenings and receiving less acute triage scores as related to difficulties accessing primary care. They explained presentation in evenings, arrival by ambulance, and leaving without completing treatment in terms of issues accessing transport to and from emergency facilities. Many factors interact to determine FN patients’ emergency care visit characteristics and outcomes. Further research needs to separate the impact of FN identity from factors such as reasons for visiting emergency facilities, distance traveled to care, and the size of facility where care is provided.
Sir Ernest Henry Shackleton’s Imperial Trans-Antarctic Expedition (ITAE) 1914–1917, consisted of two parties – a Weddell Sea party led by Shackleton with Endurance, and a supporting Ross Sea depot-laying party, led by Captain Aeneas L.A. Mackintosh with Aurora. The purpose of this research paper is to consider why the Ross Sea party contracted scurvy and the Weddell Sea party did not. The authors suggest that for the Ross Sea shore party there was ineffectual leadership, insufficient medical care and sledging with excessive loads, and an inadequate diet for sledging, in both energy and vitamin C content. In their second season, depletion of vitamin C was again evident with one person dying. The Weddell Sea party, ably led by Shackleton, not only faced the arduous task of sledging heavy stores and moving camps in thick snow, but also had to haul three boats over pressure ridges, before reaching open water and rowing to Elephant Island. Here, the men lived almost exclusively on a fresh meat diet and were not affected by scurvy. This is the final paper for the trilogy commemorating the Ross Sea party centenary (the others are Harrowfield, 2013, 2015).
In response to concerns about acetolactate synthase (ALS) inhibitor–resistant weeds in wheat production systems, we explored the efficacy of managing Bromus spp., downy and Japanese bromes, in a winter wheat system using alternative herbicide treatments applied in either fall or spring. Trials were established at Lethbridge and Kipp, Alberta, and Scott, Saskatchewan, Canada over three growing seasons (2012–2014) to compare the efficacy of pyroxasulfone (a soil-applied very-long-chain fatty acid elongase inhibitor; WSSA Group 15) and flumioxazin (a protoporphyrinogen oxidase inhibitor; WSSA Group 14) against industry-standard ALS-inhibiting herbicides for downy and Japanese brome control. Winter wheat injury from herbicide application was minor, with the exception of flucarbazone application at Scott. Bromus spp. control was greatest with pyroxsulam and all herbicide treatments containing pyroxasulfone. Downy and Japanese bromes were controlled least by thiencarbazone and flumioxazin, respectively, whereas Bromus spp. had intermediate responses to the other herbicides tested. Herbicides applied in fall resulted in reduced winter wheat yield relative to the spring applications. Overall, pyroxasulfone or pyroxsulam provided the most efficacious Bromus spp. control compared with the other herbicides and consistently maintained optimal winter wheat yields. Therefore, pyroxasulfone could facilitate management of Bromus spp. resistant to ALS inhibitors in winter wheat in the southern growing regions of western Canada. Improved weed control and delayed herbicide resistance may be achieved when pyroxasulfone is applied in combination with flumioxazin.
Introduction: Emergency Departments (EDs) are frequently the first point of entry to access health services for First Nation (FN) members. In Alberta, FN members visit EDs at almost double the rate of non-FN persons. Furthermore, preliminary evidence demonstrates differences in ED experience for FN members as compared to the general population. The Alberta First Nations Information Governance Centre, Maskwacis Health Services, Yellowhead Tribal Council, Treaty 8 First Nations of Alberta, and Alberta Health Services are working together to research FN members ED experiences and concerns. Methods: This is participatory research guided by a two-eyed seeing approach that acknowledges the equal value of both Western and Indigenous worldviews. FN and non-FN leaders researchers are full partners in the development of the research project. Six sharing circles will be held in February 2018 across Alberta, with Elders, FN patients, FN and non-FN clinicians and FN and non-FN administrators. Sharing circles are similar to focus groups, but emphasize everyone having a turn to speak and demonstrating respect among participants in accordance with FN protocols. Elders will select the questions for discussion based on topics that arose in initial team meetings. Sharing circle discussions will be audio recorded and transcribed. Analysts will include both Western and Indigenous worldview researchers, who will collaboratively interpret findings. Elders will review, discuss, contextualize and expand upon study findings. The research is also guided by FN principles of Ownership, Control, Access, and Possession of FN information. It is through these principles that First Nation research projects can truly be classified as FN lead and driven. Results: Based on initial team meeting discussions, results of sharing circles are expected to provide insights on issues such as: healing, patient-provider communication (verbal and non-verbal), shared decision making, respect for patient preferences, experiences leading to trust or distrust, understandings of wait times and triage, times when multiple (repeat) ED presentations occur, distances travelled for care, choosing specific EDs when seeking care, impacts of stereotypes about FN patients, and racism and reconciliation. Conclusion: Understanding FN ED experience and bringing FN perspectives to Western conceptions of the goals and provision of ED care are important steps toward reconciliation.
The Neotoma Paleoecology Database is a community-curated data resource that supports interdisciplinary global change research by enabling broad-scale studies of taxon and community diversity, distributions, and dynamics during the large environmental changes of the past. By consolidating many kinds of data into a common repository, Neotoma lowers costs of paleodata management, makes paleoecological data openly available, and offers a high-quality, curated resource. Neotoma’s distributed scientific governance model is flexible and scalable, with many open pathways for participation by new members, data contributors, stewards, and research communities. The Neotoma data model supports, or can be extended to support, any kind of paleoecological or paleoenvironmental data from sedimentary archives. Data additions to Neotoma are growing and now include >3.8 million observations, >17,000 datasets, and >9200 sites. Dataset types currently include fossil pollen, vertebrates, diatoms, ostracodes, macroinvertebrates, plant macrofossils, insects, testate amoebae, geochronological data, and the recently added organic biomarkers, stable isotopes, and specimen-level data. Multiple avenues exist to obtain Neotoma data, including the Explorer map-based interface, an application programming interface, the neotoma R package, and digital object identifiers. As the volume and variety of scientific data grow, community-curated data resources such as Neotoma have become foundational infrastructure for big data science.
A series of wave instruments was deployed on first-year Antarctic sea ice during SIPEX (Sea Ice Physics and Ecosystem Experiment) II. Here we describe the hardware and software design of these instruments and give an overview of the returned dataset. Each instrument consisted of a high-resolution accelerometer coupled with a tri-axis inertial measurement unit, which was located using GPS. The significant wave heights measured near the ice edge were predominately between 1 and 2 m. During the 6 weeks of data capture, several large wave events were measured. We report here a selection of events, highlighting the complexities associated with measuring wave decay at individual frequencies.
During field experiments the Russian knapweed nematode formed no galls on artichoke, one sterile gall on American basketflower, and few nematodes in numerous small galls on southwestern basket flower in comparison with much higher nematode production per gall on Russian knapweed.
Experiments were conducted from 1994 through 1996 to determine the response of the rice cultivars ‘Bengal,’ ‘Cypress,’ ‘Jodon,’ and ‘Kaybonnet’ to triclopyr at 0.42 (standard rate) and 0.84 kg ai/ha applied postemergence at the four-leaf and panicle initiation stages of growth. Applications at the four-leaf stage were made in close association with fertilization and flood establishment, which often increases the potential for triclopyr to injure rice. Visible injury from triclopyr was slightly higher for the cultivar Jodon than for the cultivars Bengal, Cypress, or Kaybonnet. Injury was 3% or less when triclopyr at 0.42 kg/ha was applied at panicle initiation regardless of the cultivar. Triclopyr at 0.42 and 0.84 kg/ha applied at the four-leaf growth stage injured rice 7% and 22%, respectively. Triclopyr at 0.84 kg/ha applied at the four-leaf stage of growth delayed days from seedling emergence to seed head emergence and rice grain yield, irrespective of cultivar.
Field experiments were conducted in 1982 through 1986 to evaluate the effect of six postemergence grass herbicides, clethodim, the methyl ester of diclofop, the butyl ester of fluazifop-P, the methyl ester of haloxyfop, the methyl ester of quizalofop, and sethoxydim, on 31 seedling grass species. All herbicides except diclofop controlled most of the species tested. None of the herbicides controlled rattail fescue or red fescue. Only clethodim and haloxyfop controlled annual bluegrass.