We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Thermo-responsive hydrogels are smart materials that rapidly switch between hydrophilic (swollen) and hydrophobic (shrunken) states when heated past a threshold temperature, resulting in order-of-magnitude changes in gel volume. Modelling the dynamics of this switch is notoriously difficult and typically involves fitting a large number of microscopic material parameters to experimental data. In this paper, we present and validate an intuitive, macroscopic description of responsive gel dynamics and use it to explore the shrinking, swelling and pumping of responsive hydrogel displacement pumps for microfluidic devices. We finish with a discussion on how such tubular structures may be used to speed up the response times of larger hydrogel smart actuators and unlock new possibilities for dynamic shape change.
Residual herbicides are primarily degraded in the soil through microbial breakdown. Any practices that result in increased soil biological activity, such as cover cropping (between cash crop seasons), could lead to a reduced persistence of herbicides in the soil. Furthermore, cover crops can also interfere with herbicide fate by interception. Field trials were conducted between 2020 and 2023 in a corn (Zea mays L.)–soybean [Glycine max (L.) Merr.] rotation to investigate the influence of cover crop (cereal rye [Secale cereale L.] and crimson clover [Trifolium incarnatum L.]) use on soil enzyme activities (β-glucosidase [BG] and dehydrogenase [DHA]), its effect on the concentration of residual herbicides (sulfentrazone, S-metolachlor, cloransulam-methyl, atrazine, and mesotrione) in the soil, and the interception of herbicides by cover crop residue. The use of cover crops occasionally resulted in increased BG and DHA activities relative to the fallow treatment. However, even when there was an increase in the activity of these two enzymes, increased degradation of the residual herbicides was not observed. The initial concentrations of all residual herbicides in the soil were significantly reduced due to interception by cereal rye biomass. Nevertheless, significant reductions in early-season weed biomass were observed when residual herbicides were included in the tank mixture applied at cover crop termination relative to the application of glyphosate plus glufosinate. Results from this research suggest that the use of cereal rye or crimson clover as cover crops (between cash crop seasons) do not impact the persistence of residual herbicides in the soil or reduce their efficacy in controlling weeds early in the growing season.
This study introduces the prostate cancer linear energy transfer sensitivity index (PCLSI) as a novel method to predict relative biological effectiveness (RBE) in prostate cancer using linear energy transfer (LET) in proton therapy based on screening for DNA repair mutations.
Materials and Methods:
Five prostate cancer cell lines with DNA repair mutations known to cause sensitivity to LET and DNA repair inhibitors were examined using published data. Relative Du145 LET sensitivity data were leveraged to deduce the LET equivalent of olaparib doses. The PCLSI model was built using three of the prostate cancer cell lines (LNCaP, 22Rv1 and Du145) with DNA mutation frequency from patient cohorts. The PCLSI model was compared against two established RBE models, McNamara and McMahon, for LET-optimized prostate cancer treatment plans.
Results:
The PCLSI model relies on the presence of eight DNA repair mutations: AR, ATM, BRCA1, BRCA2, CDH1, ETV1, PTEN and TP53, which are most likely to predict increased LET sensitivity and RBE in proton therapy. In the LET-optimized plan, the PCLSI model indicates that prostate cancer cells with these DNA repair mutations are more sensitive to increased LET than the McNamara and McMahon RBE models, with expected RBE increases ranging from 11%–33% at 2keV/µm.
Conclusions:
The PCLSI model predicts increasing RBE as a function of LET in the presence of certain genetic mutations. The integration of LET-optimized proton therapy and genetic mutation profiling could be a significant step toward the use of individualized medicine to improve outcomes using RBE escalation without the potential toxicity of physical dose escalation.
To quantify the impact of patient- and unit-level risk adjustment on infant hospital-onset bacteremia (HOB) standardized infection ratio (SIR) ranking.
Design:
A retrospective, multicenter cohort study.
Setting and participants:
Infants admitted to 284 neonatal intensive care units (NICUs) in the United States between 2016 and 2021.
Methods:
Expected HOB rates and SIRs were calculated using four adjustment strategies: birthweight (model 1), birthweight and postnatal age (model 2), birthweight and NICU complexity (model 3), and birthweight, postnatal age, and NICU complexity (model 4). Sites were ranked according to the unadjusted HOB rate, and these rankings were compared to rankings based on the four adjusted SIR models.
Results:
Compared to unadjusted HOB rate ranking (smallest to largest), the number and proportion of NICUs that left the fourth quartile (worst-performing) following adjustments were as follows: adjusted for birthweight (16, 22.5%), birthweight and postnatal age (19, 26.8%), birthweight and NICU complexity (22, 31.0%), birthweight, postnatal age and NICU complexity (23, 32.4%). Comparing NICUs that moved into the better-performing quartiles after birthweight adjustment to those that remained in the better-performing quartiles regardless of adjustment, the median percentage of low birthweight infants was 17.1% (Interquartile Range (IQR): 15.8, 19.2) vs 8.7% (IQR: 4.8, 12.6); and the median percentage of infants who died was 2.2% (IQR: 1.8, 3.1) vs 0.5% (IQR: 0.01, 12.0), respectively.
Conclusion:
Adjusting for patient and unit-level complexity moved one-third of NICUs in the worst-performing quartile into a better-performing quartile. Risk adjustment may allow for a more accurate comparison across units with varying levels of patient acuity and complexity.
The global population and status of Snowy Owls Bubo scandiacus are particularly challenging to assess because individuals are irruptive and nomadic, and the breeding range is restricted to the remote circumpolar Arctic tundra. The International Union for Conservation of Nature (IUCN) uplisted the Snowy Owl to “Vulnerable” in 2017 because the suggested population estimates appeared considerably lower than historical estimates, and it recommended actions to clarify the population size, structure, and trends. Here we present a broad review and status assessment, an effort led by the International Snowy Owl Working Group (ISOWG) and researchers from around the world, to estimate population trends and the current global status of the Snowy Owl. We use long-term breeding data, genetic studies, satellite-GPS tracking, and survival estimates to assess current population trends at several monitoring sites in the Arctic and we review the ecology and threats throughout the Snowy Owl range. An assessment of the available data suggests that current estimates of a worldwide population of 14,000–28,000 breeding adults are plausible. Our assessment of population trends at five long-term monitoring sites suggests that breeding populations of Snowy Owls in the Arctic have decreased by more than 30% over the past three generations and the species should continue to be categorised as Vulnerable under the IUCN Red List Criterion A2. We offer research recommendations to improve our understanding of Snowy Owl biology and future population assessments in a changing world.
The gut microbiome is impacted by certain types of dietary fibre. However, the type, duration and dose needed to elicit gut microbial changes and whether these changes also influence microbial metabolites remain unclear. This study investigated the effects of supplementing healthy participants with two types of non-digestible carbohydrates (resistant starch (RS) and polydextrose (PD)) on the stool microbiota and microbial metabolite concentrations in plasma, stool and urine, as secondary outcomes in the Dietary Intervention Stem Cells and Colorectal Cancer (DISC) Study. The DISC study was a double-blind, randomised controlled trial that supplemented healthy participants with RS and/or PD or placebo for 50 d in a 2 × 2 factorial design. DNA was extracted from stool samples collected pre- and post-intervention, and V4 16S rRNA gene sequencing was used to profile the gut microbiota. Metabolite concentrations were measured in stool, plasma and urine by high-performance liquid chromatography. A total of fifty-eight participants with paired samples available were included. After 50 d, no effects of RS or PD were detected on composition of the gut microbiota diversity (alpha- and beta-diversity), on genus relative abundance or on metabolite concentrations. However, Drichlet’s multinomial mixture clustering-based approach suggests that some participants changed microbial enterotype post-intervention. The gut microbiota and fecal, plasma and urinary microbial metabolites were stable in response to a 50-d fibre intervention in middle-aged adults. Larger and longer studies, including those which explore the effects of specific fibre sub-types, may be required to determine the relationships between fibre intake, the gut microbiome and host health.
COVID-19 changed the epidemiology of community-acquired respiratory viruses. We explored patterns of respiratory viral testing to understand which tests are most clinically useful in the postpandemic era.
Methods:
We conducted a retrospective observational study of discharge data from PINC-AI (formerly Premier), a large administrative database. Use of multiplex nucleic acid amplification respiratory panels in acute care, including small (2–5 targets), medium (6–11), and large panels (>11), were compared between the early pandemic (03/2020–10/2020), late pandemic (11/2020–4/2021), and prepandemic respiratory season (11/2019 - 02/2020) using ANOVA.
Results:
A median of 160.5 facilities contributed testing data per quarter (IQR 155.5–169.5). Prepandemic, facilities averaged 103 respiratory panels monthly (sd 138), including 79 large (sd 126), 7 medium (sd 31), and 16 small panels (sd 73). Relative to prepandemic, utilization decreased during the early pandemic (62 panels monthly/facility; sd 112) but returned to the prepandemic baseline by the late pandemic (107 panels monthly/facility; sd 211). Relative to prepandemic, late pandemic testing involved more small panel use (58 monthly/facility, sd 156) and less large panel use (47 monthly/facility, sd 116). Comparisons among periods demonstrated significant differences in overall testing (P < 0.0001), large panel use (P < 0.0001), and small panel use (P < 0.0001).
Conclusions:
Postpandemic, clinical use of respiratory panel testing shifted from predominantly large panels to predominantly small panels. Factors driving this change may include resource availability, costs, and the clinical utility of targeting important pathogenic viruses instead of testing “for everything.”
Major depressive disorder (MDD) is a tremendous global disease burden and the leading cause of disability worldwide. Unfortunately, individuals diagnosed with MDD typically experience a delayed response to traditional antidepressants and many do not adequately respond to pharmacotherapy, even after multiple trials. The critical need for novel antidepressant treatments has led to a recent resurgence in the clinical application of psychedelics, and intravenous ketamine, which has been investigated as a rapid-acting treatment for treatment resistant depression (TRD) as well acute suicidal ideation and behavior. However, variations in the type and quality of experimental design as well as a range of treatment outcomes in clinical trials of ketamine make interpretation of this large body of literature challenging.
Objectives
This umbrella review aims to advance our understanding of the effectiveness of intravenous ketamine as a pharmacotherapy for TRD by providing a systematic, quantitative, large-scale synthesis of the empirical literature.
Methods
We performed a comprehensive PubMed search for peer-reviewed meta-analyses of primary studies of intravenous ketamine used in the treatment of TRD. Meta-analysis and primary studies were then screened by two independent coding teams according to pre-established inclusion criteria as well as PRISMA and METRICS guidelines. We then employed metaumbrella, a statistical package developed in R, to perform effect size calculations and conversions as well as statistical tests.
Results
In a large-scale analysis of 1,182 participants across 51 primary studies, repeated-dose administration of intravenous ketamine demonstrated statistically significant effects (p<0.05) compared to placebo-controlled as well as other experimental conditions in patients with TRD, as measured by standardized clinician-administered and self-report depression symptom severity scales.
Conclusions
This study provides large-scale, quantitative support for the effectiveness of intravenous, repeated-dose ketamine as a therapy for TRD and a report of the relative effectiveness of several treatment parameters across a large and rapidly growing literature. Future investigations should use similar analytic tools to examine evidence-stratified conditions and the comparative effectiveness of other routes of administration and treatment schedules as well as the moderating influence of other clinical and demographic variables on the effectiveness of ketamine on TRD and suicidal ideation and behavior.
There has been rapidly growing interest in understanding the pharmaceutical and clinical properties of psychedelic and dissociative drugs, with a particular focus on ketamine. This compound, long known for its anesthetic and dissociative properties, has garnered attention due to its potential to rapidly alleviate symptoms of depression, especially in individuals with treatment-resistant depression (TRD) or acute suicidal ideation or behavior. However, while ketamine’s psychopharmacological effects are increasingly well-documented, the specific patterns of its neural impact remain a subject of exploration and basic questions remain about its effects on functional activation in both clinical and healthy populations.
Objectives
This meta-analysis seeks to contribute to the evolving landscape of neuroscience research on dissociative drugs such as ketamine by comprehensively examining the effects of acute ketamine administration on neural activation, as measured by functional magnetic resonance imaging (fMRI), in healthy participants.
Methods
We conducted a meta-analysis of existing fMRI activation studies of ketamine using multilevel kernel density analysis (MKDA). Following a comprehensive PubMed search, we quantitatively synthesized all published primary fMRI whole-brain activation studies of the effects of ketamine in healthy subjects with no overlapping samples (N=18). This approach also incorporated ensemble thresholding (α=0.05-0.0001) to minimize cluster-size detection bias and Monte Carlo simulations to correct for multiple comparisons.
Results
Our meta-analysis revealed statistically significant (p<0.05-0.0001; FWE-corrected) alterations in neural activation in multiple cortical and subcortical regions following the administration of ketamine to healthy participants (N=306).
Conclusions
These results offer valuable insights into the functional neuroanatomical effects caused by acute ketamine administration. These findings may also inform development of therapeutic applications of ketamine for various psychiatric and neurological conditions. Future studies should investigate the neural effects of ketamine administration, including both short-term and long-term effects, in clinical populations and their relation to clinical and functional improvements.
Bipolar I disorder (BD-I) is a chronic and recurrent mood disorder characterized by alternating episodes of depression and mania; it is also associated with substantial morbidity and mortality and with clinically significant functional impairments. While previous studies have used functional magnetic resonance imaging (fMRI) to examine neural abnormalities associated with BD-I, they have yielded mixed findings, perhaps due to differences in sampling and experimental design, including highly variable mood states at the time of scan.
Objectives
The purpose of this study is to advance our understanding of the neural basis of BD-I and mania, as measured by fMRI activation studies, and to inform the development of more effective brain-based diagnostic systems and clinical treatments.
Methods
We conducted a large-scale meta-analysis of whole-brain fMRI activation studies that compared participants with BD-I, assessed during a manic episode, to age-matched healthy controls. Following PRISMA guidelines, we conducted a comprehensive PubMed literature search using two independent coding teams to evaluate primary studies according to pre-established inclusion criteria. We then used multilevel kernel density analysis (MKDA), a well-established, voxel-wise, whole-brain, meta-analytic approach, to quantitatively synthesize all qualifying primary fMRI activation studies of mania. We used ensemble thresholding (p<0.05-0.0001) to minimize cluster size detection bias, and 10,000 Monte Carlo simulations to correct for multiple comparisons.
Results
We found that participants with BD-I (N=2,042), during an active episode of mania and relative to age-matched healthy controls (N=1,764), exhibit a pattern of significantly (p<0.05-0.0001; FWE-corrected) different activation in multiple brain regions of the cerebral cortex and basal ganglia across a variety of experimental tasks.
Conclusions
This study supports the formulation of a robust neural basis for BD-I during manic episodes and advances our understanding of the pattern of abnormal activation in this disorder. These results may inform the development of novel brain-based clinical tools for bipolar disorder such as diagnostic biomarkers, non-invasive brain stimulation, and treatment-matching protocols. Future studies should compare the neural signatures of BD-I to other related disorders to facilitate the development of protocols for differential diagnosis and improve treatment outcomes in patients with BD-I.
Attention-deficit/hyperactivity disorder (ADHD) is a highly prevalent psychiatric condition that frequently originates in early development and is associated with a variety of functional impairments. Despite a large functional neuroimaging literature on ADHD, our understanding of the neural basis of this disorder remains limited, and existing primary studies on the topic include somewhat divergent results.
Objectives
The present meta-analysis aims to advance our understanding of the neural basis of ADHD by identifying the most statistically robust patterns of abnormal neural activation throughout the whole-brain in individuals diagnosed with ADHD compared to age-matched healthy controls.
Methods
We conducted a meta-analysis of task-based functional magnetic resonance imaging (fMRI) activation studies of ADHD. This included, according to PRISMA guidelines, a comprehensive PubMed search and predetermined inclusion criteria as well as two independent coding teams who evaluated studies and included all task-based, whole-brain, fMRI activation studies that compared participants diagnosed with ADHD to age-matched healthy controls. We then performed multilevel kernel density analysis (MKDA) a well-established, whole-brain, voxelwise approach that quantitatively combines existing primary fMRI studies, with ensemble thresholding (p<0.05-0.0001) and multiple comparisons correction.
Results
Participants diagnosed with ADHD (N=1,550), relative to age-matched healthy controls (N=1,340), exhibited statistically significant (p<0.05-0.0001; FWE-corrected) patterns of abnormal activation in multiple brains of the cerebral cortex and basal ganglia across a variety of cognitive control tasks.
Conclusions
This study advances our understanding of the neural basis of ADHD and may aid in the development of new brain-based clinical interventions as well as diagnostic tools and treatment matching protocols for patients with ADHD. Future studies should also investigate the similarities and differences in neural signatures between ADHD and other highly comorbid psychiatric disorders.
Fructose-containing sugars can exaggerate postprandial lipaemia and stimulate hepatic de novo lipogenesis (DNL) when compared to glucose-based carbohydrates(1). Galactose has recently been shown to increase postprandial lipaemia compared to glucose(2), but mechanisms remain uncharacterised. The aim of this study was to assess the effect and mechanisms of lactose-induced lipaemia.
Twenty-four non-obese adults (12 male and 12 female) completed three trials in a randomised, crossover design (28 ± 7-day washout). During trials, participants consumed test drinks containing 50 g fat with 100 g of carbohydrate. The control carbohydrate was a glucose polymer (maltodextrin), the experimental carbohydrate was galactose-containing carbohydrate (lactose) and the active comparator was fructose-containing carbohydrate (sucrose). Hepatic DNL was assessed by the 2H2O method and [U-13C]-palmitate was added to the test drink to trace the fate of the ingested fat. Blood and breath samples were taken to determine plasma metabolite and hormone concentrations, in addition to plasma and breath 2H and 13C enrichments. Data were converted into incremental under the curve (iAUC) and were checked for normality by visual inspection of residuals. Differences between trials were assessed by one-way ANOVA. Where a main effect of trial was detected, post- hoc t-tests were performed to determine which trials differed from lactose according to the principle of closed-loop testing.
The plasma triacylglycerol iAUC (mean ± SD) in response to maltodextrin was 51 ± 68 mmol/L*360 min. Following lactose ingestion, plasma triacylglycerol iAUC increased to 98 ± 88 mmol/L*360 min (p<0.001 vs maltodextrin), which was comparable to sucrose [90 ± 95 mmol/L*360 min (p=0.41 vs lactose)]. Hepatic DNL in response to maltodextrin was 6.6 ± 3.0%. Following ingestion of lactose, hepatic DNL increased to 12.4 ± 6.9% (p=0.02 vs maltodextrin), which was comparable to sucrose [12.2 ± 6.9% (p=0.96 vs lactose)]. Exhaled 13CO2 in response to maltodextrin was 10.4 ± 4.1 mmol/kgFFM*360 min. Following ingestion of lactose, exhaled 13CO2 was 8.8 ± 4.9 mmol/kgFFM*360 min (p=0.09 vs maltodextrin), which was lower than sucrose [11.1 ± 3.9 mmol/kgFFM*360 min (p=0.01 vs lactose)].
These data are consistent with the hypothesis that hepatic de novo lipogenesis contributes to both lactose and sucrose-induced lipaemia and provide a rationale to investigate the longer-term effects of lactose and sucrose on metabolism.
Medical researchers are increasingly prioritizing the inclusion of underserved communities in clinical studies. However, mere inclusion is not enough. People from underserved communities frequently experience chronic stress that may lead to accelerated biological aging and early morbidity and mortality. It is our hope and intent that the medical community come together to engineer improved health outcomes for vulnerable populations. Here, we introduce Health Equity Engineering (HEE), a comprehensive scientific framework to guide research on the development of tools to identify individuals at risk of poor health outcomes due to chronic stress, the integration of these tools within existing healthcare system infrastructures, and a robust assessment of their effectiveness and sustainability. HEE is anchored in the premise that strategic intervention at the individual level, tailored to the needs of the most at-risk people, can pave the way for achieving equitable health standards at a broader population level. HEE provides a scientific framework guiding health equity research to equip the medical community with a robust set of tools to enhance health equity for current and future generations.
Diagnostic stewardship is increasingly recognized as a powerful tool to improve patient safety. Given the close relationship between diagnostic testing and antimicrobial misuse, antimicrobial stewardship (AMS) pharmacists should be key members of the diagnostic team. Pharmacists practicing in AMS already frequently engage with clinicians to improve the diagnostic process and have many skills needed for the implementation of diagnostic stewardship initiatives. As diagnostic stewardship becomes more broadly used, all infectious disease clinicians, including pharmacists, must collaborate to optimize patient care.
High-quality evidence is lacking for the impact on healthcare utilisation of short-stay alternatives to psychiatric inpatient services for people experiencing acute and/or complex mental health crises (known in England as psychiatric decision units [PDUs]). We assessed the extent to which changes in psychiatric hospital and emergency department (ED) activity were explained by implementation of PDUs in England using a quasi-experimental approach.
Methods
We conducted an interrupted time series (ITS) analysis of weekly aggregated data pre- and post-PDU implementation in one rural and two urban sites using segmented regression, adjusting for temporal and seasonal trends. Primary outcomes were changes in the number of voluntary inpatient admissions to (acute) adult psychiatric wards and number of ED adult mental health-related attendances in the 24 months post-PDU implementation compared to that in the 24 months pre-PDU implementation.
Results
The two PDUs (one urban and one rural) with longer (average) stays and high staff-to-patient ratios observed post-PDU decreases in the pattern of weekly voluntary psychiatric admissions relative to pre-PDU trend (Rural: −0.45%/week, 95% confidence interval [CI] = −0.78%, −0.12%; Urban: −0.49%/week, 95% CI = −0.73%, −0.25%); PDU implementation in each was associated with an estimated 35–38% reduction in total voluntary admissions in the post-PDU period. The (urban) PDU with the highest throughput, lowest staff-to-patient ratio and shortest average stay observed a 20% (−20.4%, CI = −29.7%, −10.0%) level reduction in mental health-related ED attendances post-PDU, although there was little impact on long-term trend. Pooled analyses across sites indicated a significant reduction in the number of voluntary admissions following PDU implementation (−16.6%, 95% CI = −23.9%, −8.5%) but no significant (long-term) trend change (−0.20%/week, 95% CI = −0.74%, 0.34%) and no short- (−2.8%, 95% CI = −19.3%, 17.0%) or long-term (0.08%/week, 95% CI = −0.13, 0.28%) effects on mental health-related ED attendances. Findings were largely unchanged in secondary (ITS) analyses that considered the introduction of other service initiatives in the study period.
Conclusions
The introduction of PDUs was associated with an immediate reduction of voluntary psychiatric inpatient admissions. The extent to which PDUs change long-term trends of voluntary psychiatric admissions or impact on psychiatric presentations at ED may be linked to their configuration. PDUs with a large capacity, short length of stay and low staff-to-patient ratio can positively impact ED mental health presentations, while PDUs with longer length of stay and higher staff-to-patient ratios have potential to reduce voluntary psychiatric admissions over an extended period. Taken as a whole, our analyses suggest that when establishing a PDU, consideration of the primary crisis-care need that underlies the creation of the unit is key.
Transient acquisition of methicillin-resistant Staphylococcus aureus (MRSA) on healthcare personnel (HCP) gloves and gowns following patient care has been examined. However, the potential for transmission to the subsequent patient has not been studied. We explored the frequency of MRSA transmission from patient to HCP, and then in separate encounters from contaminated HCP gloves and gowns to a subsequent simulated patient as well as the factors associated with these 2 transmission pathways.
Methods:
We conducted a prospective cohort study with 2 parts. In objective 1, we studied MRSA transmission from random MRSA-positive patients to HCP gloves and gowns after specific routine patient care activities. In objective 2, we simulated subsequent transmission from random HCP gloves and gowns without hand hygiene to the next patient using a manikin proxy.
Results:
For the first objective, among 98 MRSA-positive patients with 333 randomly selected individual patient–HCP interactions, HCP gloves or gowns were contaminated in 54 interactions (16.2%). In a multivariable analysis, performing endotracheal tube care had the greatest odds of glove or gown contamination (OR, 4.06; 95% CI, 1.3–12.6 relative to physical examination). For the second objective, after 147 simulated HCP–patient interactions, the subsequent transmission of MRSA to the manikin proxy occurred 15 times (10.2%).
Conclusion:
After caring for a patient with MRSA, contamination of HCP gloves and gown and transmission to subsequent patients following HCP-patient interactions occurs frequently if contact precautions are not used. Proper infection control practices, including the use of gloves and gown, can prevent this potential subsequent transmission.
Bentonites are readily available clays used in the livestock industry as feed additives to reduce aflatoxin (AF) exposure; their potential interaction with nutrients is the main concern limiting their use, however. The objective of the present study was to determine the safety of a dietary sodium-bentonite (Na-bentonite) supplement as a potential AF adsorbent, using juvenile Sprague Dawley (SD) rats as a research model. Animals were fed either a control diet or a diet containing Na-bentonite at 0.25% and 2% (w/w) inclusion rate. Growth, serum, and blood biochemical parameters, including selected serum vitamins (A and E) and elements such as calcium (Ca), potassium (K), iron (Fe), and zinc (Zn) were measured. The mineral characteristics and the aflatoxin B1 sorption capacity of Na-bentonite were also determined. By the end of the study, males gained more weight than females in control and Na-bentonite groups (p ≤ 0.0001); the interaction between treatment and sex was not significant (p = 0.6780), however. Some significant differences between the control group and bentonite treatments were observed in serum biochemistry and vitamin and minerals measurements; however, parameters fell within reference clinical values reported for SD rats and no evidence of dose-dependency was found. Serum Na and Na/K ratios were increased, while K levels were decreased in males and females from Na-bentonite groups. Serum Zn levels were decreased only in males from Na-bentonite treatments. Overall, results showed that inclusion of Na-bentonite at 0.25% and 2% did not cause any observable toxicity in a 3-month rodent study.
Acid-base titrations and attenuated total reflectance-infrared (ATR-IR) spectroscopy of solutions containing Zn(NO3)2 and the herbicide 3-amino-1,2,4-triazole suggested that soluble complexes ZnL2+ and Zn(OH)L+ form, where L represents aminotriazole. Sorption experiments and modeling in systems containing K-saturated Wyoming (SWy-K) montmorillonite suggest that at low concentrations the aminotriazole sorbs primarily in cationic form via an ion-exchange mechanism. Sorption isotherms for aminotriazole are ‘s’-shaped, indicating a co-operative sorption mechanism as the concentration of the molecule increases. At higher concentrations, ATR-IR spectroscopy indicated the presence of cationic and neutral triazole molecules on the surface, while X-ray diffraction data suggest interaction with interlayer regions of the clay. When the concentration of the herbicide was high, initial sorption of aminotriazole cations modified the clay to make the partitioning of neutral molecules to the surface more favorable. Experiments conducted in the presence of Zn(II) indicated that below pH 7, Zn(II) and aminotriazole compete for sorption sites, while above pH 7 the presence of Zn(II) enhances the uptake of aminotriazole. The enhancement was attributed to the formation of an inner-sphere ternary surface complex at hydroxyl sites (SOH) on crystal edges, having the form [(SOZn(OH)L)]0.