We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
To compare rates of Clostridioides difficile infection (CDI) recurrence following initial occurrence treated with tapered enteral vancomycin compared to standard vancomycin.
Design:
Retrospective cohort study.
Setting:
Community health system.
Patients:
Adults ≥18 years of age hospitalized with positive C. difficile polymerase chain reaction or toxin enzyme immunoassay who were prescribed either standard 10–14 days of enteral vancomycin four times daily or a 12-week tapered vancomycin regimen.
Methods:
Retrospective propensity score pair matched cohort study. Groups were matched based on age < or ≥ 65 years and receipt of non-C. difficile antibiotics during hospitalization or within 6 months post-discharge. Recurrence rates were analyzed via logistic regression conditioned on matched pairs and reported as conditional odds ratios. The primary outcome was recurrence rates compared between standard vancomycin versus tapered vancomycin for treatment of initial CDI.
Results:
The CDI recurrence rate at 6 months was 5.3% (4/75) in the taper cohort versus 28% (21/75) in the standard vancomycin cohort. The median time to CDI recurrence was 115 days versus 20 days in the taper and standard vancomycin cohorts, respectively. When adjusted for matching, patients in the taper arm were less likely to experience CDI recurrence at 6 months when compared to standard vancomycin (cOR = 0.19, 95% CI 0.07–0.56, p < 0.002).
Conclusions:
Larger prospective trials are needed to elucidate the clinical utility of tapered oral vancomycin as a treatment option to achieve sustained clinical cure in first occurrences of CDI.
While clozapine has risks, relative risk of fatality is overestimated. The UK pharmacovigilance programme is efficient, but comparisons with other drugs can mislead because of reporting variations. Clozapine actually lowers mortality, partly by reducing schizophrenia-related suicides, but preventable deaths still occur. Clozapine should be used earlier and more widely, but there should be better monitoring and better management of toxicity.
Seismic imaging in 3-D holds great potential for improving our understanding of ice sheet structure and dynamics. Conducting 3-D imaging in remote areas is simplified by using lightweight and logistically straightforward sources. We report results from controlled seismic source tests carried out near the West Antarctic Ice Sheet Divide investigating the characteristics of two types of surface seismic sources, Poulter shots and detonating cord, for use in both 2-D and 3-D seismic surveys on glaciers. Both source types produced strong basal P-wave and S-wave reflections and multiples recorded in three components. The Poulter shots had a higher amplitude for low frequencies (<10 Hz) and comparable amplitude at high frequencies (>50 Hz) relative to the detonating cord. Amplitudes, frequencies, speed of source set-up, and cost all suggested Poulter shots to be the preferred surface source compared to detonating cord for future 2-D and 3-D seismic surveys on glaciers.
The brain can be represented as a network, with nodes as brain regions and edges as region-to-region connections. Nodes with the most connections (hubs) are central to efficient brain function. Current findings on structural differences in Major Depressive Disorder (MDD) identified using network approaches remain inconsistent, potentially due to small sample sizes. It is still uncertain at what level of the connectome hierarchy differences may exist, and whether they are concentrated in hubs, disrupting fundamental brain connectivity.
Methods
We utilized two large cohorts, UK Biobank (UKB, N = 5104) and Generation Scotland (GS, N = 725), to investigate MDD case–control differences in brain network properties. Network analysis was done across four hierarchical levels: (1) global, (2) tier (nodes grouped into four tiers based on degree) and rich club (between-hub connections), (3) nodal, and (4) connection.
Results
In UKB, reductions in network efficiency were observed in MDD cases globally (d = −0.076, pFDR = 0.033), across all tiers (d = −0.069 to −0.079, pFDR = 0.020), and in hubs (d = −0.080 to −0.113, pFDR = 0.013–0.035). No differences in rich club organization and region-to-region connections were identified. The effect sizes and direction for these associations were generally consistent in GS, albeit not significant in our lower-N replication sample.
Conclusion
Our results suggest that the brain's fundamental rich club structure is similar in MDD cases and controls, but subtle topological differences exist across the brain. Consistent with recent large-scale neuroimaging findings, our findings offer a connectomic perspective on a similar scale and support the idea that minimal differences exist between MDD cases and controls.
We argue that Madole & Harden's distinction between shallow versus deep genetic causes can bring some clarity to causal claims arising from genome-wide association studies (GWASs). However, the authors argue that GWAS only finds shallow genetic causes, making GWAS commensurate with the environmental studies they hope to supplant. We also assess whether their distinction applies best to explanations or causes.
The U.S. Department of Agriculture–Agricultural Research Service (USDA-ARS) has been a leader in weed science research covering topics ranging from the development and use of integrated weed management (IWM) tactics to basic mechanistic studies, including biotic resistance of desirable plant communities and herbicide resistance. ARS weed scientists have worked in agricultural and natural ecosystems, including agronomic and horticultural crops, pastures, forests, wild lands, aquatic habitats, wetlands, and riparian areas. Through strong partnerships with academia, state agencies, private industry, and numerous federal programs, ARS weed scientists have made contributions to discoveries in the newest fields of robotics and genetics, as well as the traditional and fundamental subjects of weed–crop competition and physiology and integration of weed control tactics and practices. Weed science at ARS is often overshadowed by other research topics; thus, few are aware of the long history of ARS weed science and its important contributions. This review is the result of a symposium held at the Weed Science Society of America’s 62nd Annual Meeting in 2022 that included 10 separate presentations in a virtual Weed Science Webinar Series. The overarching themes of management tactics (IWM, biological control, and automation), basic mechanisms (competition, invasive plant genetics, and herbicide resistance), and ecosystem impacts (invasive plant spread, climate change, conservation, and restoration) represent core ARS weed science research that is dynamic and efficacious and has been a significant component of the agency’s national and international efforts. This review highlights current studies and future directions that exemplify the science and collaborative relationships both within and outside ARS. Given the constraints of weeds and invasive plants on all aspects of food, feed, and fiber systems, there is an acknowledged need to face new challenges, including agriculture and natural resources sustainability, economic resilience and reliability, and societal health and well-being.
Tackling Scotland's drug-related deaths and improving outcomes from substance misuse treatments, including residential rehabilitation, is a national priority.
Aims
To analyse and report outcomes up to 4 years after attendance at a substance misuse residential rehabilitation programme (Lothians and Edinburgh Abstinence Programme).
Method
In total, 145 participants were recruited to this longitudinal quantitative cohort study of an abstinence-based residential rehabilitation programme based on the therapeutic community model; 87 of these participants were followed up at 4 years. Outcomes are reported for seven subsections of the Addiction Severity Index-X (ASI-X), together with frequency of alcohol use, heroin use, injecting drug use and rates of abstinence from substances of misuse.
Results
Significant improvement in most outcomes at 4 years compared with admission scores were found. Completing the programme was associated with greater rates of abstinence, reduced alcohol use and improvements in alcohol status score (Mann–Whitney U = 626, P = 0.013), work satisfaction score (U = 596, P = 0.016) and psychiatric status score (U = 562, P = 0.007) on the ASI-X, in comparison with non-completion. Abstinence rates improved from 12% at baseline to 48% at 4 years, with the rate for those completing the programme increasing from 14.5% to 60.7% (χ2(2, 87) = 9.738, P = 0.002). Remaining abstinent from substances at follow-up was associated with better outcomes in the medical (U = 540, P < 0.001), psychiatric (U = 273.5, P < 0.001) and alcohol (U = 322.5, P < 0.001) subsections of the ASI-X.
Conclusions
Attending this abstinence-based rehabilitation programme was associated with positive changes in psychological and social well-being and harm reduction from substance use at 4-year follow-up, with stability of change from years 1 to 4.
Delayed cerebral ischemia (DCI) is a complication of aneurysmal subarachnoid hemorrhage (aSAH) and is associated with significant morbidity and mortality. There is little high-quality evidence available to guide the management of DCI. The Canadian Neurosurgery Research Collaborative (CNRC) is comprised of resident physicians who are positioned to capture national, multi-site data. The objective of this study was to evaluate practice patterns of Canadian physicians regarding the management of aSAH and DCI.
Methods:
We performed a cross-sectional survey of Canadian neurosurgeons, intensivists, and neurologists who manage aSAH. A 19-question electronic survey (Survey Monkey) was developed and validated by the CNRC following a DCI-related literature review (PubMed, Embase). The survey was distributed to members of the Canadian Neurosurgical Society and to Canadian members of the Neurocritical Care Society. Responses were analyzed using quantitative and qualitative methods.
Results:
The response rate was 129/340 (38%). Agreement among respondents was limited to the need for intensive care unit admission, use of clinical and radiographic monitoring, and prophylaxis for the prevention of DCI. Several inconsistencies were identified. Indications for starting hyperdynamic therapy varied. There was discrepancy in the proportion of patients who felt to require IV milrinone, IA vasodilators, or physical angioplasty for treatment of DCI. Most respondents reported their facility does not utilize a standardized definition for DCI.
Conclusion:
DCI is an important clinical entity for which no homogeneity and standardization exists in management among Canadian practitioners. The CNRC calls for the development of national standards in the definition, identification, and treatment of DCI.
Animal and human data demonstrate independent relationships between fetal growth, hypothalamic-pituitary-adrenal axis function (HPA-A) and adult cardiometabolic outcomes. While the association between fetal growth and adult cardiometabolic outcomes is well-established, the role of the HPA-A in these relationships is unclear. This study aims to determine whether HPA-A function mediates or moderates this relationship. Approximately 2900 pregnant women were recruited between 1989-1991 in the Raine Study. Detailed anthropometric data was collected at birth (per cent optimal birthweight [POBW]). The Trier Social Stress Test was administered to the offspring (Generation 2; Gen2) at 18 years; HPA-A responses were determined (reactive responders [RR], anticipatory responders [AR] and non-responders [NR]). Cardiometabolic parameters (BMI, systolic BP [sBP] and LDL cholesterol) were measured at 20 years. Regression modelling demonstrated linear associations between POBW and BMI and sBP; quadratic associations were observed for LDL cholesterol. For every 10% increase in POBW, there was a 0.54 unit increase in BMI (standard error [SE] 0.15) and a 0.65 unit decrease in sBP (SE 0.34). The interaction between participant’s fetal growth and HPA-A phenotype was strongest for sBP in young adulthood. Interactions for BMI and LDL-C were non-significant. Decomposition of the total effect revealed no causal evidence of mediation or moderation.
Fluting is a technological and morphological hallmark of some of the most iconic North American Paleoindian stone points. Through decades of detailed artifact analyses and replication experiments, archaeologists have spent considerable effort reconstructing how flute removals were achieved, and they have explored possible explanations of why fluting was such an important aspect of early point technologies. However, the end of fluting has been less thoroughly researched. In southern North America, fluting is recognized as a diagnostic characteristic of Clovis points dating to approximately 13,000 cal yr BP, the earliest widespread use of fluting. One thousand years later, fluting occurs more variably in Dalton and is no longer useful as a diagnostic indicator. How did fluting change, and why did point makers eventually abandon fluting? In this article, we use traditional 2D measurements, geometric morphometric (GM) analysis of 3D models, and 2D GM of flute cross sections to compare Clovis and Dalton point flute and basal morphologies. The significant differences observed show that fluting in Clovis was highly standardized, suggesting that fluting may have functioned to improve projectile durability. Because Dalton points were used increasingly as knives and other types of tools, maximizing projectile functionality became less important. We propose that fluting in Dalton is a vestigial technological trait retained beyond its original functional usefulness.
Field studies were conducted to determine the effects of synthetic auxin herbicides at simulated exposure rates applied to ‘Covington’ sweetpotato propagation beds on the quality of nonrooted stem cuttings (slips). Treatments included diglycolamine salt of dicamba, 2,4-D choline plus nonionic surfactant (NIS), and 2,4-D choline plus glyphosate at 1/10, 1/33, or 1/66 of a 1X application rate (560 g ae ha−1 dicamba, 1,065 g ae ha−1 2,4-D choline, 1,130 g ae ha−1 glyphosate) applied at 2 or 4 wk after first slip harvest (WASH). Injury to sweetpotato 2 wk after treatment was greatest when herbicides were applied 2 WASH (21%) compared to 4 WASH (16%). More slip injury was caused by 2,4-D choline than by dicamba, and the addition of glyphosate did not increase injury over 2,4-D choline alone. Two weeks after the second application, sweetpotato slips were cut 2 cm above the soil surface and transplanted into production fields. In 2019, sweetpotato ground coverage 8 wk after transplanting was reduced 37% and 26% by the 1/10X rates of dicamba and 2,4-D choline plus NIS, respectively. Though dicamba caused less injury to propagation beds than 2,4-D choline with or without glyphosate, after transplanting, slips treated with 1/10X dicamba did not recover as quickly as those treated with 2,4-D choline. In 2020, sweetpotato ground coverage was 90% or greater for all treatments. Dicamba applied 2 WASH decreased marketable sweetpotato storage root yield by 59% compared to the nontreated check, whereas treatments including 2,4-D choline reduced marketable yield 22% to 29%. All herbicides applied at 4 WASH reduced marketable yield 31% to 36%. The addition of glyphosate to 2,4-D choline did not increase sweetpotato yield. Results indicate that caution should be taken when deciding whether to transplant sweetpotato slips that are suspected to have been exposed to dicamba or 2,4-D choline.
Field studies were conducted in North Carolina in 2018 and 2019 to determine sweetpotato tolerance to indaziflam and its effectiveness in controlling Palmer amaranth in sweetpotato. Treatments included indaziflam pre-transplant; 7 d after transplanting (DATr) or 14 DATr at 29, 44, 58, or 73 g ai ha−1; and checks (weedy and weed-free). Indaziflam applied postemergence caused transient foliar injury to sweetpotato. Indaziflam pretransplant caused less injury to sweetpotato than other application timings regardless of rate. Palmer amaranth control was greatest when indaziflam was applied pretransplant or 7 DATr. In a weed-free environment, sweetpotato marketable yield decreased as indaziflam application was delayed. No differences in storage root length to width ratio were observed.
The 2020 update of the Canadian Stroke Best Practice Recommendations (CSBPR) for the Secondary Prevention of Stroke includes current evidence-based recommendations and expert opinions intended for use by clinicians across a broad range of settings. They provide guidance for the prevention of ischemic stroke recurrence through the identification and management of modifiable vascular risk factors. Recommendations address triage, diagnostic testing, lifestyle behaviors, vaping, hypertension, hyperlipidemia, diabetes, atrial fibrillation, other cardiac conditions, antiplatelet and anticoagulant therapies, and carotid and vertebral artery disease. This update of the previous 2017 guideline contains several new or revised recommendations. Recommendations regarding triage and initial assessment of acute transient ischemic attack (TIA) and minor stroke have been simplified, and selected aspects of the etiological stroke workup are revised. Updated treatment recommendations based on new evidence have been made for dual antiplatelet therapy for TIA and minor stroke; anticoagulant therapy for atrial fibrillation; embolic strokes of undetermined source; low-density lipoprotein lowering; hypertriglyceridemia; diabetes treatment; and patent foramen ovale management. A new section has been added to provide practical guidance regarding temporary interruption of antithrombotic therapy for surgical procedures. Cancer-associated ischemic stroke is addressed. A section on virtual care delivery of secondary stroke prevention services in included to highlight a shifting paradigm of care delivery made more urgent by the global pandemic. In addition, where appropriate, sex differences as they pertain to treatments have been addressed. The CSBPR include supporting materials such as implementation resources to facilitate the adoption of evidence into practice and performance measures to enable monitoring of uptake and effectiveness of recommendations.
While negative affect reliably predicts binge eating, it is unknown how this association may decrease or ‘de-couple’ during treatment for binge eating disorder (BED), whether such change is greater in treatments targeting emotion regulation, or how such change predicts outcome. This study utilized multi-wave ecological momentary assessment (EMA) to assess changes in the momentary association between negative affect and subsequent binge-eating symptoms during Integrative Cognitive Affective Therapy (ICAT-BED) and Cognitive Behavior Therapy Guided Self-Help (CBTgsh). It was predicted that there would be stronger de-coupling effects in ICAT-BED compared to CBTgsh given the focus on emotion regulation skills in ICAT-BED and that greater de-coupling would predict outcomes.
Methods
Adults with BED were randomized to ICAT-BED or CBTgsh and completed 1-week EMA protocols and the Eating Disorder Examination (EDE) at pre-treatment, end-of-treatment, and 6-month follow-up (final N = 78). De-coupling was operationalized as a change in momentary associations between negative affect and binge-eating symptoms from pre-treatment to end-of-treatment.
Results
There was a significant de-coupling effect at follow-up but not end-of-treatment, and de-coupling did not differ between ICAT-BED and CBTgsh. Less de-coupling was associated with higher end-of-treatment EDE global scores at end-of-treatment and higher binge frequency at follow-up.
Conclusions
Both ICAT-BED and CBTgsh were associated with de-coupling of momentary negative affect and binge-eating symptoms, which in turn relate to cognitive and behavioral treatment outcomes. Future research is warranted to identify differential mechanisms of change across ICAT-BED and CBTgsh. Results also highlight the importance of developing momentary interventions to more effectively de-couple negative affect and binge eating.
Exposure to glucocorticoid levels higher than appropriate for current developmental stages induces offspring metabolic dysfunction. Overfed/obese (OB) ewes and their fetuses display elevated blood cortisol, while fetal Adrenocorticotropic hormone (ACTH) remains unchanged. We hypothesized that OB pregnancies would show increased placental 11β hydroxysteroid dehydrogenase 2 (11β-HSD2) that converts maternal cortisol to fetal cortisone as it crosses the placenta and increased 11β-HSD system components responsible for peripheral tissue cortisol production, providing a mechanism for ACTH-independent increase in circulating fetal cortisol. Control ewes ate 100% National Research Council recommendations (CON) and OB ewes ate 150% CON diet from 60 days before conception until necropsy at day 135 gestation. At necropsy, maternal jugular and umbilical venous blood, fetal liver, perirenal fat, and cotyledonary tissues were harvested. Maternal plasma cortisol and fetal cortisol and cortisone were measured. Fetal liver, perirenal fat, cotyledonary 11β-HSD1, hexose-6-phosphate dehydrogenase (H6PD), and 11β-HSD2 protein abundance were determined by Western blot. Maternal plasma cortisol, fetal plasma cortisol, and cortisone were higher in OB vs. CON (p < 0.01). 11β-HSD2 protein was greater (p < 0.05) in OB cotyledonary tissue than CON. 11β-HSD1 abundance increased (p < 0.05) in OB vs. CON fetal liver and perirenal fat. Fetal H6PD, an 11β-HSD1 cofactor, also increased (p < 0.05) in OB vs. CON perirenal fat and tended to be elevated in OB liver (p < 0.10). Our data provide evidence for increased 11β-HSD system components responsible for peripheral tissue cortisol production in fetal liver and adipose tissue, thereby providing a mechanism for an ACTH-independent increase in circulating fetal cortisol in OB fetuses.
The Apolipoprotein (APOE) ε4 allele increases the risk for mild cognitive impairment (MCI) and dementia, but not all carriers develop MCI/dementia. The purpose of this exploratory study was to determine if early and subtle preclinical signs of cognitive dysfunction and medial temporal lobe atrophy are observed in cognitively intact ε4 carriers who subsequently develop MCI.
Methods:
Twenty-nine healthy, cognitively intact ε4 carriers (ε3/ε4 heterozygotes; ages 65–85) underwent neuropsychological testing and MRI-based measurements of medial temporal volumes over a 5-year follow-up interval; data were converted to z-scores based on a non-carrier group consisting of 17 ε3/ε3 homozygotes.
Results:
At follow-up, 11 ε4 carriers (38%) converted to a diagnosis of MCI. At study entry, the MCI converters had significantly lower scores on the Mini-Mental State Examination, Rey Auditory Verbal Learning Test (RAVLT) Trials 1–5, and RAVLT Immediate Recall compared to non-converters. MCI converters also had smaller MRI volumes in the left subiculum than non-converters. Follow-up logistic regressions revealed that left subiculum volumes and RAVLT Trials 1–5 scores were significant predictors of MCI conversion.
Conclusions:
Results from this exploratory study suggest that ε4 carriers who convert to MCI exhibit subtle cognitive and volumetric differences years prior to diagnosis.
Palmer amaranth is the most common and troublesome weed in North Carolina sweetpotato. Field studies were conducted in Clinton, NC, in 2016 and 2017 to determine the critical timing of Palmer amaranth removal in ‘Covington’ sweetpotato. Palmer amaranth was grown with sweetpotato from transplanting to 2, 3, 4, 5, 6, 7, 8, and 9 wk after transplanting (WAP) and maintained weed-free for the remainder of the season. Palmer amaranth height and shoot dry biomass increased as Palmer amaranth removal was delayed. Season-long competition by Palmer amaranth interference reduced marketable yields by 85% and 95% in 2016 and 2017, respectively. Sweetpotato yield loss displayed a strong inverse linear relationship with Palmer amaranth height. A 0.6% and 0.4% decrease in yield was observed for every centimeter of Palmer amaranth growth in 2016 and 2017, respectively. The critical timing for Palmer amaranth removal, based on 5% loss of marketable yield, was determined by fitting a log-logistic model to the relative yield data and was determined to be 2 WAP. These results show that Palmer amaranth is highly competitive with sweetpotato and should be managed as early as possible in the season. The requirement of an early critical timing of weed removal to prevent yield loss emphasizes the importance of early-season scouting and Palmer amaranth removal in sweetpotato fields. Any delay in removal can result in substantial yield reductions and fewer premium quality roots.
Field studies were conducted in 2016 and 2017 in Clinton, NC, to determine the interspecific and intraspecific interference of Palmer amaranth (Amaranthus palmeri S. Watson) or large crabgrass [Digitaria sanguinalis (L.) Scop.] in ‘Covington’ sweetpotato [Ipomoea batatas (L.) Lam.]. Amaranthus palmeri and D. sanguinalis were established 1 d after sweetpotato transplanting and maintained season-long at 0, 1, 2, 4, 8 and 0, 1, 2, 4, 16 plants m−1 of row in the presence and absence of sweetpotato, respectively. Predicted yield loss for sweetpotato was 35% to 76% for D. sanguinalis at 1 to 16 plants m−1 of row and 50% to 79% for A. palmeri at 1 to 8 plants m−1 of row. Weed dry biomass per meter of row increased linearly with increasing weed density. Individual dry biomass of A. palmeri and D. sanguinalis was not affected by weed density when grown in the presence of sweetpotato. When grown without sweetpotato, individual weed dry biomass decreased 71% and 62% from 1 to 4 plants m−1 row for A. palmeri and D. sanguinalis, respectively. Individual weed dry biomass was not affected above 4 plants m−1 row to the highest densities of 8 and 16 plants m−1 row for A. palmeri and D. sanguinalis, respectively.