We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Patients with posttraumatic stress disorder (PTSD) exhibit smaller regional brain volumes in commonly reported regions including the amygdala and hippocampus, regions associated with fear and memory processing. In the current study, we have conducted a voxel-based morphometry (VBM) meta-analysis using whole-brain statistical maps with neuroimaging data from the ENIGMA-PGC PTSD working group.
Methods
T1-weighted structural neuroimaging scans from 36 cohorts (PTSD n = 1309; controls n = 2198) were processed using a standardized VBM pipeline (ENIGMA-VBM tool). We meta-analyzed the resulting statistical maps for voxel-wise differences in gray matter (GM) and white matter (WM) volumes between PTSD patients and controls, performed subgroup analyses considering the trauma exposure of the controls, and examined associations between regional brain volumes and clinical variables including PTSD (CAPS-4/5, PCL-5) and depression severity (BDI-II, PHQ-9).
Results
PTSD patients exhibited smaller GM volumes across the frontal and temporal lobes, and cerebellum, with the most significant effect in the left cerebellum (Hedges’ g = 0.22, pcorrected = .001), and smaller cerebellar WM volume (peak Hedges’ g = 0.14, pcorrected = .008). We observed similar regional differences when comparing patients to trauma-exposed controls, suggesting these structural abnormalities may be specific to PTSD. Regression analyses revealed PTSD severity was negatively associated with GM volumes within the cerebellum (pcorrected = .003), while depression severity was negatively associated with GM volumes within the cerebellum and superior frontal gyrus in patients (pcorrected = .001).
Conclusions
PTSD patients exhibited widespread, regional differences in brain volumes where greater regional deficits appeared to reflect more severe symptoms. Our findings add to the growing literature implicating the cerebellum in PTSD psychopathology.
The macro-social and environmental conditions in which people live, such as the level of a country’s development or inequality, are associated with brain-related disorders. However, the relationship between these systemic environmental factors and the brain remains unclear. We aimed to determine the association between the level of development and inequality of a country and the brain structure of healthy adults.
Methods
We conducted a cross-sectional study pooling brain imaging (T1-based) data from 145 magnetic resonance imaging (MRI) studies in 7,962 healthy adults (4,110 women) in 29 different countries. We used a meta-regression approach to relate the brain structure to the country’s level of development and inequality.
Results
Higher human development was consistently associated with larger hippocampi and more expanded global cortical surface area, particularly in frontal areas. Increased inequality was most consistently associated with smaller hippocampal volume and thinner cortical thickness across the brain.
Conclusions
Our results suggest that the macro-economic conditions of a country are reflected in its inhabitants’ brains and may explain the different incidence of brain disorders across the world. The observed variability of brain structure in health across countries should be considered when developing tools in the field of personalized or precision medicine that are intended to be used across the world.
To evaluate the impact of a mobile-app-based central line-associated bloodstream infection (CLABSI) prevention program in oncology clinic patients with peripherally inserted central catheters (PICCs).
Design:
Pre-post prospective cohort study with baseline (July 2015–December 2016), phase-in (January 2017–April 2017), and intervention (May 2017–November 2018). Generalized linear mixed models compared intervention with baseline frequency of localized inflammation/infection and dressing peeling. Cox proportional hazards models compared days-to-removal of lines with localized inflammation/infection. Chi-square test compared bacteremia rates before and after intervention.
Setting:
Oncology clinic at a large medical center.
Patients:
Oncology clinic adult patients with PICCs.
Intervention:
CLABSI prevention program consisting of an actionable scoring system for identifying insertion site infection/inflammation coupled with a mobile-app enabling photo-assessments and automated physician alerting for remote response.
Results:
We completed 5,343 assessments of 569 PICCs in 401 patients (baseline: 2,924 assessments, 300 PICCs, 216 patients; intervention: 2,419 assessments, 269 PICCs, 185 patients). The intervention was associated with a 92% lower likelihood of having a dressing with peeling (OR 0.08, 95%CI 0.04-0.17, P < 0.001), 53% lower local inflammation/infection (OR 0.47, 95%CI 0.27-0.84, P < 0.011), and 24% (non-significant) lower CLABSI rates (P = .63). Physician mobile-app alerting and response enabled 80% lower risk of lines remaining in place after inflammation/infection was identified (HR 0.20, 95%CI:0.14-0.30, P < 0.001) and 85% faster removal of infected lines from mean (SD) 11.1 (9.7) to 1.7 (2.4) days.
Conclusions:
A mobile-app-based CLABSI prevention program decreased frequency of inflamed/infected central line insertion sites and increased speed of removal when inflammation/infection was found.
In July 2022, a genetically linked and geographically dispersed cluster of 12 cases of Shiga toxin-producing Escherichia coli (STEC) O103:H2 was detected by the UK Health Security Agency using whole genome sequencing. Review of food history questionnaires identified cheese (particularly an unpasteurized brie-style cheese) and mixed salad leaves as potential vehicles. A case–control study was conducted to investigate exposure to these products. Case food history information was collected by telephone. Controls were recruited using a market research panel and self-completed an online questionnaire. Univariable and multivariable analyses were undertaken using Firth Logistic Regression. Eleven cases and 24 controls were included in the analysis. Consumption of the brie-style cheese of interest was associated with illness (OR 57.5, 95% confidence interval: 3.10–1,060). Concurrently, the production of the brie-style cheese was investigated. Microbiological sample results for the cheese products and implicated dairy herd did not identify the outbreak strain, but did identify the presence of stx genes and STEC, respectively. Together, epidemiological, microbiological, and environmental investigations provided evidence that the brie-style cheese was the vehicle for this outbreak. Production of unpasteurized dairy products was suspended by the business operator, and a review of practices was performed.
Conservation aquaculture, defined as cultivating aquatic organisms to manage or replenish natural populations, has been advocated as a strategy to enhance fisheries production and help restore declining populations. Culture is especially compelling for species in steep decline and for which there is established methodology. The queen conch Aliger gigas is an example of a species with widely overexploited populations, with attempts to culture the species commercially ongoing for > 40 years. However, hatchery-releases have shown low survival from post-settlement to near maturity, leading to low conservation aquaculture potential. When this is viewed alongside large-scale fishery extractions, it is apparent that it is not commercially feasible to replace wild harvest nor ecologically feasible to replenish queen conch populations using existing aquaculture approaches. An age-based mortality model estimates the magnitude of culture required to replace a single adult of reproductive age. Extrapolations from catch–weight relationships highlight the scale of facilities and costs required to partially offset the harvest in a typical Caribbean fishery. Estimates of reproduction to achieve replacement suggest a greater yield from properly protecting natural breeding aggregations. Queen conch aquaculture is useful for scientific inquiry, community engagement and education, but not for stock enhancement or population restoration without more practical and cost-efficient options. Therefore, protecting breeding aggregations should be prioritized for the ecological viability of the species, as well as for its economic value for the people and industries that rely upon it.
The psychometric rigor of unsupervised, smartphone-based assessments and factors that impact remote protocol engagement is critical to evaluate prior to the use of such methods in clinical contexts. We evaluated the validity of a high-frequency, smartphone-based cognitive assessment protocol, including examining convergence and divergence with standard cognitive tests, and investigating factors that may impact adherence and performance (i.e., time of day and anticipated receipt of feedback vs. no feedback).
Methods:
Cognitively unimpaired participants (N = 120, Mage = 68.8, 68.3% female, 87% White, Meducation = 16.5 years) completed 8 consecutive days of the Mobile Monitoring of Cognitive Change (M2C2), a mobile app-based testing platform, with brief morning, afternoon, and evening sessions. Tasks included measures of working memory, processing speed, and episodic memory. Traditional neuropsychological assessments included measures from the Preclinical Alzheimer’s Cognitive Composite battery.
Results:
Findings showed overall high compliance (89.3%) across M2C2 sessions. Average compliance by time of day ranged from 90.2% for morning sessions, to 77.9% for afternoon sessions, and 84.4% for evening sessions. There was evidence of faster reaction time and among participants who expected to receive performance feedback. We observed excellent convergent and divergent validity in our comparison of M2C2 tasks and traditional neuropsychological assessments.
Conclusions:
This study supports the validity and reliability of self-administered, high-frequency cognitive assessment via smartphones in older adults. Insights into factors affecting adherence, performance, and protocol implementation are discussed.
This Element represents the first systematic study of the risks borne by those who produced, commissioned, and purchased art, across Renaissance Europe. It employs a new methodology, built around concepts from risk analysis and decision theory. The Element classifies scores of documented examples of losses into 'production risks', which arise from the conception of a work of art until its final placement, and 'reception risks', when a patron, a buyer, or viewer finds a work displeasing, inappropriate, or offensive. Significant risks must be tamed before players undertake transactions. The Element discusses risk-taming mechanisms operating society-wide: extensive communication flows, social capital, and trust, and the measures individual participants took to reduce the likelihood and consequences of losses. Those mechanisms were employed in both the patronage-based system and the modern open markets, which predominated respectively in Southern and Northern Europe.
Background: Oncology patients are at high risk for bloodstream infection (BSI) due to immunosuppression and frequent use of central venous catheters. Surveillance in this population is largely relegated to inpatient settings and limited data are available describing community burden. We evaluated rates of BSI, clinic or emergency department (ED) visits, and hospitalizations in a large cohort of oncology outpatients with peripherally inserted central catheters (PICCs). Methods: In this prospective, observational study, we followed a convenience sample of adults (age>18) with PICCs at a large academic outpatient oncology clinic for 35 months between July 2015 and November 2018. We assessed demographics, malignancy type, PICC insertion and removal dates, history of prior PICC, and line duration. Outcomes included BSI events (defined as >1 positive blood cultures or >2 positive blood cultures if coagulase-negative Staphylococcus), ED visits (without hospitalization), and unplanned hospitalizations (excluding scheduled chemotherapy hospitalizations). We used χ2 analyses to compare the frequency of categorical outcomes, and we used unpaired t tests to assess differences in means of continuous variable in hematologic versus solid-tumor malignancy patients. We used generalized linear mixed-effects models to assess differences in BSI (clustered by patient) separately for gram-positive and gram-negative BSI outcomes. Results: Among 478 patients with 658 unique PICC lines and 64,190 line days, 271 patients (413 lines) had hematologic malignancy and 207 patients (232 lines) had solid-tumor malignancy. Cohort characteristics and outcomes stratified by malignancy type are shown in Table 1. Compared to those with hematologic malignancy, solid-tumor patients were older, had 47% fewer clinic visits, and had 32% lower frequency of prior PICC lines. Overall, there were 75 BSI events (12%; 1.2 per 1,000 catheter days). We detected no significant difference in BSI rates when comparing solid-tumor versus hematologic malignancies (P = 0.20); BSIs with gram-positive pathogen were 69% higher in patients with solid tumors. Gram-negative BSIs were 41% higher in patients with hematologic malignancy. Solid-tumor malignancy was associated with 4.5-fold higher odds of developing BSI with gram-positive pathogen (OR, 4.48; 95% CI, 1.60–12.60; P = .005) compared to those with hematologic malignancy, after adjusting for age, sex, history of prior PICC, and line duration. Differences in gram-negative BSI were not significant on multivariate analysis. Conclusions: The burden of all-cause BSIs in cancer clinic adults with PICC lines was 12% or 1.2 per 1,000 catheter days, as high as nationally reported inpatient BSI rates. Higher risk of gram-positive BSIs in solid-tumor patients suggests the need for targeted infection prevention activities in this population, such as improvements in central-line monitoring, outpatient care, and maintenance of lines and/or dressings, as well as chlorhexidine bathing to reduce skin bioburden.
Rabies virus (RABV) is a deadly zoonosis that circulates in wild carnivore populations in North America. Intensive management within the USA and Canada has been conducted to control the spread of the raccoon (Procyon lotor) variant of RABV and work towards elimination. We examined RABV occurrence across the northeastern USA and southeastern Québec, Canada during 2008–2018 using a multi-method, dynamic occupancy model. Using a 10 km × 10 km grid overlaid on the landscape, we examined the probability that a grid cell was occupied with RABV and relationships with management activities (oral rabies vaccination (ORV) and trap-vaccinate-release efforts), habitat, neighbour effects and temporal trends. We compared raccoon RABV detection probabilities between different surveillance samples (e.g. animals that are strange acting, road-kill, public health samples). The management of RABV through ORV was found to be the greatest driver in reducing the occurrence of rabies on the landscape. Additionally, RABV occupancy declined further with increasing duration of ORV baiting programmes. Grid cells north of ORV management were at or near elimination ($\hat{\psi }_{{\rm north}}$ = 0.00, s.e. = 0.15), managed areas had low RABV occupancy ($\hat{\psi }_{{\rm managed}}$ = 0.20, s.e. = 0.29) and enzootic areas had the highest level of RABV occupancy ($\hat{\psi }_{{\rm south}}$ = 0.83, s.e. = 0.06). These results provide evidence that past management actions have been being successful at the goals of reducing and controlling the raccoon variant of RABV. At a finer scale we also found that vaccine bait type and bait density impacted RABV occupancy. Detection probabilities varied; samples from strange acting animals and public health had the highest detection rates. Our results support the movement of the ORV zone south within the USA due to high elimination probabilities along the US border with Québec. Additional enhanced rabies surveillance is still needed to ensure elimination is maintained.
This report describes the official photographic archives of Idi Amin’s government held by the Uganda Broadcasting Corporation (UBC). During his reign from 1971 to 1979, Idi Amin embraced visual media as a tool for archiving the achievements of populist military rule as his government sought to reorient Ugandans’ relationship with the state. Only a handful of the resulting images were ever printed or seen, reflecting the regime’s archival impulse undergirded by paranoia of unauthorized ways of seeing. The UBC’s newly opened collection of over 60,000 negatives from Amin’s photographers, alongside files at the Uganda National Archives, offers the first comprehensive opportunity to study the Ugandan state under Amin’s dictatorship through the lens of its own documentarians.
The Core Elements of Outpatient Antibiotic Stewardship provides a framework to improve antibiotic use, but cost-effectiveness data on implementation of outpatient antibiotic stewardship interventions are limited. We evaluated the cost-effectiveness of Core Element implementation in the outpatient setting.
Methods:
An economic simulation model from the health-system perspective was developed for patients presenting to outpatient settings with uncomplicated acute respiratory tract infections (ARI). Effectiveness was measured as quality-adjusted life years (QALYs). Cost and utility parameters for antibiotic treatment, adverse drug events (ADEs), and healthcare utilization were obtained from the literature. Probabilities for antibiotic treatment and appropriateness, ADEs, hospitalization, and return ARI visits were estimated from 16,712 and 51,275 patient visits in intervention and control sites during the pre- and post-implementation periods, respectively. Data for materials and labor to perform the stewardship activities were used to estimate intervention cost. We performed a one-way and probabilistic sensitivity analysis (PSA) using 1,000,000 second-order Monte Carlo simulations on input parameters.
Results:
The proportion of ARI patient-visits with antibiotics prescribed in intervention sites was lower (62% vs 74%) and appropriate treatment higher (51% vs 41%) after implementation, compared to control sites. The estimated intervention cost over a 2-year period was $133,604 (2018 US dollars). The intervention had lower mean costs ($528 vs $565) and similar mean QALYs (0.869 vs 0.868) per patient compared to usual care. In the PSA, the intervention was dominant in 63% of iterations.
Conclusions:
Implementation of the CDC Core Elements in the outpatient setting was a cost-effective strategy.
Studying phenotypic and genetic characteristics of age at onset (AAO) and polarity at onset (PAO) in bipolar disorder can provide new insights into disease pathology and facilitate the development of screening tools.
Aims
To examine the genetic architecture of AAO and PAO and their association with bipolar disorder disease characteristics.
Method
Genome-wide association studies (GWASs) and polygenic score (PGS) analyses of AAO (n = 12 977) and PAO (n = 6773) were conducted in patients with bipolar disorder from 34 cohorts and a replication sample (n = 2237). The association of onset with disease characteristics was investigated in two of these cohorts.
Results
Earlier AAO was associated with a higher probability of psychotic symptoms, suicidality, lower educational attainment, not living together and fewer episodes. Depressive onset correlated with suicidality and manic onset correlated with delusions and manic episodes. Systematic differences in AAO between cohorts and continents of origin were observed. This was also reflected in single-nucleotide variant-based heritability estimates, with higher heritabilities for stricter onset definitions. Increased PGS for autism spectrum disorder (β = −0.34 years, s.e. = 0.08), major depression (β = −0.34 years, s.e. = 0.08), schizophrenia (β = −0.39 years, s.e. = 0.08), and educational attainment (β = −0.31 years, s.e. = 0.08) were associated with an earlier AAO. The AAO GWAS identified one significant locus, but this finding did not replicate. Neither GWAS nor PGS analyses yielded significant associations with PAO.
Conclusions
AAO and PAO are associated with indicators of bipolar disorder severity. Individuals with an earlier onset show an increased polygenic liability for a broad spectrum of psychiatric traits. Systematic differences in AAO across cohorts, continents and phenotype definitions introduce significant heterogeneity, affecting analyses.
In 2020 a group of U.S. healthcare leaders formed the National Organization to Prevent Hospital-Acquired Pneumonia (NOHAP) to issue a call to action to address non–ventilator-associated hospital-acquired pneumonia (NVHAP). NVHAP is one of the most common and morbid healthcare-associated infections, but it is not tracked, reported, or actively prevented by most hospitals. This national call to action includes (1) launching a national healthcare conversation about NVHAP prevention; (2) adding NVHAP prevention measures to education for patients, healthcare professionals, and students; (3) challenging healthcare systems and insurers to implement and support NVHAP prevention; and (4) encouraging researchers to develop new strategies for NVHAP surveillance and prevention. The purpose of this document is to outline research needs to support the NVHAP call to action. Primary needs include the development of better models to estimate the economic cost of NVHAP, to elucidate the pathophysiology of NVHAP and identify the most promising pathways for prevention, to develop objective and efficient surveillance methods to track NVHAP, to rigorously test the impact of prevention strategies proposed to prevent NVHAP, and to identify the policy levers that will best engage hospitals in NVHAP surveillance and prevention. A joint task force developed this document including stakeholders from the Veterans’ Health Administration (VHA), the U.S. Centers for Disease Control and Prevention (CDC), The Joint Commission, the American Dental Association, the Patient Safety Movement Foundation, Oral Health Nursing Education and Practice (OHNEP), Teaching Oral-Systemic Health (TOSH), industry partners and academia.
In May 2019 we launched a special exhibition at the Uganda Museum in Kampala titled “The Unseen Archive of Idi Amin.” It consisted of 150 images made by government photographers in the 1970s. In this essay we explore how political history has been delimited in the Museum, and how these limitations shaped the exhibition we curated. From the time of its creation, the Museum's disparate and multifarious collections were exhibited as ethnographic specimens, stripped of historical context. Spatially and organizationally, “The Unseen Archive of Idi Amin” turned its back on the ethnographic architecture of the Uganda Museum. The transformation of these vivid, evocative, aesthetically appealing photographs into historical evidence of atrocity was intensely discomfiting. We have been obliged to organize the exhibition around categories that did not correspond with the logic of the photographic archive, with the architecture of the Museum, or with the experiences of the people who lived through the 1970s. The exhibition has made history, but not entirely in ways that we chose.
Background: In June 2019, 3 people were diagnosed with Ebola virus disease (EVD) in Kasese district, Uganda, all of whom had come from the Democratic Republic of Congo (DRC). Although no secondary transmission of Ebola occurred, an assessment of infection prevention and control (IPC) using the WHO basic IPC facility assessment checklist revealed significant gaps. Robust IPC systems are critical for the prevention of healthcare-associated infections like EVD. A rapid intervention was developed and implemented in Kasese to strengthen IPC capacity in high-risk facilities. Methods: Of 117 healthcare facilities, 50 were considered at high risk of receiving suspected EVD cases from DRC based on population movement assessments. In August 2019, IPC mentors were selected from 25 high-risk facilities and assigned to support their facility and a second high-risk facility. Mentors ensured formation of IPC committees and implemented the national mentorship strategy for IPC preparedness in non-EVD treatment facilities. This effort focused on screening, isolation, and notification of suspect cases: 4 mentorship visits were conducted (1 per week for 1 month). Middle and terminal assessments were conducted using the WHO IPC checklist 2 and 4 weeks after the intervention commenced. Results were evaluated against baseline data. Results: Overall, 39 facilities had data from baseline, middle, and end assessments. Median scores in facility IPC standard precautions increased from baseline 50% (IQR, 39%–62%) to 73% (IQR, 67%–76%) at the terminal assessments. Scores increased for all measured parameters except for water source (access to running water). Greatest improvements were seen in formation of IPC committees (41% to 75%), hand hygiene compliance (47% to 86%), waste management (51% to 83%), and availability of dedicated isolation areas (16% to 42%) for suspect cases. Limited improvement was noted for training on management of suspect isolated cases and availability of personal protective equipment (PPE) (Fig. 1). No differences were noted in scores for facilities with nonresident mentors versus those with resident mentors at baseline (48% vs 50%) and end assessments (72% vs 74%). Conclusions: This intervention improved IPC capacity in health facilities while avoiding the cost and service disruption associated with large-scale classroom-based training of health workers. The greatest improvements were seen in activities relying on behavior change, such as hand hygiene, IPC committee, and waste management. Smaller changes were seen in areas requiring significant investments such as isolation areas, steady water source, and availability of personal protective equipment (PPE). Mentorship is ongoing in moderate- and lower-risk facilities in Kasese district.
Funding: None
Disclosures: Mohammed Lamorde reports contract research for Janssen Pharmaceutica, ViiV, Mylan.
Background: The Core Elements of Outpatient Antibiotic Stewardship provide a framework to improve antibiotic use, but cost-effectiveness data on interventions to improve antibiotic use are limited. Beginning in September 2017, an antibiotic stewardship intervention was launched in within 10 outpatient Veterans Healthcare Administration clinics. The intervention was based on the Core Elements and used an academic detailing (AD) and an audit and feedback (AF) approach to encourage appropriate use of antibiotics. The objective of this analysis was to evaluate the cost-effectiveness of the intervention among patients with uncomplicated acute respiratory tract infections (ARI). Methods: We developed an economic simulation model from the VA’s perspective for patients presenting for an index outpatient clinic visit with an ARI (Fig. 1). Effectiveness was measured as quality-adjusted life-years (QALYs). Cost and utility parameters for antibiotic treatment, adverse drug reactions (ADRs), and healthcare utilization were obtained from the published literature. Probability parameters for antibiotic treatment, appropriateness of treatment, antibiotic ADRs, hospitalization, and return ARI visits were estimated using VA Corporate Data Warehouse data from a total of 22,137 patients in the 10 clinics during 2014–2019 before and after the intervention. Detailed cost data on the development of the AD and AF materials and electronically captured time and effort for the National AD Service activities by specific providers from a national ARI campaign were used as a proxy for the cost estimate of similar activities conducted in this intervention. We performed 1-way and probabilistic sensitivity analyses (PSAs) using 10,000 second-order Monte Carlo simulations on costs and utility values using their means and standard deviations. Results: The proportion of uncomplicated ARI visits with antibiotics prescribed (59% vs 40%) was lower and appropriate treatment was higher (24% vs 32%) after the intervention. The intervention was estimated to cost $110,846 (2018 USD) over a 2-year period. Compared to no intervention, the intervention had lower mean costs ($880 vs $517) and higher mean QALYs (0.837 vs 0.863) per patient because of reduced inappropriate treatment, ADRs, and subsequent healthcare utilization, including hospitalization. In threshold analyses, the antibiotic stewardship strategy was no longer dominant if intervention cost was >$64,415,000 or the number of patients cared for was <3,672. In the PSA, the antibiotic stewardship intervention was dominant in 100% of the 10,000 Monte Carlo iterations (Fig. 2). Conclusions: In every scenario, the VA outpatient AD and AF antibiotic stewardship intervention was a dominant strategy compared to no intervention.
To determine whether the Society for Healthcare Epidemiology of America (SHEA) and the Infectious Diseases Society of America (IDSA) Clostridioides difficile infection (CDI) severity criteria adequately predicts poor outcomes.
Design:
Retrospective validation study.
Setting and participants:
Patients with CDI in the Veterans’ Affairs Health System from January 1, 2006, to December 31, 2016.
Methods:
For the 2010 criteria, patients with leukocytosis or a serum creatinine (SCr) value ≥1.5 times the baseline were classified as severe. For the 2018 criteria, patients with leukocytosis or a SCr value ≥1.5 mg/dL were classified as severe. Poor outcomes were defined as hospital or intensive care admission within 7 days of diagnosis, colectomy within 14 days, or 30-day all-cause mortality; they were modeled as a function of the 2010 and 2018 criteria separately using logistic regression.
Results:
We analyzed data from 86,112 episodes of CDI. Severity was unclassifiable in a large proportion of episodes diagnosed in subacute care (2010, 58.8%; 2018, 49.2%). Sensitivity ranged from 0.48 for subacute care using 2010 criteria to 0.73 for acute care using 2018 criteria. Areas under the curve were poor and similar (0.60 for subacute care and 0.57 for acute care) for both versions, but negative predictive values were >0.80.
Conclusions:
Model performances across care settings and criteria versions were generally poor but had reasonably high negative predictive value. Many patients in the subacute-care setting, an increasing fraction of CDI cases, could not be classified. More work is needed to develop criteria to identify patients at risk of poor outcomes.
Increased use of dicamba and/or glyphosate in dicamba/glyphosate-tolerant soybean might affect many sensitive crops, including potato. The objective of this study was to determine the growth and yield of ‘Russet Burbank’ potato grown from seed tubers (generation 2) from mother plants (generation 1) treated with dicamba (4, 20, and 99 g ae ha−1), glyphosate (8, 40, and 197 g ae ha−1), or a combination of dicamba and glyphosate during tuber initiation. Generation 2 tubers were planted near Oakes and Inkster, ND, in 2016 and 2017, at the same research farm where the generation 1 tubers were grown the previous year. Treatment with 99 g ha−1 dicamba, 197 g ha−1 glyphosate, or 99 g ha−1 dicamba + 197 g ha−1 glyphosate caused emergence of generation 2 plants to be reduced by up to 84%, 86%, and 87%, respectively, at 5 wk after planting. Total tuber yield of generation 2 was reduced up to 67%, 55%, and 68% when 99 g ha−1 dicamba, 197 g ha−1 glyphosate, or 99 g ha−1 dicamba + 197 g ha−1 glyphosate was applied to generation 1 plants, respectively. At each site year, 197 g ha−1 glyphosate reduced total yield and marketable yield, while 99 g ha−1 dicamba reduced total yield and marketable yield in some site-years. This study confirms that exposure to glyphosate and dicamba of potato grown for potato seed tubers can negatively affect the growth and yield potential of the subsequently grown daughter generation.