We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
In the last decade, numerous film and print media treatments of the Second World War have focused on the “soldier's perspective.” Because the wartime generation is now rapidly disappearing, postwar generations have become especially eager to know the war. In particular, there is a hunger to learn how it was experienced by ordinary servicemen—a focus popularized most effectively by historian John Keegan (and film directors such as Clint Eastwood, Oliver Stone, and Francis Ford Coppola). Scholars and war history buffs alike seek “reliable” accounts of the war; presumably, the best histories of the war are those that rest on a firm foundation of the most “reliable” sources. Certainly, the larger historical narrative of the war is incomplete without adequate attention to the personal records that servicemen kept at the time, such as diaries and letters. Nevertheless, like any document, it is essential to pay close attention to the assumptions shared by the authors of these documents and their audiences today, especially regarding the “truth” such texts putatively contain.
Commercial targeted sprayer systems allow producers to reduce herbicide inputs but risks the possibility of not treating emerging weeds. Currently, targeted applications with the John Deere system have five spray sensitivity settings, and no published literature discusses the effects of these settings on detecting and spraying weeds of varying species, sizes, and positions in crops. Research was conducted in Arkansas, Illinois, Indiana, Mississippi, and North Carolina on plantings of corn, cotton, and soybean to determine how various factors might influence the ability of targeted applications to treat weeds. These data included 21 weed species aggregated to six classes with height, width, and densities ranging from 25 to 0.25 cm, 25 to 0.25 cm, and 14.3 to 0.04 plants m−2, respectively. Crop and weed density did not influence the likelihood of treating the weeds. As expected, the sensitivity setting alters the ability to treat weeds. Targeted applications (across sensitivity settings, median weed height and width, and density of 2.4 plants m−2) resulted in a treatment success of 99.6% to 84.4% for Convolvulaceae, 99.1% to 68.8% for decumbent broadleaf weeds, 98.9% to 62.9% for Malvaceae, 99.1% to 70.3% for Poaceae, 98.0% to 48.3% for Amaranthaceae, and 98.5% to 55.8% for yellow nutsedge. Reducing the sensitivity setting reduced the ability to treat weeds. The size of weeds aided targeted application success, with larger weeds being more readily treated through easier detection. Based on these findings, various conditions can affect the outcome of targeted multinozzle applications. Additionally, the analyses highlight some of the parameters to consider when using these technologies.
We draw a distinction between the traditional reference class problem, which describes an obstruction to estimating a single individual probability—which we rename the individual reference class problem—and what we call the reference class problem at scale, which can result when using tools from statistics and machine learning to systematically make predictions about many individual probabilities simultaneously. We argue that scale actually helps to mitigate the reference class problem, and purely statistical tools can be used to efficiently minimize the reference class problem at scale, even though they cannot be used to solve the individual reference class problem.
Interprofessional teams in the pediatric cardiac ICU consolidate their management plans in pre-family meeting huddles, a process that affects the course of family meetings but often lacks optimal communication and teamwork.
Methods:
Cardiac ICU clinicians participated in an interprofessional intervention to improve how they prepared for and conducted family meetings. We conducted a pretest–posttest study with clinicians participating in huddles before family meetings. We assessed feasibility of clinician enrollment, assessed clinician perception of acceptability of the intervention via questionnaire and semi-structured interviews, and impact on team performance using a validated tool. Wilcoxon rank sum test assessed intervention impact on team performance at meeting level comparing pre- and post-intervention data.
Results:
Totally, 24 clinicians enrolled in the intervention (92% retention) with 100% completion of training. All participants recommend cardiac ICU Teams and Loved ones Communicating to others and 96% believe it improved their participation in family meetings. We exceeded an acceptable level of protocol fidelity (>75%). Team performance was significantly (p < 0.001) higher in post-intervention huddles (n = 30) than in pre-intervention (n = 28) in all domains. Median comparisons: Team structure [2 vs. 5], Leadership [3 vs. 5], Situation Monitoring [3 vs. 5], Mutual Support [ 3 vs. 5], and Communication [3 vs. 5].
Conclusion:
Implementing an interprofessional team intervention to improve team performance in pre-family meeting huddles is feasible, acceptable, and improves team function. Future research should further assess impact on clinicians, patients, and families.
The opportunity to increase soybean yield has prompted Illinois farmers to plant soybean earlier than historical norms. Extending the growing season with an earlier planting date might alter the relationship between soybean growth and weed emergence timings, potentially altering the optimal herbicide application timings to minimize crop yield loss due to weed interference and ensure minimal weed seed production. The objective of this research was to examine various herbicide treatments applied at different timings and rates to assess the effect on weed control and yield in early-planted soybean. Field experiments were conducted in 2021 at three locations across central Illinois to determine effective chemical strategies for weed management in early-planted soybean. PRE treatments consisted of a S-metolachlor + metribuzin premix applied at planting or just prior to soybean emergence at 0.5X (883 + 210 g ai ha−1) or 1X (1,766 + 420 g ai ha−1) label-recommended rates. POST treatments were applied when weeds reached 10 cm tall and consisted of 1X rates of glufosinate (655 g ai ha−1) + glyphosate (1,260 g ae ha−1) + ammonium sulfate, without or with pyroxasulfone at a 0.5X (63 g ai ha−1) or 1X (126 g ai ha−1) rate. Treatments comprising both a full rate of PRE followed by a POST resulted in the greatest and most consistent weed control at the final evaluation timing. The addition of pyroxasulfone to POST treatments did not consistently reduce late-season weed emergence. The lack of a consistent effect by pyroxasulfone could be attributed to suppression of weeds by soybean canopy closure due to earlier soybean development. The full rate of PRE extended the timing of POST application 2 to 3 wk for all treatments at all locations except Urbana. Full-rate PRE treatments also reduced the time between the POST application and soybean canopy closure. Overall, a full-rate PRE reduced early-season weed interference and minimized soybean yield loss due to weed interference.
Foliar-applied postemergence applications of glufosinate are often applied to glufosinate-resistant crops to provide nonselective weed control without significant crop injury. Rainfall, air temperature, solar radiation, and relative humidity near the time of application have been reported to affect glufosinate efficacy. However, previous research may have not captured the full range of weather variability to which glufosinate may be exposed before or following application. Additionally, climate models suggest more extreme weather will become the norm, further expanding the weather range to which glufosinate can be exposed. The objective of this research was to quantify the probability of successful weed control (efficacy ≥85%) with glufosinate applied to some key weed species across a broad range of weather conditions. A database of >10,000 North American herbicide evaluation trials was used in this study. The database was filtered to include treatments with a single postemergence application of glufosinate applied to waterhemp [Amaranthus tuberculatus (Moq.) Sauer], morningglory species (Ipomoea spp.), and/or giant foxtail (Setaria faberi Herrm.) <15 cm in height. These species were chosen because they are well represented in the database and listed as common and troublesome weed species in both corn (Zea mays L.) and soybean [Glycine max (L.) Merr.] (Van Wychen 2020, 2022). Individual random forest models were created. Low rainfall (≤20 mm) over the 5 d before glufosinate application was detrimental to the probability of successful control of A. tuberculatus and S. faberi. Lower relative humidity (≤70%) and solar radiation (≤23 MJ m−1 d−1) on the day of application reduced the probability of successful weed control in most cases. Additionally, the probability of successful control decreased for all species when average air temperature over the first 5 d after application was ≤25 C. As climate continues to change and become more variable, the risk of unacceptable control of several common species with glufosinate is likely to increase.
Foliar-applied postemergence herbicides are a critical component of corn (Zea mays L.) and soybean [Glycine max (L.) Merr.] weed management programs in North America. Rainfall and air temperature around the time of application may affect the efficacy of herbicides applied postemergence in corn or soybean production fields. However, previous research utilized a limited number of site-years and may not capture the range of rainfall and air temperatures that these herbicides are exposed to throughout North America. The objective of this research was to model the probability of achieving successful weed control (≥85%) with commonly applied postemergence herbicides across a broad range of environments. A large database of more than 10,000 individual herbicide evaluation field trials conducted throughout North America was used in this study. The database was filtered to include only trials with a single postemergence application of fomesafen, glyphosate, mesotrione, or fomesafen + glyphosate. Waterhemp [Amaranthus tuberculatus (Moq.) Sauer], morningglory species (Ipomoea spp.), and giant foxtail (Setaria faberi Herrm.) were the weeds of focus. Separate random forest models were created for each weed species by herbicide combination. The probability of successful weed control deteriorated when the average air temperature within the first 10 d after application was <19 or >25 C for most of the herbicide by weed species models. Additionally, drier conditions before postemergence herbicide application reduced the probability of successful control for several of the herbicide by weed species models. As air temperatures increase and rainfall becomes more variable, weed control with many of the commonly used postemergence herbicides is likely to become less reliable.
In response to the COVID-19 pandemic, we rapidly implemented a plasma coordination center, within two months, to support transfusion for two outpatient randomized controlled trials. The center design was based on an investigational drug services model and a Food and Drug Administration-compliant database to manage blood product inventory and trial safety.
Methods:
A core investigational team adapted a cloud-based platform to randomize patient assignments and track inventory distribution of control plasma and high-titer COVID-19 convalescent plasma of different blood groups from 29 donor collection centers directly to blood banks serving 26 transfusion sites.
Results:
We performed 1,351 transfusions in 16 months. The transparency of the digital inventory at each site was critical to facilitate qualification, randomization, and overnight shipments of blood group-compatible plasma for transfusions into trial participants. While inventory challenges were heightened with COVID-19 convalescent plasma, the cloud-based system, and the flexible approach of the plasma coordination center staff across the blood bank network enabled decentralized procurement and distribution of investigational products to maintain inventory thresholds and overcome local supply chain restraints at the sites.
Conclusion:
The rapid creation of a plasma coordination center for outpatient transfusions is infrequent in the academic setting. Distributing more than 3,100 plasma units to blood banks charged with managing investigational inventory across the U.S. in a decentralized manner posed operational and regulatory challenges while providing opportunities for the plasma coordination center to contribute to research of global importance. This program can serve as a template in subsequent public health emergencies.
Staphylococcus aureus nasal carriers were randomized (1:1) to XF-73 or placebo nasal gel, administered 5x over ∼24hrs pre-cardiac surgery. S. aureus burden rapidly decreased after 2 doses (–2.2log10 CFU/mL; placebo –0.01log10 CFU/mL) and was maintained to 6 days post-surgery. Among XF-73 patients, 46.5% received post-operative anti-staphylococcal antibiotics versus 70% in placebo (P = 0.045).
Prevented planting payments reimburse crop producers for losses from not being able to plant. These payments provide critical protection to producers; however, these payments, which are determined using a nationwide, crop-specific coverage factor, have been questioned to induce moral hazard. Depending on the region and crop insurance coverage, payments from this provision exceed producers’ losses. This paper estimates the prevented planting coverage factor by coverage level and region that would equitably reimburse corn and soybean producers for their losses. We find the prevented planting coverage factor has significant variation across coverage levels and location within our study region. The prevented planting coverage factor was found to decline as the policy coverage level increases. The further north in the study region the higher the coverage factor, likely due to increased land rent expenses. The results provide a unique perspective of how these coverage factors would vary to equitably compensate producers for losses, which addresses the moral hazard concerns with prevented planting.
Suicidal thoughts and behaviors are elevated among active-duty service members (ADSM) and veterans compared to the general population. Hence, it is a priority to examine maintenance factors underlying suicidal ideation among ADSM and veterans to develop effective, targeted interventions. In particular, interpersonal risk factors, hopelessness, and overarousal have been robustly connected to suicidal ideation and intent.
Methods
To identify the suicidal ideation risk factors that are most relevant, we employed network analysis to examine between-subjects (cross-sectional), contemporaneous (within seconds), and temporal (across four hours) group-level networks of suicidal ideation and related risk factors in a sample of ADSM and veterans (participant n = 92, observations n = 10 650). Participants completed ecological momentary assessment (EMA) surveys four times a day for 30 days, where they answered questions related to suicidal ideation, interpersonal risk factors, hopelessness, and overarousal.
Results
The between-subjects and contemporaneous networks identified agitation, not feeling close to others, and ineffectiveness as the most central symptoms. The temporal network revealed that feeling ineffective was most likely to influence other symptoms in the network over time.
Conclusion
Our findings suggest that ineffectiveness, low belongingness, and agitation are important drivers of moment-to-moment and longitudinal relations between risk factors for suicidal ideation in ADSM and veterans. Targeting these symptoms may disrupt suicidal ideation.
Autoimmune encephalitis is increasingly recognized as a neurologic cause of acute mental status changes with similar prevalence to infectious encephalitis. Despite rising awareness, approaches to diagnosis remain inconsistent and evidence for optimal treatment is limited. The following Canadian guidelines represent a consensus and evidence (where available) based approach to both the diagnosis and treatment of adult patients with autoimmune encephalitis. The guidelines were developed using a modified RAND process and included input from specialists in autoimmune neurology, neuropsychiatry and infectious diseases. These guidelines are targeted at front line clinicians and were created to provide a pragmatic and practical approach to managing such patients in the acute setting.
Depression is characterized by abnormalities in emotional processing, but the specific drivers of such emotional abnormalities are unknown. Computational work indicates that both surprising outcomes (prediction errors; PEs) and outcomes (values) themselves drive emotional responses, but neither has been consistently linked to affective disturbances in depression. As a result, the computational mechanisms driving emotional abnormalities in depression remain unknown.
Methods
Here, in 687 individuals, one-third of whom qualify as depressed via a standard self-report measure (the PHQ-9), we use high-stakes, naturalistic events – the reveal of midterm exam grades – to test whether individuals with heightened depression display a specific reduction in emotional response to positive PEs.
Results
Using Bayesian mixed effects models, we find that individuals with heightened depression do not affectively benefit from surprising, good outcomes – that is, they display reduced affective responses to positive PEs. These results were highly specific: effects were not observed to negative PEs, value signals (grades), and were not related to generalized anxiety. This suggests that the computational drivers of abnormalities in emotion in depression may be specifically due to positive PE-based emotional responding.
Conclusions
Affective abnormalities are core depression symptoms, but the computational mechanisms underlying such differences are unknown. This work suggests that blunted affective reactions to positive PEs are likely mechanistic drivers of emotional dysregulation in depression.
Mild traumatic brain injury (mTBI), depression, and posttraumatic stress disorder (PTSD) are a notable triad in Operation Enduring Freedom, Operation Iraqi Freedom, and Operation New Dawn (OEF/OIF/OND) Veterans. With the comorbidity of depression and PTSD in Veterans with mTBI histories, and their role in exacerbating cognitive and emotional dysfunction, interventions addressing cognitive and psychiatric functioning are critical. Compensatory Cognitive Training (CCT) is associated with improvements in areas such as prospective memory, attention, and executive functioning and has also yielded small-to-medium treatment effects on PTSD and depressive symptom severity. Identifying predictors of psychiatric symptom change following CCT would further inform the interventional approach. We sought to examine neuropsychological predictors of PTSD and depressive symptom improvement in Veterans with a history of mTBI who received CCT.
Participants and Methods:
37 OEF/OIF/OND Veterans with mTBI history and cognitive complaints received 10-weekly 120-minute CCT group sessions as part of a clinical trial. Participants completed a baseline neuropsychological assessment including tests of premorbid functioning, attention/working memory, processing speed, verbal learning/memory, and executive functioning, and completed psychiatric symptom measures (PTSD Checklist-Military Version; Beck Depression Inventory-II) at baseline, post-treatment, and 5-week follow-up. Paired samples t-tests were used to examine statistically significant change in PTSD (total and symptom cluster scores) and depressive symptom scores over time. Pearson correlations were calculated between neuropsychological scores and PTSD and depressive symptom change scores at post-treatment and follow-up. Neuropsychological measures identified as significantly correlated with psychiatric symptom change scores (p^.05) were entered as independent variables in separate multiple linear regression analyses to predict symptom change at post-treatment and follow-up.
Results:
Over 50% of CCT participants had clinically meaningful improvement in depressive symptoms (>17.5% score reduction) and over 20% had clinically meaningful improvement in PTSD symptoms (>10-point improvement) at post-treatment and follow-up. Examination of PTSD symptom cluster scores (re-experiencing, avoidance/numbing, and arousal) revealed a statistically significant improvement in avoidance/numbing at follow-up. Bivariate correlations indicated that worse baseline performance on D-KEFS Category Fluency was moderately associated with PTSD symptom improvement at post-treatment. Worse performance on both D-KEFS Category Fluency and Category Switching Accuracy was associated with improvement in depressive symptoms at post-treatment and follow-up. Worse performance on D-KEFS Trail Making Test Switching was associated with improvement in depressive symptoms at follow-up. Subsequent regression analyses revealed worse processing speed and worse aspects of executive functioning at baseline significantly predicted depressive symptom improvement at post-treatment and follow-up.
Conclusions:
Worse baseline performances on tests of processing speed and aspects of executive functioning were significantly associated with improvements in PTSD and depressive symptoms during the trial. Our results suggest that cognitive training may bolster skills that are helpful for PTSD and depressive symptom reduction and that those with worse baseline functioning may benefit more from treatment because they have more room to improve. Although CCT is not a primary treatment for PTSD or depressive symptoms, our results support consideration of including CCT in hybrid treatment approaches. Further research should examine these relationships in larger samples.
We report a case of hypoplastic left heart syndrome and with subsequent aortopathy and then found to have hereditary haemorrhagic telangiectasia/juvenile polyposis syndrome due to a germline SMAD4 pathologic variant. The patient’s staged palliation was complicated by the development of neoaortic aneurysms, arteriovenous malformations, and gastrointestinal bleeding thought to be secondary to Fontan circulation, but workup revealed a SMAD4 variant consistent with hereditary haemorrhagic telangiectasia/juvenile polyposis syndrome. This case underscores the importance of genetic modifiers in CHD, especially those with Fontan physiology.
Understanding parents’ communication preferences and how parental and child characteristics impact satisfaction with communication is vital to mitigate communication challenges in the cardiac ICU.
Methods
This cross-sectional survey was conducted from January 2019 to March 2020 in a paediatric cardiac ICU with parents of patients admitted for at least two weeks. Family satisfaction with communication with the medical team was measured using the Communication Assessment Tool for Team settings. Clinical characteristics were collected via Epic, Pediatric Cardiac Critical Care Consortium local entry and Society for Thoracic Surgeons Congenital Heart Surgery Databases. Associations between communication score and parental mood, stress, perceptions of clinical care, and demographic characteristics along with patient demographic and clinical characteristics were examined. Multivariable ordinal models were conducted with characteristics significant in bivariate analysis.
Results
In total, 93 parents of 84 patients (86% of approached) completed surveys. Parents were 63% female and 70% White. Seventy per cent of patients were <6 months old at admission, 25% had an extracardiac abnormality, and 80% had a cardiac surgery this admission. Parents of children with higher pre-surgical risk of mortality scores (OR 2.875; 95%CI 1.076–7.678), presence of surgical complications (72 [63.0, 75.0] vs. 64 [95%CI 54.6, 73] (p = 0.0247)), and greater satisfaction with care in the ICU (r = 0.93922; p < 0.0001) had significantly higher communication scores.
Conclusion
These findings can prepare providers for scenarios with higher risk for communication challenges and demonstrate the need for further investigation into interventions that reduce parental anxiety and improve communication for patients with unexpected clinical trajectories
We studied 83 cardiac-surgery patients with nasal S. aureus carriage who received 4 intranasal administrations of XF-73 nasal gel or placebo <24 hours before surgery. One hour before surgery, patients exhibited a S. aureus nasal carriage reduction of 2.5 log10 with XF-73 compared to 0.4 log10 CFU/mL for those who received placebo (95% CI, −2.7 to −1.5; P < .0001).
The dissociative subtype of post-traumatic stress disorder (PTSD-DS) was introduced in the fifth edition of the Diagnostic and Statistical Manual of Mental Disorders (DSM-5), and is characterised by symptoms of either depersonalisation or derealisation, in addition to a diagnosis of post-traumatic stress disorder (PTSD). This systematic review and meta-analysis sought to estimate the point prevalence of current PTSD-DS, and the extent to which method of assessment, demographic and trauma variables moderate this estimate, across different methods of prevalence estimation. Studies included were identified by searching MEDLINE (EBSCO), PsycInfo, CINAHL, Academic Search Complete and PTSDpubs, yielding 49 studies that met the inclusion criteria (N = 8214 participants). A random-effects meta-analysis estimated the prevalence of PTSD-DS as 38.1% (95% CI 31.5–45.0%) across all samples, 45.5% (95% CI 37.7–53.4%) across all diagnosis-based and clinical cut-off samples, 22.8% (95% CI 14.8–32.0%) across all latent class analysis (LCA) and latent profile analysis (LPA) samples and 48.1% (95% CI 35.0–61.3%) across samples which strictly used the DSM-5 PTSD criteria; all as a proportion of those already with a diagnosis of PTSD. All results were characterised by high levels of heterogeneity, limiting generalisability. Moderator analyses mostly failed to identify sources of heterogeneity. PTSD-DS was more prevalent in children compared to adults, and in diagnosis-based and clinical cut-off samples compared to LCA and LPA samples. Risk of bias was not significantly related to prevalence estimates. The implications of these results are discussed further.
A clinical decision tree was developed using point-of-care characteristics to identify patients with culture-proven sepsis due to extended-spectrum β-lactamase–producing Enterobacterales (ESBL-PE). We compared its performance with the clinical gestalt of emergency department (ED) clinicians and hospital-based clinicians. The developed tree outperformed ED-based clinicians but was comparable to inpatient-based clinicians.
Sweet corn (Zea mays L.) tolerance to dicamba and several other herbicides is due to cytochrome P450 (CYP)-mediated metabolism and is conferred by a single gene (Nsf1). Tolerance varies by CYP genotypic class, with hybrids homozygous for functional CYP (Nsf1Nsf1) being the most tolerant and hybrids homozygous for mutant CYP alleles (nsf1nsf1) being the least tolerant. The herbicide safener cyprosulfamide (CSA) increases tolerance to dicamba by stimulating the expression of several CYPs. However, the extent to which CSA improves the tolerance of different sweet corn CYP genotypic classes to dicamba is poorly understood. Additionally, the effect of growth stage on sweet corn sensitivity to dicamba is inadequately described. The objective of this work was to quantify the significance of application timing, formulation, and CYP genotypic class on sweet corn response to dicamba. Hybrids representing each of the three CYP genotypes (Nsf1Nsf1, Nsf1nsf1, nsf1nsf1), were treated with dicamba or dicamba + CSA at one of three growth stages: V3, V6, or V9. Across all timings, the nsf1nsf1 hybrid was the least tolerant to dicamba, displaying 16% higher crop injury levels 2 wk after treatment and 2,130 kg ha−1 lower ear mass yields compared with the Nsf1Nsf1 hybrid. The V9 growth stage was the most susceptible time for dicamba injury regardless of genotypic class, with 1.89 and 1,750 kg ha−1 lower ear mass yields compared with the V3 and V6 application timings, respectively. The addition of CSA to dicamba V9 applications reduced the injury from dicamba for all three genotypic classes; however, it did not eliminate the injury. The use of Nsf1Nsf1 or Nsf1nsf1 sweet corn hybrids along with herbicide safeners will reduce the frequency and severity of injury from dicamba and other CYP-metabolized herbicides.