We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Among Líĺwat people of the Interior Plateau of British Columbia, an oral tradition relays how early ancestors used to ascend Qẃelqẃelústen, or Mount Meager. The account maintains that those climbers could see the ocean, which is not the case today, because the mountain is surrounded by many other high peaks, and the Strait of Georgia is several mountain ridges to the west. However, the mountain is an active and volatile volcano, which last erupted circa 2360 cal BP. It is also the site of the largest landslide in Canadian history, which occurred in 2010. Given that it had been a high, glacier-capped mountain throughout the Holocene, much like other volcanoes along the coastal range, we surmise that a climber may have reasonably been afforded a view of the ocean from its prior heights. We conducted viewshed analyses of the potential mountain height prior to its eruption and determined that one could indeed view the ocean if the mountain were at least 950 m higher than it is today. This aligns with the oral tradition, indicating that it may be over 2,400 years old, and plausibly in the range of 4,000 to 9,000 years old when the mountain may have been at such a height.
Clinical outcomes of repetitive transcranial magnetic stimulation (rTMS) for treatment of treatment-resistant depression (TRD) vary widely and there is no mood rating scale that is standard for assessing rTMS outcome. It remains unclear whether TMS is as efficacious in older adults with late-life depression (LLD) compared to younger adults with major depressive disorder (MDD). This study examined the effect of age on outcomes of rTMS treatment of adults with TRD. Self-report and observer mood ratings were measured weekly in 687 subjects ages 16–100 years undergoing rTMS treatment using the Inventory of Depressive Symptomatology 30-item Self-Report (IDS-SR), Patient Health Questionnaire 9-item (PHQ), Profile of Mood States 30-item, and Hamilton Depression Rating Scale 17-item (HDRS). All rating scales detected significant improvement with treatment; response and remission rates varied by scale but not by age (response/remission ≥ 60: 38%–57%/25%–33%; <60: 32%–49%/18%–25%). Proportional hazards models showed early improvement predicted later improvement across ages, though early improvements in PHQ and HDRS were more predictive of remission in those < 60 years (relative to those ≥ 60) and greater baseline IDS burden was more predictive of non-remission in those ≥ 60 years (relative to those < 60). These results indicate there is no significant effect of age on treatment outcomes in rTMS for TRD, though rating instruments may differ in assessment of symptom burden between younger and older adults during treatment.
The adsorption of 13C-labeled benzene on imogolite has been studied on samples which had been evacuated and then heated to remove water below their decomposition point. After adsorption of labeled benzene, the samples were studied by nuclear magnetic resonance using non-spinning techniques. The results show that benzene can occupy more than one pore type and that water does not displace benzene from the intra-tube pores at atmospheric pressure. A further finding is that there are at least two types of adsorbed benzene in so called inter-tube pores, one of which is more rigidly held than that in intratube pores. The presence of disordered materials at the edge of pores could also play a role in altering the pore mouth thereby creating new types of pores. Moreover, where two tubes do not pack properly, space might be created where an adsorbed molecule may bind more tightly than expected in a conventional pore.
In 2016, the National Center for Advancing Translational Science launched the Trial Innovation Network (TIN) to address barriers to efficient and informative multicenter trials. The TIN provides a national platform, working in partnership with 60+ Clinical and Translational Science Award (CTSA) hubs across the country to support the design and conduct of successful multicenter trials. A dedicated Hub Liaison Team (HLT) was established within each CTSA to facilitate connection between the hubs and the newly launched Trial and Recruitment Innovation Centers. Each HLT serves as an expert intermediary, connecting CTSA Hub investigators with TIN support, and connecting TIN research teams with potential multicenter trial site investigators. The cross-consortium Liaison Team network was developed during the first TIN funding cycle, and it is now a mature national network at the cutting edge of team science in clinical and translational research. The CTSA-based HLT structures and the external network structure have been developed in collaborative and iterative ways, with methods for shared learning and continuous process improvement. In this paper, we review the structure, function, and development of the Liaison Team network, discuss lessons learned during the first TIN funding cycle, and outline a path toward further network maturity.
Precision Medicine is an emerging approach for disease treatment and prevention that takes into account individual variability in genes, environment, and lifestyle. Autoimmune diseases are those in which the body’s natural defense system loses discriminating power between its own cells and foreign cells, causing the body to mistakenly attack healthy tissues. These conditions are very heterogeneous in their presentation and therefore difficult to diagnose and treat. Achieving precision medicine in autoimmune diseases has been challenging due to the complex etiologies of these conditions, involving an interplay between genetic, epigenetic, and environmental factors. However, recent technological and computational advances in molecular profiling have helped identify patient subtypes and molecular pathways which can be used to improve diagnostics and therapeutics. This review discusses the current understanding of the disease mechanisms, heterogeneity, and pathogenic autoantigens in autoimmune diseases gained from genomic and transcriptomic studies and highlights how these findings can be applied to better understand disease heterogeneity in the context of disease diagnostics and therapeutics.
The Hierarchical Taxonomy of Psychopathology (HiTOP) has emerged out of the quantitative approach to psychiatric nosology. This approach identifies psychopathology constructs based on patterns of co-variation among signs and symptoms. The initial HiTOP model, which was published in 2017, is based on a large literature that spans decades of research. HiTOP is a living model that undergoes revision as new data become available. Here we discuss advantages and practical considerations of using this system in psychiatric practice and research. We especially highlight limitations of HiTOP and ongoing efforts to address them. We describe differences and similarities between HiTOP and existing diagnostic systems. Next, we review the types of evidence that informed development of HiTOP, including populations in which it has been studied and data on its validity. The paper also describes how HiTOP can facilitate research on genetic and environmental causes of psychopathology as well as the search for neurobiologic mechanisms and novel treatments. Furthermore, we consider implications for public health programs and prevention of mental disorders. We also review data on clinical utility and illustrate clinical application of HiTOP. Importantly, the model is based on measures and practices that are already used widely in clinical settings. HiTOP offers a way to organize and formalize these techniques. This model already can contribute to progress in psychiatry and complement traditional nosologies. Moreover, HiTOP seeks to facilitate research on linkages between phenotypes and biological processes, which may enable construction of a system that encompasses both biomarkers and precise clinical description.
Early in the COVID-19 pandemic, the World Health Organization stressed the importance of daily clinical assessments of infected patients, yet current approaches frequently consider cross-sectional timepoints, cumulative summary measures, or time-to-event analyses. Statistical methods are available that make use of the rich information content of longitudinal assessments. We demonstrate the use of a multistate transition model to assess the dynamic nature of COVID-19-associated critical illness using daily evaluations of COVID-19 patients from 9 academic hospitals. We describe the accessibility and utility of methods that consider the clinical trajectory of critically ill COVID-19 patients.
OBJECTIVES/GOALS: Using the covariate-rich Veteran Health Administration data, estimate the association between Proton Pump Inhibitor (PPI) use and severe COVID-19, rigorously adjusting for confounding using propensity score (PS)-weighting. METHODS/STUDY POPULATION: We assembled a national retrospective cohort of United States veterans who tested positive for SARS-CoV-2, with information on 33 covariates including comorbidity diagnoses, lab values, and medications. Current outpatient PPI use was compared to non-use (two or more fills and pills on hand at admission vs no PPI prescription fill in prior year). The primary composite outcome was mechanical ventilation use or death within 60 days; the secondary composite outcome included ICU admission. PS-weighting mimicked a 1:1 matching cohort, allowing inclusion of all patients while achieving good covariate balance. The weighted cohort was analyzed using logistic regression. RESULTS/ANTICIPATED RESULTS: Our analytic cohort included 97,674 veterans with SARS-CoV-2 testing, of whom 14,958 (15.3%) tested positive (6,262 [41.9%] current PPI-users, 8,696 [58.1%] non-users). After weighting, all covariates were well-balanced with standardized mean differences less than a threshold of 0.1. Prior to PS-weighting (no covariate adjustment), we observed higher odds of the primary (9.3% vs 7.5%; OR 1.27, 95% CI 1.13-1.43) and secondary (25.8% vs 21.4%; OR 1.27, 95% CI 1.18-1.37) outcomes among PPI users vs non-users. After PS-weighting, PPI use vs non-use was not associated with the primary (8.2% vs 8.0%; OR 1.03, 95% CI 0.91-1.16) or secondary (23.4% vs 22.9%;OR 1.03, 95% CI 0.95-1.12) outcomes. DISCUSSION/SIGNIFICANCE: The associations between PPI use and severe COVID-19 outcomes that have been previously reported may be due to limitations in the covariates available for adjustment. With respect to COVID-19, our robust PS-weighted analysis provides patients and providers with further evidence for PPI safety.
To examine the association between adherence to plant-based diets and mortality.
Design:
Prospective study. We calculated a plant-based diet index (PDI) by assigning positive scores to plant foods and reverse scores to animal foods. We also created a healthful PDI (hPDI) and an unhealthful PDI (uPDI) by further separating the healthy plant foods from less-healthy plant foods.
Setting:
The VA Million Veteran Program.
Participants:
315 919 men and women aged 19–104 years who completed a FFQ at the baseline.
Results:
We documented 31 136 deaths during the follow-up. A higher PDI was significantly associated with lower total mortality (hazard ratio (HR) comparing extreme deciles = 0·75, 95 % CI: 0·71, 0·79, Ptrend < 0·001]. We observed an inverse association between hPDI and total mortality (HR comparing extreme deciles = 0·64, 95 % CI: 0·61, 0·68, Ptrend < 0·001), whereas uPDI was positively associated with total mortality (HR comparing extreme deciles = 1·41, 95 % CI: 1·33, 1·49, Ptrend < 0·001). Similar significant associations of PDI, hPDI and uPDI were also observed for CVD and cancer mortality. The associations between the PDI and total mortality were consistent among African and European American participants, and participants free from CVD and cancer and those who were diagnosed with major chronic disease at baseline.
Conclusions:
A greater adherence to a plant-based diet was associated with substantially lower total mortality in this large population of veterans. These findings support recommending plant-rich dietary patterns for the prevention of major chronic diseases.
To determine risk factors for carbapenemase-producing organisms (CPOs) and to determine the prognostic impact of CPOs.
Design:
A retrospective matched case–control study.
Patients:
Inpatients across Scotland in 2010–2016 were included. Patients with a CPO were matched with 2 control groups by hospital, admission date, specimen type, and bacteria. One group comprised patients either infected or colonized with a non-CPO and the other group were general inpatients.
Methods:
Conditional logistic regression models were used to identify risk factors for CPO infection and colonization, respectively. Mortality rates and length of postisolation hospitalization were compared between CPO and non-CPO patients.
Results:
In total, 70 CPO infection cases (with 210 general inpatient controls and 121 non-CPO controls) and 34 CPO colonization cases (with 102 general inpatient controls and 60 non-CPO controls) were identified. Risk factors for CPO infection versus general inpatients were prior hospital stay (adjusted odds ratio [aOR], 4.05; 95% confidence interval [CI], 1.52–10.78; P = .005), longer hospitalization (aOR, 1.07; 95% CI, 1.04–1.10; P < .001), longer intensive care unit (ICU) stay (aOR, 1.41; 95% CI, 1.01–1.98; P = .045), and immunodeficiency (aOR, 3.68; 95% CI, 1.16–11.66; P = .027). Risk factors for CPO colonization were prior high-dependency unit (HDU) stay (aOR, 11.46; 95% CI, 1.27–103.09; P = .030) and endocrine, nutritional, and metabolic (ENM) diseases (aOR, 3.41; 95% CI, 1.02–11.33; P = .046). Risk factors for CPO infection versus non-CPO infection were prolonged hospitalization (aOR, 1.02; 95% CI, 1.00–1.03; P = .038) and HDU stay (aOR, 1.13; 95% CI, 1.02–1.26; P = .024). No differences in mortality rates were detected between CPO and non-CPO patients. CPO infection was associated with longer hospital stay than non-CPO infection (P = .041).
Conclusions:
A history of (prolonged) hospitalization, prolonged ICU or HDU stay; ENM diseases; and being immunocompromised increased risk for CPO. CPO infection was not associated with increased mortality but was associated with prolonged hospital stay.
Iron-rich meteorites are significantly underrepresented in collection statistics from Antarctica. This has led to a hypothesis that there is a sparse layer of iron-rich meteorites hidden below the surface of the ice, thereby explaining the apparent shortfall. As standard Antarctic meteorite collecting techniques rely upon a visual surface search approach, the need has thus arisen to develop a system that can detect iron objects under a few tens of centimetres of ice, where the expected number density is of the order one per square kilometre. To help answer this hypothesis, a large-scale pulse induction metal detector array has been constructed for deployment in Antarctica. The metal detector array is 6 m wide, able to travel at 15 km h-1 and can scan 1 km2 in ~11 hours. This paper details the construction of the metal detector system with respect to design criteria, notably the ruggedization of the system for Antarctic deployment. Some preliminary results from UK and Antarctic testing are presented. We show that the system performs as specified and should reach the pre-agreed target of the detection of a 100 g iron meteorite at 300 mm when deployed in Antarctica.
An improved understanding of diagnostic and treatment practices for patients with rare primary mitochondrial disorders can support benchmarking against guidelines and establish priorities for evaluative research. We aimed to describe physician care for patients with mitochondrial diseases in Canada, including variation in care.
Methods:
We conducted a cross-sectional survey of Canadian physicians involved in the diagnosis and/or ongoing care of patients with mitochondrial diseases. We used snowball sampling to identify potentially eligible participants, who were contacted by mail up to five times and invited to complete a questionnaire by mail or internet. The questionnaire addressed: personal experience in providing care for mitochondrial disorders; diagnostic and treatment practices; challenges in accessing tests or treatments; and views regarding research priorities.
Results:
We received 58 survey responses (52% response rate). Most respondents (83%) reported spending 20% or less of their clinical practice time caring for patients with mitochondrial disorders. We identified important variation in diagnostic care, although assessments frequently reported as diagnostically helpful (e.g., brain magnetic resonance imaging, MRI/MR spectroscopy) were also recommended in published guidelines. Approximately half (49%) of participants would recommend “mitochondrial cocktails” for all or most patients, but we identified variation in responses regarding specific vitamins and cofactors. A majority of physicians recommended studies on the development of effective therapies as the top research priority.
Conclusions:
While Canadian physicians’ views about diagnostic care and disease management are aligned with published recommendations, important variations in care reflect persistent areas of uncertainty and a need for empirical evidence to support and update standard protocols.
Dementia is a leading cause of morbidity and mortality without pharmacologic prevention or cure. Mounting evidence suggests that adherence to a Mediterranean dietary pattern may slow cognitive decline, and is important to characterise in at-risk cohorts. Thus, we determined the reliability and validity of the Mediterranean Diet and Culinary Index (MediCul), a new tool, among community-dwelling individuals with mild cognitive impairment (MCI). A total of sixty-eight participants (66 % female) aged 75·9 (sd 6·6) years, from the Study of Mental and Resistance Training study MCI cohort, completed the fifty-item MediCul at two time points, followed by a 3-d food record (FR). MediCul test–retest reliability was assessed using intra-class correlation coefficients (ICC), Bland–Altman plots and κ agreement within seventeen dietary element categories. Validity was assessed against the FR using the Bland–Altman method and nutrient trends across MediCul score tertiles. The mean MediCul score was 54·6/100·0, with few participants reaching thresholds for key Mediterranean foods. MediCul had very good test–retest reliability (ICC=0·93, 95 % CI 0·884, 0·954, P<0·0001) with fair-to-almost-perfect agreement for classifying elements within the same category. Validity was moderate with no systematic bias between methods of measurement, according to the regression coefficient (y=−2·30+0·17x) (95 % CI −0·027, 0·358; P=0·091). MediCul over-estimated the mean FR score by 6 %, with limits of agreement being under- and over-estimated by 11 and 23 %, respectively. Nutrient trends were significantly associated with increased MediCul scoring, consistent with a Mediterranean pattern. MediCul provides reliable and moderately valid information about Mediterranean diet adherence among older individuals with MCI, with potential application in future studies assessing relationships between diet and cognitive function.
Rapid identification of esophageal intubations is critical to avoid patient morbidity and mortality. Continuous waveform capnography remains the gold standard for endotracheal tube (ETT) confirmation, but it has limitations. Point-of-care ultrasound (POCUS) may be a useful alternative for confirming ETT placement. The objective of this study was to determine the accuracy of paramedic-performed POCUS identification of esophageal intubations with and without ETT manipulation.
Methods
A prospective, observational study using a cadaver model was conducted. Local paramedics were recruited as subjects and each completed a survey of their demographics, employment history, intubation experience, and prior POCUS training. Subjects participated in a didactic session in which they learned POCUS identification of ETT location. During each study session, investigators randomly placed an ETT in either the trachea or esophagus of four cadavers, confirmed with direct laryngoscopy. Subjects then attempted to determine position using POCUS both without and with manipulation of the ETT. Manipulation of the tube was performed by twisting the tube. Descriptive statistics and logistic regression were used to assess the results and the effects of previous paramedic experience.
Results
During 12 study sessions, from March 2014 through December 2015, 57 subjects participated, evaluating a total of 228 intubations: 113 tracheal and 115 esophageal. Subjects were 84.0% male, mean age of 39 years (range: 22 - 62 years), with median experience of seven years (range: 0.6 - 39 years). Paramedics correctly identified ETT location in 158 (69.3%) cases without and 194 (85.1%) with ETT manipulation. The sensitivity and specificity of identifying esophageal location without ETT manipulation increased from 52.2% (95% confidence interval [CI], 43.0-61.0) and 86.7% (95% CI, 81.0-93.0) to 87.0% (95% CI, 81.0-93.0) and 83.2% (95% CI, 0.76-0.90) after manipulation (P<.0001), without affecting specificity (P=.45). Subjects correctly identified 41 previously incorrectly identified esophageal intubations. Paramedic experience, previous intubations, and POCUS experience did not correlate with ability to identify tube location.
Conclusion:
Paramedics can accurately identify esophageal intubations with POCUS, and manipulation improves identification. Further studies of paramedic use of dynamic POCUS to identify inadvertent esophageal intubations are needed.
LemaPC, O’BrienM, WilsonJ, St. JamesE, LindstromH, DeAngelisJ, CaldwellJ, MayP, ClemencyB.Avoid the Goose! Paramedic Identification of Esophageal Intubation by Ultrasound. Prehosp Disaster Med.2018;33(4):406–410
Several herbicide-based weed management programs for glyphosate-tolerant cotton were compared in eight field studies across Alabama during 1996 and 1997. Weed management programs ranged from traditional, soil-applied residual herbicide programs to more recently developed total postemergence (POST) herbicide programs. Pitted morningglory and sicklepod control was best achieved with fluometuron applied preemergence (PRE) followed by (fb) a single POST over-the-top (POT) application of glyphosate fb a POST-directed application of glyphosate. Annual grass control was better with the preplant incorporated (PPI) programs at two of three locations in both years. Treatments that included at least one glyphosate POT application gave increased grass control over no glyphosate or pyrithiobac POT. Velvetleaf control was improved with the addition of glyphosate POT. A herbicide program using no POST herbicides yielded significantly less seed cotton than any program using POST herbicides at one location. PRE- and POST-only weed management programs at another location produced more seed cotton and gave greater net returns than PPI programs. Similarly, net returns at that same location were equivalent for both PRE- and POST-only programs, and less for PPI programs. POST-only programs yielded highest amounts of seed cotton and netted greater returns.
Doveweed is becoming more common in agronomic crops in North Carolina. Laboratory and greenhouse experiments were conducted to determine the effect of temperature and seed burial depth on doveweed germination and emergence. Germination of lightly scarified seed at constant temperature was well described by a Gaussian model, which estimated peak germination at 28 C. Similar maximum percentage of germination was observed for optimal treatments under both constant and alternating temperatures. Among alternating temperatures, a 35/25 C regime gave greatest germination (77%). In spite of similar average daily temperatures, germination was greater with alternating temperature regimes of 40/30 and 40/35 C (65 and 30%, respectively) than constant temperatures of 36 and 38 C (4 and 0%, respectively). No germination was observed at 38 C constant temperature or for alternating temperature regimes of 20/10 and 25/15 C. Light did not enhance germination. Greatest emergence occurred from 0 to 1 cm, with a reduction in emergence as depth increased to 4 cm. No emergence occurred from 6 cm or greater depth. This information on seedbank dynamics may aid in developing tools and strategies for management.
Field studies were conducted to assess two sulfur-containing additives for use with glyphosate applied postemergence to glyphosate-resistant cotton for the control of sicklepod and yellow nutsedge. Neither diammonium sulfate (AMS) nor ammonium thiosulfate (ATS), both applied at 2.24 kg/ha, increased control of either species. Effective control of both species was dependent on glyphosate (isopropylamine salt) rate alone, with optimum control at 1.26 kg ae/ha. Plant-mapping data further indicated that sulfur-containing additives generally had no effect on either cotton fruiting patterns or yield. However, applying glyphosate at any rate did increase seed cotton yield in 2 of 3 yr vs. no glyphosate. In addition, applying glyphosate at any rate resulted in an increase in the number of bolls vs. no glyphosate in the following plant-mapping responses: total number of bolls per plant, number of abcised bolls per plant, bolls at the top five sympodial nodes, and bolls at positions 1 and 2 on the sympodia. Glyphosate absorption and subsequent translocation, as influenced by the addition of the sulfur-containing additives, was evaluated using radiotracer techniques. Glyphosate absorption after 48 h was 86, 63, and 37% of amount applied in cotton, sicklepod, and yellow nutsedge, respectively. Absorption by sicklepod and yellow nutsedge was not affected by the addition of either of the additives. Absorption by cotton was reduced by ATS but was not affected by AMS. In yellow nutsedge and cotton, glyphosate concentration in the treated area and adjacent tissue was not affected by either additive. A greater portion of glyphosate was translocated away from the treated area in sicklepod with glyphosate plus AMS (32%) than with glyphosate plus ATS (21%). AMS and ATS may be used in glyphosate-resistant cotton without the risk of either crop injury or yield reduction. However, their use for increased control of annual weed species, such as sicklepod and yellow nutsedge, may not be warranted.
The goal of this research was to develop herbicide programs for controlling acetolactate synthase (ALS)–, propanil-, quinclorac-, and clomazone-resistant barnyardgrass. Two applications of imazethapyr alone at 70 g ai ha−1 failed to control the ALS-resistant biotype more than 36%; however, when imazethapyr at 70 g ha−1 was applied early POST (EPOST) followed by imazethapyr at 70 g ha−1 plus fenoxaprop at 120 g ai ha−1 immediately prior to flooding (PREFLD), barnyardgrass control improved to 78% at 10 wk after planting. When imazethapyr was applied twice following PRE or delayed PRE applications of clomazone at 336 g ai ha−1, quinclorac at 560 g ai ha−1, pendimethalin at 1,120 g ai ha−1, or thiobencarb at 4,480 g ai ha−1 control was 92 to 100%. A single-pass program consisting of a delayed PRE application of clomazone at 336 g ha−1 plus quinclorac at 560 g ha−1 plus pendimethalin at 1,120 g ha−1 plus thiobencarb at 4,480 g ha−1 controlled all herbicide-resistant barnyardgrass biotypes at the same level as a standard multiple application program.
Field experiments were conducted in Alabama during 1999 and 2000 to test the hypothesis that any glyphosate-induced yield suppression in glyphosate-resistant cotton would be less with irrigation than without irrigation. Yield compensation was monitored by observing alterations in plant growth and fruiting patterns. Glyphosate treatments included a nontreated control, 1.12 kg ai/ha applied POST at the 4-leaf stage, 1.12 kg/ha applied DIR at the prebloom stage, and 1.12 kg/ha applied POST at 4-leaf and postemergence directed (DIR) at the prebloom cotton stages. The second variable, irrigation treatment, was established by irrigating plots individually with overhead sprinklers or maintaining them under dryland, nonirrigated conditions. Cotton yield and all measured parameters including lint quality were positively affected by irrigation. Irrigation increased yield 52% compared to nonirrigated cotton. Yield and fiber quality effects were independent of glyphosate treatments. Neither yield nor any of the measured variables that reflected whole plant response were influenced by glyphosate treatment or by a glyphosate by irrigation interaction.