We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Fifty-three tests designed to measure aspects of creative thinking were administered to 410 air cadets and student officers. The scores were intercorrelated and 16 factors were extracted. Orthogonal rotations resulted in 14 identifiable factors, a doublet, and a residual. Nine previously identified factors were: verbal comprehension, numerical facility, perceptual speed, visualization, general reasoning, word fluency, associational fluency, ideational fluency, and a factor combining Thurstone's closure I and II. Five new factors were identified as originality, redefinition, adaptive flexibility, spontaneous flexibility, and sensitivity to problems.
Transdisciplinary research knits together knowledge from diverse epistemic communities in addressing social-environmental challenges, such as biodiversity loss, climate crises, food insecurity, and public health. This article reflects on the roles of philosophy of science in transdisciplinary research while focusing on Indigenous and other subjugated forms of knowledge. We offer a critical assessment of demarcationist approaches in philosophy of science and outline a constructive alternative of transdisciplinary philosophy of science. While a focus on demarcation obscures the complex relations between epistemic communities, transdisciplinary philosophy of science provides resources for meeting epistemic and political challenges of collaborative knowledge production.
The quenching of cluster satellite galaxies is inextricably linked to the suppression of their cold interstellar medium (ISM) by environmental mechanisms. While the removal of neutral atomic hydrogen (H i) at large radii is well studied, how the environment impacts the remaining gas in the centres of galaxies, which are dominated by molecular gas, is less clear. Using new observations from the Virgo Environment traced in CO survey (VERTICO) and archival H i data, we study the H i and molecular gas within the optical discs of Virgo cluster galaxies on 1.2-kpc scales with spatially resolved scaling relations between stellar ($\Sigma_{\star}$), H i ($\Sigma_{\text{H}\,{\small\text{I}}}$), and molecular gas ($\Sigma_{\text{mol}}$) surface densities. Adopting H i deficiency as a measure of environmental impact, we find evidence that, in addition to removing the H i at large radii, the cluster processes also lower the average $\Sigma_{\text{H}\,{\small\text{I}}}$ of the remaining gas even in the central $1.2\,$kpc. The impact on molecular gas is comparatively weaker than on the H i, and we show that the lower $\Sigma_{\text{mol}}$ gas is removed first. In the most H i-deficient galaxies, however, we find evidence that environmental processes reduce the typical $\Sigma_{\text{mol}}$ of the remaining gas by nearly a factor of 3. We find no evidence for environment-driven elevation of $\Sigma_{\text{H}\,{\small\text{I}}}$ or $\Sigma_{\text{mol}}$ in H i-deficient galaxies. Using the ratio of $\Sigma_{\text{mol}}$-to-$\Sigma_{\text{H}\,{\small\text{I}}}$ in individual regions, we show that changes in the ISM physical conditions, estimated using the total gas surface density and midplane hydrostatic pressure, cannot explain the observed reduction in molecular gas content. Instead, we suggest that direct stripping of the molecular gas is required to explain our results.
Many decisions in everyday life involve a choice between exploring options that are currently unknown and exploiting options that are already known to be rewarding. Previous work has suggested that humans solve such “explore-exploit” dilemmas using a mixture of two strategies: directed exploration, in which information seeking drives exploration by choice, and random exploration, in which behavioral variability drives exploration by chance. One limitation of this previous work was that, like most studies on explore-exploit decision making, it focused exclusively on the domain of gains, where the goal was to maximize reward. In many real-world decisions, however, the goal is to minimize losses and it is well known from Prospect Theory that behavior can be quite different in this domain. In this study, we compared explore-exploit behavior of human subjects under conditions of gain and loss. We found that people use both directed and random exploration regardless of whether they are exploring to maximize gains or minimize losses and that there is quantitative agreement between the exploration parameters across domains. Our results also revealed an overall bias towards the more uncertain option in the domain of losses. While this bias towards uncertainty was qualitatively consistent with the predictions of Prospect Theory, quantitatively we found that the bias was better described by a Bayesian account, in which subjects had a prior that was optimistic for losses and pessimistic for gains. Taken together, our results suggest that explore-exploit decisions are driven by three independent processes: directed and random exploration, and a baseline uncertainty seeking that is driven by a prior.
The Hierarchical Taxonomy of Psychopathology (HiTOP) has emerged out of the quantitative approach to psychiatric nosology. This approach identifies psychopathology constructs based on patterns of co-variation among signs and symptoms. The initial HiTOP model, which was published in 2017, is based on a large literature that spans decades of research. HiTOP is a living model that undergoes revision as new data become available. Here we discuss advantages and practical considerations of using this system in psychiatric practice and research. We especially highlight limitations of HiTOP and ongoing efforts to address them. We describe differences and similarities between HiTOP and existing diagnostic systems. Next, we review the types of evidence that informed development of HiTOP, including populations in which it has been studied and data on its validity. The paper also describes how HiTOP can facilitate research on genetic and environmental causes of psychopathology as well as the search for neurobiologic mechanisms and novel treatments. Furthermore, we consider implications for public health programs and prevention of mental disorders. We also review data on clinical utility and illustrate clinical application of HiTOP. Importantly, the model is based on measures and practices that are already used widely in clinical settings. HiTOP offers a way to organize and formalize these techniques. This model already can contribute to progress in psychiatry and complement traditional nosologies. Moreover, HiTOP seeks to facilitate research on linkages between phenotypes and biological processes, which may enable construction of a system that encompasses both biomarkers and precise clinical description.
OBJECTIVES/GOALS: Using the covariate-rich Veteran Health Administration data, estimate the association between Proton Pump Inhibitor (PPI) use and severe COVID-19, rigorously adjusting for confounding using propensity score (PS)-weighting. METHODS/STUDY POPULATION: We assembled a national retrospective cohort of United States veterans who tested positive for SARS-CoV-2, with information on 33 covariates including comorbidity diagnoses, lab values, and medications. Current outpatient PPI use was compared to non-use (two or more fills and pills on hand at admission vs no PPI prescription fill in prior year). The primary composite outcome was mechanical ventilation use or death within 60 days; the secondary composite outcome included ICU admission. PS-weighting mimicked a 1:1 matching cohort, allowing inclusion of all patients while achieving good covariate balance. The weighted cohort was analyzed using logistic regression. RESULTS/ANTICIPATED RESULTS: Our analytic cohort included 97,674 veterans with SARS-CoV-2 testing, of whom 14,958 (15.3%) tested positive (6,262 [41.9%] current PPI-users, 8,696 [58.1%] non-users). After weighting, all covariates were well-balanced with standardized mean differences less than a threshold of 0.1. Prior to PS-weighting (no covariate adjustment), we observed higher odds of the primary (9.3% vs 7.5%; OR 1.27, 95% CI 1.13-1.43) and secondary (25.8% vs 21.4%; OR 1.27, 95% CI 1.18-1.37) outcomes among PPI users vs non-users. After PS-weighting, PPI use vs non-use was not associated with the primary (8.2% vs 8.0%; OR 1.03, 95% CI 0.91-1.16) or secondary (23.4% vs 22.9%;OR 1.03, 95% CI 0.95-1.12) outcomes. DISCUSSION/SIGNIFICANCE: The associations between PPI use and severe COVID-19 outcomes that have been previously reported may be due to limitations in the covariates available for adjustment. With respect to COVID-19, our robust PS-weighted analysis provides patients and providers with further evidence for PPI safety.
We examined the impact of microbiological results from respiratory samples on choice of antibiotic therapy in patients treated for hospital-acquired pneumonia (HAP) or ventilator-associated pneumonia (VAP).
Design:
Four-year retrospective study.
Setting:
Veterans’ Health Administration (VHA).
Patients:
VHA patients hospitalized with HAP or VAP and with respiratory cultures between October 1, 2014, and September 30, 2018.
Interventions:
We compared patients with positive and negative respiratory culture results, assessing changes in antibiotic class and Antibiotic Spectrum Index (ASI) from the day of sample collection (day 0) through day 7.
Results:
Between October 1, 2014, and September 30, 2018, we identified 5,086 patients with HAP/VAP: 2,952 with positive culture results and 2,134 with negative culture results. All-cause 30-day mortality was 21% for both groups. The mean time from respiratory sample receipt in the laboratory to final respiratory culture result was longer for those with positive (2.9 ± 1.3 days) compared to negative results (2.5 ± 1.3 days; P < .001). The most common pathogens were Staphylococcus aureus and Pseudomonas aeruginosa. Vancomycin and β-lactam/β-lactamase inhibitors were the most commonly prescribed agents. The decrease in the median ASI from 13 to 8 between days 0 and 6 was similar among patients with positive and negative respiratory cultures. Patients with negative cultures were more likely to be off antibiotics from day 3 onward.
Conclusions:
The results of respiratory cultures had only a small influence on antibiotics used during the treatment of HAP/VAP. The decrease in ASI for both groups suggests the integration of antibiotic stewardship principles, including de-escalation, into the care of patients with HAP/VAP.
OBJECTIVES/SPECIFIC AIMS: Background: Delirium is a well described form of acute brain organ dysfunction characterized by decreased or increased movement, changes in attention and concentration as well as perceptual disturbances (i.e., hallucinations) and delusions. Catatonia, a neuropsychiatric syndrome traditionally described in patients with severe psychiatric illness, can present as phenotypically similar to delirium and is characterized by increased, decreased and/or abnormal movements, staring, rigidity, and mutism. Delirium and catatonia can co-occur in the setting of medical illness, but no studies have explored this relationship by age. Our objective was to assess whether advancing age and the presence of catatonia are associated with delirium. METHODS/STUDY POPULATION: Methods: We prospectively enrolled critically ill patients at a single institution who were on a ventilator or in shock and evaluated them daily for delirium using the Confusion Assessment for the ICU and for catatonia using the Bush Francis Catatonia Rating Scale. Measures of association (OR) were assessed with a simple logistic regression model with catatonia as the independent variable and delirium as the dependent variable. Effect measure modification by age was assessed using a Likelihood ratio test. RESULTS/ANTICIPATED RESULTS: Results: We enrolled 136 medical and surgical critically ill patients with 452 matched (concomitant) delirium and catatonia assessments. Median age was 59 years (IQR: 52–68). In our cohort of 136 patients, 58 patients (43%) had delirium only, 4 (3%) had catatonia only, 42 (31%) had both delirium and catatonia, and 32 (24%) had neither. Age was significantly associated with prevalent delirium (i.e., increasing age associated with decreased risk for delirium) (p=0.04) after adjusting for catatonia severity. Catatonia was significantly associated with prevalent delirium (p<0.0001) after adjusting for age. Peak delirium risk was for patients aged 55 years with 3 or more catatonic signs, who had 53.4 times the odds of delirium (95% CI: 16.06, 176.75) than those with no catatonic signs. Patients 70 years and older with 3 or more catatonia features had half this risk. DISCUSSION/SIGNIFICANCE OF IMPACT: Conclusions: Catatonia is significantly associated with prevalent delirium even after controlling for age. These data support an inverted U-shape risk of delirium after adjusting for catatonia. This relationship and its clinical ramifications need to be examined in a larger sample, including patients with dementia. Additionally, we need to assess which acute brain syndrome (delirium or catatonia) develops first.
Antineuronal antibodies are associated with psychosis, although their clinical significance in first episode of psychosis (FEP) is undetermined.
Aims
To examine all patients admitted for treatment of FEP for antineuronal antibodies and describe clinical presentations and treatment outcomes in those who were antibody positive.
Method
Individuals admitted for FEP to six mental health units in Queensland, Australia, were prospectively tested for serum antineuronal antibodies. Antibody-positive patients were referred for neurological and immunological assessment and therapy.
Results
Of 113 consenting participants, six had antineuronal antibodies (anti-N-methyl-D-aspartate receptor antibodies [n = 4], voltage-gated potassium channel antibodies [n = 1] and antibodies against uncharacterised antigen [n = 1]). Five received immunotherapy, which prompted resolution of psychosis in four.
Conclusions
A small subgroup of patients admitted to hospital with FEP have antineuronal antibodies detectable in serum and are responsive to immunotherapy. Early diagnosis and treatment is critical to optimise recovery.
Seed of 41 economically important weed species of the Great Plains region of the United States were buried 20 cm deep in soil in eastern and western Nebraska in 1976. The 41 species consisted of 11 annual grass, 14 annual broadleaf, 4 biennial broadleaf, and 12 perennial broadleaf species. Weed seeds were exhumed annually for germination tests the first 9 yr, then after 12 and 17 yr. Germination percentages at the two burial locations averaged over 0, 1 to 4, 5 to 8, and 9 to 17 yr of burial were 57, 28, 9, and 4% for annual grass; 47, 26, 16, and 11 % for annual broadleaf; 52, 49, 44, and 30 % for biennial broadleaf; 36, 18, 13, and 8% for perennial broadleaf; and 47, 26, 16, and 10% for all 41 weed species, respectively. Biennial broadleaf weeds showed the greatest seed germination over years. Annual grass weeds showed less seed germinability over 17 yr of burial than annual broadleaf weeds and perennial broadleaf weed species were intermediate. Weed seed germinability in soil was greater in the reduced rainfall and more moderate soil temperatures of western Nebraska than in the greater rainfall and more fluctuating soil temperatures of eastern Nebraska. The greatest seed survival among the 41 weed species was shown by common mullein, which had 95% germination after 17 yr of burial in western Nebraska. Decay rates of individual weed species in soil will be of most value to weed scientists, agriculturalists, and modelers evaluating past or designing future weed management systems.
Preplant applied herbicides were compared for their effect on three varieties of sugarbeets when seeds were planted at six depths during 1987 through 1989. More sugarbeet seedlings emerged and at a faster rate as the depth of seeding decreased from 4.5 to 1.6 cm. Herbicide injury to sugarbeet seedlings increased as depth of seeding increased from less than to greater than 2.5 cm. Herbicide treatments reduced sugarbeet stand and decreased early season sugarbeet height but had little effect on root yield or sucrose content.
An experiment was conducted at five locations in Nebraska to determine the extent of demise of weed seed in soil when seed production was eliminated from 1975 through 1979 in corn (Zea mays L.). Weed yields, weed seed production, and corn yields were determined under four weed management levels in 1980. Annual broadleaf weed seed were more prevalent than grass seed in cultivated soil throughout the study. The population of viable weed seed in soil declined 95% during the 5-yr period that weed seed production was eliminated. Weed seed buildup recovered to within 90% of the 1975 level during 1980 at Concord and Clay Center but remained low at Lincoln, North Platte, and Scottsbluff. Thus, seed longevity in soil was sometimes sufficient to withstand modern weed control methods and still reinfest a field after 5 yr of eliminating weed seed production. Corn yields were maintained 1 yr with minimum weed management effort following 5 yr of no weed seed production.
Management systems for direct-seeded and transplanted sugarbeets (Beta vulgaris L. ‘Mono Hy D2′) were compared for weed control and sugarbeet selectivity from 1983 through 1985 in western Nebraska. Broadleaf weed density was similar, but yellow foxtail [Setaria glauca (L.) Beauv. # SETLU] density was lower in transplanted compared to direct-seeded sugarbeets. Preplant soil-incorporated applications of cycloate (S-ethyl cyclohexylethylcarbamothioate) plus trifluralin [2,6-dinitro-N,N-dipropyl-4-(trifluoromethyl)benzenamine] at 3.3 plus 0.6 kg ai/ha or ethofumesate [(±)-2-ethoxy-2,3-dihydro-3,3-dimethyl-5-benzofuranyl methanesulfonate] plus trifluralin at 2.2 plus 0.6 kg/ha was noninjurious to transplanted sugarbeets but caused severe injury to direct-seeded sugarbeets. The combination of cycloate or ethofumesate with trifluralin improved weed control over that obtained when cycloate or ethofumesate was used alone. By combining the improved weed control obtained from cycloate plus trifluralin or ethofumesate plus trifluralin with the transplanting crop establishment technique, a superior sugarbeet weed control program was developed.
Laboratory experiments were conducted to determine the effect of moisture stress on the absorption and translocation of 14C-labeled picloram (4-amino-3,5,6-trichloropicolinic acid), dicamba (3,6-dichloro-o-anisic acid), and glyphosate [N-(phosphonomethyl)glycine] within the Canada thistle [Cirsium arvense (L.) Scop. # CIRAR] plants. The absorption and translocation of picloram and dicamba were unaffected by moisture stress. Absorption and translocation of glyphosate to the roots and apical meristem of Canada thistle was reduced by increasing moisture stress. Weekly differential irrigation of Canada thistle field plots during the summers of 1980 and 1981 established three soil moisture regimes averaging −6.6, −11.3, and −15.0 bars at the time of herbicide treatment. When Canada thistle control was evaluated 1 year after application of glyphosate, dicamba, and picloram at 2.5, 1.1, and 0.6 kg/ha, respectively, no differences in Canada thistle shoot control were found between moisture stress treatments.
To determine the impact of an environmental disinfection intervention on the incidence of healthcare-associated Clostridium difficile infection (CDI).
DESIGN
A multicenter randomized trial.
SETTING
In total,16 acute-care hospitals in northeastern Ohio participated in the study.
INTERVENTION
We conducted a 12-month randomized trial to compare standard cleaning to enhanced cleaning that included monitoring of environmental services (EVS) personnel performance with feedback to EVS and infection control staff. We assessed the thoroughness of cleaning based on fluorescent marker removal from high-touch surfaces and the effectiveness of disinfection based on environmental cultures for C. difficile. A linear mixed model was used to compare CDI rates in the intervention and postintervention periods for control and intervention hospitals. The primary outcome was the incidence of healthcare-associated CDI.
RESULTS
Overall, 7 intervention hospitals and 8 control hospitals completed the study. The intervention resulted in significantly increased fluorescent marker removal in CDI and non-CDI rooms and decreased recovery of C. difficile from high-touch surfaces in CDI rooms. However, no reduction was observed in the incidence of healthcare-associated CDI in the intervention hospitals during the intervention and postintervention periods. Moreover, there was no correlation between the percentage of positive cultures after cleaning of CDI or non-CDI rooms and the incidence of healthcare-associated CDI.
CONCLUSIONS
An environmental disinfection intervention improved the thoroughness and effectiveness of cleaning but did not reduce the incidence of healthcare-associated CDI. Thus, interventions that focus only on improving cleaning may not be sufficient to control healthcare-associated CDI.
Volunteer corn can affect dry bean by reducing yields; expanding the life cycle of insects, mites, and pathogens; interfering with harvest; and contaminating bean seed. Field studies were conducted at Lingle, WY, and Scottsbluff, NE, to determine the relationship between volunteer corn density and dry bean yield, establish the proper time of volunteer corn removal, and determine whether dry bean yield was affected by the method used to remove volunteer corn. Volunteer corn reduced dry bean yields, as recorded in other crops. Growing conditions for each location were different, as indicated by the accumulated growing degree days (GDD): Lingle 2008 (990), Lingle 2009 (780), and Scottsbluff 2009 (957). No difference in dry bean yields was observed between hand removal of volunteer corn and herbicide application. Dry bean yield loss increased with longer periods of volunteer corn competition and ranged from 1.2 to 1.8% yield loss for every 100 GDD that control was delayed. Control measures should be implemented 15 to 20 d after planting when volunteer corn densities are close to 1 plant m−2. Dry bean yield losses also increased as volunteer corn densities increased, with losses from 6.5 to 19.3% for 1 volunteer corn plant m−2. Based on 2015 prices, the cost of controlling volunteer corn would be the equivalent of 102 kg ha−1 of dry bean, and potential losses above 4% would justify control and should not be delayed beyond 15 to 20 d after planting.
Field trials were conducted in 1999 and 2000 to determine the influence of weed size and the number of glyphosate or glufosinate applications on weed control and sugarbeet yield. Glyphosate at 840 g/ha or glufosinate at 390 g/ha was applied one, two, or three times, beginning when the average weed height was 3, 10, 15, or 25 cm. Two sequential applications of glyphosate applied to 10-cm weeds or three sequential applications of glufosinate applied to 3-cm weeds provided weed control comparable to three sequential applications of desmedipham plus phenmedipham plus triflusulfuron plus clopyralid. Weed control and sugarbeet root yield were optimal for two postemergence applications of glyphosate and for three applications of glufosinate. Glyphosate provided greater control of redroot pigweed and common lambsquarters than glufosinate. Sugarbeet sucrose yield with both glyphosate and glufosinate weed control programs was nearly 10,000 kg/ha. Compared with two sequential applications of glyphosate, sucrose yield of glyphosate-resistant sugarbeet was reduced 15% by three sequential applications of desmedipham plus phenmedipham plus triflusulfuron plus clopyralid. Sucrose yields were similar between three sequential applications of glufosinate and three applications of desmedipham plus phenmedipham plus triflusulfuron plus clopyralid.
Field trials were conducted at five sites from 2001 through 2003 to determine the influence on sugarbeet and weeds of repeated broadcast and banded reduced rates of desmedipham plus phenmedipham, triflusulfuron, and clopyralid in combination with either 1.5 or 3% v/v methylated seed oil (MSO). Desmedipham plus phenmedipham, triflusulfuron, and clopyralid were applied POST three times at 5 to 7 d intervals at either 25, 50, 75, or 100% of a 180 plus 180 plus 18 plus 100 g ai/ha dosage (full rate). When averaged over all herbicide rates, crop injury was 6% greater, but common lambsquarters control was 5% higher, and crop yield was 15% greater with broadcast compared with banded herbicide application. In most situations, adding MSO at 3% rather than 1.5% did not improve weed control. Sugarbeet injury was lowest (11%) and the average weed control was 86% when herbicide rates (with 1.5% MSO) were 25% of the full rate (microrate). Applying an herbicide rate (with 1.5% MSO) that was 50% of the full rate (half rate) increased crop injury from 11% with the microrate to 18% with the half rate and elevated average weed control from 86% with the microrate to 92% with the half rate. Common lambsquarters control increased from 81% with the microrate to 89% with the half rate. Sugarbeet root yield was 23 t/ha when no herbicide was used, 48 t/ha with the microrate, and 49 t/ha with the half rate compared with 54 t/ha when the full rate was applied without MSO. Increasing herbicide rates to 75% of the full rate (three-quarter rate) (with 1.5% MSO) increased crop injury to 27% and average weed control to 96%. Applying 1.5% MSO to the full rate increased crop injury to 35% with no improvement in average weed control over that achieved with the full rate without MSO.
A segment of the debate surrounding the commercialization of geneticallyengineered (GE) crops, such as glyphosate-resistant (GR) crops, focuses onthe theory that implementation of these traits is an extension of theintensification of agriculture that will further erode the biodiversity ofagricultural landscapes. A large field-scale study was conducted in 2006 inthe United States on 156 different field sites with a minimum 3-yr historyof GR corn, cotton, or soybean in the cropping system. The impact ofcropping system, crop rotation, frequency of using the GR crop trait, andseveral categorical variables on emerged weed density and diversity wasanalyzed. Species richness, evenness, Shannon's H′, proportion of forbs,erect growth habit, and C3 species diversity were all greater inagricultural sites that lacked crop rotation or were in a continuous GR cropsystem. Rotating between two GR crops (e.g., corn and soybean) or rotatingto a non-GR crop resulted in less weed diversity than a continuous GR crop.The composition of the weed flora was more strongly related to location(geography) than any other parameter. The diversity of weed flora inagricultural sites with a history of GR crop production can be influenced byseveral factors relating to the specific method in which the GR trait isintegrated (cropping system, crop rotation, GR trait rotation), the specificweed species, and the geographical location. The finding that fields withcontinuous GR crops demonstrated greater weed diversity is contrary toarguments opposing the use of GE crops. These results justify furtherresearch to clarify the complexities of crops grown withherbicide-resistance traits, or more broadly, GE crops, to provide a morecomplete characterization of their culture and local adaptation.
In 2010, a grower survey was administered to 1,299 growers in 22 states to determine changes in weed management in the United States from 2006 to 2009. The majority of growers had not changed weed management practices in the previous 3 yr; however, 75% reported using weed management practices targeted at glyphosate-resistant (GR) weeds. Growers were asked to rate their efforts at controlling GR weeds and rate the effectiveness of various practices for controlling/preventing GR weeds regardless of whether they were personally using them. Using the herbicide labeled rate, scouting fields, and rotating crops were among the practices considered by growers as most effective in managing GR weeds. Sixty-seven percent of growers reported effective management of GR weeds. Between the 2005 and 2010 Benchmark surveys, the frequency of growers using specific actions to manage GR weeds increased markedly. Although the relative effectiveness of practices, as perceived by growers, remained the same, the effectiveness rating of tillage and the use of residual and POST herbicides increased.