We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Health technology assessment (HTA) is a form of policy analysis that informs decisions about funding and scaling up health technologies to improve health outcomes. An equity-focused HTA recommendation explicitly addresses the impact of health technologies on individuals disadvantaged in society because of specific health needs or social conditions. However, more evidence is needed on the relationships between patient engagement processes and the development of equity-focused HTA recommendations.
Objectives
The objective of this study is to assess relationships between patient engagement processes and the development of equity-focused HTA recommendations.
Methods
We analyzed sixty HTA reports published between 2013 and 2021 from two Canadian organizations: Canada’s Drug Agency and Ontario Health.
Results
Quantitative analysis of the HTA reports showed that direct patient engagement (odds ratio (OR): 3.85; 95 percent confidence interval (CI): 2.40–6.20) and consensus in decision-making (OR: 2.27; 95 percent CI: 1.35–3.84) were more likely to be associated with the development of equity-focused HTA recommendations than indirect patient engagement (OR: .26; 95 percent CI: .16–.41) and voting (OR: .44; 95 percent CI: .26–.73).
Conclusion
The results can inform the development of patient engagement strategies in HTA. These findings have implications for practice, research, and policy. They provide valuable insights into HTA.
This paper reports an experiment designed to assess the effects of a rotation in the marginal cost curve on convergence in a repeated Cournot triopoly. Increasing the cost curve's slope both reduces the serially-undominated set to the Nash prediction, and increases the peakedness of earnings. We observe higher rates of Nash equilibrium play in the design with the steeper marginal cost schedule, but only when participants are also rematched after each decision. Examination of response patterns suggests that the treatment with a steeper marginal cost curve and with a re-matching of participants across periods induces the selection of Nash Consistent responses.
The quenching of cluster satellite galaxies is inextricably linked to the suppression of their cold interstellar medium (ISM) by environmental mechanisms. While the removal of neutral atomic hydrogen (H i) at large radii is well studied, how the environment impacts the remaining gas in the centres of galaxies, which are dominated by molecular gas, is less clear. Using new observations from the Virgo Environment traced in CO survey (VERTICO) and archival H i data, we study the H i and molecular gas within the optical discs of Virgo cluster galaxies on 1.2-kpc scales with spatially resolved scaling relations between stellar ($\Sigma_{\star}$), H i ($\Sigma_{\text{H}\,{\small\text{I}}}$), and molecular gas ($\Sigma_{\text{mol}}$) surface densities. Adopting H i deficiency as a measure of environmental impact, we find evidence that, in addition to removing the H i at large radii, the cluster processes also lower the average $\Sigma_{\text{H}\,{\small\text{I}}}$ of the remaining gas even in the central $1.2\,$kpc. The impact on molecular gas is comparatively weaker than on the H i, and we show that the lower $\Sigma_{\text{mol}}$ gas is removed first. In the most H i-deficient galaxies, however, we find evidence that environmental processes reduce the typical $\Sigma_{\text{mol}}$ of the remaining gas by nearly a factor of 3. We find no evidence for environment-driven elevation of $\Sigma_{\text{H}\,{\small\text{I}}}$ or $\Sigma_{\text{mol}}$ in H i-deficient galaxies. Using the ratio of $\Sigma_{\text{mol}}$-to-$\Sigma_{\text{H}\,{\small\text{I}}}$ in individual regions, we show that changes in the ISM physical conditions, estimated using the total gas surface density and midplane hydrostatic pressure, cannot explain the observed reduction in molecular gas content. Instead, we suggest that direct stripping of the molecular gas is required to explain our results.
Many decisions in everyday life involve a choice between exploring options that are currently unknown and exploiting options that are already known to be rewarding. Previous work has suggested that humans solve such “explore-exploit” dilemmas using a mixture of two strategies: directed exploration, in which information seeking drives exploration by choice, and random exploration, in which behavioral variability drives exploration by chance. One limitation of this previous work was that, like most studies on explore-exploit decision making, it focused exclusively on the domain of gains, where the goal was to maximize reward. In many real-world decisions, however, the goal is to minimize losses and it is well known from Prospect Theory that behavior can be quite different in this domain. In this study, we compared explore-exploit behavior of human subjects under conditions of gain and loss. We found that people use both directed and random exploration regardless of whether they are exploring to maximize gains or minimize losses and that there is quantitative agreement between the exploration parameters across domains. Our results also revealed an overall bias towards the more uncertain option in the domain of losses. While this bias towards uncertainty was qualitatively consistent with the predictions of Prospect Theory, quantitatively we found that the bias was better described by a Bayesian account, in which subjects had a prior that was optimistic for losses and pessimistic for gains. Taken together, our results suggest that explore-exploit decisions are driven by three independent processes: directed and random exploration, and a baseline uncertainty seeking that is driven by a prior.
The Hierarchical Taxonomy of Psychopathology (HiTOP) has emerged out of the quantitative approach to psychiatric nosology. This approach identifies psychopathology constructs based on patterns of co-variation among signs and symptoms. The initial HiTOP model, which was published in 2017, is based on a large literature that spans decades of research. HiTOP is a living model that undergoes revision as new data become available. Here we discuss advantages and practical considerations of using this system in psychiatric practice and research. We especially highlight limitations of HiTOP and ongoing efforts to address them. We describe differences and similarities between HiTOP and existing diagnostic systems. Next, we review the types of evidence that informed development of HiTOP, including populations in which it has been studied and data on its validity. The paper also describes how HiTOP can facilitate research on genetic and environmental causes of psychopathology as well as the search for neurobiologic mechanisms and novel treatments. Furthermore, we consider implications for public health programs and prevention of mental disorders. We also review data on clinical utility and illustrate clinical application of HiTOP. Importantly, the model is based on measures and practices that are already used widely in clinical settings. HiTOP offers a way to organize and formalize these techniques. This model already can contribute to progress in psychiatry and complement traditional nosologies. Moreover, HiTOP seeks to facilitate research on linkages between phenotypes and biological processes, which may enable construction of a system that encompasses both biomarkers and precise clinical description.
Open Strategy has drawn increasing attention in recent years. A growing number of studies have captured greater transparency and heightened inclusion in the strategic practices of contemporary organizations (e.g., Whittington et al., 2011; Hautz et al., 2017). It is often Information Technology (IT) that can facilitate involvement of a wider range of stakeholders in the generation of strategic content and knowledge (Chesbrough & Appleyard, 2007; Wulf & Butel, 2016), and in the practice of strategy (Whittington et al., 2011; Whittington, 2014). However, despite the widely recognized role of such technology as online platforms (Malhotra et al., 2017) and social media (Huang et al., 2013; Baptista et al., 2017) in enabling openness in strategy, literature with an explicit focus on IT has been surprisingly sparse to date (Tavakoli et al., 2015; 2017). Thus far, most papers have been published in Management and Strategic Management outlets (e.g., Whittington et al., 2011; Stieger et al., 2012; Seidl & Werle, 2017), including a special issue on Open Strategy in Long Range Planning (e.g., Hautz et al., 2017).
OBJECTIVES/SPECIFIC AIMS: Delirium, a form of acute brain dysfunction, characterized by changes in attention and alertness, is a known independent predictor of mortality in the Intensive Care Unit (ICU). We sought to understand whether catatonia, a more recently recognized form of acute brain dysfunction, is associated with increased 30-day mortality in critically ill older adults. METHODS/STUDY POPULATION: We prospectively enrolled critically ill patients at a single institution who were on a ventilator or in shock and evaluated them daily for delirium using the Confusion Assessment for the ICU and for catatonia using the Bush Francis Catatonia Rating Scale. Coma, was defined as a Richmond Agitation Scale score of −4 or −5. We used the Cox Proportional Hazards model predicting 30-day mortality after adjusting for delirium, coma and catatonia status. RESULTS/ANTICIPATED RESULTS: We enrolled 335 medical, surgical or trauma critically ill patients with 1103 matched delirium and catatonia assessments. Median age was 58 years (IQR: 48 - 67). Main indications for admission to the ICU included: airway disease or protection (32%; N=100) or sepsis and/or shock (25%; N=79. In the unadjusted analysis, regardless of the presence of catatonia, non-delirious individuals have the highest median survival times, while delirious patients have the lowest median survival time. Comparing the absence and presence of catatonia, the presence of catatonia worsens survival (Figure 1). In a time-dependent Cox model, comparing non-delirious individuals, holding catatonia status constant, delirious individuals have 1.72 times the hazards of death (IQR: 1.321, 2.231) while those with coma have 5.48 times the hazards of death (IQR: 4.298, 6.984). For DSM-5 catatonia scores, a 1-unit increase in the score is associated with 1.18 times the hazards of in-hospital mortality. Comparing two individuals with the same delirium status, an individual with a DSM-5 catatonia score of 0 (no catatonia) will have 1.178 times the hazard of death (IQR: 1.086, 1.278), while an individual with a score of 3 catatonia items (catatonia) present will have 1.63 times the hazard of death. DISCUSSION/SIGNIFICANCE OF IMPACT: Non-delirious individuals have the highest median survival times, while those who are comatose have the lowest median survival times after a critical illness, holding catatonia status constant. Comparing the absence and presence of catatonia, the presence of catatonia seems to worsen survival. Those individual who are both comatose and catatonic have the lowest median survival time.
Medical procedures and patient care activities may facilitate environmental dissemination of healthcare-associated pathogens such as methicillin-resistant Staphylococcus aureus (MRSA).
Design:
Observational cohort study of MRSA-colonized patients to determine the frequency of and risk factors for environmental shedding of MRSA during procedures and care activities in carriers with positive nares and/or wound cultures. Bivariate analyses were performed to identify factors associated with environmental shedding.
Setting:
A Veterans Affairs hospital.
Participants:
This study included 75 patients in contact precautions for MRSA colonization or infection.
Results:
Of 75 patients in contact precautions for MRSA, 55 (73%) had MRSA in nares and/or wounds and 25 (33%) had positive skin cultures. For the 52 patients with MRSA in nares and/or wounds and at least 1 observed procedure, environmental shedding of MRSA occurred more frequently during procedures and care activities than in the absence of a procedure (59 of 138, 43% vs 8 of 83, 10%; P < .001). During procedures, increased shedding occurred ≤0.9 m versus >0.9 m from the patient (52 of 138, 38% vs 25 of 138, 18%; P = .0004). Contamination occurred frequently on surfaces touched by personnel (12 of 38, 32%) and on portable equipment used for procedures (25 of 101, 25%). By bivariate analysis, the presence of a wound with MRSA was associated with shedding (17 of 29, 59% versus 6 of 23, 26%; P = .04).
Conclusions:
Environmental shedding of MRSA occurs frequently during medical procedures and patient care activities. There is a need for effective strategies to disinfect surfaces and equipment after procedures.
OBJECTIVES/SPECIFIC AIMS: Background: Delirium is a well described form of acute brain organ dysfunction characterized by decreased or increased movement, changes in attention and concentration as well as perceptual disturbances (i.e., hallucinations) and delusions. Catatonia, a neuropsychiatric syndrome traditionally described in patients with severe psychiatric illness, can present as phenotypically similar to delirium and is characterized by increased, decreased and/or abnormal movements, staring, rigidity, and mutism. Delirium and catatonia can co-occur in the setting of medical illness, but no studies have explored this relationship by age. Our objective was to assess whether advancing age and the presence of catatonia are associated with delirium. METHODS/STUDY POPULATION: Methods: We prospectively enrolled critically ill patients at a single institution who were on a ventilator or in shock and evaluated them daily for delirium using the Confusion Assessment for the ICU and for catatonia using the Bush Francis Catatonia Rating Scale. Measures of association (OR) were assessed with a simple logistic regression model with catatonia as the independent variable and delirium as the dependent variable. Effect measure modification by age was assessed using a Likelihood ratio test. RESULTS/ANTICIPATED RESULTS: Results: We enrolled 136 medical and surgical critically ill patients with 452 matched (concomitant) delirium and catatonia assessments. Median age was 59 years (IQR: 52–68). In our cohort of 136 patients, 58 patients (43%) had delirium only, 4 (3%) had catatonia only, 42 (31%) had both delirium and catatonia, and 32 (24%) had neither. Age was significantly associated with prevalent delirium (i.e., increasing age associated with decreased risk for delirium) (p=0.04) after adjusting for catatonia severity. Catatonia was significantly associated with prevalent delirium (p<0.0001) after adjusting for age. Peak delirium risk was for patients aged 55 years with 3 or more catatonic signs, who had 53.4 times the odds of delirium (95% CI: 16.06, 176.75) than those with no catatonic signs. Patients 70 years and older with 3 or more catatonia features had half this risk. DISCUSSION/SIGNIFICANCE OF IMPACT: Conclusions: Catatonia is significantly associated with prevalent delirium even after controlling for age. These data support an inverted U-shape risk of delirium after adjusting for catatonia. This relationship and its clinical ramifications need to be examined in a larger sample, including patients with dementia. Additionally, we need to assess which acute brain syndrome (delirium or catatonia) develops first.
Timing of weed emergence and seed persistence in the soil influence the ability to implement timely and effective control practices. Emergence patterns and seed persistence of kochia populations were monitored in 2010 and 2011 at sites in Kansas, Colorado, Wyoming, Nebraska, and South Dakota. Weekly observations of emergence were initiated in March and continued until no new emergence occurred. Seed was harvested from each site, placed into 100-seed mesh packets, and buried at depths of 0, 2.5, and 10 cm in fall of 2010 and 2011. Packets were exhumed at 6-mo intervals over 2 yr. Viability of exhumed seeds was evaluated. Nonlinear mixed-effects Weibull models were fit to cumulative emergence (%) across growing degree days (GDD) and to viable seed (%) across burial time to describe their fixed and random effects across site-years. Final emergence densities varied among site-years and ranged from as few as 4 to almost 380,000 seedlings m−2. Across 11 site-years in Kansas, cumulative GDD needed for 10% emergence were 168, while across 6 site-years in Wyoming and Nebraska, only 90 GDD were needed; on the calendar, this date shifted from early to late March. The majority (>95%) of kochia seed did not persist for more than 2 yr. Remaining seed viability was generally >80% when seeds were exhumed within 6 mo after burial in March, and declined to <5% by October of the first year after burial. Burial did not appear to increase or decrease seed viability over time but placed seed in a position from which seedling emergence would not be possible. High seedling emergence that occurs very early in the spring emphasizes the need for fall or early spring PRE weed control such as tillage, herbicides, and cover crops, while continued emergence into midsummer emphasizes the need for extended periods of kochia management.
Field research was conducted during the summers of 1981 and 1982 in order to determine relative infection and population increase of lesion nematodes (Pratylenchus spp.) on seven weed species that commonly occur in field-bean (Phaseolus vulgaris L.) fields in western Nebraska. Weeds were grown at three densities with and without fieldbeans. A representative sample of the root systems from plants in each plot was removed in August and the nematodes were extracted and counted. No difference in nematode infection rate was found among weed population levels. Nematodes per gram of dry root were not different in weeds grown with or without fieldbeans. Weeds grown with fieldbeans had smaller root systems, and consequently total nematodes per root system were less than in weeds grown in the absence of fieldbeans. There was a significant difference among most weed species when nematodes per gram of dry root were estimated. Hairy nightshade (Solanum sarachoides Sendt. ♯3 SOLSA) and barnyardgrass [Echinochloa crus-galli (L.) ♯ ECHCG] supported the highest numbers of nematodes per g oven-dry roots, redroot pigweed (Amaranthus retroflexus L. ♯ AMARE) and common cocklebur (Xanthium pensylvanicum Wallr. ♯ XANPE) had least numbers of nematodes/g oven-dry roots, and infestation levels on other weed species were variable but generally intermediate.
Management systems for direct-seeded and transplanted sugarbeets (Beta vulgaris L. ‘Mono Hy D2′) were compared for weed control and sugarbeet selectivity from 1983 through 1985 in western Nebraska. Broadleaf weed density was similar, but yellow foxtail [Setaria glauca (L.) Beauv. # SETLU] density was lower in transplanted compared to direct-seeded sugarbeets. Preplant soil-incorporated applications of cycloate (S-ethyl cyclohexylethylcarbamothioate) plus trifluralin [2,6-dinitro-N,N-dipropyl-4-(trifluoromethyl)benzenamine] at 3.3 plus 0.6 kg ai/ha or ethofumesate [(±)-2-ethoxy-2,3-dihydro-3,3-dimethyl-5-benzofuranyl methanesulfonate] plus trifluralin at 2.2 plus 0.6 kg/ha was noninjurious to transplanted sugarbeets but caused severe injury to direct-seeded sugarbeets. The combination of cycloate or ethofumesate with trifluralin improved weed control over that obtained when cycloate or ethofumesate was used alone. By combining the improved weed control obtained from cycloate plus trifluralin or ethofumesate plus trifluralin with the transplanting crop establishment technique, a superior sugarbeet weed control program was developed.
The Chemical Movement through Layered Soils (CMLS) model was modified and combined with the USDA-SCS State Soil Geographic Data Base (STATSGO) and Montana Agricultural Potentials System (MAPS) digital databases to assess the likelihood of groundwater contamination from selected herbicides in Teton County, MT. The STATSGO and MAPS databases were overlaid to produce polygons with unique soil and climate characteristics and attribute tables containing only those data needed by the CMLS model. The Weather Generator (WGEN) computer simulation model was modified and used to generate daily precipitation and evapotranspiration values. A new algorithm was developed to estimate soil carbon as a function of soil depth. The depth of movement of the applied chemicals at the end of the growing season was estimated with CMLS for each of the soil series in the STATSGO soil mapping units and the results were entered into ARC/INFO to produce the final hazard maps showing best, weighted average, and worst case results for every unique combination (polygon) of soil mapping unit and climate. County weed infestation maps for leafy spurge and spotted knapweed were digitized and overlaid in ARC/INFO with the CMLS model results for picloram to illustrate how the results might be used to evaluate the threat to groundwater posed by current herbicide applications.
The seed composition in the upper 15-cm soil horizon was determined and correlated with weed seedlings growing with fieldbeans (Phaseolus vulgaris L. ‘Valley’). The total seed reservoir averaged 250 seed/kg of soil, and 19 species were represented. Seed occurring with the most frequency were redroot pigweed (Amaranthus retroflexus L. ♯ AMARE), common lambsquarters (Chenopodium album L. ♯ CHEAL), and common purslane (Portulaca oleracea L. ♯ POROL). Seed from these plants accounted for over 85% of the seed found. The number of barnyardgrass [Echinochloa crus-galli (L.) Beauv. ♯ECHCG], buffalobur (Solanum rostratum Dunal ♯ SOLCU), common lambsquarters, common purslane, and common sunflower (Helianthus annuus L. ♯ HELAN) seed in the soil was correlated with the number of plants growing in the field with fieldbeans. A correlation occurred between redroot pigweed, yellow foxtail [Setaria lutescens (Weigel.) Hubb. ♯ SETLU], and barnyardgrass growing in corn (Zea mays L.) fields in the fall of the year and plants growing in the field with fieldbeans the following year.
An experiment was conducted near Scottsbluff, NE, to assess three techniques for establishing perennial grasses in pasture sites and to evaluate the effectiveness of five perennial grasses compared with herbicide or mowing for Canada thistle control. Perennial grass density 9 mo after seeding and perennial grass biomass 12 mo after seeding both followed the same trend, indicating that preplant rototilling improved perennial grass establishment. After 3 yr, Canada thistle control was greater than 90% in plots where perennial grasses had been established utilizing preplant rototilling, and competitive grasses were as effective as yearly applications of clopyralid at 0.55 kg/ha for controlling Canada thistle. Averaged across two studies conducted for 3 yr, hybrid wheatgrass, intermediate wheatgrass, Russian wildrye, tall fescue, and western wheatgrass provided 85, 74, 76, 78, and 66% Canada thistle control, respectively.
Field experiments, conducted from 1991 to 1994, generated information on weed seedbank emergence for 22 site-years from Ohio to Colorado and Minnesota to Missouri. Early spring seedbank densities were estimated through direct extraction of viable seeds from soil cores. Emerged seedlings were recorded periodically, as were daily values for air and soil temperature, and precipitation. Percentages of weed seedbanks that emerged as seedlings were calculated from seedbank and seedling data for each species, and relationships between seedbank emergence and microclimatic variables were sought. Fifteen species were found in 3 or more site-years. Average emergence percentages (and coefficients of variation) of these species were as follows: giant foxtail, 31.2 (84%); velvetleaf, 28.2 (66); kochia, 25.7 (79); Pennsylvania smartweed, 25.1 (65); common purslane, 15.4 (135); common ragweed, 15.0 (110); green foxtail, 8.5 (72); wild proso millet, 6.6 (104); hairy nightshade, 5.2 (62); common sunflower, 5.0 (26); yellow foxtail, 3.4 (67); pigweed species, 3.3 (103); common lambsquarters, 2.7 (111); wild buckwheat, 2.5 (63), and prostrate knotweed, 0.6 (79). Variation among site-years, for some species, could be attributed to microclimate variables thought to induce secondary dormancy in spring. For example, total seasonal emergence percentage of giant foxtail was related positively to the 1st date at which average daily soil temperature at 5 to 10 cm soil depth reached 16 C. Thus, if soil warmed before mid April, secondary dormancy was induced and few seedlings emerged, whereas many seedlings emerged if soil remained cool until June.
Imazethapyr at 0.07 and 0.10 kg ai ha-1 applied preplant incorporated (PPI), preemergence (PRE), and postemergence (POST) was evaluated at two locations in 1988 and 1989 for safety to dry edible beans. Bean stunting, leaf crinkling, and interveinal chlorosis were evident from imazethapyr and varied from 1 to 57%. Imazethapyr significantly reduced bean height and delayed maturity. PPI and POST applications of imazethapyr at 0.07 and 0.10 kg ha-1 did not reduce bean seed yields compared to yields of the untreated control. Bean cultivar by herbicide interactions were significant for bean injury but varied with year and location.
Seedbanks and seedling emergence of annual weeds were examined in arable fields at eight locations in the Corn Belt. Seed densities were estimated by direct seed extraction from each of several soil cores in each sampled plot. Average total seedbank densities ranged from 600 to 162 000 viable seed m-2 among locations. Coefficients of variation (CV) typically exceeded 50%. CV for seed densities of individual species usually exceeded 100%, indicating strongly aggregated distributions. CV were lower for species with dense seed populations than those with sparse seed populations. Variance of total seedbank densities was unstable when < 10 cores were examined per plot, but stabilized at all locations when ≥ 15 cores were analyzed, despite a 12-fold difference in plot size and 270-fold difference in seed density among locations. Percentage viable seed that emerged as seedlings in field plots ranged from < 1% for yellow rocket to 30% for giant foxtail. Redroot pigweed and common lambsquarters were the most frequently encountered species. Emergence percentages of these species were related inversely to rainfall or air temperatures in April or May, presumably because anoxia and/or high temperatures induced secondary dormancy in nondormant seed. From 50 to 90% of total seed in the seedbank were dead. This information can be employed by bioeconomic weed management models, which currently use coarse estimates of emergence percentages to customize recommendations for weed control.
Roots of Canada thistle were excavated from the soil monthly from 1999 to 2001 near Scottsbluff, NE, to quantify the influence of changing soil temperature on free sugars and fructans in roots. Sucrose concentrations were low from May through August then increased in the fall and remained at high levels during winter and then declined in April as plants initiated spring growth. Changes in sucrose, 1-kestose (DP 3) and 1-nystose (DP 4) were shown to be closely associated with changes in soil temperature. During the second year of the study, average soil temperatures during the winter were colder than the first year and resulted in an increase of sucrose in Canada thistle roots. Experiments were conducted from 2001 to 2004 to determine whether there was a correlation between herbicide efficacy, time of herbicide application, and the resulting herbicide effect on root carbohydrate and Canada thistle control. Clopyralid applied in the fall reduced Canada thistle density 92% 8 months after treatment (MAT) whereas treatment made in the spring reduced plant density 33% 11 MAT. Fall application of clopyralid increased the activity of fructan 1-exohydrolase (1-FEH) in roots and was associated with a decline in sucrose, DP 4, and 1-fructofuranosyl-nystose (DP 5) 35 d after treatment (DAT). Spring application of clopyralid also resulted in a decrease of the same carbohydrates 35 DAT, but by 98 DAT, or early October, sucrose level in roots had recovered and was similar to nontreated plants. Fall application of 2,4-D or clopyralid reduced Canada thistle density 39 and 92% respectively, 8 MAT, but only clopyralid resulted in a reduction of sucrose, DP 4, DP 5, and total sugar and an increase of 1-FEH compared with nontreated plants.
Riparian habitats are important components of an ecosystem; however, their hydrology combined with anthropogenic effects facilitates the establishment and spread of invasive plant species. We used a maximum-entropy predictive habitat model, MAXENT, to predict the distributions of five invasive plant species (Canada thistle, musk thistle, Russian olive, phragmites, and saltcedar) along the North Platte River in Nebraska. Projections for each species were highly accurate. Elevation and distance from river were most important variables for each species. Saltcedar and phragmites appear to have restricted distributions in the study area, whereas Russian olive and thistle species were broadly distributed. Results from this study hold promise for the development of proactive management approaches to identify and control areas of high abundance and prevent further spread of invasive plants along the North Platte River.