We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
Online ordering will be unavailable from 17:00 GMT on Friday, April 25 until 17:00 GMT on Sunday, April 27 due to maintenance. We apologise for the inconvenience.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This paper documents trends over the last two decades in retirement behavior and retirement income choices of participants in TIAA, a large and mature defined contribution plan. From 2000 and 2018, the average age at which TIAA participants stopped contributing to their accounts, which is a lower bound on their retirement age, rose by 1.2 years for female and 2.0 years for male participants. There is considerable variation in the elapsed time between the time of the last contribution to and the first income draw from plan accounts. Only 40% of participants take an initial income payment within 48 months of their last contribution. Later retirement and lags between retirement and the first retirement income payout led to a growing fraction of participants reaching the required minimum distribution (RMD) age before starting income draws. Between 2000 and 2018, the fraction of first-time income recipients who took no income until their RMD rose from 10% to 52%, while the fraction of these recipients who selected a life-contingent annuitized payout stream declined from 61% to 18%. Among those who began receiving income before age 70, annuitization rates were significantly higher than among those who did so at older ages. Aggregating across all income-receiving beneficiaries at TIAA, not just new income recipients, the proportion with a life annuity as part of their payout strategy fell from 52% in 2008 to 31% in 2018. By comparison, the proportion of all income recipients taking an RMD payment rose from 16% to 29%. About one-fifth of retirees received more than one type of income; the most common pairing was an RMD and a life annuity. In the later years of our sample, the RMD was becoming the de facto default distribution option for newly retired TIAA participants.
The impact of the coronavirus disease 2019 (COVID-19) pandemic on mental health is still being unravelled. It is important to identify which individuals are at greatest risk of worsening symptoms. This study aimed to examine changes in depression, anxiety and post-traumatic stress disorder (PTSD) symptoms using prospective and retrospective symptom change assessments, and to find and examine the effect of key risk factors.
Method
Online questionnaires were administered to 34 465 individuals (aged 16 years or above) in April/May 2020 in the UK, recruited from existing cohorts or via social media. Around one-third (n = 12 718) of included participants had prior diagnoses of depression or anxiety and had completed pre-pandemic mental health assessments (between September 2018 and February 2020), allowing prospective investigation of symptom change.
Results
Prospective symptom analyses showed small decreases in depression (PHQ-9: −0.43 points) and anxiety [generalised anxiety disorder scale – 7 items (GAD)-7: −0.33 points] and increases in PTSD (PCL-6: 0.22 points). Conversely, retrospective symptom analyses demonstrated significant large increases (PHQ-9: 2.40; GAD-7 = 1.97), with 55% reported worsening mental health since the beginning of the pandemic on a global change rating. Across both prospective and retrospective measures of symptom change, worsening depression, anxiety and PTSD symptoms were associated with prior mental health diagnoses, female gender, young age and unemployed/student status.
Conclusions
We highlight the effect of prior mental health diagnoses on worsening mental health during the pandemic and confirm previously reported sociodemographic risk factors. Discrepancies between prospective and retrospective measures of changes in mental health may be related to recall bias-related underestimation of prior symptom severity.
The COVID-19 pandemic has shone a spotlight on how health outcomes are unequally distributed among different population groups, with disadvantaged communities and individuals being disproportionality affected in terms of infection, morbidity and mortality, as well as vaccine access. Recently, there has been considerable debate about how social disadvantage and inequality intersect with developmental processes to result in a heightened susceptibility to environmental stressors, economic shocks and large-scale health emergencies. We argue that DOHaD Society members can make important contributions to addressing issues of inequality and improving community resilience in response to COVID-19. In order to do so, it is beneficial to engage with and adopt a social justice framework. We detail how DOHaD can align its research and policy recommendations with a social justice perspective to ensure that we contribute to improving the health of present and future generations in an equitable and socially just way.
This paper presents the current state of mathematical modelling of the electrochemical behaviour of lithium-ion batteries (LIBs) as they are charged and discharged. It reviews the models developed by Newman and co-workers, both in the cases of dilute and moderately concentrated electrolytes and indicates the modelling assumptions required for their development. Particular attention is paid to the interface conditions imposed between the electrolyte and the active electrode material; necessary conditions are derived for one of these, the Butler–Volmer relation, in order to ensure physically realistic solutions. Insight into the origin of the differences between various models found in the literature is revealed by considering formulations obtained by using different measures of the electric potential. Materials commonly used for electrodes in LIBs are considered and the various mathematical models used to describe lithium transport in them discussed. The problem of upscaling from models of behaviour at the single electrode particle scale to the cell scale is addressed using homogenisation techniques resulting in the pseudo-2D model commonly used to describe charge transport and discharge behaviour in lithium-ion cells. Numerical solution to this model is discussed and illustrative results for a common device are computed.
This project will work closely with existing service partners involved in street level services and focus on testing and evaluating three approaches for street level interventions for youth who are homeless and who have severe or moderate mentally illness. Youth will be asked to choose their preferred service approach:
Housing First related initiatives focused on interventions designed to move youth to appropriate and available housing and ongoing housing supports.
Treatment First initiatives to provide Mental Health/Addiction supports and treatment solutions, and; Simultaneous attention to both Housing and Treatment Together
Our primary objective is to understand the service delivery preferences of homeless youth and understand the outcomes of these choices. Our research questions include:
1. Which approaches to service are chosen by youth?
2. What are the differences and similarities between groups choosing each approach?
3. What are the critical ingredients needed to effectively implement services for homeless youth from the perspectives of youth, families and service providers?
Focus groups with staff and family members will occur to assist in understanding the nature of each of service approach, changes that evolve within services, & facilitators and barriers to service delivery. This work will be important in determining which approach is chosen by youth and why. Evaluating the outcomes with each choice will provide valuable information about outcomes for the service options chosen by youth. This assist in better identifying weaknesses in the services offered and inform further development of treatment options that youth will accept.
There is increasing evidence for a neurobiological basis of antisocial personality disorder (ASPD), includinggenetic liability, aberrant serotonergic function, neuropsychological deficits and structural and functional brain abnormalities. However, few functional brain imaging studies have been conducted using tasks of clinically relevant functions such as impulse control and reinforcement processing. Here we report on a study investigating the neural basis of behavioural inhibition and reward sensitivity in ASPD using functional magnetic resonance imaging (fMRI).
Methods
17 medication-free male individuals with DSM IV ASPD and 14 healthy controls were included. All subjects were screened for Axis I pathology and substance misuse. Scanner tasks included two block design tasks: one Go/No-Go task and one reward task. Scanning was carried out on a 1.5T Phillips system. Whole brain coverage was achieved using 40 axial slices with 3.5mm spacing a TR of 5 seconds. Data were analysed using SPM5 using random effects models.
Results
Results of the Go/No-Go task confirmed brain activation previously described in the processing of impulse inhibition, namely in the orbitofrontal and dorsolateral prefrontal cortex and the anterior cingulate, and these were enhanced in the PD group. The reward task was associated with BOLD response changes in the reward network in both groups. However, these BOLD responses were reduced in the ASPD group, particularly in prefrontal areas.
Conclusions
Our results further support the notion of prefrontal dysfunction in ASPD. However, contrary to previous studies suggesting “hypofrontality” in this disorder, we found task specific increased and decreased BOLD responses.
Alcohol induced liver disease (ALD) is the predominant cause of alcohol-related mortality in the UK. Therefore helping patients with ALD to quit is a primary treatment goal.
Aims/Objectives
The primary aim of this study was to measure the effectiveness and tolerability of Baclofen in maintaining abstinence, and to determine if this resulted in a reduction in standard measures of liver damage.
Methods
An observational prospective clinical audit was performed. Patients with ALD were commenced on Baclofen titrated according to tolerability and response up to 30 mg TDS. Primary outcome measures were severity of physical dependence (SADQ score) and biochemical markers of liver damage GGT, ALT, Bilirubin fibroelastography. These were compared at baseline, and 1 year.
Results
Of the 243 patients commenced on Baclofen, 151 (85 female 66 male) have completed 1 year follow-up (F/U) of which 130 (86%) have remained engaged. 10 have died. Comparison of baseline (B/L) and 1 year biochemical markers showed a reduction in GGT (c2= 66.8 P < 0.0001) and Bil (c2= 82.6 P < 0.0001). There was a significant reduction in alcohol consumption (P < 0.0001 95% CI = 10 to 22). And the presence of physical dependence (c<sup>2</sup> = 77.4 P < 0.0001) as categorised by SADQ.
Conclusion
Baclofen is well tolerated in this very difficult to treat, high risk patient group. It has a positive impact on alcohol consumption, and overall measures of liver function and harm. A RCT is needed to confirm the benefit of Baclofen in this patient group.</div>
Feed represents a substantial proportion of production costs in the dairy industry and is a useful target for improving overall system efficiency and sustainability. The objective of this study was to develop methodology to estimate the economic value for a feed efficiency trait and the associated methane production relevant to Canada. The approach quantifies the level of economic savings achieved by selecting animals that convert consumed feed into product while minimizing the feed energy used for inefficient metabolism, maintenance and digestion. We define a selection criterion trait called Feed Performance (FP) as a 1 kg increase in more efficiently used feed in a first parity lactating cow. The impact of a change in this trait on the total lifetime value of more efficiently used feed via correlated selection responses in other life stages is then quantified. The resulting improved conversion of feed was also applied to determine the resulting reduction in output of emissions (and their relative value based on a national emissions value) under an assumption of constant methane yield, where methane yield is defined as kg methane/kg dry matter intake (DMI). Overall, increasing the FP estimated breeding value by one unit (i.e. 1 kg of more efficiently converted DMI during the cow’s first lactation) translates to a total lifetime saving of 3.23 kg in DMI and 0.055 kg in methane with the economic values of CAD $0.82 and CAD $0.07, respectively. Therefore, the estimated total economic value for FP is CAD $0.89/unit. The proposed model is robust and could also be applied to determine the economic value for feed efficiency traits within a selection index in other production systems and countries.
Introduction: Although use of point of care ultrasound (PoCUS) protocols for patients with undifferentiated hypotension in the Emergency Department (ED) is widespread, our previously reported SHoC-ED study showed no clear survival or length of stay benefit for patients assessed with PoCUS. In this analysis, we examine if the use of PoCUS changed fluid administration and rates of other emergency interventions between patients with different shock types. The primary comparison was between cardiogenic and non-cardiogenic shock types. Methods: A post-hoc analysis was completed on the database from an RCT of 273 patients who presented to the ED with undifferentiated hypotension (SBP <100 or shock index > 1) and who had been randomized to receive standard care with or without PoCUS in 6 centres in Canada and South Africa. PoCUS-trained physicians performed scans after initial assessment. Shock categories and diagnoses recorded at 60 minutes after ED presentation, were used to allocate patients into subcategories of shock for analysis of treatment. We analyzed actual care delivered including initial IV fluid bolus volumes (mL), rates of inotrope use and major procedures. Standard statistical tests were employed. Sample size was powered at 0.80 (α:0.05) for a moderate difference. Results: Although there were expected differences in the mean fluid bolus volume between patients with non-cardiogenic and cardiogenic shock, there was no difference in fluid bolus volume between the control and PoCUS groups (non-cardiogenic control 1878 mL (95% CI 1550 – 2206 mL) vs. non-cardiogenic PoCUS 1687 mL (1458 – 1916 mL); and cardiogenic control 768 mL (194 – 1341 mL) vs. cardiogenic PoCUS 981 mL (341 – 1620 mL). Likewise there were no differences in rates of inotrope administration, or major procedures for any of the subcategories of shock between the control group and PoCUS group patients. The most common subcategory of shock was distributive. Conclusion: Despite differences in care delivered by subcategory of shock, we did not find any significant difference in actual care delivered between patients who were examined using PoCUS and those who were not. This may help to explain the previously reported lack of outcome difference between groups.
Introduction: Point of care ultrasound has been reported to improve diagnosis in non-traumatic hypotensive ED patients. We compared diagnostic performance of physicians with and without PoCUS in undifferentiated hypotensive patients as part of an international prospective randomized controlled study. The primary outcome was diagnostic performance of PoCUS for cardiogenic vs. non-cardiogenic shock. Methods: SHoC-ED recruited hypotensive patients (SBP < 100 mmHg or shock index > 1) in 6 centres in Canada and South Africa. We describe previously unreported secondary outcomes relating to diagnostic accuracy. Patients were randomized to standard clinical assessment (No PoCUS) or PoCUS groups. PoCUS-trained physicians performed scans after initial assessment. Demographics, clinical details and findings were collected prospectively. Initial and secondary diagnoses including shock category were recorded at 0 and 60 minutes. Final diagnosis was determined by independent blinded chart review. Standard statistical tests were employed. Sample size was powered at 0.80 (α:0.05) for a moderate difference. Results: 273 patients were enrolled with follow-up for primary outcome completed for 270. Baseline demographics and perceived category of shock were similar between groups. 11% of patients were determined to have cardiogenic shock. PoCUS had a sensitivity of 80.0% (95% CI 54.8 to 93.0%), specificity 95.5% (90.0 to 98.1%), LR+ve 17.9 (7.34 to 43.8), LR-ve 0.21 (0.08 to 0.58), Diagnostic OR 85.6 (18.2 to 403.6) and accuracy 93.7% (88.0 to 97.2%) for cardiogenic shock. Standard assessment without PoCUS had a sensitivity of 91.7% (64.6 to 98.5%), specificity 93.8% (87.8 to 97.0%), LR+ve 14.8 (7.1 to 30.9), LR- of 0.09 (0.01 to 0.58), Diagnostic OR 166.6 (18.7 to 1481) and accuracy of 93.6% (87.8 to 97.2%). There was no significant difference in sensitivity (-11.7% (-37.8 to 18.3%)) or specificity (1.73% (-4.67 to 8.29%)). Diagnostic performance was also similar between other shock subcategories. Conclusion: As reported in other studies, PoCUS based assessment performed well diagnostically in undifferentiated hypotensive patients, especially as a rule-in test. However performance was similar to standard (non-PoCUS) assessment, which was excellent in this study.
This study aimed to compare the quality of beef from suckler bulls raised on a high-energy concentrate ration and slaughtered at different carcass weights (CW)/ages. In total, 42 spring-born, Charolais and Limousin-sired, weaned suckler bulls were provided with a finishing diet of ad libitum concentrates and grass silage until they reached target CW of 340, 380 and 420 kg. Intramuscular fat (IMF) content tended (P<0.06) to be higher for 420 kg CW than for 380 and 340 kg CW. Sensory tenderness was lower (P<0.001) for 420 kg CW than for 380 and 340 kg CW. Juiciness was higher (P<0.05) for 420 kg CW than for 380 kg CW. Flavour liking was higher (P<0.05) for 420 and 380 kg CW (which did not differ) than for 340 kg CW. Overall, an increase in CW resulted in a slight increase in IMF content which could be responsible for the increase in juiciness and flavour liking of the beef. An increase in CW led to a decrease in the tenderness of the beef even though the overall liking of the beef was not affected.
Plant canopy reflectance over the 0.45- to 1.25-μm wavelength (WL) of weed species and crops was recorded with a field spectroradiometer to evaluate the possible use of remote sensing to distinguish weeds from crops. Weed and weed-crop species reflectance differences were generally greater at the 0.85 μm WL in the near-infrared spectral region than at the 0.55 μm WL in the visible region, indicating that color infrared (CIR) aerial photography may be useful to detect weed populations in crops. Canopy reflectance data were more directly related to photographic differences in weed-crop images than were single leaf or inflorescence reflectance data. Aerial photography at altitudes of 610 to 3050 m distinguished climbing milkweed (Sarcostemma cyancboides ♯ SAZCY) in orange [Citrus sinensis (L.) Osbeck. ‘Valencia’) trees; ragweed parthenium (Parthenium hysterophorus L. ♯ PTNHY) in carrot (Daucus carota L., var. sativa ‘Long Imperator’); johnsongrass [Sorghum halepense (L.) Pers. ♯ SORHA) in cotton (Gossypium hirsutum L. ‘CP 3774’) and in sorghum (Sorghum bicolor L. Moench. ‘Oro’); London rocket (Sisymbrium irio L. ♯ SSYIR) in cabbage; and Palmer amaranth (Amaranthus palmeri S. Wats. ♯ AMAPA) in cotton. Johnsongrass was also detectable with CIR film in maturing grain sorghum from 18 290 m. Detection of weed species in crops was aided by differential stages of inflorescence and senescence, and by the chlorophyll content, color, area, intercellular space, and surface characteristics of the leaves. Discrete plant community areas were determined by computer-based image analyses from a 1:8000-scale positive transparency with the efficiency of 82, 81, 68, and 100% for Palmer amaranth, johnsongrass, sorghum, and cotton, respectively. The computer analyses should permit discrete aerial surveys of weed-crop communities that are necessary for integrated crop management systems.
Introduction: Point of care ultrasound (PoCUS) is an established tool in the initial management of patients with undifferentiated hypotension in the emergency department (ED). While PoCUS protocols have been shown to improve early diagnostic accuracy, there is little published evidence for any mortality benefit. We report the findings from our international multicenter randomized controlled trial, assessing the impact of a PoCUS protocol on survival and key clinical outcomes. Methods: Recruitment occurred at 7 centres in North America (4) and South Africa (3). Scans were performed by PoCUS-trained physicians. Screening at triage identified patients (SBP<100 or shock index>1), randomized to PoCUS or control (standard care and no PoCUS) groups. Demographics, clinical details and study findings were collected prospectively. Initial and secondary diagnoses were recorded at 0 and 60 minutes, with ultrasound performed in the PoCUS group prior to secondary assessment. The primary outcome measure was 30-day/discharge mortality. Secondary outcome measures included diagnostic accuracy, changes in vital signs, acid-base status, and length of stay. Categorical data was analyzed using Fishers test, and continuous data by Student T test and multi-level log-regression testing. (GraphPad/SPSS) Final chart review was blinded to initial impressions and PoCUS findings. Results: 258 patients were enrolled with follow-up fully completed. Baseline comparisons confirmed effective randomization. There was no difference between groups for the primary outcome of mortality; PoCUS 32/129 (24.8%; 95% CI 14.3-35.3%) vs. Control 32/129 (24.8%; 95% CI 14.3-35.3%); RR 1.00 (95% CI 0.869 to 1.15; p=1.00). There were no differences in the secondary outcomes; ICU and total length of stay. Our sample size has a power of 0.80 (α:0.05) for a moderate effect size. Other secondary outcomes are reported separately. Conclusion: This is the first RCT to compare PoCUS to standard care for undifferentiated hypotensive ED patients. We did not find any mortality or length of stay benefits with the use of a PoCUS protocol, though a larger study is required to confirm these findings. While PoCUS may have diagnostic benefits, these may not translate into a survival benefit effect.
Introduction: Point of Care Ultrasound (PoCUS) protocols are commonly used to guide resuscitation for emergency department (ED) patients with undifferentiated non-traumatic hypotension. While PoCUS has been shown to improve early diagnosis, there is a minimal evidence for any outcome benefit. We completed an international multicenter randomized controlled trial (RCT) to assess the impact of a PoCUS protocol on key resuscitation markers in this group. We report diagnostic impact and mortality elsewhere. Methods: The SHoC-ED1 study compared the addition of PoCUS to standard care within the first hour in the treatment of adult patients presenting with undifferentiated hypotension (SBP<100 mmHg or a Shock Index >1.0) with a control group that did not receive PoCUS. Scans were performed by PoCUS-trained physicians. 4 North American, and 3 South African sites participated in the study. Resuscitation outcomes analyzed included volume of fluid administered in the ED, changes in shock index (SI), modified early warning score (MEWS), venous acid-base balance, and lactate, at one and four hours. Comparisons utilized a T-test as well as stratified binomial log-regression to assess for any significant improvement in resuscitation amount the outcomes. Our sample size was powered at 0.80 (α:0.05) for a moderate effect size. Results: 258 patients were enrolled with follow-up fully completed. Baseline comparisons confirmed effective randomization. There was no significant difference in mean total volume of fluid received between the control (1658 ml; 95%CI 1365-1950) and PoCUS groups (1609 ml; 1385-1832; p=0.79). Significant improvements were seen in SI, MEWS, lactate and bicarbonate with resuscitation in both the PoCUS and control groups, however there was no difference between groups. Conclusion: SHOC-ED1 is the first RCT to compare PoCUS to standard of care in hypotensive ED patients. No significant difference in fluid used, or markers of resuscitation was found when comparing the use of a PoCUS protocol to that of standard of care in the resuscitation of patients with undifferentiated hypotension.
Introduction: Point of care ultrasonography (PoCUS) is an established tool in the initial management of hypotensive patients in the emergency department (ED). It has been shown rule out certain shock etiologies, and improve diagnostic certainty, however evidence on benefit in the management of hypotensive patients is limited. We report the findings from our international multicenter RCT assessing the impact of a PoCUS protocol on diagnostic accuracy, as well as other key outcomes including mortality, which are reported elsewhere. Methods: Recruitment occurred at 4 North American and 3 Southern African sites. Screening at triage identified patients (SBP<100 mmHg or shock index >1) who were randomized to either PoCUS or control groups. Scans were performed by PoCUS-trained physicians. Demographics, clinical details and findings were collected prospectively. Initial and secondary diagnoses were recorded at 0 and 60 minutes, with ultrasound performed in the PoCUS group prior to secondary assessment. Final chart review was blinded to initial impressions and PoCUS findings. Categorical data was analyzed using Fishers two-tailed test. Our sample size was powered at 0.80 (α:0.05) for a moderate effect size. Results: 258 patients were enrolled with follow-up fully completed. Baseline comparisons confirmed effective randomization. The perceived shock category changed more frequently in the PoCUS group 20/127 (15.7%) vs. control 7/125 (5.6%); RR 2.81 (95% CI 1.23 to 6.42; p=0.0134). There was no significant difference in change of diagnostic impression between groups PoCUS 39/123 (31.7%) vs control 34/124 (27.4%); RR 1.16 (95% CI 0.786 to 1.70; p=0.4879). There was no significant difference in the rate of correct category of shock between PoCUS (118/127; 93%) and control (113/122; 93%); RR 1.00 (95% CI 0.936 to 1.08; p=1.00), or for correct diagnosis; PoCUS 90/127 (70%) vs control 86/122 (70%); RR 0.987 (95% CI 0.671 to 1.45; p=1.00). Conclusion: This is the first RCT to compare PoCUS to standard care for undifferentiated hypotensive ED patients. We found that the use of PoCUS did change physicians’ perceived shock category. PoCUS did not improve diagnostic accuracy for category of shock or diagnosis.
Field and greenhouse studies were conducted to evaluate mesotrione alone and in combinations with low rates of atrazine and bentazon for control of yellow and purple nutsedge. Mesotrione alone at rates of 105 to 210 g ai/ha controlled yellow nutsedge 43 to 70%. Mixtures of mesotrione with atrazine at 280 g ai/ha did not always improve yellow nutsedge control over that by mesotrione alone, but increasing atrazine to 560 g ai/ha in these mixtures generally provided more consistent control of yellow nutsedge. Mesotrione at 105 g ai/ha mixed with bentazon at 280 or 560 g ai/ha controlled yellow nutsedge 88% or greater which was similar to control from the standard halosulfuron at 36 g ai/ha. Mesotrione, atrazine, and bentazon alone did not control purple nutsedge. Mixtures of mesotrione plus bentazon, however, did improve control of purple nutsedge over either herbicide applied alone, but this control was not considered commercially acceptable.
Bushkiller, an aggressive perennial vine native to Southeast Asia, has invaded several sites in Alabama, North Carolina, Texas, Louisiana, and Mississippi. Bushkiller has only recently been discovered in North Carolina. The potential economic and environmental consequences associated with established exotic invasive perennial vines and the lack of published control measures for bushkiller prompted research to be conducted at North Carolina State University that may be used in an early-detection rapid-response program. Field and greenhouse studies were conducted to determine bushkiller response to selected foliar-applied herbicides. Field study 1 evaluated efficacy of glyphosate, triclopyr, triclopyr plus 2,4-D, triclopyr plus aminopyralid, and triclopyr plus glyphosate applied postemergence to bushkiller. No control was evident from any treatment at 10 mo after application. In a separate experiment, aminocyclopyrachlor, imazapyr, metsulfuron, sulfometuron, and sulfometuron plus metsulfuron were applied postemergence to bushkiller. Control with aminocyclopyrachlor, imazapyr, sulfometuron, and sulfometuron plus metsulfuron was 88 to 99% at 10 mo after application. Each treatment was also applied to bushkiller in a greenhouse trial. Aminocyclopyrachlor and triclopyr-containing treatments generally resulted in the greatest control, lowest dry weights, and shortest vine lengths among the treatments. These results indicate that several herbicides may be employed initially in an early-detection, rapid-response program for bushkiller. Additional research is needed to determine how effective these herbicides would be in multiple-season treatments that may be required at well established bushkiller infestation sites.
Studies were conducted in 2000 and 2001 to investigate responses of glyphosate-resistant cotton, glyphosate-resistant soybean, and selected weed species to postemergence applications of isopropylamine (Ipa) and diammonium (Dia) salts of glyphosate at selected rates ranging from 0.42 to 3.36 kg ae/ha. No differences were detected between either glyphosate salts or application timings for cotton injury, cotton lint yield, micronaire, fiber length, fiber strength, or fiber uniformity. In a weed-free soybean study, no differences in soybean injury occurred between early-postemergence treatments of the two glyphosate salts. Injury from late-postemergence treatments did not exceed 12% with glyphosate-Ipa or 9% with glyphosate-Dia at 3.36 kg/ha. Soybean treated with glyphosate-Ipa yielded 3,050 kg/ha, whereas soybean treated with glyphosate-Dia yielded 2,880 kg/ha, when averaged over glyphosate rate and application timing. In a soybean study that included weed control as a variable, weed control at 14 d after treatment (DAT), and soybean yield was independent of glyphosate salt. Control of common ragweed, ivyleaf morningglory, pitted morningglory, and large crabgrass at 28 DAT was similar at 0.84 kg/ha of either glyphosate salt.