We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Objectives/Goals: To explore the caregivers’ lived experiences related to facilitators of and barriers to effective primary care or neurology follow-up for children discharged from the pediatric emergency department (PED) with headaches. Methods/Study Population: We used the descriptive phenomenology qualitative study design to ascertain caregivers’ lived experiences with making follow-up appointments after their child’s PED visit. We conducted semi-structured interviews with caregivers of children with headaches from 4 large urban PEDs over HIPAA-compliant Zoom conferencing platform. A facilitator/co-facilitator team (JH and SL) guided all interviews, and the audio of which was transcribed using the TRINT software. Conventional content analysis was performed by two coders (JH and AS) to generate new themes, and coding disputes were resolved by team members using Atlas TI (version 24). Results/Anticipated Results: We interviewed a total of 11 caregivers (9 mothers, 1 grandmother, and 1 father). Among interviewees, 45% identified as White non-Hispanic, 45% Hispanic, 9% as African-American, and 37% were publicly insured. Participants described similar experiences in obtaining follow-up care that included long waits to obtain neurology appointments. Participants also described opportunities to overcome wait times that included offering alternative healthcare provider types as well as telehealth options. Last, participants described desired action while awaiting neurology appointments such as obtaining testing and setting treatment plans. Discussion/Significance of Impact: Caregivers perceived time to appointment as too long and identified practical solutions to ease frustrations while waiting. Future research should explore sharing caregiver experiences with primary care providers, PED physicians, and neurologists while developing plans to implement caregiver-informed interventions.
Placebo and nocebo effects are widely reported across psychiatric conditions, yet have seldom been examined in the context of gambling disorder. Through meta-analysis, we examined placebo effects, their moderating factors, and nocebo effects, from available randomised, controlled pharmacological clinical trials in gambling disorder.
Methods:
We searched, up to 19 February 2024, a broad range of databases, for double-blind randomised controlled trials (RCTs) of medications for gambling disorder. Outcomes were gambling symptom severity and quality of life (for efficacy), and drop outs due to medication side effects in the placebo arms.
Results:
We included 16 RCTs (n = 833) in the meta-analysis. The overall effect size for gambling severity reduction in the placebo arms was 1.18 (95%CI 0.91–1.46) and for quality of life improvement was 0.63 (0.42-0.83). Medication class, study sponsorship, trial duration, baseline severity of gambling and publication year significantly moderated effect sizes for at least some of these outcome measures. Author conflict of interest, placebo run-in, gender split, severity scale choice, age of participants or unbalanced randomisation did not moderate effect sizes. Nocebo effects leading to drop out from the trial were observed in 6% of participants in trials involving antipsychotics, while this was less for other medication types.
Conclusion:
Placebo effects in trials of pharmacological treatment of gambling disorder are large, and there are several moderators of this effect. Nocebo effects were measureable and may be influenced by medication class being studied. Practical implications of these new findings for the field are discussed, along with recommendations for future clinical trials.
Repetitive transcranial magnetic stimulation (TMS) is an evidenced based treatment for adults with treatment resistant depression (TRD). The standard clinical protocol for TMS is to stimulate the left dorsolateral prefrontal cortex (DLPFC). Although the DLPFC is a defining region in the cognitive control network of the brain and implicated in executive functions such as attention and working memory, we lack knowledge about whether TMS improves cognitive function independent of depression symptoms. This exploratory analysis sought to address this gap in knowledge by assessing changes in attention before and after completion of a standard treatment with TMS in Veterans with TRD.
Participants and Methods:
Participants consisted of 7 Veterans (14.3% female; age M = 46.14, SD = 7.15; years education M = 16.86, SD = 3.02) who completed a full 30-session course of TMS treatment and had significant depressive symptoms at baseline (Patient Health Questionnaire-9; PHQ-9 score >5). Participants were given neurocognitive assessments measuring aspects of attention [Wechsler Adult Intelligence Scale 4th Edition (WAIS-IV) subtests: Digits Forward, Digits Backward, and Number Sequencing) at baseline and again after completion of TMS treatment. The relationship between pre and post scores were examined using paired-samples t-test for continuous variables and a linear regression to covary for depression and posttraumatic stress disorder (PTSD), which is often comorbid with depression in Veteran populations.
Results:
There was a significant improvement in Digit Span Forward (p=.01, d=-.53), but not Digit Span Backward (p=.06) and Number Sequencing (p=.54) post-TMS treatment. Depression severity was not a significant predictor of performance on Digit Span Forward (f(1,5)=.29, p=.61) after TMS treatment. PTSD severity was also not a significant predictor of performance on Digit Span Forward (f(1,5)=1.31, p=.32).
Conclusions:
Findings suggested that a standard course of TMS improves less demanding measures of working memory after a full course of TMS, but possibly not the more demanding aspects of working memory. This improvement in cognitive function was independent of improvements in depression and PTSD symptoms. Further investigation in a larger sample and with direct neuroimaging measures of cognitive function is warranted.
There are few evidence-based interventions to support caregiver mental health developed for low- and middle-income countries. Nae Umeed is a community-based group intervention developed with collaboratively with local community health workers in Uttarakhand, India primarily to promote mental wellbeing for caregivers and others. This pre–post study aimed to evaluate whether Nae Umeed improved mental health and social participation for people with mental distress, including caregivers. The intervention consisted of 14 structured group sessions facilitated by community health workers. Among 115 adult participants, 20% were caregivers and 80% were people with disability and other vulnerable community members; 62% had no formal education and 92% were female. Substantial and statistically significant improvements occurred in validated psychometric measures for mental health (12-Item General Health Questionnaire, Patient Health Questionnaire-9) and social participation (Participation Scale). Improvements occurred regardless of caregiver status. This intervention addressed mental health and social participation for marginalised groups that are typically without access to formal mental health care and findings suggest Nae Umeed improved mental health and social participation; however, a controlled community trial would be required to prove causation. Community-based group interventions are a promising approach to improving the mental health of vulnerable groups in South Asia.
Information about population sizes, trends, and habitat use is key for species conservation and management. The Buff-breasted Sandpiper Calidris subruficollis (BBSA) is a long-distance migratory shorebird that breeds in the Arctic and migrates to south-eastern South America, wintering in the grasslands of southern Brazil, Uruguay, and Argentina. Most studies of Nearctic migratory species occur in the Northern Hemisphere, but monitoring these species at non-breeding areas is crucial for conservation during this phase of the annual cycle. Our first objective was to estimate trends of BBSA at four key areas in southern Brazil during the non-breeding season. We surveyed for BBSA and measured vegetation height in most years from 2008/09 to 2019/20. We used hierarchical distance sampling models in which BBSA abundance and density were modelled as a function of vegetation height and corrected for detectability. Next, we used on-the-ground surveys combined with satellite imagery and habitat classification models to estimate BBSA population size in 2019/20 at two major non-breeding areas. We found that abundance and density were negatively affected by increasing vegetation height. Abundance fluctuated five- to eight-fold over the study period, with peaks in the middle of the study (2014/15). We estimated the BBSA wintering population size as 1,201 (95% credible interval [CI]: 637–1,946) birds in Torotama Island and 2,232 (95% CI: 1,199–3,584) in Lagoa do Peixe National Park during the 2019/20 austral summer. Although no pronounced trend was detected, BBSA abundance fluctuated greatly from year to year. Our results demonstrate that only two of the four key areas hold high densities of BBSA and highlight the positive effect of short grass on BBSA numbers. Short-grass coastal habitats used by BBSA are strongly influenced by livestock grazing and climate, and are expected to shrink in size with future development and climatic changes.
Despite the availability of remote consolation and the evidence for its effectiveness, its adoption has been relatively limited (Hashiguchi, 2020). In light of COVID social distancing measures, there was an immediate requirement to adopt this technology into routine practice.
Objectives
The objective of this evaluation was to examine clinicians’ experiences of the urgent adoption of digital technology in a NHS provider of mental health and community physical health services.
Methods
From a staff survey (n=234) of experiences of working during a period when there were significant levels of Covid-related restrictions, data was extracted and subject to thematic analysis by a research team made up of clinicians, academics, and quality improvement specialists.
Results
Five key themes relevant to the urgent adoption of digital technology were identified (figure 1): (1) Availability of staff for patient contact was generally felt to be improved; (2) Quality of contact was reported to be variable (e.g. some respondents reporting better rapport with patients, whereas others found remote contact interfered with rapport building); (3) Safeguarding concerns were reported to be more difficult to identify through remote consultation; (4) Contingency plans were recommended to allow for vulnerable patients for whom remote consultation was a problem; (5) Multi-agency working was reported to be strengthened.
Conclusions
The findings from this evaluation allow for an informed approach to future adoption of remote consultation in routine practice.
Gravitational waves from coalescing neutron stars encode information about nuclear matter at extreme densities, inaccessible by laboratory experiments. The late inspiral is influenced by the presence of tides, which depend on the neutron star equation of state. Neutron star mergers are expected to often produce rapidly rotating remnant neutron stars that emit gravitational waves. These will provide clues to the extremely hot post-merger environment. This signature of nuclear matter in gravitational waves contains most information in the 2–4 kHz frequency band, which is outside of the most sensitive band of current detectors. We present the design concept and science case for a Neutron Star Extreme Matter Observatory (NEMO): a gravitational-wave interferometer optimised to study nuclear physics with merging neutron stars. The concept uses high-circulating laser power, quantum squeezing, and a detector topology specifically designed to achieve the high-frequency sensitivity necessary to probe nuclear matter using gravitational waves. Above 1 kHz, the proposed strain sensitivity is comparable to full third-generation detectors at a fraction of the cost. Such sensitivity changes expected event rates for detection of post-merger remnants from approximately one per few decades with two A+ detectors to a few per year and potentially allow for the first gravitational-wave observations of supernovae, isolated neutron stars, and other exotica.
Emergency Medical Services (EMS) systems have developed protocols for prehospital activation of the cardiac catheterization laboratory for patients with suspected ST-elevation myocardial infarction (STEMI) to decrease first-medical-contact-to-balloon time (FMC2B). The rate of “false positive” prehospital activations is high. In order to decrease this rate and expedite care for patients with true STEMI, the American Heart Association (AHA; Dallas, Texas USA) developed the Mission Lifeline PreAct STEMI algorithm, which was implemented in Los Angeles County (LAC; California USA) in 2015. The hypothesis of this study was that implementation of the PreAct algorithm would increase the positive predictive value (PPV) of prehospital activation.
Methods:
This is an observational pre-/post-study of the effect of the implementation of the PreAct algorithm for patients with suspected STEMI transported to one of five STEMI Receiving Centers (SRCs) within the LAC Regional System. The primary outcome was the PPV of cardiac catheterization laboratory activation for percutaneous coronary intervention (PCI) or coronary artery bypass graft (CABG). The secondary outcome was FMC2B.
Results:
A total of 1,877 patients were analyzed for the primary outcome in the pre-intervention period and 405 patients in the post-intervention period. There was an overall decrease in cardiac catheterization laboratory activations, from 67% in the pre-intervention period to 49% in the post-intervention period (95% CI for the difference, -14% to -22%). The overall rate of cardiac catheterization declined in post-intervention period as compared the pre-intervention period, from 34% to 30% (95% CI, for the difference -7.6% to 0.4%), but actually increased for subjects who had activation (48% versus 58%; 95% CI, 4.6%-15.0%). Implementation of the PreAct algorithm was associated with an increase in the PPV of activation for PCI or CABG from 37.9% to 48.6%. The overall odds ratio (OR) associated with the intervention was 1.4 (95% CI, 1.1-1.8). The effect of the intervention was to decrease variability between medical centers. There was no associated change in average FMC2B.
Conclusions:
The implementation of the PreAct algorithm in the LAC EMS system was associated with an overall increase in the PPV of cardiac catheterization laboratory activation.
Registry-based trials have emerged as a potentially cost-saving study methodology. Early estimates of cost savings, however, conflated the benefits associated with registry utilisation and those associated with other aspects of pragmatic trial designs, which might not all be as broadly applicable. In this study, we sought to build a practical tool that investigators could use across disciplines to estimate the ranges of potential cost differences associated with implementing registry-based trials versus standard clinical trials.
Methods:
We built simulation Markov models to compare unique costs associated with data acquisition, cleaning, and linkage under a registry-based trial design versus a standard clinical trial. We conducted one-way, two-way, and probabilistic sensitivity analyses, varying study characteristics over broad ranges, to determine thresholds at which investigators might optimally select each trial design.
Results:
Registry-based trials were more cost effective than standard clinical trials 98.6% of the time. Data-related cost savings ranged from $4300 to $600,000 with variation in study characteristics. Cost differences were most reactive to the number of patients in a study, the number of data elements per patient available in a registry, and the speed with which research coordinators could manually abstract data. Registry incorporation resulted in cost savings when as few as 3768 independent data elements were available and when manual data abstraction took as little as 3.4 seconds per data field.
Conclusions:
Registries offer important resources for investigators. When available, their broad incorporation may help the scientific community reduce the costs of clinical investigation. We offer here a practical tool for investigators to assess potential costs savings.
We investigate experimentally the turbulent flow through a two-dimensional contraction. Using a water tunnel with an active grid we generate turbulence at Taylor microscale Reynolds number $Re_{\unicode[STIX]{x1D706}}\sim 250$ which is advected through a 2.5 : 1 contraction. Volumetric and time-resolved tomographic particle image velocimetry and shake-the-box velocity measurements are used to characterize the evolution of coherent vortical structures at three streamwise locations upstream of and within the contraction. We confirm the conceptual picture of coherent large-scale vortices being stretched and aligned with the mean rate of strain. This alignment of the vortices with the tunnel centreline is stronger compared to the alignment of vorticity with the large-scale strain observed in numerical simulations of homogeneous turbulence. We judge this by the peak probability magnitudes of these alignments. This result is robust and independent of the grid-rotation protocols. On the other hand, while the pointwise vorticity vector also, to a lesser extent, aligns with the mean strain, it principally remains aligned with the intermediate eigenvector of the local instantaneous strain-rate tensor, as is known in other turbulent flows. These results persist when the distance from the grid to the entrance of the contraction is doubled, showing that modest transverse inhomogeneities do not significantly affect these vortical-orientation results.
Early behaviors that differentiate later biomarkers for psychopathology can guide preventive efforts while also facilitating pathophysiological research. We tested whether error-related negativity (ERN) moderates the link between early behavior and later psychopathology in two early childhood phenotypes: behavioral inhibition and irritability. From ages 2 to 7 years, children (n = 291) were assessed longitudinally for behavioral inhibition (BI) and irritability. Behavioral inhibition was assessed via maternal report and behavioral responses to novelty. Childhood irritability was assessed using the Child Behavior Checklist. At age 12, an electroencephalogram (EEG) was recorded while children performed a flanker task to measure ERN, a neural indicator of error monitoring. Clinical assessments of anxiety and irritability were conducted using questionnaires (i.e., Screen for Child Anxiety Related Disorders and Affective Reactivity Index) and clinical interviews. Error monitoring interacted with early BI and early irritability to predict later psychopathology. Among children with high BI, an enhanced ERN predicted greater social anxiety at age 12. In contrast, children with high childhood irritability and blunted ERN predicted greater irritability at age 12. This converges with previous work and provides novel insight into the specificity of pathways associated with psychopathology.
Using existing data from clinical registries to support clinical trials and other prospective studies has the potential to improve research efficiency. However, little has been reported about staff experiences and lessons learned from implementation of this method in pediatric cardiology.
Objectives:
We describe the process of using existing registry data in the Pediatric Heart Network Residual Lesion Score Study, report stakeholders’ perspectives, and provide recommendations to guide future studies using this methodology.
Methods:
The Residual Lesion Score Study, a 17-site prospective, observational study, piloted the use of existing local surgical registry data (collected for submission to the Society of Thoracic Surgeons-Congenital Heart Surgery Database) to supplement manual data collection. A survey regarding processes and perceptions was administered to study site and data coordinating center staff.
Results:
Survey response rate was 98% (54/55). Overall, 57% perceived that using registry data saved research staff time in the current study, and 74% perceived that it would save time in future studies; 55% noted significant upfront time in developing a methodology for extracting registry data. Survey recommendations included simplifying data extraction processes and tailoring to the needs of the study, understanding registry characteristics to maximise data quality and security, and involving all stakeholders in design and implementation processes.
Conclusions:
Use of existing registry data was perceived to save time and promote efficiency. Consideration must be given to the upfront investment of time and resources needed. Ongoing efforts focussed on automating and centralising data management may aid in further optimising this methodology for future studies.
Despite established clinical associations among major depression (MD), alcohol dependence (AD), and alcohol consumption (AC), the nature of the causal relationship between them is not completely understood. We leveraged genome-wide data from the Psychiatric Genomics Consortium (PGC) and UK Biobank to test for the presence of shared genetic mechanisms and causal relationships among MD, AD, and AC.
Methods
Linkage disequilibrium score regression and Mendelian randomization (MR) were performed using genome-wide data from the PGC (MD: 135 458 cases and 344 901 controls; AD: 10 206 cases and 28 480 controls) and UK Biobank (AC-frequency: 438 308 individuals; AC-quantity: 307 098 individuals).
Results
Positive genetic correlation was observed between MD and AD (rgMD−AD = + 0.47, P = 6.6 × 10−10). AC-quantity showed positive genetic correlation with both AD (rgAD−AC quantity = + 0.75, P = 1.8 × 10−14) and MD (rgMD−AC quantity = + 0.14, P = 2.9 × 10−7), while there was negative correlation of AC-frequency with MD (rgMD−AC frequency = −0.17, P = 1.5 × 10−10) and a non-significant result with AD. MR analyses confirmed the presence of pleiotropy among these four traits. However, the MD-AD results reflect a mediated-pleiotropy mechanism (i.e. causal relationship) with an effect of MD on AD (beta = 0.28, P = 1.29 × 10−6). There was no evidence for reverse causation.
Conclusion
This study supports a causal role for genetic liability of MD on AD based on genetic datasets including thousands of individuals. Understanding mechanisms underlying MD-AD comorbidity addresses important public health concerns and has the potential to facilitate prevention and intervention efforts.
The Numeniini is a tribe of 13 wader species (Scolopacidae, Charadriiformes) of which seven are Near Threatened or globally threatened, including two Critically Endangered. To help inform conservation management and policy responses, we present the results of an expert assessment of the threats that members of this taxonomic group face across migratory flyways. Most threats are increasing in intensity, particularly in non-breeding areas, where habitat loss resulting from residential and commercial development, aquaculture, mining, transport, disturbance, problematic invasive species, pollution and climate change were regarded as having the greatest detrimental impact. Fewer threats (mining, disturbance, problematic native species and climate change) were identified as widely affecting breeding areas. Numeniini populations face the greatest number of non-breeding threats in the East Asian-Australasian Flyway, especially those associated with coastal reclamation; related threats were also identified across the Central and Atlantic Americas, and East Atlantic flyways. Threats on the breeding grounds were greatest in Central and Atlantic Americas, East Atlantic and West Asian flyways. Three priority actions were associated with monitoring and research: to monitor breeding population trends (which for species breeding in remote areas may best be achieved through surveys at key non-breeding sites), to deploy tracking technologies to identify migratory connectivity, and to monitor land-cover change across breeding and non-breeding areas. Two priority actions were focused on conservation and policy responses: to identify and effectively protect key non-breeding sites across all flyways (particularly in the East Asian- Australasian Flyway), and to implement successful conservation interventions at a sufficient scale across human-dominated landscapes for species’ recovery to be achieved. If implemented urgently, these measures in combination have the potential to alter the current population declines of many Numeniini species and provide a template for the conservation of other groups of threatened species.
Few records in the alpine landscape of western North America document the geomorphic and glaciologic response to climate change during the Pleistocene–Holocene transition. While moraines can provide snapshots of glacier extent, high-resolution records of environmental response to the end of the Last Glacial Maximum, Younger Dryas cooling, and subsequent warming into the stable Holocene are rare. We describe the transition from the late Pleistocene to the Holocene using a ~ 17,000-yr sediment record from Swiftcurrent Lake in eastern Glacier National Park, MT, with a focus on the period from ~ 17 to 11 ka. Total organic and inorganic carbon, grain size, and carbon/nitrogen data provide evidence for glacial retreat from the late Pleistocene into the Holocene, with the exception of a well-constrained advance during the Younger Dryas from 12.75 to 11.5 ka. Increased detrital carbonate concentration in Swiftcurrent Lake sediment reflects enhanced glacial erosion and sediment transport, likely a result of a more proximal ice terminus position and a reduction in the number of alpine lakes acting as sediment sinks in the valley.
An experiment was conducted in 1998 and 1999 in southeastern Pennsylvania to examine the effect of tillage and soybean row spacing on burcucumber emergence and growth. A second experiment evaluated postemergence (POST) soybean herbicides on burcucumber control. In the tillage and row spacing study, a glyphosate-resistant soybean variety was planted in no-till and reduced-tillage systems in 38- and 76-cm row spacings. In the POST herbicide experiment, chlorimuron, glyphosate, CGA-277476, thifensulfuron, and several combinations of these herbicides were applied at two different POST application timings in 38-cm row soybean planted in a reduced-tillage system. In the tillage and row spacing study, burcucumber emergence was greatest starting in late May through mid-June and mostly ceased by early July, regardless of tillage system or row spacing. Although there was no difference in germination period in either tillage system, preplant tillage increased the number of emerged plants by 110% in 1997 and 70% in 1998 compared to the no-till system. Row spacing had no effect on burcucumber emergence or biomass production. In general, most POST herbicide programs controlled burcucumber, and there was no difference between early POST and mid-POST application timings. Chlorimuron at 13 g ai/ha, chlorimuron plus thifensulfuron, glyphosate, glyphosate plus chlorimuron, and glyphosate plus CGA-277476 provided 87% or greater control of burcucumber 12 wk after planting. These herbicides reduced burcucumber density and biomass by more than 56% in 1997 and 96% in 1998.
Field experiments were conducted in grain sorghum at five locations in Kansas in 2009 and 2010, to evaluate the efficacy and crop safety of early- to mid-POST (EMPOST) and late-POST (LPOST) applications of premixed pyrasulfotole and bromoxynil (PYRA&BROM) in tank mix combinations with atrazine or atrazine plus 2,4-D ester or dicamba compared to bromoxynil plus atrazine. PYRA&BROM at 244 or 300 g ai ha−1 plus atrazine at 560 g ai ha−1 applied EMPOST controlled pigweed species (Palmer amaranth, tumble pigweed, and redroot pigweed), kochia, velvetleaf, common sunflower, ivyleaf morningglory, and common lambsquarters 93% or greater. Puncturevine control among three locations ranged from 85 to 99%. Control of most weed species was not improved by increasing PYRA&BROM rate from 244 to 300 g ha−1 or by tank mixing 2,4-D or dicamba with PYRA&BROM plus atrazine. However, ivyleaf morningglory control was improved at the LPOST timing by adding 2,4-D or dicamba at 140 g ae ha−1. In no instance did any PYRA&BROM treatment provide greater weed control than bromoxynil plus atrazine at 281 + 560 g ha−1 when applied EMPOST, but in most instances PYRA&BROM treatments were more effective than bromoxynil plus atrazine when applied LPOST. Generally, PYRA&BROM treatments were more effective when applied EMPOST than LPOST, especially when 2,4-D or dicamba was added. PYRA&BROM plus atrazine treatments caused foliar bleaching in sorghum within 7 ± 3 d after treatment, but recovery was complete within 3 to 4 wk and grain yields were not reduced. Tank mixing dicamba with PYRA&BROM and atrazine occasionally reduced visible crop response compared to PYRA&BROM plus atrazine. Our results indicate that PYRA&BROM plus atrazine with or without 2,4-D or dicamba selectively controls several troublesome broadleaf weeds in grain sorghum. Foliar bleaching of sorghum leaves can occur but the symptoms are transient, and grain yields are not likely to be reduced.
Volunteer horseradish plants that emerged from root segments remaining after harvest can reduce yields of rotational crops as well as provide a host for pathogens and insects, thus reducing the benefits of crop rotation. POST applications of halosulfuron in corn can be an effective component to improve management of volunteer horseradish, but the replant interval from application to safe planting of commercial horseradish has not been determined. Fall herbicide applications are another possible volunteer horseradish management strategy than can be implemented once crops are harvested. Therefore, field experiments were conducted to evaluate the safe replant interval of horseradish following halosulfuron applications and to determine the efficacy of fall herbicide applications for volunteer horseradish control. Visual estimates of horseradish injury were greatest (85%) in plantings made zero months after halosulfuron applied at two times the approved rate; moreover, for all rates, injury decreased as the time after halosulfuron application increased. No herbicide injury or root biomass reduction occurred on horseradish at any halosulfuron rate from replanting beyond 4 mo after halosulfuron application. Control of volunteer horseradish was 91% or greater for all fall herbicide applications that included 2,4-D. Furthermore, volunteer horseradish shoot density was the lowest following combinations of 2,4-D tank-mixed with halosulfuron or rimsulfuron : thifensulfuron (0.2 and 0.4 shoots m−2, respectively) compared with the nontreated control (5.1 shoots m−2). This research demonstrates the effectiveness of both halosulfuron and 2,4-D as components of an integrated management strategy for volunteer horseradish control and the potential for halosulfuron applications without soil persistence beyond 4 mo affecting subsequent commercial horseradish production.
Experiments examining burcucumber management in glufosinate-resistant (GR) and imidazolinone-resistant (IMI) corn were conducted in 1997 and 1998 in southeastern Pennsylvania. GR corn was planted in 38- and 76-cm rows, and postemergence (POST) treatments of glufosinate and glufosinate plus atrazine were applied to corn at the V4 or V5 growth stage. In a second study, IMI corn was planted in 76-cm rows, and 15 preemergence (PRE) and POST herbicide programs were evaluated. Herbicide treatments included RPA-201772, CGA 152005, simazine, imazethapyr plus imazapyr, imazamox, chlorimuron plus thifensulfuron, nicosulfuron plus rimsulfuron plus atrazine, CGA 152005 plus primisulfuron, and combinations with atrazine. Burcucumber germinated throughout the growing season, with greatest emergence occurring in early June, gradually decreasing to minimal emergence by mid-July. Glufosinate alone controlled burcucumber 79 to 90% 7 weeks after planting (WAP) regardless of application timing or row spacing. By 10 to 13 WAP, control was 82% or less due to lack of residual control and new burcucumber emergence. Row spacing had little effect on burcucumber emergence or control and appears to have little impact on burcucumber management in corn. In general, PRE herbicide programs were less effective than POST programs, although PRE treatments containing atrazine equaled some POST herbicides. POST-applied chlorimuron plus thifensulfuron, nicosulfuron plus rimsulfuron plus atrazine, and CGA 152005 plus primisulfuron controlled burcucumber greater than 80 and 90% in 1997 and 1998, respectively. Imazethapyr plus imazapyr and imazamox applied POST controlled burcucumber 66% 10 WAP. Adding atrazine to POST herbicide programs did not increase control, with the exception of imazethapyr plus imazapyr.