We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Major depressive disorder (MDD) is a tremendous global disease burden and the leading cause of disability worldwide. Unfortunately, individuals diagnosed with MDD typically experience a delayed response to traditional antidepressants and many do not adequately respond to pharmacotherapy, even after multiple trials. The critical need for novel antidepressant treatments has led to a recent resurgence in the clinical application of psychedelics, and intravenous ketamine, which has been investigated as a rapid-acting treatment for treatment resistant depression (TRD) as well acute suicidal ideation and behavior. However, variations in the type and quality of experimental design as well as a range of treatment outcomes in clinical trials of ketamine make interpretation of this large body of literature challenging.
Objectives
This umbrella review aims to advance our understanding of the effectiveness of intravenous ketamine as a pharmacotherapy for TRD by providing a systematic, quantitative, large-scale synthesis of the empirical literature.
Methods
We performed a comprehensive PubMed search for peer-reviewed meta-analyses of primary studies of intravenous ketamine used in the treatment of TRD. Meta-analysis and primary studies were then screened by two independent coding teams according to pre-established inclusion criteria as well as PRISMA and METRICS guidelines. We then employed metaumbrella, a statistical package developed in R, to perform effect size calculations and conversions as well as statistical tests.
Results
In a large-scale analysis of 1,182 participants across 51 primary studies, repeated-dose administration of intravenous ketamine demonstrated statistically significant effects (p<0.05) compared to placebo-controlled as well as other experimental conditions in patients with TRD, as measured by standardized clinician-administered and self-report depression symptom severity scales.
Conclusions
This study provides large-scale, quantitative support for the effectiveness of intravenous, repeated-dose ketamine as a therapy for TRD and a report of the relative effectiveness of several treatment parameters across a large and rapidly growing literature. Future investigations should use similar analytic tools to examine evidence-stratified conditions and the comparative effectiveness of other routes of administration and treatment schedules as well as the moderating influence of other clinical and demographic variables on the effectiveness of ketamine on TRD and suicidal ideation and behavior.
There has been rapidly growing interest in understanding the pharmaceutical and clinical properties of psychedelic and dissociative drugs, with a particular focus on ketamine. This compound, long known for its anesthetic and dissociative properties, has garnered attention due to its potential to rapidly alleviate symptoms of depression, especially in individuals with treatment-resistant depression (TRD) or acute suicidal ideation or behavior. However, while ketamine’s psychopharmacological effects are increasingly well-documented, the specific patterns of its neural impact remain a subject of exploration and basic questions remain about its effects on functional activation in both clinical and healthy populations.
Objectives
This meta-analysis seeks to contribute to the evolving landscape of neuroscience research on dissociative drugs such as ketamine by comprehensively examining the effects of acute ketamine administration on neural activation, as measured by functional magnetic resonance imaging (fMRI), in healthy participants.
Methods
We conducted a meta-analysis of existing fMRI activation studies of ketamine using multilevel kernel density analysis (MKDA). Following a comprehensive PubMed search, we quantitatively synthesized all published primary fMRI whole-brain activation studies of the effects of ketamine in healthy subjects with no overlapping samples (N=18). This approach also incorporated ensemble thresholding (α=0.05-0.0001) to minimize cluster-size detection bias and Monte Carlo simulations to correct for multiple comparisons.
Results
Our meta-analysis revealed statistically significant (p<0.05-0.0001; FWE-corrected) alterations in neural activation in multiple cortical and subcortical regions following the administration of ketamine to healthy participants (N=306).
Conclusions
These results offer valuable insights into the functional neuroanatomical effects caused by acute ketamine administration. These findings may also inform development of therapeutic applications of ketamine for various psychiatric and neurological conditions. Future studies should investigate the neural effects of ketamine administration, including both short-term and long-term effects, in clinical populations and their relation to clinical and functional improvements.
Bipolar I disorder (BD-I) is a chronic and recurrent mood disorder characterized by alternating episodes of depression and mania; it is also associated with substantial morbidity and mortality and with clinically significant functional impairments. While previous studies have used functional magnetic resonance imaging (fMRI) to examine neural abnormalities associated with BD-I, they have yielded mixed findings, perhaps due to differences in sampling and experimental design, including highly variable mood states at the time of scan.
Objectives
The purpose of this study is to advance our understanding of the neural basis of BD-I and mania, as measured by fMRI activation studies, and to inform the development of more effective brain-based diagnostic systems and clinical treatments.
Methods
We conducted a large-scale meta-analysis of whole-brain fMRI activation studies that compared participants with BD-I, assessed during a manic episode, to age-matched healthy controls. Following PRISMA guidelines, we conducted a comprehensive PubMed literature search using two independent coding teams to evaluate primary studies according to pre-established inclusion criteria. We then used multilevel kernel density analysis (MKDA), a well-established, voxel-wise, whole-brain, meta-analytic approach, to quantitatively synthesize all qualifying primary fMRI activation studies of mania. We used ensemble thresholding (p<0.05-0.0001) to minimize cluster size detection bias, and 10,000 Monte Carlo simulations to correct for multiple comparisons.
Results
We found that participants with BD-I (N=2,042), during an active episode of mania and relative to age-matched healthy controls (N=1,764), exhibit a pattern of significantly (p<0.05-0.0001; FWE-corrected) different activation in multiple brain regions of the cerebral cortex and basal ganglia across a variety of experimental tasks.
Conclusions
This study supports the formulation of a robust neural basis for BD-I during manic episodes and advances our understanding of the pattern of abnormal activation in this disorder. These results may inform the development of novel brain-based clinical tools for bipolar disorder such as diagnostic biomarkers, non-invasive brain stimulation, and treatment-matching protocols. Future studies should compare the neural signatures of BD-I to other related disorders to facilitate the development of protocols for differential diagnosis and improve treatment outcomes in patients with BD-I.
Attention-deficit/hyperactivity disorder (ADHD) is a highly prevalent psychiatric condition that frequently originates in early development and is associated with a variety of functional impairments. Despite a large functional neuroimaging literature on ADHD, our understanding of the neural basis of this disorder remains limited, and existing primary studies on the topic include somewhat divergent results.
Objectives
The present meta-analysis aims to advance our understanding of the neural basis of ADHD by identifying the most statistically robust patterns of abnormal neural activation throughout the whole-brain in individuals diagnosed with ADHD compared to age-matched healthy controls.
Methods
We conducted a meta-analysis of task-based functional magnetic resonance imaging (fMRI) activation studies of ADHD. This included, according to PRISMA guidelines, a comprehensive PubMed search and predetermined inclusion criteria as well as two independent coding teams who evaluated studies and included all task-based, whole-brain, fMRI activation studies that compared participants diagnosed with ADHD to age-matched healthy controls. We then performed multilevel kernel density analysis (MKDA) a well-established, whole-brain, voxelwise approach that quantitatively combines existing primary fMRI studies, with ensemble thresholding (p<0.05-0.0001) and multiple comparisons correction.
Results
Participants diagnosed with ADHD (N=1,550), relative to age-matched healthy controls (N=1,340), exhibited statistically significant (p<0.05-0.0001; FWE-corrected) patterns of abnormal activation in multiple brains of the cerebral cortex and basal ganglia across a variety of cognitive control tasks.
Conclusions
This study advances our understanding of the neural basis of ADHD and may aid in the development of new brain-based clinical interventions as well as diagnostic tools and treatment matching protocols for patients with ADHD. Future studies should also investigate the similarities and differences in neural signatures between ADHD and other highly comorbid psychiatric disorders.
Medical researchers are increasingly prioritizing the inclusion of underserved communities in clinical studies. However, mere inclusion is not enough. People from underserved communities frequently experience chronic stress that may lead to accelerated biological aging and early morbidity and mortality. It is our hope and intent that the medical community come together to engineer improved health outcomes for vulnerable populations. Here, we introduce Health Equity Engineering (HEE), a comprehensive scientific framework to guide research on the development of tools to identify individuals at risk of poor health outcomes due to chronic stress, the integration of these tools within existing healthcare system infrastructures, and a robust assessment of their effectiveness and sustainability. HEE is anchored in the premise that strategic intervention at the individual level, tailored to the needs of the most at-risk people, can pave the way for achieving equitable health standards at a broader population level. HEE provides a scientific framework guiding health equity research to equip the medical community with a robust set of tools to enhance health equity for current and future generations.
Objective. The efficacy of individualized, community-based physical activity as an adjunctive smoking cessation treatment to enhance long-term smoking cessation rates was evaluated for the Lifestyle Enhancement Program (LEAP). Methods. The study was a two-arm, parallel-group, randomized controlled trial. All participants (n = 392) received cessation counseling and a nicotine patch and were randomized to physical activity (n = 199; YMCA membership and personalized exercise programming from a health coach) or an equal contact frequency wellness curriculum (n = 193). Physical activity treatment was individualized and flexible (with each participant selecting types of activities and intensity levels and being encouraged to exercise at the YMCA and at home, as well as to use “lifestyle” activity). The primary outcome (biochemically verified prolonged abstinence at 7-weeks (end of treatment) and 6- and 12-months postcessation) and secondary outcomes (7-day point prevalent tobacco abstinence (PPA), total minutes per week of leisure time physical activity and strength training) were assessed at baseline, 7 weeks, 6 months, and 12 months. Results. Prolonged abstinence in the physical activity and wellness groups was 19.6% and 25.4%, respectively, at 7-weeks, 15.1% and 16.6% at 6-months, and 14.1% and 17.1% at 12 months (all between-group P values >0.18). Similarly, PPA rates did not differ significantly between groups at any follow-up. Change from baseline leisure-time activity plus strength training increased significantly in the physical activity group at 7 weeks (P = 0.04). Across treatment groups, an increase in the number of minutes per week in strength training from baseline to 7 weeks predicted prolonged abstinence at 12 months (P ≤ 0.001). Further analyses revealed that social support, fewer years smoked, and less temptation to smoke were associated with prolonged abstinence over 12 months in both groups. Conclusions. Community-based physical activity programming, delivered as adjunctive treatment with behavioral/pharmacological cessation treatment, did not improve long-term quit rates compared to adjunctive wellness counseling plus behavioral/pharmacological cessation treatment. This trial is registered with https://beta.clinicaltrials.gov/study/NCT00403312, registration no. NCT00403312.
We aimed to understand which non-household activities increased infection odds and contributed greatest to SARS-CoV-2 infections following the lifting of public health restrictions in England and Wales.
Procedures
We undertook multivariable logistic regressions assessing the contribution to infections of activities reported by adult Virus Watch Community Cohort Study participants. We calculated adjusted weighted population attributable fractions (aPAF) estimating which activity contributed greatest to infections.
Findings
Among 11 413 participants (493 infections), infection was associated with: leaving home for work (aOR 1.35 (1.11–1.64), aPAF 17%), public transport (aOR 1.27 (1.04–1.57), aPAF 12%), shopping once (aOR 1.83 (1.36–2.45)) vs. more than three times a week, indoor leisure (aOR 1.24 (1.02–1.51), aPAF 10%) and indoor hospitality (aOR 1.21 (0.98–1.48), aPAF 7%). We found no association for outdoor hospitality (1.14 (0.94–1.39), aPAF 5%) or outdoor leisure (1.14 (0.82–1.59), aPAF 1%).
Conclusion
Essential activities (work and public transport) carried the greatest risk and were the dominant contributors to infections. Non-essential indoor activities (hospitality and leisure) increased risk but contributed less. Outdoor activities carried no statistical risk and contributed to fewer infections. As countries aim to ‘live with COVID’, mitigating transmission in essential and indoor venues becomes increasingly relevant.
Serious illness conversations (SICs) can improve the experience and well-being of patients with advanced cancer. A structured Serious Illness Conversation Guide (SICG) has been shown to improve oncology patient outcomes but was developed and tested in a predominantly White population. To help address disparities in advanced cancer care, we aimed to assess the acceptability of the SICG among African Americans with advanced cancer and their clinicians.
Methods
A two-phase study conducted in Charleston, SC, included focus groups to gather perspectives on the SICG in Black Americans and a single-arm pilot study of a revised SICG with surveys and qualitative exit interviews to evaluate patient and clinician perspectives. We used descriptive analysis of survey results and thematic analysis of qualitative data.
Results
Community-based and patient focus group participants (N = 20) reported that a simulated conversation using an adapted SICG built connection, promoted control, and fostered consideration of religious faith and family. Black patients with advanced cancer (N = 23) reported that SICG-guided conversations were acceptable, helpful, and promoted conversations with loved ones. Oncologists found conversations feasible to implement and skill-building, and also identified opportunities for training and implementation that could support meeting the needs of their patients with low health literacy. An adapted SICG includes language to assess the strength and affirm the clinician–patient relationship.
Significance of results
An adapted structured communication tool to facilitate SIC, the SICG, appears acceptable to Black Americans with advanced cancer and seems feasible for use by oncology clinicians working with this population. Further testing in other marginalized populations may address disparities in advanced cancer care.
Policies that promote conversion of antibiotics from intravenous to oral route administration are considered “low hanging fruit” for hospital antimicrobial stewardship programs. We developed a simple metric based on digestive days of therapy divided by total days of therapy for targeted agents and a method for hospital comparisons. External comparisons may help identify opportunities for improving prospective implementation.
Pompe disease results from lysosomal acid α-glucosidase deficiency, which leads to cardiomyopathy in all infantile-onset and occasional late-onset patients. Cardiac assessment is important for its diagnosis and management. This article presents unpublished cardiac findings, concomitant medications, and cardiac efficacy and safety outcomes from the ADVANCE study; trajectories of patients with abnormal left ventricular mass z score at enrolment; and post hoc analyses of on-treatment left ventricular mass and systolic blood pressure z scores by disease phenotype, GAA genotype, and “fraction of life” (defined as the fraction of life on pre-study 160 L production-scale alglucosidase alfa). ADVANCE evaluated 52 weeks’ treatment with 4000 L production-scale alglucosidase alfa in ≥1-year-old United States of America patients with Pompe disease previously receiving 160 L production-scale alglucosidase alfa. M-mode echocardiography and 12-lead electrocardiography were performed at enrolment and Week 52. Sixty-seven patients had complete left ventricular mass z scores, decreasing at Week 52 (infantile-onset patients, change −0.8 ± 1.83; 95% confidence interval −1.3 to −0.2; all patients, change −0.5 ± 1.71; 95% confidence interval −1.0 to −0.1). Patients with “fraction of life” <0.79 had left ventricular mass z score decreasing (enrolment: +0.1 ± 3.0; Week 52: −1.1 ± 2.0); those with “fraction of life” ≥0.79 remained stable (enrolment: −0.9 ± 1.5; Week 52: −0.9 ± 1.4). Systolic blood pressure z scores were stable from enrolment to Week 52, and no cohort developed systemic hypertension. Eight patients had Wolff–Parkinson–White syndrome. Cardiac hypertrophy and dysrhythmia in ADVANCE patients at or before enrolment were typical of Pompe disease. Four-thousand L alglucosidase alfa therapy maintained fractional shortening, left ventricular posterior and septal end-diastolic thicknesses, and improved left ventricular mass z score.
Social Media Statement: Post hoc analyses of the ADVANCE study cohort of 113 children support ongoing cardiac monitoring and concomitant management of children with Pompe disease on long-term alglucosidase alfa to functionally improve cardiomyopathy and/or dysrhythmia.
Evaluation of a mandatory immunization program to increase and sustain high immunization coverage for healthcare personnel (HCP).
Design:
Descriptive study with before-and-after analysis.
Setting:
Tertiary-care academic medical center.
Participants:
Medical center HCP.
Methods:
A comprehensive mandatory immunization initiative was implemented in 2 phases, starting in July 2014. Key facets of the initiative included a formalized exemption review process, incorporation into institutional quality goals, data feedback, and accountability to support compliance.
Results:
Both immunization and overall compliance rates with targeted immunizations increased significantly in the years after the implementation period. The influenza immunization rate increased from 80% the year prior to the initiative to >97% for the 3 subsequent influenza seasons (P < .0001). Mumps, measles and varicella vaccination compliance increased from 94% in January 2014 to >99% by January 2017, rubella vaccination compliance increased from 93% to 99.5%, and hepatitis B vaccination compliance from 95% to 99% (P < .0001 for all comparisons). An associated positive effect on TB testing compliance, which was not included in the mandatory program, was also noted; it increased from 76% to 92% over the same period (P < .0001).
Conclusions:
Thoughtful, step-wise implementation of a mandatory immunization program linked to professional accountability can be successful in increasing immunization rates as well as overall compliance with policy requirements to cover all recommended HCP immunizations.
The 2017 solar eclipse was associated with mass gatherings in many of the 14 states along the path of totality. The Kentucky Department for Public Health implemented an enhanced syndromic surveillance system to detect increases in emergency department (ED) visits and other health care needs near Hopkinsville, Kentucky, where the point of greatest eclipse occurred.
Methods:
EDs flagged visits of patients who participated in eclipse events from August 17–22. Data from 14 area emergency medical services and 26 first-aid stations were also monitored to detect health-related events occurring during the eclipse period.
Results:
Forty-four potential eclipse event-related visits were identified, primarily injuries, gastrointestinal illness, and heat-related illness. First-aid stations and emergency medical services commonly attended to patients with pain and heat-related illness.
Conclusions:
Kentucky’s experience during the eclipse demonstrated the value of patient visit flagging to describe the disease burden during a mass gathering and to investigate epidemiological links between cases. A close collaboration between public health authorities within and across jurisdictions, health information exchanges, hospitals, and other first-response care providers will optimize health surveillance activities before, during, and after mass gatherings.
Laser-based compact MeV X-ray sources are useful for a variety of applications such as radiography and active interrogation of nuclear materials. MeV X rays are typically generated by impinging the intense laser onto ~mm-thick high-Z foil. Here, we have characterized such a MeV X-ray source from 120 TW (80 J, 650 fs) laser interaction with a 1 mm-thick tantalum foil. Our measurements show X-ray temperature of 2.5 MeV, flux of 3 × 1012 photons/sr/shot, beam divergence of ~0.1 sr, conversion efficiency of ~1%, that is, ~1 J of MeV X rays out of 80 J incident laser, and source size of 80 m. Our measurement also shows that MeV X-ray yield and temperature is largely insensitive to nanosecond laser contrasts up to 10−5. Also, preliminary measurements of similar MeV X-ray source using a double-foil scheme, where the laser-driven hot electrons from a thin foil undergoing relativistic transparency impinging onto a second high-Z converter foil separated by 50–400 m, show MeV X-ray yield more than an order of magnitude lower compared with the single-foil results.
Spotted hyenas (Crocuta crocuta) are mammalian carnivores that occur throughout sub-Saharan Africa in a diverse array of habitats. Spotted hyenas primarily obtain food by hunting ungulates but also scavenge from carcasses using powerful jaws. They have extended juvenile periods and live in complex societies characterized by fission-fusion dynamics. Experimental assessments have been done using a variety of olfactory, visual, physical, and auditory stimuli. Studies suggest that spotted hyenas exhibit high levels of social intelligence, including recognition of third-party relationships. Innovation has been assessed in hyenas using a novel extractive foraging task, and numerosity using vocalization playback experiments. Major challenges during experimentation incude controlling olfactory, visual and auditory cues, building robust apparatuses and controlling motivation and neophobia. In the wild, cognitive assessment of individuals is influenced by complex group interactions as well as by specific testing conditions. However, testing in both captive and wild environments offers exciting opportunities to understand the evolution, mechanisms, and adaptive functions of cognition in this species.
We aimed to explore multiple perspectives regarding barriers to and facilitators of advance care planning (ACP) among African Americans to identify similarities or differences that might have clinical implications.
Method
Qualitative study with health disparities experts (n = 5), community members (n = 9), and seriously ill African American patients and caregivers (n = 11). Using template analysis, interviews were coded to identify intrapersonal, interpersonal, and systems-level themes in accordance with a social ecological framework.
Result
Participants identified seven primary factors that influence ACP for African Americans: religion and spirituality; trust and mistrust; family relationships and experiences; patient-clinician relationships; prognostic communication, care preferences, and preparation and control. These influences echo those described in the existing literature; however, our data highlight consistent differences by group in the degree to which these factors positively or negatively affect ACP. Expert participants reinforced common themes from the literature, for example, that African Americans were not interested in prognostic information because of mistrust and religion. Seriously ill patients were more likely to express trust in their clinicians and to desire prognostic communication; they and community members expressed a desire to prepare for and control the end of life. Religious belief did not appear to negate these desires.
Significance of results
The literature on ACP in African Americans may not accurately reflect the experience of seriously ill African Americans. What are commonly understood as barriers to ACP may in fact not be. We propose reframing stereotypical barriers to ACP, such as religion and spirituality, or family, as cultural assets that should be engaged to enhance ACP. Although further research can inform best practices for engaging African American patients in ACP, findings suggest that respectful, rapport-building communication may facilitate ACP. Clinicians are encouraged to engage in early ACP using respectful and rapport building communication practices, including open-ended questions.
A 3-yr study was initiated in 1982 to determine the effects of herbicides and crop rotations on large crabgrass [Digitaria sanguinalis (L.) Scop. # DIGSA] and broadleaf signalgrass [Brachiaria platyphylla (Griseb.) Nash # BRAPP] population dynamics. Regardless of the crop rotation sequence, broadleaf signalgrass immediately became the predominant weed where standard herbicide programs were used. Large crabgrass became the predominant species after two growing seasons if no herbicides were applied. Domination by large crabgrass appeared to be due to greater seed production. The domination by broadleaf signalgrass in plots treated with herbicides was attributed to its tolerance to the primary grass herbicide alachlor [2-chloro-N-(2,6-diethylphenyl)-N-methoxymethyl)acetamide]. Broadleaf signalgrass emergence from soil treated with 2.2 kg ai/ha was not statistically different from that in untreated soil, while large crabgrass and fall panicum [Panicum dichotomiflorum (L.) Michx. # PANDI] emergence was significantly reduced at the same rate.
Previous reports have suggested that bentazon [3-(1-methylethyl)-(1H)-2,1,3-benzothiadiazin-4(3H)-one 2,2-dioxide] tolerance among soybean genotypes is the result of differential translocation or metabolism. The basis for tolerance was reexamined using susceptible and tolerant genotypes. Tolerant genotypes (‘Hill’ and ‘Clark 63’) were found to tolerate 100- to 300-fold more bentazon than susceptible genotypes (‘L78–3263’, ‘Hurrelbrink’, and ‘PI 229.342’). Minor differences in absorption and translocation occurred among the genotypes but they did not correlate with tolerance. Tolerant genotypes metabolized 80 to 90% of absorbed bentazon within 24 h, while susceptible genotypes metabolized only 10 to 15%. Two major metabolites, the glycosyl conjugates of 6- and 8-hydroxybentazon, were formed in tolerant genotypes. Susceptible genotypes did not form the hydroxybentazon conjugates but instead produced relatively low levels of two unidentified metabolites. It is concluded that differential bentazon tolerance among soybean genotypes is linked to the ability to form both the 6- and 8-hydroxybentazon conjugates.
Six multiple-cropping systems composed of: a) turnip (Brassica campestris spp. rapifera), corn (Zea mays L.), and snapbean (Phaseolus vulgaris L.); b) turnip, peanut (Arachis hypogaea L.), and snapbean; c) turnip, corn, and turnip; d) turnip, peanut, and turnip; e) snapbean, soybean [Glycine max (L.) Merr.], and cabbage (Brassica oleracea L.); and f) turnip, cucumber (Cucumis sativus L.), cowpea [Vigna unguiculata (L.) Walp.], and turnip were subjected to nematicide and weed control programs of cultivation or herbicides. Herbicide programs were superior to cultivation in control of weeds. Weeds remaining in the row following cultivation competed severely with crops. Weed species remaining were altered depending on the method of control and crop. Yellow nutsedge (Cyperus esculentus L. ♯3 CYPES) increased rapidly in all herbicide programs but not in cultivated plots. Pigweeds (Amaranthus spp.) were controlled by herbicides but increased in cultivated plots. Corn, peanut, soybean, and spring snapbean yields were higher in herbicide treatments than in cultivated treatments. Cucumber was the only crop that had increased yields for both main effects, herbicide and nematicide. Turnip was consistently injured in herbicide treatments, which was believed to be caused by residues from previous crops interacting with pathogens and possible allelopathic effects of decaying organic matter.
Broadleaf signalgrass [Brachiaria platyphylla (Griseb.) Nash # BRAPP has recently become the dominant annual grass in certain fields of the North Carolina Coastal Plains. Previously, fall panicum (Panicum dichotomiflorum Michx. # PANDI) and large crabgrass [Digitaria sanguinalis (L.) Scop. # DIGSA] were the dominant annual grasses in the region. One of the possible reasons for the observed population shift could be production of inhibitors or stimulators by one species that affects the population dynamics of the other species. Studies were initiated to evaluate the effects of broadleaf signalgrass, large crabgrass, and fall panicum residue, applied as a mulch or soil incorporated, on five indicator species: the three weeds themselves, corn (Zea mays L.), and soybean [Glycine max (L.) Merr.]. At expected residue levels, the degree of inhibition or stimulation from fall panicum and broadleaf signalgrass was determined to be significant for some indicator species. When such responses were seen, the amount of residue necessary to produce these results was usually within the concentrations normally observed in field situations. Based on these results, it appears that the observed population shift is partially mediated by the production of inhibitors or stimulators through plant residue. Other factors such as differential herbicide selectivity and crop rotation are being investigated.