We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Novel management strategies for controlling smutgrass have potential to influence sward dynamics in bahiagrass forage systems. This experiment evaluated population shifts in bahiagrass forage following implementation of integrated herbicide and fertilizer management plans for controlling smutgrass. Herbicide treatments included indaziflam applied PRE, hexazinone applied POST, a combination of PRE + POST herbicides, and a nonsprayed control. Fertilizer treatments included nitrogen, nitrogen + potassium, and an unfertilized control. The POST treatment reduced smutgrass coverage regardless of PRE or fertilizer application by the end of the first season and remained low for the 3-yr duration of the experiment (P < 0.01). All treatments, including nontreated controls, reduced smutgrass coverage during year 3 (P < 0.05), indicating that routine harvesting to remove the biomass reduced smutgrass coverage. Bahiagrass cover increased at the end of year 1 with POST treatment (P < 0.01), but only the POST + fertilizer treatment maintained greater bahiagrass coverage than the nontreated control by the end of year 3 (P < 0.05). Expenses associated with the POST + fertilizer treatment totaled US$348 ha−1 across the 3-yr experiment. Other smutgrass control options could include complete removal of biomass (hay production) and pasture renovation, which can cost 3-fold or greater more than POST + fertilizer treatment. Complete removal of biomass may reduce smutgrass coverage by removing mature seedheads, but at a much greater expense of US$2,835 to US$5,825 ha−1, depending on herbicide and fertilizer inputs. Bahiagrass renovation is US$826 ha−1 in establishment costs alone. When pasture production expenses are included for two seasons postrenovation, the total increases to US$1,120 ha−1 across three seasons. The importance of hexazinone and fertilizer as components of smutgrass control in bahiagrass forage was confirmed in this study. Future research should focus on the biology of smutgrass and the role of a PRE treatment in a long-term, larger-scale forage system.
OBJECTIVES/GOALS: Obesity is associated with increased incidence of breast cancer (BC), yet is not included in many lifetime-risk calculators. Obesity may impact breast cancer screening sensitivity. Retrospective studies show that bariatric surgery is associated with a lower risk of BC, but the effects of surgical weight loss on breast tissue are poorly understood. METHODS/STUDY POPULATION: We proposed a mixed-methods before and after study design to investigate the effects of surgical weight loss on breast tissue via pre- and post-weight loss breast tissue biopsies and imaging. In addition, we aimed to better understand barriers to BC screening for patients with obesity by conducting qualitative interviews. With institutional review board approval, we have begun recruiting 14 cisgender women who plan to undergo Roux-en-Y gastric bypass or sleeve gastrectomy. Participants must be at least 40 years old, with no prior history of breast biopsies or breast cancer and will undergo comprehensive breast cancer screening including mammography with quantitative density assessment, breast MRI, as well as breast core biopsies. RESULTS/ANTICIPATED RESULTS: We hypothesize that obesity and its associated metabolic changes lead to altered breast stroma, including increased inflammation, and tissue stiffness, with subsequent risk of carcinogenesis. If true, we expect to find obese women will have measurably increased inflammatory markers in their breast tissue, which are reduced after bariatric surgery. We expect that change in mammographic density may correlate with fibroglandular volume change on MRI; there are little data on change in background parenchymal enhancement in the setting of obesity and weight change and quantifying this will provide preliminary data for future work. Last, we expect that undergoing BC screening will be easier for patients after weight loss due to constraints of imaging equipment and potential bias in the screening process. DISCUSSION/SIGNIFICANCE: Screening for BC is paramount to improving outcomes yet people with obesity are screened less with worse outcomes. Studying the effects of weight loss on the breast may improve interpretation of breast imaging in the setting of obesity and identify markers of risk. Understanding barriers to screening may help us develop strategies to improve screening.
The current coronavirus disease (COVID-19) pandemic has placed unprecedented strain on underfunded public health resources in the Southeastern United States. The Memphis, TN, metropolitan region has lacked infrastructure for health data exchange.
This manuscript describes a multidisciplinary initiative to create a community-focused COVID-19 data registry, the Memphis Pandemic Health Informatics System (MEMPHI-SYS). MEMPHI-SYS leverages test result data updated directly from community-based testing sites, as well as a full complement of public health data sets and knowledge-based informatics. It has been guided by relationships with community stakeholders and is managed alongside the largest publicly funded community-based COVID-19 testing response in the Mid-South. MEMPHI-SYS has supported interactive Web-based analytic resources and informs federally funded COVID-19 outreach directed toward neighborhoods most in need of pandemic support.
MEMPHI-SYS provides an instructive case study of how to collaboratively establish the technical scaffolding and human relationships necessary for data-driven, health equity-focused pandemic surveillance, and policy interventions.
Seed retention, and ultimately seed shatter, are extremely important for the efficacy of harvest weed seed control (HWSC) and are likely influenced by various agroecological and environmental factors. Field studies investigated seed-shattering phenology of 22 weed species across three soybean [Glycine max (L.) Merr.]-producing regions in the United States. We further evaluated the potential drivers of seed shatter in terms of weather conditions, growing degree days, and plant biomass. Based on the results, weather conditions had no consistent impact on weed seed shatter. However, there was a positive correlation between individual weed plant biomass and delayed weed seed–shattering rates during harvest. This work demonstrates that HWSC can potentially reduce weed seedbank inputs of plants that have escaped early-season management practices and retained seed through harvest. However, smaller individuals of plants within the same population that shatter seed before harvest pose a risk of escaping early-season management and HWSC.
Recent guidelines and recommen dations from government prevention advisory groups endorsing shared clinical decision-making reflect an emerging trend among public health bodies.
Potential effectiveness of harvest weed seed control (HWSC) systems depends upon seed shatter of the target weed species at crop maturity, enabling its collection and processing at crop harvest. However, seed retention likely is influenced by agroecological and environmental factors. In 2016 and 2017, we assessed seed-shatter phenology in 13 economically important broadleaf weed species in soybean [Glycine max (L.) Merr.] from crop physiological maturity to 4 wk after physiological maturity at multiple sites spread across 14 states in the southern, northern, and mid-Atlantic United States. Greater proportions of seeds were retained by weeds in southern latitudes and shatter rate increased at northern latitudes. Amaranthus spp. seed shatter was low (0% to 2%), whereas shatter varied widely in common ragweed (Ambrosia artemisiifolia L.) (2% to 90%) over the weeks following soybean physiological maturity. Overall, the broadleaf species studied shattered less than 10% of their seeds by soybean harvest. Our results suggest that some of the broadleaf species with greater seed retention rates in the weeks following soybean physiological maturity may be good candidates for HWSC.
Seed shatter is an important weediness trait on which the efficacy of harvest weed seed control (HWSC) depends. The level of seed shatter in a species is likely influenced by agroecological and environmental factors. In 2016 and 2017, we assessed seed shatter of eight economically important grass weed species in soybean [Glycine max (L.) Merr.] from crop physiological maturity to 4 wk after maturity at multiple sites spread across 11 states in the southern, northern, and mid-Atlantic United States. From soybean maturity to 4 wk after maturity, cumulative percent seed shatter was lowest in the southern U.S. regions and increased moving north through the states. At soybean maturity, the percent of seed shatter ranged from 1% to 70%. That range had shifted to 5% to 100% (mean: 42%) by 25 d after soybean maturity. There were considerable differences in seed-shatter onset and rate of progression between sites and years in some species that could impact their susceptibility to HWSC. Our results suggest that many summer annual grass species are likely not ideal candidates for HWSC, although HWSC could substantially reduce their seed output during certain years.
We aimed to investigate the heterogeneity of seasonal suicide patterns among multiple geographically, demographically and socioeconomically diverse populations.
Methods
Weekly time-series data of suicide counts for 354 communities in 12 countries during 1986–2016 were analysed. Two-stage analysis was performed. In the first stage, a generalised linear model, including cyclic splines, was used to estimate seasonal patterns of suicide for each community. In the second stage, the community-specific seasonal patterns were combined for each country using meta-regression. In addition, the community-specific seasonal patterns were regressed onto community-level socioeconomic, demographic and environmental indicators using meta-regression.
Results
We observed seasonal patterns in suicide, with the counts peaking in spring and declining to a trough in winter in most of the countries. However, the shape of seasonal patterns varied among countries from bimodal to unimodal seasonality. The amplitude of seasonal patterns (i.e. the peak/trough relative risk) also varied from 1.47 (95% confidence interval [CI]: 1.33–1.62) to 1.05 (95% CI: 1.01–1.1) among 12 countries. The subgroup difference in the seasonal pattern also varied over countries. In some countries, larger amplitude was shown for females and for the elderly population (≥65 years of age) than for males and for younger people, respectively. The subperiod difference also varied; some countries showed increasing seasonality while others showed a decrease or little change. Finally, the amplitude was larger for communities with colder climates, higher proportions of elderly people and lower unemployment rates (p-values < 0.05).
Conclusions
Despite the common features of a spring peak and a winter trough, seasonal suicide patterns were largely heterogeneous in shape, amplitude, subgroup differences and temporal changes among different populations, as influenced by climate, demographic and socioeconomic conditions. Our findings may help elucidate the underlying mechanisms of seasonal suicide patterns and aid in improving the design of population-specific suicide prevention programmes based on these patterns.
The present study examined the effect of ingredient bundles (i.e. measured ingredients with recipes) and recipe tastings as a strategy to increase the selection of healthy, target foods (kale, brown rice and whole-wheat pasta).
Design
Each of the three conditions was tested once per week for three weeks. The conditions were: Treatment 1 (T1), recipe tastings only; Treatment 2 (T2), ingredient bundle plus recipe tastings; and Control, no intervention.
Setting
A food pantry in Bridgeport, CT, USA.
Participants
Food pantry clients.
Results
Controlling for family size and intervention week, the likelihood of clients in T2 (n 160) selecting at least one target item compared with the Control group (n 160) was 3·20 times higher for kale, 4·76 times higher for brown rice and 7·25 times higher for whole-wheat pasta. Compared with T1 (n 128), T2 clients were 2·67 times more likely to select kale, 7·67 times more likely to select brown rice and 11·43 times more likely to select whole-wheat pasta. No differences between T1 and the Control group were found.
Conclusions
Findings suggest that innovative, nudging strategies such as ingredient bundles may increase appeal of foods and encourage pantry clients to select healthier options.
A segment of the debate surrounding the commercialization and use of glyphosate-resistant (GR) crops focuses on the theory that the implementation of these traits is an extension of the intensification of agriculture that will further erode the biodiversity of agricultural landscapes. A large field-scale study was initiated in 2006 in the United States on 156 different field sites with a minimum 3-yr history of GR-corn, -cotton or -soybean in the cropping system. The impact of cropping system, crop rotation, frequency of using the GR crop trait, and several categorical variables on seedbank weed population density and diversity was analyzed. The parameters of total weed population density of all species in the seedbank, species richness, Shannon's H′ and evenness were not affected by any management treatment. The similarity between the seedbank and aboveground weed community was more strongly related to location than management; previous year's crops and cropping systems were also important while GR trait rotation was not. The composition of the weed flora was more strongly related to location (geography) than any other parameter. The diversity of weed flora in agricultural sites with a history of GR crop production can be influenced by several factors relating to the specific method in which the GR trait is integrated (cropping system, crop rotation, GR trait rotation), the specific weed species, and the geographical location. Continuous GR crop, compared to fields with other cropping systems, only had greater species diversity (species richness) of some life forms, i.e., biennials, winter annuals, and prostrate weeds. Overall diversity was related to geography and not cropping system. These results justify further research to clarify the complexities of crops grown with herbicide-resistance traits to provide a more complete characterization of their culture and local adaptation to the weed seedbank.
Plutonium metal is a very unusual element, exhibiting six allotropes at ambient pressure, between room temperature and its melting point, a complicated phase diagram, and a complex electronic structure. Many phases of plutonium metal are unstable with changes in temperature, pressure, chemical additions, or time. This strongly affects structure and properties, and becomes of high importance, particularly when considering effects on structural integrity over long periods of time [1]. This paper presents a time-dependent neutron total scattering study of the local and average structure of naturally aging δ-phase 239Pu-Ga alloys, together with preliminary results on neutron tomography characterization.
The amygdala and subgenual anterior cingulate cortex (sACC) are key brain regions for the generation of negative affect. In this longitudinal fMRI study of adolescents we investigated how amygdala–sACC connectivity was correlated with negative affectivity (NA) both cross-sectionally and longitudinally, and examined its relationship to the onset of first-episode depression.
Method.
Fifty-six adolescents who were part of a larger longitudinal study of adolescent development were included. They had no history of mental illness at the time of their baseline scan (mean age 16.5 years) and had a follow-up scan 2 years later (mean age 18.8 years). We used resting-state functional-connectivity MRI to investigate whether cross-sectional and change measures of amygdala–sACC connectivity were (i) correlated with NA and its change over time, and (ii) related to the onset of first-episode depression.
Results.
The magnitude of amygdala connectivity with sACC showed significant positive correlation with NA at both time-points. Further analysis confirmed that change in amygdala–sACC connectivity between assessments was correlated with change in NA. Eight participants developed a first episode of depression between the baseline and follow-up assessments: they showed increased amygdala–sACC connectivity at follow-up.
Conclusions.
Amygdala–sACC connectivity is associated with NA in adolescence, with change in connectivity between these regions showing positive correlation with change in NA. Our observation that the onset of depression was associated with an increase in connectivity between the regions provides support for the neurobiological ‘scar’ hypothesis of depression.
Mass casualty triage is the process of prioritizing multiple victims when resources are not sufficient to treat everyone immediately. No national guideline for mass casualty triage exists in the United States. The lack of a national guideline has resulted in variability in triage processes, tags, and nomenclature. This variability has the potential to inject confusion and miscommunication into the disaster incident, particularly when multiple jurisdictions are involved. The Model Uniform Core Criteria for Mass Casualty Triage were developed to be a national guideline for mass casualty triage to ensure interoperability and standardization when responding to a mass casualty incident. The Core Criteria consist of 4 categories: general considerations, global sorting, lifesaving interventions, and individual assessment of triage category. The criteria within each of these categories were developed by a workgroup of experts representing national stakeholder organizations who used the best available science and, when necessary, consensus opinion. This article describes how the Model Uniform Core Criteria for Mass Casualty Triage were developed.
(Disaster Med Public Health Preparedness. 2011;5:129-137)
Mass casualty triage is a critical skill. Although many systems exist to guide providers in making triage decisions, there is little scientific evidence available to demonstrate that any of the available systems have been validated. Furthermore, in the United States there is little consistency from one jurisdiction to the next in the application of mass casualty triage methodology. There are no nationally agreed upon categories or color designations. This review reports on a consensus committee process used to evaluate and compare commonly used triage systems, and to develop a proposed national mass casualty triage guideline. The proposed guideline, entitled SALT (sort, assess, life-saving interventions, treatment and/or transport) triage, was developed based on the best available science and consensus opinion. It incorporates aspects from all of the existing triage systems to create a single overarching guide for unifying the mass casualty triage process across the United States. (Disaster Med Public Health Preparedness. 2008;2(Suppl 1):S25–S34)