We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Cults have captivated public imagination, gained visibility in the media, and become a popular topic of discourse. While anecdotal and journalistic accounts offer compelling insights, systematic study on the structure, psychological predispositions, and relevance to clinical and legal settings are comparatively scarce. This disparity highlights a crucial need for rigorous scholarly inquiry, moving beyond media portrayals to uncover the foundational mechanisms that sustain and shape these enigmatic groups. Authored by experts in forensic psychiatry and psychology, this book consolidates the extant literature in reviewing the theoretical, sociocultural, clinical, and forensic issues surrounding cultist groups. This text applies evidence-based study to identify group subtypes and explore mediators and moderators that may be relevant in clinical and legal contexts. Authors address issues as they relate to a variety of subpopulations, comorbid mental disorders, mind-altering substances, treatment, and the and legal implications inherent to cults and persuasive leadership. This book may be especially pertinent to mental health professionals and those working in the criminal justice system.
Resolvent analysis provides a framework to predict coherent spatio-temporal structures of the largest linear energy amplification, through a singular value decomposition (SVD) of the resolvent operator, obtained by linearising the Navier–Stokes equations about a known turbulent mean velocity profile. Resolvent analysis utilizes a Fourier decomposition in time, which has thus far limited its application to statistically stationary or time-periodic flows. This work develops a variant of resolvent analysis applicable to time-evolving flows, and proposes a variant that identifies spatio-temporally sparse structures, applicable to either stationary or time-varying mean velocity profiles. Spatio-temporal resolvent analysis is formulated through the incorporation of the temporal dimension to the numerical domain via a discrete time-differentiation operator. Sparsity (which manifests in localisation) is achieved through the addition of an $l_1$-norm penalisation term to the optimisation associated with the SVD. This modified optimisation problem can be formulated as a nonlinear eigenproblem and solved via an inverse power method. We first showcase the implementation of the sparse analysis on a statistically stationary turbulent channel flow, and demonstrate that the sparse variant can identify aspects of the physics not directly evident from standard resolvent analysis. This is followed by applying the sparse space–time formulation on systems that are time varying: a time-periodic turbulent Stokes boundary layer and then a turbulent channel flow with a sudden imposition of a lateral pressure gradient, with the original streamwise pressure gradient unchanged. We present results demonstrating how the sparsity-promoting variant can either change the quantitative structure of the leading space–time modes to increase their sparsity, or identify entirely different linear amplification mechanisms compared with non-sparse resolvent analysis.
Weeds are one of the greatest challenges to snap bean (Phaseolus vulgaris L.) production. Anecdotal observation posits certain species frequently escape the weed management system by the time of crop harvest, hereafter called residual weeds. The objectives of this work were to (1) quantify the residual weed community in snap bean grown for processing across the major growing regions in the United States and (2) investigate linkages between the density of residual weeds and their contributions to weed canopy cover. In surveys of 358 fields across the Northwest (NW), Midwest (MW), and Northeast (NE), residual weeds were observed in 95% of the fields. While a total of 109 species or species-groups were identified, one to three species dominated the residual weed community of individual fields in most cases. It was not uncommon to have >10 weeds m−2 with a weed canopy covering >5% of the field’s surface area. Some of the most abundant and problematic species or species-groups escaping control included amaranth species such as smooth pigweed (Amaranthus hybridus L.), Palmer amaranth (Amaranthus palmeri S. Watson), redroot pigweed (Amaranthus retroflexus L.), and waterhemp [Amaranthus tuberculatus (Moq.) Sauer]; common lambsquarters (Chenopodium album L.); large crabgrass [Digitaria sanguinalis (L.) Scop.]; and ivyleaf morningglory (Ipomoea hederacea Jacq.). Emerging threats include hophornbeam copperleaf (Acalypha ostryifolia Riddell) in the MW and sharppoint fluvellin [Kickxia elatine (L.) Dumort.] in the NW. Beyond crop losses due to weed interference, the weed canopy at harvest poses a risk to contaminating snap bean products with foreign material. Random forest modeling predicts the residual weed canopy is dominated by C. album, D. sanguinalis, carpetweed (Mollugo verticillata L.), I. hederacea, amaranth species, and A. ostryifolia. This is the first quantitative report on the weed community escaping control in U.S. snap bean production.
This work introduces a formulation of resolvent analysis that uses wavelet transforms rather than Fourier transforms in time. Under this formulation, resolvent analysis may extend to turbulent flows with non-stationary mean states. The optimal resolvent modes are augmented with a temporal dimension and are able to encode the time-transient trajectories that are most amplified by the linearised Navier–Stokes equations. We first show that the wavelet- and Fourier-based resolvent analyses give equivalent results for statistically stationary flow by applying them to turbulent channel flow. We then use wavelet-based resolvent analysis to study the transient growth mechanism in the near-wall region of a turbulent channel flow by windowing the resolvent operator in time and frequency. The computed principal resolvent response mode, i.e. the velocity field optimally amplified by the linearised dynamics of the flow, exhibits characteristics of the Orr mechanism, which supports the claim that this mechanism is key to linear transient energy growth. We also apply this method to non-stationary parallel shear flows such as an oscillating boundary layer, and three-dimensional channel flow in which a sudden spanwise pressure gradient perturbs a fully developed turbulent channel flow. In both cases, wavelet-based resolvent analysis yields modes that are sensitive to the changing mean profile of the flow. For the oscillating boundary layer, wavelet-based resolvent analysis produces oscillating principal forcing and response modes that peak at times and wall-normal locations associated with high turbulent activity. For the turbulent channel flow under a sudden spanwise pressure gradient, the resolvent modes gradually realign themselves with the mean flow as the latter deviates. Wavelet-based resolvent analysis thus captures the changes in the transient linear growth mechanisms caused by a time-varying turbulent mean profile.
Cohort studies demonstrate that people who later develop schizophrenia, on average, present with mild cognitive deficits in childhood and endure a decline in adolescence and adulthood. Yet, tremendous heterogeneity exists during the course of psychotic disorders, including the prodromal period. Individuals identified to be in this period (known as CHR-P) are at heightened risk for developing psychosis (~35%) and begin to exhibit cognitive deficits. Cognitive impairments in CHR-P (as a singular group) appear to be relatively stable or ameliorate over time. A sizeable proportion has been described to decline on measures related to processing speed or verbal learning. The purpose of this analysis is to use data-driven approaches to identify latent subgroups among CHR-P based on cognitive trajectories. This will yield a clearer understanding of the timing and presentation of both general and domain-specific deficits.
Participants and Methods:
Participants included 684 young people at CHR-P (ages 12–35) from the second cohort of the North American Prodromal Longitudinal Study. Performance on the MATRICS Consensus Cognitive Battery (MCCB) and the Wechsler Abbreviated Scale of Intelligence (WASI-I) was assessed at baseline, 12-, and 24-months. Tested MCCB domains include verbal learning, speed of processing, working memory, and reasoning & problem-solving. Sex- and age-based norms were utilized. The Oral Reading subtest on the Wide Range Achievement Test (WRAT4) indexed pre-morbid IQ at baseline. Latent class mixture models were used to identify distinct trajectories of cognitive performance across two years. One- to 5-class solutions were compared to decide the best solution. This determination depended on goodness-of-fit metrics, interpretability of latent trajectories, and proportion of subgroup membership (>5%).
Results:
A one-class solution was found for WASI-I Full-Scale IQ, as people at CHR-P predominantly demonstrated an average IQ that increased gradually over time. For individual domains, one-class solutions also best fit the trajectories for speed of processing, verbal learning, and working memory domains. Two distinct subgroups were identified on one of the executive functioning domains, reasoning and problem-solving (NAB Mazes). The sample divided into unimpaired performance with mild improvement over time (Class I, 74%) and persistent performance two standard deviations below average (Class II, 26%). Between these classes, no significant differences were found for biological sex, age, years of education, or likelihood of conversion to psychosis (OR = 1.68, 95% CI 0.86 to 3.14). Individuals assigned to Class II did demonstrate a lower WASI-I IQ at baseline (96.3 vs. 106.3) and a lower premorbid IQ (100.8 vs. 106.2).
Conclusions:
Youth at CHR-P demonstrate relatively homogeneous trajectories across time in terms of general cognition and most individual domains. In contrast, two distinct subgroups were observed with higher cognitive skills involving planning and foresight, and they notably exist independent of conversion outcome. Overall, these findings replicate and extend results from a recently published latent class analysis that examined 12-month trajectories among CHR-P using a different cognitive battery (Allott et al., 2022). Findings inform which individuals at CHR-P may be most likely to benefit from cognitive remediation and can inform about the substrates of deficits by establishing meaningful subtypes.
Clinical implementation of risk calculator models in the clinical high-risk for psychosis (CHR-P) population has been hindered by heterogeneous risk distributions across study cohorts which could be attributed to pre-ascertainment illness progression. To examine this, we tested whether the duration of attenuated psychotic symptom (APS) worsening prior to baseline moderated performance of the North American prodrome longitudinal study 2 (NAPLS2) risk calculator. We also examined whether rates of cortical thinning, another marker of illness progression, bolstered clinical prediction models.
Methods
Participants from both the NAPLS2 and NAPLS3 samples were classified as either ‘long’ or ‘short’ symptom duration based on time since APS increase prior to baseline. The NAPLS2 risk calculator model was applied to each of these groups. In a subset of NAPLS3 participants who completed follow-up magnetic resonance imaging scans, change in cortical thickness was combined with the individual risk score to predict conversion to psychosis.
Results
The risk calculator models achieved similar performance across the combined NAPLS2/NAPLS3 sample [area under the curve (AUC) = 0.69], the long duration group (AUC = 0.71), and the short duration group (AUC = 0.71). The shorter duration group was younger and had higher baseline APS than the longer duration group. The addition of cortical thinning improved the prediction of conversion significantly for the short duration group (AUC = 0.84), with a moderate improvement in prediction for the longer duration group (AUC = 0.78).
Conclusions
These results suggest that early illness progression differs among CHR-P patients, is detectable with both clinical and neuroimaging measures, and could play an essential role in the prediction of clinical outcomes.
This systematic literature review aimed to provide an overview of the characteristics and methods used in studies applying the disability-adjusted life years (DALY) concept for infectious diseases within European Union (EU)/European Economic Area (EEA)/European Free Trade Association (EFTA) countries and the United Kingdom. Electronic databases and grey literature were searched for articles reporting the assessment of DALY and its components. We considered studies in which researchers performed DALY calculations using primary epidemiological data input sources. We screened 3053 studies of which 2948 were excluded and 105 studies met our inclusion criteria. Of these studies, 22 were multi-country and 83 were single-country studies, of which 46 were from the Netherlands. Food- and water-borne diseases were the most frequently studied infectious diseases. Between 2015 and 2022, the number of burden of infectious disease studies was 1.6 times higher compared to that published between 2000 and 2014. Almost all studies (97%) estimated DALYs based on the incidence- and pathogen-based approach and without social weighting functions; however, there was less methodological consensus with regards to the disability weights and life tables that were applied. The number of burden of infectious disease studies undertaken across Europe has increased over time. Development and use of guidelines will promote performing burden of infectious disease studies and facilitate comparability of the results.
Response to lithium in patients with bipolar disorder is associated with clinical and transdiagnostic genetic factors. The predictive combination of these variables might help clinicians better predict which patients will respond to lithium treatment.
Aims
To use a combination of transdiagnostic genetic and clinical factors to predict lithium response in patients with bipolar disorder.
Method
This study utilised genetic and clinical data (n = 1034) collected as part of the International Consortium on Lithium Genetics (ConLi+Gen) project. Polygenic risk scores (PRS) were computed for schizophrenia and major depressive disorder, and then combined with clinical variables using a cross-validated machine-learning regression approach. Unimodal, multimodal and genetically stratified models were trained and validated using ridge, elastic net and random forest regression on 692 patients with bipolar disorder from ten study sites using leave-site-out cross-validation. All models were then tested on an independent test set of 342 patients. The best performing models were then tested in a classification framework.
Results
The best performing linear model explained 5.1% (P = 0.0001) of variance in lithium response and was composed of clinical variables, PRS variables and interaction terms between them. The best performing non-linear model used only clinical variables and explained 8.1% (P = 0.0001) of variance in lithium response. A priori genomic stratification improved non-linear model performance to 13.7% (P = 0.0001) and improved the binary classification of lithium response. This model stratified patients based on their meta-polygenic loadings for major depressive disorder and schizophrenia and was then trained using clinical data.
Conclusions
Using PRS to first stratify patients genetically and then train machine-learning models with clinical predictors led to large improvements in lithium response prediction. When used with other PRS and biological markers in the future this approach may help inform which patients are most likely to respond to lithium treatment.
While comorbidity of clinical high-risk for psychosis (CHR-P) status and social anxiety is well-established, it remains unclear how social anxiety and positive symptoms covary over time in this population. The present study aimed to determine whether there are more than one covariant trajectory of social anxiety and positive symptoms in the North American Prodrome Longitudinal Study cohort (NAPLS 2) and, if so, to test whether the different trajectory subgroups differ in terms of genetic and environmental risk factors for psychotic disorders and general functional outcome.
Methods
In total, 764 CHR individuals were evaluated at baseline for social anxiety and psychosis risk symptom severity and followed up every 6 months for 2 years. Application of group-based multi-trajectory modeling discerned three subgroups based on the covariant trajectories of social anxiety and positive symptoms over 2 years.
Results
One of the subgroups showed sustained social anxiety over time despite moderate recovery in positive symptoms, while the other two showed recovery of social anxiety below clinically significant thresholds, along with modest to moderate recovery in positive symptom severity. The trajectory group with sustained social anxiety had poorer long-term global functional outcomes than the other trajectory groups. In addition, compared with the other two trajectory groups, membership in the group with sustained social anxiety was predicted by higher levels of polygenic risk for schizophrenia and environmental stress exposures.
Conclusions
Together, these analyses indicate differential relevance of sustained v. remitting social anxiety symptoms in the CHR-P population, which in turn may carry implications for differential intervention strategies.
Studying phenotypic and genetic characteristics of age at onset (AAO) and polarity at onset (PAO) in bipolar disorder can provide new insights into disease pathology and facilitate the development of screening tools.
Aims
To examine the genetic architecture of AAO and PAO and their association with bipolar disorder disease characteristics.
Method
Genome-wide association studies (GWASs) and polygenic score (PGS) analyses of AAO (n = 12 977) and PAO (n = 6773) were conducted in patients with bipolar disorder from 34 cohorts and a replication sample (n = 2237). The association of onset with disease characteristics was investigated in two of these cohorts.
Results
Earlier AAO was associated with a higher probability of psychotic symptoms, suicidality, lower educational attainment, not living together and fewer episodes. Depressive onset correlated with suicidality and manic onset correlated with delusions and manic episodes. Systematic differences in AAO between cohorts and continents of origin were observed. This was also reflected in single-nucleotide variant-based heritability estimates, with higher heritabilities for stricter onset definitions. Increased PGS for autism spectrum disorder (β = −0.34 years, s.e. = 0.08), major depression (β = −0.34 years, s.e. = 0.08), schizophrenia (β = −0.39 years, s.e. = 0.08), and educational attainment (β = −0.31 years, s.e. = 0.08) were associated with an earlier AAO. The AAO GWAS identified one significant locus, but this finding did not replicate. Neither GWAS nor PGS analyses yielded significant associations with PAO.
Conclusions
AAO and PAO are associated with indicators of bipolar disorder severity. Individuals with an earlier onset show an increased polygenic liability for a broad spectrum of psychiatric traits. Systematic differences in AAO across cohorts, continents and phenotype definitions introduce significant heterogeneity, affecting analyses.
Functional changes in the brain during ageing can alter learning and memory, gait and balance – in some cases leading to early cognitive decline, disability or injurious falls among older adults. Dietary interventions with strawberry (SB) have been associated with improvements in neuronal, psychomotor and cognitive functions in rodent models of ageing. We hypothesised that dietary supplementation with SB would improve mobility and cognition among older adults. In this study, twenty-two men and fifteen women, between the ages of 60 and 75 years, were recruited into a randomised, double-blind, placebo-controlled trial in which they consumed either freeze-dried SB (24 g/d, equivalent to two cups of fresh SB) or a SB placebo for 90 d. Participants completed a battery of balance, gait and cognitive tests at baseline and again at 45 and 90 d of intervention. Significant supplement group by study visit interactions were observed on tests of learning and memory. Participants in the SB group showed significantly shorter latencies in a virtual spatial navigation task (P = 0·020, ηp2 = 0·106) and increased word recognition in the California Verbal Learning test (P = 0·014, ηp2 = 0·159) across study visits relative to controls. However, no improvement in gait or balance was observed. These findings show that the addition of SB to the diets of healthy, older adults can improve some aspects of cognition, but not gait or balance, although more studies with a larger sample size and longer follow-up are needed to confirm this finding.
In recent years, a variety of efforts have been made in political science to enable, encourage, or require scholars to be more open and explicit about the bases of their empirical claims and, in turn, make those claims more readily evaluable by others. While qualitative scholars have long taken an interest in making their research open, reflexive, and systematic, the recent push for overarching transparency norms and requirements has provoked serious concern within qualitative research communities and raised fundamental questions about the meaning, value, costs, and intellectual relevance of transparency for qualitative inquiry. In this Perspectives Reflection, we crystallize the central findings of a three-year deliberative process—the Qualitative Transparency Deliberations (QTD)—involving hundreds of political scientists in a broad discussion of these issues. Following an overview of the process and the key insights that emerged, we present summaries of the QTD Working Groups’ final reports. Drawing on a series of public, online conversations that unfolded at www.qualtd.net, the reports unpack transparency’s promise, practicalities, risks, and limitations in relation to different qualitative methodologies, forms of evidence, and research contexts. Taken as a whole, these reports—the full versions of which can be found in the Supplementary Materials—offer practical guidance to scholars designing and implementing qualitative research, and to editors, reviewers, and funders seeking to develop criteria of evaluation that are appropriate—as understood by relevant research communities—to the forms of inquiry being assessed. We dedicate this Reflection to the memory of our coauthor and QTD working group leader Kendra Koivu.1
The study used naturalistic data on the production of nominal prefixes in the Otopamean language Northern Pame (autonym: Xi'iuy) to test Whole Word (constructivist) and Minimal Word (prosodic) theories for the acquisition of inflection. Whole Word theories assume that children store words in their entirety; Minimal Word theories assume that children produce words as binary feet. Northern Pame uses obligatory portmanteaux prefixes to inflect nouns for class, number, animacy and possessor. Singular nouns constitute 90 percent of the nouns that the children hear and yet all five two-year-old children frequently omitted the singular noun prefixes, but produced the low frequency noun suffixes for dual and animate plural. Neither the children's production of the noun-class prefixes nor their prefix overextensions correlated with the adult type and token frequencies of production. Northern Pame children constructed Minimal Words that contain binary feet and disfavor the production of initial, extrametrical prefixes.
Antimicrobial resistance is an urgent public health threat. Identifying trends in antimicrobial susceptibility can inform public health policy at the state and local levels.
Objective:
To determine the ability of statewide antibiogram aggregation for public health surveillance to identify changes in antimicrobial resistance trends.
Design:
Facility-level trend analysis.
Methods:
Crude and adjusted trend analyses of the susceptibility of Escherichia coli and Klebsiella pneumoniae to particular antibiotics, as reported by aggregated antibiograms, were examined from 2008 through 2018. Multivariable regression analyses via generalized linear mixed models were used to examine associations between hospital characteristics and trends of E. coli and K. pneumoniae susceptibility to ciprofloxacin and ceftriaxone.
Results:
E. coli and K. pneumoniae showed inverse trends in drug susceptibility over time. K. pneumoniae susceptibility to fluoroquinolones increased by 5% between 2008 and 2018 (P < .05). In contrast, E. coli susceptibility declined during the same period to ceftriaxone (6%), gentamicin (4%), and fluoroquinolones (4%) (P < .05). When compared to Boston hospitals, E. coli isolates from hospitals in other regions had a >4% higher proportion of susceptibility to ciprofloxacin and a >3% higher proportion of susceptibility to ceftriaxone (P < .05). Isolates of K. pneumoniae had higher susceptibility to ciprofloxacin (>3%) and ceftriaxone (>1.5%) in all regions when compared to Boston hospitals (P < .05).
Conclusions:
Cumulative antibiograms can be used to monitor antimicrobial resistance, to discern regional and facility differences, and to detect changes in trends. Furthermore, because the number of years that hospitals contributed reports to the state-level aggregate had no significant influence on susceptibility trends, other states should not be discouraged by incomplete hospital compliance.
OBJECTIVES/GOALS: Introduction: Between 2014 and 2019 the National Institute of Health (NIH) through the National Center for the Advancement of Translational Science (NCATS) has awarded about $2.7 billion to U.S. Academic Medical Centers to build a national network of clinical and translational science program hubs that serve to meet their key goals and initiatives. Today there are about 60 Clinical and Translational Science Award (CTSA) program hubs. Each CTSA program hub has a corresponding website highlighting its clinical and translational science centered programs and activities. These websites are a critical communication gateway to promote NCATS goals and initiatives. Objective: The objective of this research is to evaluate the NIH funded Clinical and Translational Science Award (CTSA) program hub websites for NCATS goals and initiative content alignment, navigability, and interactivity. METHODS/STUDY POPULATION: Methods: Each CTSA program hub website was systematically evaluated for information or tools that align with the five NCATS / CTSA Goals and eight CTSA nationally identified program initiatives. Each NCATS goal and CTSA initiative was subsequently ranked by information diversity level (text, tool, interactivity) and navigation level (click distance from the home page). RESULTS/ANTICIPATED RESULTS: Results: Four of the five NCATS goals are thoroughly and consistently represented among the CTSA Consortium with workforce development, patient and community engagement, and quality and efficiency of research being the top three. Informatics is thoroughly and consistently represented, but not always clearly identified on the home page. The most underrepresented goal is integration of special and underserved populations which was identified on only 60% of CTSA program hub websites. The most common focus of the eight CTSA program initiatives is the Trial Innovation Network in CTSA program hub websites. The Smart IRB comes in a distant second. The remaining six initiatives are severely underrepresented. DISCUSSION/SIGNIFICANCE OF IMPACT: Discussion: The identification of these gaps among the CTSA program hubs presents an understanding of content management and website functionality among the consortium from 3 principal approaches. First it creates an understanding of CTSA program hub content alignment with its funding source goals and initiatives. Such an understanding presents an opportunity to promote ways to create a better aligned consortium with improved collaboration pathways by the funding source through program hub website content standards. Second, it creates an opportunity for program hubs to understand and respond to the messaging their websites are presenting as it relates to the funding source. Third, it provides an opportunity to identify specific program initiatives and goals the CTSA institutions independently chose to highlight which can open a dialog to the better understanding the value of the program initiatives as they relate to the needs of CTSA program hubs. Ultimately, CTSA websites through content alignment should lead to an improved user experience.
Intensified cover-cropping practices are increasingly viewed as a herbicide-resistance management tool but clear distinction between reactive and proactive resistance management performance targets is needed. We evaluated two proactive performance targets for integrating cover-cropping tactics, including (1) facilitation of reduced herbicide inputs and (2) reduced herbicide selection pressure. We conducted corn (Zea mays L.) and soybean [Glycine max (L.) Merr.] field experiments in Pennsylvania and Delaware using synthetic weed seedbanks of horseweed [Conyza canadensis (L.) Cronquist] and smooth pigweed (Amaranthus hybridus L.) to assess winter and summer annual population dynamics, respectively. The effect of alternative cover crops was evaluated across a range of herbicide inputs. Cover crop biomass production ranged from 2,000 to 8,500 kg ha−1 in corn and 3,000 to 5,500 kg ha−1 in soybean. Experimental results demonstrated that herbicide-based tactics were the primary drivers of total weed biomass production, with cover-cropping tactics providing an additive weed-suppression benefit. Substitution of cover crops for PRE or POST herbicide programs did not reduce total weed control levels or cash crop yields but did result in lower net returns due to higher input costs. Cover-cropping tactics significantly reduced C. canadensis populations in three of four cover crop treatments and decreased the number of large rosettes (>7.6-cm diameter) at the time of preplant herbicide exposure. Substitution of cover crops for PRE herbicides resulted in increased selection pressure on POST herbicides, but reduced the number of large individuals (>10 cm) at POST applications. Collectively, our findings suggest that cover crops can reduce the intensity of selection pressure on POST herbicides, but the magnitude of the effect varies based on weed life-history traits. Additional work is needed to describe proactive resistance management concepts and performance targets for integrating cover crops so producers can apply these concepts in site-specific, within-field management practices.
Interesterified (IE) fats are widely used to replace partially-hydrogenated fats as hard fats with functional and sensory properties needed for spreads/margarines, baked goods, and confectionary, while avoiding the health hazards of trans fats. Detailed mechanistic work to determine the metabolic effects of interesterification of commonly-consumed hard fats has not yet been done. Earlier studies using fats less commonly consumed have shown either neutral or a lowering effect on postprandial lipaemia. We investigated postprandial lipaemia, lipoprotein remodelling, and triacylglycerol-rich lipoprotein (TRL) fraction apolipoprotein concentrations following a common IE blend of palm oil/kernel fractions versus its non-IE counterpart, alongside a reference monounsaturated (MUFA) oil. A 3-armed, double blind, randomized controlled trial (clinicaltrials.gov NCT03191513) in healthy adults (n = 20; 10 men, 10 women) aged 45–75 y, assessed effects of single meals (897 kcal, 50 g fat, 16 g protein, 88 g carbohydrate) on postprandial plasma triacylglycerol (TAG) concentrations, lipoprotein profiles, and TRL fraction apolipoprotein B48 and TAG concentrations. Test fats were IE 80:20 palm stearin/palm kernel fat, the equivalent non-IE fat, and a high-MUFA reference oil (rapeseed oil, RO). Blood was collected at baseline and hourly for 8 h. Linear mixed modelling was performed, adjusting for treatment order and baseline values (ver. 24.0; SPSS Inc., Chicago, IL, USA). Total 8 h incremental area under the curves (iAUC) for plasma TAG concentrations were lower following IE and non-IE compared with RO (mean difference in iAUC: non-IE vs. RO -1.8 mmol/L.h (95% CI -3.3, -0.2); IE vs. RO -2.6 mmol/L.h (95% CI -5.3, 0.0)), but iAUCs for IE and non-IE were not significantly different. There were no differences between IE and non-IE for chylomicron fraction apoB48 concentrations nor TAG:apoB48 ratio. No differences were observed between IE and non-IE for lipoprotein (VLDL, HDL, LDL) particle size or sub-class particle concentrations. However, LDL particle diameters were reduced at 5 and 6 h following IE vs RO (P < 0.05). XXL- (including chylomicron remnants and VLDL particles), XL- and L-VLDL particle concentrations (average diameters > 75, 64, and 53.6 nm respectively) were higher following IE and non-IE vs. RO at 6 h (P < 0.05) and 8 h postprandially (P < 0.005–0.05). In conclusion, both IE and non-IE palmitic acid-rich fats generated a greater preponderance of pro-atherogenic large TRL remnant particles in the late postprandial phase relative to an oleic acid-rich oil. However, the process of interesterification did not modify postprandial TAG response or lipoprotein metabolism.
Weeds can cause significant yield loss in watermelon production systems. Commercially acceptable weed control is difficult to achieve, even with heavy reliance on herbicides. A study was conducted to evaluate a spring-seeded cereal rye cover crop with different herbicide application timings for weed management between row middles in watermelon production systems. Common lambsquarters and pigweed species (namely, Palmer amaranth and smooth pigweed) densities and biomasses were often lower with cereal rye compared with no cereal rye, regardless of herbicide treatment. The presence of cereal rye did not negatively influence the number of marketable watermelon fruit, but average marketable fruit weight in cereal rye versus no cereal rye treatments varied by location. These results demonstrate that a spring-seeded cereal rye cover crop can help reduce weed density and weed biomass, and potentially enhance overall weed control. Cereal rye alone did not provide full-season weed control, so additional research is needed to determine the best methods to integrate spring cover cropping with other weed management tactics in watermelon for effective, full-season control.