We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
In his 2019 essay, Arthur Kleinman laments that medicine has become ever-competent at managing illness, yet caring for those who are ill is increasingly out of practice. He opines that the language of ‘the soul’ is helpful to those practicing medicine, as it provides an important counterbalance to medicine’s technical rationality that avoids the existential and spiritual domains of human life. His accusation that medicine has become soulless merits considering, yet we believe his is the wrong description of contemporary medicine. Where medicine is disciplined by technological and informational rationalities that risk coercing attention away from corporealities and toward an impersonal, digital order, the resulting practices expose medicine to becoming not soulless but excarnated. Here we engage Kleinman in conversation with Franco Berardi, Charles Taylor, and others to ask: Have we left behind the body for senseless purposes? Perhaps medicine is not proving itself to be soulless, but rather senseless, bodyless – the any-occupation of excarnated souls. If so, the dissension of excarnation and the recovery of touching purpose seems to us to be an apparent need within the contemporary and increasingly digitally managed and informationally ordered medical milieu.
Cognitive decline is expected in normative aging (Cabeza et al., 2018; Salthouse, 2019), which can lead to impairments in adaptive functioning (Yam et al., 2014). Several cognitive domains have been associated with adaptive functioning in older adult samples, including processing speed and executive functioning (e.g., Nguyen et al., 2019; Vaughn & Giovanello, 2010). A recent study examining a mixed clinical sample of older adults demonstrated that processing speed was more predictive of functional decline than other cognitive domains, including aspects of executive functioning (Roye et al., 2022). Therefore, this study attempts to build on previous findings by further examining the relationships between processing speed, adaptive functioning, and executive functioning. Specifically, it investigated the extent to which processing speed mediated the associations between executive functioning and adaptive functioning.
Participants and Methods:
Participants (N = 239) were selected from a clinical database of neuropsychological evaluations. Inclusion criteria were age 60+ (M = 74.0, SD = 6.9) and completion of relevant study measures. Participants were majority White (93%) women (53.1%). Three cognitive diagnosis groups were coded: No Diagnosis (N = 82), Mild Neurocognitive Disorder (NCD; N = 78), and Major NCD (N = 79). The Texas Functional Living Scale (TFLS) was used as a performance-based measure of adaptive functioning. Processing speed was measured using the Coding subtest from the Repeatable Battery for the Assessment of Neuropsychological Status. Executive functioning performance was quantified using part B of the Trail Making Test, Controlled Oral Word Association Test, and Similarities and Matrix Reasoning subtests from the WAIS-IV and WASI-II. Mediation models included age and years of education as covariates and indirect effects were assessed with bootstrapped confidence intervals (Hayes, 2020).
Results:
Processing speed mediated all measures of executive functioning. The pattern was consistent for all executive functioning measures such that poorer executive functioning was associated with poorer processing speed, which was subsequently associated with poorer adaptive functioning. Direct effects were significant for all models (ps < .03), suggesting that executive functioning maintained unique associations with adaptive functioning. Follow-up analyses indicated no evidence for moderation of the mediation models based on diagnostic group.
Conclusions:
These results highlight the importance of processing speed in understanding real-world implications of pathological and non-pathological cognitive aging. Processing speed mediated all relationships between executive functioning and adaptive functioning. There was no evidence for moderation of these effects, supporting generalizability regardless of neurocognitive disorder and etiologic subtype. Further investigation is warranted into the importance of processing speed in explaining associations of other cognitive domains with adaptive functioning.
The presence of cognitive impairment corresponds with declines in adaptive functioning (Cahn-Weiner, Ready, & Malloy, 2003). Although memory loss is often highlighted as a key deficit in neurodegenerative diseases (Arvanitakis et al., 2018), research indicates that processing speed may be equally important when predicting functional outcomes in atypical cognitive decline (Roye et al., 2022). Additionally, the development of performance-based measures of adaptive functioning offers a quantifiable depiction of functional deficits within a clinical setting. This study investigated the degree to which processing speed explains the relationship between immediate/delayed memory and adaptive functioning in patients diagnosed with mild and major neurocognitive disorders using an objective measure of adaptive functioning.
Participants and Methods:
Participants (N = 115) were selected from a clinical database of neuropsychological evaluations. Included participants were ages 65+ (M = 74.7, SD = 5.15), completed all relevant study measures, and were diagnosed with Mild Neurocognitive Disorder (NCD; N = 69) or Major NCD (N = 46). They were majority white (87.8%) women (53.0%). The Texas Functional Living Scale was used as a performance-based measure of adaptive functioning. The Coding subtest from the Repeatable Battery for the Assessment of Neuropsychological Status (RBANS-CD) was used to measure information processing speed. Composite memory measures for Immediate Recall and Delayed Recall were created from subtests of the RBANS (List Learning, Story Memory, and Figure Recall) and the Wechsler Memory Scale-IV (Logical Memory and Visual Reproduction). Multiple regressions were conducted to evaluate the importance of memory and information processing speed in understanding adaptive functioning. Age and years of education were added as covariates in regression analyses.
Results:
Significant correlations (p < .001) were found between adaptive functioning and processing speed (PS; r = .52), immediate memory (IM; r = .43), and delayed memory (DM; r = .32). In a regression model with IM and DM predicting daily functioning, only IM significantly explained daily functioning (rsp = .24, p = .009). A multiple regression revealed daily functioning was significantly and uniquely associated with IM (rsp = .28, p < .001) and PS (rsp = .41, p < .001). This was qualified by a significant interaction effect (rsp = -.29, p = .001), revealing that IM was only associated with adaptive functioning at PS scores lower than the RBANS normative 20th percentile.
Conclusions:
Results suggest that processing speed may be a more sensitive predictor of functional decline than memory among older adults with cognitive disorders. These findings support further investigation into the clinical utility of processing speed tests for predicting functional decline in older adults.
We recently reported on the radio-frequency attenuation length of cold polar ice at Summit Station, Greenland, based on bi-static radar measurements of radio-frequency bedrock echo strengths taken during the summer of 2021. Those data also allow studies of (a) the relative contributions of coherent (such as discrete internal conducting layers with sub-centimeter transverse scale) vs incoherent (e.g. bulk volumetric) scattering, (b) the magnitude of internal layer reflection coefficients, (c) limits on signal propagation velocity asymmetries (‘birefringence’) and (d) limits on signal dispersion in-ice over a bandwidth of ~100 MHz. We find that (1) attenuation lengths approach 1 km in our band, (2) after averaging 10 000 echo triggers, reflected signals observable over the thermal floor (to depths of ~1500 m) are consistent with being entirely coherent, (3) internal layer reflectivities are ≈–60$\to$–70 dB, (4) birefringent effects for vertically propagating signals are smaller by an order of magnitude relative to South Pole and (5) within our experimental limits, glacial ice is non-dispersive over the frequency band relevant for neutrino detection experiments.
Depression and anxiety are common and highly comorbid, and their comorbidity is associated with poorer outcomes posing clinical and public health concerns. We evaluated the polygenic contribution to comorbid depression and anxiety, and to each in isolation.
Methods
Diagnostic codes were extracted from electronic health records for four biobanks [N = 177 865 including 138 632 European (77.9%), 25 612 African (14.4%), and 13 621 Hispanic (7.7%) ancestry participants]. The outcome was a four-level variable representing the depression/anxiety diagnosis group: neither, depression-only, anxiety-only, and comorbid. Multinomial regression was used to test for association of depression and anxiety polygenic risk scores (PRSs) with the outcome while adjusting for principal components of ancestry.
Results
In total, 132 960 patients had neither diagnosis (74.8%), 16 092 depression-only (9.0%), 13 098 anxiety-only (7.4%), and 16 584 comorbid (9.3%). In the European meta-analysis across biobanks, both PRSs were higher in each diagnosis group compared to controls. Notably, depression-PRS (OR 1.20 per s.d. increase in PRS; 95% CI 1.18–1.23) and anxiety-PRS (OR 1.07; 95% CI 1.05–1.09) had the largest effect when the comorbid group was compared with controls. Furthermore, the depression-PRS was significantly higher in the comorbid group than the depression-only group (OR 1.09; 95% CI 1.06–1.12) and the anxiety-only group (OR 1.15; 95% CI 1.11–1.19) and was significantly higher in the depression-only group than the anxiety-only group (OR 1.06; 95% CI 1.02–1.09), showing a genetic risk gradient across the conditions and the comorbidity.
Conclusions
This study suggests that depression and anxiety have partially independent genetic liabilities and the genetic vulnerabilities to depression and anxiety make distinct contributions to comorbid depression and anxiety.
In the transition zone, turfgrass managers generally utilize the dormancy period of warm-season turfgrass to apply herbicides for managing winter annual weeds. Although this weed control strategy is common in bermudagrass [Cynodon dactylon (L.) Pers.], it has been less adopted in zoysiagrass (Zoysia spp.) due to variable turfgrass injury during post-dormancy transition. Previous research reported that air temperature could affect weed control and crop safety from herbicides. Growth-chamber studies were conducted to evaluate zoysiagrass response to glyphosate and glufosinate as influenced by three different temperature regimes during and after treatment. A field research study was conducted at four site-years to assess the influence of variable heat-unit accumulation on zoysiagrass response to seven herbicides. In the growth-chamber study, glufosinate injured zoysiagrass more than glyphosate and reduced time to reach 50% green cover reduction, regardless of the rate, when incubated for 7 d under different temperature levels. When green zoysiagrass sprigs were incubated for 7 d at 10 C, the rate of green cover reduction was slowed for both herbicides; however, green cover was rapidly reduced under 27 C. After treated zoysiagrass plugs having 5% green cover were incubated at 10 C for 14 d, glyphosate-treated plugs reached 50% green cover in 22 d, similar to nontreated plugs but less than the 70 d required for glufosinate-treated plugs. Zoysiagrass response to glyphosate was temperature dependent, but glufosinate injured zoysiagrass unacceptably regardless of temperature regime. Diquat, flumioxazin, glufosinate, and metsulfuron + rimsulfuron injured zoysiagrass at 200 or 300 growing-degree days at base 5 C (GDD5C) application timings, but foramsulfuron and oxadiazon did not injure zoysiagrass regardless of GDD5C. The relationship of leaf density to green turf cover is dependent on zoysiagrass mowing height, and both metrics are reduced by injurious herbicides. Research indicates that glufosinate injures zoysiagrass more than glyphosate, and the speed and magnitude of herbicide injury generally increase with temperature.
This paper proposes a framework for comprehensive, collaborative, and community-based care (C4) for accessible mental health services in low-resource settings. Because mental health conditions have many causes, this framework includes social, public health, wellness and clinical services. It accommodates integration of stand-alone mental health programs with health and non-health community-based services. It addresses gaps in previous models including lack of community-based psychotherapeutic and social services, difficulty in addressing comorbidity of mental and physical conditions, and how workers interact with respect to referral and coordination of care. The framework is based on task-shifting of services to non-specialized workers. While the framework draws on the World Health Organization’s Mental Health Gap Action Program and other global mental health models, there are important differences. The C4 Framework delineates types of workers based on their skills. Separate workers focus on: basic psychoeducation and information sharing; community-level, evidence-based psychotherapeutic counseling; and primary medical care and more advanced, specialized mental health services for more severe or complex cases. This paper is intended for individuals, organizations and governments interested in implementing mental health services. The primary aim is to provide a framework for the provision of widely accessible mental health care and services.
As the title indicates, this review of research in latin america covers a wide variety of topics. It can, however, be subdivided into three main divisions : (a) counseling, guidance, and student personnel work, (b) research dealing with disabled and/or handicapped persons, and (c) studies dealing with cultural-attitudinal-value influences within Latin American education as they affect counseling and guidance and special education-rehabilitation.
Hospitalizations among skilled nursing facility (SNF) residents in Detroit increased in mid-March 2020 due to the coronavirus disease 2019 (COVID-19) pandemic. Outbreak response teams were deployed from local healthcare systems, the Centers for Disease Control and Prevention (CDC), and the Detroit Health Department (DHD) to understand the infection prevention and control (IPC) gaps in SNFs that may have accelerated the outbreak.
Methods:
We conducted 2 point-prevalence surveys (PPS-1 and PPS-2) at 13 Detroit SNFs from April 8 to May 8, 2020. The DHD and partners conducted facility-wide severe acute respiratory coronavirus virus 2 (SARS-CoV-2) testing of all residents and staff and collected information regarding resident cohorting, staff cohorting, and personnel protective equipment (PPE) utilized during that time.
Results:
Resident cohorting had been implemented in 7 of 13 (58.3%) SNFs prior to point-prevalence survey 1 (PPS-1), and other facilities initiated cohorting after obtaining PPS-1 results. Cohorting protocols of healthcare practitioners and environmental service staff were not established in 4 (31%) of 13 facilities, and in 3 facilities (23.1%) the ancillary staff were not assigned to cohorts. Also, 2 SNFs (15%) had an observation unit prior to PPS-1, 2 (15%) had an observation unit after PPS-1, 4 (31%) could not establish an observation unit due to inadequate space, and 5 (38.4%) created an observation unit after PPS-2.
Conclusion:
On-site consultations identified gaps in IPC knowledge and cohorting that may have contributed to ongoing transmission of SARS-CoV-2 among SNF residents despite aggressive testing measures. Infection preventionists (IPs) are critical in guiding ongoing IPC practices in SNFs to reduce spread of COVID-19 through response and prevention.
U.S. veterans report high rates of traumatic experiences and mental health symptomology [e.g. posttraumatic stress disorder (PTSD)]. The stress sensitization hypothesis posits experiences of adversity sensitize individuals to stress reactions which can lead to greater psychiatric problems. We extend this hypothesis by exploring how multiple adversities such as early childhood adversity, combat-related trauma, and military sexual trauma related to heterogeneity in stress over time and, subsequently, greater risk for PTSD.
Methods
1230 veterans were recruited for an observational, longitudinal study. Veterans responded to questionnaires on PTSD, stress, and traumatic experiences five times over an 18-month study period. We used latent transition analysis to understand how heterogeneity in adverse experiences is related to transition into stress trajectory classes. We also explored how transition patterns related to PTSD symptomology.
Results
Across all models, we found support for stress sensitization. In general, combat trauma in combinations with other types of adverse experiences, namely early childhood adversity and military sexual trauma, imposed a greater probability of transitioning into higher risk stress profiles. We also showed differential effects of early childhood and military-specific adversity on PTSD symptomology.
Conclusion
The present study rigorously integrates both military-specific and early life adversity into analysis on stress sensitivity, and is the first to examine how sensitivity might affect trajectories of stress over time. Our study provides a nuanced, and specific, look at who is risk for sensitization to stress based on previous traumatic experiences as well as what transition patterns are associated with greater PTSD symptomology.
A hedonic model was employed to examine factors that influence the resale price of row crop planters on the used machinery market. Planter sale data from 2016 to 2018 were utilized to conduct the analysis. Results suggested that the primary factors impacting planter resale prices were make, age, condition, planter configuration, row number, and row spacing. As a function of age (depreciation), planter values were generally determined to decrease at a decreasing rate. Finally, it was determined that there was a significant interaction between the variables make and age, suggesting that different planter makes depreciate differently.
Immediate posttreatment irrigation has been proposed as a method to reduce hybrid bermudagrass [Cynodon dactylon (L.) Pers. × Cynodon transvaalensis Burtt Davy] phytotoxicity from topramezone. Immediate irrigation is impractical, because it would take a turfgrass sprayer 10 to 15 min to cover an average golf course fairway or athletic field. There is also insufficient evidence regarding how posttreatment irrigation, immediate or otherwise, influences mature goosegrass [Eleusine indica (L.) Gaertn.] control from topramezone or low-dose topramezone plus metribuzin programs. We sought to investigate bermudagrass and E. indica response to immediate, 15-min, and 30-min posttreatment irrigation compared with no irrigation following topramezone at 12.3 g ae ha−1, the lowest labeled rate, or topramezone at 6.1 g ha−1 plus metribuzin at 210 g ai ha−1. We also evaluated placement of each herbicide and their combination on soil, foliage, and soil plus foliage to help elucidate the mechanisms involved in differential responses between species and herbicide mixtures. Responses were largely dependent on trial due to bermudagrass injury from high-dose topramezone being nearly eliminated by immediate irrigation in one trial and only slightly affected in another. When posttreatment irrigation was postponed for 15 or 30 min, topramezone alone injured bermudagrass unacceptably in both trials. Bermudagrass was injured less by low-dose topramezone plus metribuzin than by high-dose topramezone. All posttreatment irrigation timings reduced E. indica control compared with no posttreatment irrigation. The herbicide placement study suggested that topramezone control of E. indica is highly dependent on foliar uptake and that phytotoxicity of both bermudagrass and E. indica is greater from topramezone than metribuzin. Thus, posttreatment irrigation likely reduces topramezone rate load with a concomitant effect on plant phytotoxicity of both species. Metribuzin reduced 21-d cumulative clipping weight and tiller production of plants, and this may be a mechanism by which it reduces foliar white discoloration from topramezone.
Seed retention, and ultimately seed shatter, are extremely important for the efficacy of harvest weed seed control (HWSC) and are likely influenced by various agroecological and environmental factors. Field studies investigated seed-shattering phenology of 22 weed species across three soybean [Glycine max (L.) Merr.]-producing regions in the United States. We further evaluated the potential drivers of seed shatter in terms of weather conditions, growing degree days, and plant biomass. Based on the results, weather conditions had no consistent impact on weed seed shatter. However, there was a positive correlation between individual weed plant biomass and delayed weed seed–shattering rates during harvest. This work demonstrates that HWSC can potentially reduce weed seedbank inputs of plants that have escaped early-season management practices and retained seed through harvest. However, smaller individuals of plants within the same population that shatter seed before harvest pose a risk of escaping early-season management and HWSC.
This study assessed the cost-effectiveness of the Centers for Disease Control and Prevention’s (CDC’s) Sodium Reduction in Communities Program (SRCP).
Design:
We collected implementation costs and performance measure indicators from SRCP recipients and their partner food service organisations. We estimated the cost per person and per food service organisation reached and the cost per menu item impacted. We estimated the short-term effectiveness of SRCP in reducing sodium consumption and used it as an input in the Prevention Impact Simulation Model to project the long-term impact on medical cost savings and quality-adjusted life-years gained due to a reduction in CVD and estimate the cost-effectiveness of SRCP if sustained through 2025 and 2040.
Setting:
CDC funded eight recipients as part of the 2016–2021 round of the SRCP to work with food service organisations in eight settings to increase the availability and purchase of lower-sodium food options.
Participants:
Eight SRCP recipients and twenty of their partners.
Results:
At the recipient level, average cost per person reached was $10, and average cost per food service organisation reached was $42 917. At the food service organisation level, median monthly cost per food item impacted by recipe modification or product substitution was $684. Cost-effectiveness analyses showed that, if sustained, the programme is cost saving (i.e. the reduction in medical costs is greater than the implementation costs) in the target population by $1·82 through 2025 and $2·09 through 2040.
Conclusions:
By providing evidence of the cost-effectiveness of a real-world sodium reduction initiative, this study can help inform decisions by public health organisations about related CVD prevention interventions.
Studying phenotypic and genetic characteristics of age at onset (AAO) and polarity at onset (PAO) in bipolar disorder can provide new insights into disease pathology and facilitate the development of screening tools.
Aims
To examine the genetic architecture of AAO and PAO and their association with bipolar disorder disease characteristics.
Method
Genome-wide association studies (GWASs) and polygenic score (PGS) analyses of AAO (n = 12 977) and PAO (n = 6773) were conducted in patients with bipolar disorder from 34 cohorts and a replication sample (n = 2237). The association of onset with disease characteristics was investigated in two of these cohorts.
Results
Earlier AAO was associated with a higher probability of psychotic symptoms, suicidality, lower educational attainment, not living together and fewer episodes. Depressive onset correlated with suicidality and manic onset correlated with delusions and manic episodes. Systematic differences in AAO between cohorts and continents of origin were observed. This was also reflected in single-nucleotide variant-based heritability estimates, with higher heritabilities for stricter onset definitions. Increased PGS for autism spectrum disorder (β = −0.34 years, s.e. = 0.08), major depression (β = −0.34 years, s.e. = 0.08), schizophrenia (β = −0.39 years, s.e. = 0.08), and educational attainment (β = −0.31 years, s.e. = 0.08) were associated with an earlier AAO. The AAO GWAS identified one significant locus, but this finding did not replicate. Neither GWAS nor PGS analyses yielded significant associations with PAO.
Conclusions
AAO and PAO are associated with indicators of bipolar disorder severity. Individuals with an earlier onset show an increased polygenic liability for a broad spectrum of psychiatric traits. Systematic differences in AAO across cohorts, continents and phenotype definitions introduce significant heterogeneity, affecting analyses.
ABSTRACT IMPACT: This project seeks to identify unique host responses that are biomarkers for specific urethral pathogens, and which can be used in the development of point-of-care (POC) STI diagnostics. OBJECTIVES/GOALS: How Chlamydia trachomatis (CT) and other common STIs, e.g. Neisseria gonorrhoeae, evade immunity and elicit pathology in the male urethra is poorly understood. Our objective is to determine how STI-infected urethral epithelial cells, as well as the uninfected ‘bystander’ cells with which infected cells communicate, respond to CT and other STIs. METHODS/STUDY POPULATION: We evaluated how immortalized urethral cell lines - including transduced human urethral epithelial cells (THUECs) - respond to increasing doses of CT infectious particles using in vitro one-step progeny assays performed in the presence or absence of cycloheximide, a drug that inhibits eukaryotic protein synthesis. We will perform concurrent single-cell RNA sequencing (scRNA-seq) and multiplex cytokine analyses to determine how different CT doses impact the transcriptomes of infected and bystander urethral epithelial cells and modulate cytokine production of the overall monolayer. Results of these experiments will inform the feasibility of performing similar analyses in situ using urethral swabs from men with clinically diagnosed urethritis. RESULTS/ANTICIPATED RESULTS: Our results demonstrate that immune-competent urethral cell monolayers strongly resist CT infection, unless most of the cells are simultaneously infected. This suggests that uninfected bystander cells sense CT-infected cells and secrete soluble factors that may act to limit CT proliferation in infected cells and to inform remaining uninfected cells that a potential pathogen is present. We anticipate that our scRNA-seq and cytokine analyses will identify both specific effector pathways that protect against CT and intracellular signals that modulate them. We speculate that these pathways and signals may differ during infection with CT and other STIs. Importantly, we anticipate that our in vitro model of CT infection will be highly representative of in situ immune responses observed in urethras of infected men. DISCUSSION/SIGNIFICANCE OF FINDINGS: In men, common STIs including CT are usually managed syndromically due to a lack of POC diagnostics. By determining how STIs elicit urethral inflammation and identifying countermeasures that STIs use to evade urethral immunity, we can identify host responses that serve as biomarkers for urethritis, generally, and for specific urethral pathogens.
Impairment in reciprocal social behavior (RSB), an essential component of early social competence, clinically defines autism spectrum disorder (ASD). However, the behavioral and genetic architecture of RSB in toddlerhood, when ASD first emerges, has not been fully characterized. We analyzed data from a quantitative video-referenced rating of RSB (vrRSB) in two toddler samples: a community-based volunteer research registry (n = 1,563) and an ethnically diverse, longitudinal twin sample ascertained from two state birth registries (n = 714). Variation in RSB was continuously distributed, temporally stable, significantly associated with ASD risk at age 18 months, and only modestly explained by sociodemographic and medical factors (r2 = 9.4%). Five latent RSB factors were identified and corresponded to aspects of social communication or restricted repetitive behaviors, the two core ASD symptom domains. Quantitative genetic analyses indicated substantial heritability for all factors at age 24 months (h2 ≥ .61). Genetic influences strongly overlapped across all factors, with a social motivation factor showing evidence of newly-emerging genetic influences between the ages of 18 and 24 months. RSB constitutes a heritable, trait-like competency whose factorial and genetic structure is generalized across diverse populations, demonstrating its role as an early, enduring dimension of inherited variation in human social behavior. Substantially overlapping RSB domains, measurable when core ASD features arise and consolidate, may serve as markers of specific pathways to autism and anchors to inform determinants of autism's heterogeneity.
Potential effectiveness of harvest weed seed control (HWSC) systems depends upon seed shatter of the target weed species at crop maturity, enabling its collection and processing at crop harvest. However, seed retention likely is influenced by agroecological and environmental factors. In 2016 and 2017, we assessed seed-shatter phenology in 13 economically important broadleaf weed species in soybean [Glycine max (L.) Merr.] from crop physiological maturity to 4 wk after physiological maturity at multiple sites spread across 14 states in the southern, northern, and mid-Atlantic United States. Greater proportions of seeds were retained by weeds in southern latitudes and shatter rate increased at northern latitudes. Amaranthus spp. seed shatter was low (0% to 2%), whereas shatter varied widely in common ragweed (Ambrosia artemisiifolia L.) (2% to 90%) over the weeks following soybean physiological maturity. Overall, the broadleaf species studied shattered less than 10% of their seeds by soybean harvest. Our results suggest that some of the broadleaf species with greater seed retention rates in the weeks following soybean physiological maturity may be good candidates for HWSC.
Seed shatter is an important weediness trait on which the efficacy of harvest weed seed control (HWSC) depends. The level of seed shatter in a species is likely influenced by agroecological and environmental factors. In 2016 and 2017, we assessed seed shatter of eight economically important grass weed species in soybean [Glycine max (L.) Merr.] from crop physiological maturity to 4 wk after maturity at multiple sites spread across 11 states in the southern, northern, and mid-Atlantic United States. From soybean maturity to 4 wk after maturity, cumulative percent seed shatter was lowest in the southern U.S. regions and increased moving north through the states. At soybean maturity, the percent of seed shatter ranged from 1% to 70%. That range had shifted to 5% to 100% (mean: 42%) by 25 d after soybean maturity. There were considerable differences in seed-shatter onset and rate of progression between sites and years in some species that could impact their susceptibility to HWSC. Our results suggest that many summer annual grass species are likely not ideal candidates for HWSC, although HWSC could substantially reduce their seed output during certain years.