To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Several Miscanthus species are cultivated in the U.S. Midwest and Northeast, and feral populations can displace the native plant community and potentially negatively affect ecosystem processes. The monetary cost of eradicating feral Miscanthus populations is unknown, but quantifying eradication costs will inform decisions on whether eradication is a feasible goal and should be considered when totaling the economic damage of invasive species. We managed experimental populations of eulaliagrass (Miscanthus sinensis Andersson) and the giant Miscanthus hybrid (Miscanthus × giganteus J.M. Greef & Deuter ex Hodkinson & Renvoize) in three floodplain forest and three old field sites in central Illinois with the goal of eradication. We recorded the time invested in eradication efforts and tracked survival of Miscanthus plants over a 5-yr period, then estimated the costs associated with eradicating these Miscanthus populations. Finally, we used these estimates to predict the total monetary costs of eradicating existing M. sinensis populations reported on EDDMapS. Miscanthus populations in the old field sites were harder to eradicate, resulting in an average of 290% greater estimated eradication costs compared with the floodplain forest sites. However, the cost and time needed to eradicate Miscanthus populations were similar between Miscanthus species. On-site eradication costs ranged from $390 to $3,316 per site (or $1.3 to $11 m−2) in the old field sites, compared with only $85 to $547 (or $0.92 to $1.82 m−2) to eradicate populations within the floodplain forests, with labor comprising the largest share of these costs. Using our M. sinensis eradication cost estimates in Illinois, we predict that the potential costs to eradicate populations reported on EDDMapS would range from $10 to $37 million, with a median predicted cost of $22 million. The monetary costs of eradicating feral Miscanthus populations should be weighed against the benefits of cultivating these species to provide a comprehensive picture of the relative costs and benefits of adding these species to our landscapes.
Copy number variants (CNVs) have been associated with the risk of schizophrenia, autism and intellectual disability. However, little is known about their spectrum of psychopathology in adulthood.
We investigated the psychiatric phenotypes of adult CNV carriers and compared probands, who were ascertained through clinical genetics services, with carriers who were not. One hundred twenty-four adult participants (age 18–76), each bearing one of 15 rare CNVs, were recruited through a variety of sources including clinical genetics services, charities for carriers of genetic variants, and online advertising. A battery of psychiatric assessments was used to determine psychopathology.
The frequencies of psychopathology were consistently higher for the CNV group compared to general population rates. We found particularly high rates of neurodevelopmental disorders (NDDs) (48%), mood disorders (42%), anxiety disorders (47%) and personality disorders (73%) as well as high rates of psychiatric multimorbidity (median number of diagnoses: 2 in non-probands, 3 in probands). NDDs [odds ratio (OR) = 4.67, 95% confidence interval (CI) 1.32–16.51; p = 0.017) and psychotic disorders (OR = 6.8, 95% CI 1.3–36.3; p = 0.025) occurred significantly more frequently in probands (N = 45; NDD: 39[87%]; psychosis: 8[18%]) than non-probands (N = 79; NDD: 20 [25%]; psychosis: 3[4%]). Participants also had somatic diagnoses pertaining to all organ systems, particularly conotruncal cardiac malformations (in individuals with 22q11.2 deletion syndrome specifically), musculoskeletal, immunological, and endocrine diseases.
Adult CNV carriers had a markedly increased rate of anxiety and personality disorders not previously reported and high rates of psychiatric multimorbidity. Our findings support in-depth psychiatric and medical assessments of carriers of CNVs and the establishment of multidisciplinary clinical services.
Seed retention, and ultimately seed shatter, are extremely important for the efficacy of harvest weed seed control (HWSC) and are likely influenced by various agroecological and environmental factors. Field studies investigated seed-shattering phenology of 22 weed species across three soybean [Glycine max (L.) Merr.]-producing regions in the United States. We further evaluated the potential drivers of seed shatter in terms of weather conditions, growing degree days, and plant biomass. Based on the results, weather conditions had no consistent impact on weed seed shatter. However, there was a positive correlation between individual weed plant biomass and delayed weed seed–shattering rates during harvest. This work demonstrates that HWSC can potentially reduce weed seedbank inputs of plants that have escaped early-season management practices and retained seed through harvest. However, smaller individuals of plants within the same population that shatter seed before harvest pose a risk of escaping early-season management and HWSC.
To estimate prior severe acute respiratory coronavirus virus 2 (SARS-CoV-2) infection among skilled nursing facility (SNF) staff in the state of Georgia and to identify risk factors for seropositivity as of fall 2020.
Baseline survey and seroprevalence of the ongoing longitudinal Coronavirus 2019 (COVID-19) Prevention in Nursing Homes study.
The study included 14 SNFs in the state of Georgia.
In total, 792 SNF staff employed or contracted with participating SNFs were included in this study. The analysis included 749 participants with SARS-CoV-2 serostatus results who provided age, sex, and complete survey information.
We estimated unadjusted odds ratios (ORs) and 95% confidence intervals (95% CIs) for potential risk factors and SARS-CoV-2 serostatus. We estimated adjusted ORs using a logistic regression model including age, sex, community case rate, SNF resident infection rate, working at other facilities, and job role.
Staff working in high-infection SNFs were twice as likely (unadjusted OR, 2.08; 95% CI, 1.45–3.00) to be seropositive as those in low-infection SNFs. Certified nursing assistants and nurses were 3 times more likely to be seropositive than administrative, pharmacy, or nonresident care staff: unadjusted OR, 2.93 (95% CI, 1.58–5.78) and unadjusted OR, 3.08 (95% CI, 1.66–6.07). Logistic regression yielded similar adjusted ORs.
Working at high-infection SNFs was a risk factor for SARS-CoV-2 seropositivity. Even after accounting for resident infections, certified nursing assistants and nurses had a 3-fold higher risk of SARS-CoV-2 seropositivity than nonclinical staff. This knowledge can guide prioritized implementation of safer ways for caregivers to provide necessary care to SNF residents.
Neuropsychopharmacologic effects of long-term opioid therapy (LTOT) in the context of chronic pain may result in subjective anhedonia coupled with decreased attention to natural rewards. Yet, there are no known efficacious treatments for anhedonia and reward deficits associated with chronic opioid use. Mindfulness-Oriented Recovery Enhancement (MORE), a novel behavioral intervention combining training in mindfulness with savoring of natural rewards, may hold promise for treating anhedonia in LTOT.
Veterans receiving LTOT (N = 63) for chronic pain were randomized to 8 weeks of MORE or a supportive group (SG) psychotherapy control. Before and after the 8-week treatment groups, we assessed the effects of MORE on the late positive potential (LPP) of the electroencephalogram and skin conductance level (SCL) during viewing and up-regulating responses (i.e. savoring) to natural reward cues. We then examined whether these neurophysiological effects were associated with reductions in subjective anhedonia by 4-month follow-up.
Patients treated with MORE demonstrated significantly increased LPP and SCL to natural reward cues and greater decreases in subjective anhedonia relative to those in the SG. The effect of MORE on reducing anhedonia was statistically mediated by increases in LPP response during savoring.
MORE enhances motivated attention to natural reward cues among chronic pain patients on LTOT, as evidenced by increased electrocortical and sympathetic nervous system responses. Given neurophysiological evidence of clinical target engagement, MORE may be an efficacious treatment for anhedonia among chronic opioid users, people with chronic pain, and those at risk for opioid use disorder.
Studying phenotypic and genetic characteristics of age at onset (AAO) and polarity at onset (PAO) in bipolar disorder can provide new insights into disease pathology and facilitate the development of screening tools.
To examine the genetic architecture of AAO and PAO and their association with bipolar disorder disease characteristics.
Genome-wide association studies (GWASs) and polygenic score (PGS) analyses of AAO (n = 12 977) and PAO (n = 6773) were conducted in patients with bipolar disorder from 34 cohorts and a replication sample (n = 2237). The association of onset with disease characteristics was investigated in two of these cohorts.
Earlier AAO was associated with a higher probability of psychotic symptoms, suicidality, lower educational attainment, not living together and fewer episodes. Depressive onset correlated with suicidality and manic onset correlated with delusions and manic episodes. Systematic differences in AAO between cohorts and continents of origin were observed. This was also reflected in single-nucleotide variant-based heritability estimates, with higher heritabilities for stricter onset definitions. Increased PGS for autism spectrum disorder (β = −0.34 years, s.e. = 0.08), major depression (β = −0.34 years, s.e. = 0.08), schizophrenia (β = −0.39 years, s.e. = 0.08), and educational attainment (β = −0.31 years, s.e. = 0.08) were associated with an earlier AAO. The AAO GWAS identified one significant locus, but this finding did not replicate. Neither GWAS nor PGS analyses yielded significant associations with PAO.
AAO and PAO are associated with indicators of bipolar disorder severity. Individuals with an earlier onset show an increased polygenic liability for a broad spectrum of psychiatric traits. Systematic differences in AAO across cohorts, continents and phenotype definitions introduce significant heterogeneity, affecting analyses.
Research into the relationship between ecosystem services and human well-being, including poverty alleviation, has blossomed. However, little is known about who has produced this knowledge, what collaborative patterns and institutional and funding conditions have underpinned it, or what implications these matters may have. To investigate the potential implications of such production for conservation science and practice, we address this by developing a social network analysis of the most prolific writers in the production of knowledge about ecosystem services and poverty alleviation. We show that 70% of these authors are men, most are trained in either the biological sciences or economics and almost none in the humanities. Eighty per cent of authors obtained their PhD from universities in the EU or the USA, and they are currently employed in these regions. The co-authorship network is strongly collaborative, without dominant authors, and with the top 30 most cited scholars being based in the USA and co-authoring frequently. These findings suggest, firstly, that the production of knowledge on ecosystem services and poverty alleviation research has the same geographical and gender biases that characterize knowledge production in other scientific areas and, secondly, that there is an expertise bias that also characterizes other environmental matters. This is despite the fact that the research field of ecosystem services and poverty alleviation, by its nature, requires a multidisciplinary lens. This could be overcome through promoting more extensive collaboration and knowledge co-production.
The impacts of the COVID-19 pandemic extend to global biodiversity and its conservation. Although short-term beneficial or adverse impacts on biodiversity have been widely discussed, there is less attention to the likely political and economic responses to the crisis and their implications for conservation. Here we describe four possible alternative future policy responses: (1) restoration of the previous economy, (2) removal of obstacles to economic growth, (3) green recovery and (4) transformative economic reconstruction. Each alternative offers opportunities and risks for conservation. They differ in the agents they emphasize to mobilize change (e.g. markets or states) and in the extent to which they prioritize or downplay the protection of nature. We analyse the advantages and disadvantages of these four options from a conservation perspective. We argue that the choice of post-COVID-19 recovery strategy has huge significance for the future of biodiversity, and that conservationists of all persuasions must not shrink from engagement in the debates to come.
Potential effectiveness of harvest weed seed control (HWSC) systems depends upon seed shatter of the target weed species at crop maturity, enabling its collection and processing at crop harvest. However, seed retention likely is influenced by agroecological and environmental factors. In 2016 and 2017, we assessed seed-shatter phenology in 13 economically important broadleaf weed species in soybean [Glycine max (L.) Merr.] from crop physiological maturity to 4 wk after physiological maturity at multiple sites spread across 14 states in the southern, northern, and mid-Atlantic United States. Greater proportions of seeds were retained by weeds in southern latitudes and shatter rate increased at northern latitudes. Amaranthus spp. seed shatter was low (0% to 2%), whereas shatter varied widely in common ragweed (Ambrosia artemisiifolia L.) (2% to 90%) over the weeks following soybean physiological maturity. Overall, the broadleaf species studied shattered less than 10% of their seeds by soybean harvest. Our results suggest that some of the broadleaf species with greater seed retention rates in the weeks following soybean physiological maturity may be good candidates for HWSC.
Seed shatter is an important weediness trait on which the efficacy of harvest weed seed control (HWSC) depends. The level of seed shatter in a species is likely influenced by agroecological and environmental factors. In 2016 and 2017, we assessed seed shatter of eight economically important grass weed species in soybean [Glycine max (L.) Merr.] from crop physiological maturity to 4 wk after maturity at multiple sites spread across 11 states in the southern, northern, and mid-Atlantic United States. From soybean maturity to 4 wk after maturity, cumulative percent seed shatter was lowest in the southern U.S. regions and increased moving north through the states. At soybean maturity, the percent of seed shatter ranged from 1% to 70%. That range had shifted to 5% to 100% (mean: 42%) by 25 d after soybean maturity. There were considerable differences in seed-shatter onset and rate of progression between sites and years in some species that could impact their susceptibility to HWSC. Our results suggest that many summer annual grass species are likely not ideal candidates for HWSC, although HWSC could substantially reduce their seed output during certain years.
Radiocarbon (14C) ages cannot provide absolutely dated chronologies for archaeological or paleoenvironmental studies directly but must be converted to calendar age equivalents using a calibration curve compensating for fluctuations in atmospheric 14C concentration. Although calibration curves are constructed from independently dated archives, they invariably require revision as new data become available and our understanding of the Earth system improves. In this volume the international 14C calibration curves for both the Northern and Southern Hemispheres, as well as for the ocean surface layer, have been updated to include a wealth of new data and extended to 55,000 cal BP. Based on tree rings, IntCal20 now extends as a fully atmospheric record to ca. 13,900 cal BP. For the older part of the timescale, IntCal20 comprises statistically integrated evidence from floating tree-ring chronologies, lacustrine and marine sediments, speleothems, and corals. We utilized improved evaluation of the timescales and location variable 14C offsets from the atmosphere (reservoir age, dead carbon fraction) for each dataset. New statistical methods have refined the structure of the calibration curves while maintaining a robust treatment of uncertainties in the 14C ages, the calendar ages and other corrections. The inclusion of modeled marine reservoir ages derived from a three-dimensional ocean circulation model has allowed us to apply more appropriate reservoir corrections to the marine 14C data rather than the previous use of constant regional offsets from the atmosphere. Here we provide an overview of the new and revised datasets and the associated methods used for the construction of the IntCal20 curve and explore potential regional offsets for tree-ring data. We discuss the main differences with respect to the previous calibration curve, IntCal13, and some of the implications for archaeology and geosciences ranging from the recent past to the time of the extinction of the Neanderthals.
UK Biobank is a well-characterised cohort of over 500 000 participants including genetics, environmental data and imaging. An online mental health questionnaire was designed for UK Biobank participants to expand its potential.
Describe the development, implementation and results of this questionnaire.
An expert working group designed the questionnaire, using established measures where possible, and consulting a patient group. Operational criteria were agreed for defining likely disorder and risk states, including lifetime depression, mania/hypomania, generalised anxiety disorder, unusual experiences and self-harm, and current post-traumatic stress and hazardous/harmful alcohol use.
A total of 157 366 completed online questionnaires were available by August 2017. Participants were aged 45–82 (53% were ≥65 years) and 57% women. Comparison of self-reported diagnosed mental disorder with a contemporary study shows a similar prevalence, despite respondents being of higher average socioeconomic status. Lifetime depression was a common finding, with 24% (37 434) of participants meeting criteria and current hazardous/harmful alcohol use criteria were met by 21% (32 602), whereas other criteria were met by less than 8% of the participants. There was extensive comorbidity among the syndromes. Mental disorders were associated with a high neuroticism score, adverse life events and long-term illness; addiction and bipolar affective disorder in particular were associated with measures of deprivation.
The UK Biobank questionnaire represents a very large mental health survey in itself, and the results presented here show high face validity, although caution is needed because of selection bias. Built into UK Biobank, these data intersect with other health data to offer unparalleled potential for crosscutting biomedical research involving mental health.
When 2017 Hurricane Harvey struck the coastline of Texas on August 25, 2017, it resulted in 88 fatalities and more than US $125 billion in damage to infrastructure. The floods associated with the storm created a toxic mix of chemicals, sewage and other biohazards, and over 6 million cubic meters of garbage in Houston alone. The level of biohazard exposure and injuries from trauma among persons residing in affected areas was widespread and likely contributed to increases in emergency department (ED) visits in Houston and cities receiving hurricane evacuees. We investigated medical surge resulting from these evacuations in Dallas–Fort Worth (DFW) metroplex EDs.
We used data sourced from the North Texas Syndromic Surveillance Region 2/3 in ESSENCE to investigate ED visit surge following the storm in DFW hospitals because this area received evacuees from the 60 counties with disaster declarations due to the storm. We used the interrupted time series (ITS) analysis to estimate the magnitude and duration of the ED surge. ITS was applied to all ED visits in DFW and visits made by patients residing in any of the 60 counties with disaster declarations due to the storm. The DFW metropolitan statistical area included 55 hospitals. Time series analyses examined data from March 1, 2017–January 6, 2018 with focus on the storm impact period, August 14–September 15, 2017. Data from before, during, and after the storm were visualized spatially and temporally to characterize magnitude, duration, and spatial variation of medical surge attributable to Hurricane Harvey.
During the study period overall, ED visits in the DFW area rose immediately by about 11% (95% CI: 9%, 13%), amounting to ~16 500 excess total visits before returning to the baseline on September 21, 2017. Visits by patients identified as residing in disaster declaration counties to DFW hospitals rose immediately by 127% (95% CI: 125%, 129%), amounting to 654 excess visits by September 29, 2017, when visits returned to the baseline. A spatial analysis revealed that evacuated patients were strongly clustered (Moran’s I = 0.35, P < 0.0001) among 5 of the counties with disaster declarations in the 11-day window during the storm surge.
The observed increase in ED visits in DFW due to Hurricane Harvey and ensuing evacuation was significant. Anticipating medical surge following large-scale hurricanes is critical for community preparedness planning. Coordinated planning across stakeholders is necessary to safeguard the population and for a skillful response to medical surge needs. Plans that address hurricane response, in particular, should have contingencies for support beyond the expected disaster areas.
Hurricane Maria caused catastrophic damage in Puerto Rico, increasing the risk for morbidity and mortality in the post-impact period. We aimed to establish a syndromic surveillance system to describe the number and type of visits at 2 emergency health-care settings in the same hospital system in Ponce, Puerto Rico.
We implemented a hurricane surveillance system by interviewing patients with a short questionnaire about the reason for visit at a hospital emergency department and associated urgent care clinic in the 6 mo after Hurricane Maria. We then evaluated the system by comparing findings with data from the electronic medical record (EMR) system for the same time period.
The hurricane surveillance system captured information from 5116 participants across the 2 sites, representing 17% of all visits captured in the EMR for the same period. Most visits were associated with acute illness/symptoms (79%), followed by injury (11%). The hurricane surveillance and EMR data were similar, proportionally, by sex, age, and visit category.
The hurricane surveillance system provided timely and representative data about the number and type of visits at 2 sites. This system, or an adapted version using available electronic data, should be considered in future disaster settings.
As demonstrated by neuroimaging data, the human brain contains systems that control responses to threat. The revised Reinforcement Sensitivity Theory of personality predicts that individual differences in the reactivity of these brain systems produce anxiety and fear-related personality traits. Here we discuss some of the challenges in testing this theory and, as an example, present a pilot study that aimed to dissociate brain activity during pursuit by threat and goal conflict. We did this by translating the Mouse Defense Test Battery for human fMRI use. In this version, dubbed the Joystick Operated Runway Task (JORT), we repeatedly exposed 24 participants to pursuit and goal conflict, with and without threat of electric shock. The runway design of JORT allowed the effect of threat distance on brain activation to be evaluated independently of context. Goal conflict plus threat of electric shock caused deactivation in a network of brain areas that included the fusiform and middle temporal gyri, as well as the default mode network core, including medial frontal regions, precuneus and posterior cingulate gyrus, and laterally the inferior parietal and angular gyri. Consistent with earlier research, we also found that imminent threat activated the midbrain and that this effect was significantly stronger during the simple pursuit condition than during goal conflict. Also consistent with earlier research, we found significantly greater hippocampal activation during goal conflict than pursuit by imminent threat. In conclusion, our results contribute knowledge to theories linking anxiety disorders to altered functioning in defensive brain systems and also highlight challenges in this research domain.
As the IAU heads towards its second century, many changes have simultaneously transformed Astronomy and the human condition world-wide. Amid the amazing recent discoveries of exoplanets, primeval galaxies, and gravitational radiation, the human condition on Earth has become blazingly interconnected, yet beset with ever-increasing problems of over-population, pollution, and never-ending wars. Fossil-fueled global climate change has begun to yield perilous consequences. And the displacement of people from war-torn nations has reached levels not seen since World War II.