We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Personality traits (e.g. neuroticism) and the social environment predict risk for internalizing disorders and suicidal behavior. Studying these characteristics together and prospectively within a population confronted with high stressor exposure (e.g. U.S. Army soldiers) has not been done, yet could uncover unique and interactive predictive effects that may inform prevention and early intervention efforts.
Methods
Five broad personality traits and social network size were assessed via self-administered questionnaires among experienced soldiers preparing for deployment (N = 4645) and new soldiers reporting for basic training (N = 6216). Predictive models examined associations of baseline personality and social network variables with recent distress disorders or suicidal behaviors assessed 3- and 9-months post-deployment and approximately 5 years following enlistment.
Results
Among the personality traits, elevated neuroticism was consistently associated with increased mental health risk following deployment. Small social networks were also associated with increased mental health risk following deployment, beyond the variance accounted for by personality. Limited support was found for social network size moderating the association between personality and mental health outcomes. Small social networks also predicted distress disorders and suicidal behavior 5 years following enlistment, whereas unique effects of personality traits on these more distal outcomes were rare.
Conclusions
Heightened neuroticism and small social networks predict a greater risk for negative mental health sequelae, especially following deployment. Social ties may mitigate adverse impacts of personality traits on psychopathology in some contexts. Early identification and targeted intervention for these distinct, modifiable factors may decrease the risk of distress disorders and suicidal behavior.
The North American Waterfowl Management Plan highlights the importance of enhancing waterfowl habitat for productivity and resilience. Many forms of land management are conducted in wetlands to support the diverse communities of waterfowl and other species. Primary burrowing crayfish are also abundant and important in these environments, but little research is available assessing the effects of waterfowl land management on primary burrowers. We examined the response of the digger crayfish, Creaserinus fodiens, to the common vegetation management practices of mowing and disking at waterfowl conservation areas in south-eastern Missouri. Our results demonstrated that at a fine scale, crayfish density was affected by only canopy cover. We also highlighted distributional effects of landscape-level environmental variables and suggested that habitat generalists were tolerant of vegetation management, responding more to vegetation composition and broader landscape effects. We discuss wetlands conservation practices and suggest that burrowing crayfish management would integrate well with some current management strategies for waterfowl.
Impaired olfaction may be a biomarker for early Lewy body disease, but its value in mild cognitive impairment with Lewy bodies (MCI-LB) is unknown. We compared olfaction in MCI-LB with MCI due to Alzheimer’s disease (MCI-AD) and healthy older adults. We hypothesized that olfactory function would be worse in probable MCI-LB than in both MCI-AD and healthy comparison subjects (HC).
Design:
Cross-sectional study assessing olfaction using Sniffin’ Sticks 16 (SS-16) in MCI-LB, MCI-AD, and HC with longitudinal follow-up. Differences were adjusted for age, and receiver operating characteristic (ROC) curves were used for discriminating MCI-LB from MCI-AD and HC.
Setting:
Participants were recruited from Memory Services in the North East of England.
Participants:
Thirty-eight probable MCI-LB, 33 MCI-AD, 19 possible MCI-LB, and 32HC.
Measurements:
Olfaction was assessed using SS-16 and a questionnaire.
Results:
Participants with probable MCI-LB had worse olfaction than both MCI-AD (age-adjusted mean difference (B) = 2.05, 95% CI: 0.62–3.49, p = 0.005) and HC (B = 3.96, 95% CI: 2.51–5.40, p < 0.001). The previously identified cutoff score for the SS-16 of ≤ 10 had 84% sensitivity for probable MCI-LB (95% CI: 69–94%), but 30% specificity versus MCI-AD. ROC analysis found a lower cutoff of ≤ 7 was better (63% sensitivity for MCI-LB, with 73% specificity vs MCI-AD and 97% vs HC). Asking about olfactory impairments was not useful in identifying them.
Conclusions:
MCI-LB had worse olfaction than MCI-AD and normal aging. A lower cutoff score of ≤ 7 is required when using SS-16 in such patients. Olfactory testing may have value in identifying early LB disease in memory services.
The present study aimed to clarify the neuropsychological profile of the emergent diagnostic category of Mild Cognitive Impairment with Lewy bodies (MCI-LB) and determine whether domain-specific impairments such as in memory were related to deficits in domain-general cognitive processes (executive function or processing speed).
Method:
Patients (n = 83) and healthy age- and sex-matched controls (n = 34) underwent clinical and imaging assessments. Probable MCI-LB (n = 44) and MCI-Alzheimer’s disease (AD) (n = 39) were diagnosed following National Institute on Aging-Alzheimer’s Association (NIA-AA) and dementia with Lewy bodies (DLB) consortium criteria. Neuropsychological measures included cognitive and psychomotor speed, executive function, working memory, and verbal and visuospatial recall.
Results:
MCI-LB scored significantly lower than MCI-AD on processing speed [Trail Making Test B: p = .03, g = .45; Digit Symbol Substitution Test (DSST): p = .04, g = .47; DSST Error Check: p < .001, g = .68] and executive function [Trail Making Test Ratio (A/B): p = .04, g = .52] tasks. MCI-AD performed worse than MCI-LB on memory tasks, specifically visuospatial (Modified Taylor Complex Figure: p = .01, g = .46) and verbal (Rey Auditory Verbal Learning Test: p = .04, g = .42) delayed recall measures. Stepwise discriminant analysis correctly classified the subtype in 65.1% of MCI patients (72.7% specificity, 56.4% sensitivity). Processing speed accounted for more group-associated variance in visuospatial and verbal memory in both MCI subtypes than executive function, while no significant relationships between measures were observed in controls (all ps > .05)
Conclusions:
MCI-LB was characterized by executive dysfunction and slowed processing speed but did not show the visuospatial dysfunction expected, while MCI-AD displayed an amnestic profile. However, there was considerable neuropsychological profile overlap and processing speed mediated performance in both MCI subtypes.
Electroencephalographic (EEG) abnormalities are greater in mild cognitive impairment (MCI) with Lewy bodies (MCI-LB) than in MCI due to Alzheimer’s disease (MCI-AD) and may anticipate the onset of dementia. We aimed to assess whether quantitative EEG (qEEG) slowing would predict a higher annual hazard of dementia in MCI across these etiologies. MCI patients (n = 92) and healthy comparators (n = 31) provided qEEG recording and underwent longitudinal clinical and cognitive follow-up. Associations between qEEG slowing, measured by increased theta/alpha ratio, and clinical progression from MCI to dementia were estimated with a multistate transition model to account for death as a competing risk, while controlling for age, cognitive function, and etiology classified by an expert consensus panel.
Over a mean follow-up of 1.5 years (SD = 0.5), 14 cases of incident dementia and 5 deaths were observed. Increased theta/alpha ratio on qEEG was associated with increased annual hazard of dementia (hazard ratio = 1.84, 95% CI: 1.01–3.35). This extends previous findings that MCI-LB features early functional changes, showing that qEEG slowing may anticipate the onset of dementia in prospectively identified MCI.
The objectives of this study were to develop and refine EMPOWER (Enhancing and Mobilizing the POtential for Wellness and Resilience), a brief manualized cognitive-behavioral, acceptance-based intervention for surrogate decision-makers of critically ill patients and to evaluate its preliminary feasibility, acceptability, and promise in improving surrogates’ mental health and patient outcomes.
Method
Part 1 involved obtaining qualitative stakeholder feedback from 5 bereaved surrogates and 10 critical care and mental health clinicians. Stakeholders were provided with the manual and prompted for feedback on its content, format, and language. Feedback was organized and incorporated into the manual, which was then re-circulated until consensus. In Part 2, surrogates of critically ill patients admitted to an intensive care unit (ICU) reporting moderate anxiety or close attachment were enrolled in an open trial of EMPOWER. Surrogates completed six, 15–20 min modules, totaling 1.5–2 h. Surrogates were administered measures of peritraumatic distress, experiential avoidance, prolonged grief, distress tolerance, anxiety, and depression at pre-intervention, post-intervention, and at 1-month and 3-month follow-up assessments.
Results
Part 1 resulted in changes to the EMPOWER manual, including reducing jargon, improving navigability, making EMPOWER applicable for a range of illness scenarios, rearranging the modules, and adding further instructions and psychoeducation. Part 2 findings suggested that EMPOWER is feasible, with 100% of participants completing all modules. The acceptability of EMPOWER appeared strong, with high ratings of effectiveness and helpfulness (M = 8/10). Results showed immediate post-intervention improvements in anxiety (d = −0.41), peritraumatic distress (d = −0.24), and experiential avoidance (d = −0.23). At the 3-month follow-up assessments, surrogates exhibited improvements in prolonged grief symptoms (d = −0.94), depression (d = −0.23), anxiety (d = −0.29), and experiential avoidance (d = −0.30).
Significance of results
Preliminary data suggest that EMPOWER is feasible, acceptable, and associated with notable improvements in psychological symptoms among surrogates. Future research should examine EMPOWER with a larger sample in a randomized controlled trial.
The first demonstration of laser action in ruby was made in 1960 by T. H. Maiman of Hughes Research Laboratories, USA. Many laboratories worldwide began the search for lasers using different materials, operating at different wavelengths. In the UK, academia, industry and the central laboratories took up the challenge from the earliest days to develop these systems for a broad range of applications. This historical review looks at the contribution the UK has made to the advancement of the technology, the development of systems and components and their exploitation over the last 60 years.
Dopaminergic imaging is an established biomarker for dementia with Lewy bodies, but its diagnostic accuracy at the mild cognitive impairment (MCI) stage remains uncertain.
Aims
To provide robust prospective evidence of the diagnostic accuracy of dopaminergic imaging at the MCI stage to either support or refute its inclusion as a biomarker for the diagnosis of MCI with Lewy bodies.
Method
We conducted a prospective diagnostic accuracy study of baseline dopaminergic imaging with [123I]N-ω-fluoropropyl-2β-carbomethoxy-3β-(4-iodophenyl)nortropane single-photon emission computerised tomography (123I-FP-CIT SPECT) in 144 patients with MCI. Images were rated as normal or abnormal by a panel of experts with access to striatal binding ratio results. Follow-up consensus diagnosis based on the presence of core features of Lewy body disease was used as the reference standard.
Results
At latest assessment (mean 2 years) 61 patients had probable MCI with Lewy bodies, 26 possible MCI with Lewy bodies and 57 MCI due to Alzheimer's disease. The sensitivity of baseline FP-CIT visual rating for probable MCI with Lewy bodies was 66% (95% CI 52–77%), specificity 88% (76–95%) and accuracy 76% (68–84%), with positive likelihood ratio 5.3.
Conclusions
It is over five times as likely for an abnormal scan to be found in probable MCI with Lewy bodies than MCI due to Alzheimer's disease. Dopaminergic imaging appears to be useful at the MCI stage in cases where Lewy body disease is suspected clinically.
The updated common rule, for human subjects research, requires that consents “begin with a ‘concise and focused’ presentation of the key information that will most likely help someone make a decision about whether to participate in a study” (Menikoff, Kaneshiro, Pritchard. The New England Journal of Medicine. 2017; 376(7): 613–615.). We utilized a community-engaged technology development approach to inform feature options within the REDCap software platform centered around collection and storage of electronic consent (eConsent) to address issues of transparency, clinical trial efficiency, and regulatory compliance for informed consent (Harris, et al. Journal of Biomedical Informatics 2009; 42(2): 377–381.). eConsent may also improve recruitment and retention in clinical research studies by addressing: (1) barriers for accessing rural populations by facilitating remote consent and (2) cultural and literacy barriers by including optional explanatory material (e.g., defining terms by hovering over them with the cursor) or the choice of displaying different videos/images based on participant’s race, ethnicity, or educational level (Phillippi, et al. Journal of Obstetric, Gynecologic, & Neonatal Nursing. 2018; 47(4): 529–534.).
Methods:
We developed and pilot tested our eConsent framework to provide a personalized consent experience whereby users are guided through a consent document that utilizes avatars, contextual glossary information supplements, and videos, to facilitate communication of information.
Results:
The eConsent framework includes a portfolio of eight features, reviewed by community stakeholders, and tested at two academic medical centers.
Conclusions:
Early adoption and utilization of this eConsent framework have demonstrated acceptability. Next steps will emphasize testing efficacy of features to improve participant engagement with the consent process.
The Holocene portion of the Siple Dome (Antarctica) ice core was dated by interpreting the electrical, visual and chemical properties of the core. The data were interpreted manually and with a computer algorithm. The algorithm interpretation was adjusted to be consistent with atmospheric methane stratigraphic ties to the GISP2 (Greenland Ice Sheet Project 2) ice core, 10Be stratigraphic ties to the dendrochronology 14 C record and the dated volcanic stratigraphy. The algorithm interpretation is more consistent and better quantified than the tedious and subjective manual interpretation.
We measured vertical strain in the firn at Siple Dome, Antarctica, using two systems, both of which measure relative displacements over time of metal markers placed in an air-filled borehole. One system uses a metal-detecting tuned coil, and the other uses a video camera to locate the markers. We compare the merits of the two systems. We combine steady-state calculations and a measured density profile to estimate the true vertical-velocity profile. This allows us to calculate a depth-age scale for the firn at Siple Dome. Our steady-state depth-age scale has ages ≈10-15% younger at any given depth when compared to depth-age scales derived by layer counting in a core 40 m away. The age of a visible ash layer at 97 m in the core is 665 ± 30 years, in agreement with a similar analysis conducted at Taylor Dome, Antarctica, where the same ash is also seen, providing an additional dated tie point between the two cores.
Surgical experiments were conducted on cultured five-node apical rhizome segments of quackgrass. Removal of scale leaves promoted an initial burst of growth within the axillary buds but did not support the continued growth of buds as effectively as removal of the rhizome apex. Replacement of detached scale leaves over denuded buds temporarily repressed the promotive effect of scale leaf removal. Aqueous extracts of scale leaf material inhibited apical growth in rhizome segments but did not inhibit bud growth. Anatomical sections revealed that removal of scale leaves promoted development of buds: cells enlarged, vascular tissues differentiated, and new nodes began to form within 4 days of the removal of scale leaves. It is suggested that scale leaves contribute to apical dominance by inhibiting the initial development of axillary buds.
A growing segment of society is concerned about a myriad of health and environmental issues related to the use of pesticides and other agricultural chemicals. Despite the leveling-off of agricultural chemical use in the 1980s, chemical use in agriculture has come to be seen as a two-edged sword. On the positive side, agricultural chemicals have become the engine for world-wide productivity gains. These chemicals have contributed to increased yields per acre and have reduced waste in storage and distribution. On the negative side, agricultural chemicals are perceived by many to present risks to the safety of the food we eat, to the quality of our drinking water, to the wildlife population, to applicators and to people who inadvertently come into point contact with them.
Four soil chronosequences in the southern Great Basin were examined in order to study and quantify soil development during the Quaternary. Soils of all four areas are developed in gravelly alluvial fans in semiarid climates with 8 to 40 cm mean annual precipitation. Lithologies of alluvium are granite-gneiss at Silver Lake, granite and basalt at Cima Volcanic Field, limestone at Kyle Canyon, and siliceous volcanic rocks at Fortymile Wash. Ages of the soils are approximated from several radiometric and experimental techniques, and rates are assessed using a conservative mathematical approach. Average rates for Holocene soils at Silver Lake are about 10 times higher than for Pleistocene soils at Kyle Canyon and Fortymile Wash, based on limited age control. Holocene soils in all four areas appear to develop at similar rates, and Pleistocene soils at Kyle Canyon and Fortymile Wash may differ by only a factor of 2 to 4. Over time spans of several millennia, a preferred model for the age curves is not linear but may be exponential or parabolic, in which rates decrease with increasing age. These preliminary results imply that the geographical variation in rates within the southern Great Basin-Mojave region may be much less significant than temporal variation in rates of soil development. The reasons for temporal variation in rates and processes of soil development are complexly linked to climatic change and related changes in water and dust, erosional history, and internally driven chemical and physical processes.
Dynamic programming techniques were used to evaluate the effects of alternative levels of normal flex acreage requirements on a Midwestern corn-soybean farm and a Southeastern cotton farm. Results indicate that increasing normal flex acres from the current level of 15 percent to 35 percent would provide inducement for farmers in both regions to plant more soybeans. In general, the cotton farm incurs considerably higher expected losses from the change. Thus, there are unequal regional consequences of such a policy change.
Understanding the distribution of gas in and around galaxies is vital for our interpretation of galaxy formation and evolution. As part of the Arecibo Galaxy Environment Survey (AGES) we have observed the neutral hydrogen (HI) gas in and around the nearby Local Group galaxy M33 to a greater depth than previous observations. As part of this project we investigated the absence of optically detected dwarf galaxies in its neighbourhood, which is contrary to predictions of galaxy formation models. We observed 22 discrete clouds, 11 of which were previously undetected and none of which have optically detected counterparts. We find one particularly interesting hydrogen cloud, which has many similar characteristics to hydrogen distributed in the disk of a galaxy. This cloud, if it is at the distance of M33, has a HI mass of around 107 M⊙ and a diameter of 18 kpc, making it larger in size than M33 itself.
Recent studies suggest that sand can serve as a vehicle for exposure of humans to pathogens at beach sites, resulting in increased health risks. Sampling for microorganisms in sand should therefore be considered for inclusion in regulatory programmes aimed at protecting recreational beach users from infectious disease. Here, we review the literature on pathogen levels in beach sand, and their potential for affecting human health. In an effort to provide specific recommendations for sand sampling programmes, we outline published guidelines for beach monitoring programmes, which are currently focused exclusively on measuring microbial levels in water. We also provide background on spatial distribution and temporal characteristics of microbes in sand, as these factors influence sampling programmes. First steps toward establishing a sand sampling programme include identifying appropriate beach sites and use of initial sanitary assessments to refine site selection. A tiered approach is recommended for monitoring. This approach would include the analysis of samples from many sites for faecal indicator organisms and other conventional analytes, while testing for specific pathogens and unconventional indicators is reserved for high-risk sites. Given the diversity of microbes found in sand, studies are urgently needed to identify the most significant aetiological agent of disease and to relate microbial measurements in sand to human health risk.
A long-season (160–180 days) cotton variety with a conventional production system was formerly grown in the Texas Coastal Bend Region. Cotton producers in the region used intensive insecticide applications throughout the growing season and harvested in August or September, and occasionally in October. In general, intensive insecticide applications for boll weevil and fleahopper control destroyed the beneficial insects and spiders. Late-season tobacco budworm infestations were thereby aggravated. These late-season insect infestations were a result of the relatively high rainfall during August and September. Moreover, high rainfall during this time not only interfered with harvest, but also reduced both the yield and quality of cotton (Lacewell et al.).
Throughout the southern states and at the federal level, much attention is being focused on the appropriate strategy for controlling cotton insect pests, particularly the boll weevil. This paper presents estimated economic impacts to farmers, regions and consumers of implementing three alternative boll weevil control strategies. One strategy evaluated is a proposed boll weevil eradication program which involves integrating many controls including insecticides, reproduction-diapause control by early season stalk destruction, pheromone-baited traps, trap crops, early season control with insecticide, and massive releases of sterile boll weevils. The plan is to eradicate the boll weevil in the U.S., and then indefinitely maintain a barrier at the U.S.-Mexico border to prevent future weevil immigration to the U.S.