We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Previous findings suggest that time setting errors (TSEs) in the Clock Drawing Test (CDT) may be related mainly to impairments in semantic and executive function. Recent attempts to dissociate the classic stimulus-bound error (setting the time to “10 to 11” instead of “10 past 11”) from other TSEs, did not support hypotheses regarding this error being primarily executive in nature or different from other time setting errors in terms of neurocognitive correlates. This study aimed to further investigate the cognitive correlates of stimulus-bound errors and other TSEs, in order to trace possible underlying cognitive deficits.
Methods:
We examined cognitive test performance of participants with preliminary diagnoses associated with mild cognitive impairment. Among 490 participants, we identified clocks with stimulus-bound errors (n = 78), other TSEs (n = 41), other errors not related to time settings (n = 176), or errorless clocks (n = 195).
Results:
No differences were found on any dependent measure between the stimulus-bound and the other TSErs groups. Group comparisons suggested TSEs in general, to be associated with lower performance on various cognitive measures, especially on semantic and working memory measures. Regression analysis further highlighted semantic and verbal working memory difficulties as being the most prominent deficits associated with these errors.
Conclusion:
TSEs in the CDT may indicate underlying deficits in semantic function and working memory. In addition, results support previous findings related to the diagnostic value of TSEs in detecting cognitive impairment.
Quantifying the marine radiocarbon reservoir effect, offsets (ΔR), and ΔR variability over time is critical to improving dating estimates of marine samples while also providing a proxy of water mass dynamics. In the northeastern Pacific, where no high-resolution time series of ΔR has yet been established, we sampled radiocarbon (14C) from exactly dated growth increments in a multicentennial chronology of the long-lived bivalve, Pacific geoduck (Paneopea generosa) at the Tree Nob site, coastal British Columbia, Canada. Samples were taken at approximately decadal time intervals from 1725 CE to 1920 CE and indicate average ΔR values of 256 ± 22 years (1σ) consistent with existing discrete estimates. Temporal variability in ΔR is small relative to analogous Atlantic records except for an unusually old-water event, 1802–1812. The correlation between ΔR and sea surface temperature (SST) reconstructed from geoduck increment width is weakly significant (r2 = .29, p = .03), indicating warm water is generally old, when the 1802–1812 interval is excluded. This interval contains the oldest (–2.1σ) anomaly, and that is coincident with the coldest (–2.7σ) anomalies of the temperature reconstruction. An additional 32 14C values spanning 1952–1980 were detrended using a northeastern Pacific bomb pulse curve. Significant positive correlations were identified between the detrended 14C data and annual El Niño Southern Oscillation (ENSO) and summer SST such that cooler conditions are associated with older water. Thus, 14C is generally relatively stable with weak, potentially inconsistent associations to climate variables, but capable of infrequent excursions as illustrated by the unusually cold, old-water 1802–1812 interval.
Recent excavations by the Ancient Southwest Texas Project of Texas State University sampled a previously undocumented Younger Dryas component from Eagle Cave in the Lower Pecos Canyonlands of Texas. This stratified assemblage consists of bison (Bison antiquus) bones in association with lithic artifacts and a hearth. Bayesian modeling yields an age of 12,660–12,480 cal BP, and analyses indicate behaviors associated with the processing of a juvenile bison and the manufacture and maintenance of lithic tools. This article presents spatial, faunal, macrobotanical, chronometric, geoarchaeological, and lithic analyses relating to the Younger Dryas component within Eagle Cave. The identification of the Younger Dryas occupation in Eagle Cave should encourage archaeologists to revisit previously excavated rockshelter sites in the Lower Pecos and beyond to evaluate deposits for unrecognized, older occupations.
Background: Carbapenem-resistant Enterobacteriaceae (CRE) are endemic in the Chicago region. We assessed the regional impact of a CRE control intervention targeting high-prevalence facilities; that is, long-term acute-care hospitals (LTACHs) and ventilator-capable skilled nursing facilities (vSNFs). Methods: In July 2017, an academic–public health partnership launched a regional CRE prevention bundle: (1) identifying patient CRE status by querying Illinois’ XDRO registry and periodic point-prevalence surveys reported to public health, (2) cohorting or private rooms with contact precautions for CRE patients, (3) combining hand hygiene adherence, monitoring with general infection control education, and guidance by project coordinators and public health, and (4) daily chlorhexidine gluconate (CHG) bathing. Informed by epidemiology and modeling, we targeted LTACHs and vSNFs in a 13-mile radius from the coordinating center. Illinois mandates CRE reporting to the XDRO registry, which can also be manually queried or generate automated alerts to facilitate interfacility communication. The regional intervention promoted increased automation of alerts to hospitals. The prespecified primary outcome was incident clinical CRE culture reported to the XDRO registry in Cook County by month, analyzed by segmented regression modeling. A secondary outcome was colonization prevalence measured by serial point-prevalence surveys for carbapenemase-producing organism colonization in LTACHs and vSNFs. Results: All eligible LTACHs (n = 6) and vSNFs (n = 9) participated in the intervention. One vSNF declined CHG bathing. vSNFs that implemented CHG bathing typically bathed residents 2–3 times per week instead of daily. Overall, there were significant gaps in infection control practices, especially in vSNFs. Also, 75 Illinois hospitals adopted automated alerts (56 during the intervention period). Mean CRE incidence in Cook County decreased from 59.0 cases per month during baseline to 40.6 cases per month during intervention (P < .001). In a segmented regression model, there was an average reduction of 10.56 cases per month during the 24-month intervention period (P = .02) (Fig. 1), and an estimated 253 incident CRE cases were averted. Mean CRE incidence also decreased among the stratum of vSNF/LTACH intervention facilities (P = .03). However, evidence of ongoing CRE transmission, particularly in vSNFs, persisted, and CRE colonization prevalence remained high at intervention facilities (Table 1). Conclusions: A resource-intensive public health regional CRE intervention was implemented that included enhanced interfacility communication and targeted infection prevention. There was a significant decline in incident CRE clinical cases in Cook County, despite high persistent CRE colonization prevalence in intervention facilities. vSNFs, where understaffing or underresourcing were common and lengths of stay range from months to years, had a major prevalence challenge, underscoring the need for aggressive infection control improvements in these facilities.
Funding: The Centers for Disease Control and Prevention (SHEPheRD Contract No. 200-2011-42037)
Disclosures: M.Y.L. has received research support in the form of contributed product from OpGen and Sage Products (now part of Stryker Corporation), and has received an investigator-initiated grant from CareFusion Foundation (now part of BD).
Background:Candida auris and carbapenemase-producing organisms (CPO) are multidrug-resistant organisms that can colonize people for prolonged periods and can cause invasive infections and spread in healthcare settings, particularly in high-acuity long-term care facilities. Point-prevalence surveys (PPSs) conducted in long-term acute-care hospitals in the Chicago region identified median prevalence of colonization to be 31% for C. auris and 24% for CPO. Prevalence of C. auris colonization has not been described in pediatric populations in the United States, and limited data exist on CPO colonization in children outside intensive care units. The Chicago Department of Public Health (CDPH) conducted a PPS to assess C. auris and CPO colonization in a pediatric hospital serving high-acuity patients with extended lengths of stay (LOS). Methods: CDPH conducted a PPS in August 2019 in a pediatric hospital with extended LOS to screen for C. auris and CPO colonization. Medical devices (ie, gastrostomy tubes, tracheostomies, mechanical ventilators, and central venous catheters [CVC]) and LOS were documented. Screening specimens consisted of composite bilateral axillae and groin swabs for C. auris and rectal swabs for CPO testing. The Wisconsin State Laboratory of Hygiene tested all specimens. Real-time polymerase chain reaction (PCR) assays were used to detect C. auris DNA and carbapenemase genes: blaKPC, blaNDM, blaVIM, blaOXA-48, and blaIMP (Xpert Carba-R Assay, Cepheid, Sunnyvale, CA). All axillae and groin swabs were processed by PCR and culture to identify C. auris. For CPO, culture was only performed on PCR-positive specimens. Results: Of the 29 patients hospitalized, 26 (90%) had gastrostomy tubes, 24 (83%) had tracheostomies, 20 (69%) required mechanical ventilation, and 3 (10%) had CVCs. Also, 25 (86%) were screened for C. auris and CPO; 4 (14%) lacked parental consent and were not swabbed. Two rectal specimens were unsatisfactory, producing invalid CPO test results. Median LOS was 35 days (range, 1–300 days). No patients were positive for C. auris. From CPO screening, blaOXA-48 was detected in 1 patient sample, yielding a CPO prevalence of 3.4% (1 of 29). No organism was recovered from the blaOXA-48 positive specimen. Conclusions: This is the first documented screening of C. auris colonization in a pediatric hospital with extended LOS. Despite a high prevalence of C. auris and CPOs in adult healthcare settings of similar acuity in the region, C. auris was not identified and CPOs were rare at this pediatric facility. Additional evaluations in pediatric hospitals should be conducted to further understand C. auris and CPO prevalence in this population.
Mindfulness-based interventions (MBIs), founded on the meditation practices outlined in the Mindfulness-Based Stress Reduction (MBSR) program and historically rooted in contemplative traditions, offer one mental framework to address the unique needs of individuals suffering from the causes and consequences of substance and behavioral addictions. MBIs are considered a third wave of empirically tested psychotherapies following behavioral therapy and cognitive-behavioral therapy, respectively. MBI-proposed targets of change include self-regulation, self-exploration, and self-liberation; together, an important set of mental capacities or skills to break the cycle of addiction. In this chapter, we describe the development of MBIs adapted for a variety of addictions. We focus on MBIs for substance use disorders (SUD) and binge-eating disorder (BED) due to similarities in addictive and neurobiological processes (both may be considered substance addictions, BED as a proxy for food addiction), though other behavioral addictions are also discussed. We then critically review leading experimental trials that test the efficacy of MBIs on mechanisms of addiction and substance use behavior among people diagnosed with SUD and BED. Based on results available to date, treatment effects from MBIs are on par with other clinically accepted treatments. However, several methodological limitations make interpretation of the internal validity and reliability of these results difficult to assess. We discuss strengths and limitations of the state of evidence to date and provide suggestions for future research with an emphasis on treatment fidelity and its role in improving the validity of future study findings. We expect our synthesis to inform the public on the value of applying MBIs to remediate the causes and consequences of addictive behavior.
The National Institute of Standards and Technology (NIST) certifies a suite of Standard Reference Materials (SRMs) to be used to evaluate specific aspects of the instrument performance of both X-ray and neutron powder diffractometers. This report describes SRM 640f, the seventh generation of this powder diffraction SRM, which is designed to be used primarily for calibrating powder diffractometers with respect to line position; it also can be used for the determination of the instrument profile function. It is certified with respect to the lattice parameter and consists of approximately 7.5 g of silicon powder prepared to minimize line broadening. A NIST-built diffractometer, incorporating many advanced design features, was used to certify the lattice parameter of the Si powder. Both statistical and systematic uncertainties have been assigned to yield a certified value for the lattice parameter at 22.5 °C of a = 0.5431144 ± 0.000008 nm.
Large prospective observational studies have cast doubt on the common assumption that endovascular thrombectomy (EVT) is superior to intravenous thrombolysis for patients with acute basilar artery occlusion (BAO). The purpose of this study was to retrospectively review our experience for patients with BAO undergoing EVT with modern endovascular devices.
Methods:
All consecutive patients undergoing EVT with either a second-generation stent retriever or direct aspiration thrombectomy for BAO at our regional stroke center from January 1, 2013 to March 1, 2019 were included. The primary outcome measure was functional outcome at 1 month using the modified Rankin Scale (mRS) score. Multivariable logistic regression was used to assess the association between patient characteristics and dichotomized mRS.
Results:
A total of 43 consecutive patients underwent EVT for BAO. The average age was 67 years with 61% male patients. Overall, 37% (16/43) of patients achieved good functional outcome. Successful reperfusion was achieved in 72% (31/43) of cases. The median (interquartile range) stroke onset to treatment time was 420 (270–639) minutes (7 hours) for all patients. The procedure-related complication rate was 9% (4/43). On multivariate analysis, posterior circulation Alberta stroke program early computed tomography score and Basilar Artery on Computed Tomography Angiography score were associated with improved functional outcome.
Conclusion:
EVT appears to be safe and feasible in patients with BAO. Our finding that time to treatment and successful reperfusion were not associated with improved outcome is likely due to including patients with established infarcts. Given the variability of collaterals in the posterior circulation, the paradigm of utilizing a tissue window may assist in patient selection for EVT. Magnetic resonance imaging may be a reasonable option to determine the extent of ischemia in certain situations.
Recent investigations now suggest that cerebrovascular reactivity (CVR) is impaired in Alzheimer’s disease (AD) and may underpin part of the disease’s neurovascular component. However, our understanding of the relationship between the magnitude of CVR, the speed of cerebrovascular response, and the progression of AD is still limited. This is especially true in patients with mild cognitive impairment (MCI), which is recognized as an intermediate stage between normal aging and dementia. The purpose of this study was to investigate AD and MCI patients by mapping repeatable and accurate measures of cerebrovascular function, namely the magnitude and speed of cerebrovascular response (τ) to a vasoactive stimulus in key predilection sites for vascular dysfunction in AD.
Methods:
Thirty-three subjects (age range: 52–83 years, 20 males) were prospectively recruited. CVR and τ were assessed using blood oxygen level-dependent MRI during a standardized carbon dioxide stimulus. Temporal and parietal cortical regions of interest (ROIs) were generated from anatomical images using the FreeSurfer image analysis suite.
Results:
Of 33 subjects recruited, 3 individuals were excluded, leaving 30 subjects for analysis, consisting of 6 individuals with early AD, 11 individuals with MCI, and 13 older healthy controls (HCs). τ was found to be significantly higher in the AD group compared to the HC group in both the temporal (p = 0.03) and parietal cortex (p = 0.01) following a one-way ANCOVA correcting for age and microangiopathy scoring and a Bonferroni post-hoc correction.
Conclusion:
The study findings suggest that AD is associated with a slowing of the cerebrovascular response in the temporal and parietal cortices.
The National Institute of Standards and Technology (NIST) certifies a suite of Standard Reference Materials (SRMs) to evaluate specific aspects of instrument performance of both X-ray and neutron powder diffractometers. This report describes SRM 660c, the fourth generation of this powder diffraction SRM, which is used primarily for calibrating powder diffractometers with respect to line position and line shape for the determination of the instrument profile function (IPF). It is certified with respect to lattice parameter and consists of approximately 6 g of lanthanum hexaboride (LaB6) powder. So that this SRM would be applicable for the neutron diffraction community, the powder was prepared from an isotopically enriched 11B precursor material. The microstructure of the LaB6 powder was engineered specifically to yield a crystallite size above that where size broadening is typically observed and to minimize the crystallographic defects that lead to strain broadening. A NIST-built diffractometer, incorporating many advanced design features, was used to certify the lattice parameter of the LaB6 powder. Both Type A, statistical, and Type B, systematic, uncertainties have been assigned to yield a certified value for the lattice parameter at 22.5 °C of a = 0.415 682 6 ± 0.000 008 nm (95% confidence).
Apolipoprotein E (APOE) E4 is the main genetic risk factor for Alzheimer’s disease (AD). Due to the consistent association, there is interest as to whether E4 influences the risk of other neurodegenerative diseases. Further, there is a constant search for other genetic biomarkers contributing to these phenotypes, such as microtubule-associated protein tau (MAPT) haplotypes. Here, participants from the Ontario Neurodegenerative Disease Research Initiative were genotyped to investigate whether the APOE E4 allele or MAPT H1 haplotype are associated with five neurodegenerative diseases: (1) AD and mild cognitive impairment (MCI), (2) amyotrophic lateral sclerosis, (3) frontotemporal dementia (FTD), (4) Parkinson’s disease, and (5) vascular cognitive impairment.
Methods:
Genotypes were defined for their respective APOE allele and MAPT haplotype calls for each participant, and logistic regression analyses were performed to identify the associations with the presentations of neurodegenerative diseases.
Results:
Our work confirmed the association of the E4 allele with a dose-dependent increased presentation of AD, and an association between the E4 allele alone and MCI; however, the other four diseases were not associated with E4. Further, the APOE E2 allele was associated with decreased presentation of both AD and MCI. No associations were identified between MAPT haplotype and the neurodegenerative disease cohorts; but following subtyping of the FTD cohort, the H1 haplotype was significantly associated with progressive supranuclear palsy.
Conclusion:
This is the first study to concurrently analyze the association of APOE isoforms and MAPT haplotypes with five neurodegenerative diseases using consistent enrollment criteria and broad phenotypic analysis.
The Comprehensive Assessment of Neurodegeneration and Dementia (COMPASS-ND) cohort study of the Canadian Consortium on Neurodegeneration in Aging (CCNA) is a national initiative to catalyze research on dementia, set up to support the research agendas of CCNA teams. This cross-country longitudinal cohort of 2310 deeply phenotyped subjects with various forms of dementia and mild memory loss or concerns, along with cognitively intact elderly subjects, will test hypotheses generated by these teams.
Methods:
The COMPASS-ND protocol, initial grant proposal for funding, fifth semi-annual CCNA Progress Report submitted to the Canadian Institutes of Health Research December 2017, and other documents supplemented by modifications made and lessons learned after implementation were used by the authors to create the description of the study provided here.
Results:
The CCNA COMPASS-ND cohort includes participants from across Canada with various cognitive conditions associated with or at risk of neurodegenerative diseases. They will undergo a wide range of experimental, clinical, imaging, and genetic investigation to specifically address the causes, diagnosis, treatment, and prevention of these conditions in the aging population. Data derived from clinical and cognitive assessments, biospecimens, brain imaging, genetics, and brain donations will be used to test hypotheses generated by CCNA research teams and other Canadian researchers. The study is the most comprehensive and ambitious Canadian study of dementia. Initial data posting occurred in 2018, with the full cohort to be accrued by 2020.
Conclusion:
Availability of data from the COMPASS-ND study will provide a major stimulus for dementia research in Canada in the coming years.
To evaluate the psychometric properties of HEARTSMAP, an emergency psychosocial assessment and management tool, and its impact on patient care and flow measures.
Methods
We conducted the study in two phases: first validating the tool using extracted information from a retrospective cohort, then evaluating implementation on a prospective cohort of youth presenting with mental health complaints to a tertiary Pediatric Emergency Department (PED). In phase 1, six PED clinicians applied HEARTSMAP to extracted narratives and we calculated inter-rater agreement for referral recommendations using Cohen’s Kappa and the sensitivity and specificity for identifying youth requiring psychiatric consultation and hospitalization. In phase 2, PED clinicians prospectively used HEARTSMAP and we assessed the impact of the tool’s implementation on patient-related outcomes and Emergency department (ED) flow measures.
Results
We found substantial agreement (κ=0.7) for cases requiring emergent psychiatric consultation and moderate agreement for cases requiring community urgent and non-urgent follow-up (κ=0.4 each). The sensitivity was 76% (95%CI: 63%, 90%) and specificity was 65% (95%CI: 55%, 71%) using retrospective cases. During pilot implementation, 62 patients received HEARTSMAP assessments: 46 (74%) of HEARTSMAP assessments triggered a recommendation for ED psychiatry assessment, 39 (63%) were evaluated by psychiatry and 13 (21%) were admitted. At follow-up, all patients with HEARTSMAP’s triggered recommendations had accessed community resources. For those hospitalized for further psychiatric care at their index or return visit within 30 days, 100% were initially identified by HEARTSMAP at the index visit as requiring ED psychiatric consultation.
Conclusions
HEARTSMAP has strong reliability, and when applied prospectively is a safe and effective management tool.
The World Alzheimer Report 2016 estimated that 47 million people are living with dementia worldwide (Alzheimer's Disease International, 2016). In the inaugural World Health Organization Ministerial Conference on Global Action against Dementia, six of the top ten research priorities were focused on prevention, identification, and reduction of dementia risk, and on delivery and quality of care for people with dementia and their carers (Shah et al., 2016). While the Lancet Neurology Commission has suggested that even minor advances to delay progression or ameliorate symptoms might have substantial financial and societal benefits (Winblad et al., 2016), advances have been slow.
Increasingly, ambulance services offer alternatives to transfer to the emergency department (ED), when this is better for patients. The introduction of electronic health records (EHR) in ambulance services is encouraged by national policy across the United Kingdom (UK) but roll-out has been variable and complex.
Electronic Records in Ambulances (ERA) is a two-year study which aims to investigate and describe the opportunities and challenges of implementing EHR and associated technology in ambulances to support a safe and effective shift to out of hospital care, including the implications for workforce in terms of training, role and clinical decision-making skills.
METHODS:
Our study includes a scoping review of relevant issues and a baseline assessment of progress in all UK ambulance services in implementing EHR. These will inform four in-depth case studies of services at different stages of implementation, assessing current usage, and examining context.
RESULTS:
The scoping review identified themes including: there are many perceived potential benefits of EHR, such as improved safety and remote diagnostics, but as yet little evidence of them; technical challenges to implementation may inhibit uptake and lead to increased workload in the short term; staff implementing EHR may do so selectively or devise workarounds; and EHR may be perceived as a tool of staff surveillance.
CONCLUSIONS:
Our scoping review identified some complex issues around the implementation of EHR and the relevant challenges, opportunities and workforce implications. These will help to inform our fieldwork and subsequent data analysis in the case study sites, to begin early in 2017. Lessons learned from the experience of implementing EHR so far should inform future development of information technology in ambulance services, and help service providers to understand how best to maximize the opportunities offered by EHR to redesign care.
By
Jacqui Ala, senior lecturer in International Relations at the University of the Witwatersrand, Johannesburg.,
David Black, Lester B Pearson professor of International Development Studies in the Department of Political Science at Dalhousie University, Canada
From a constructivist perspective the causes and manifestations of inequality are multidimensional. Different populations affected by systemic inequality are disadvantaged through a convergence of socio-economic, cultural and political factors that are also historically and contextually bound. Moreover, inequality is not experienced the same way either horizontally between different groups or vertically among people within the same group. Constructivism allows for a more nuanced understanding of the causes and consequences of inequality for particular groups within society.
Relatively little attention has been paid by development studies to issues concerning people with disabilities, in South Africa and elsewhere. In the transitional and immediately post-apartheid years of the 1990s, some progress was made by government in addressing the development needs of people with disabilities – indeed, South Africa was regarded internationally as a leader in addressing the rights of the disabled. However, this focus has waned. We will argue in this chapter that the accommodation of people with disabilities in the South African political economy was critically compromised by the country's post-apartheid embrace of several key policy choices. Most fundamentally, South Africa's initial vision of a society built on social democratic principles has been eroded by the awkward marriage of neoliberal economics to these ideals – a step initially marked by the abrupt abandonment of the Reconstruction and Development Programme (RDP) for the Growth, Employment and Redistribution (Gear) strategy in 1996 (see Naidoo 2010; Marais 2011). In this context, the adoption of a development framework based on human rights and social justice was unable to be effectively implemented, and the government's commitments to people with disabilities concomitantly waned. The steady spread of corruption and maladministration has only served to make the situation worse.
DISABILITY AND INEQUALITY IN SOUTH AFRICA
Despite variations and nuances in the manifestations of inequality, it can be firmly observed in South Africa – as elsewhere – that disabled people are typically the most disadvantaged within the various social categories of difference and inequity (Yeo and Moore 2003; Graham et al. 2013; Loeb et al. 2008; Leibbrandt et al. 2010). Despite wide-ranging constitutional and legislative provisions aiming to guarantee the rights of people with disabilities in South Africa, the disabled remain marginalised socially and economically.
The aim of this analysis was to test if changes in insomnia symptoms and global sleep quality are associated with coinciding changes in depressed mood among older adults. We report on results yielded from secondary analysis of longitudinal data from a clinical trial of older adults (N = 49) aged 55 to 80 years who reported at least moderate levels of sleep problems. All measures were collected at baseline and after the trial ten weeks later. We computed change scores for two separate measures of disturbed sleep, the Athens Insomnia Scale (AIS) and the Pittsburgh Sleep Quality Index (PSQI), and tested their association with change in depressed mood (Beck Depression Inventory-II; BDI-II) in two separate linear regression models adjusted for biological covariates related to sleep (sex, age, body mass index, and NF-κB as a biological marker previously correlated with insomnia and depression). Change in AIS scores was associated with change in BDI-II scores (β = 0.38, p < 0.01). Change in PSQI scores was not significantly associated with change in BDI-II scores (β = 0.17, p = 0.26). Our findings suggest that improvements over ten weeks in insomnia symptoms rather than global sleep quality coincide with improvement in depressed mood among older adults.
Greenhouse studies were conducted to determine host status of weed species for Rhizoctonia solani AG-1, which causes Rhizoctonia foliar blight of soybean. Weed species were barnyardgrass, broadleaf signalgrass, common cocklebur, entireleaf morningglory, hemp sesbania, itchgrass, johnsongrass, large crabgrass, northern jointvetch, prickly sida, purple nutsedge, redweed, sicklepod, and smooth pigweed. Seedling weeds were inoculated with suspensions containing intraspecific group IA and IB isolates of the fungus. In the first study, sclerotia of IA were recovered from tissue of all weeds except smooth pigweed, and mycelia of IA were recovered from all except smooth pigweed and redweed. In that study, neither microsclerotia nor mycelia of IB were recovered from sicklepod, barnyardgrass, or large crabgrass, and only microsclerotia were recovered from itchgrass and purple nutsedge. In the second study, sclerotia of IA, microsclerotia of IB, and mycelia of each isolate were recovered from all weed species. In other studies, R. solani spread from at least six of seven weed species to a noninfected soybean plant growing in close proximity. These studies emphasize the importance of weed control, not only for reducing plant competition and increasing yield, but also for the potential impact on development of RFB.
Field studies evaluated response of soybean to Rhizoctonia foliar blight (RFB) disease in combination with varying densities of common cocklebur, hemp sesbania, or johnsongrass. Soybean plants at both V10 and R1 growth stages were not inoculated or inoculated with suspensions containing equal concentrations of Rhizoctonia solani AG-1 IA and IB mycelia. Intensity of RFB was rated weekly beginning at V1 soybean growth stage, and data were used to determine area under disease progress curves. Intensity of RFB was greater in 1993 than in 1994. When averaged across weed species and weed densities, soybean yield in 1993 was reduced 18% in plots inoculated with R. solani compared with those not inoculated. Intensity of RFB, however, did not differ between inoculated and noninoculated plots in 1994. Interactions between R. solani and weed density for RFB intensity and yield were not significant either year. Soybean yields in 1994, however, were reduced by hemp sesbania and johnsongrass in inoculated plots. Soybean maturity was delayed both years when hemp sesbania was present.
Acifluorfen, alachlor, glufosinate, glyphosate, paraquat, and pendimethalin were evaluated for their effects on mycelial growth and sclerotia/microsclerotia production by Rhizoctonia solani AG-1 IA and IB in culture. All of these herbicides except glufosinate and glyphosate were evaluated for effects on severity of Rhizoctonia foliar blight of soybean in the field. In laboratory studies, all herbicides reduced colony radius of R. solani. Growth reductions for IB were greater than for IA in the presence of pendimethalin, alachlor, and acifluorfen, but glufosinate reduced growth of IA more than IB. Sclerotia production by both isolates was prevented by paraquat, greatly reduced by glufosinate, but markedly less affected by the other herbicides tested. In field studies, all tested herbicides influenced severity of Rhizoctonia foliar blight when disease pressure was low, but only paraquat reduced severity when disease pressure was high.