We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Diagnosing HIV-Associated Neurocognitive Disorders (HAND) requires attributing neurocognitive impairment and functional decline at least partly to HIV-related brain effects. Depressive symptom severity, whether attributable to HIV or not, may influence self-reported functioning. We examined longitudinal relationships among objective global cognition, depressive symptom severity, and self-reported everyday functioning in people with HIV (PWH).
Methods:
Longitudinal data from 894 PWH were collected at a university-based research center (2002–2016). Participants completed self-report measures of everyday functioning to assess both dependence in instrumental activities of daily living (IADL) and subjective cognitive difficulties at each visit, along with depressive symptom severity (BDI-II). Multilevel modeling examined within- and between-person predictors of self-reported everyday functioning outcomes.
Results:
Participants averaged 6 visits over 5 years. Multilevel regression showed a significant interaction between visit-specific global cognitive performance and mean depression symptom severity on likelihood of dependence in IADL (p = 0.04), such that within-person association between worse cognition and greater likelihood of IADL dependence was strongest among individuals with lower mean depressive symptom severity. In contrast, participants with higher mean depressive symptom severity had higher likelihoods of IADL dependence regardless of cognition. Multilevel modelling of subjective cognitive difficulties showed no significant interaction between global cognition and mean depressive symptom severity (p > 0.05).
Conclusions:
The findings indicate a link between cognitive abilities and IADL dependence in PWH with low to moderate depressive symptoms. However, those with higher depressive symptoms severity report IADL dependence regardless of cognitive status. This is clinically significant because everyday functioning is measured through self-report rather than performance-based assessments.
Identifying persons with HIV (PWH) at increased risk for Alzheimer’s disease (AD) is complicated because memory deficits are common in HIV-associated neurocognitive disorders (HAND) and a defining feature of amnestic mild cognitive impairment (aMCI; a precursor to AD). Recognition memory deficits may be useful in differentiating these etiologies. Therefore, neuroimaging correlates of different memory deficits (i.e., recall, recognition) and their longitudinal trajectories in PWH were examined.
Design:
We examined 92 PWH from the CHARTER Program, ages 45–68, without severe comorbid conditions, who received baseline structural MRI and baseline and longitudinal neuropsychological testing. Linear and logistic regression examined neuroanatomical correlates (i.e., cortical thickness and volumes of regions associated with HAND and/or AD) of memory performance at baseline and multilevel modeling examined neuroanatomical correlates of memory decline (average follow-up = 6.5 years).
Results:
At baseline, thinner pars opercularis cortex was associated with impaired recognition (p = 0.012; p = 0.060 after correcting for multiple comparisons). Worse delayed recall was associated with thinner pars opercularis (p = 0.001) and thinner rostral middle frontal cortex (p = 0.006) cross sectionally even after correcting for multiple comparisons. Delayed recall and recognition were not associated with medial temporal lobe (MTL), basal ganglia, or other prefrontal structures. Recognition impairment was variable over time, and there was little decline in delayed recall. Baseline MTL and prefrontal structures were not associated with delayed recall.
Conclusions:
Episodic memory was associated with prefrontal structures, and MTL and prefrontal structures did not predict memory decline. There was relative stability in memory over time. Findings suggest that episodic memory is more related to frontal structures, rather than encroaching AD pathology, in middle-aged PWH. Additional research should clarify if recognition is useful clinically to differentiate aMCI and HAND.
Translatability of preclinical results remains a major obstacle in neuropsychiatric research. Even when cognitive tests in preclinical models show translational validity for human testing, with sensitivity to clinical deficits, there remains the issue of heterogeneity among human participants. Norming of performance on cognitive tasks enable corrections for any differences in performance that may arise from the influence of socioeconomic factors, and thus a more direct comparison with preclinical testing results. The 5-choice continuous performance task (5C-CPT) is a test sensitive to changes in sustained attention and cognitive control in rodent manipulations and clinical populations, including schizophrenia and bipolar disorder. Herein, we present normed results of 5C-CPT data from a cohort of human participants, enabling greater comparison to future clinical and rodent testing.
Participants and Methods:
5C-CPT data were generated from a range of participants from the Translational Methamphetamine AIDS Research Center (n=82) and a study of bipolar disorder (n=45). Participant demographics were as follows: Age M=38.5, SD=16.7, Education: M=14.5, SD=1.9, 45% female, 10% Asian, 17% African American, 27% Hispanic, and 46% non-Hispanic White. We used the test2norm R-package to create norms for each of the major outcomes from the 5C-CPT. Non-normally distributed raw scores were transformed to generate more normally distributed data needed for the norming process. Raw scores were first converted into uniform scaled scores that range from 0-20 where a higher score indicated better performance. We then generated T-score formulas, which are standardized residuals and scaled to have a mean of 50 and standard deviation of 10. The residuals are obtained from regressions, modeled using multiple fractional polynomial method (MFP), which regresses scaled scores on demographic variables, which a user wishes to control for (gender, age, education, ethnicity, etc.). MFP models allow to fit non-linear effects for numeric demographic factors (e.g., age), if such effects exist.
Results:
New, demographically corrected T-score formulas were calculated for each major outcome of the 5C-CPT: reaction time (MCL), reaction time variability (VarRT), dprime, hit rate (HR) and false-alarm rate (FAR). MFP models showed that age had a significant effect on MCL, VarRT, dprime, and HR (all p<0.01), while gender only showed a significant effect for MCL and VarRT (all p<0.05). Interestingly, education and ethnicity did not show a significant effect for any MFP model and none of the demographic factors (age, education, gender, ethnicity) were significant in the model for FAR. As defined in the test2norm package, all scaled scores had a mean of 10 and SD of 3 and all T-scores had a mean of 50 and SD of 10.
Conclusions:
The 5C-CPT is a test of attention and cognitive control available for human testing, reverse-translated from rodent studies. The normative data generated here will enable future comparisons of data without the need for additional control studies. Furthermore, comparing these normative data to manipulations will enable further comparisons to rodent testing, with manipulations relative to baseline becoming more meaningful. Thus, the 5C-CPT is a viable tool for conducting cross-species translational research toward developing novel therapeutics that treat dysfunctional attentional and cognitive control.
Among people with HIV (PWH), the apolipoprotein e4 (APOE-e4) allele, a genetic marker associated with Alzheimer’s disease (AD), and self-reported family history of dementia (FHD), considered a proxy for higher AD genetic risk, are independently associated with worse neurocognition. However, research has not addressed the potential additive effect of FHD and APOE-e4 on global and domain-specific neurocognition among PWH. Thus, the aim of the current investigation is to examine the associations between FHD, APOE-e4, and neurocognition among PWH.
Participants and Methods:
283 PWH (Mage=50.9; SDage=5.6) from the CNS HIV Anti-Retroviral Therapy Effects Research (CHARTER) study completed comprehensive neuropsychological and neuromedical evaluations and underwent APOE genotyping. APOE status was dichotomized into APOE-e4+ and APOE-e4-. APOE-e4+ status included heterozygous and homozygous carriers. Participants completed a free-response question capturing FHD of a first- or second-degree relative (i.e., biologic parent, sibling, children, grandparent, grandchild, uncle, aunt, nephew, niece, half-sibling). A dichotomized (yes/no), FHD variable was used in analyses. Neurocognition was measured using global and domain-specific demographically corrected (i.e., age, education, sex, race/ethnicity) T-scores. t-tests were used to compare global and domain-specific demographically-corrected T-scores by FHD status and APOE-e4 status. A 2x2 factorial analysis of variance (ANOVA) was used to model the interactive effects of FHD and APOE-e4 status. Tukey’s HSD test was used to follow-up on significant ANOVAs.
Results:
Results revealed significant differences by FHD status in executive functioning (t(281)=-2.3, p=0.03) and motor skills (t(278)=-2.0, p=0.03) such that FHD+ performed worse compared to FHD-. Differences in global neurocognition by FHD status approached significance (t(281)=-1.8, p=.069). Global and domain-specific neurocognitive performance were comparable among APOE-e4 carriers and noncarriers (ps>0.05). Results evaluating the interactive effects of FHD and APOE-e4 showed significant differences in motor skills (F(3)=2.7, p=0.04) between the FHD-/APOE-e4+ and FHD+/APOE-e4- groups such that the FHD+/APOE-e4- performed worse than the FHD-/APOE-e4+ group (p=0.02).
Conclusions:
PWH with FHD exhibited worse neurocognitive performance within the domains of executive functioning and motor skills, however, there were no significant differences in neurocognition between APOE-e4 carriers and noncarriers. Furthermore, global neurocognitive performance was comparable across FHD/APOE-e4 groups. Differences between the FHD-/APOE-e4+ and FHD+/APOE-e4- groups in motor skills were likely driven by FHD status, considering there were no independent effects of APOE-e4 status. This suggests that FHD may be a predispositional risk factor for poor neurocognitive performance among PWH. Considering FHD is easily captured through self-report, compared to blood based APOE-e4 status, PWH with FHD should be more closely monitored. Future research is warranted to address the potential additive effect of FHD and APOE-e4 on rates of global and domain-specific neurocognitive decline and impairment over time among in an older cohort of PWH, where APOE-e4 status may have stronger effects.
Many people with HIV (PWH) are at risk for age-related neurodegenerative disorders such as Alzheimer’s disease (AD). Studies on the association between cognition, neuroimaging outcomes, and the Apolipoprotein E4 (APOE4) genotype, which is associated with greater risk of AD, have yielded mixed results in PWH; however, many of these studies have examined a wide age range of PWH and have not examined APOE by race interactions that are observed in HIV-negative older adults. Thus, we examined how APOE status relates to cognition and medial temporal lobe (MTL) structures (implicated in AD pathogenesis) in mid- to older-aged PWH. In exploratory analyses, we also examined race (African American (AA)/Black and non-Hispanic (NH) White) by APOE status interactions on cognition and MTL structures.
Participants and Methods:
The analysis included 88 PWH between the ages of 45 and 68 (mean age=51±5.9 years; 86% male; 51% AA/Black, 38% NH-White, 9% Hispanic/Latinx, 2% other) from the CNS HIV Antiretroviral Therapy Effects Research multi-site study. Participants underwent APOE genotyping, neuropsychological testing, and structural MRI; APOE groups were defined as APOE4+ (at least one APOE4 allele) and APOE4- (no APOE4 alleles). Eighty-nine percent of participants were on antiretroviral therapy, 74% had undetectable plasma HIV RNA (<50 copies/ml), and 25% were APOE4+ (32% AA/Black/15% NH-White). Neuropsychological testing assessed seven domains, and demographically-corrected T-scores were calculated. FreeSurfer 7.1.1 was used to measure MTL structures (hippocampal volume, entorhinal cortex thickness, and parahippocampal thickness) and the effect of scanner was regressed out prior to analyses. Multivariable linear regressions tested the association between APOE status and cognitive and imaging outcomes. Models examining cognition covaried for comorbid conditions and HIV disease characteristics related to global cognition (i.e., AIDS status, lifetime methamphetamine use disorder). Models examining the MTL covaried for age, sex, and
relevant imaging covariates (i.e., intracranial volume or mean cortical thickness).
Results:
APOE4+ carriers had worse learning (ß=-0.27, p=.01) and delayed recall (ß=-0.25, p=.02) compared to the APOE4- group, but APOE status was not significantly associated with any other domain (ps>0.24). APOE4+ status was also associated with thinner entorhinal cortex (ß=-0.24, p=.02). APOE status was not significantly associated with hippocampal volume (ß=-0.08, p=0.32) or parahippocampal thickness (ß=-0.18, p=.08). Lastly, race interacted with APOE status such that the negative association between APOE4+ status and cognition was stronger in NH-White PWH as compared to AA/Black PWH in learning, delayed recall, and verbal fluency (ps<0.05). There were no APOE by race interactions for any MTL structures (ps>0.10).
Conclusions:
Findings suggest that APOE4 carrier status is associated with worse episodic memory and thinner entorhinal cortex in mid- to older-aged PWH. While APOE4+ groups were small, we found that APOE4 carrier status had a larger association with cognition in NH-White PWH as compared to AA/Black PWH, consistent with studies demonstrating an attenuated effect of APOE4 in older AA/Black HIV-negative older adults. These findings further highlight the importance of recruiting diverse samples and suggest exploring other genetic markers (e.g., ABCA7) that may be more predictive of AD in some races to better understand AD risk in diverse groups of PWH.
The Mormon cricket (MC), Anabrus simplex Haldeman, 1852 (Orthoptera: Tettigoniidae), has a long and negative history with agriculture in Utah and other western states of the USA. Most A. simplex populations migrate in large groups, and their feeding can cause significant damage to forage plants and cultivated crops. Chemical pesticides are often applied, but some settings (e.g. habitats of threatened and endangered species) call for non-chemical control measures. Studies in Africa, South America, and Australia have assessed certain isolates of Metarhizium acridum as very promising pathogens for Orthoptera: Acrididae (locust) biocontrol. In the current study, two isolates of Metarhizium robertsii, one isolate of Metarhizium brunneum, one isolate of Metarhizium guizhouense, and three isolates of M. acridum were tested for infectivity to MC nymphs and adults of either sex. Based on the speed of mortality, M. robertsii (ARSEF 23 and ARSEF 2575) and M. brunneum (ARSEF 7711) were the most virulent to instars 2 to 5 MC nymphs. M. guizhouense (ARSEF 7847) from Arizona was intermediate and the M. acridum isolates (ARSEF 324, 3341, and 3609) were the slowest killers. ARSEF 2575 was also the most virulent to instar 6 and 7 nymphs and adults of MC. All of the isolates at the conidial concentration of 1 × 107 conidia ml−1 induced approximately 100% mortality by 6 days post application of fungal conidia. In conclusion, isolates ARSEF 23, ARSEF 2575, and ARSEF 7711 acted most rapidly to kill MC under laboratory conditions. The M. acridum isolates, however, have much higher tolerance to heat and UV-B radiation, which may be critical to their successful use in field application.
Balloon valvuloplasty and surgical aortic valvotomy have been the treatment mainstays for congenital aortic stenosis in children. Choice of intervention often differs depending upon centre bias with limited relevant, comparative literature.
Objectives:
This study aims to provide an unbiased, contemporary matched comparison of these balloon and surgical approaches.
Methods:
Retrospective analysis of patients with congenital aortic valve stenosis who underwent balloon valvuloplasty (Queensland Children’s Hospital, Brisbane) or surgical valvotomy (Royal Children’s Hospital, Melbourne) between 2005 and 2016. Patients were excluded if pre-intervention assessment indicated ineligibility to either group. Propensity score matching was performed based on age, weight, and valve morphology.
Results:
Sixty-five balloon patients and seventy-seven surgical patients were included. Overall, the groups were well matched with 18 neonates/25 infants in the balloon group and 17 neonates/28 infants in the surgical group. Median age at balloon was 92 days (range 2 days – 18.8 years) compared to 167 days (range 0 days – 18.1 years) for surgery (rank-sum p = 0.08). Mean follow-up was 5.3 years. There was one late balloon death and two early surgical deaths due to left ventricular failure. There was no significant difference in freedom from reintervention at latest follow-up (69% in the balloon group and 70% in the surgical group, p = 1.0).
Conclusions:
Contemporary analysis of balloon aortic valvuloplasty and surgical aortic valvotomy shows no difference in overall reintervention rates in the medium term. Balloon valvuloplasty performs well across all age groups, achieving delay or avoidance of surgical intervention.
Objectives: Studies of neurocognitively elite older adults, termed SuperAgers, have identified clinical predictors and neurobiological indicators of resilience against age-related neurocognitive decline. Despite rising rates of older persons living with HIV (PLWH), SuperAging (SA) in PLWH remains undefined. We aimed to establish neuropsychological criteria for SA in PLWH and examined clinically relevant correlates of SA. Methods: 734 PLWH and 123 HIV-uninfected participants between 50 and 64 years of age underwent neuropsychological and neuromedical evaluations. SA was defined as demographically corrected (i.e., sex, race/ethnicity, education) global neurocognitive performance within normal range for 25-year-olds. Remaining participants were labeled cognitively normal (CN) or impaired (CI) based on actual age. Chi-square and analysis of variance tests examined HIV group differences on neurocognitive status and demographics. Within PLWH, neurocognitive status differences were tested on HIV disease characteristics, medical comorbidities, and everyday functioning. Multinomial logistic regression explored independent predictors of neurocognitive status. Results: Neurocognitive status rates and demographic characteristics differed between PLWH (SA=17%; CN=38%; CI=45%) and HIV-uninfected participants (SA=35%; CN=55%; CI=11%). In PLWH, neurocognitive groups were comparable on demographic and HIV disease characteristics. Younger age, higher verbal IQ, absence of diabetes, fewer depressive symptoms, and lifetime cannabis use disorder increased likelihood of SA. SA reported increased independence in everyday functioning, employment, and health-related quality of life than non-SA. Conclusions: Despite combined neurological risk of aging and HIV, youthful neurocognitive performance is possible for older PLWH. SA relates to improved real-world functioning and may be better explained by cognitive reserve and maintenance of cardiometabolic and mental health than HIV disease severity. Future research investigating biomarker and lifestyle (e.g., physical activity) correlates of SA may help identify modifiable neuroprotective factors against HIV-related neurobiological aging. (JINS, 2019, 25, 507–519)
The Neotoma Paleoecology Database is a community-curated data resource that supports interdisciplinary global change research by enabling broad-scale studies of taxon and community diversity, distributions, and dynamics during the large environmental changes of the past. By consolidating many kinds of data into a common repository, Neotoma lowers costs of paleodata management, makes paleoecological data openly available, and offers a high-quality, curated resource. Neotoma’s distributed scientific governance model is flexible and scalable, with many open pathways for participation by new members, data contributors, stewards, and research communities. The Neotoma data model supports, or can be extended to support, any kind of paleoecological or paleoenvironmental data from sedimentary archives. Data additions to Neotoma are growing and now include >3.8 million observations, >17,000 datasets, and >9200 sites. Dataset types currently include fossil pollen, vertebrates, diatoms, ostracodes, macroinvertebrates, plant macrofossils, insects, testate amoebae, geochronological data, and the recently added organic biomarkers, stable isotopes, and specimen-level data. Multiple avenues exist to obtain Neotoma data, including the Explorer map-based interface, an application programming interface, the neotoma R package, and digital object identifiers. As the volume and variety of scientific data grow, community-curated data resources such as Neotoma have become foundational infrastructure for big data science.
A standard weed management system (system I) had a higher return above variable costs than did an intensive weed management system (system II) for two eastern Colorado cropping rotations. For continuous corn (Zea mays L.), the return above variable costs averaged $18.85/ha more under system I than under system II. For a barley (Hordeum vulgare L.)-corn-sugarbeet (Beta vulgaris L.) rotation, the return above variable costs averaged $20.48/ha more under System I than under System II. Based on alternative input (herbicide) and product prices, higher herbicide costs favored the standard weed management system, whereas higher crop prices favored the weed management system with the higher yields adjusted for quality. The probability that returns above variable costs differed between the two weed management systems depended upon the level of product prices and herbicide costs.
The persistence and germinability of jointed goatgrass (Aegilops cylindrica Host. # AEGCY) seed (caryopses) were studied over a five-year period between 1979 and 1984 at five locations in Colorado, Kansas, and Nebraska. Seed were buried in open-mesh packets and the sites were not disturbed for the duration of the study. Seed survival at burial depths of 5, 15, or 30 cm decreased rapidly over the first three years at all locations. By the third year, less than 7.4 ± 6.5% (mean ± SD) of the seed remained at 5 cm in the soil at all locations. By the third, fourth, and fifth year there was total loss of seed at one, two, and four sites at the 5-cm burial depth. Only three of five sites had total loss of seed at 30 cm after five years. The proportion of surviving seed that were germinable increased with time and was almost 100% after three years of burial. Because jointed goatgrass seed were relatively transient in the soil profile, fallowing of infested areas for a three-year period may significantly reduce populations of this weed or eradicate it, depending upon location.
Grass and broadleaf weed densities and seed numbers, weed control practices, and grain yields were included in a bioeconomic model that evaluates alternative weed management strategies for continuous corn (Zea mays L.). Weed seed numbers in soil and herbicide carry-over provided intertemporal links. Four weed management strategies – two fixed, one mixed, and one flexible – were evaluated with annualized net returns as the performance indicator. The flexible strategy (weed control based on observed conditions) had the largest annualized net return for high and low initial weed seed numbers. The fixed weed management strategy (weed control predetermined) of an annual application of only a preemergence herbicide ranked second in terms of annualized net returns for high weed seed numbers. The mixed weed management strategy of alternative year applications of preemergence herbicide and “as needed” applications of postemergence herbicide ranked second for low initial weed seed numbers. The fixed weed management strategy of alternate year application of preemergence herbicide only generated the lowest annualized net return, regardless of initial weed seed numbers.
A fixed (conventional) weed management strategy in corn was compared to three other strategies (two mixed and one flexible) in terms of weed control, grain yield, gross margin (gross income minus herbicide treatment costs), and herbicide use under furrow irrigation for four consecutive years. The fixed strategy prespecified preplanting, preemergence, postemergence, and layby herbicides. The flexible strategy herbicide treatments were specified by a computer bioeconomic model. Model decisions were based on weed seed in soil before planting, weed densities after corn emergence, herbicide costs, expected corn grain yield and selling price, and other parameters. The two mixed strategies were a combination of fixed and flexible strategies and designated either specified soil-applied herbicides (mixed/soil), or no soil-applied herbicide (mixed/no soil); postemergence treatments were determined by the model. Average corn grain yield was 10 280 kg ha–1 and gross income was 920 $ ha–1 and neither differed among strategies. Total weed density and gross margin were significantly higher for the mixed/no soil and flexible strategies compared to the mixed/soil and fixed strategies. Total weed density averaged 28 720, 28 100, 10 910, and 680 plants ha–1 for the mixed/no soil, flexible, mixed/soil, and fixed strategies, respectively. Annual gross margins for the four strategies averaged 885, 875, 845, and 810 $ ha–1, respectively. Herbicide use over the 4-yr period for these four strategies averaged 3.8, 5.3, 20.5, and 26.9 kg ha–1, respectively, and each value differed from the other. Thus, weeds can be managed in corn, gross margins increased, and herbicide use decreased by employing a bioeconomic weed-corn model to make weed management decisions.
An economic analysis of four weed management systems employed on four crop sequences in a barley-corn-pinto bean-sugarbeet rotation in eastern Colorado was computed. Weeds were controlled in each crop with only conventional tillage or conventional tillage plus minimum levels of herbicides (systems 3 and 4), moderate levels of herbicides (system 1), or intensive levels of herbicides (system 2). Adjusted gross returns were higher for systems 3 and 4 where herbicide use was less/year and decreased over 4 yr than for systems 1 and 2 where herbicide use was higher/year and constant. When the four crop sequences were aggregated using yield and sucrose indices, the least herbicide-intensive weed management system had $440/ha/4 yr higher indexed adjusted gross return than the most herbicide-intensive weed management system. An income risk analysis showed that the herbicide-intensive weed management system was not risk efficient and that producers would select one of the other three less herbicide-intensive weed management systems depending upon their risk preferences.
The impact of four weed management systems on weed seed reserves in soil, yearly weed problem, and production of barley, corn, pinto bean, and sugarbeet was assessed where these crops were grown in rotation for 4 consecutive years in four cropping sequences. Weeds were controlled in each crop with only conventional tillage or conventional tillage plus minimum, moderate (system 1), and intensive (system 2) levels of herbicides. Seed of annual weeds from 11 genera were identified, with barnyardgrass and redroot pigweed comprising 66 and 19%, respectively, of the initial 90 million weed seed/ha present in the upper 25 cm of the soil profile. After the fourth cropping year, overall decline in total number of weed seed in soil was 53% when averaged over four cropping sequences and four weed management systems. Over the 4-yr period, about 10 times more weeds escaped control in system 1 than in system 2; and within a crop, the fewest number of weeds escaped control annually in barley. System 2 had the highest herbicide use in each cropping sequence, the fewest weeds at harvest, and the smallest adjusted gross return over the 4-yr period in three of four cropping sequences.
To identify modifiable risk factors for acquisition of Klebsiella pneumoniae carbapenemase-producing Enterobacteriaceae (KPC) colonization among long-term acute-care hospital (LTACH) patients.
DESIGN
Multicenter, matched case-control study.
SETTING
Four LTACHs in Chicago, Illinois.
PARTICIPANTS
Each case patient included in this study had a KPC-negative rectal surveillance culture on admission followed by a KPC-positive surveillance culture later in the hospital stay. Each matched control patient had a KPC-negative rectal surveillance culture on admission and no KPC isolated during the hospital stay.
RESULTS
From June 2012 to June 2013, 2,575 patients were admitted to 4 LTACHs; 217 of 2,144 KPC-negative patients (10.1%) acquired KPC. In total, 100 of these patients were selected at random and matched to 100 controls by LTACH facility, admission date, and censored length of stay. Acquisitions occurred a median of 16.5 days after admission. On multivariate analysis, we found that exposure to higher colonization pressure (OR, 1.02; 95% CI, 1.01–1.04; P=.002), exposure to a carbapenem (OR, 2.25; 95% CI, 1.06–4.77; P=.04), and higher Charlson comorbidity index (OR, 1.14; 95% CI, 1.01–1.29; P=.04) were independent risk factors for KPC acquisition; the odds of KPC acquisition increased by 2% for each 1% increase in colonization pressure.
CONCLUSIONS
Higher colonization pressure, exposure to carbapenems, and a higher Charlson comorbidity index independently increased the odds of KPC acquisition among LTACH patients. Reducing colonization pressure (through separation of KPC-positive patients from KPC-negative patients using strict cohorts or private rooms) and reducing carbapenem exposure may prevent KPC cross transmission in this high-risk patient population.
East-west glacial striations and grooves were discovered on the summit ridge of Observation Hill, which parallel the present direction of shelf ice movement to the east and south-east of the Hut Point peninsula. This evidence suggests that Observation Hill may have been glaciated by a thickened McMurdo lobe of the adjacent Ross Ice Shelf, moving west, as it does today.
Field experiments were conducted in 1996, 1997, and 1998 at Ste. Anne de Bellevue, Quebec, Canada, and in 1996 at Ottawa, Ontario, Canada, to quantify the impact of corn hybrids, differing in canopy architecture and plant spacing (plant population density and row spacing), on biomass production by transplanted and naturally occurring weeds. The treatments consisted of a factorial combination of corn type (leafy reduced stature [LRS], late-maturing big leaf [LMBL], a conventional Pioneer 3979 [P3979], and, as a control, a corn-free condition [weed monoculture]), two weed levels (low density [transplanted weeds: common lambsquarters and redroot pigweed] and high density [weedy: plots with naturally occurring weeds]), two corn population densities (normal and high), and row spacings (38 and 76 cm). At all site-years under both weed levels, the decrease in biomass production by both transplanted and naturally occurring weeds was greater due to the narrow row spacing than due to the high plant population density. The combination of narrower rows and higher population densities increased corn canopy light interception by 3 to 5%. Biomass produced by both transplanted and naturally occurring weeds was five to eight times less under the corn canopy than in the weed monoculture treatment. Generally, weed biomass production was reduced more by early-maturing hybrids (LRS and P3979) than by LMBL. Thus, hybrid selection and plant spacing could be used as important components of integrated pest management (weed control) for sustainable agriculture.
Coapplication of herbicides and insecticides affords growers an opportunity to control multiple pests with one application given that efficacy is not compromised. Trifloxysulfuron was applied at 5.3 g ai/ha both alone and in combination with the insecticides acephate (370 g ai/ha), oxamyl (370 g ai/ha), lambda-cyhalothrin (34 g ai/ha), acetamiprid (45 g ai/ha), thiamethoxam (45 g ai/ha), endosulfan (379 g ai/ha), indoxacarb (123 g ai/ha), emamectin benzoate (11 g ai/ha), methoxyfenozide (67 g ai/ha), spinosad (75 g ai/ha), and pyridalyl (112 g ai/ha) to determine the effects of coapplication on control of some of the more common and/or troublesome broadleaf weeds infesting cotton. In addition, the insecticides acephate, oxamyl, lambda-cyhalothrin, thiamethoxam, and endosulfan, at the rates listed above, were applied either alone or in combination with trifloxysulfuron at 7.9 g/ha to assess the effects of coapplication on thrips control. Control of hemp sesbania (insecticides oxamyl and lambda-cyhalothrin), sicklepod (insecticides methoxyfenozide and pyridalyl), redroot pigweed (insecticides thiamethoxam, methoxyfenozide, spinosad, and pyridalyl), and smooth pigweed, Palmer amaranth, and common lambsquarters (all insecticides) with trifloxysulfuron may be reduced when coapplied with the indicated insecticides for each species. Control of pitted, tall, ivyleaf, and entireleaf morningglory with trifloxysulfuron was not affected by the insecticides evaluated. Coapplication of trifloxysulfuron with the insecticides evaluated also resulted in no negative effects on thrips control.
A synthesis of previous work and new data on the stratigraphy of high terraces of the Ohio and Monongahela Rivers upstream from Parkersburg, West Virginia, indicates a correspondence between terrace histories in the ancient Teays and Pittsburgh drainage basins. Four terraces are identified in each. Sediments of the lower three alluvial and slackwater terraces, correlated with Illinoian, early Wisconsin, and late Wisconsin glacial deposits, have been traced along the modern Ohio River through the former divide between the Teays and Pittsburgh basins. Sediments in the fourth terrace, the highest well-defined terrace in each basin, were deposited in two ice-dammed lakes, separated by a divide near New Martinsville, West Virginia. Some deposits of the highest slackwater terrace in both the Teays and Pittsburgh basins have reversed remanent magnetic polarity. This, and the stratigraphic succession in the two basins, suggests that both were ponded during the same glaciation. Reversed polarity in these terrace sediments restricts the age of the first ice-damming event for which stratigraphic evidence is well-preserved to a pre-Illinoian, early Pleistocene glaciation prior to 788,000 yr ago. In contrast, slackwater sediments in the Monongahela River valley, upstream from an outwash gravel dam at the Allegheny-Monongahela confluence, have normal remanent magnetic polarity, corroborating correlation with an Illinoian ponding event.