We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Background: Feedback reports summarizing clinician performance are effective tools to improve antibiotic stewardship in the ambulatory setting, but few studies have evaluated their effectiveness for pediatric inpatients. We developed and implemented feedback reports reflecting electronically-derived measures of appropriate antibiotic choice and duration for community acquired pneumonia (CAP) and measured their impact on appropriate antibiotic use in children hospitalized for CAP. Methods: We performed a single center quasi-experimental study including children 6 months to 17 years hospitalized for CAP between 12/1/2021-11/30/2023. Children with chronic medical conditions, ICU stays >48 hours, and outside transfers were excluded. The intervention occurred in 11/2022 and included clinician education, a monthly group-level feedback report disseminated by email (Figure 1), and a monthly review of clinician performance during a virtual quality improvement meeting. Patient characteristics were compared using chi-square or Wilcoxon rank sum tests. Interrupted time series analysis (ITSA) was used to measure the immediate change in the proportion of CAP encounters receiving both the appropriate antibiotic choice and duration, as well as the change in slope from the preintervention to the postintervention periods. Choice and duration were analyzed separately using ITSA as a secondary analysis. Results: There were 817 CAP encounters, including 420 preintervention and 397 postintervention. Patients admitted in the postintervention period were older (median age 2 years vs 3 years, P=0.03), but otherwise there were no differences in race, ethnicity, sex, ICU admission, or complicated pneumonia. Preintervention, 52% of encounters received both the appropriate antibiotic choice and duration; 96% of encounters received the appropriate antibiotic choice and 54% received the appropriate duration. The ITSA demonstrated an immediate 16% increase in the proportion of patients receiving both appropriate antibiotic choice and duration (95% confidence interval, 1-31%; P = 0.047) and no significant further increase over time following the intervention (P = 0.84) (Figure 2). When antibiotic choice was analyzed separately by ITSA, there was no immediate change or change over time in the proportion of patients receiving the appropriate antibiotic choice. In the ITSA of duration alone, there was an immediate 17% increase in the proportion receiving the appropriate duration (95% confidence interval, 2-33%; P = 0.03) and no change over time. Conclusion: Feedback reports generated from electronically-derived metrics of antibiotic choice and duration, combined with ongoing clinician education, increased the proportion of children with CAP treated with the appropriate antibiotic duration. Electronic feedback reports are a scalable and impactful intervention to improve antibiotic use in children hospitalized with CAP.
Background: Community-acquired pneumonia (CAP) is a common indication for antibiotic use in hospitalized children and is a key target for pediatric antimicrobial stewardship programs (ASPs). Building upon prior work, we developed and refined an electronic algorithm to identify children hospitalized with CAP and to evaluate the appropriateness of initial antibiotic choice and duration. Methods: We performed a cross-sectional study including children 6 months to 17 years hospitalized for CAP between January 1, 2019, and October 31, 2022, at a tertiary-care children’s hospital. CAP was defined electronically as an International Classification of Disease, Tenth Revision (ICD-10) code for pneumonia, a chest radiograph or chest computed tomography scan (CT) performed within 48 hours of admission, and systemic antibiotics administered within the first 48 hours of hospitalization and continued for at least 2 days. We applied the following exclusion criteria: patients transferred from another healthcare setting, those who died within 48 hours of hospitalization, children with complex chronic conditions, and those with intensive care unit stays >48 hours. Criteria for appropriate antibiotic choice and duration were defined based on established guidelines. Two physicians performed independent medical record reviews of 80 randomly selected patients (10% sample) to evaluate the performance of the electronic algorithm in (1) identifying patients treated for clinician-diagnosed CAP and (2) classifying antibiotic choice and duration as appropriate. A third physician resolved discrepancies. The electronic algorithm was compared to this medical record review, which served as the reference standard. Results: Of 80 children identified by the electronic algorithm, 79 (99%) were diagnosed with CAP based on medical record review. Antibiotic use was classified as the appropriate choice in 75 (94%) of 80 cases, and appropriate duration in 16 (20%) of 80 cases. The sensitivity of the electronic algorithm for identifying appropriate initial antibiotic choice was 94%; specificity could not be calculated because no events of inappropriate antibiotic choice were identified based on chart review. The sensitivity and specificity for determining appropriate duration were 88% and 97%, respectively (Table 1).
Conclusions: The electronic algorithm accurately identified children hospitalized with CAP and demonstrated acceptable performance for identifying appropriate antibiotic choice and duration. Use of this electronic algorithm may improve the efficiency of stewardship activities and could facilitate alignment with updated accreditation standards. Future studies validating this algorithm at other centers are needed.
Blood biomarkers of Alzheimer's disease (AD) may allow for the early detection of AD pathology in mild cognitive impairment (MCI) due to AD (MCI-AD) and as a co-pathology in MCI with Lewy bodies (MCI-LB). However not all cases of MCI-LB will feature AD pathology. Disease-general biomarkers of neurodegeneration, such as glial fibrillary acidic protein (GFAP) or neurofilament light (NfL), may therefore provide a useful supplement to AD biomarkers. We aimed to compare the relative utility of plasma Aβ42/40, p-tau181, GFAP and NfL in differentiating MCI-AD and MCI-LB from cognitively healthy older adults, and from one another.
Methods
Plasma samples were analysed for 172 participants (31 healthy controls, 48 MCI-AD, 28 possible MCI-LB and 65 probable MCI-LB) at baseline, and a subset (n = 55) who provided repeated samples after ≥1 year. Samples were analysed with a Simoa 4-plex assay for Aβ42, Aβ40, GFAP and NfL, and incorporated previously-collected p-tau181 from this same cohort.
Results
Probable MCI-LB had elevated GFAP (p < 0.001) and NfL (p = 0.012) relative to controls, but not significantly lower Aβ42/40 (p = 0.06). GFAP and p-tau181 were higher in MCI-AD than MCI-LB. GFAP discriminated all MCI subgroups, from controls (AUC of 0.75), but no plasma-based marker effectively differentiated MCI-AD from MCI-LB. NfL correlated with disease severity and increased with MCI progression over time (p = 0.011).
Conclusion
Markers of AD and astrocytosis/neurodegeneration are elevated in MCI-LB. GFAP offered similar utility to p-tau181 in distinguishing MCI overall, and its subgroups, from healthy controls.
A penicillin allergy testing service (PATS) assessed penicillin allergy in patients with hematologic malignancies; 17 patients who met criteria had negative skin testing. Patients who underwent penicillin challenge passed and were delabeled. Of delabeled patients, 87% received and tolerated β-lactams during follow-up. Providers found the PATS valuable.
Impaired olfaction may be a biomarker for early Lewy body disease, but its value in mild cognitive impairment with Lewy bodies (MCI-LB) is unknown. We compared olfaction in MCI-LB with MCI due to Alzheimer’s disease (MCI-AD) and healthy older adults. We hypothesized that olfactory function would be worse in probable MCI-LB than in both MCI-AD and healthy comparison subjects (HC).
Design:
Cross-sectional study assessing olfaction using Sniffin’ Sticks 16 (SS-16) in MCI-LB, MCI-AD, and HC with longitudinal follow-up. Differences were adjusted for age, and receiver operating characteristic (ROC) curves were used for discriminating MCI-LB from MCI-AD and HC.
Setting:
Participants were recruited from Memory Services in the North East of England.
Participants:
Thirty-eight probable MCI-LB, 33 MCI-AD, 19 possible MCI-LB, and 32HC.
Measurements:
Olfaction was assessed using SS-16 and a questionnaire.
Results:
Participants with probable MCI-LB had worse olfaction than both MCI-AD (age-adjusted mean difference (B) = 2.05, 95% CI: 0.62–3.49, p = 0.005) and HC (B = 3.96, 95% CI: 2.51–5.40, p < 0.001). The previously identified cutoff score for the SS-16 of ≤ 10 had 84% sensitivity for probable MCI-LB (95% CI: 69–94%), but 30% specificity versus MCI-AD. ROC analysis found a lower cutoff of ≤ 7 was better (63% sensitivity for MCI-LB, with 73% specificity vs MCI-AD and 97% vs HC). Asking about olfactory impairments was not useful in identifying them.
Conclusions:
MCI-LB had worse olfaction than MCI-AD and normal aging. A lower cutoff score of ≤ 7 is required when using SS-16 in such patients. Olfactory testing may have value in identifying early LB disease in memory services.
The present study aimed to clarify the neuropsychological profile of the emergent diagnostic category of Mild Cognitive Impairment with Lewy bodies (MCI-LB) and determine whether domain-specific impairments such as in memory were related to deficits in domain-general cognitive processes (executive function or processing speed).
Method:
Patients (n = 83) and healthy age- and sex-matched controls (n = 34) underwent clinical and imaging assessments. Probable MCI-LB (n = 44) and MCI-Alzheimer’s disease (AD) (n = 39) were diagnosed following National Institute on Aging-Alzheimer’s Association (NIA-AA) and dementia with Lewy bodies (DLB) consortium criteria. Neuropsychological measures included cognitive and psychomotor speed, executive function, working memory, and verbal and visuospatial recall.
Results:
MCI-LB scored significantly lower than MCI-AD on processing speed [Trail Making Test B: p = .03, g = .45; Digit Symbol Substitution Test (DSST): p = .04, g = .47; DSST Error Check: p < .001, g = .68] and executive function [Trail Making Test Ratio (A/B): p = .04, g = .52] tasks. MCI-AD performed worse than MCI-LB on memory tasks, specifically visuospatial (Modified Taylor Complex Figure: p = .01, g = .46) and verbal (Rey Auditory Verbal Learning Test: p = .04, g = .42) delayed recall measures. Stepwise discriminant analysis correctly classified the subtype in 65.1% of MCI patients (72.7% specificity, 56.4% sensitivity). Processing speed accounted for more group-associated variance in visuospatial and verbal memory in both MCI subtypes than executive function, while no significant relationships between measures were observed in controls (all ps > .05)
Conclusions:
MCI-LB was characterized by executive dysfunction and slowed processing speed but did not show the visuospatial dysfunction expected, while MCI-AD displayed an amnestic profile. However, there was considerable neuropsychological profile overlap and processing speed mediated performance in both MCI subtypes.
Electroencephalographic (EEG) abnormalities are greater in mild cognitive impairment (MCI) with Lewy bodies (MCI-LB) than in MCI due to Alzheimer’s disease (MCI-AD) and may anticipate the onset of dementia. We aimed to assess whether quantitative EEG (qEEG) slowing would predict a higher annual hazard of dementia in MCI across these etiologies. MCI patients (n = 92) and healthy comparators (n = 31) provided qEEG recording and underwent longitudinal clinical and cognitive follow-up. Associations between qEEG slowing, measured by increased theta/alpha ratio, and clinical progression from MCI to dementia were estimated with a multistate transition model to account for death as a competing risk, while controlling for age, cognitive function, and etiology classified by an expert consensus panel.
Over a mean follow-up of 1.5 years (SD = 0.5), 14 cases of incident dementia and 5 deaths were observed. Increased theta/alpha ratio on qEEG was associated with increased annual hazard of dementia (hazard ratio = 1.84, 95% CI: 1.01–3.35). This extends previous findings that MCI-LB features early functional changes, showing that qEEG slowing may anticipate the onset of dementia in prospectively identified MCI.
Archaeological investigations at Bucklers Park in Crowthorne have revealed a window onto a significant later prehistoric place, which was used and revisited over 1700 years between the Early Bronze Age and later Iron Age (c. 1800–100 bc). Activity on site was based around the heating of water using fire-heated flint, producing three mounds of fire-cracked flint and burnt organic material. These ‘burnt mounds’ are known across later prehistoric Britain and Ireland, but the ways they may have been formed are uncertain, and they are arguably under-discussed in southern Britain. Whilst water was initially drawn from a stream, a series of wells were established at the site between the Middle Bronze Age and Early Iron Age, one of which contained a well-preserved log ladder. These wells were revisited and recut over long periods of time and during the Middle Iron Age the site’s function shifted dramatically when a roundhouse was constructed. The long-term use of the site, its excellent organic preservation, dating, and its location in a remote area on the Bagshot Heath, make it significant. This paper summarises the findings from the excavations, discussing the formation of the site in the context of wider research on later prehistoric burnt mounds.
Dopaminergic imaging is an established biomarker for dementia with Lewy bodies, but its diagnostic accuracy at the mild cognitive impairment (MCI) stage remains uncertain.
Aims
To provide robust prospective evidence of the diagnostic accuracy of dopaminergic imaging at the MCI stage to either support or refute its inclusion as a biomarker for the diagnosis of MCI with Lewy bodies.
Method
We conducted a prospective diagnostic accuracy study of baseline dopaminergic imaging with [123I]N-ω-fluoropropyl-2β-carbomethoxy-3β-(4-iodophenyl)nortropane single-photon emission computerised tomography (123I-FP-CIT SPECT) in 144 patients with MCI. Images were rated as normal or abnormal by a panel of experts with access to striatal binding ratio results. Follow-up consensus diagnosis based on the presence of core features of Lewy body disease was used as the reference standard.
Results
At latest assessment (mean 2 years) 61 patients had probable MCI with Lewy bodies, 26 possible MCI with Lewy bodies and 57 MCI due to Alzheimer's disease. The sensitivity of baseline FP-CIT visual rating for probable MCI with Lewy bodies was 66% (95% CI 52–77%), specificity 88% (76–95%) and accuracy 76% (68–84%), with positive likelihood ratio 5.3.
Conclusions
It is over five times as likely for an abnormal scan to be found in probable MCI with Lewy bodies than MCI due to Alzheimer's disease. Dopaminergic imaging appears to be useful at the MCI stage in cases where Lewy body disease is suspected clinically.
Background: Antibiotic overuse contributes to antibiotic resistance and unnecessary adverse drug effects. Antibiotic stewardship interventions have primarily focused on acute-care settings. Most antibiotic use, however, occurs in outpatients with acute respiratory tract infections such as pharyngitis. The electronic health record (EHR) might provide an effective and efficient tool for outpatient antibiotic stewardship. We aimed to develop and validate an electronic algorithm to identify inappropriate antibiotic use for pediatric outpatients with pharyngitis. Methods: This study was conducted within the Children’s Hospital of Philadelphia (CHOP) Care Network, including 31 pediatric primary care practices and 3 urgent care centers with a shared EHR serving >250,000 children. We used International Classification of Diseases, Tenth Revision (ICD-10) codes to identify encounters for pharyngitis at any CHOP practice from March 15, 2017, to March 14, 2018, excluding those with concurrent infections (eg, otitis media, sinusitis), immunocompromising conditions, or other comorbidities that might influence the need for antibiotics. We randomly selected 450 features for detailed chart abstraction assessing patient demographics as well as practice and prescriber characteristics. Appropriateness of antibiotic use based on chart review served as the gold standard for evaluating the electronic algorithm. Criteria for appropriate use included streptococcal testing, use of penicillin or amoxicillin (absent β-lactam allergy), and a 10-day duration of therapy. Results: In 450 patients, the median age was 8.4 years (IQR, 5.5–9.0) and 54% were women. On chart review, 149 patients (33%) received an antibiotic, of whom 126 had a positive rapid strep result. Thus, based on chart review, 23 subjects (5%) diagnosed with pharyngitis received antibiotics inappropriately. Amoxicillin or penicillin was prescribed for 100 of the 126 children (79%) with a positive rapid strep test. Of the 126 children with a positive test, 114 (90%) received the correct antibiotic: amoxicillin, penicillin, or an appropriate alternative antibiotic due to b-lactam allergy. Duration of treatment was correct for all 126 children. Using the electronic algorithm, the proportion of inappropriate prescribing was 28 of 450 (6%). The test characteristics of the electronic algorithm (compared to gold standard chart review) for identification of inappropriate antibiotic prescribing were sensitivity (99%, 422 of 427); specificity (100%, 23 of 23); positive predictive value (82%, 23 of 28); and negative predictive value (100%, 422 of 422). Conclusions: For children with pharyngitis, an electronic algorithm for identification of inappropriate antibiotic prescribing is highly accurate. Future work should validate this approach in other settings and develop and evaluate the impact of an audit and feedback intervention based on this tool.
Background: Antibiotic resistance has increased at alarming rates, driven predominantly by antibiotic overuse. Although most antibiotic use occurs in outpatients, antimicrobial stewardship programs have primarily focused on inpatient settings. A major challenge for outpatient stewardship is the lack of accurate and accessible electronic data to target interventions. We sought to develop and validate an electronic algorithm to identify inappropriate antibiotic use for outpatients with acute bronchitis. Methods: This study was conducted within the University of Pennsylvania Health System (UPHS). We used ICD-10 diagnostic codes to identify encounters for acute bronchitis at any outpatient UPHS practice between March 15, 2017, and March 14, 2018. Exclusion criteria included underlying immunocompromising condition, other comorbidity influencing the need for antibiotics (eg, emphysema), or ICD-10 code at the same visit for a concurrent infection (eg, sinusitis). We randomly selected 300 (150 from academic practices and 150 from nonacademic practices) eligible subjects for detailed chart abstraction that assessed patient demographics and practice and prescriber characteristics. Appropriateness of antibiotic use based on chart review served as the gold standard for assessment of the electronic algorithm. Because antibiotic use is not indicated for this study population, appropriateness was assessed based upon whether an antibiotic was prescribed or not. Results: Of 300 subjects, median age was 61 years (interquartile range, 50–68), 62% were women, 74% were seen in internal medicine (vs family medicine) practices, and 75% were seen by a physician (vs an advanced practice provider). On chart review, 167 (56%) subjects received an antibiotic. Of these subjects, 1 had documented concern for pertussis and 4 had excluding conditions for which there were no ICD-10 codes. One received an antibiotic prescription for a planned dental procedure. Thus, based on chart review, 161 (54%) subjects received antibiotics inappropriately. Using the electronic algorithm based on diagnostic codes, underlying and concurrent conditions, and prescribing data, the number of subjects with inappropriate prescribing was 170 (56%) because 3 subjects had antibiotic prescribing not noted based on chart review. The test characteristics of the electronic algorithm (compared to gold standard chart review) for identification of inappropriate antibiotic prescribing were the following: sensitivity, 100% (161 of 161); specificity, 94% (130 of 139); positive predictive value, 95% (161 of 170); and negative predictive value, 100% (130 of 130). Conclusions: For outpatients with acute bronchitis, an electronic algorithm for identification of inappropriate antibiotic prescribing is highly accurate. This algorithm could be used to efficiently assess prescribing among practices and individual clinicians. The impact of interventions based on this algorithm should be tested in future studies.
Short-term hunter-gatherer residential camps have been a central feature of human settlement patterns and social structure for most of human evolutionary history. Recent analyses of ethnohistoric hunter-gatherer data show that across different environments, the average size of hunter-gatherer bands is remarkably constant and that bands are commonly formed by a small number of coresident families. Using ethnoarchaeological data, we examine the relationship between the physical infrastructure of camps and their social organization. We compiled a dataset of 263 ethnoarchaeologically observed hunter-gatherer camps from 13 studies in the literature. We focus on both the scale of camps, or their average size, structure, and composition, and the dynamics that governed their variation. Using a combination of inferential statistics and linear models, we show that the physical infrastructure of camps, measured by the number of household features, reflects the internal social organization of hunter-gatherer bands. Using scaling analyses, we then show that the variation among individual camps is related to a predictable set of dynamics between camp area, infrastructure, the number of occupants, and residence time. Moreover, the scale and dynamics that set the statistical variance in camp sizes are similar across different environments and have important implications for reconstructing prehistoric hunter-gatherer social organization and behavior from the archaeological record.
From its original formulation in 1990 the International Trans-Antarctic Scientific Expedition (ITASE) has had as its primary aim the collection and interpretation of a continent-wide array of environmental parameters assembled through the coordinated efforts of scientists from several nations. ITASE offers the ground-based opportunities of traditional-style traverse travel coupled with the modern technology of GPS, crevasse detecting radar, satellite communications and multidisciplinary research. By operating predominantly in the mode of an oversnow traverse, ITASE offers scientists the opportunity to experience the dynamic range of the Antarctic environment. ITASE also offers an important interactive venue for research similar to that afforded by oceanographic research vessels and large polar field camps, without the cost of the former or the lack of mobility of the latter. More importantly, the combination of disciplines represented by ITASE provides a unique, multidimensional (space and time) view of the ice sheet and its history. ITASE has now collected >20 000km of snow radar, recovered more than 240 firn/ice cores (total length 7000 m), remotely penetrated to ~4000m into the ice sheet, and sampled the atmosphere to heights of >20 km.
Complex unconformable englacial stratigraphy, including a segment of distinctive cosets of bed sequences, occurs throughout the thickness of a 3.2 MHz ice-sheet radar profile we acquired across the upper Byrd Glacier (East Antarctica) catchment. Some cosets span >10 km, are >100 m thick and are delineated by distinct horizons. At 40-90 m depth in firn, comparisons between 200 MHz and specially processed 3.2 MHz profiles reveal that the delineating horizons result from density-modified layers produced by decades to millennia of subaerial exposure, as detailed in our related paper (Part I). These comparisons, together with reflected waveforms at depth, also reveal that the modified layers retain their chemical stratification, and therefore the original unconformable surface. Two profile segments show high-amplitude transverse folds spanning much of the ice-sheet thickness. The parallel nature of most of them suggests basal sliding beneath long-term up-ice-flow accumulation zones, which we identify in satellite images as the likely sources for the cosets. The unconformable stratigraphy at depths greater than 2000 m shows that antidunal deposition and intense firn recrystallization zones have persisted for tens of thousands of years in this region of East Antarctica.
Unconformable firn stratigraphy exists throughout a 650 km long radar profile that we recorded down-flow of megadune fields in the Byrd Glacier (East Antarctica) catchment. Profile segments reveal cosets of prograding bedding sequences up to 90 m thick and with lateral, along-crest dimensions up to tens of kilometers. We profiled them in oblique section and nearly parallel to the prevailing wind. The prograding snow accumulates on broad, low windward slopes located above ice-bed depressions, which implies long-term slope stability. The apparent subglacial control implies that the accumulation progrades in balance with ice velocity, which we measured at ~30 ma”1. The sequences prograde over intensely modified and recrystallized wind-glaze firn, visible in the profiles as unstratified layers and zones up to several tens of meters thick. The intense recrystallization eliminates density stratification, and the altered layers appear to thicken into a connected network. Modeling of coset formation using wind and ice flow reproduces their dimensions and morphology. However, accumulation rates well above current regional estimates and existing data for megadunes are required because of the measured ice speed and required slope stability. The consistent unconformable strata along our traverse show that coset and recrystallized morphology extend far beyond the megadune fields.
Seedlings of wheat (Triticum aestivum L.), corn (Zea mays L.), and sorghum (Sorghum vulgare Pers.) were exposed to 2-chloro-4-ethylamino-6-isopropylamino-s-triazine (atrazine) in solution culture. Following prolonged treatment, the tolerant species, corn and sorghum, accumulated leaf concentrations of unaltered atrazine which were comparable to those found in the sensitive species at the point of acute toxicity. In the sensitive species, wheat, there did not seem to be a critical leaf concentration of atrazine which was necessary to bring about acute toxicity. The leaf concentration in wheat could be raised by increasing the concentration of atrazine in the nutrient solution or by lowering the temperature at which the plants were grown. The loss of chlorophyll in sensitive species was closely related to and preceded acute toxicity symptoms.
Deposits beneath Mubwindi Swamp provide a partial record of vegetation history since at least 43,000 yr ago. We studied pollen from two cores and obtained nine radiocarbon ages from one of these cores and three radiocarbon ages from the other. Pollen deposited before and soon after the last glacial maximum represents vegetation very different from the modern vegetation of the Mubwindi Swamp catchment. Although species now associated with higher altitudes were dominant some elements of moist lower montane forest persisted, possibly because of favorable soils or topography. The pollen data provides evidence for a late glacial montane forest refuge near Mubwindi Swamp. Moist lower montane forest became much more widespread soon after the glacial maximum. The only irrefutably Holocene sediments from Mubwindi Swamp date to the past 2500 yr. During this time a combination of climatic and human-induced changes in vegetation can be seen in the pollen records.
Field experiments were conducted in 1996, 1997, and 1998 at Ste. Anne de Bellevue, Quebec, Canada, and in 1996 at Ottawa, Ontario, Canada, to quantify the impact of corn hybrids, differing in canopy architecture and plant spacing (plant population density and row spacing), on biomass production by transplanted and naturally occurring weeds. The treatments consisted of a factorial combination of corn type (leafy reduced stature [LRS], late-maturing big leaf [LMBL], a conventional Pioneer 3979 [P3979], and, as a control, a corn-free condition [weed monoculture]), two weed levels (low density [transplanted weeds: common lambsquarters and redroot pigweed] and high density [weedy: plots with naturally occurring weeds]), two corn population densities (normal and high), and row spacings (38 and 76 cm). At all site-years under both weed levels, the decrease in biomass production by both transplanted and naturally occurring weeds was greater due to the narrow row spacing than due to the high plant population density. The combination of narrower rows and higher population densities increased corn canopy light interception by 3 to 5%. Biomass produced by both transplanted and naturally occurring weeds was five to eight times less under the corn canopy than in the weed monoculture treatment. Generally, weed biomass production was reduced more by early-maturing hybrids (LRS and P3979) than by LMBL. Thus, hybrid selection and plant spacing could be used as important components of integrated pest management (weed control) for sustainable agriculture.
The Dry Creek archeologic site contains a stratified record of late Pleistocene human occupation in central Alaska. Four archeologic components occur within a sequence of multiple loess and sand layers which together form a 2-m cap above weathered glacial outwash. The two oldest components appear to be of late Pleistocene age and occur with the bones of extinct game animals. Geologic mapping, stratigraphic correlations, radiocarbon dating, and sediment analyses indicate that the basal loess units formed part of a widespread blanket that was associated with an arctic steppe environment and with stream aggradation during waning phases of the last major glaciation of the Alaska Range. These basal loess beds contain artifacts for which radiocarbon dates and typologic correlations suggest a time range of perhaps 12,000–9000 yr ago. A long subsequent episode of cultural sterility was associated with waning loess deposition and development of a cryoturbated tundra soil above shallow permafrost. Sand deposition from local source areas predominated during the middle and late Holocene, and buried Subarctic Brown Soils indicate that a forest fringe developed on bluff-edge sand sheets along Dry Creek. The youngest archeologic component, which is associated with the deepest forest soil, indicates intermittent human occupation of the site between about 4700 and 3400 14C yr BP.