We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Geoarchaeological research as part of the AHRC funded Living with Monuments (LwM) project investigated the upper Kennet river system across the Avebury World Heritage landscape. The results demonstrate that in the early–mid-Holocene (c. 9500–1000 bc) there was very low erosion of disturbed soils into the floodplain, with floodplain deposits confined to a naturally forming bedload fluvial deposit aggrading in a shallow channel of inter-linked deeper pools. At the time of the Neolithic monument building in the 4th–early 3rd millennium bc, the river was wide and shallow with areas of presumed braid plain. Between c. 4000 and 1000 bc, a human induced signature of soil erosion became a minor component of fluvial sedimentation in the Kennet palaeo-channel but it was small scale and localised. This strongly suggests that there is little evidence of widespread woodland removal associated with Neolithic farming and monument building, despite the evidently large timber requirements for Neolithic sites like the West Kennet palisade enclosures. Consequently, there was relatively light human disturbance of the hinterland and valley slopes over the longue durée until the later Bronze Age/Early Iron Age, with a predominance of pasture over arable land. Rather than large Neolithic monument complexes being constructed within woodland clearings, representing ancestral and sacred spaces, the substantially much more open landscape provided a suitable landscape with areas of sarsen spreads potentially easily visible. During the period c. 3000–1000 bc, the sediment load within the channel slowly increased with alluvial deposition of increasingly humic silty clays across the valley floor. However, this only represents small-scale landscape disturbance. It is from the Late Bronze Age–Early Iron Age when the anthropogenic signal of human driven alluviation becomes dominant and overtakes the bedload fluvial signal across the floodplain, with localised colluvial deposits on the floodplain margins. Subsequently, the alluvial archive describes more extensive human impact across this landscape, including the disturbance of loessic-rich soils in the catchment. The deposition of floodplain wide alluvium continues throughout the Roman, medieval, and post-medieval periods, correlating with the development of a low-flow, single channel, with alluvial sediments describing a decreasing energy in the depositional environment.
Few studies have examined the impact of late-life depression trajectories on specific domains of cognitive function. This study aims to delineate how different depressive symptom trajectories specifically affect cognitive function in older adults.
Design:
Prospective longitudinal cohort study
Setting:
Australia and the United States of America
Participants:
In total, 11,035 community-dwelling older adults with a mean age of 75 years
Measurements:
Depressive trajectories were modelled from depressive symptoms according to annual Centre for Epidemiological Studies Depression Scale 10 (CES-D-10) surveys. Four trajectories of depressive symptoms were identified: low (“nondepressed”), consistently mild (“subthreshold depression”), consistently moderate (“persistent depression”), and initially low but increasing (“emerging depression”). Global cognition (Modified Mini-Mental State Examination [3MS]), verbal fluency (Controlled Oral Word Association Test [COWAT]), processing speed (Symbol Digit Modalities Test [SDMT]), episodic memory (Hopkins Verbal Learning Test – Revised [HVLT-R]), and a composite z-score were assessed over a subsequent median 2 years.
Results:
Subthreshold depression predicted impaired performance on the SDMT (Cohen’s d −0.04) and composite score (−0.03); emerging depression predicted impaired performance on the SDMT (−0.13), HVLT-R (−0.09), 3 MS (−0.08) and composite score (−0.09); and persistent depression predicted impaired performance on the SDMT (−0.08), 3 MS (−0.11), and composite score (−0.09).
Conclusions:
Depressive symptoms are associated with later impaired processing speed. These effects are small. Diverse depression trajectories have different impacts on cognitive function.
The personalised oncology paradigm remains challenging to deliver despite technological advances in genomics-based identification of actionable variants combined with the increasing focus of drug development on these specific targets. To ensure we continue to build concerted momentum to improve outcomes across all cancer types, financial, technological and operational barriers need to be addressed. For example, complete integration and certification of the ‘molecular tumour board’ into ‘standard of care’ ensures a unified clinical decision pathway that both counteracts fragmentation and is the cornerstone of evidence-based delivery inside and outside of a research setting. Generally, integrated delivery has been restricted to specific (common) cancer types either within major cancer centres or small regional networks. Here, we focus on solutions in real-world integration of genomics, pathology, surgery, oncological treatments, data from clinical source systems and analysis of whole-body imaging as digital data that can facilitate cost-effectiveness analysis, clinical trial recruitment, and outcome assessment. This urgent imperative for cancer also extends across the early diagnosis and adjuvant treatment interventions, individualised cancer vaccines, immune cell therapies, personalised synthetic lethal therapeutics and cancer screening and prevention. Oncology care systems worldwide require proactive step-changes in solutions that include inter-operative digital working that can solve patient centred challenges to ensure inclusive, quality, sustainable, fair and cost-effective adoption and efficient delivery. Here we highlight workforce, technical, clinical, regulatory and economic challenges that prevent the implementation of precision oncology at scale, and offer a systematic roadmap of integrated solutions for standard of care based on minimal essential digital tools. These include unified decision support tools, quality control, data flows within an ethical and legal data framework, training and certification, monitoring and feedback. Bridging the technical, operational, regulatory and economic gaps demands the joint actions from public and industry stakeholders across national and global boundaries.
White matter hyperintensity (WMH) burden is greater, has a frontal-temporal distribution, and is associated with proxies of exposure to repetitive head impacts (RHI) in former American football players. These findings suggest that in the context of RHI, WMH might have unique etiologies that extend beyond those of vascular risk factors and normal aging processes. The objective of this study was to evaluate the correlates of WMH in former elite American football players. We examined markers of amyloid, tau, neurodegeneration, inflammation, axonal injury, and vascular health and their relationships to WMH. A group of age-matched asymptomatic men without a history of RHI was included to determine the specificity of the relationships observed in the former football players.
Participants and Methods:
240 male participants aged 45-74 (60 unexposed asymptomatic men, 60 male former college football players, 120 male former professional football players) underwent semi-structured clinical interviews, magnetic resonance imaging (structural T1, T2 FLAIR, and diffusion tensor imaging), and lumbar puncture to collect cerebrospinal fluid (CSF) biomarkers as part of the DIAGNOSE CTE Research Project. Total WMH lesion volumes (TLV) were estimated using the Lesion Prediction Algorithm from the Lesion Segmentation Toolbox. Structural equation modeling, using Full-Information Maximum Likelihood (FIML) to account for missing values, examined the associations between log-TLV and the following variables: total cortical thickness, whole-brain average fractional anisotropy (FA), CSF amyloid ß42, CSF p-tau181, CSF sTREM2 (a marker of microglial activation), CSF neurofilament light (NfL), and the modified Framingham stroke risk profile (rFSRP). Covariates included age, race, education, APOE z4 carrier status, and evaluation site. Bootstrapped 95% confidence intervals assessed statistical significance. Models were performed separately for football players (college and professional players pooled; n=180) and the unexposed men (n=60). Due to differences in sample size, estimates were compared and were considered different if the percent change in the estimates exceeded 10%.
Results:
In the former football players (mean age=57.2, 34% Black, 29% APOE e4 carrier), reduced cortical thickness (B=-0.25, 95% CI [0.45, -0.08]), lower average FA (B=-0.27, 95% CI [-0.41, -.12]), higher p-tau181 (B=0.17, 95% CI [0.02, 0.43]), and higher rFSRP score (B=0.27, 95% CI [0.08, 0.42]) were associated with greater log-TLV. Compared to the unexposed men, substantial differences in estimates were observed for rFSRP (Bcontrol=0.02, Bfootball=0.27, 994% difference), average FA (Bcontrol=-0.03, Bfootball=-0.27, 802% difference), and p-tau181 (Bcontrol=-0.31, Bfootball=0.17, -155% difference). In the former football players, rFSRP showed a stronger positive association and average FA showed a stronger negative association with WMH compared to unexposed men. The effect of WMH on cortical thickness was similar between the two groups (Bcontrol=-0.27, Bfootball=-0.25, 7% difference).
Conclusions:
These results suggest that the risk factor and biological correlates of WMH differ between former American football players and asymptomatic individuals unexposed to RHI. In addition to vascular risk factors, white matter integrity on DTI showed a stronger relationship with WMH burden in the former football players. FLAIR WMH serves as a promising measure to further investigate the late multifactorial pathologies of RHI.
Individuals living with HIV may experience cognitive difficulties or marked declines known as HIV-Associated Neurocognitive Disorder (HAND). Cognitive difficulties have been associated with worse outcomes for people living with HIV, therefore, accurate cognitive screening and identification is critical. One potentially sensitive marker of cognitive impairment which has been underutilized, is intra-individual variability (IIV). Cognitive IIV is the dispersion of scores across tasks in neuropsychological assessment. In individuals living with HIV, greater cognitive IIV has been associated with cortical atrophy, poorer cognitive functioning, with more rapid declines, and greater difficulties in daily functioning. Studies examining the use of IIV in clinical neuropsychological testing are limited, and few have examined IIV in the context of a single neuropsychological battery designed for culturally diverse or at-risk populations. To address these gaps, this study aimed to examine IIV profiles of individuals living with HIV and who inject drugs, utilizing the Neuropsi, a standardized neuropsychological instrument for Spanish speaking populations.
Participants and Methods:
Spanish speaking adults residing in Puerto Rico (n=90) who are HIV positive and who inject drugs (HIV+I), HIV negative and who inject drugs (HIV-I), HIV positive who do not inject drugs (HIV+), or healthy controls (HC) completed the Neuropsi battery as part of a larger research protocol. The Neuropsi produces 3 index scores representing cognitive domains of memory, attention/memory, and attention/executive functioning. Total battery and within index IIV were calculated by dividing the standard deviation of T-scores by mean performance, resulting in a coefficient of variance (CoV). Group differences on overall test battery mean CoV (OTBMCoV) were investigated. To examine unique profiles of index specific IIV, a cluster analysis was performed for each group.
Results:
Results of a one-way ANOVA indicated significant between group differences on OTBMCoV (F[3,86]=6.54, p<.001). Post-hoc analyses revealed that HIV+I (M=.55, SE=.07, p=.003), HIV-I (M=.50, SE=.03, p=.001), and HIV+ (M=.48, SE=.02, p=.002) had greater OTBMCoV than the HC group (M=.30, SE=.02). To better understand sources of IIV within each group, cluster analysis of index specific IIV was conducted. For the HIV+ group, 3 distinct clusters were extracted: 1. High IIV in attention/memory and attention/executive functioning (n=3, 8%); 2. Elevated memory IIV (n=21, 52%); 3. Low IIV across all indices (n=16, 40%). For the HIV-I group, 2 distinct clusters were extracted: 1. High IIV across all 3 indices (n=7, 24%) and 2. Low IIV across all 3 indices (n=22, 76%). For the HC group, 3 distinct clusters were extracted: 1. Very low IIV across all 3 indices (n=5, 36%); 2. Elevated memory IIV (n=6, 43%); 3. Elevated attention/executive functioning IIV with very low attention/memory and memory IIV (n=3, 21%). Sample size of the HIV+I group was insufficient to extract clusters.
Conclusions:
Current findings support IIV in the Neuropsi test battery as clinically sensitive marker for cognitive impairment in Spanish speaking individuals living with HIV or who inject drugs. Furthermore, the distinct IIV cluster types identified between groups can help to better understand specific sources of variability. Implications for clinical assessment in prognosis and etiological considerations are discussed.
Injection drug use is a significant public health crisis with adverse health outcomes, including increased risk of human immunodeficiency virus (HIV) infection. Comorbidity of HIV and injection drug use is highly prevalent in the United States and disproportionately elevated in surrounding territories such as Puerto Rico. While both HIV status and injection drug use are independently known to be associated with cognitive deficits, the interaction of these effects remains largely unknown. The aim of this study was to determine how HIV status and injection drug use are related to cognitive functioning in a group of Puerto Rican participants. Additionally, we investigated the degree to which type and frequency of substance use predict cognitive abilities.
Participants and Methods:
96 Puerto Rican adults completed the Neuropsi Attention and Memory-3rd Edition battery for Spanish-speaking participants. Injection substance use over the previous 12 months was also obtained via clinical interview. Participants were categorized into four groups based on HIV status and injection substance use in the last 30 days (HIV+/injector, HIV+/non-injector, HIV/injector, HIV-/non-injector). One-way analysis of variance (ANOVA) was conducted to determine differences between groups on each index of the Neuropsi battery (Attention and Executive Function; Memory; Attention and Memory). Multiple linear regression was used to determine whether type and frequency of substance use predicted performance on these indices while considering HIV status.
Results:
The one-way ANOVAs revealed significant differences (p’s < 0.01) between the healthy control group and all other groups across all indices. No significant differences were observed between the other groups. Injection drug use, regardless of the substance, was associated with lower combined attention and memory performance compared to those who inject less than monthly (Monthly: p = 0.04; 2-3x daily: p < 0.01; 4-7x daily: p = 0.02; 8+ times daily: p < 0.01). Both minimal and heavy daily use predicted poorer memory performance (p = 0.02 and p = 0.01, respectively). Heavy heroin use predicted poorer attention and executive functioning (p = 0.04). Heroin use also predicted lower performance on tests of memory when used monthly (p = 0.049), and daily or almost daily (2-6x weekly: p = 0.04; 4-7x daily: p = 0.04). Finally, moderate injection of heroin predicted lower scores on attention and memory (Weekly: p = 0.04; 2-6x weekly: p = 0.048). Heavy combined heroin and cocaine use predicted worse memory performance (p = 0.03) and combined attention and memory (p = 0.046). HIV status was not a moderating factor in any circumstance.
Conclusions:
As predicted, residents of Puerto Rico who do not inject substances and are HIVnegative performed better in domains of memory, attention, and executive function than those living with HIV and/or inject substances. There was no significant difference among the affected groups in cognitive ability. As expected, daily injection of substances predicted worse performance on tasks of memory. Heavy heroin use predicted worse performance on executive function and memory tasks, while heroin-only and combined heroin and cocaine use predicted worse memory performance. Overall, the type and frequency of substance is more predictive of cognitive functioning than HIV status.
Now that racism has been officially recognized in Brazil, and some universities have adopted affirmative-action admission policies, measures of the magnitude of racial inequality and analyses that identify the factors associated with changes in racial disparities over time assume particular relevance to the conduct of public debate. This study uses census data from 1950 to 2000 to estimate the probability of death in the early years of life, a robust indicator of the standard of living among the white and Afro-Brazilian populations. Associated estimates of the average number of years of life expectancy at birth show that the 6.6-year advantage that the white population enjoyed in the 1950s remained virtually unchanged throughout the second half of the twentieth century, despite the significant improvements that accrued to both racial groups. The application of multivariate techniques to samples selected from the 1960, 1980, and 2000 census enumerations further shows that, controlling for key determinants of child survival, the white mortality advantage persisted and even increased somewhat in 2000. The article discusses evidence of continued racial inequality during an era of deep transformation in social structure, with reference to the challenges of skin color classification in a multiracial society and the evolution of debates about color, class, and discrimination in Brazil.
Novel approaches are needed to understand and disrupt Mycobacterium tuberculosis transmission. In this proof-of-concept study, we investigated the use of environmental air samplings to detect and quantify M. tuberculosis in different clinic settings in a high-burden area.
Design:
Cross-sectional, environmental sampling.
Setting:
Primary-care clinic.
Methods:
A portable, high-flow dry filter unit (DFU) was used to draw air through polyester felt filters for 2 hours. Samples were collected in the waiting area and TB room of a primary care clinic. Controls included sterile filters placed directly into collection tubes at the DFU sampling site, and filter samplings performed outdoors. DNA was extracted from the filters, and droplet digital polymerase chain reaction (ddPCR) was used to quantify M. tuberculosis DNA copies. Carbon dioxide (CO2) data loggers captured CO2 concentrations in the sampled areas.
Results:
The median sampling time was 123 minutes (interquartile range [IQR], 121–126). A median of 121 (IQR, 35–243) M. tuberculosis DNA copies were obtained from 74 clinic samplings, compared to a median of 3 (IQR, 1–33; P < .001) obtained from 47 controls. At a threshold of 320 DNA copies, specificity was 100%, and 18% of clinic samples would be classified as positive.
Conclusions:
This proof-of-concept study suggests that the potential for airborne M. tuberculosis detection based on M. tuberculosis DNA copy yield to enable the identification of high-risk transmission locations. Further optimization of the M. tuberculosis extraction technique and ddPCR data analysis would improve detection and enable robust interpretation of these data.
Background: Healthcare-associated infections (HAIs) represent an ongoing problem for all clinics. Children’s clinics have waiting rooms that include toys and activities to entertain children, possibly representing reservoirs for HAIs. This study focuses on a newly constructed children’s outpatient clinic associated with a teaching hospital. We studied waiting room bacterial colonization of floors and play devices from the last phase of construction through 6 months of clinical use. Methods: Waiting room areas on the first 2 floors of the facility were studied due to high patient volume in those areas. In total, 16 locations were sampled: 11 on floors and 5 on play items. Using sterile double-transport swabs, all locations were sampled on 5 separate occasions over 2 months during the last phase of construction and 13 times over 6 months after the clinic was opened. After collection swabs were placed on ice, transported to a microbiology lab, and used to inoculate Hardy Diagnostics Cdiff Banana Broth (for Clostridium difficile - Cdiff), CHROM MRSA agar (for methicillin resistant Staphylococcus aureus - MRSA), Pseudomonas isolation agar (for Pseudomonas spp and P. aeruginosa), and tryptic soy agar to detect Bacillus spp. Media were incubated for 48 hours at 37°C and were scored for bacterial presence based on observation of colonies or change in the medium. Results: During the construction phase, waiting-room-floor bacterial colonies were dominated by Bacillus spp, and first-floor waiting rooms had nearly 7 times more colonies than those on the second floor (P < .05). A similar pattern was observed for C. difficile and MRSA. No Pseudomonas spp were observed during construction. Once patients were present, Bacillus spp contamination dropped for the first floor, but increased for the second floor. All other bacterial types (C. difficile, MRSA, Pseudomonas spp, and P. aeruginosa) increased on the second floor after the clinic opened (eg, from 23% to 42% for C. difficile and from 7% to 46% for MRSA; P < .05). The play devices showed small increases in bacterial load after clinic opening, most notably Pseudomonas spp. Conclusions: This study provides evidence that a shift from bacterial species associated with soil (eg, Bacillus spp) toward species commonly associated with humans occurred in waiting rooms after construction in this children’s outpatient clinic. Increases for MRSA, Pseudomonas spp, and P. aeruginosa were linked to patient presence. These data suggest that patients, their families, and clinic staff transport bacteria into clinic waiting rooms. This outpatient clinic environmental contamination may increase potential for HAIs and may represent a target for intervention.
Background: The bacteria that inhabit outpatient healthcare facilities influence patient outcomes and recovery, although the diversity and quantity of these bacterial communities is largely unknown. Whether differences in bacterial presence exist in individual medical specialty units of an outpatient clinic is also largely unknown. The purpose of this study was to compare bacterial species found in the general medicine and pulmonary units of an outpatient children’s clinic associated with a teaching hospital. Methods: In total, 6 locations (4 floor sites, counters, air ducts) were sampled in 3 rooms in the pulmonary (PUL) unit and 3 rooms in the general medicine (GM) unit on 13 days over a 6-month period. Sterile double transport swabs were utilized, transported on ice to a microbiology lab, and used to inoculate Hardy Diagnostics Cdiff Banana Broth (for Clostridium difficile), CHROM MRSA agar (for methicillin-resistant Staphylococcus aureus [MRSA]), eosin methylene blue (Levine-type, for Lac+ gram negatives [GN]), and Pseudomonas isolation agar (for Pseudomonas spp and P. aeruginosa [PS and PSA]). Media were incubated for 48 hours at 37°C and were scored for bacterial presence based on colonial observation. Results: The presence of bacteria isolated from GM and PUL units differed by species and location. Based on the percentage of positive swabs, the presence of GN was widespread in both units (Fig 1). Additionally, bacterial presence was greatest on the floors (GN ranged from 72% to 85% on floors in the 2 units), whereas counters had fewer positive swabs (GN ranged from 23% to 38% on counters), and swabs from return air ducts rarely led to bacterial growth. The 1 case in which swabs from the PUL unit resulted in higher levels of bacterial growth than for the GM unit was for PSA (GM, 8%; PUL, 13%). C. difficile detection was the same on both units (ie, 35% of floor samples showed contamination). Conclusions: The levels of environmental bacterial presence observed for these clinic units differed in some cases by unit and ranged from not detectable to very high levels. Detection of C. difficile on 35% of floor samples in both units could be problematic. Additionally, for the PUL unit, contamination of 13% of floor samples by PSA should raise concerns because many patients in this clinic have cystic fibrosis (CF). Although many CF patients are colonized by PSA, others may potentially contract an infection by this pathogen from the clinical environment. This observation supports current infection control recommendations for CF patients in outpatient settings.
Good education requires student experiences that deliver lessons about practice as well as theory and that encourage students to work for the public good—especially in the operation of democratic institutions (Dewey 1923; Dewy 1938). We report on an evaluation of the pedagogical value of a research project involving 23 colleges and universities across the country. Faculty trained and supervised students who observed polling places in the 2016 General Election. Our findings indicate that this was a valuable learning experience in both the short and long terms. Students found their experiences to be valuable and reported learning generally and specifically related to course material. Postelection, they also felt more knowledgeable about election science topics, voting behavior, and research methods. Students reported interest in participating in similar research in the future, would recommend other students to do so, and expressed interest in more learning and research about the topics central to their experience. Our results suggest that participants appreciated the importance of elections and their study. Collectively, the participating students are engaged and efficacious—essential qualities of citizens in a democracy.
The use of monthly intranasal mupirocin was associated with a significant reduction in the rate of methicillin-resistant Staphylococcus aureus transmission and Staphylococcus aureus invasive infection in a large neonatal intensive care unit. Resistance to mupirocin emerged over time, but it was rare and was not associated with adverse clinical outcomes.
Field experiments were conducted in Alabama during 1999 and 2000 to test the hypothesis that any glyphosate-induced yield suppression in glyphosate-resistant cotton would be less with irrigation than without irrigation. Yield compensation was monitored by observing alterations in plant growth and fruiting patterns. Glyphosate treatments included a nontreated control, 1.12 kg ai/ha applied POST at the 4-leaf stage, 1.12 kg/ha applied DIR at the prebloom stage, and 1.12 kg/ha applied POST at 4-leaf and postemergence directed (DIR) at the prebloom cotton stages. The second variable, irrigation treatment, was established by irrigating plots individually with overhead sprinklers or maintaining them under dryland, nonirrigated conditions. Cotton yield and all measured parameters including lint quality were positively affected by irrigation. Irrigation increased yield 52% compared to nonirrigated cotton. Yield and fiber quality effects were independent of glyphosate treatments. Neither yield nor any of the measured variables that reflected whole plant response were influenced by glyphosate treatment or by a glyphosate by irrigation interaction.
The American experience with income taxes began with the Act of 1913, but it was not until 1921 that capital gains were identified separately and taxed differently from other sources of income. This fundamental revision of the income tax was justified on equity grounds; proponents of the change argued that it was unfair to tax income accrued over many years in the year that income was realized [6, p. 192].
The present capital gains provisions have been and are now—strongly attacked alternately as being too lenient and too strict [5, p. 184]. Unfortunately, this symmetry in the opposition to present capital gains treatment does not imply the present provisions are near the optimum.
Produce growers in Kentucky, North Carolina, and Tennessee were surveyed in 2002 to gather information about their decision making in the areas of planting, postharvest handling, marketing, and expected changes. North Carolina has proportionately more respondents with large operations, and Kentucky and Tennessee were more similar and concentrated in smaller farms. Tennessee and Kentucky respondents were less likely to have engaged in activities that were associated with the commercial distribution system. Greater reliance on the commercial distribution system on the part of North Carolina growers is consistent with more produce export activity.
Among the familiar sights crowding the landscape of English history from the dooms of Ine to that crown plucked from a hawthorn bush at Bosworth, none is more deeply cherished than the crisis of 1297 and the “Confirmation of the Charters” to which it gave rise. For, despite all the sharp differences over detail that the documentation for this crisis has engendered, scholars have shown remarkable agreement in seeing it as the one defeat suffered by Edward I in a long and notably successful reign. And to that defeat they have attributed great constitutional significance. Stubbs set the pattern, calling the “result singularly in harmony with what seems from history and experience to be the natural direction of English progress,” and Wilkinson is only one among the many who have recently elaborated on that theme:
The crisis of 1297 … placed a definite check on the tendencies which Edward I had shown, to ignore the deep principles of the constitution under stress of the necessities which confronted the nation … It was a landmark in the advance of the knights … toward political maturity. It helped to establish the tradition of co-operation and political alliance between the knights and the magnates, on which a good deal of the political future of England was to depend …. What the opposition achieved, in 1297, was a great vindication of the ancient political principle of government by consent ….