We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
War, captives, and human sacrifice were parts of Late Postclassic (AD 1250–1524) Maya culture in highland Guatemala. Las Casas (1958:152) wrote that the supreme lord “put the heads of the sacrificed on some poles on a certain altar dedicated only to this, where they had these for some time, after which they buried them.” These cultural aspects show up in human remains excavated at Iximche’, the Kaqchikel Maya capital. Here, we integrate previously published and unpublished results of stable isotope analyses and explore their implications for diets and the geographic origins of individuals who were buried at the site on the eve of the Spanish conquest. Data from Iximche’ are compared with available results from other ancient Maya sites.
Diagnosis in psychiatry faces familiar challenges. Validity and utility remain elusive, and confusion regarding the fluid and arbitrary border between mental health and illness is increasing. The mainstream strategy has been conservative and iterative, retaining current nosology until something better emerges. However, this has led to stagnation. New conceptual frameworks are urgently required to catalyze a genuine paradigm shift.
Methods
We outline candidate strategies that could pave the way for such a paradigm shift. These include the Research Domain Criteria (RDoC), the Hierarchical Taxonomy of Psychopathology (HiTOP), and Clinical Staging, which all promote a blend of dimensional and categorical approaches.
Results
These alternative still heuristic transdiagnostic models provide varying levels of clinical and research utility. RDoC was intended to provide a framework to reorient research beyond the constraints of DSM. HiTOP began as a nosology derived from statistical methods and is now pursuing clinical utility. Clinical Staging aims to both expand the scope and refine the utility of diagnosis by the inclusion of the dimension of timing. None is yet fit for purpose. Yet they are relatively complementary, and it may be possible for them to operate as an ecosystem. Time will tell whether they have the capacity singly or jointly to deliver a paradigm shift.
Conclusions
Several heuristic models have been developed that separately or synergistically build infrastructure to enable new transdiagnostic research to define the structure, development, and mechanisms of mental disorders, to guide treatment and better meet the needs of patients, policymakers, and society.
Changing practice patterns caused by the pandemic have created an urgent need for guidance in prescribing stimulants using telepsychiatry for attention-deficit hyperactivity disorder (ADHD). A notable spike in the prescribing of stimulants accompanied the suspension of the Ryan Haight Act, allowing the prescribing of stimulants without a face-to-face meeting. Competing forces both for and against prescribing ADHD stimulants by telepsychiatry have emerged, requiring guidelines to balance these factors. On the one hand, factors weighing in favor of increasing the availability of treatment for ADHD via telepsychiatry include enhanced access to care, reduction in the large number of untreated cases, and prevention of the known adverse outcomes of untreated ADHD. On the other hand, factors in favor of limiting telepsychiatry for ADHD include mitigating the possibility of exploiting telepsychiatry for profit or for misuse, abuse, and diversion of stimulants. This Expert Consensus Group has developed numerous specific guidelines and advocates for some flexibility in allowing telepsychiatry evaluations and treatment without an in-person evaluation to continue. These guidelines also recognize the need to give greater scrutiny to certain subpopulations, such as young adults without a prior diagnosis or treatment of ADHD who request immediate-release stimulants, which should increase the suspicion of possible medication diversion, misuse, or abuse. In such cases, nonstimulants, controlled-release stimulants, or psychosocial interventions should be prioritized. We encourage the use of outside informants to support the history, the use of rating scales, and having access to a hybrid model of both in-person and remote treatment.
Anterior temporal lobectomy is a common surgical approach for medication-resistant temporal lobe epilepsy (TLE). Prior studies have shown inconsistent findings regarding the utility of presurgical intracarotid sodium amobarbital testing (IAT; also known as Wada test) and neuroimaging in predicting postoperative seizure control. In the present study, we evaluated the predictive utility of IAT, as well as structural magnetic resonance imaging (MRI) and positron emission tomography (PET), on long-term (3-years) seizure outcome following surgery for TLE.
Participants and Methods:
Patients consisted of 107 adults (mean age=38.6, SD=12.2; mean education=13.3 years, SD=2.0; female=47.7%; White=100%) with TLE (mean epilepsy duration =23.0 years, SD=15.7; left TLE surgery=50.5%). We examined whether demographic, clinical (side of resection, resection type [selective vs. non-selective], hemisphere of language dominance, epilepsy duration), and presurgical studies (normal vs. abnormal MRI, normal vs. abnormal PET, correctly lateralizing vs. incorrectly lateralizing IAT) were associated with absolute (cross-sectional) seizure outcome (i.e., freedom vs. recurrence) with a series of chi-squared and t-tests. Additionally, we determined whether presurgical evaluations predicted time to seizure recurrence (longitudinal outcome) over a three-year period with univariate Cox regression models, and we compared survival curves with Mantel-Cox (log rank) tests.
Results:
Demographic and clinical variables (including type [selective vs. whole lobectomy] and side of resection) were not associated with seizure outcome. No associations were found among the presurgical variables. Presurgical MRI was not associated with cross-sectional (OR=1.5, p=.557, 95% CI=0.4-5.7) or longitudinal (HR=1.2, p=.641, 95% CI=0.4-3.9) seizure outcome. Normal PET scan (OR= 4.8, p=.045, 95% CI=1.0-24.3) and IAT incorrectly lateralizing to seizure focus (OR=3.9, p=.018, 95% CI=1.2-12.9) were associated with higher odds of seizure recurrence. Furthermore, normal PET scan (HR=3.6, p=.028, 95% CI =1.0-13.5) and incorrectly lateralized IAT (HR= 2.8, p=.012, 95% CI=1.2-7.0) were presurgical predictors of earlier seizure recurrence within three years of TLE surgery. Log rank tests indicated that survival functions were significantly different between patients with normal vs. abnormal PET and incorrectly vs. correctly lateralizing IAT such that these had seizure relapse five and seven months earlier on average (respectively).
Conclusions:
Presurgical normal PET scan and incorrectly lateralizing IAT were associated with increased risk of post-surgical seizure recurrence and shorter time-to-seizure relapse.
The U.S. Department of Agriculture–Agricultural Research Service (USDA-ARS) has been a leader in weed science research covering topics ranging from the development and use of integrated weed management (IWM) tactics to basic mechanistic studies, including biotic resistance of desirable plant communities and herbicide resistance. ARS weed scientists have worked in agricultural and natural ecosystems, including agronomic and horticultural crops, pastures, forests, wild lands, aquatic habitats, wetlands, and riparian areas. Through strong partnerships with academia, state agencies, private industry, and numerous federal programs, ARS weed scientists have made contributions to discoveries in the newest fields of robotics and genetics, as well as the traditional and fundamental subjects of weed–crop competition and physiology and integration of weed control tactics and practices. Weed science at ARS is often overshadowed by other research topics; thus, few are aware of the long history of ARS weed science and its important contributions. This review is the result of a symposium held at the Weed Science Society of America’s 62nd Annual Meeting in 2022 that included 10 separate presentations in a virtual Weed Science Webinar Series. The overarching themes of management tactics (IWM, biological control, and automation), basic mechanisms (competition, invasive plant genetics, and herbicide resistance), and ecosystem impacts (invasive plant spread, climate change, conservation, and restoration) represent core ARS weed science research that is dynamic and efficacious and has been a significant component of the agency’s national and international efforts. This review highlights current studies and future directions that exemplify the science and collaborative relationships both within and outside ARS. Given the constraints of weeds and invasive plants on all aspects of food, feed, and fiber systems, there is an acknowledged need to face new challenges, including agriculture and natural resources sustainability, economic resilience and reliability, and societal health and well-being.
Modern psychometric methods make it possible to eliminate nonperforming items and reduce measurement error. Application of these methods to existing outcome measures can reduce variability in scores, and may increase treatment effect sizes in depression treatment trials.
Aims
We aim to determine whether using confirmatory factor analysis techniques can provide better estimates of the true effects of treatments, by conducting secondary analyses of individual patient data from randomised trials of antidepressant therapies.
Method
We will access individual patient data from antidepressant treatment trials through Clinicalstudydatarequest.com and Vivli.org, specifically targeting studies that used the Hamilton Rating Scale for Depression (HRSD) as the outcome measure. Exploratory and confirmatory factor analytic approaches will be used to determine pre-treatment (baseline) and post-treatment models of depression, in terms of the number of factors and weighted scores of each item. Differences in the derived factor scores between baseline and outcome measurements will yield an effect size for factor-informed depression change. The difference between the factor-informed effect size and each original trial effect size, calculated with total HRSD-17 scores, will be determined, and the differences modelled with meta-analytic approaches. Risk differences for proportions of patients who achieved remission will also be evaluated. Furthermore, measurement invariance methods will be used to assess potential gender differences.
Conclusions
Our approach will determine whether adopting advanced psychometric analyses can improve precision and better estimate effect sizes in antidepressant treatment trials. The proposed methods could have implications for future trials and other types of studies that use patient-reported outcome measures.
It is uncertain if long-term levels of low-density lipoprotein-cholesterol (LDL-C) affect cognition in middle age. We examined the association of LDL-C levels over 25 years with cognitive function in a prospective cohort of black and white US adults.
Methods:
Lipids were measured at baseline (1985–1986; age: 18–30 years) and at serial examinations conducted over 25 years. Time-averaged cumulative LDL-C was calculated using the area under the curve for 3,328 participants with ≥3 LDL-C measurements and a cognitive function assessment. Cognitive function was assessed at the Year 25 examination with the Digit Symbol Substitution Test [DSST], Rey Auditory Visual Learning Test [RAVLT], and Stroop Test. A brain magnetic resonance imaging (MRI) sub-study (N = 707) was also completed at Year 25 to assess abnormal white matter tissue volume (AWMV) and gray matter cerebral blood flow volume (GM-CBFV) as secondary outcomes.
Results:
There were 15.6%, 32.9%, 28.9%, and 22.6% participants with time-averaged cumulative LDL-C <100 mg/dL, 101–129 mg/dL, 130–159 mg/dL, and ≥160 mg/dL, respectively. Standardized differences in all cognitive function test scores ranged from 0.16 SD lower to 0.09 SD higher across time-averaged LDL-C categories in comparison to those with LDL-C < 100 mg/dL. After covariate adjustment, participants with higher versus lower time-averaged LDL-C had a lower RAVLT score (p-trend = 0.02) but no differences were present for DSST, Stroop Test, AWMV, or GM-CBFV.
Conclusion:
Cumulative LDL-C was associated with small differences in memory, as assessed by RAVLT scores, but not other cognitive or brain MRI measures over 25 years of follow-up.
A recent genome-wide association study (GWAS) identified 12 independent loci significantly associated with attention-deficit/hyperactivity disorder (ADHD). Polygenic risk scores (PRS), derived from the GWAS, can be used to assess genetic overlap between ADHD and other traits. Using ADHD samples from several international sites, we derived PRS for ADHD from the recent GWAS to test whether genetic variants that contribute to ADHD also influence two cognitive functions that show strong association with ADHD: attention regulation and response inhibition, captured by reaction time variability (RTV) and commission errors (CE).
Methods
The discovery GWAS included 19 099 ADHD cases and 34 194 control participants. The combined target sample included 845 people with ADHD (age: 8–40 years). RTV and CE were available from reaction time and response inhibition tasks. ADHD PRS were calculated from the GWAS using a leave-one-study-out approach. Regression analyses were run to investigate whether ADHD PRS were associated with CE and RTV. Results across sites were combined via random effect meta-analyses.
Results
When combining the studies in meta-analyses, results were significant for RTV (R2 = 0.011, β = 0.088, p = 0.02) but not for CE (R2 = 0.011, β = 0.013, p = 0.732). No significant association was found between ADHD PRS and RTV or CE in any sample individually (p > 0.10).
Conclusions
We detected a significant association between PRS for ADHD and RTV (but not CE) in individuals with ADHD, suggesting that common genetic risk variants for ADHD influence attention regulation.
According to the stress inoculation hypothesis, successfully navigating life stressors may improve one's ability to cope with subsequent stressors, thereby increasing psychiatric resilience.
Aims
Among individuals with no baseline history of post-traumatic stress disorder (PTSD) and/or major depressive disorder (MDD), to determine whether a history of a stressful life event protected participants against the development of PTSD and/or MDD after a natural disaster.
Method
Analyses utilised data from a multiwave, prospective cohort study of adult Chilean primary care attendees (years 2003–2011; n = 1160). At baseline, participants completed the Composite International Diagnostic Interview (CIDI), a comprehensive psychiatric diagnostic instrument, and the List of Threatening Experiences, a 12-item questionnaire that measures major stressful life events. During the study (2010), the sixth most powerful earthquake on record struck Chile. One year later (2011), the CIDI was re-administered to assess post-disaster PTSD and/or MDD.
Results
Marginal structural logistic regressions indicated that for every one-unit increase in the number of pre-disaster stressors, the odds of developing post-disaster PTSD or MDD increased (OR = 1.21, 95% CI 1.08–1.37, and OR = 1.16, 95% CI 1.06–1.27 respectively). When categorising pre-disaster stressors, individuals with four or more stressors (compared with no stressors) had higher odds of developing post-disaster PTSD (OR = 2.77, 95% CI 1.52–5.04), and a dose–response relationship between pre-disaster stressors and post-disaster MDD was found.
Conclusions
In contrast to the stress inoculation hypothesis, results indicated that experiencing multiple stressors increased the vulnerability to developing PTSD and/or MDD after a natural disaster. Increased knowledge regarding the individual variations of these disorders is essential to inform targeted mental health interventions after a natural disaster, especially in under-studied populations.
This article traces the ascent of new digital surveillance practices for European health security in an era of heightened global pandemic vigilance. In doing so, the article demonstrates how the confluence of evolving processes of digitisation and production of new digital data sources have enabled EU health security agents in recent years to enhance infectious disease surveillance through novel digitised practices of epidemic intelligence. Subsequently, the article thus argues that the centralisation of these new epidemic intelligence technologies to the core of EU health security initiatives has been foundational to the ascent of a new blended health surveillance practice operating across the EU, which amalgamates the digitised surface alerts of these new big data surveillance technologies with the long-established and traditional disease surveillance legacies of EU Member States. By utilising the concept of surface knowledge in relation to the ascent of these European epidemic intelligence practices, this article demonstrates the key epistemic and methodological shifts which occur in the production of knowledge, alerts and signals for accelerated infectious disease surveillance and the governing of public health risks within the EU.
Self-reported activity restriction is an established correlate of depression in dementia caregivers (dCGs). It is plausible that the daily distribution of objectively measured activity is also altered in dCGs with depression symptoms; if so, such activity characteristics could provide a passively measurable marker of depression or specific times to target preventive interventions. We therefore investigated how levels of activity throughout the day differed in dCGs with and without depression symptoms, then tested whether any such differences predicted changes in symptoms 6 months later.
Design, setting, participants, and measurements:
We examined 56 dCGs (mean age = 71, standard deviation (SD) = 6.7; 68% female) and used clustering to identify subgroups which had distinct depression symptom levels, leveraging baseline Center for Epidemiologic Studies of Depression Scale–Revised Edition and Patient Health Questionnaire-9 (PHQ-9) measures, as well as a PHQ-9 score from 6 months later. Using wrist activity (mean recording length = 12.9 days, minimum = 6 days), we calculated average hourly activity levels and then assessed when activity levels relate to depression symptoms and changes in symptoms 6 months later.
Results:
Clustering identified subgroups characterized by: (1) no/minimal symptoms (36%) and (2) depression symptoms (64%). After multiple comparison correction, the group of dCGs with depression symptoms was less active from 8 to 10 AM (Cohen’s d ≤ −0.9). These morning activity levels predicted the degree of symptom change on the PHQ-9 6 months later (per SD unit β = −0.8, 95% confidence interval: −1.6, −0.1, p = 0.03) independent of self-reported activity restriction and other key factors.
Conclusions:
These novel findings suggest that morning activity may protect dCGs from depression symptoms. Future studies should test whether helping dCGs get active in the morning influences the other features of depression in this population (i.e. insomnia, intrusive thoughts, and perceived activity restriction).
This article investigates the rise of algorithmic disease surveillance systems as novel technologies of risk analysis utilised to regulate pandemic outbreaks in an era of big data. Critically, the article demonstrates how intensified efforts towards harnessing big data and the application of algorithmic processing techniques to enhance the real-time surveillance and regulation infectious disease outbreaks significantly transform practices of global infectious disease surveillance; observed through the advent of novel risk rationalities which underpin the deployment of intensifying algorithmic practices to increasingly colonise and patrol emergent topographies of data in order to identify and govern the emergence of exceptional pathogenic risks. Conceptually, this article asserts further howthe rise of these novel risk regulating technologies within a context of big data transforms the government and forecasting of epidemics and pandemics: illustrated by the rise of emergent algorithmic governmentalties of risk within contemporary contexts of big data, disease surveillance and the regulation of pandemic.
Radiocarbon (14C or carbon-14, half-life 5730 yr) is a key radionuclide in the assessment of the safety of a geological disposal facility (GDF) for radioactive waste. In particular, the radiological impact of gaseous carbon-14 bearing species has been recognized as a potential issue. Irradiated steels are one of the main sources of carbon-14 in the United Kingdom’s radioactive waste inventory. However, there is considerable uncertainty about the chemical form(s) in which the carbon-14 will be released. The objective of the work was to measure the rate and speciation of carbon-14 release from irradiated 316L(N) stainless steel on leaching under high-pH anoxic conditions, representative of a cement-based near field for low-heat generating wastes. Periodic measurements of carbon-14 releases to both the gas phase and to solution were made in duplicate experiments over a period of up to 417 days. An initial fast release of carbon-14 from the surface of the steel is observed during the first week of leaching, followed by a drop in the rate of release at longer times. Carbon-14 is released primarily to the solution phase with differing fractions released to the gas phase in the two experiments: about 1% of the total release in one and 6% in the other. The predominant dissolved carbon-14 releases are in inorganic form (as 14C-carbonate) but also include organic species. The predominant gas-phase species are hydrocarbons with a smaller fraction of 14CO (which may include some volatile oxygen-containing carbon-species). The experiments are continuing, with final sampling and termination planned after leaching for a total of two years.
OBJECTIVES/SPECIFIC AIMS: Background: Delirium is a well described form of acute brain organ dysfunction characterized by decreased or increased movement, changes in attention and concentration as well as perceptual disturbances (i.e., hallucinations) and delusions. Catatonia, a neuropsychiatric syndrome traditionally described in patients with severe psychiatric illness, can present as phenotypically similar to delirium and is characterized by increased, decreased and/or abnormal movements, staring, rigidity, and mutism. Delirium and catatonia can co-occur in the setting of medical illness, but no studies have explored this relationship by age. Our objective was to assess whether advancing age and the presence of catatonia are associated with delirium. METHODS/STUDY POPULATION: Methods: We prospectively enrolled critically ill patients at a single institution who were on a ventilator or in shock and evaluated them daily for delirium using the Confusion Assessment for the ICU and for catatonia using the Bush Francis Catatonia Rating Scale. Measures of association (OR) were assessed with a simple logistic regression model with catatonia as the independent variable and delirium as the dependent variable. Effect measure modification by age was assessed using a Likelihood ratio test. RESULTS/ANTICIPATED RESULTS: Results: We enrolled 136 medical and surgical critically ill patients with 452 matched (concomitant) delirium and catatonia assessments. Median age was 59 years (IQR: 52–68). In our cohort of 136 patients, 58 patients (43%) had delirium only, 4 (3%) had catatonia only, 42 (31%) had both delirium and catatonia, and 32 (24%) had neither. Age was significantly associated with prevalent delirium (i.e., increasing age associated with decreased risk for delirium) (p=0.04) after adjusting for catatonia severity. Catatonia was significantly associated with prevalent delirium (p<0.0001) after adjusting for age. Peak delirium risk was for patients aged 55 years with 3 or more catatonic signs, who had 53.4 times the odds of delirium (95% CI: 16.06, 176.75) than those with no catatonic signs. Patients 70 years and older with 3 or more catatonia features had half this risk. DISCUSSION/SIGNIFICANCE OF IMPACT: Conclusions: Catatonia is significantly associated with prevalent delirium even after controlling for age. These data support an inverted U-shape risk of delirium after adjusting for catatonia. This relationship and its clinical ramifications need to be examined in a larger sample, including patients with dementia. Additionally, we need to assess which acute brain syndrome (delirium or catatonia) develops first.
To evaluate probiotics for the primary prevention of Clostridium difficile infection (CDI) among hospital inpatients.
DESIGN
A before-and-after quality improvement intervention comparing 12-month baseline and intervention periods.
SETTING
A 694-bed teaching hospital.
INTERVENTION
We administered a multispecies probiotic comprising L. acidophilus (CL1285), L. casei (LBC80R), and L. rhamnosus (CLR2) to eligible antibiotic recipients within 12 hours of initial antibiotic receipt through 5 days after final dose. We excluded (1) all patients on neonatal, pediatric and oncology wards; (2) all individuals receiving perioperative prophylactic antibiotic recipients; (3) all those restricted from oral intake; and (4) those with pancreatitis, leukopenia, or posttransplant. We defined CDI by symptoms plus C. difficile toxin detection by polymerase chain reaction. Our primary outcome was hospital-onset CDI incidence on eligible hospital units, analyzed using segmented regression.
RESULTS
The study included 251 CDI episodes among 360,016 patient days during the baseline and intervention periods, and the incidence rate was 7.0 per 10,000 patient days. The incidence rate was similar during baseline and intervention periods (6.9 vs 7.0 per 10,000 patient days; P=.95). However, compared to the first 6 months of the intervention, we detected a significant decrease in CDI during the final 6 months (incidence rate ratio, 0.6; 95% confidence interval, 0.4–0.9; P=.009). Testing intensity remained stable between the baseline and intervention periods: 19% versus 20% of stools tested were C. difficile positive by PCR, respectively. From medical record reviews, only 26% of eligible patients received a probiotic per the protocol.
CONCLUSIONS
Despite poor adherence to the protocol, there was a reduction in the incidence of CDI during the intervention, which was delayed ~6 months after introducing probiotic for primary prevention.
The Neotoma Paleoecology Database is a community-curated data resource that supports interdisciplinary global change research by enabling broad-scale studies of taxon and community diversity, distributions, and dynamics during the large environmental changes of the past. By consolidating many kinds of data into a common repository, Neotoma lowers costs of paleodata management, makes paleoecological data openly available, and offers a high-quality, curated resource. Neotoma’s distributed scientific governance model is flexible and scalable, with many open pathways for participation by new members, data contributors, stewards, and research communities. The Neotoma data model supports, or can be extended to support, any kind of paleoecological or paleoenvironmental data from sedimentary archives. Data additions to Neotoma are growing and now include >3.8 million observations, >17,000 datasets, and >9200 sites. Dataset types currently include fossil pollen, vertebrates, diatoms, ostracodes, macroinvertebrates, plant macrofossils, insects, testate amoebae, geochronological data, and the recently added organic biomarkers, stable isotopes, and specimen-level data. Multiple avenues exist to obtain Neotoma data, including the Explorer map-based interface, an application programming interface, the neotoma R package, and digital object identifiers. As the volume and variety of scientific data grow, community-curated data resources such as Neotoma have become foundational infrastructure for big data science.
An internationally approved and globally used classification scheme for the diagnosis of CHD has long been sought. The International Paediatric and Congenital Cardiac Code (IPCCC), which was produced and has been maintained by the International Society for Nomenclature of Paediatric and Congenital Heart Disease (the International Nomenclature Society), is used widely, but has spawned many “short list” versions that differ in content depending on the user. Thus, efforts to have a uniform identification of patients with CHD using a single up-to-date and coordinated nomenclature system continue to be thwarted, even if a common nomenclature has been used as a basis for composing various “short lists”. In an attempt to solve this problem, the International Nomenclature Society has linked its efforts with those of the World Health Organization to obtain a globally accepted nomenclature tree for CHD within the 11th iteration of the International Classification of Diseases (ICD-11). The International Nomenclature Society has submitted a hierarchical nomenclature tree for CHD to the World Health Organization that is expected to serve increasingly as the “short list” for all communities interested in coding for congenital cardiology. This article reviews the history of the International Classification of Diseases and of the IPCCC, and outlines the process used in developing the ICD-11 congenital cardiac disease diagnostic list and the definitions for each term on the list. An overview of the content of the congenital heart anomaly section of the Foundation Component of ICD-11, published herein in its entirety, is also included. Future plans for the International Nomenclature Society include linking again with the World Health Organization to tackle procedural nomenclature as it relates to cardiac malformations. By doing so, the Society will continue its role in standardising nomenclature for CHD across the globe, thereby promoting research and better outcomes for fetuses, children, and adults with congenital heart anomalies.
By
Stephen J. Rassenti, Economic Science Institute, Chapman University,
Vernon L. Smith, Economic Science Institute, Chapman University,
Robert L. Bulfin, Department of Industrial and Systems Engineering, Auburn University
In 1968 the FAA adopted a high density rule for the allocation of scarce landing and take-off slots at four major airports (La Guardia,Washington National, Kennedy International, and O'Hare International). This rule establishes slot quotas for the control of airspace congestion at these airports.
Airport runway slots, regulated by these quotas, have a distinguishing feature which any proposed allocation procedure must accommodate: an airline's demand for a takeoff slot at a flight originating airport is not independent of its demand for a landing slot at the flight destination airport. Indeed, a given flight may take off and land in a sequence of several connected demand interdependent legs. For economic efficiency it is desirable to develop an airport slot allocation procedure that allocates individual slots to those airline flights for which the demand (willingness to pay) is greatest.
Grether, Isaac, and Plott (hereafter, GIP) (1979, 1981) have proposed a practical market procedure for achieving this goal. Their procedure is based upon the growing body of experimental evidence on the performance of (1) the competitive (uniform price) sealed-bid auction and (2) the oral double auction such as is used on the organized stock and commodity exchanges. Under their proposal an independent primary market for slots at each airport would be organized as a sealed-bid competitive auction at timely intervals. Since the primary market allocation does not make provision for slot demand interdependence, a computerized form of the oral double auction (with block transaction capabilities) is proposed as an “after market” to allow airlines to purchase freely and sell primary market slots to each other. This continuous after market exchange would provide the institutional means by which individual airlines would acquire those slot packages which support their individual flight schedules. Thus, an airline that acquired slots at Washington National which did not flight-match the slots acquired at O'Hare could either buy additional O'Hare slots or sell its excess Washington slots in the after market.
The Protectorate is arguably the Cinderella of Interregnum studies: it lacks the immediate drama of the Regicide, the Republic or the Restoration, and is often dismissed as a 'retreat from revolution', a short period of conservative rule before the inevitable return of the Stuarts. The essays in this volume present new research that challenges this view. They argue instead that the Protectorate was dynamic and progressive, even if the policies put forward were not always successful, and often created further tensions within the government and between Whitehall and the localities. Particular topics include studies of Oliver Cromwell and his relationship with Parliament, and the awkward position inherited by his son, Richard; the role of art and architecture in creating a splendid protectoral court; and the important part played by the council, as a law-making body, as a political cockpit, and as part of a hierarchy of government covering not just England but also Ireland and Scotland. There are also investigations of the reactions to Cromwellian rule in Wales, in the towns and cities of the Severn/Avon basin, and in the local communities of England faced with a far-reaching programme of religious reform. PATRICK LITTLE is Senior Research Fellow at the History of Parliament Trust. Contributors: BARRY COWARD, DAVID L. SMITH, JASON PEACEY, PAUL HUNNEYBALL, BLAIR WORDEN, PETER GAUNT, LLOYD BOWEN, STEPHEN K. ROBERTS, CHRISTOPHER DURSTON.
The transition from inland- to streaming-style ice flow near to and upstream from the onset to Ice Stream D, West Antarctica, is investigated using the force-balance technique. Basal drag provides the majority of the flow resistance over the study area but is substantially modified by non-local stress gradients. Lateral drag increases with distance downstream, balancing ∼50–100% of the driving stress at the onset. Longitudinal stress gradients (LSG) are also found to be significant, an observation that distinguishes ice flow in this region from the inland- and streaming-flow regimes that bound it, in which LSG are usually negligible. LSG decrease the spatial variability in basal drag and sliding speed and increase the area of the bed over which frictional melting occurs. Overall, LSG decrease the resistive influence of basal stress concentrations and increase the spatial uniformity of basal sliding. These observations suggest that streaming flow develops as an integrated response to the physical interaction between the ice and its bed over an extended region upstream from the onset, rather than being solely due to changes in basal characteristics at the onset. An implication is that non-steady-flow behavior upstream from the onset may ultimately propagate downstream and result in non-steady behavior at the onset.