We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Most people with mental illness in low and middle-income countries (LMICs) do not receive biomedical treatment, though many seek care from traditional healers and faith healers. We conducted a qualitative study in Buyende District, Uganda, using framework analysis. Data collection included interviews with 24 traditional healers, 20 faith healers, and 23 biomedical providers, plus 4 focus group discussions. Interviews explored treatment approaches, provider relationships, and collaboration potential until theoretical saturation was reached. Three main themes emerged: (1) Biomedical providers’ perspectives on traditional and faith healers; (2) Traditional and faith healers’ views on biomedical providers; and (3) Collaboration opportunities and barriers. Biomedical providers viewed faith healers positively but traditional healers as potentially harmful. Traditional and faith healers valued biomedical approaches while feeling variably accepted. Interest in collaboration existed across groups but was complicated by power dynamics, economic concerns, and differing mental illness conceptualizations. Traditional healers and faith healers routinely referred patients to biomedical providers, though reciprocal referrals were rare. The study reveals distinct dynamics among providers in rural Uganda, with historical colonial influences continuing to shape relationships and highlighting the need for integrated, contextually appropriate mental healthcare systems.
The cause of most CHD is unknown and considered complex, implicating genetic and environmental factors in disease causation. The Kids Heart BioBank was established in 2003 to accelerate genetic investigations into CHD.
Methods:
Recruitment includes patients undergoing interventions for CHD at The Children’s Hospital at Westmead. Informed consent is obtained from parents/guardians, and blood is collected at the time of cardiac intervention from which DNA is extracted and stored. Associated detailed clinical information and a family history are stored in the purpose-designed database.
Results:
To date, the Kids Heart BioBank contains biospecimens and associated clinical information from over 4,900 patients with CHD and their families. Two-thirds (64.1%) of probands have been included in research studies with 28.9% of participants who underwent genomic sequencing receiving a molecular diagnosis with direct clinical utility. The value of this resource to patients and families is highlighted by the high consent rate (94.6%) and the low withdrawal of consent rate (0.4%). The Kids Heart BioBank has supported many large national and international collaborations and contributed significantly to CHD research.
Conclusions:
The Kids Heart BioBank is an invaluable resource and, together with other similar resources, the resulting research has paved the way for clinical genetic testing options for CHD patients, previously not possible. With research in the field moving away from diagnosing monogenic disease, the Kids Heart BioBank is ideally placed to support the next chapter of research efforts into complex disease mechanisms, requiring large patient cohorts with detailed phenotypic information.
In this article, I take up the case of runic writing to reflect upon James Scott’s view of the nexus between writing and various forms of domination in early states, especially the use of literacy for taxation in cereal-growing societies. Scott’s theses provide interesting matter “to think with,” even when his grasp of historical detail has been found wanting. It is not controversial to grant Scott that cuneiform writing was a remarkable tool for statecraft, and exploitation, in the first states of Mesopotamia, around 3500 BC. The same is true of writing in other early states. But in the first states of Scandinavia, particularly Denmark ca. AD 500–800, writing had a more troubled relationship with the state. No evidence survives that runic writing was used to administer taxation or much else, as it was in other agrarian civilisations. It is true that the runic script was used to commemorate kings, most famously by Haraldr Blátǫnn (r. ca. 958–ca. 986.). But, statistically speaking, it was more often used to aggrandize the sort of local big men who usually resisted centralized power. In this article, I survey the relationship between runic writing and administration. I consider what the Danish situation suggests about the relationship between states and writing and offer a tentative hypothesis of a short-lived attempt at runic bureaucracy around 800, which created—and quickly lost control of—a shortened variety of the runic script (the Younger Futhark).
The variants of frontotemporal dementia (FTD) require careful differentiation from primary psychiatric disorders as the neuropsychiatric manifestations can overshadow the unique cognitive deficits. The language variants of FTD are less readily recognised by trainees despite making up around 43% of cases.1 This educational article presents an anonymised case of one of the language variants: semantic dementia. The cognitive deficits and neuropsychiatric manifestations (delusions and hyperreligiosity) are explored in terms of aetiology and management. By the end of the article, readers should be able to differentiate FTD from Alzheimer's disease, understand the principles of management and associated risks, and develop a multifaceted approach to hyperreligiosity in dementia.
Prisons are susceptible to outbreaks. Control measures focusing on isolation and cohorting negatively affect wellbeing. We present an outbreak of coronavirus disease 2019 (COVID-19) in a large male prison in Wales, UK, October 2020 to April 2021, and discuss control measures.
We gathered case-information, including demographics, staff-residence postcode, resident cell number, work areas/dates, test results, staff interview dates/notes and resident prison-transfer dates. Epidemiological curves were mapped by prison location. Control measures included isolation (exclusion from work or cell-isolation), cohorting (new admissions and work-area groups), asymptomatic testing (case-finding), removal of communal dining and movement restrictions. Facemask use and enhanced hygiene were already in place. Whole-genome sequencing (WGS) and interviews determined the genetic relationship between cases plausibility of transmission.
Of 453 cases, 53% (n = 242) were staff, most aged 25–34 years (11.5% females, 27.15% males) and symptomatic (64%). Crude attack-rate was higher in staff (29%, 95% CI 26–64%) than in residents (12%, 95% CI 9–15%).
Whole-genome sequencing can help differentiate multiple introductions from person-to-person transmission in prisons. It should be introduced alongside asymptomatic testing as soon as possible to control prison outbreaks. Timely epidemiological investigation, including data visualisation, allowed dynamic risk assessment and proportionate control measures, minimising the reduction in resident welfare.
We describe 14 yr of public data from the Parkes Pulsar Timing Array (PPTA), an ongoing project that is producing precise measurements of pulse times of arrival from 26 millisecond pulsars using the 64-m Parkes radio telescope with a cadence of approximately 3 weeks in three observing bands. A comprehensive description of the pulsar observing systems employed at the telescope since 2004 is provided, including the calibration methodology and an analysis of the stability of system components. We attempt to provide full accounting of the reduction from the raw measured Stokes parameters to pulse times of arrival to aid third parties in reproducing our results. This conversion is encapsulated in a processing pipeline designed to track provenance. Our data products include pulse times of arrival for each of the pulsars along with an initial set of pulsar parameters and noise models. The calibrated pulse profiles and timing template profiles are also available. These data represent almost 21 000 h of recorded data spanning over 14 yr. After accounting for processes that induce time-correlated noise, 22 of the pulsars have weighted root-mean-square timing residuals of
$<\!\!1\,\mu\text{s}$
in at least one radio band. The data should allow end users to quickly undertake their own gravitational wave analyses, for example, without having to understand the intricacies of pulsar polarisation calibration or attain a mastery of radio frequency interference mitigation as is required when analysing raw data files.
Even gods are not always above bureaucracy. Societies very different from each other have entertained the idea that the heavens might be arranged much like an earthly bureaucracy, or that mythological beings might exercise their power in a way that makes them resembles bureaucrats. The best-known case is the Chinese “celestial bureaucracy,” but the idea is also found in (to take nearly random examples) Ancient Near Eastern cosmology, the Hebrew Bible, Late Antiquity, and modern popular culture. The primary sources discussed in this essay pertain to an area of history where bureaucracy was historically underdeveloped, namely medieval Scandinavia. Beginning with the Glavendrup runestone from the 900s, I examine a way of thinking about divine power that seems blissfully bureaucracy-free. Moving forwards in time to Adam of Bremen’s description of the temple at Uppsala (1040s–1070s), I find traces of a tentative, half-formed bureaucracy in the fading embers of Scandinavian paganism. In the 1220s, well into the Christian era, I find Snorri Sturluson concocting a version of Old Norse myth which proposes a novel resolution between the non-bureaucratic origins of his mythological corpus and the burgeoning bureacratization of High Medieval Norway. Although my focus is on medieval Scandinavia, transhistorical comparisons are frequently drawn with mythological bureaucrats from other times and places. In closing, I synthesise this comparative material with historical and anthropological theories of the relationship between bureaucracy and the divine.
The WAIS (West Antarctic Ice Sheet) Divide deep ice core was recently completed to a total depth of 3405 m, ending 50 m above the bed. Investigation of the visual stratigraphy and grain characteristics indicates that the ice column at the drilling location is undisturbed by any large-scale overturning or discontinuity. The climate record developed from this core is therefore likely to be continuous and robust. Measured grain-growth rates, recrystallization characteristics, and grain-size response at climate transitions fit within current understanding. Significant impurity control on grain size is indicated from correlation analysis between impurity loading and grain size. Bubble-number densities and bubble sizes and shapes are presented through the full extent of the bubbly ice. Where bubble elongation is observed, the direction of elongation is preferentially parallel to the trace of the basal (0001) plane. Preferred crystallographic orientation of grains is present in the shallowest samples measured, and increases with depth, progressing to a vertical-girdle pattern that tightens to a vertical single-maximum fabric. This single-maximum fabric switches into multiple maxima as the grain size increases rapidly in the deepest, warmest ice. A strong dependence of the fabric on the impurity-mediated grain size is apparent in the deepest samples.
Timing of weed emergence and seed persistence in the soil influence the ability to implement timely and effective control practices. Emergence patterns and seed persistence of kochia populations were monitored in 2010 and 2011 at sites in Kansas, Colorado, Wyoming, Nebraska, and South Dakota. Weekly observations of emergence were initiated in March and continued until no new emergence occurred. Seed was harvested from each site, placed into 100-seed mesh packets, and buried at depths of 0, 2.5, and 10 cm in fall of 2010 and 2011. Packets were exhumed at 6-mo intervals over 2 yr. Viability of exhumed seeds was evaluated. Nonlinear mixed-effects Weibull models were fit to cumulative emergence (%) across growing degree days (GDD) and to viable seed (%) across burial time to describe their fixed and random effects across site-years. Final emergence densities varied among site-years and ranged from as few as 4 to almost 380,000 seedlings m−2. Across 11 site-years in Kansas, cumulative GDD needed for 10% emergence were 168, while across 6 site-years in Wyoming and Nebraska, only 90 GDD were needed; on the calendar, this date shifted from early to late March. The majority (>95%) of kochia seed did not persist for more than 2 yr. Remaining seed viability was generally >80% when seeds were exhumed within 6 mo after burial in March, and declined to <5% by October of the first year after burial. Burial did not appear to increase or decrease seed viability over time but placed seed in a position from which seedling emergence would not be possible. High seedling emergence that occurs very early in the spring emphasizes the need for fall or early spring PRE weed control such as tillage, herbicides, and cover crops, while continued emergence into midsummer emphasizes the need for extended periods of kochia management.
The effects of topsoil thickness on winter annual weed growth and nutrient concentration were assessed for three consecutive years in soybean plots. The topsoil treatments had high fertility levels, uniform textures, and no herbicides were used in the study. Common chickweed composed 75% of the winter annual weed species. Weed biomass production decreased as topsoil thickness decreased from 22.5 cm to 0. Topsoil thickness of 22.5 cm produced 800 kg ha–1 more weed growth than 0 cm topsoil. The weed biomass grown in thicker topsoil had higher total amounts of N, K, Mg, and Ca.
Annual ryegrass has been proposed as a cover crop in the corn–soybean cropping systems of the U.S. Midwest because of its low seed cost, rapid establishment, contribution to soil quality, weed suppressive abilities, and susceptibility to common broad-spectrum herbicides. However, cover crops can reduce the subsequent main crop yield by creating unfavorable germination and emergence conditions, harboring pests, and if not controlled, competing with the main crop. This study, conducted in Illinois, Oregon, and Tennessee, investigated the efficacy of glyphosate for annual ryegrass winter cover crop removal. Glyphosate at 415, 830, and 1,660 g ae/ha was applied to annual ryegrass at late tiller, second node, boot, and early flowering stages. Annual ryegrass control was consistently maximized with the highest glyphosate rate applied at the boot or early flower stage. Annual ryegrass biomass was generally the lowest with the highest rate of glyphosate applied at the earlier stages. Overall, no single application timing at any glyphosate rate provided complete control or biomass reduction of the annual ryegrass cover crop. A sequential herbicide program or a glyphosate plus a graminicide tank-mix probably will be needed for adequate annual ryegrass stand removal.
Research was conducted from 2011 to 2014 to determine weed population
dynamics and frequency of glyphosate-resistant (GR) Palmer amaranth with
herbicide programs consisting of glyphosate, dicamba, and residual
herbicides in dicamba-tolerant cotton. Five treatments were maintained in
the same plots over the duration of the experiment: three sequential POST
applications of glyphosate with or without pendimethalin plus diuron PRE;
three sequential POST applications of glyphosate plus dicamba with and
without the PRE herbicides; and a POST application of glyphosate plus
dicamba plus acetochlor followed by one or two POST applications of
glyphosate plus dicamba without PRE herbicides. Additional treatments
included alternating years with three sequential POST applications of
glyphosate only and glyphosate plus dicamba POST with and without PRE
herbicides. The greatest population of Palmer amaranth was observed when
glyphosate was the only POST herbicide throughout the experiment. Although
diuron plus pendimethalin PRE in a program with only glyphosate POST
improved control during the first 2 yr, these herbicides were ineffective by
the final 2 yr on the basis of weed counts from soil cores. The lowest
population of Palmer amaranth was observed when glyphosate plus dicamba were
applied regardless of PRE herbicides or inclusion of acetochlor POST.
Frequency of GR Palmer amaranth was 8% or less when the experiment was
initiated. Frequency of GR Palmer amaranth varied by herbicide program
during 2012 but was similar among all herbicide programs in 2013 and 2014.
Similar frequency of GR Palmer amaranth across all treatments at the end of
the experiment most likely resulted from pollen movement from Palmer
amaranth treated with glyphosate only to any surviving female plants
regardless of PRE or POST treatment. These data suggest that GR Palmer
amaranth can be controlled by dicamba and that dicamba is an effective
alternative mode of action to glyphosate in fields where GR Palmer amaranth
exists.
We studied neuroinflammation in individuals with late-life, depression, as a
risk factor for dementia, using [11C]PK11195 positron emission
tomography (PET). Five older participants with major depression and 13
controls underwent PET and multimodal 3T magnetic resonance imaging (MRI),
with blood taken to measure C-reactive protein (CRP). We found significantly
higher CRP levels in those with late-life depression and raised
[11C]PK11195 binding compared with controls in brain regions
associated with depression, including subgenual anterior cingulate cortex,
and significant hippocampal subfield atrophy in cornu ammonis 1 and
subiculum. Our findings suggest neuroinflammation requires further
investigation in late-life depression, both as a possible aetiological
factor and a potential therapeutic target.
This work provides new insights into human responses to and perceptions of sea-level rise at a time when the landscapes of north-west Europe were radically changing. These issues are investigated through a case study focused on the Channel Islands. We report on the excavation of two sites, Canal du Squez in Jersey and Lihou (GU582) in Guernsey, and the study of museum collections across the Channel Islands. We argue that people were drawn to this area as a result of the dynamic environmental processes occurring and the opportunities these created. The evidence suggests that the area was a particular focus during the Middle Mesolithic, when Guernsey and Alderney were already islands and while Jersey was a peninsula of northern France. Insularisation does not appear to have created a barrier to occupation during either the Middle or Final Mesolithic, indicating the appearance of lifeways increasingly focused on maritime voyaging and marine resources from the second half of the 9th millennium BC onwards.
It is widely accepted that between the beginning of the Early Neolithic period and the end of the Early Bronze Age different regions of Britain were connected to one another by sea, but little is known about the nature of maritime contacts before plank-built boats developed during the 2nd millennium bc. This paper considers a series of coastal sites, some of which were first settled from Mesolithic times. From the early 4th millennium they were also associated with artefact production and the use of imported objects and raw materials. Their distribution focuses on the region of isostatic uplift in northern Britain where the ancient shoreline still survives. It is considered in relation to a new model of coastal change which suggests that these locations were characterised by natural havens sheltered behind islands or bars. The sites can be compared with the ‘landing places’ and ‘beach markets’ discussed by historical archaeologists in recent years.
Objectives: The headroom approach to medical device development relies on the estimation of a value-based price ceiling at different stages of the development cycle. Such price-ceilings delineate the commercial opportunities for new products in many healthcare systems. We apply a simple model to obtain critical business information as the product proceeds along a development pathway, and indicate some future directions for the development of the approach.
Methods: Health economic modelling in the supply-side development cycle for new products.
Results: The headroom can be used: initially as a ‘reality check’ on the viability of the device in the healthcare market; to support product development decisions using a real options approach; and to contribute to a pricing policy which respects uncertainties in the reimbursement outlook.
Conclusions: The headroom provides a unifying thread for business decisions along the development cycle for a new product. Over the course of the cycle attitudes to uncertainty will evolve, based on the timing and manner in which new information accrues. Within this framework the developmental value of new information can justify the costs of clinical trials and other evidence-gathering activities. Headroom can function as a simple shared tool to parties in commercial negotiations around individual products or groups of products. The development of similar approaches in other contexts holds promise for more rational planning of service provision.
Where better to begin a consideration of fear and the modern politics of representation than with the self-identified ‘shock’ Charles Baudelaire experienced sometime between ‘1846 and ‘47’ when he first discovered ‘a few fragments by Edgar Poe’? (1986: 148). To underscore the importance of his discovery, Baudelaire's letter divides his reaction to Poe's work into two categories. On an aesthetic level, the jarring power of Poe's fragmentary narrative exposition enabled a new mode of representation to reveal shocking moments of emotional uncertainty. The ‘poems and short stories’ Baudelaire happened upon in Paris were organised ‘in a vague, confused, disorderly way’, enacting a metaphorics of correspondences ‘that Poe had been able to bring together to perfection’ to expose a defensive struggle for self-expression and emotional coherence in the face of a fleeting and contingent view of everyday modern life (ibid.). But there is a second and unapologetically political response worthy of discussion here. It was Poe's dark vision of a democratic and progressive history that galvanised Baudelaire's commitments to develop one of the most unsettling features in his poetry. A mob fear haunts Les Fleurs du mal. And although my critical examination will resist the reductive assertion that Baudelaire's complicated politics were categorically anti-democratic, his fear of the court of public opinion as the immanent ordering force in modern life does offer a remarkably lucid, at times shockingly contentious, point of re-entry into his work. The issue can be rephrased thus: it was precisely in Poe's terrifying portrayals of the escalating conflict between the will of the supposedly free individual and the mass public that resonated with Baudelaire's concerns about the widening gap between individual liberty and democratic rule. A fear of the unchecked power of the multitude was therefore difficult for Baudelaire to endure but more intolerable to surrender.
Baudelaire recognised non-rational impulses were guiding the political motivations of an emerging mass public.
Federalism is a core principle of American government; yet, how much attention is given to federalism beyond introductory courses? A 1969 study described American federalism as the “dark continent” of political science teaching. Based on surveys of chairs of US departments of political science and members of the APSA’s section on federalism and intergovernmental relations in 2013, the authors found that these course offerings have increased markedly since 1969, that the courses cover a range of topics, and that many department chairs are interested in offering these courses in the future. However, the teaching of comparative federalism lags far behind American federalism. Thus, comparative federalism remains a “dark continent” of federalism teaching.
In the coming decades, the broad outline of Kenyan development is quite likely to duplicate the Mexican experience of industrial and commercial progress for a minority combined with economic stagnation for the majority. While there are obvious differences in culture, history, and geography, there are basic similarities in industrialization strategy, agricultural structure, urban expansion, and population growth rate. A continuation of the current industrial, financial, and agricultural strategies, and an extrapolation of other variables along certain key paths, will almost surely lead Kenya to the same form of dualistic transformation that has gradually engulfed Mexico over the course of three decades. A rather important point which emerges from the following analysis is that only by a significant reduction in the population growth rate can that result be avoided.
An economic transformation is many faceted, involving technological, institutional, and cultural changes as well as significant shifts in the product mix. From an economic standpoint, an essential element of transformation is a significant growth in labor productivity which, in turn, is translated into a rising level of income per capita. This improvement in average product per worker largely involves the transfer of labor from traditional, low productivity sectors into those which spearhead modernization. This facet of economic transformation is referred to as labor force transformation and is generally understood to be a goal of economic development.