To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The ecological fecundity of the northern shore of Lake Victoria was vital to Buganda’s dominance of the interlacustrine region during the pre-colonial period. Despite this, protein-energy malnutrition was notoriously common throughout the twentieth century. This paper charts changes in nutritional illness in a relatively wealthy, food-secure area of Africa during a time of vast social, economic and medical change. In Buganda at least, it appears that both the causation and epidemiology of malnutrition moved away from the endemic societal causes described by early colonial doctors and became instead more defined by individual position within a rapidly modernising economy.
Juries, committees and experts panels commonly appraise things of one kind or another on the basis of grades awarded by several people. When everybody's grading thresholds are known to be the same, the results sometimes can be counted on to reflect the graders’ opinion. Otherwise, they often cannot. Under certain conditions, Arrow's ‘impossibility’ theorem entails that judgements reached by aggregating grades do not reliably track any collective sense of better and worse at all. These claims are made by adapting the Arrow–Sen framework for social choice to study grading in groups.
Demographic trends, and older people over 65 years disproportionately occupying beds in psychiatric hospitals, pointed to their increasing clinical needs. Clinical work with older people often required different skills from work with younger people. ‘General psychiatrists’, nominally working with adults of all ages, usually had little interest in working with older people. By 1977, it was clear to clinical leaders in the field of psychogeriatrics that official recognition of their specialty by the government was essential to ensure service development. Official recognition would provide the means to collect data to identify gaps in services, to obtain information on the implementation of government guidance and to advocate for resources, including ensuring high quality training posts for doctors wanting to specialise in the field. Doctors have traditionally taken the lead in creating new medical specialties, and psychogeriatrics was no exception. However, support fluctuated towards the specialty from the leadership of the Royal College of Psychiatrists. Health service leaders who did not undertake work with older people, were incredulous that others wished to do so. Negotiations between the Royal College of Psychiatrists and the Department of Health and Social Security about recognising psychogeriatrics were convoluted and prolonged. Recognition was achieved in 1989, following intervention by the Royal College of Physicians of London.
The Pioneer Health Centre, based in South London before and after the Second World War, remains a source of interest for advocates of a positive approach to health promotion in contrast with the treatment of those already ill. Its closure in 1950 for lack of funds has been blamed on the then recently established National Health Service, but this article argues that such an explanation is over-simplified and ignores a number of other factors. The Centre had struggled financially during the 1930s and tried to gain support from the Medical Research Council. The Council appeared interested in the Centre before the war, but was less sympathetic in the 1940s. Around the time of its closure and afterwards, the Centre was also involved in negotiations with London County Council; these failed because the Centre’s directors would not accept the changes which the Council would have needed to make. Unpublished documents reveal that the Centre’s directors were uncompromising and that their approach to the situation antagonised their colleagues. Changes in medical science also worked against the Centre. The success of sulphonamide drugs appeared to render preventive medicine less significant, while the development of statistical techniques cast doubt on the Centre’s experimental methods. The Centre was at the heart of the nascent organic farming movement, which opposed the rapid growth of chemical cultivation. But what might be termed ‘chemical triumphalism’ was on the march in both medicine and agriculture, and the Centre was out of tune with the mood of the times.
This article examines constructions of national musical identity in early twentieth-century Britain by exploring and contextualizing hitherto neglected discourses and practices concerning the production of an ‘English’ singing voice. Tracing the origins and development of ideas surrounding native vocal performance and pedagogy, I reconstruct a culture of English singing as a backdrop against which to offer, by way of conclusion, a reading of the ‘English voice’ performed in Ralph Vaughan Williams's song ‘Silent Noon’. By drawing upon perspectives derived from recent studies of song, vocal production, and national and aesthetic identity, I demonstrate that ‘song’ became a place in which the literal and figurative voices of performers and composers were drawn together in the making of a national music. As such, I advance a series of new historical perspectives through which to rethink notions of an English musical renaissance.
This paper analyses how research on antibiotic resistance has been a driving force in the development of new antibiotics. Drug resistance, while being a problem for physicians and patients, offers attractive perspectives for those who research and develop new medicines. It imposes limits on the usability of older medicines and simultaneously modifies pathologies in a way that opens markets for new treatments. Studying resistance can thus be an important part of developing and marketing antibiotics.
The chosen example is that of the German pharmaceutical company Bayer. Before World War Two, Bayer had pioneered the development of anti-infective chemotherapy, sulpha drugs in particular, but had missed the boat when it came to fungal antibiotics. Exacerbated by the effects of war, Bayer’s world market presence, which had been considerable prior to the war, had plummeted. In this critical situation, the company opted for a development strategy that tried to capitalise on the problems created by the use of first-generation antibiotics. Part and parcel of this strategy was monitoring what can be called the structural change of infectious disease. In practice, this meant to focus on pathologies resulting from resistance and hospital infections. In addition, Bayer also focused on lifestyle pathologies such as athlete’s foot. This paper will follow drug development and marketing at Bayer from 1945 to about 1980. In this period, Bayer managed to regain some of its previous standing in markets but could not escape from the overall crisis of anti-infective drug development from the 1970s on.