To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The bequest for the Church Peace Union—the predecessor of today's Carnegie Council for Ethics in International Affairs (and the publisher of this journal)—was given by Andrew Carnegie in February 1914. The Church Peace Union subsequently sponsored the first worldwide gathering of religious leaders, which was held in Constance, Germany, on August 2, 1914. Convened under the shadow of an impending war, not all delegates made it to the gathering. Six months previously, Carnegie had stipulated that the Church Peace Union devote its funds to the deserving poor “after the arbitration of international disputes is established and war abolished, as it certainly will be some day.” This could happen, he noted, “sooner than expected, probably by the Teutonic nations, Germany, Britain, and the United States first deciding to act in unison, the others joining later.” The outbreak of war was a catastrophic blow to such hopes, as the very nations expected to be at the core of this civilized project descended into an orgy of destruction the likes of which the world had never seen.
Peace is normally understood as the absence of war among nations. But that definition presupposes the overarching importance of nations as the key units of human association. There are, however, many other non-national entities, such as races, ethnic communities, religions, cultures, and civilizations. These entities, too, engage in conflict from time to time, as exemplified by the interracial violence and religious antagonisms in various parts of the world today and, of course, that which took place in the past. Yet why do we preserve the terms “war” and “peace” only for interstate relations? This is a very limited perspective, inasmuch as wars are a phenomenon whose appearance long preceded the formation of nations in the modern centuries; and besides, a presumed state of peace among countries can conceal serious hostilities between races or religions within and across national boundaries. Nazi Germany was technically at peace with all countries till 1939, and yet violent acts were committed there against groups of people domestically who were not considered racially acceptable. In today's world, there are no large-scale international wars, but domestic tensions and physical assaults occur daily within many countries. Terrorists wage war against states and their citizens alike, but they are not nations. To counter their threat, war preparedness in the traditional sense may be useful, perhaps, but it is much less effective than the coming together of individuals and groups to create a condition of interdependence and mutual trust. World peace must fundamentally be founded on a sense of shared humanity, regardless of which country people happen to live in. To consider war and peace purely in the context of international relations, therefore, is insufficient, even anachronistic. What we need is less an international than a transnational idea of peace.
The Arab Spring of 2011 is widely viewed today as one of the great historical moments of political transformation. Comparisons have been made to the European revolutions of 1848 and the post–cold war democratic transitions in Eastern Europe, while some have spoken of a possible “fourth wave” of democratization. These analogies make sense given that longstanding dictators who seemed impervious to political change, in a region known for persistent authoritarianism, were suddenly toppled by largely nonviolent protesters invoking the universal themes of political freedom, dignity, and social justice. From the outset, however, the Arab Spring was met by a small chorus of criticism and contempt from prominent intellectuals, writers, and politicians.
Broad comparisons of international relations across time—of the prospects for peace and of the possibilities for a new ethics for a connected world—typically focus on two dimensions: economic globalization and integration on the one hand, and the character of major interstate relations on the other. One of the most striking features of the pre-1914 world was precisely the coincidence of intensified globalization with a dramatic deterioration in major power relations, the downfall of concert-style approaches to international order, and the descent into total war and ideological confrontation—what T. S. Eliot termed “the panorama of futility and anarchy which is contemporary history.” Today's optimists stress the degree to which globalization appears much more firmly institutionalized than it was a hundred years ago, the rather striking success of global economic governance in responding to the financial crisis of 2007–2008 (compared to, say, the Great Depression), and the longer-term trend within international society to move away from major-power war. Pessimists are less sure. They worry that we have had to re-learn just how unstable global capitalism can be, both in terms of the wrenching societal changes produced by economic success and of the political strains produced by slowdown and recession. And they point to the abiding or resurgent power of nationalism in all of the core countries in the system, the return of balance-of-power thinking (above all in Asia), and the renewed salience of major power politics.
The war in Iraq is over. U.S. troops have withdrawn. Saddam Hussein has been overthrown and replaced with a government perceived to be more democratic and more just to the Iraqi people. In late 2011, concurrent with the U.S. withdrawal, strategists suggested that there was “peace at last” in Iraq, a cause for celebration.
Significant attention has been paid to the history of public health in England during the final part of the twentieth century. Within this, however, the field that came to be known as specialist health promotion (SHP) has been relatively neglected. Between 1980 and 2000 those working in this field, generally known as health promotion specialists (HPSs), enjoyed a relative rise in policy and practice prominence before SHP was effectively abandoned by government and others charged with developing and sustaining public-health structures. This paper seeks to explain why the fall of SHP is important; to move towards explaining its rise and decline; and to argue for greater historical attention to be paid to an important but neglected field within health and health care. Essentially, SHP emerged from a set of loose and contingent practices known as health education. A range of important social, economic, organisational and political influences contributed to the slow construction of a putative specialism in health promotion, accompanied by the desire on the part of some (but not all) HPSs to ‘professionalise’ their role. Finally the projects of both specialisation and professionalisation failed, again as a result of then prevailing organisational and political influences. The importance of such a failure in a so-called era of public health is discussed. In the light of this, the paper concludes by briefly setting out an agenda for further research related to the history of SHP.
This article explores the ways in which risk and responsibility were conceptualised in the late nineteenth and early twentieth centuries by surgeons, their patients and the lay public. By this point surgery could be seen, simultaneously, as safe (due to developments in surgical science) and increasingly risky (because such progress allowed for greater experimentation). With the glorification of the heroic surgeon in the late Victorian and early Edwardian period came a corresponding, if grudging, recognition that successful surgery was supported by a team of ancillary professionals. In theory, therefore, blame for mistakes could be shared amongst the team; in practice, this was not always the case. Opening with an examination of the May Thorne negligence case of 1904, I will also, in the latter third of this piece, focus on surgical risks encountered by women surgeons, themselves still relatively new and, therefore, potentially risky individuals. A brief case study of the ways in which one female-run institution, the New Hospital for Women, dealt with debates surrounding risk and responsibility concludes this article. The origin of the risks perceived and the ways in which responsibility was taken (or not) for risky procedures will provide ways of conceptualising what ‘surgical anxiety’ meant in the 1890s and 1900s.
Historical accounts of colonial science and medicine have failed to engage with the Colonial Office’s shift in focus towards the support of research after 1940. A large new fund was created in 1940 to expand activities in the colonies described as fundamental research. With this new funding came a qualitative shift in the type of personnel and activity sought for colonial development and, as a result, a diverse group of medical and technical officers existed in Britain’s colonies by the 1950s. The fact that such variety existed amongst British officers in terms of their qualifications, institutional locations and also their relationships with colonial and metropolitan governments makes the use of the term ‘expert’ in much existing historical scholarship on scientific and medical aspects of empire problematic. This article will consider how the Colonial Office achieved this expansion of research activities and personnel after 1940. Specifically, it will focus on the reasons officials sought to engage individuals drawn from the British research councils to administer this work and the consequences of their involvement for the new apparatus established for colonial research after 1940. An understanding of the implications of the application of the research council system to the Colonial Empire requires engagement with the ideology promoted by the Agricultural Research Council (ARC) and Medical Research Council (MRC) which placed emphasis on the distinct and higher status of fundamental research and which privileged freedom for researchers.