To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
As an old man, Divie Bethune McCartee (1820–1900) would write down his life story as a history of the foreign missions movement in America, beginning decades before his birth with the work of his grandparents in New York City. There, his grandfather had been an honorary member of the London Missionary Society, a founder of the New York Missionary Society, and a host to missionaries on their way to foreign fields. In that household, his mother had grown up surrounded by the print culture of foreign missions, learning about the world and anxious to go herself as a missionary, but instead making peace with her role as a vigorous supporter of benevolent causes at home in New York. When talking to a later generation of women missionaries, McCartee said that when he became a medical missionary to China and Japan, he went in her place.
Humanitarian intervention seeks to stop mass atrocities – killings, rapes, ethnic cleansing – within countries. From 1945 until the formal end of the Cold War in 1991, many more noncombatants were killed by violence within countries than by war among them. The examples, since 1991 alone, include Rwanda, the Congo, Bosnia, East Timor, Kosovo, Libya, and Syria. In the Congo alone, violence, but particularly the perilous conditions it created, claimed between 3 and 5 million lives between 1996 and 2003. Thomas Hobbes averred that the state alone could prevent people’s lives from being “nasty, brutish and short”; but in these and other instances, governments were the principal perpetrators of atrocities, which they methodically planned and committed to achieve political ends.
In the early modern period, extractive industries were the vanguard of European colonization in America. Whether involving the removal of minerals, flora, fauna, or other organic or inorganic materials, these ventures attracted enterprising Europeans hoping to profit from bringing natural resources out of newly accessible lands and into the expanding currents of international trade. Establishing a viable extractive industry – such as mining, logging, fishing, hunting animals, or collecting plants – proved a critical preliminary component of many settlement schemes by helping to generate the capital needed to underwrite their initial development and, ideally, by contributing to their ongoing productivity. Although the results were uneven, European powers, especially Spain, England, France, and the Netherlands, sought to parlay their subjects’ involvement in various American extractive industries to produce national wealth, claim sovereignty over new lands, justify the exploitation of subaltern populations, and lay the groundwork for more extensive imperial expansion. Whether as the foundations of new commodity frontiers or as the precursors to other forms of colonial development, extractive industries reshaped many regions in the Americas, often with dire outcomes for their Indigenous inhabitants and natural environments.
In 1903, the United States federal government narrowed the parameters of indigenous sovereignty in the formative Supreme Court case, Lone Wolf v. Hitchcock. The ruling followed in the wake of the Dawes Act of 1887, also known as the General Allotment Act, which authorized the disaggregation of communally held lands belonging to Native American nations. Although officials justified allotment on many grounds, including those framed as a humanitarian desire to prevent white settler violence, the policy opened up indigenous lands that exceeded tribal population – a number generated by the US census – to sale for white settlers. Secretary of the Interior Ethan Allen Hitchcock, who served under President William McKinley and President Theodore Roosevelt from 1899 to 1907, had come to superintend the bureaucratic machinery discharging this legislative mandate. Hitchcock drew fire from Native Americans, who protested this unequivocal land grab he oversaw through a variety of means, including legal appeals. The Kiowa leader Lone Wolf sued Hitchcock, claiming that allotment violated the Treaty of Medicine Lodge Creek of 1867, one of hundreds of treaties defining US–Indian relations that specifically granted the Kiowa nation the right to reject sales of their land with a three-fourths majority vote. Eventually, the Supreme Court ruled in favor of Hitchcock, effectively ignoring the legal precedent of recognizing treaties with indigenous nations as well as their status as semi-sovereign “domestic, dependent nations” – a status conferred in 1832 by Chief Justice John Marshall in earlier Supreme Court rulings. The dismissal of these legal precedents in 1903 with Lone Wolf v. Hitchcock, further empowered Congress, instilled with newly articulated plenary power, to abrogate treaties and mobilize a variety of legal instruments to separate indigenous peoples from their ancestral and reservation lands.
In the aftermath of World War II, the United States almost singlehandedly created the web of international institutions and organizations that comprise what is often referred to as the “liberal world order.” In the economic realm, American policymakers, working most closely with the British, designed what would become the International Monetary Fund and the World Bank. On trade matters, the United States considered plans for an International Trade Organization, although it ultimately designed and backed the less ambitious General Agreement on Trade and Tariffs. These efforts put in place international regimes and organizations that remain consequential today. But by far the highest-level American attention was reserved for the United Nations (UN), a new global organization designed to ensure future world peace and security.
On June 14, 1944, over two hundred women, representing seventy-five organizations, gathered at the White House for a conference on “How Women May Share in Post-War Policy Making.” Sponsored by Eleanor Roosevelt, the conference centered on sharing arguments and strategies for ensuring women’s involvement in all aspects of the anticipated peace process. “The tasks of war, of peace, of nation-planning,” the attendees resolved, “must be shared by men and women alike … Women have been called upon to share the burdens of war, to stand side by side with men on the production line and to complement men in the fighting services. So women must share in the building of a post-war world fit for all citizens – men and women – to live and work in freely side by side.”1 These women demanded a place at the peace tables not only because they saw themselves as equal citizens, not only because they knew they had something to contribute to the peace process, but also because they felt themselves obligated to help secure the postwar world.
On June 2, 1897, Assistant Secretary of the Navy Theodore Roosevelt addressed the Naval War College in Newport, Rhode Island. Citing international competition and economic interests overseas, he called for the immediate expansion and modernization of the US military. New armaments, particularly battleships, were necessary for securing the vast amounts of territory, capital, and influence that the United States had accumulated within and beyond the continent of North America. “This nation cannot stand still if it is to retain its self-respect,” he pronounced.
For American policymakers, the end of the Cold War was, above all, a self-affirming experience. The four decades of global competition with Soviet communism had cast doubt on whether the United States was the most powerful or most righteous country in the world, but the peaceful and precipitous collapse of communism in the late 1980s appeared to confirm that it was indeed both. Looking to the future in the early 1990s, American policymakers were guided by two steadfast beliefs. First, they believed that the United States should remain the most powerful country in the world, and that American primacy in world affairs would receive the consent of the vast majority of other countries for the foreseeable future. Second, they believed that the United States’ form of political and economic organization – liberal democratic capitalism – was destined to benignly conquer the globe, and that it was the job of the US government to accelerate its expansion.
Many standard practices of US foreign policy around the globe today were forged in Latin America before the United States became a global superpower. It was in Latin America that the United States first trained sovereign military forces, used development as a counterinsurgency technique, experimented with cultural diplomacy, led in the creation of an international governmental organization, and signed a regional security pact. But Latin America was not a sterile laboratory for the unilateral development of US foreign policy. Rather, policymakers crafted these various strategies for advancing US interests in response to Latin American engagements with US power and changing political circumstances.
Americans had traditionally prided themselves on being able to maintain a small army. Protected by two oceans and following George Washington’s plea to avoid entangling alliances with the Europeans, they had fought their wars with volunteers. The first half of the twentieth century forced a radical revision of those traditional beliefs. Between 1900 and 1945 the United States built a powerful military capable of fighting in almost any corner of the globe. It also joined coalitions and alliances with European powers, forming the foundation for building the most powerful military alliance the world had yet known. The massive changes in the nature of American military power from 1900 to 1945 thus represent not an evolutionary shift, but a revolutionary one.
The United States was born in an imperial world, and grew to economic preeminence in an era of intense globalization. At first its citizens experienced empire as the struggle with the motherland and among the contending European powers that fought over North America and in the North Atlantic from 1750 to 1815. From 1815, this inter-imperial entanglement became increasingly focused on the relationship with the British empire. The American empire as both aspiration and practice emerged from and was heavily influenced by this relationship. Having developed its own imperial agenda under the hegemony of British naval and economic power on a global level, the United States gradually came to challenge this unequal relationship. Informal and formal empire are relevant to such trans-imperial entanglements as are non-state actors such as missionaries and entrepreneurs.
Frederick Douglass spoke for many when he told a Glasgow audience, during his first visit to Britain, that the aim of the antislavery international was the construction of a cordon of antislavery feeling around the United States. It was bounded by “Canada on the North, Mexico in the west, and England, Scotland and Ireland on the east, so that wherever a slaveholder went, he might hear nothing but denunciation of slavery, that he might be looked down upon as a man-stealing, cradle-robbing, and woman-stripping monster, and that he might see reproof and detestation on every hand.”1 The movement relied on the cooperation of people of good faith, men and women, black and white, committed to the destruction of slavery wherever it existed. It appealed to all classes and races, from the aristocracy of Stafford House, as J. Sella Martin a fugitive slave put it, to the “hard-handed working men of the great railway works of Stratford” in London’s East End.
Slavery connected North American colonies to a wider world in ways both obvious and subtle. Most obviously, forcing African people to toil in the New World introduced African knowledge, cultures, and languages into American societies, enriching not only slaveholders, but also American culture. African labor, skills, and traditions built American economies, shaped systems of production, and transformed American cuisine, music, and speech. Meanwhile, North American colonists adapted slavery’s caste system from precedents set elsewhere around the Atlantic. Slaveholding North American societies codified and elaborated systems to control the enslaved to suit their own ends and circumstances, articulating a racial division of rights and labor that uniquely constrained African American lives and subjected enslaved people to myriad abuses, but the basics of property in persons and hereditary servile status were imported.
Infused with common transnational sensibilities, this novel scholarship has taken a variety of interpretative paths. Some of these new histories were pioneered by diplomatic historians who increasingly placed the perceptions and policies of presidents, diplomats, and generals on a global stage or employed new and sometimes non-American based archives to illuminate the perspectives of non-state actors from the worlds of business, activism, religion, and what we now call nongovernmental organizations. Other scholars have crafted social and cultural histories, offering a wider vision of American engagement in the world by exploring how the construction of American state and society has intersected with global forces and contestations over identity abroad. At the same time this work shares many of the convictions that have animated new work on the Atlantic World, slavery, borderlands, migration and the environment, and in critical race and queer studies. In all these ways, historians have embraced multiple transnational optics to reimagine how US history was made.
A dinner party in the prosperous Washington suburb of McLean (Va.) among high-level dignitaries present is hardly an unusual affair. But this one was rather special. It was the evening of January 28, 1979 and the People’s Republic of China (PRC)’s Vice Premier Deng Xiaoping sat between the US Secretary of State Cyrus Vance and the National Security Advisor Zbigniew Brzezinski. The atmosphere “was lively and friendly. Several toasts were given expressing hope for the future and pride for what had been accomplished.” Brzezinski, whose house provided the setting for the dinner, would remember it as one of the highlights of his career in the Carter Administration. And for good reason. For this was not a mere courtesy meeting. Deng, Brzezinski, Vance, and the others present were celebrating the full normalization of Sino-American relations.
A growing number of scholars have begun to use settler colonialism as a lens for US history, especially in the nineteenth century as the United States built its continental empire by dispossessing hundreds of Native nations. Most scholars of North America who have engaged settler colonialism have done so through the work of Australian historian Patrick Wolfe, who analyzed settler colonialism as a distinctive form of colonialism. In contrast to labor-extractive forms of colonialism in which colonizers use indigenous labor to reap profits, Wolfe argued, settler colonialists seek to eliminate indigenous people and take their lands. Wolfe’s notion of settler colonialism’s “logic of elimination” allows for the identification of a range of policies and practices – including outright genocide, removal, and assimilation – all with the goal of taking indigenous lands and replacing Native inhabitants with settlers.
Writing in 1917, the Seneca intellectual Arthur C. Parker claimed the following about the “American Indian”: “Loving liberty as he does, he will fight for it. Knowing the tragedy of ‘broken treaties,’ he will fight that there not be more treaties broken. Challenged, the Indian has responded and shown himself a citizen of the world and an exponent of an ethical civilization wherein human liberty is assured.” In a different essay, entitled “The American Indian in the World Crisis,” Parker called the American Indian “a world patriot.”1 During World War I, American Indians such as Parker asserted a global outlook as they sought citizenship at home and democracy for small nations abroad, an early form of the Double V campaign.