To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Truthmaking is the metaphysical exploration of the idea that what is true depends upon what exists. Truthmaker theorists argue about what the truthmaking relation involves, which truths require truthmakers, and what those truthmakers are. This Element covers the dominant views on these core issues in truthmaking. It also explores some key metaphysical topics and debates that are usefully approached by employing the tools of truthmaker theory: the debate between presentists and eternalists over the existence of entities from the past, and the debate between actualists and possibilists over merely possible states of affairs. In the final section, the Element explores how to think about truthmakers for truths involving social constructions.
Anti-Racist Shakespeare argues that Shakespeare is a productive site to cultivate an anti-racist pedagogy. Our study outlines the necessary theoretical foundations for educators to develop a critical understanding of the longue durée of racial formation so that they can implement anti-racist pedagogical strategies and interventions in their classrooms. This Element advances teaching Shakespeare through race and anti-racism in order to expose students to the unequal structures of power and domination that are systemically reproduced within society, culture, academic disciplines, and classrooms. We contend that this approach to teaching Shakespeare and race empowers students not only to see these paradigms but also to take action by challenging and overturning them. This title is also available as Open Access on Cambridge Core.
Normalizing flows, diffusion normalizing flows and variational autoencoders are powerful generative models. This Element provides a unified framework to handle these approaches via Markov chains. The authors consider stochastic normalizing flows as a pair of Markov chains fulfilling some properties, and show how many state-of-the-art models for data generation fit into this framework. Indeed numerical simulations show that including stochastic layers improves the expressivity of the network and allows for generating multimodal distributions from unimodal ones. The Markov chains point of view enables the coupling of both deterministic layers as invertible neural networks and stochastic layers as Metropolis-Hasting layers, Langevin layers, variational autoencoders and diffusion normalizing flows in a mathematically sound way. The authors' framework establishes a useful mathematical tool to combine the various approaches.
Theory is not a set of texts, it is a style of approach. It is to engage in the act of speculation: gestures of abstraction that re-imagine and dramatize the crises of living. This Element is a both a primer for understanding some of the more predominant strands of critical theory in the study of religion in late antiquity, and a history of speculative leaps in the field. It is a history of dilemmas that the field has tried to work out again and again - questions about subjectivity, the body, agency, violence, and power. This Element additionally presses us on the ethical stakes of our uses of theory, and asks how the field's interests in theory help us understand what's going on, half-spoken, in the disciplinary unconscious.
Ironic language is a salient reminder that speakers of all languages do not always mean what they say. While ironic language has captured the attention of theorists and scholars for centuries, it is only since the 1980s that psycholinguistic methods have been employed to investigate how readers and hearers detect, process, and comprehend ironic language. This Element reviews the foundational definitions, theories, and psycholinguistic models of ironic language, covering key questions such as the distinction between literal and ironic meaning, the role of contextual information during irony processing, and the cognitive mechanisms involved. These key questions continue to motivate new studies and methodological innovations, providing ample opportunity for future researchers who wish to continue exploring how ironic language is processed and understood.
Our minds are severely limited in how much information they can extensively process, in spite of being massively parallel at the visual end. When people attempt to track moving objects, only a limited number can be tracked, which varies with display parameters. Associated experiments indicate that spatial selection and updating has higher capacity than selection and updating of features such as color and shape, and is mediated by processes specific to each cerebral hemisphere, such that each hemifield has its own spatial tracking limit. These spatial selection processes act as a bottleneck that gate subsequent processing. To improve our understanding of this bottleneck, future work should strive to avoid contamination of tracking tasks by high-level cognition. While we are far from fully understanding how attention keeps up with multiple moving objects, what we already know illuminates the architecture of visual processing and offers promising directions for new discoveries.
Wittgenstein published next to nothing on the philosophy of religion and yet his conception of religious belief has been both enormously influential and hotly contested. In the contemporary literature, Wittgenstein has variously been labelled a fideist, a non-cognitivist and a relativist of sorts. This Element shows that all of these readings are misguided and seriously at odds, not just with what Wittgenstein says about religious belief, but with his entire later philosophy. This Element also argues that Wittgenstein presents us with an important 'third way' of understanding religious belief – one that does not fall into the trap of either assimilating religious beliefs to ordinary empirical or scientific beliefs or seeking to reduce them to the expression of certain attitudes.
This Element contributes to existing research with an analysis of public understandings of democracy based on original surveys fielded in Indonesia, Malaysia, the Philippines, Singapore and Thailand. It conceptualises democracy as consisting of liberal, egalitarian and participatory ideals, and investigates the structure of public understandings of democracy in the five countries. It then proceeds to identify important relationships between conceptions of democracy and other attitudes, such as satisfaction with democracy, support for democracy, trust in institutions, policy preferences and political behaviour. The findings suggest that a comprehensive analysis of understandings of democracy is essential to understand political attitudes and behaviours.
Suffering is ubiquitous. Quests to make sense of it in relation to the existence of God – and to find meaning in our lives in the face of it – are significant aspects of the human experience. Evil and Theodicy motivates the project of theodicy by examining arguments rooted in evil against God's existence and by critically assessing the response of skeptical theism. Ekstrom explores eight different lines of theodicy. She argues that, even if the prospects for theodicy are dim with respect to defending the rationality of theistic belief in light of suffering, nonetheless, work in theodicies is practically useful.
Mary Brazelton argues that the territories and peoples associated with China have played vital roles in the emergence of modern international health. In the early twentieth century, repeated epidemic outbreaks in China justified interventions by transnational organizations; these projects shaped strategies for international health. China has also served as a space of creativity and reinvention, in which administrators developed new models of health care during decades of war and revolution, even as traditional practitioners presented alternatives to Western biomedicine. The 1949 establishment of the People's Republic of China introduced a new era of socialist internationalism, as well as new initiatives to establish connections across the non-aligned world using medical diplomacy. After 1978, the post-socialist transition gave rise to new configurations of health governance. The rich and varied history of Chinese involvement in global health offers a means to make sense of present-day crises.
Deontology is a theory about how we should act, morally speaking. It comes in several varieties, but all share certain doctrines, many of which are close to those found in the so-called 'common-sense morality' of the Western world. And all varieties are united in their opposition to consequentialism, a theory that, in its simplest form, tells us that we should always act so as to maximize impersonal value by bringing about the best consequences. This Element presents some of the different versions of deontology, including the views of W. D. Ross, and, to a lesser extent, Immanuel Kant. It defends certain deontological tenets, while challenging others, and contrasts them with consequentialism. Deontology and consequentialism are two of the main contenders in ethical theory, but virtue ethics is another, and it too is addressed (briefly), with an attempt to see it, in its most plausible form, as part of deontology.
For many years, suggestions to 'geoengineer' the climate occupied a marginal role in climate change science and politics. Today, visions of massive carbon drawdown and sunlight reflection have become reasonable additions to conventional mitigation and adaptation. Why did researchers start engaging with ideas that were, for a long time, considered highly controversial? And how did some of these ideas come to be perceived worthy of research funding and in need of international governance? This Element provides an analysis of the recent history and evolution of geoengineering as a governance object. It explains how geoengineering evolved from a thought shared by a small network into a governance object that is likely to shape the future of climate politics. In the process, it generates a theory on the earliest phase of the policy cycle and sheds light on the question why we govern the things we govern in the first place.
In line with a profound theological understanding of liturgy as the Church at prayer (ecclesia orans), the focus of this Element is the variegated ways in which Christians address, turn to, and worship God in their central rituals and celebrations. Surveying a representative sample of official liturgical sources from different Christian Churches, the question is asked how 'pure' the monotheism expressed in them is. For one could argue that there is some ambiguity involved, especially with respect to (i) the peculiar position of Christ, the Son of God, and God the Father in liturgical prayers, and (ii) regarding the veneration of the saints. The essential key to unlock this complex and multi-layered reality is a meticulous study of the essential doxological nature of Christian liturgy, both from a phenomenological point of view and on the basis of fine textual analyses.
This Element outlines the recent understanding of ensemble representations in perception in a holistic way aimed to engage the general audience, novel and expert alike. The Element highlights the ubiquitous nature of this summary process, paving the way for a discussion of the theoretical and cortical underpinnings, and why ensemble encoding should be considered a basic, inherently necessary component of human perception. Following an overview of the topic, including a brief history of the field, the Element introduces overarching themes and a corresponding outline of the present work.
The recent advances in the field of molecular diagnostic techniques have led to the identification of targetable alterations prompting a paradigm shift in the management of non-small cell lung cancer (NSCLC) and an era of precision oncology. This Element highlights the most clinically relevant oncogenic drivers other than EGFR, their management and current advancements in treatment. It also examines the different challenges in resistance to targeted therapies and diagnostic dilemmas for each oncogenic driver and the future direction of NSCLC management.
This Element presents and discusses the main trajectories in the evolution of the concept of ambiguity and the most relevant theoretical contributions developed around it. It specifically elaborates on both the intrinsic perspectives on ambiguity as an inherent part of organizational decision-making processes and the more recent strategic perspectives on discursively constructed strategic ambiguity. It helps illuminate the path ahead of organizational scholars and offers new avenues for future research. This is important given the ever more pervasive presence of ambiguity in and around organizations and societies.
Behavioral economics provides a rich set of explicit models of non-classical preferences and belief formation which can be used to estimate structural models of decision making. At the same time, experimental approaches allow the researcher to exogenously vary components of the decision making environment. The synergies between behavioral and experimental economics provide a natural setting for the estimation of structural models. This Element will cover examples supporting the following arguments 1) Experimental data allows the researcher to estimate structural models under weaker assumptions and can simplify their estimation, 2) many popular models in behavioral economics can be estimated without any programming skills using existing software, 3) experimental methods are useful to validate structural models. This Element aims to facilitate adoption of structural modelling by providing Stata codes to replicate some of the empirical illustrations that are presented. Examples covered include estimation of outcome-based preferences, belief-dependent preferences and risk preferences.
Overuse has become a major issue of healthcare quality, safety, and sustainability around the world. In this Element, the authors discuss concepts, terminology, and the history of concerns. They show how interventions to address overuse target multiple drivers. They highlight successes and promising approaches, but also challenges in generating and using evidence about overuse. They emphasise that different stakeholder perceptions of value must be recognised. System-level efforts to restrict access to services have created tensions between stakeholder groups and stimulated politicised debates about rationing. They argue for clear articulation of priorities, problem definition, mechanisms for interventions, and areas of uncertainty. Policy-makers should prioritise transparency, be alert to inequalities as they seek to reduce overuse, and consider how to balance controlling use with enabling clinicians to respond to individual circumstances. The complexity of the drivers and possible solutions to overuse require the use of multiple research methods, including social science studies. This title is also available as Open Access on Cambridge Core.
This Element offers an opinionated and selective introduction to philosophical issues concerning idealizations in physics, including the concept of and reasons for introducing idealization, abstraction, and approximation, possible taxonomy and justification, and application to issues of mathematical Platonism, scientific realism, and scientific understanding.
Historically simulation was used as an education and training technique in healthcare, but now has an emerging role in improving quality and safety. Simulation-based techniques can be applied to help understand healthcare settings and the practices and behaviours of those who work in them. Simulation-based interventions can help to improve care and outcomes – for example, by improving readiness of teams to respond effectively to situations or to improve skill and speed. Simulation can also help test planned interventions and infrastructural changes, allowing possible vulnerabilities and risks to be identified and addressed. Challenges include cost, resources, training, and evaluation, and the lack of connection between the simulation and improvement fields, both in practice and in scholarship. The business case for simulation as an improvement technique remains to be established. This Element concludes by offering a way forward for simulation in practice and for future scholarly directions to improve the approach. This title is also available as Open Access on Cambridge Core.