Hostname: page-component-76fb5796d-r6qrq Total loading time: 0 Render date: 2024-04-27T13:21:32.299Z Has data issue: false hasContentIssue false

Dynamic Benefit-Cost Analysis for Uncertain Futures

Published online by Cambridge University Press:  02 September 2019

Susan E. Dudley*
Affiliation:
The George Washington University, GW Regulatory Studies Center, 805 21st Street NW, Suite 600, Washington, DC 20052, USA, email: sdudley@gwu.edu
Daniel R. Pérez
Affiliation:
The George Washington University, GW Regulatory Studies Center, 805 21st Street NW, Suite 600, Washington, DC 20052, USA, email: danielperez@gwu.edu
Brian F. Mannix
Affiliation:
The George Washington University, GW Regulatory Studies Center, 805 21st Street NW, Suite 600, Washington, DC 20052, USA, email: bmannix@gwu.edu
Christopher Carrigan
Affiliation:
The George Washington University Trachtenberg School of Public Policy and Public Administration, GW Regulatory Studies Center, 805 21st Street NW, Suite 600, Washington, DC 20052, USA, email: ccarrigan@gwu.edu
*
Rights & Permissions [Opens in a new window]

Abstract

“Uncertain futures” refers to a set of policy problems that possess some combination of the following characteristics: (i) they potentially cause irreversible changes; (ii) they are widespread, so that policy responses may make sense only on a global scale; (iii) network effects are difficult to understand and may amplify (or moderate) consequences; (iv) time horizons are long; and (v) the likelihood of catastrophic outcomes is unknown or even unknowable. These characteristics tend to make uncertain futures intractable to market solutions because property rights are not clearly defined and essential information is unavailable. These same factors also pose challenges for benefit-cost analysis (BCA) and other traditional decision analysis tools. The diverse policy decisions confronting decision-makers today demand “dynamic BCA,” analytic frameworks that incorporate uncertainties and trade-offs across policy areas, recognizing that: perceptions of risks can be uninformed, misinformed, or inaccurate; risk characterization can suffer from ambiguity; and experts’ tendency to focus on one risk at a time may blind policymakers to important trade-offs. Dynamic BCA – which recognizes trade-offs, anticipates the need to learn from experience, and encourages learning – is essential for lowering the likelihoods and mitigating the consequences of uncertain futures while encouraging economic growth, reducing fragility, and increasing resilience.

Type
Symposium on Analysis for Uncertain Futures
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted re-use, distribution, and reproduction in any medium, provided the original work is properly cited.
Copyright
© Society for Benefit-Cost Analysis, 2019

1 Introduction and context

Policymakers frequently face demands to act now to protect against a wide range of future risks, and to do so without impeding economic growth. Yet, in some cases, traditional analytical tools – decision analysis, benefit-cost analysis (BCA), risk analysis, etc. – may be inadequate to support the analyst charged with modeling the relevant uncertainties and trade-offs. Although some risks may be estimated using actuarial methods, others, particularly those with unknown probabilities and potentially severe and widespread consequences, may not.

BCA has enjoyed both academic and bipartisan support as a valuable approach for examining different policy options and informing policy choices. Since at least the 1970s, it has been the preferred instrument for comparing incremental (marginal) alternatives and addressing discrete policy questions in isolation. As traditionally applied, however, BCA may be ill-equipped to cope with certain challenges now facing policymakers, including those associated with climate change, nuclear war, cyber attacks against critical infrastructure, widespread natural disasters, global pandemics, and systemic financial crises.

Traditional methods of analysis that focus on marginal effects can break down when dealing with large irreversible changes, with policies that require large-scale coordination to succeed or that only make sense on a global scale, in situations involving networked effects, or when confronting long time horizons. Furthermore, the likelihood that a particular problem will occur can be unknown or even unknowable. For these reasons, we refer to these issues as “uncertain futures.”

Marginal analysis is most valuable when examining one issue at a time. Experts who focus on a particular source of risk (e.g., war, climate change, cyber terrorism, financial system collapse, etc.) often consider policy approaches in isolation – the phenomenon of narrow framing (Kahneman, Reference Kahneman2003). However, committing vast resources to one problem may harm economic growth and make society less resilient and less able to cope with other (anticipated or unanticipated) events or challenges. Specialists in different policy areas, responding to the perceived crise du jour in their respective fields, may compete to bring attention to what each sees as the highest priority at that moment. The resulting competition can become a “common pool” problem if it depletes resources and impairs an overall capacity to respond to the future as it unfolds (Ostrom, Reference Ostrom1990).

This does not mean that BCA and other analysis tools are not critical to ensure policies support and enhance well-being. Rather, the diverse policy choices confronting decision-makers today call for broader frameworks that incorporate uncertainties and trade-offs across policy decisions. More flexible and dynamic decision-analysis approaches that anticipate the need to learn from experience, and that encourage learning, are essential. To facilitate the development of dynamic BCA methods for analyzing uncertain futures, the GW Regulatory Studies Center commissioned four papers that explore, from different perspectives, analytical approaches to inform policies that address uncertain futures. The collective research output is premised on the idea that policy analysis of uncertain futures can benefit from cross-fertilization of ideas and interdisciplinary analytical tools.

In the first of these papers, Louis Anthony Cox, Jr. applies insights from machine learning – especially, deep multi-agent reinforcement learning – to reveal how incremental learning and improvement approaches (i.e., “muddling through”) can supplement and reinforce traditional decision analysis (Cox, Reference Cox2019). In fact, Cox demonstrates that addressing problems characterized as uncertain futures through successive limited comparisons informed by reinforcement learning techniques ultimately reaches the same policy prescriptions as traditional optimization techniques, but it does so while offering the advantage of being inherently more amenable to managing these types of issues as well.

Second, Fred Roberts applies risk assessment to scenarios of terrorist attacks on critical infrastructure, including U.S. sporting venues and the international maritime transportation system. He finds that risk assessments of terrorist attacks traditionally treat physical and cyber attacks separately and, as a result, are inappropriate for considering the risks associated with combined attacks that include both a physical and cyber component. In response, he proposes a framework informed by expert judgment to determine whether an attacker would likely prefer executing a combined or traditional physical attack on a given target (Roberts, Reference Roberts2019).

In the third commissioned paper, James Scouras examines nuclear war as a global catastrophic risk and shows that multidisciplinary studies combining insights from “historical case studies, expert elicitation, probabilistic risk assessment, complex systems theory, and other disciplines” can address many of the shortcomings of single analytic approaches. He argues that experts can address current gaps in their assessments of the consequences of nuclear weapons by further investigating understudied phenomena (e.g., the effects of electromagnetic pulse, nuclear winter, the cascading effects of nuclear war on the interdependent infrastructures that sustain societies, etc.) (Scouras, Reference Scouras2019).

Finally, Viscusi et al. show that adopting precautionary measures in the face of risk ambiguity can increase, rather than mitigate, the risk of adverse outcomes by undervaluing the information that can be gained through trial and error. Instead, policymakers should exploit ambiguity and opportunities for learning about uncertain risks by, for example, making incremental investments in the presence of irreversible effects. They also suggest that these decision-makers should resist the temptation to adopt different discounting procedures for temporally remote effects since standard discounting procedures without any ad hoc adjustment can properly weight future benefits and costs (Viscusi et al., Reference Viscusi2019).

In this overview paper, we complement these contributions not only by describing what we mean by uncertain futures, but also by illustrating the challenges associated with anticipating these futures and developing effective policies to address them. In managing these types of problems, we further demonstrate that employing dynamic BCA approaches that learn from different disciplines and utilize experimentation offer many advantages.

2 Characteristics of uncertain futures

Problems that invite policy analysis are generally concerned with public goods, externalities, or some other market imperfection that makes collective action worth considering. The subset we refer to as uncertain futures are those that appear relatively intractable because they possess some combination of the following characteristics:

  • They potentially cause irreversible changes;

  • They are widespread, so that policy responses may make sense only on a global scale;

  • Network effects are difficult to understand and may amplify (or moderate) consequences;

  • Time horizons are long; and

  • The likelihood of catastrophic outcomes is unknown or even unknowable.

These characteristics pose substantial challenges for policy analysis and decision frameworks needed to support collective decisions. Encompassed in our definition of uncertain futures are what Taleb (Reference Taleb2007) refers to as “black swans,” unanticipated, consequential, and rare events for which it is difficult both to characterize the hazard and calculate the probability of its occurrence.

In examining these types of problems, some researchers have focused attention on their potential irreversibility and catastrophic consequences (Sunstein, Reference Sunstein2006; Weitzman, Reference Weitzman2011). For example, Sunstein (Reference Sunstein2006, 896) argues that these attributes make it worthwhile for regulators to consider investing resources in preserving flexibility of policy approaches as society gains knowledge, thus decreasing the uncertainty surrounding each individual risk. Analogously, Weitzman (Reference Weitzman2011, 291) notes that the catastrophic nature of these risks requires the use of analytical frameworks that go beyond “standard” benefit-cost analysis. To do so, he advocates greater use of contingency planning with a particular emphasis on developing policy portfolios of last resort options, such as, for example, geoengineering as an approach to addressing climate issues.

Considering terrorism and natural disasters, Viscusi (Reference W. Kip.2009) identifies four main differences between these and other risks for which economists routinely estimate money-risk trade-offs, including those associated with jobs, products, and motor vehicles. The first distinction is that terrorism and natural disasters often generate clusters of deaths in a single catastrophe. Second, the perceived probability of death varies across a country, unlike with routine risks where the probabilities are more uniform. For example, while terrorist attacks and the highest-impact effects of natural disasters are concentrated geographically, motor vehicle accidents are more diffusely distributed. A third difference is that risks associated with terrorism and natural disasters are public risks that are less often the result of market exchange and, to the extent they are unanticipated, are not captured in market risk premiums. Finally, terrorism and natural disasters often have additional effects, such as changing underlying perceptions of national security, which extend beyond the more evident impacts on life and property (Viscusi, Reference W. Kip.2009, 191-2).

2.1 Example: catastrophic climate change

One example of a challenging policy problem that meets the criteria identified above for an uncertain future is catastrophic climate change. While the occurrence of climate change is widely accepted, the nature and probability of extreme outcomes is much less understood. Forecasted effects occur decades if not centuries into the future and may be irreversible. The scale is global, and worst case “fat tail” scenarios depend on a network of uncertain linkages between human and natural systems. In its 2018 special report, Global Warming of 1.5°C, the Intergovernmental Panel on Climate Change (IPCC) (2018) estimates global temperature increases ranging from 1.46 to 3.66°C by the end of the 21st century. The IPCC warns, however, that temperature increases at or above 2°C increase the likelihood of triggering “tipping points” – critical thresholds beyond which a system can experience significant, often irreversible changes including: “habitat losses for organisms; increases in risks related to flooding from sea level risk; increases in extreme drought; and reductions in rainfall and water availability” (IPCC, 2018, 261).

2.2 Example: systemic financial crisis

As another example, the global financial crisis of 2007 and subsequent “Great Recession” caused monetary authorities to take emergency actions that, in scope and scale, were without precedent. Despite considerable analysis, legislation, and regulation, whether the financial system today is any less vulnerable to a major dislocation is unclear. Much of the ongoing concern centers on institutions deemed “too-big-to-fail”; yet we cannot be certain that ensuring the health of individual institutions will adequately mitigate systemic risk, given the possibility that seemingly small causes may produce a cascade of larger consequences.

The systemic risk embedded in financial markets may be due, in part, to the size and contours of the governing policy institutions, the financial, monetary, and fiscal authorities at the national and supranational level. The struggles to contain the financial crisis in Greece offer one example. But systemic risk also follows from the dense interconnectedness of the modern financial system. In 2015, the U.S. Commodity Futures Trading Commission (CFTC) accused an individual trader in London of causing the trillion-dollar “flash crash” of May 6, 2010 (Whipp & Scannell, Reference Whipp and Scannell2016). Regardless of the merit of the accusation, it should be clear that prosecuting an individual is not an adequate policy response to what is a sign of systemic instability. High-speed algorithmic trading has come to dominate financial exchanges, raising concerns about their vulnerability to sudden disruption (Budish et al., Reference Budish, Cramton and Shim2015). Today’s markets operate so fast that a financial collapse may be catastrophic in its amplitude and global in its scope, proliferating before anyone recognizes what has happened. Authorities continue to struggle with an understanding of systemic risk, options for moderating it, and methods of analyzing it.

2.3 Example: cyberattack on the power grid

Experts have long considered the U.S. power grid a logical target for a major cyberattack. Increasingly sophisticated and interconnected power grids allow electric utilities to use technological advances in networking to operate with greater efficiency and reliability; however, this interconnectedness may also amplify the potential damage caused by a coordinated cyberattack, an electromagnetic pulse weapon, or a large solar storm. Worst-case scenarios include nationwide blackouts that persist anywhere from a matter of days in some places to several weeks in others. Power outages of this magnitude could cause billions of dollars in economic damage and present significant public safety concerns.Footnote 1

U.S. agencies, including the Department of Homeland Security (DHS) and the Department of Energy (DOE), recognize that the interconnectedness of the U.S. energy grid makes it vulnerable to “cascading failure.” Admiral Michael Rogers testified in 2014 that several countries likely have the capability of shutting down the U.S. power grid (House Select Intelligence Committee, 2014). However, significant uncertainty exists regarding how best to manage this risk. For example, utilities must balance the low-probability but possibly catastrophic risk of a cyberattack against the near certainty of routine weather events affecting their customers. Experts note that utilities operate on thin profit margins, and decisions to increase spending on cybersecurity to prevent low-probability harms require spending less on managing highly-probable, weather-related risks (e.g., investing in burying power lines or raising them above tree lines) (Knake, Reference Knake2017).

For regulated utilities, capital expenditures must be justified to a regulatory commission while the appropriate level of investment required to sufficiently safeguard the grid against cyber attacks is unknown. In its 2015 report, DOE estimated that an investment of $15.2 billion might be sufficient to insulate the U.S. energy grid against cyber attacks (Knake, Reference Knake2017). Yet, its Quadrennial Energy Review released just one year later found that the level of investment necessary actually ranged from $350 billion to $500 billion (U.S. Department of Energy, 2017). These wide-ranging estimates exemplify the deep uncertainty surrounding the appropriate level of investment to safeguard the grid.

Further, as Roberts (Reference Roberts2019) details in his article, a cyberattack might be used to facilitate a physical attack on a target, either to increase the attack’s probability of success, reduce its cost, or increase the magnitude of its effects. Future assessments considering the appropriate level of investment required to secure the grid will require analytical tools that consider the joint effects of “combined attacks.” Additionally, such assessments will also have to account for the synergistic effects of risk mitigation to address multiple threats. For example, methods that might be employed to harden the electric grid could also lessen the consequences of cyber, electromagnetic pulse, solar eruptions, etc.

3 Policy challenges for addressing uncertain futures

The structure of uncertain futures problems can inhibit market mechanisms from solving them, but designing appropriate policies to address them is also challenging. Traditional analytical tools may not be adequate for uncertain futures because, in such cases, perceptions of risks can be uninformed, misinformed, or inaccurate; risk characterization can suffer from ambiguity; and experts’ tendency to focus on one risk at a time may blind policymakers to important trade-offs.

3.1 Traditional analytical tools may be inadequate

Traditional decision analysis examines a set of decision alternatives and evaluates them by considering the probabilities and utilities of different outcomes associated with each. With this information, choices can be optimized subject to feasibility constraints (e.g., see Raiffa, Reference Raiffa1968; Abbas & Howard, Reference Abbas and Howard2015). However, when reliable information on probabilities is lacking, the analysis becomes much more complicated.

To address these challenges, Cox (Reference Cox2019) reviews the insights of Charles Lindblom (see Lindblom, Reference Lindblom1959), who in 1959 raised concerns about the practicality of “rational-comprehensive decision-making,” such as decision analysis and statistical decision theory, for understanding and managing complex situations. Cox observes that:

Managing large-scale, geographically distributed, and long-term risks arising from diverse underlying causes – ranging from poverty to underinvestment in protecting against natural hazards or failures of sociotechnical, economic, and financial systems – poses formidable challenges for any theory of effective social decision-making. Participants may have different and rapidly evolving local information and goals, perceive different opportunities and urgencies for actions, and be differently aware of how their actions affect each other through side effects and externalities (Cox, Reference Cox2019, 1-2).

Although scholars maintain that BCA “is an indispensable step in rational decision-making in this as in other areas of government regulation” (Posner, Reference Posner2004, 139), they simultaneously recognize that the characteristics of these events make them “intractable to conventional analytic methods” (Posner, Reference Posner2004, 8). It is often difficult to monetize the costs and benefits of both the effects of uncertain risks and of the policies that attempt to address them.

BCA uses prices and pseudo-prices, such as “shadow” and hedonic prices, to capture the value of marginal changes in the supply of goods and services. But scale matters. Prices at the margin cannot be extrapolated without limit: selling one pint of blood for $50 does not mean you would part with two gallons for $800. Further, these methods break down when addressing large irreversible changes such as deliberate species extinction (e.g., see Pugh, Reference Pugh2016; Mannix, Reference Mannix2018); with policies that only make sense on a global scale where uncertainty exists around what authority is making policy and who has standing to influence it; where network effects make systemic failures more likely, such as with financial markets; with long time horizons where determining the appropriate discount rate becomes more complicated; and when probabilities are unknown or unknowable.

While the underlying BCA principles may be sound, data required to complete an analysis are often inaccessible. We can observe prices that are relevant for valuing changes in mortality risk, but there is no market in which to observe a price for the whole planet, as the prices for small patches of real estate are not really informative on this point. Income effects that could be a minor complication in ordinary BCA may come to dominate the results when evaluating extreme cases. Transfer effects can also get very complicated. For example, to what extent should climate policy prompt large transfers among nations, and how will these concerns affect efforts to negotiate a global policy?

3.2 Inaccurate or uninformed risk perceptions complicate decisions

Kahneman and Tversky (Reference Kahneman and Tversky1979) demonstrate that “highly unlikely events are [often] either ignored or overweighted” (p. 283). It is also well established that the public’s perceptions of which risks are most concerning with respect to their likelihoods and impacts differ substantially from risk experts’ estimates (e.g., see Kuran & Sunstein, Reference Kuran and Sunstein1999; Chilton et al., Reference Chilton, Jones-Lee, Kiraly, Metcalf and Pang2006; Viscusi, Reference W. Kip.2009). Additionally, both groups shift their opinions regarding which risks pose the greatest threats to society over relatively short timeframes. For example, Scouras’s article in this issue illustrates that experts hold widely divergent views about the likelihood of nuclear war. Table 1, which summarizes the results of the World Economic Forum’s Global Risks Report survey in 2008 and 2018, illustrates that even experts’ perceptions can change significantly over periods as short as 10 years (World Economic Forum, 2018).

Table 1 World Economic Forum: top 5 global risks in terms of impact.

Source: World Economic Forum, 2018.

Although some experts argue for taking extra precaution when dealing with potentially catastrophic or irreversible risks, the truly difficult issue resides in deciding on the appropriate level and form of policy intervention. For example, Sunstein (Reference Sunstein2006) points out that notions of irreversibility and catastrophe are “exceedingly ambiguous, and [that] it is by no means clear how regulators should understand them.”

As Cox (Reference Cox2019) explains, large-scale real-world decision problems challenge traditional rational-comprehensive decision analysis models because the “reward function” and choice set are unknown or costly to develop. Behavioral research suggests that people – the lay public, political figures, and experts – tend not only to overestimate small and uncertain risks but also are irrationally averse to risk ambiguity. Viscusi et al.’s research in this series reveals that the public over-reacts to very small water-related risks, such as those posed by the herbicide atrazine, prescription drug residues, and BPA. Ambiguity aversion may lead to precautionary choices when, as Viscusi et al. (Reference Viscusi2019, 9) highlight, “the optimal strategy often involves holding off from expensive or irreversible actions and instead learning about the risk based on experience, and considering adaptive behavior that involves switching to other policies if the outcomes with the uncertain choice are sufficiently unfavorable. ”

3.3 Risks exist in analyzing from a single perspective (narrow framing)

Scholars studying risk regulation have long understood that problems result when regulators miss the proverbial “forest for the trees.” For example, 25 years ago, now Supreme Court Justice Stephen Breyer (Reference Breyer1993, 11) observed the propensity for regulators to have “tunnel vision,” referring to the inclination of individuals and agencies to pursue their narrow regulatory objectives while ignoring how their efforts interact within the larger set of societal benefits and costs. Breyer further documented that agencies can lack systematic methods for prioritizing risks and posited that regulators and policymakers tend to be overly attentive to public opinions in deciding how risks should be prioritized (Breyer, Reference Breyer1993, 20). Finally, he noted that regulations crafted with the intention of addressing risks are often responsible for creating additional risks for which the rule makers have not accounted (Breyer, Reference Breyer1993, 22). Accordingly, some research suggests that improvements in the ability to manage risks could be achieved using frameworks that reward learning, experimentation, and interdisciplinary collaboration (Pawson, Reference Pawson2013, 190).

Tunnel vision can be aggravated in the face of uncertain futures because the analysis may easily miss general equilibrium effects or the influence of rational expectations (Sprenger, Reference Sprenger2015). Expensive efforts to improve the welfare of future generations may be frustrated, for example, if the public compensates by reducing other forms of long-term savings and intergenerational wealth transfer.

4 Benefits of flexibility and interdisciplinary learning

Any framework for evaluating policy actions to address potential public risks should recognize the existence of trade-offs and the importance of learning. For example, Arrow and Fisher (Reference Arrow and Fisher1974) assert that policymakers need to be attentive to the irreversibility of their decisions when they expend resources or decide prematurely to regulate certain activities. Scholars have documented numerous cases where certain expensive regulations and programs intended to save lives may instead have led to increased fatalities as a result of their unintended consequences (e.g., see Keeney, Reference Keeney1990; Lutter et al., Reference Lutter, Morrall and Viscusi1999). Moreover, as Arrow and Fisher (Reference Arrow and Fisher1974, 319) demonstrate, the expected benefits of an irreversible decision are likely to be lower than initially projected once that estimate is adjusted to reflect the loss of options that result from making the choice.

As a result, risk management strategies that allow for flexibility in future decision-making are often preferable to undertaking irreversible decisions in the short-term.Footnote 2 Consistent with this idea, Linquiti and Vonortas (Reference Linquiti and Vonortas2012) demonstrate how a “real options” approach can be used to deal with problems such as climate change. To do so, they model decision frameworks for building barriers to protect communities against storm surges and find that a strategy that extends the barrier height as needed in response to new information can offer greater net social benefits than an inflexible strategy employed at the outset of the planning period. This is due to both the flexibility embedded in incremental decision-making, which enables one to respond to new information, as well as the opportunity to defer investment, which lowers net present value costs. They conclude that a successful real options strategy is contingent upon: (1) the ability to learn about real-world conditions; (2) sufficient time to acquire information before the onset of damages; and (3) political, economic, and institutional conditions that permit timely action. In such cases, taking advantage of the learning curve associated with risks and remedies can allow one to reach better policy outcomes.

Similarly, Cox (Reference Cox2019) makes a case for reinforcement learning, through which successive incremental actions, reinforced by feedback at each step, can effectively “solve dynamic optimization problems [even] when not enough is initially known to formulate a clear decision optimization problem.” Cox begins with Lindblom’s (Reference Lindblom1959) concern that traditional decision analysis is impractical for large-scale, multi-person organizational decision-making over time and explores his alternative method of “muddling through,” using successive limited comparisons rather than a comprehensive optimization approach. Applying insights from machine learning, Cox concludes that the apparently divergent approaches are not necessarily incompatible. Reinforcement learning and multiagent reinforcement learning (RL and MARL) techniques “muddle through” algorithmically in that they make gradual incremental adjustments that can be analyzed and guided by experience. Cox finds that these policy gradient algorithms “converge to the same policies as exact optimization methods traditionally used in operations research and statistical decision theory, such as value iteration and policy iteration algorithms in stochastic dynamic programming.” He draws several lessons for policymaking:

  1. (i) It is important to collect accurate, relevant feedback data and to use them to improve policies. “After each new action is taken, RL evaluates the reward received and compares it to the reward that was expected so that the difference can be used to correct erroneous expectations and update the current policy. This requires that the effects of actions be evaluated and compared to prior expectations or predictions, and also that policies then be adjusted in light of the data” (Cox, Reference Cox2019, 23).

  2. (ii) Experimentation is essential for discovering how to induce desired changes in outcome probabilities. Rather than creating a model of how the world works, and then choosing policies that maximize expected returns according to the model, policymakers must recognize that causal relationships between actions and outcomes are initially highly uncertain. Cox shows how RL and MARL algorithms provide mechanisms for coping with model uncertainty, thus avoiding the need to rely on a single hypothetical causal model.

  3. (iii) “During collective learning, agents should advance slowly when doing better than expected, but retreat quickly when doing worse” (Cox, Reference Cox2019, 29) An application of the “win or lose fast” (WoLF) principle in MARL, while carefully controlling adjustment step sizes, avoids policy churn that can destabilize the learning and improvement process.

  4. (iv) Actors should be separated from reviewers. The program evaluation literature recognizes that third-party evaluators may be more objective reviewers of programs than their initiators (e.g., see Scriven, Reference Scriven1997; Rossi et al., Reference Rossi, Lipsey and Freeman2004). Cox argues that RL offers support for this perspective: “In deep learning RL algorithms, training one network to decide what to do next and a separate one to evaluate how well it is working has been found to prevent overly optimistic assessments of policy performance due to overfitting, i.e., using the same data to both select estimated value-maximizing actions and estimate the values from taking those actions” (Cox, Reference Cox2019, 30).

  5. (v) Incentives should be designed to reward learning and improvement. Cox suggests approaches for rewarding human agents based on insights from machine reinforcement learning algorithms, some of which allow agents share with each other “valuable memories, experiences, and expertise” to improve collective performance (Cox, Reference Cox2019, 33).

Given the stakes involved in regulating high impact, uncertain risks, approaches that attempt to preserve flexibility while focusing on learning through experimentation and evaluating ex post outcomes of different interventions can be especially valuable (Dudley, Reference Dudley2013). Quasi-experimental designs or natural experiments, where policies are designed in ways and on scales that generate observations which enable causal inferences to be drawn based on different outcomes, may be applicable to some challenges as well (Dominici et al., Reference Francesca, Greenstone and Sunstein2014).

Examining policy questions while considering different perspectives and employing the insights of various disciplines is also important. Writing as a program evaluation scholar, Pawson (Reference Pawson2013) argues that different fields operate in similar areas of program theory such as risk management, but the associated researchers typically do not build on prior work from other fields. As a result, scholars and practitioners routinely end up “reinventing the wheel.” Interdisciplinary collaboration can provide “the means of establishing a common language to draw out the similarities between different interventions…to link their evaluations” and speed up learning (Pawson, Reference Pawson2013, 190).

5 Addressing challenges in risk assessment of uncertain futures caused by intelligent adversaries

Uncertain futures created by actors intentionally attempting to harm others (through acts of terrorism, the use of nuclear weapons, etc.) generate additional analytical challenges for policymakers. The complexity introduced by the strategic, responsive, and adaptive behavior of such actors limits the effectiveness of commonly used analytical tools – such as probabilistic risk assessment – for analyzing these risks (e.g., see Oussalah & Newby, Reference Oussalah and Newby2004, 134; Cox, Reference Cox2008).

5.1 Probabilistic vs. strategic risk

Darby (Reference Darby2004) highlights the substantive differences that exist between a probabilistic evaluation of terrorist risk and a similar assessment of other risks such as safety risk.Footnote 3 Primary among them, estimating terrorist risk does not involve “random events associated with ‘dumb’ components; terrorist risk involves intentional acts by malevolent, thinking terrorists.” As a result, the risk depends on many attributes exogenous to the system, including an attacker’s capabilities, ability to adapt, and perception of both a target’s vulnerability and the consequences of a successful attack. Golany et al. (Reference Golany, Kaplan, Marmur and Rothblum2009) categorize the distinction as the difference between modeling probabilistic uncertainty and strategic uncertainty.

Considering the conceptual framework widely employed by DHS in its risk analyses of terrorism where risk is function of threat (T), vulnerability (V), and consequence (C) (i.e., risk = f(T,V,C)), Darby (Reference Darby2004) notes that successfully estimating risk using this approach requires recognizing the dependence among these variables. While researchers have employed various techniques to improve upon this framework, others maintain that it is ultimately of limited use for informing decision-makers responsible for allocating scarce resources to defend against terrorist attacks. For example, the National Research Council (2010) reviewed DHS’s approach to risk analysis in 2010 and concluded:

[Risk = f(T,V,C)]…is a philosophically suitable framework for breaking risk into its component elements…[but] is not an adequate calculation tool for estimating risk in the terrorism domain, for which independence of threats, vulnerabilities, and consequences does not typically hold and feedbacks exist.Footnote 4

Examining other uncertain futures often requires overcoming many of the same challenges associated with analyzing terrorism. These include grappling with the aforementioned complexity of assessing strategic risk as well as other challenges including reconciling substantive variation in experts’ risk perceptions, the existence of risk ambiguity, and data limitations. Altogether, these issues point to the need to modify existing frameworks for assessing risks especially related to attacks against critical infrastructure (Depoy et al., Reference Depoy, Phelan, Sholander, Smith, Varnado and Wyss2005). Additionally, given the limited data available to study such attacks, experts identify “qualitative or semi-quantitative approaches … for making interim decisions while identifying data needs” as a promising approach (O’Brien & Cummins, Reference O’Brien and Cummins2011).

5.2 Risk assessment for combined cyber and physical attacks

Another limitation of traditional approaches to conducting risk assessments of terrorist attacks resides in the fact that threats and consequences of physical and cyber attacks are typically considered in isolation (Depoy et al., Reference Depoy, Phelan, Sholander, Smith, Varnado and Wyss2005). In his paper in this series, Roberts (Reference Roberts2019) proposes a qualitative, relative risk assessment framework, noting that “a more modern way of thinking about security is to think of ‘combined’ attacks that include both a physical and a cyber component.” His study assesses the contextual factors under which terrorists might prefer a combined attack on U.S. sporting venues or the international maritime transportation system.

Roberts (Reference Roberts2019) asserts that terrorists are increasingly using cyber attacks as precursors to physical attacks, particularly when that cyberattack serves to: (i) improve the physical attack’s likelihood of success; (ii) decrease the cost of conducting it; or (iii) increase the consequences of that physical attack. His study employs a qualitative framework using these criteria and informed by expert judgment to establish whether an attacker would likely prefer executing a combined or traditional physical attack on a particular target. Admittedly, several combinations of probability, cost, and consequence exist where it is unclear how to assess a potential attacker’s preferences. For instance, it is challenging to determine to what extent a terrorist would accept a lower consequence, such as fewer casualties, in exchange for a higher probability of success.

Roberts (Reference Roberts2019) concludes that the traditional risk = f(T,V,C) probabilistic framework not only does not account for the risk of combined attacks, but also that data are so scarce for these types of attacks that “quantitative analysis of such gaps is questionable at best.” Overall, two advantages of Roberts’ approach are that it offers decision-makers a framework that is technically feasible to implement and permits risk analysis to proceed in the face of strategic uncertainty without relying on quantitative metrics of uncertain validity, especially where there is little solid data.Footnote 5

5.3 Nuclear war as a global catastrophic risk

Scouras (Reference Scouras2019) assesses the risks posed by intelligent adversaries involving the use of nuclear weapons. His study notes that analysis of the likelihood of nuclear war is often based on poorly executed expert elicitations that reveal widely divergent opinions. Furthermore, our knowledge of the consequences of nuclear attacks remains underdeveloped with respect to many important effects, despite over a thousand nuclear tests and billions of dollars expended toward research on nuclear impacts. Like Roberts, Scouras asserts that traditional probabilistic risk assessment is insufficient for informing policy responses, and he offers multiple insights for improving the analysis of uncertain futures. Notably, he cautions against deriving overly simplistic policy prescriptions from computations of expected risk, which may be "inadequate, even dangerous [absent] a more nuanced understanding of the risk of nuclear war." (Scouras, Reference Scouras2019).

His article also highlights several contextual factors that have limited the development of comprehensive evaluations of the consequences of using nuclear weapons. For instance, Scouras notes that, particularly during the Cold War, data collection and analysis focused on “the nuclear damage mechanisms of air blast, cratering, ground shock, and similar phenomena” (Scouras, Reference Scouras2019, 11). Accordingly, such an emphasis limited growth of knowledge regarding other important effects of nuclear weapons, including damage to the functioning of economies and societies, the effects of electromagnetic pulses accompanying nuclear explosions, and the possibility of global-scale famine via nuclear winter.Footnote 6 According to Scouras, “there are no fundamental barriers to obtaining a better understanding of these important phenomena” (Scouras, Reference Scouras2019, 31). He argues that federal agencies such as the Defense Threat Reduction Agency should be explicitly directed to address these current gaps in our consequence assessments.

Finally, the continued proliferation of nuclear weapons provides additional risks for analysts to consider, including those posed by terrorists and other nonstate actors. Scouras (Reference Scouras2019) posits that it is feasible (and necessary) to improve assessments of these events and argues for research melding ideas from a variety of perspectives and approaches including case studies, experts, risk assessment, and complex systems theory, as “no single analytic approach has proven to be satisfactory.”

6 Policy errors

It is tempting to conclude that the set of problems we have characterized as uncertain futures may be too difficult for the lay public to address, thus requiring experts to design collective action solutions. However, Viscusi et al. (Reference Viscusi2019) raise several caution flags in blindly following this approach.

First, he examines the psychological biases, such as loss aversion, that tend to distort public perceptions of low-probability threats or potentially severe outcomes, however unlikely. Even if experts are less susceptible to these biases, distorted public perceptions can be amplified through the policy development process, which is (as it should be) designed to be responsive to public concerns. Viscusi et al. see rigorous BCA as an antidote to help prevent resource misallocations that can result from misguided public pressure.

Second, Viscusi et al. consider responses to risk ambiguity, meaning risks for which there is very little information that would allow empirical estimates of probabilities. In the face of such ambiguity, many tend to advocate using the “precautionary principle,” a “no regrets” option, or some other approach that incorporates ambiguity aversion. But such advice can lead to outcomes that are the opposite of what was intended because they foreclose opportunities for learning. Instead, it is important to maintain a balanced and objective assessment of costs and benefits, and to design policies that maintain the flexibility to make course corrections in the future as more information becomes available.

Further, they analyze the use of discount rates to evaluate benefits and costs that may occur in the distant future. Seemingly small changes in the discount rate can cause very large changes in the apparent merits of different policy options. Unfortunately, this means that advocates of a particular policy outcome may be tempted to tweak the discount rate to get their desired result. Viscusi et al. note that discounting procedures are based on sound economic reasoning even for intergenerational effects, and as a result, policymakers and other interested parties should be wary of attempts to manipulate them.

7 Policy implications

The characteristics of uncertain futures tend to make them intractable to market solutions because property rights are not clearly defined and essential information is unknown or even unknowable. Of course, these same factors also pose challenges for BCA and other traditional policy decision approaches. Both risk ambiguity and narrow framing elicit calls for precautionary approaches to uncertain futures, but these responses can have unintended consequences as well.

Despite these difficulties, the articles in this series illustrate that analysis can help ensure that policies are designed to support and enhance public well-being. The diverse policy decisions confronting decision-makers today demand analytic frameworks that incorporate uncertainties and trade-offs across policy areas. “Hard cases make bad law” is an old lawyers’ adage. It applies to policy analysis as well. When confronted with analytically intractable uncertain futures, we may need to sharpen our tools to apply and extend long-standing principles of sound decision-making to better address hard, real problems.

The articles in this series are intended as a first step toward dynamic BCA, providing firmer foundations and better developed methods for managing the risks of uncertain futures. Our hope is that this research may result in policies that lower the likelihoods and mitigate the consequences of these uncertain futures while encouraging economic growth, reducing fragility, and increasing resilience.

Acknowledgments:

The articles published in this special issue are part of a project funded by the Searle Freedom Trust. We greatly benefited from discussions with our colleagues in the GW Regulatory Studies Center and participants in sessions at the Society for Benefit-Cost Analysis and the Society for Risk Analysis annual meetings.

Footnotes

1 For example, sustained power outages could be fatal for certain people – particularly vulnerable populations living in extreme cold or hot areas of the country.

2 This is particularly relevant for situations where decision-makers expect to have better data in the future regarding outcomes of different approaches to the risk in question or that reduce the uncertainty surrounding its impact.

3 See p. 2 in Darby (Reference Darby2004), equation (2), where i denotes a specific threat scenario, j a specific facility, and k a specific type of consequence.

4 Cox (Reference Cox2008) elaborates on several additional shortcomings of this model.

5 For example, Cox (Reference Cox2008) proposes the following: “These and other approaches such as game theory models typically do not even attempt to assess threat, vulnerability, and consequence numbers or scores as inputs. Instead they focus on modeling how intelligent attackers can best exploit opportunities to do damage and how defenders can optimize allocation of defensive resources to minimize the damage that attackers can do, assuming that the attackers will take full advantage of remaining weaknesses.”

6 Additionally, Scouras also notes that “nonphysical societal effects (e.g., social, psychological, political, and economic effects) are even more difficult to assess and have never been adequately investigated.”

References

Abbas, Ali E. and Howard, Ronald A.. 2015. Foundations of Decision Analysis. Pearson Education: Harlow, United Kingdom.Google Scholar
Arrow, Kenneth J. and Fisher, Anthony C.. 1974. “Environmental Preservation, Uncertainty, and Irreversibility.” The Quarterly Journal of Economics, 88(2): 312319.Google Scholar
Breyer, Stephen G. 1993. Breaking the Vicious Circle. The Oliver Wendell Holmes Lectures. Vol. 1992. Cambridge, MA: Harvard University Press.Google Scholar
Budish, Eric, Cramton, Peter, and Shim, John. 2015. “The High-Frequency Trading Arms Race: Frequent Batch Auctions as a Market Design Response.” Quarterly Journal of Economics, 130(4): 15471621.Google Scholar
Chilton, Susan, Jones-Lee, Michael, Kiraly, Francis, Metcalf, Hugh, and Pang, Wei. 2006. “Dread Risks.” Journal of Risk and Uncertainty, 33(3): 165182.Google Scholar
Cox, Louis Anthony. Jr 2008. “Some Limitations of “Risk = Threat X Vulnerability X Consequence” for Risk Analysis of Terrorist Attacks.” Risk Analysis, 28(6): 1749.Google Scholar
Cox, , Louis Anthony Jr. 2019. “Muddling-Through and Deep Learning for Managing Large-Scale Uncertain Risks.” Journal of Benefit-Cost Analysis, 10(2). https://doi.org/10.1017/bca.2019.17.Google Scholar
Darby, John L. 2004. Estimating Terrorist Risk with Possibility Theory: Los Alamos National Laboratory.Google Scholar
Depoy, Jennifer Mae, Phelan, James, Sholander, Peter, Smith, Bryan, Varnado, G. Bruce, and Wyss, Gregory Dane. 2005. “Risk Assessment for Physical and Cyber Attacks on Critical Infrastructures.” IEEE: United States.Google Scholar
Francesca, Dominici, Greenstone, Michael, and Sunstein, Cass R.. 2014. “Particulate Matter Matters.” Science, 344(6181): 257259.Google Scholar
Dudley, Susan E. 2013. “A Retrospective Review of Retrospective Review.” Available at https://regulatorystudies.columbian.gwu.edu/retrospective-review-retrospective-review.Google Scholar
Golany, Boaz, Kaplan, Edward H., Marmur, Abraham, and Rothblum, Uriel G.. 2009. “Nature Plays with Dice—Terrorists do Not: Allocating Resources to Counter Strategic Versus Probabilistic Risks.” European Journal of Operational Research, 192(1): 198208.Google Scholar
House (Select) Intelligence Committee Hearing . 2014. Washington, DC: Federal Information & News Dispatch, Inc.Google Scholar
Intergovernmental Panel on Climate Change. Global Warming of 1.5°C: An IPCC Special Report on the Impacts of Global Warming of 1.5°C Above Pre-Industrial Levels and Related Global Greenhouse Gas Emission Pathways, in the Context of Strengthening the Global Response to the Threat of Climate Change, Sustainable Development, and Efforts to Eradicate Poverty: Intergovernmental Panel on Climate Change, 2018.Google Scholar
Kahneman, Daniel. 2003. “Maps of Bounded Rationality: Psychology for Behavioral Economics.” The American Economic Review, 93(5): 14491475.Google Scholar
Kahneman, Daniel and Tversky, Amos. 1979. “Prospect Theory: An Analysis of Decision Under Risk.” In Handbook of the Fundamentals of Financial Decision Making. Part I, pp. 99127.Google Scholar
Keeney, Ralph L. 1990. “Mortality Risks Induced by Economic Expenditures.” Risk Analysis: An Official Publication of the Society for Risk Analysis, 10(1): 147159.Google Scholar
Knake, Robert K. 2017. A Cyberattack on the U.S. Power Grid: Council on Foreign Relations.Google Scholar
Kuran, Timur and Sunstein, Cass R.. 1999. “Availability Cascades and Risk Regulation.” Stanford Law Review, 51(4): 683768.Google Scholar
Lindblom, Charles. 1959. The Science of “Muddling through”. Public Administration Review, 19(3): 79.Google Scholar
Linquiti, Peter and Vonortas, Nicholas. 2012. “The Value of Flexibility in Adapting to Climate Change.” Climate Change Economics, 3(2): 801833.Google Scholar
Lutter, Randall, Morrall, John F., and Viscusi, W. Kip. 1999. “The Cost-Per-Life-Saved Cutoff for Safety Enhancing Regulations.” Economic Inquiry, 37(4): 599608.Google Scholar
Mannix, Brian. 2018. “Benefit-Cost Analysis and Emerging Technologies.” The Hastings Center Report, 48(Suppl 1, S1): S20.Google Scholar
National Research Council. 2010. “Review of the Department of Homeland Security’s Approach to Risk Analysis.” National Academies Press.Google Scholar
O’Brien, Niall Joseph and Cummins, Enda J.. 2011. “A Risk Assessment Framework for Assessing Metallic Nanomaterials of Environmental Concern: Aquatic Exposure and Behavior.” Risk Analysis, 31(5): 706726.Google Scholar
Ostrom, Elinor. 1990. Governing the Commons. 1st ed. Cambridge: Cambridge University Press.Google Scholar
Oussalah, Mourad and Newby, Martin. 2004Combination of Qualitative and Quantitative Sources of Knowledge for Risk Assessment in the Framework of Possibility Theory.” International Journal of General Systems, 33(2-3): 133151.Google Scholar
Pawson, Ray. 2013 The Science of Evaluation: A Realist Manifesto. London: SAGE.Google Scholar
Posner, Richard A. 2004. Catastrophe Risk and Response. Oxford: Oxford University Press.Google Scholar
Pugh, Jonathan. 2016. “Driven to Extinction? the Ethics of Eradicating Mosquitoes with Gene-Drive Technologies.” Journal of Medical Ethics, 42(9): 578581.Google Scholar
Raiffa, Howard. 1968. Decision Analysis: Introductory Lectures on Choices Under Uncertainty. 1 ed. Reading, MA: Addison-Wesley.Google Scholar
Roberts, Fred S. 2019. “From Football to Oil Rigs: Risk Assessment for Combined Cyber and Physical Attacks.” Journal of Benefit-Cost Analysis, 10(2). https://doi.org/10.1017/bca.2019.15.Google Scholar
Rossi, Peter H., Lipsey, Mark W., and Freeman, Howard E.. 2004. Evaluation, 7th ed. Thousand Oaks, CA: Sage Publications.Google Scholar
Scouras, James. 2019. “Nuclear War as a Global Catastrophic Risk.” Journal of Benefit-Cost Analysis, 10(2). https://doi.org/10.1017/bca.2019.16.Google Scholar
Scriven, Michael. 1997. “Truth and Objectivity in Evaluation.” In Evaluation for the 21st Century: A Handbook, 477. Thousand Oaks: SAGE Publications, Inc.Google Scholar
Sprenger, Charles. 2015. “An Endowment Effect for Risk: Experimental Tests of Stochastic Reference Points.” Journal of Political Economy, 123(6): 14561499.Google Scholar
Sunstein, Cass R. 2006. “Irreversible and Catastrophic.” Cornell Law Review 91(4): 841.Google Scholar
Taleb, Nassim Nicholas. 2007. Black Swan: The Impact of the Highly Improbable. New York: Random House.Google Scholar
U.S. Department of Energy. 2017. Quadrennial Energy Review. Transforming the Nation’s Electricity System: The Second Installment of the QER. Washington, DC: US DOE.Google Scholar
Viscusi, W. Kip. 2019. “Responsible Precautions for Uncertain Environmental Risks.” Journal of Benefit-Cost Analysis, 10(2). https://doi.org/10.1017/bca.2019.14.Google Scholar
W. Kip., Viscusi 2009. “Valuing Risks of Death from Terrorism and Natural Disasters.” Journal of Risk and Uncertainty, 38(3): 191213.Google Scholar
Weitzman, Martin L. 2011. “Fat-Tailed Uncertainty in the Economics of Catastrophic Climate Change.” Review of Environmental Economics and Policy, 5(2): 275292.Google Scholar
Whipp, Lindsay and Scannell, Kara. 2016. ‘Flash-Crash’ Trader Navinder Sarao Pleads Guilty to Spoofing. London: The Financial Times Limited.Google Scholar
World Economic Forum. 2018 . The Global Risks Report 2018: 13th Edition. World Economic Forum.Google Scholar
Figure 0

Table 1 World Economic Forum: top 5 global risks in terms of impact.