We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
Assessing the risk of nuclear terrorism is a challenging task due to the diversity of actors involved, variety of pathways to success, range of defensive measures employed, and the lack of detailed historical record upon which to base analysis. Numerical models developed to date vary wildly in both approach and ultimate assessment: estimates of the likelihood a nuclear terrorist attack differ by up to nine orders of magnitude. This article critiques existing efforts from the standpoint of probability theory, and proposes an alternative perspective on the utility of risk assessment in this area. Nuclear terrorism is argued to be a ‘virtual risk’ for which it is not possible to meaningfully ascribe a quantitative measure, making numerical estimates of the likelihood of nuclear terrorism misleading. Instead, we argue that focus should be placed on utilising models to identify areas of disagreement as targets for further research, with greater emphasis on understanding terrorist decision-making and adaption in response to nuclear security measures.
Author’s email:
Author’s email:
^{1} Sharon Squassoni, ‘Outcomes from the 2014 Nuclear Security Summit’, Centre for Strategic and International Studies Critical Questions (25 March 2014), available at: http://csis.org/publication/outcomes-2014-nuclear-security-summit} accessed 1 July 2016.
^{2} NTI Nuclear Security Index, ‘Nuclear Threat Initiative’, available at: {http://ntiindex.org/} accessed 10 February 2016.
^{3} Porter Theodore M., Trust in Numbers: The Pursuit of Objectivity in Science and Public Life (Princeton: Princeton University Press, 1995), ch. 4.
^{4} Mueller John, ‘The Atomic Terrorist: Assessing the Likelihood’, conference paper, Program on International Security (University of Chicago, 2008), p. 14 ; Allison Graham T., Nuclear Terrorism: The Ultimate Preventable Catastrophe (London: Macmillan, 2004), p. 15 .
^{5} Silver Nate, ‘Crunching the risk numbers’, Wall Street Journal (8 January 2010); Allison Graham T., ‘Nuclear attack a worst-case reality?’, The Washington Times (23 April 2008); Crowley Michael, ‘Yes, Obama really is worried about a Manhattan nuke’, Time (26 March 2014).
^{6} Kaplan Stanley and Garrick B. John, ‘On the quantitative definition of risk’, Risk Analysis, 1 (1981), pp. 11–27 .
^{7} Many potential hazards may fall outside the ambit of our collective knowledge resulting in so-called Black Swan events, as discussed by Nassim Taleb in his book of the same name. See Taleb Nassim Nicholas, The Black Swan: The Impact of the Highly Improbable (London: Peguin, 2008).
^{8} Zimmerman Peter D. and Lewis Jeffrey G., ‘The bomb in the backyard’, Foreign Policy (16 October 2009), available at: {http://foreignpolicy.com/2009/10/16/the-bomb-in-the-backyard/} accessed 1 July 2016.
^{9} Ferguson Charles D. and Potter William C., The Four Faces of Nuclear Terrorism (London: Routledge, 2005), p. 5 .
^{10} Mathematically, this formulation presents risk as the ‘expected consequence’ of the hazard.
^{11} Anthony (Tony) Cox Louis , Jr, ‘Some limitations of “Risk = Threat x Vulnerability x Consequence” for risk analysis of terrorist attacks’, Risk Analysis, 28 (2008), pp. 1749–1761 .
^{12} While consequence analysis typically focuses on a single impact measure, such as economic damage expressed in dollars lost, there is increasing acknowledgement that impacts are multidimensional and should therefore be addressed in such terms. See, for instance, Frey Bruno S., Luechinger Simon, and Stutzer Alois, ‘Calculating tragedy: Assessing the costs of terrorism’, Journal of Economic Surveys, 21 (2007), pp. 1–24 .
^{13} Everitt B. S. and Skrondal A., The Cambridge Dictionary of Statistics (4th edn, Cambridge University Press, 2010), p. 174 ; Háyek Alan, ‘“Mises redux” – redux: Fifteen arguments against finite Frequentism’, Erkenntnis, 45 (1996), pp. 209–227 . Note that our designation of Frequentist probability could, more properly, be referred to as finite Frequentist probability. This approach demonstrates a wide range of mathematical and philosophical problems as an interpretation of probability despite its numerous enticements.
^{14} Ibid.
^{15} Williams Alan F. and Shabanova Veronika I., ‘Responsibility of drivers, by age and gender, for motor-vehicle crash deaths’, Journal of Safety Research, 34 (2003), p. 527 .
^{16} Porter, Trust in Numbers, ch. 4.
^{17} In the context of Bayesian Probability, the same observation can be applied to the generation of the prior probability distribution, although there are guiding principles that can be applied in this case such as the Jeffreys Prior, see Lindley D. V., ‘The use of prior probability distributions in statistical inference and decisions’, Proceedings of the Fourth Berkeley Symposium on Mathematical Statistics and Probability (Berkeley: University of California Press, 1961), pp. 453–468 .
^{18} Bedford Tim and Cooke Roger, Probabilistic Risk Analysis: Foundations and Methods (Cambridge: Cambridge University Press, 2001), ch. 10.
^{19} Winkler Robert L. and Murphy Allan H., ‘“Good” probability assessors’, Journal of Applied Meteorology, 7 (1968), p. 751 .
^{20} Apostolakis George, ‘The concept of probability in safety assessment of technological systems’, Science, 250 (1990), p. 1359 .
^{21} Talbott William, ‘Bayesian Epistemology’, The Stanford Encyclopedia of Philosophy, Winter (2016).
^{22} Matthews Robert A. J., ‘Fact Versus Fiction: The Use and Abuse of Subjectivity in Scientific Research’, European Science and Environment Forum Working Paper, No. 2 (1998).
^{23} Aven Terje, Renn Ortwin, and Rosa Eugene A., ‘On the ontological status of the concept of risk’, Safety Science, 49 (2011), p. 1077 .
^{24} Goldstein Michael, ‘Subjective Bayesian analysis: Principles and practice’, Bayesian Analysis, 1 (2006), p. 408 .
^{25} Kahneman Daniel and Tversky Amos, ‘Subjective probability: a judgement of representativeness’, in C.-A. Staël von Holstein (ed.), The Concept of Probability in Psychological Experiments (Dordrecht: D. Reidel Publishing Company, 1974), p. 44 , emphasis added.
^{26} Kaplan and Garrick, ‘On the quantitative definition of risk’, p. 18.
^{27} Ibid.
^{28} Winkler and Murphy, ‘“Good” probability assessors’, p. 752.
^{29} Ibid.
^{30} Bedford and Cooke, Probabilistic Risk Analysis, pp. 191–217.
^{31} An alternative name for the knowledge gap could be epistemic goodness, the extent to which relevant knowledge about the hazard or scenario under consideration is knowable.
^{32} Briggs Andrew H., Goeree Ron, Blackhouse Gord, and O’Brien Bernie J., ‘Probabilistic analysis of cost-effectiveness models: Choosing between treatment strategies for gastroesophageal reflux disease’, Medical Decision Making, 22 (2002), p. 291 ; Dewan Naresh A., Shehan Christopher J., Reeb Steven D., Gobar Lisa S., Scott Walter J., and Ryschon Kay, ‘Likelihood of malignancy in a solitary pulmonary nodule: Comparison of Bayesian analysis and results of FDG-PET scan’, Chest, 112 (1997), pp. 416–422 .
^{33} Clauset Aaron, Young Maxwell, and Gleditsch Kristian S., ‘On the frequency of severe terrorist events’, Journal of Conflict Resolution, 51 (2007), pp. 58–87 .
^{34} Ibid., p. 64.
^{35} Willis Henry H., LaTourrette Tom, Kelly Terrence K., Hickey Scot, and Neill Samuel, Terrorism Risk Modeling for Intelligence Analysis and Infrastructure Protection (RAND, 2007), p. 5 .
^{36} Ibid.
^{37} Ibid.
^{38} Adams John, ‘Risk and morality: Three framing devices’, in Richard Victor Ericson and Aaron Doyle (eds), Risk and Morality (University of Toronto Press, 2003), pp. 87–104 .
^{39} Ibid., p. 87.
^{40} Kindhauser Mary K. (ed.), Communicable Diseases 2002: Global Defence Against the Infectious Disease Threat (World Health Organisation, 2002), available at: {http://apps.who.int/iris/bitstream/10665/42572/1/9241590297.pdf} accessed 8 July 2016.
^{41} In the United Kingdom, the response to mid-nineteenth century cholera epidemics followed precisely this pattern. Popularly attributed to the revolutionary statistical epidemiology of Dr John Snow, who painstakingly identified a contaminated communal water pump handle as the source of a central London outbreak, the realisation that cholera spread via the faecal-oral route forced the authorities to act. Legislative changes followed and regulations governing public health were disseminated by the General Board of Health that required local authorities to ‘provide dispensaries operating around the clock with sufficient medical aid to treat cholera patients’ among other stipulations, see Snow John, On the Mode of Communication of Cholera (John Churchill, 1855); Cameron Donald and Jones Ian G., ‘John Snow, the Broad Street pump and modern epidemiology’, International Journal of Epidemiology, 12 (1983), pp. 393–396 ; Kotar S. L. and Gessler J. E., Cholera: A Worldwide History (Jefferson, NC: McFarland and Company, 2014), pp. 151–160 .
^{42} Enders Walter and Sandler Todd, ‘The effectiveness of antiterrorism policies: a vector-autoregression-intervention analysis’, The American Political Science Review, 87 (1993), pp. 829–844 .
^{43} Preventive Security Measures, Convention on International Aviation, Annex 17, ‘Security: Safeguarding International Civil Aviation Against Acts of Unlawful Interference’ (International Civil Aviation Organisation, 2014).
^{44} Burchell Mark J., ‘W(h)ither the Drake equation?’, International Journal of Astrobiology, 5 (2006), pp. 243–250 .
^{45} SETI Institute, ‘The Drake Equation’, available at: {http://www.seti.org/node/434} accessed 2 February 2016.
^{46} Burchell, ‘W(h)ither the Drake equation?’, p. 249.
^{47} Adams, ‘Risk and morality’, p. 92.
^{48} Ibid.
^{49} Ibid.
^{50} There are likely to have been many more unrecorded or unreported cases, with political or other reasons acting as a constraint upon national authorities from sharing such sensitive information. The International Atomic Energy Agency’s Incident and Trafficking Database (ITDB) is a case in point: event reporting is voluntary and therefore the completeness of the database is open to question. While statistical tools can be applied to the ITDB, this must be done in the knowledge that the results are inherently limited for this reason.
^{51} Bunn Matthew, ‘A mathematical model of the risk of nuclear terrorism’, The Annals of the American Academy of Political and Social Science, 607 (2006), pp. 103–120 .
^{52} Ibid., p. 113.
^{53} Ferguson and Potter, The Four Faces of Nuclear Terrorism, pp. 33, 194–5.
^{54} Ibid., p. 4.
^{55} Jenkins Brian Michael, ‘The new age of terrorism’, in Brian Michael Jenkins (ed.), McGraw-Hill Homeland Security Handbook (RAND, 2006), ch. 9.
^{56} Cole Benjamin, The Changing Face of Terrorism (London: I. B. Tauris, 2010).
^{57} Ferguson and Potter, The Four Faces of Nuclear Terrorism, ch. 1.
^{58} Mowatt-Larssen Rolf, ‘The Armageddon test: Preventing nuclear terrorism’, Bulletin of the Atomic Scientists, 65 (2009), pp. 60–70 .
^{59} Volders Brecht and Sauer Tom (eds), Nuclear Terrorism: Countering the Threat (London: Routledge, 2016); Levi Michael, On Nuclear Terrorism (Cambridge, MA: Harvard University Press, 2009).
^{60} After receiving criticism for ‘alarmist’ views on the issue, Peter Zimmerman famously included a section headed ‘John Mueller: Pollyanna?’ in his paper, ‘Do we really need to worry? Some reflections on the threat of nuclear terrorism’, Defence Against Terrorism Review, 2:2 (2009), pp. 1–14. In Achieving Nuclear Ambitions: Scientists, Politicians, and Proliferation (Cambridge: Cambridge University Press, 2012), Jacques Hymans notes that Allison ‘cites – without irony – an analysis conducted by science fiction writer Tom Clancy’ as part of an argument showing terrorists or even individuals could self-produce fissile materials for inclusion in an IND.
^{61} Allison, Nuclear Terrorism, p. 15; Mueller, ‘The Atomic Terrorist’, p. 14.
^{62} Keller William and Modarres Mohammad, ‘A historical overview of probabilistic risk assessment development and its use in the nuclear power industry: a tribute to the late Professor Normal Carl Rasmussen’, Reliability Engineering and System Safety, 89 (2005), pp. 271–285 .
^{63} Reactor Safety Study: An Assessment of Accident Risks in U.S. Commercial Nuclear Power Plants (US Nuclear Regulatory Commission, 1975).
^{64} Ezell Barry C. et al., ‘Probabilistic risk analysis and terrorism risk’, Risk Analysis, 30 (2010), pp. 575–589 .
^{65} Paté-Cornell Elisabeth, ‘Fault tree vs. event trees in reliability analysis’, Risk Analysis, 4 (1984), pp. 177–186 .
^{66} For a particularly accessible introduction to the theory, see ibid.
^{67} Bunn, ‘A mathematical model of the risk of nuclear terrorism’, pp. 104–6; Mueller, ‘The Atomic Terrorist’, p. 14. Levi stands in contrast to this approach, for example, by analysing the set of defensive measures as a whole, that is, as a layered defensive system.
^{68} Bunn, ‘A mathematical model of the risk of nuclear terrorism’, p. 103.
^{69} Ibid.
^{70} Kenney Michael, ‘From Pablo to Osama: Counter-terrorism lessons from the war on drugs’, Survival, 45 (2003), pp. 187–206 .
^{71} Enders and Sandler, ‘The effectiveness of antiterrorism policies’.
^{72} For instance, via a joint US-Russia blue ribbon ceremony inaugurating the first US Second Line of Defense border monitoring system in place at Moscow’s Sheremetyevo airport, see Lara Cantuti and Lee Thomas, ‘Second Line of Defense Program’, paper presented at The Institute of Nuclear Materials Management (US Department of Energy, 1999), p. 3.
^{73} Morton David P., Pan Fend, and Saeger Kevin J., ‘Models for nuclear smuggling interdiction’, IIE Transactions, 39 (2007), pp. 3–14 .
^{74} One notable exception is the study by Morton, Pan, and Saeger where resource allocation is dynamically modelled in the face of a (potentially) unknown number of adversaries.
^{75} US National Research Council Committee on Methodological Improvements to the Department of Homeland Security’s Biological Agent Risk Analysis, ‘Department of Homeland Security Bioterrorism Risk Assessment: A Call for Change’ (National Academies Press, 2008), p. 27.
^{76} Ibid.
^{77} Borum Randy, ‘Understanding terrorist psychology’, in Andrew Silke (ed.), The Psychology of Counter-Terrorism (London: Routledge, 2010).
^{78} Levi, On Nuclear Terrorism, p. 49.
^{79} Reader Ian, Contemporary Religious Violence in Japan: The Case of Aum Shinrikyo (Honolulu: University of Hawaii Press, 2000).
^{80} Flint Colin and Radil Steven M., ‘Terrorism and counter-terrorism: Situating al-Qaeda and the global war on terror within geopolitical trends and structure’, Eurasian Geography and Economics, 20 (2009), pp. 150–171 .
^{81} Lugar Richard G., ‘The Lugar Survey on Proliferation Threats and Responses’, Office of United States Senator Richard G. Lugar (2005), p. 16 .
^{82} Shmueli Galit, ‘To explain or predict’, Statistical Science, 25 (2010), pp. 289–310 .
^{83} Morgan Mary S. and Morrison Margaret (eds), Models as Mediators: Perspectives on Natural and Social Sciences (Cambridge: Cambridge University Press, 1999).
^{84} McBurney Peter, ‘What are models for?’, in Massimo Cossentino, Michael Kaisers, Karl Tuyls, and Gerhard Weiss (eds), Multi-Agent Systems (Springer Lecture Notes in Computer Science, 2012), pp. 175–188 .
^{85} Bunn, ‘A mathematical model of the risk of nuclear terrorism’, p. 103.
^{86} Burchell, ‘W(h)ither the Drake equation?’, p. 244.
^{87} Paté-Cornell Elisabeth and Guikema Seth, ‘Probabilistic modeling of terrorist threats: a systems analysis approach to setting priorities among countermeasures’, Military Operations Research, 7 (2002), p. 1 .
^{88} Burchell, ‘W(h)ither the Drake equation?’, p. 244.
^{89} Bunn, ‘A mathematical model of the risk of nuclear terrorism’, p. 117.
^{90} For instance, Lugar, ‘The Lugar Survey on Proliferation Threats and Responses’.
^{91} McBurney, ‘What are models for?’, p. 184.
^{92} Bunn, ‘A mathematical model of the risk of nuclear terrorism’, p. 108.
^{93} Kaplan and Garrick, ‘On the quantitative definition of risk’.
^{94} O’Hagan Anthony et al., Uncertain Judgements: Eliciting Experts’ Probabilities (Hoboken, NJ: Wiley, 2006), p. 25 .
^{95} Howard Ronald A., Dynamic Probabilistic Systems, Volume II: Semi-Markov and Decision Processes (Mineola, NY: Dover, 2007), p. 965 ; for instance, Miller William M., ‘A state-transition model of epidemic foot-and-mouth disease’, New Techniques in Veterinary Epidemiology and Economics, 1 (1976).
^{96} See, for example, Ransom Weaver et al., ‘Modeling and Simulating Terrorist Decision-Making: A ‘Performance Moderator Function’ Approach to Generating Virtual Opponents’, Proceedings of the 10^{th} Conference on Computer Generated Forces and Behavioural Representation (2001).
^{97} US National Research Council, ‘Risk assessment’.
^{98} Lampland Martha, ‘False numbers as formalizing practices’, Social Studies of Science, 40 (2010), pp. 377–404 .
* Author’s email: robert.downes@kcl.ac.uk
** Author’s email: christopher.hobbs@kcl.ac.uk
Email your librarian or administrator to recommend adding this journal to your organisation's collection.
Full text views reflects the number of PDF downloads, PDFs sent to Google Drive, Dropbox and Kindle and HTML full text views.
Abstract views reflect the number of visits to the article landing page.
* Views captured on Cambridge Core between 2nd May 2017 - 20th February 2018. This data will be updated every 24 hours.