Skip to main content Accessibility help
×
Home
  • Print publication year: 2014
  • Online publication date: June 2014

Towards common-sense reasoning via conditional simulation: legacies of Turing in Artificial Intelligence

Summary

Abstract. The problem of replicating the flexibility of human common-sense reasoning has captured the imagination of computer scientists since the early days of Alan Turing's foundational work on computation and the philosophy of artificial intelligence. In the intervening years, the idea of cognition as computation has emerged as a fundamental tenet of Artificial Intelligence (AI) and cognitive science. But what kind of computation is cognition?

We describe a computational formalism centered around a probabilistic Turing machine called QUERY, which captures the operation of probabilistic conditioning via conditional simulation. Through several examples and analyses, we demonstrate how the QUERY abstraction can be used to cast common-sense reasoning as probabilistic inference in a statistical model of our observations and the uncertain structure of the world that generated that experience. This formulation is a recent synthesis of several research programs in AI and cognitive science, but it also represents a surprising convergence of several of Turing's pioneering insights in AI, the foundations of computation, and statistics.

§1. Introduction. In his landmark paper Computing Machinery and Intelligence [Tur50], Alan Turing predicted that by the end of the twentieth century, “general educated opinion will have altered so much that one will be able to speak of machines thinking without expecting to be contradicted.” Even if Turing has not yet been proven right, the idea of cognition as computation has emerged as a fundamental tenet of Artificial Intelligence (AI) and cognitive science. But what kind of computation—what kind of computer program—is cognition?

[AFR11] N. L., Ackerman, C. E., Freer, and D. M., Roy, Noncomputable conditional distributions, Proceedings of the 26th annual IEEE symposium on Logic in Computer Science (LICS 2011), 20ll, pp. 107-116.
[And90] J. R., Anderson, The adaptive character of thought, Erlbaum, Hillsdale, NJ, 1990.
[AB13] J., Avigad and V., Brattka, Computability and analysis: The legacy of Alan Turing, in this volume.
[BJ03] F. R., Bach and M. I., Jordan, Learning graphical models with Mercer kernels, Advances in Neural Information Processing Systems 15 (NIPS 2002) (S., Becker, S., Thrun, and K., Obermayer, editors), The MIT Press, Cambridge, MA, 2003, pp. 1009-1016.
[Bak12] C. L., Baker, Bayesian theory of mind: Modeling human reasoning about beliefs, desires, goals, and social relations, Ph.D. thesis, Massachusetts Institute of Technology, 2012.
[BGT07] C. L., Baker, N. D., Goodman, and J. B., Tenenbaum, Theory-based social goal inference, Proceedings of the 30th annual conference of the Cognitive Science Society, 2007, pp. 1447-1452.
[BST09] C. L., Baker, R., Saxe, and J. B., Tenenbaum, Action understanding as inverse planning, Cognition, vol. 113 (2009), no. 3, pp. 329-349.
[Bar98] A. R., Barron, Information-theoretic characterization of Bayes performance and the choice ofpriors in parametric and nonparametric problems, Bayesian statistics 6: Proceedings of the sixth Valencia international meeting (J. M., Bernardo, J. O., Berger, A. P., Dawid, and A. F., M. Smith, editors), 1998, pp. 27-52.
[Be157] R., Bellman, Dynamic programming, Princeton University Press, Princeton, NJ, 1957.
[B1a97] J., Blanck, Domain representability of metric spaces, Annals of Pure and Applied Logic, vol. 83 (1997), no. 3, pp. 225-247.
[Cam11] C. F., Camerer, Behavioral game theory: Experiments in strategic interaction, The Roundtable Series in Behavioral Economics, Princeton University Press, 2011.
[Car09] S., Carey, The origin of concepts, Oxford University Press, New York, 2009.
[CSH08] V., Chandrasekaran, N., Srebro, and P., Harsha, Complexity of inference in graphical models, Proceedings of the twenty fourth conference on Uncertainty in Artificial Intelligence (UAI 2008) (Corvalis, Oregon), AUAI Press, 2008, pp. 70-78.
[Coo90] G. F., Cooper, The computational complexity of probabilistic inference using Bayesian belief networks, Artificial Intelligence, vol. 42 (1990), no. 2–3, pp. 393-405.
[Cop04] B. J., Copeland (editor), The essential Turing: Seminal writings in computing, logic, philosophy, artificial intelligence, and artificial life: Plus the secrets of enigma, Oxford University Press, Oxford, 2004.
[CP96] B. J., Copeland and D., Proudfoot, On Alan Turing's anticipation of connectionism, Synthese, vol. 108 (1996), no. 3, pp. 361-377.
[DKLR00] P., Dagum, R., Karp, M., Luby, and S., Ross, An optimal algorithm for Monte Carlo estimation, SIAM Journal on Computing, vol. 29 (2000), no. 5, pp. 1484-1496.
[DL93] P., Dagum and M., Luby, Approximating probabilistic inference in Bayesian belief networks is NP-hard, Artificial Intelligence, vol. 60 (1993), no. 1, pp. 141-153.
[dMSS56] K., de Leeuw, E. F., Moore, C. E., Shannon, and N., Shapiro, Computability by probabilistic machines, Automata Studies, Annals of Mathematical Studies, no. 34, Princeton University Press, Princeton, NJ, 1956, pp. 183-212.
[DeG05] M. H., DeGroot, Optimal statistical decisions, Wiley Classics Library, Wiley, 2005.
[DWRT10] F., Doshi-Velez, D., Wingate, N., Roy, and J., Tenenbaum, Nonparametric Bayesian policy priors for reinforcement learning, Advances in Neural Information Processing Systems 23 (NIPS 2010) (J., Lafferty, C. K. I., Williams, J., Shawe-Taylor, R. S., Zemel, and A., Culotta, editors), 2010, pp. 532-540.
[Eda96] A., Edalat, The Scott topology induces the weak topology, 11th annual IEEE symposium on Logic in Computer Science (LICS 1996), IEEE Computer Society Press, Los Alamitos, CA, 1996, pp. 372-381.
[EH98] A., Edalat and R., Heckmann, A computational model for metric spaces, Theoretical Computer Science, vol. 193 (1998), no. 1–2, pp. 53-73.
[FG12] M. C., Frank and N. D., Goodman, Predicting pragmatic reasoning in language games, Science, vol. 336 (2012), no. 6084, p. 998.
[Gac05] P., Gács, Uniform test of algorithmic randomness over a general space, Theoretical Computer Science, vol. 341 (2005), no. 1–3, pp. 91-137.
[GHR10] S., Galatolo, M., Hoyrup, and C., Rojas, Effective symbolic dynamics, random points, statistical behavior, complexity and entropy, Information and Computation, vol. 208 (2010), no. 1, pp. 23-41.
[Gei84] W. S., Geisler, Physical limits of acuity and hyperacuity, Journal of the Optical Society of America A, vol. 1 (1984), no. 7, pp. 775-782.
[GG12] T., Gerstenberg and N. D., Goodman, Ping pong in Church: Productive use of concepts in human probabilistic inference, Proceedings of the thirty-fourth annual conference of the Cognitive Science Society (Austin, TX) (N., Miyake, D., Peebles, and R. P., Cooper, editors), Cognitive Science Society, 2012.
[GGLT12] T., Gerstenberg, N. D., Goodman, D. A., Lagnado, and J. B., Tenenbaum, Noisy Newtons: Unifying process and dependency accounts of causal attribution, Proceedings of the thirty-fourth annual conference of the Cognitive Science Society (Austin, TX) (N., Miyake, D., Peebles, and R. P., Cooper, editors), Cognitive Science Society, 2012.
[GT07] L., Getoor and B., Taskar, Introduction to statistical relational learning, The MIT Press, 2007.
[GG02] D. G., Goldstein and G., Gigerenzer, Models of ecological rationality: The recognition heuristic, Psychological Review, vol. 109 (2002), no. 1, pp. 75-90.
[Goo61] I. J., Good, A causal calculus. I, The British Journal for the Philosophy of Science, vol. 11 (1961), pp. 305-318.
[Goo68] I. J., Good, Corroboration, explanation, evolving probability, simplicity and a sharpened razor, The British Journalfor the Philosophy of Science, vol. 19 (1968), no. 2, pp. 123-143.
[Goo75] I. J., Good, Explicativity, corroboration, and the relative odds of hypotheses, Synthese, vol. 30 (1975), no. 1, pp. 39-73.
[Goo79] I. J., Good, A. M. Turing's statistical work in World War II, Biometrika, vol. 66 (1979), no. 2, pp. 393-396, Studies in the history of probability and statistics. XXXVII.
[Goo91] I. J., Good, Weight of evidence and the Bayesian likelihood ratio, The use of statistics in forensic science (C. G. G., Aitken and D. A., Stoney, editors), Ellis Horwood, Chichester, 1991.
[Goo00] I. J., Good, Turing's anticipation of empirical Bayes in connection with the cryptanalysis of the naval Enigma, Journal of Statistical Computation and Simulation, vol. 66 (2000), no. 2, pp. 101-111.
[GBT09] N. D., Goodman, C. L., Baker, and J. B., Tenenbaum, Cause and intent: Social reasoning in causal learning, Proceedings of the 31st annual conference of the Cognitive Science Society, 2009, pp. 2759-2764.
[GMRBT08] N. D., Goodman, V. K., Mansinghka, D. M., Roy, K., Bonawitz, and J. B., Tenenbaum, Church: A language for generative models, Proceedings of the twenty-fourth conference on Uncertainty in Artificial Intelligence (UAI 2008) (Corvalis, Oregon), AUAI Press, 2008, pp. 220-229.
[GS12] N. D., Goodman and A., Stuhlmüller, Knowledge and implicature: Modeling language understanding as social cognition, Proceedings of the thirty-fourth annual Conference of the Cognitive Science Society (Austin, TX) (N., Miyake, D., Peebles, and R. P., Cooper, editors), Cognitive Science Society, 2012.
[GT12] N. D., Goodman and J. B., Tenenbaum, The probabilistic language of thought, in preparation, 2012.
[GTFG08] N. D., Goodman, J. B., Tenenbaum, J., Feldman, and T. L., Griffiths, A rational analysis of rule-based concept learning, Cognitive Science, vol. 32 (2008), no. 1, pp. 108-154.
[GTO11] N. D., Goodman, J. B., Tenenbaum, and T. J., O'Donnell, Probabilistic models of cognition, Church wiki, (2011), http://projects.csail.mit.edu/church/wiki/Probabilistic_Models-of-Cognition.
[GUT11] N. D., Goodman, T. D., Ullman, and J. B., Tenenbaum, Learning a theory of causality, Psychological Review, vol. 118 (2011), no. 1, pp. 110-119.
[Gop12] A., Gopnik, Scientific thinking in young children: Theoretical advances, empirical research, and policy implications, Science, vol. 337 (2012), no. 6102, pp. 1623-1627.
[GKT08] T. L., Griffiths, C., Kemp, and J. B., Tenenbaum, Bayesian models of cognition, Cambridge handbook of computational cognitive modeling, Cambridge University Press, 2008.
[GT05] T. L., Griffiths and J. B., Tenenbaum, Structure and strength in causal induction, Cognitive Psychology, vol. 51 (2005), no. 4, pp. 334-384.
[GT06] T. L., Griffiths and J. B., Tenenbaum, Optimal predictions in everyday cognition, Psychological Science, vol. 17 (2006), no. 9, pp. 767-773.
[GT09] T. L., Griffiths and J. B., Tenenbaum, Theory-based causal induction, Psychological Review, vol. 116 (2009), no. 4, pp. 661-716.
[GSW07] T., Grubba, M., Schröder, and K., Weihrauch, Computable metrization, Mathematical Logic Quarterly, vol. 53 (2007), no. 4–5, pp. 381-395.
[Ham12] J., Hamrick, Physical reasoning in complex scenes is sensitive to mass, Master of Engineering thesis, Massachusetts Institute of Technology, Cambridge, MA, 2012.
[HBT11] J., Hamrick, P. W., Battaglia, and J. B., Tenenbaum, Internal physics models guide probabilistic judgments about object dynamics, Proceedings of the thirty-third annual Conference of the Cognitive Science Society (Austin, TX) (C., Carlson, C., Hölscher, and T., Shipley, editors), Cognitive Science Society, 2011, pp. 1545-1550.
[Hem02] A., Hemmerling, Effective metric spaces and representations of the reals, Theoretical Computer Science, vol. 284 (2002), no. 2, pp. 347-372.
[Hod97] A., Hodges, Turing: A natural philosopher, Phoenix, London, 1997.
[How60] R. A., Howard, Dynamic programming and Markov processes, The MIT Press, Cambridge, MA, 1960.
[KLC98] L. P., Kaelbling, M. L., Littman, and A. R., Cassandra, Planning and acting in partially observable stochastic domains, Artificial Intelligence, vol. 101 (1998), pp. 99-134.
[KLM96] L. P., Kaelbling, M. L., Littman, and A.W., Moore, Reinforcement learning: A survey, Journal of Artificial Intelligence Research, vol. 4 (1996), pp. 237-285.
[Kal02] O., Kallenberg, Foundations of moden probability, 2nd ed., Probability and its Applications, Springer, New York, 2002.
[KGT08] C., Kemp, N. D., Goodman, and J. B., Tenenbaum, Learning and using relational theories, Advances in Neural Information Processing Systems 20 (NIPS 2007), 2008.
[KSBT07] C., Kemp, P., Shafto, A., Berke, and J. B., Tenenbaum, Combining causal and similarity-based reasoning, Advances in Neural Information Processing Systems 19 (NIPS 2006) (B., Schölkopf, J., Platt, and T., Hoffman, editors), The MIT Press, Cambridge, MA, 2007, pp. 681-688.
[KT08] C., Kemp and J. B., Tenenbaum, The discovery of structurai form, Proceedings of the National Academy of Sciences, vol. 105 (2008), no. 31, pp. 10687-10692.
[KY03] D., Kersten and A., Yuille, Bayesian models of object perception, Current Opinion in Neurobiology, vol. 13 (2003), no. 2, pp. 150-158.
[LBFL93] R. K., Lindsay, B. G., Buchanan, E. A., Feigenbaum, and J., Lederberg, DENDRAL: A case study of the first expert system for scientific hypothesis formation, Artificial Intelligence, vol. 61 (1993), no. 2, pp. 209-261.
[Luc59] R. D., Luce, Individual choice behavior, John Wiley, New York, 1959.
[Luc77] R. D., Luce, The choice axiom after twenty years, Journal of Mathematical Psychology, vol. 15 (1977), no. 3, pp. 215-233.
[Mac03] D. J. C., MacKay, Information theory, inference, and learning algorithms, Cambridge University Press, Cambridge, UK, 2003.
[MHC03] O., Madani, S., Hanks, and A., Condon, On the undecidability of probabilistic planning and related stochastic optimization problems, Artificial Intelligence, vol. 147 (2003), no. 1–2, pp. 5-34.
[Man09] V. K., Mansinghka, Natively probabilistic computation, Ph.D. thesis, Massachusetts Institute of Technology, 2009.
[Man11] V. K., Mansinghka, Beyond calculation: Probabilistic computing machines and universal stochastic inference, NIPS Philosophy and Machine Learning Workshop, (2011).
[MJT08] V. K., Mansinghka, E., Jonas, and J. B., Tenenbaum, Stochastic digital circuits for probabilistic inference, Technical Report MIT-CSAIL-TR-2008-069, Massachusetts Institute of Technology, 2008.
[MKTG06] V. K., Mansinghka, C., Kemp, J. B., Tenenbaum, and T. L., Griffiths, Structured priors for structure learning, Proceedings of the twenty-second conference on Uncertainty in Artificial Intelligence (UAI2006) (Arlington, Virginia), AUAI Press, 2006, pp. 324-331.
[MR13] V. K., Mansinghka and D. M., Roy, Stochastic inference machines, in preparation.
[Mar82] D., Marr, Vision, Freeman, San Francisco, 1982.
[McC68] John, McCarthy, Programs with common sense, Semantic information processing, The MIT Press, 1968, pp. 403-418.
[MUSTT12] J., McCoy, T. D., Ullman, A., Stuhlmüller, T., Gerstenberg, and J. B., Tenen-Baum, Why blame Bob? Probabilistic generative models, counterfactual reasoning, and blame attribution, Proceedings of the thirty-fourth annual conference of the Cognitive Science Society (Austin, TX) (N., Miyake, D., Peebles, and R. P., Cooper, editors), Cognitive Science Society, 2012.
[MP43] W. S., McCulloch and W., Pitts, A logical calculus of the ideas immanent in nervous activity, Bulletin of Mathematical Biology, vol. 5 (1943), no. 4, pp. 115-133.
[Mon82] George E., Monahan, A survey of partially observable Markov Decision Processes: Theory, models, andalgorithms, Management Science, vol. 28 (1982), no. 1, pp. 1-16.
[Mug91] S., Muggleton, Inductive logic programming, New Generation Computing, vol. 8 (1991), no. 4, pp. 295-318.
[OC98] M., Oaksford and N., Chater (editors), Rational models of cognition, Oxford University Press, Oxford, 1998.
[OC07] M., Oaksford and N., Chater (editors), Bayesian rationality: The probabilistic approach to human reasoning, Oxford University Press, New York, 2007.
[PT87] C. H., Papadimitriou and J. N., Tsitsiklis, The complexity of Markov Decision Processes, Mathematics of Operations Research, vol. 12 (1987), no. 3, pp. 441-450.
[Pea88] J., Pearl, Probabilistic reasoning in intelligent systems: Networks of plausible inference, Morgan Kaufmann, San Francisco, 1988.
[Pea04] J., Pearl, Graphical models for probabilistic and causal reasoning, Computer science handbook (A. B., Tucker, editor), CRC Press, 2nd ed., 2004.
[Pfa79] J., Pfanzagl, Conditional distributions as derivatives, The Annals of Probability, vol. 7 (1979), no. 6, pp. 1046-1050.
[Rao88] M. M., Rao, Paradoxes in conditional probability, Journal of Multivariate Analysis, vol. 27 (1988), no. 2, pp. 434-446.
[Rao05] M. M., Rao, Conditional measures and applications, 2nd ed., Pure and Applied Mathematics, vol. 271, Chapman & Hall/CRC, Boca Raton, FL, 2005.
[RH11] S., Rathmanner and M., Hutter, A philosophical treatise of universal induction, Entropy, vol. 13 (2011), no. 6, pp. 1076-1136.
[Roy11] D. M., Roy, Computability, inference and modeling in probabilistic programming, Ph.D. thesis, Massachusetts Institute of Technology, 2011.
[Sch07] M., Schröder, Admissible representations for probability measures, Mathematical Logic Quarterly, vol. 53 (2007), no. 4–5, pp. 431-445.
[Sch12] L., Schulz, The origins of inquiry: Inductive inference and exploration in early childhood, Trends in Cognitive Sciences, vol. 16 (2012), no. 7, pp. 382-389.
[She87] R. N., Shepard, Toward a universal law of generalization for psychological science, Science, vol. 237 (1987), no. 4820, pp. 1317-1323.
[SMH+91] M. A., Shwe, B., Middleton, D. E., Heckerman, M., Henrion, E. J., Horvitz, H. P., Lehmann, and G. F., Cooper, Probabilistic diagnosis using a reformulation of the INTERNIST-1/QMR knowledge base, Methods of Information in Medicine, vol. 30 (1991), pp. 241-255.
[SG92] A. F. M., Smith and A. E., Gelfand, Bayesian statistics without tears: A sampling-resampling perspective, The American Statistician, vol. 46 (1992), no. 2, pp. 84-88.
[Sol64] R. J., Solomonoff, A formal theory of inductive inference: Parts I and II, Information and Control, vol. 7 (1964), no. 1, pp. 1–22 and 224-254.
[SG13] A., Stuhlmüller and N. D., Goodman, Reasoning about reasoning by nested conditioning: Modeling theory ofmind with probabilistic programs, submitted.
[SG12] A., Stuhlmüller, A dynamic programming algorithm for inference in recursive probabilistic programs, Second Statistical Relational AI workshop at UAI 2012 (StaRAI-12), (2012).
[TG01] J. B., Tenenbaum and T. L., Griffiths, Generalization, similarity, and Bayesian inference, Behavioral and Brain Sciences, vol. 24 (2001), no. 4, pp. 629-640.
[TGK06] J. B., Tenenbaum, T. L., Griffiths, and C., Kemp, Theory-based Bayesian models of inductive learning and reasoning, Trends in Cognitive Sciences, vol. 10 (2006), no. 7, pp. 309-318.
[TKGG11] J. B., Tenenbaum, C., Kemp, T. L., Griffiths, and N. D., Goodman, How to grow a mind: Statistics, structure, andabstraction, Science, vol. 331 (2011), no. 6022, pp. 1279-1285.
[Teu02] C., Teuscher, Turing's connectionism: An investigation of neural network architectures, Springer-Verlag, London, 2002.
[Tju74] T., Tjur, Conditional probability distributions, Lecture Notes, no. 2, Institute of Mathematical Statistics, University of Copenhagen, Copenhagen, 1974.
[Tju75] T., Tjur, A constructive definition of conditional distributions, Preprint 13, Institute of Mathematical Statistics, University of Copenhagen, Copenhagen, 1975.
[Tju80] T., Tjur, Probability based on Radon measures, Wiley Series in Probability and Mathematical Statistics, John Wiley & Sons Ltd., Chichester, 1980.
[THS06] M., Toussaint, S., Harmeling, and A., Storkey, Probabilistic inference for solving (PO)MDPs, Technical Report EDI-INF-RR-0934, University of Edinburgh, School of Informatics, 2006.
[Tur36] A. M., Turing, On computable numbers, with an application to the Entscheidungsproblem, Proceedings of the London Mathematical Society. Second Series, vol. 42 (1936), no. 1, pp. 230-265.
[Tur39] A. M., Turing, Systems of logic based on ordinals, Proceedings of the London Mathematical Society. Second Series, vol. 45 (1939), no. 1, pp. 161-228.
[Tur48] A. M., Turing, Intelligent machinery, National Physical Laboratory Report, 1948.
[Tur50] A. M., Turing, Computing machinery and intelligence, Mind, vol. 59 (1950), pp. 433-460.
[Tur52] A. M., Turing, The chemical basis of morphogenesis, Philosophical Transactions of the Royal Society of London. Series B, Biological Sciences, vol. 237 (1952), no. 641, pp. 37-72.
[Tur96] A. M., Turing, Intelligent machinery, a heretical theory, Philosophia Mathematica. Philosophy of Mathematics, its Learning, andits Applications. Series III, vol. 4 (1996), no. 3, pp. 256-260, Originally a radio presentation, 1951.
[Tur12] A. M., Turing, The applications of probability to cryptography, c. 1941, UK National Archives, HW25/37, 2012.
[UBM+09] T. D., Ullman, C. L., Baker, O., Macindoe, O., Evans, N. D., Goodman, and J. B., Tenenbaum, Help or hinder: Bayesian models of social goal inference, Advances in Neural Information Processing Systems 22 (NIPS2009), 2009, pp. 1874-1882.
[Wat89] C. J. C. H., Watkins, Learning from delayed rewards, Ph.D. thesis, King's College, University of Cambridge, 1989.
[WD92] C. J. C. H., Watkins and P., Dayan, Q-Learning, Machine Learning, vol. 8 (1992), pp. 279-292.
[Wei93] K., Weihrauch, Computability on computable metric spaces, Theoretical Computer Science, vol. 113 (1993), no. 2, pp. 191-210.
[Wei99] K., Weihrauch, Computability on the probability measures on the Borel sets of the unit interval, Theoretical Computer Science, vol. 219 (1999), no. 1–2, pp. 421-437.
[Wei00] K., Weihrauch, Computable analysis: An introduction, Texts in Theoretical Computer Science, An EATCS Series, Springer-Verlag, Berlin, 2000.
[WGRKT11] D., Wingate, N. D., Goodman, D. M., Roy, L. P., Kaelbling, and J. B., Tenen-Baum, Bayesian policy search with policy priors, Proceedings of the twenty-second International Joint Conference on Artificial Intelligence (IJCAI) (Menlo Park, CA) (T., Walsh, editor), AAAI Press, 2011.
[WGSS11] D., Wingate, N. D., Goodman, A., Stuhlmüller, and J. M., Siskind, Nonstandard interpretations of probabilistic programs for efficient inference, Advances in Neural Information Processing Systems 24 (NIPS 2011), 2011.
[WSG11] D., Wingate, A., Stuhlmüller, and N. D., Goodman, Lightweight implementations of probabilistic programming languages via transformational compilation, Proceedings of the fourteenth international conference on Artificial Intelligence and Statistics (AISTATS), Journal of Machine Learning Research: Workshop and Conference Proceedings, vol. 15, 2011, pp. 770-778.
[Yam99] T., Yamakami, Polynomial time samplable distributions, Journal of Complexity, vol. 15 (1999), no. 4, pp. 557-574.
[Zab95] S. L., Zabell, Alan Turing and the central limit theorem, American Mathematical Monthly, vol. 102 (1995), no. 6, pp. 483-494.
[Zab12] S. L., Zabell, Commentary on Alan M. Turing: The applications of probability to cryptography, Cryptologia, vol. 36(2012), no. 3, pp. 191-214.