To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Navigating across sometimes treacherous waters, we demonstrated in the previous chapter the construction and utilisation of a small selection of graph observables for measuring various properties of finite random graphs. Together with the generation of operator representations for arbitrary graph models, we should now be in possession of a sufficiently equipped toolset with which to further explore and characterise on rigorous algebraical grounds the plethora of graph models in the applied graph-theoretical literature. However, our adventurous journey would not be complete without touching upon another crucial aspect exhibited by many real-world networks - their dynamic nature. In this final chapter, we will explore with one hopefully light-hearted, playful example - the game of chess - how to formulate such dynamical aspects in our operator graph-theoretical language. As we will witness here, the construction of graphs that describe possible moves of chess pieces at any position during a game, and the transformations that lead to changes of such positional chess graphs, pose a formidable challenge for not only computational algorithms.
The first part of this book led us on a journey from one of the undoubtedly most cherished fields of applied mathematics, classical graph theory, across the ghastly depths of an inherently dynamic formalisation of physical reality in terms of mappings and operators, to an inspired attempt at a fusion of both of these perspectives. With the backing of a conceptional and notational framework at hand, it is now time to put this attempt to a test, as we continue our adventurous journey with an excursion into the endless realm of applications. This second part of our journey will start in this chapter with an exploration of graph generators and their operator graph-theoretical formulation. In this undertaking, we will focus primarily on the generation of random graphs as such models enjoy, in one way or another, widespread and prominent employment throughout almost all fields of science and technology. Only the last section will see the exemplary generation of an exact graph model, the finite square grid graph, as preparation for a closer inspection of an intriguing yet unsolved problem at the very heart of condensed matter physics in the next chapter.
This textbook establishes a theoretical framework for understanding deep learning models of practical relevance. With an approach that borrows from theoretical physics, Roberts and Yaida provide clear and pedagogical explanations of how realistic deep neural networks actually work. To make results from the theoretical forefront accessible, the authors eschew the subject's traditional emphasis on intimidating formality without sacrificing accuracy. Straightforward and approachable, this volume balances detailed first-principle derivations of novel results with insight and intuition for theorists and practitioners alike. This self-contained textbook is ideal for students and researchers interested in artificial intelligence with minimal prerequisites of linear algebra, calculus, and informal probability theory, and it can easily fill a semester-long course on deep learning theory. For the first time, the exciting practical advances in modern artificial intelligence capabilities can be matched with a set of effective principles, providing a timeless blueprint for theoretical research in deep learning.
Since the early eighteenth century, the theory of networks and graphs has matured into an indispensable tool for describing countless real-world phenomena. However, the study of large-scale features of a network often requires unrealistic limits, such as taking the network size to infinity or assuming a continuum. These asymptotic and analytic approaches can significantly diverge from real or simulated networks when applied at the finite scales of real-world applications. This book offers an approach to overcoming these limitations by introducing operator graph theory, an exact, non-asymptotic set of tools combining graph theory with operator calculus. The book is intended for mathematicians, physicists, and other scientists interested in discrete finite systems and their graph-theoretical description, and in delineating the abstract algebraic structures that characterise such systems. All the necessary background on graph theory and operator calculus is included for readers to understand the potential applications of operator graph theory.