To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Many will trace the earliest articulation of what we may today call the science of complexity to Weaver’s [1] classic essay. In this work, Weaver distinguished between (i) the science of “simplicity” with phenomena that could be understood when reduced to a few variables, such as classical mechanics in two dimensions, (ii) the science of “disorganized complexity” concerning systems with large numbers of variables analyzed by a process of averaging, such as statistical thermodynamics, and (iii) an emerging field of “organized complex” systems, also with large numbers of variables, that was not amenable to either approach. This third middle region, Weaver wrote, would form the next significant challenge for science, needing both the power of machines (computers) and large interdisciplinary scientific teams for progress. Today, the field of complex systems, though lacking a universally accepted definition, studies entities – physical, biological, or social – united by the presence of large numbers of nonlinearly interacting agents that yield collective behavior not directly predictable from the laws governing interactions of the individual agents [2]. The thesis of complexity is therefore in direct opposition to the philosophy of reductionism and the source of an important debate regarding the foundations of science itself [3]. Examples of collective behavior in complex systems include, for instance, the “emergent” phenomena of macroscopic patterns [4] and phase transitions [5]. These coherent structures occur at scales far removed from those governing the interaction of the individual entities of the system and are due to bifurcation and symmetry breaking [6] involving macroscopic “collective” variables. On the other hand, the large size and nonlinearity of complex systems endow them with a measure of unpredictability – arising from deterministic chaos as well as inherent “fluctuations” – that naturally invokes a probabilistic description. Complex systems are thus said to have an “open” future that generates information and “surprise” as they evolve [7].
Epilepsy is the most common of the chronic and severe neurological diseases. It affects 65 million people worldwide and is characterized by an augmented susceptibility to seizures. Seizures are “transient occurrence of signs and/or symptoms due to abnormal excessive or synchronous neuronal activity in the brain” [1]. Current therapeutic strategies have the goal of suppressing or reducing the occurrence of seizures, thus being symptomatic rather than curative. There are no known therapies able to modify the evolution of acquired epilepsy, or to prevent its development. Furthermore, 25–40% of patients do not respond to pharmacological treatment, and this number stays unchanged when using new generation antiepileptic drugs as compared to established ones. For drug-resistant patients with focal epilepsy (an epilepsy in which seizures start in one hemisphere) there exists an alternative to medication: surgical resection of the brain regions involved in the generation of seizures, the epileptogenic zone, under the constraints of limiting post-surgical neurological impairments. Rates of success of brain surgery for epilepsy treatment vary between 34% and 74% as a function of the type of epilepsy. Outcomes are very variable, depend on the patient condition, and can change in time.
The previous chapters have dealt with the complex adaptive nature of the genome. Similar concepts in terms of interacting elements, self-organization and adaptation can be applied at other hierarchical scales. In this chapter we will show how complex adaptive systems (CAS) concepts can be usefully applied at the level of action potential firing patterns of single neurons in terms of seizure generation and of associated morbidities.
Epilepsy is a family of neurological disorders in which patients experience unprovoked spontaneous seizures. Unfortunately, there is currently no cure for epilepsy, and seizure management is the target of most therapies. The first-line treatment of epilepsy is usually antiepileptic drugs. However, depending on the subtype of epilepsy and the individual, drug treatments fail to control the seizures in around one-third of patients. One challenge in the treatment of epilepsy is its heterogeneity. In each patient, seizures are thought to be generated by different mechanisms, processes, and parameters, and treatment outcomes will also depend on these.
Since the early 2000s, the growing field of computational neuroscience has shown remarkable applicability in the study of epilepsy. A number of different and complementary approaches have been applied to brain signals obtained with scalp and invasive electroencephalography (EEG) to address a variety of fundamental and clinical problems. Historically, researchers have focused on overt changes in brain electrical signals, which can be detected using signal processing techniques. More recent advances have also shown that connectivity and network-level effects can provide critical information that complements the classical brain regional perspective. Thus, the modern toolkit for epilepsy electrophysiology now includes complex systems approaches such as network science (e.g., graph theory), nonlinear signal processing, information theory, and machine learning techniques. Complex systems approaches have made their contribution to our understanding of epilepsy and to the development of new tools that might improve its diagnosis and treatment.
The epilepsies are devastating neurological disorders for which progress developing effective new therapies has slowed over recent decades, primarily due to the complexity of the brain at all scales. This reality has shifted the focus of experimental and clinical practice toward complex systems approaches to overcoming current barriers. Organized by scale from genes to whole brain, the chapters of this book survey the theoretical underpinnings and use of network and dynamical systems approaches to interpreting and modeling experimental and clinical data in epilepsy. The emphasis throughout is on the value of the non-trivial, and often counterintuitive, properties of complex systems, and how to leverage these properties to elaborate mechanisms of epilepsy and develop new therapies. In this essential book, readers will learn key concepts of complex systems theory applied across multiple scales and how each of these scales connects to epilepsy.
There is a close relationship between random graphs and percolation. In fact, percolation and random graphs have been viewed as “the same phenomenon expressed in different languages” (Albert and Barabási, ). Early ideas on percolation (although not under that name) in molecular chemistry can be found in the articles by Flory () and Stockmayer ().
Percolation can be defined more generally than as a process on , . In this chapter, we motivate the main ideas and theory of percolation on more general graphs by application to polymer gelation and amorphous computing.
In this chapter, we discuss various issues that arise when networks increase in size. What does it mean for a network to increase in size and how would we visualize that process? Can a sequence of networks, increasing in size, converge to a limit, and what would such a limit look like? We discuss the transformation of an adjacency matrix to a pixel picture and what it means for a sequence of pixel pictures to increase in size. If a limit exists, the resulting function is called a limit graphon, but it is not itself a network. Estimation of a graphon is also discussed and methods described include an approximation by SBM and a network histogram.
As we have seen, networks, such as the Internet and World Wide Web, social networks (e.g., Facebook and LinkedIn), biological networks (e.g., gene regulatory networks, PPI networks, networks of the brain), transportation networks, and ecological networks are becoming larger and larger in today’s interconnected world. Some of these networks are truly huge and difficult, if not impossible, to analyze completely and efficiently. In this chapter, we discuss some of the issues involving comparing networks for similarity or differences, including choice of similarity measures, exchangeable random structures of networks, and property testing in networks.
In this chapter, we introduce a number of parametric statistical models that have been used to model network data. The social network literature has named them , , and models, the last of which has also been referred to as an ERGM (exponential random graph model).
When a network is too large to study completely, we sample from that network just as we would sample from any large population. The structure of network data, however, is more complicated than that of standard statistical data. The main question is, how can one sample from a network that has nodes and edges? Should we sample the nodes? Or should we sample the edges? The answers to these questions depend upon the complexity of the network. In this chapter, we examine various methods of sampling a network.
Random graphs were introduced by the Hungarian mathematicians Erdős and Rényi (, ), who imposed a probabilistic framework on classical combinatorial graph theory. At the same time, Edgar N. Gilbert () also studied the theoretical properties of random graphs.