2 results
12 - Life's Information Hierarchy
- from Part IV - Complexity and Causality
-
- By Jessica Flack, University of Wisconsin–Madison's
- Edited by Sara Imari Walker, Arizona State University, Paul C. W. Davies, Arizona State University, George F. R. Ellis, University of Cape Town
-
- Book:
- From Matter to Life
- Published online:
- 02 March 2017
- Print publication:
- 23 February 2017, pp 283-302
-
- Chapter
- Export citation
-
Summary
SUMMARY
I propose that biological systems are information hierarchies organized into multiple functional space and time scales. This multi-scale structure results from the collective effects of components estimating, in evolutionary or ecological time, regularities in their environments by coarse-graining or compressing time-series data and using these perceived regularities to tune strategies. As coarse-grained (slow) variables become for components better predictors than microscopic behavior (which fluctuates), and component estimates of these variables converge, new levels of organization consolidate. This process gives the appearance of downward causation – as components tune to the consolidating level, variance at the component level decreases. Because the formation of new levels results from an interaction between component capacity for regularity extraction, consensus formation, and how structured the environment is, the new levels, and the macroscopic, slow variables describing them, are characterized by intrinsic subjectivity. Hence the process producing these variables is perhaps best viewed as a locally optimized collective computation performed by system components in their search for configurations that reduce environmental uncertainty. If this view is correct, identifying important, functional macroscopic variables in biological systems will require an understanding of biological computation. I will discuss how we can move toward identifying laws in biology by studying the computation inductively. This includes strategy extraction from data, construction of stochastic circuits that map micro to macro, dimension-reduction techniques to move toward an algorithmic theory for the macroscopic output, methods for quantifying circuit collectivity, and macroscopic tuning and control.
INTRODUCTION
A significant challenge before biology is to determine whether living systems – composed of noisy, adaptive, heterogenous components with only partly aligned interests – are governed by principles or laws operating on universal quantities that can be derived from microscopic processes or reflect contingent events leading to irreducible complexity (Gell-Mann and Lloyd, 1996; Goldenfeld, 1999; Krakauer and Flack, 2010; Krakauer et al., 2011; Flack et al., 2013). We know the answer to this question for physical systems and it is useful to recall that understanding in physics was achieved only after extensive debate. This debate began with the observation that certain average quantities – temperature, pressure, entropy, volume, and energy – exist at equilibrium in fundamental relationship to each other, as expressed in the ideal gas law. This observation led to thermodynamics, an equilibrium theory treating aggregate variables.
12 - Probabilistic design principles for robust multi-modal communication networks
- from Part III - Artificial neural networks as models of perceptual processing in ecology and evolutionary biology
-
- By David C. Krakauer, Santa Fe Institute, Jessica Flack, Emory University, Nihat Ay, Max Planck Institute for Mathematics in the Sciences
- Edited by Colin R. Tosh, University of Leeds, Graeme D. Ruxton, University of Glasgow
-
- Book:
- Modelling Perception with Artificial Neural Networks
- Published online:
- 05 July 2011
- Print publication:
- 24 June 2010, pp 255-268
-
- Chapter
- Export citation
-
Summary
12.1 Stochastic multi-modal communication
Biological systems are inherently noisy and typically comprised of distributed, partially autonomous components. These features require that we understand evolutionary traits in terms of probabilistic design principles, rather than traditional deterministic, engineering frameworks. This characterisation is particularly relevant for signalling systems. Signals, whether between cells or individuals, provide essential integrative mechanisms for building complex, collective, structures. These signalling mechanisms need to integrate, or average, information from distributed sources in order to generate reliable responses. Thus there are two primary pressures operating on signals: the need to process information from multiple sources, and the need to ensure that this information is not corrupted or effaced. In this chapter we provide an information-theoretic framework for thinking about the probabilistic logic of animal communication in relation to robust, multi-modal, signals.
There are many types of signals that have evolved to allow for animal communication. These signals can be classified according to five features: modality (the number of sensory systems involved in signal production), channels (the number of channels involved in each modality), components (the number of communicative units within modalities and channels), context (variation in signal meaning due to social or environmental factors) and combinatoriality (whether modalities, channels, components and/or contextual usage can be rearranged to create different meaning). In this paper we focus on multi-channel and multi-modal signals, exploring how the capacity for multi-modality could have arisen and whether it is likely to have been dependent on selection for increased information flow or on selection for signalling system robustness.