Hostname: page-component-8448b6f56d-42gr6 Total loading time: 0 Render date: 2024-04-24T12:15:33.330Z Has data issue: false hasContentIssue false

Managing Complexity in Socio-Economic Systems*

Published online by Cambridge University Press:  01 May 2009

Dirk Helbing
Affiliation:
ETH Zürich, Prof. Dr. Dirk Helbing, Chair of Sociology, in particular of Modeling and Simulation, UNO D 11, Universitätstrasse 41, 8092 Zürich. E-mail: dhelbing@ethz.ch Szentháromság u. 2, H-1014 Budapest
Rights & Permissions [Opens in a new window]

Abstract

This contribution summarizes some typical features of complex systems such as non-linear interactions, chaotic dynamics, the ``butterfly effect’’, phase transitions, self-organized criticality, cascading effects, and power laws. These imply sometimes quite unexpected, counter-intuitive, or even paradoxical behaviors of socio-economic systems. A typical example is the faster-is-slower effect. Due to their tendency of self-organization, complex systems are often hard to control. Instead of trying to control their behavior, it would often be better to pursue the approach of guided self-organization, i.e. to use the driving forces of the system rather than to fight against them. This is illustrated by the example of hierarchical systems, which need to fulfill certain principles in order to be efficient and robust in an ever-changing environment. We also discuss the important role of fluctuations and heterogeneity for the adaptability, flexibility and robustness of complex systems. The presentation is enriched by a number of examples ranging from decision behavior up to production systems and disaster spreading.

Type
Focus: Complexity
Copyright
Copyright © Academia Europaea 2009

1. What is special about complex systems?

Many of us have been raised with the idea of cause and effect, i.e. some stimulus-response theory of the world. Particularly, small causes would have small effects and large causes would have large effects. This is, in fact, true for “linear system” ’, where cause and effect are proportional to each other. Such behavior is often found close to the equilibrium state of a system. However, when complex systems are driven far from equilibrium, non-linearities dominate, which can cause many kinds of “strange” and counter-intuitive behaviors. In the following, I will mention a few. We all have been surprised by these behaviors many times.

While linear systems have no more than one stationary state or one optimal solution, the situation for non-linear systems is different. They can have multiple stationary solutions or optima (see Figure 1), which has several important implications:

Figure 1 Illustration of linear and non-linear functions. While linear functions have one maximum in a limited area (left), non-linear functions may have many (local) maxima (right)

  • The resulting state is history-dependent. Different initial conditions will not automatically end up in the same stateReference Haken1. This is sometimes called “hysteresis”.

  • It may be hard to find the best, i.e. the “global’’ optimum in the potentially very large set of local optima. Many non-linear optimization problems are “NP hard”, i.e. the computational time needed to determine the best state tends to explode with the size of the systemReference Ausiello, Crescenzi, Gambosi, Kann, Marchetti-Spaccamela and Protasi2. In fact, many optimization problems are “combinatorial complex”.

1.1. Chaotic dynamics and butterfly effect

It may also happen that the stationary solutions are unstable, i.e. any small perturbation will drive the system away from the stationary state until it is attracted by another state (a so-called “attractor”). Such attractors may be other stationary solutions, but in many cases, they can be of oscillatory nature (e.g. “limit cycles”). Chaotically behaving systemsReference Strogatz3 are characterized by “strange attractors”, which are non-periodic (see Figure 2). Furthermore, the slightest change in the trajectory of a chaotic system (“the beat of a butterfly’s wing”) will eventually lead to a completely different dynamics. This is often called the “butterfly effect” and makes the behavior of chaotic systems unpredictable (beyond a certain time horizon), see Figure 3.

Figure 2 Illustration of trajectories that converge towards (a) a stable stationary point, (b) a limit cycle, and (c) a strange attractor

Figure 3 Illustration of the “butterfly effect”, i.e. the separation of neighboring trajectories in the course of time

1.2. Self-organization, competition, and cooperation

Systems with non-linear interactions do not necessarily behave chaotically. Often, they are characterized by “emergent”, i.e. spontaneous coordination or synchronizationReference Kuramoto4Reference Manrubia, Mikhailov and Zanette6. Even coordinated states, however, may sometimes be undesired. A typical example for this is stop-and-go waves in freeway trafficReference Helbing7, which are a result of an instability of the traffic flow due to the delayed velocity adjustments of vehicles.

Self-organization is typical in driven many-component systemsReference Helbing7 such as traffic, crowds, organizations, companies, or production plants. Such systems have been successfully modeled as many-particle or multi-agent systems. Depending on the respective system, the components are vehicles, individuals, workers, or products (or their parts). In these systems, the energy input is absorbed by frictional effects. However, the frictional effect is not homogeneous, i.e. it is not the same everywhere. It rather depends on the local interactions among the different components of the system, which leads to spatio-temporal pattern formation.

The example of social insects like ants, bees, or termites shows that simple interactions can lead to complex structures and impressive functions. This is often called “swarm intelligence” Reference Bonabeau, Dorigo and Theraulaz8. Swarm intelligence is based on local (i.e. decentralized) interactions and can be used for the self-organization and self-steering of complex systems. Some recent examples are traffic assistanceReference Bonabeau, Dorigo and Theraulaz8 systems or self-organized traffic light controlReference Kesting, Schönhof, Lämmer, Treiber and Helbing9, Reference Helbing and Lämmer10. However, if the interactions are not appropriate, the system may be characterized by unstable dynamics, breakdowns and jamming, or it may be trapped in a local optimum (a “frustrated state”).

Many systems are characterized by a competition for scarce resources. Then, the question whether and how a system optimum is reached is often studied with methods from “game theory”Reference Axelrod11Reference Schelling13. Instead of reaching the state that maximizes the overall success, the system may instead converge to a user equilibrium, where the success (“payoff”) of every system component is the same, but lower than it could be. This happens, for example, in traffic systems with the consequence of excess travel timesReference Helbing, Schönhof, Stark and Holyst14. In conclusion, if everybody tries to reach the best outcome for him- or herself, this may lead to overall bad results and social dilemmasReference Glance and Huberman15 (the “tragedy of the commons”Reference Hardin16). Sometimes, however, the system optimum can only be reached by complicated coordination in space and/or time, e.g. by suitable turn-taking behavior (see Figure 4). We will return to this issue in Sec. 2.4, when we discuss the “faster-is-slower” effect.

Figure 4 Emergence of turn-taking behavior: After some time, individuals may learn to improve their average success by choosing both possible options in an alternating and coordinated way (afterReference Kuramoto4)

1.3. Phase transitions and catastrophe theory

One typical feature of complex systems is their robustness with respect to perturbations, because the system tends to get back to its “natural state”, the attractor. However, as mentioned above, many complex systems can assume different states. For this reason, we may have transitions from one system state (“phase” or attractor) to another one.

These phase transitions occur at so-called “critical points” that are reached by changes of the system parameters (which are often slowly changing variables of the system). When system parameters come close to critical points, small fluctuations may become a dominating influence and determine the future fate of the system. Therefore, one speaks of “critical fluctuations”Reference Haken1.

In other words, large fluctuations are a sign of a system entering an unstable regime, indicating its potential transition to another system state, which may be hard to anticipate. Another indicator of potential instability is “critical slowing down”. However, once the critical point is passed, the system state may change quite rapidly. The relatively abrupt change from one system state to an often completely different one is studied by “catastrophe theory”Reference Zeeman17. One can distinguish a variety of different types of catastrophes, but we cannot go into all these details, here.

1.4. Self-organized criticality, power laws, and cascading effects

At the critical point itself, fluctuations are not only dominating, they may even become arbitrarily large. Therefore, one often speaks of “scale-free” behavior, which is typically characterized by power lawsReference Stanley18, Reference Schroeder19. Note that, for power laws, the variance and the expected value (the average) of a variable may be undefined!

One possible implication of power laws are cascading effects. The classical example is a sand pile, where more and more grains are added on topReference Bak, Tang and Wiesenfeld20. Eventually, when the critical “angle of repose” is reached, one observes avalanches of sand grains of all possible sizes, and the avalanche size distribution is given by a power law. The angle of repose, by the way, even determines the stability of the famous pyramids in Egypt.

Cascading effects are the underlying reason for many disasters, where the failure of one element of a system causes the failure of another one (see Figure 5). Typical examples for this dynamics are blackouts of electrical power grids and the spreading of epidemics, rumors, bankruptcies or congestion patterns. This spreading is often along the links of the underlying causality or interaction networksReference Helbing, Ammoser and Kühnert21.

Figure 5 Illustration of the interaction network in anthropogenic systems. When the system is seriously challenged, this is likely to cause cascading failures along the arrows of this network (afterReference Helbing, Ammoser and Kühnert21)

“Self-organized criticality”Reference Bak, Tang and Wiesenfeld20, Reference Bak22 is a particularly interesting phenomenon, where a system is driven towards a critical point. This is not uncommon for economic systems or critical infrastructures: Due to the need to minimize costs, safety margins will not be chosen higher than necessary. For example, they will be adjusted to the largest system perturbation that has occurred in the last so-and-so many years. As a consequence, there will be no failures in a long time. But then, controllers start to argue that one could safe money by reducing the standards. Eventually, the safety margins will be low enough to be exceeded by some perturbation, which may finally trigger a disaster.

Waves of bankruptciesReference Aleksiejuk and Holyst23, Reference Aleksiejuk, Holyst and Kossinets24 are not much different. The competition for customers, forces companies to make better and better offers, until the profits have reached a critical value and some companies will die. This will reduce the competitive pressure among the remaining companies and increase the profits again. As a consequence, new competitors will enter the market, which eventually drives the system back to the critical point.

2. Some common mistakes in the management of complex systems

The particular features of complex systems have important implications for organizations, companies, and societies, which are complex multi-component systems themselves. Their counter-intuitive behaviors result from often very complicated feedback loops in the system, which cause many management mistakes and undesired side effects. Such effects are particularly well-known from failing political attempts to improve the social or economic conditions.

2.1. The system does not do what you want it to do

One of the consequences of the non-linear interactions between the components of a complex system is that the internal interactions often dominate the external control attempts (or boundary conditions). This is particularly obvious for group dynamicsReference Tubbs25, Reference Arrow, McGrath and Berdahl26.

It is quite typical for complex systems that, many times, large efforts have no significant effect, while sometimes, the slightest change (even a “wrong word”) has a “revolutionary” impact. This all depends on whether a system is close to a critical state (which will lead to the latter situation) or not (then, many efforts to change the system will be in vain). In fact, complex systems often counteract the action. In chemical systems, this is known as Le Chatelier’s principleFootnote *.

Regarding such predictions, classical time series analysis will normally provide bad forecasts. The problem of opinion polls to anticipate election results when the mood in the population is changing, is well-known. In many cases, the expectations of a large number of individuals, as expressed by the stock prices at real or virtual stock markets, is more indicative than results of classical extrapolation. Therefore, auction-based mechanisms have been proposed as a new prediction tool. Recently, there are even techniques to forecast the future with small groupsReference Chen, Fine and Huberman27. This, however, requires to correct for individual biases by fitting certain personality parameters. These reflect, for example, the degree of risk aversion.

2.2. Guided self-organization is better than control

The previous section questions the classical control approach, which is, for example, used to control machines. But it is also frequently applied to business and societies, when decision-makers attempt to regulate all details by legislation, administrative procedures, project definitions, etc. These procedures are very complicated and time-consuming, sensitive to gaps, prone to failures, and they often go along with unanticipated side effects and costs. However, a complex system cannot be controlled like a bus, i.e. steering it somewhere may drive it to some unexpected state.

Biological systems are very differently designed. They do not specify all procedures in detail. Otherwise cells would be much too small to contain all construction plans in their genetic code, and the brain would be too small to perform its incredible tasks. Rather than trying to control all details of the system behavior, biology makes use of the self-organization of complex systems rather than “fighting” it. It guides self-organization, while forceful control would destroy itReference Mikhailov28.

Detailed control would require a large amount of energy, and would need further resources to put and keep the components of an artificial system together. That means, overriding the self-organization in the system is costly and inefficient. Instead, one could use self-organization principles as part of the management plan. But this requires a better understanding of the natural behavior of complex systems like companies and societies.

2.3. Self-organized networks and hierarchies

Hierarchies are a classical way to control systems. However, strict hierarchies are only optimal under certain conditions.

Particularly, they require a high reliability of the nodes (the staff members) and the links (their exchange).

Experimental results on the problem solving performance of groupsReference Ulschak29 show that small groups can find solutions to difficult problems faster than any of their constituting individuals, because groups profit from complementary knowledge and ideas. The actual performance, however, sensitively depends on the organization of information flows, i.e. on who can communicate with whom. If communication is unidirectional, for example, this can reduce performance. However, it may also be inefficient if everybody can talk to everyone else. This is, because the number of potential (bidirectional) communicative links grows like N(N − 1)/2, where N denotes the number of group members. The number of communicative or group-dynamical constellations even grows as (3N − 2N + 1 + 1)/2.

Consequently, the number of possible information flows explodes with the group size, which may easily overwhelm the communication and information processing capacity of individuals. This explains the slow speed of group decision making, i.e. the inefficiency of large committees. It is also responsible for the fact that, after some transient time, (communication) activities in large (discussion) groups often concentrate on a few members only, which is due to a self-organized information bundling and Differentiation (role formation) process. A similar effect is even observed in insect societies such as bee hives: When a critical colony size is exceeded, a few members develop hyperactivity, while most colony members become lazyReference Gautrais, Theraulaz, Deneubourg and Anderson30.

This illustrates the tendency of bundling and compressing information flows, which is most pronounced in strict hierarchies. But the performance of strictly hierarchical organizations (see Figure 6) is vulnerable for the following reasons:

Figure 6 Illustration of different kinds of hierarchical organization. As there are no alternative communication links, strict hierarchies are vulnerable to the failure of nodes or links (afterReference Helbing, Ammoser and Kühnert31)

  • Hierarchical organizations are not robust with respect to failure of nodes (due of illness of staff members, holidays, quitting the job) or links (due to difficult personal relationships).

  • They often do not connect interrelated activities in different departments well.

  • Important information may get lost due to the filtering of information implied by the bundling process.

  • Important information may arrive late, as it takes time to be communicated over various hierarchical levels.

Therefore, hierarchical networks with short-cuts are expected to be superior to strictly hierarchical networksReference Helbing, Ammoser and Kühnert31Reference Stauffer and de Oliveira33. They can profit from alternative information paths and “small-world” effectsReference Watts and Strogatz34.

Note that the spontaneous formation of hierarchical structures is not untypical in social systems: Individuals form groups, which form companies, organizations, and parties, which make up a society or nation. A similar situation can be found in biology, where organelles form cells, cells form organs, and organs form bodies. Another example is well-known from physics, where elementary particles form nuclei, which combine to atoms with electrons. The atoms form chemical molecules, which organize themselves as solids. These make up cellular bodies, which form solar systems, which again establish galaxies.

Obviously, the non-linear interactions between the different elements of the system give rise to a formation of different levels, which are hierarchically ordered one below another. While changes on the lowest hierarchical level are fastest, changes on the highest level are slow.

On the lowest level, we find the strongest interactions among its elements. This is obviously the reason for the fast changes on the lowest hierarchical level. If the interactions are attractive, bonds will arise. These cause the elements to behave no longer completely individually, but to form units representing the elements of the next level. Since the attractive interactions are more or less `saturated’ by the bonds, the interactions within these units are stronger than the interactions between them. The relatively weak residual interactions between the formed units induce their relatively slow dynamicsReference Helbing35.

In summary, a general interdependence between the interaction strength, the changing rate, and the formation of hierarchical levels can be found, and the existence of different hierarchical levels implies a “separation of time scales”.

The management of organizations, production processes, companies, and political changes seems to be quite different today: The highest hierarchy levels appear to take a strong influence on the system on a relatively short time scale. This does not only require a large amount of resources (administrative overhead). It also makes it difficult for the lower, less central levels of organization to adjust themselves to a changing environment. This complicates large-scale coordination in the system and makes it more costly. Strong interference in the system may even destroy self-organization in the system instead of using its potentials. Therefore, the re-structuring of companies can easily fail, in particularly if it is applied too often. A good example is given in Ref. Reference Christen, Bongard, Pausits, Stoop and Stoop36.

Governments would be advised to focus their activities on coordination functions, and on adaptations that are relevant for long time scales, i.e. applicable for 100 years or so. Otherwise the individuals will not be able to adjust to the boundary conditions set by the government. If the government tries to adjust to the population and the people try to adjust to the socio-economic conditions on the same time scale of months or years, the control attempts are expected to cause a potentially chaotic dynamics and a failure of control.

Anyway, detailed regulations hardly ever reach more fairness. They rather reduce flexibility, and make the anyway required processes inefficient, slow, complicated, and expensive. As a consequence, many people will not be able to utilize their rights without external help, while a specialized minority will be able to profit from the regulations or exploit them.

2.4. Faster is often slower

Another common mistake is to push team members to their limits and have machines run at maximum speed. In many cases, this will not maximize productivity and throughput, but rather frustration. Most systems require some spare capacity to run smoothly. This is well illustrated by queuing systems: If the arrival rate reaches the service rate, the average waiting time will grow enormously. The same applies to the variation of the waiting time. Jamming and full buffers will be an unfavorable, but likely side effect. And there will be little reserves in case of additional demand.

The situation becomes even more difficult by dynamic interaction effects, when a system is driven to its limits. In traffic systems, for example, this leads to a “capacity drop”. Such a capacity drop occurs often unexpectedly and is a sign of inefficiencies due to dynamical friction or obstruction effects. It results from increasing coordination problems when sufficient space or time are lacking. The consequence is often a “faster-is-slower effect” (see Figure 7). This effect has been observed in many traffic, production, and logistic systems. Consequently, it is often not good if everybody is doing his or her best. It is more important to adjust to the other activities and processes in order to reach a harmonic and well coordinated overall dynamics. Otherwise, more and more conflicts, inefficiencies and mistakes will ruin the overall performance.

Figure 7 Top: Schematic representation of the successive processes of a wet bench, i.e. a particular supply chain in semiconductor production. Middle: The Gantt diagrams illustrate the treatment times of the first four of several more processes, where we have used the same colors for processes belonging to the same run, i.e. the same set of wafers. The left diagram shows the original schedule, while the right one shows an optimized schedule based on the “slower-is-faster effect”. Bottom: The increase in the throughput of a wet bench by switching from the original production schedule to the optimized one was found to be 33%, in some cases even higher (afterReference Fasold37)

2.5. The role of fluctuations and heterogeneity

Let us finally discuss the role of fluctuations and heterogeneity. Fluctuations are often considered unfavorable, as they are thought to produce disorder. They can also trigger instabilities and breakdowns, as is known from traffic flows. But in some systems, fluctuations can also have positive effects.

While a large fluctuation strength, in fact, tends to destroy order, medium fluctuation levels may even cause a noise-induced ordering (see Figure 8). An eventual increase in the degree of order in the system is particularly expected if the system tends to be trapped in local minima `frustrated states’). Only by means of fluctuations, it is possible to escape these traps and to eventually find better solutions.

Figure 8 Illustration of frequency distributions of behaviors in space (afterReference Helbing and Platkowski39). Left: Separation of oppositely moving pedestrians perpendicularly to their walking direction for a low fluctuation strength. Right: Noise-induced ordering for medium fluctuation levels leads to a clear separation into two spatial areas. This reduces frictional effects and increases the efficiency of motion

Fluctuations are also needed to develop different behavioral roles under initially identical conditions. This eventually leads to a differentiation and specialization (heterogeneity), which often helps to reach a better group performanceReference Helbing40 (see Figure 9).

Figure 9 Typical individual decision changes of 9 test persons in a route choice experiment with two alternative routes. Note that we find almost similar or opposite behaviors after some time. The test persons develop a few kinds of complementary strategies (“roles”) in favor of a good group performance (afterReference Helbing40)

Furthermore, the speed of evolution also profits from variety and fluctuations (“mutations”). Uniformity, i.e. if everybody behaves and thinks the same, will lead to a poor adaptation to changing environmental or market conditions. In contrast, a large variety of different approaches (i.e. a heterogeneous population) will imply a large innovation rate. The innovation rate is actually expected to be proportional to the variance of individual solutions. Therefore, strong norms, “monocultures”, and the application of identical strategies all over the world due to the trend towards globalization imply dangers.

This trend is reinforced by “herding effects”Reference Helbing7. Whenever the future is hard to predict, people tend to orient at the behavior of others. This may easily lead to wrong collective decisions, even of highly intelligent people. This danger can be only reduced by supporting and maintaining a plurality of opinions and solutions.

3. Summary and outlook

In this contribution, I have given a short overview of some properties and particularities of complex systems. Many of their behaviors may occur unexpectedly (due to “catastrophes” or phase transitions), and they are often counter-intuitive, e.g. due to feedback loops and side effects. Therefore, the response of complex systems to control attempts can be very different from the intended or predicted one.

Complex behavior in space and time is found for many multi-component systems with non-linear interactions. Typical examples are companies, organizations, administrations, or societies. This has serious implications regarding suitable control approaches. In fact, most control attempts are destined to fail. It would, however, be the wrong conclusion that one would just have to apply more force to get control over the system. This would destroy the self-organization in the system, on which social systems are based.

Obtaining a better understanding of how to make use of the natural tendencies and behaviors at work. A management that supports and guides the natural self-organization in the system would perform much more efficiently than an artificially constructed system that requires continuous forcing. Companies and countries that manage to successfully apply the principle of self-organization will be the future winners of the on-going global competition.

In conclusion, we are currently facing a paradigm shift in the management of complex systems, and investments into complexity research will be of competitive advantage.

Acknowledgment

This contribution was first published as Introduction of the book Managing Complexity, edited by D. Helbing (Springer, Berlin, 2008), and republished with courtesy of the author.

Dirk Helbing (born on January 19, 1965), has been Professor of Sociology, in particular of Modelling and Simulation at ETH Zurich since June 2007. Before, he worked as Managing Director of the Institute for Transport & Economics at Dresden University of Technology, where he was appointed full professor in 2000. Having studied Physics and Mathematics in Göttingen, his masters thesis dealt with the nonlinear modelling and multi-agent simulation of observed self-organization phenomena in pedestrian crowds. Two years later, he finished his PhD at Stuttgart University on modelling social interaction processes by means of game-theoretical approaches, stochastic methods and complex systems theory, which was awarded two research prizes. After having completed his habilitation on traffic dynamics and optimization in 1996, he received a Heisenberg scholarship. Both theses were printed by international publishers. Apart from this, Helbing has (co-)organized several international conferences and (co-)edited proceedings or special issues on material flows in networks and cooperative dynamics in socio-economic and traffic systems. He has given 250 talks and published more than 200 papers, including several contributions to journals such as Nature, Science or PNAS. Dirk Helbing is also elected member of the German Academy of Sciences “Leopoldina”.

Footnotes

* Specifically, Le Chatelier’s principle says “if a chemical system at equilibrium experiences a change in concentration, temperature, or total pressure, the equilibrium will shift in order to minimize that change.”

References

1.Haken, H. (1977) Synergetics (Berlin, Heidelberg, New York: Springer).Google Scholar
2.Ausiello, G., Crescenzi, P., Gambosi, G., Kann, V., Marchetti-Spaccamela, A., Protasi, M. (1999) Complexity and Approximation – Combinatorial optimization problems and their approximability properties (Berlin, Heidelberg, New York: Springer).Google Scholar
3.Strogatz, S. H. (2001) Nonlinear Dynamics and Chaos: With Applications to Physics, Biology, Chemistry and Engineering (New York: Perseus).Google Scholar
4.Kuramoto, Y. (1984) Chemical Oscillations, Waves, and Turbulence (Berlin, Heidelberg, New York: Springer).CrossRefGoogle Scholar
5.Pikovsky, A., Rosenblum, M. and Kurths, J. (2003) Synchronization: A Universal Concept in Nonlinear Sciences (Cambridge: Cambridge University Press).Google Scholar
6.Manrubia, S. C., Mikhailov, A. S. and Zanette, D. H. (2004) Emergence of Dynamical Order. Synchronization Phenomena in Complex Systems (Singapore: World Scientific).CrossRefGoogle Scholar
7.Helbing, D. (2001) Traffic and related self-driven many-particle systems. Reviews of Modern Physics, 73, 10671141.CrossRefGoogle Scholar
8.Bonabeau, E., Dorigo, M. and Theraulaz, G. (1999) Swarm Intelligence: From Natural to Artificial Systems (Santa Fe Institute Studies in the Sciences of Complexity Proceedings).CrossRefGoogle Scholar
9.Kesting, A., Schönhof, M., Lämmer, S., Treiber, M. and Helbing, D. (2008) Decentralized approaches to adaptive traffic control. In: D. Helbing (ed.) Managing Complexity: Insights, Concepts, Applications (Berlin: Springer), pp. 179200.Google Scholar
10.Helbing, D. and Lämmer, S. (2005) Verfahren zur Koordination konkurrierender Prozesse oder zur Steuerung des Transports von mobilen Einheiten innerhalb eines Netzwerkes [Method to Coordinate Competing Processes or to Control the Transport of Mobile Units within a Network]. Pending patent DE 10 2005 023742.8.Google Scholar
11.Axelrod, R. (1985) The Evolution of Cooperation (New York: Basic Books).Google Scholar
12.von Neumann, J., Morgenstern, O., Rubinstein, A. and Kuhn, H. W. (2004) Theory of Games and Economic Behavior (Princeton: Princeton University Press).Google Scholar
13.Schelling, T. C. (2006) The Strategy of Conflict (Harvard: Harvard University Press).Google Scholar
14.Helbing, D., Schönhof, M., Stark, H.-U. and Holyst, J. A. (2005) How individuals learn to take turns: Emergence of alternating cooperation in a congestion game and the prisoner’s dilemma. Advances in Complex Systems, 8, 87116.CrossRefGoogle Scholar
15.Glance, N. S. and Huberman, B. A. (1994) The dynamics of social dilemmas. Scientific American, 270(3), 7681.CrossRefGoogle Scholar
16.Hardin, G. (1968) The Tragedy of the Commons. Science, 162, 12431248.CrossRefGoogle ScholarPubMed
17.Zeeman, E. C. (1977) Catastrophe Theory (London: Addison-Wesley).Google Scholar
18.Stanley, H. E. (1971) Introduction to Phase Transitions and Critical Phenomena (Oxford: Oxford University Press).Google Scholar
19.Schroeder, M. (1992) Fractals, Chaos, Power Laws: Minutes from an Infinite Paradise (New York: W.H. Freeman).Google Scholar
20.Bak, P., Tang, C. and Wiesenfeld, K. (1987) Self-organized criticality: An explanation of 1/f noise. Phys. Rev. Lett., 59, 381384.CrossRefGoogle Scholar
21.Helbing, D., Ammoser, H. and Kühnert, C. (2005) Disasters as extreme events and the importance of network interactions for disaster response management. In: S. Albeverio, V. Jentsch and H. Kantz (eds) The Unimaginable and Unpredictable: Extreme Events in Nature and Society (Berlin, Heidelberg, New York: Springer), pp. 319348.Google Scholar
22.Bak, P. (1996) How Nature Works: The Science of Self-Organized Criticality (New York: Copernicus).CrossRefGoogle Scholar
23.Aleksiejuk, A. and Holyst, J. A. (2001) A simple model of bank bankruptcies. Physica A, 299(1–2), 198204.CrossRefGoogle Scholar
24.Aleksiejuk, A., Holyst, J. A. and Kossinets, G. (2002) Self-organized criticality in a model of collective bank bankruptcies. International Journal of Modern Physics C, 13(3), 333341.CrossRefGoogle Scholar
25.Tubbs, S. L. (2001) A Systems Approach to Small Group Interaction (Boston: McGraw-Hill).Google Scholar
26.Arrow, H., McGrath, J. E. and Berdahl, J. L. (2000) Small Groups as Complex Systems: Formation, Coordination, Development, and Adaptation (Sage).CrossRefGoogle Scholar
27.Chen, K.-Y., Fine, L. R. and Huberman, B. A. (2003) Predicting the Future. Information Systems Frontiers, 5, 47.CrossRefGoogle Scholar
28.Mikhailov, A. S. (1992) Artificial life: An engineering perspective. In: R. Friedrich and A. Wunderlin (eds) Evolution of Dynamical Structures in Complex Systems (Berlin, Heidelberg, New York: Springer), pp. 301312.CrossRefGoogle Scholar
29.Ulschak, F. L. (1981) Small Group Problem Solving: An Aid to Organizational Effectiveness (Boston: Addison-Wesley Reading Mass).Google Scholar
30.Gautrais, J., Theraulaz, G., Deneubourg, J.-L. and Anderson, C. (2002) Emergent polyethism as a consequence of increased colony size in insect societies. Journal of Theoretical Biology, 215, 363371.CrossRefGoogle ScholarPubMed
31.Helbing, D., Ammoser, H. and Kühnert, C. (2006) Information flows in hierarchical networks and the capability of organizations to successfully respond to failures, crises, and disasters. Physica A, 363, 141150.CrossRefGoogle Scholar
32.Adamic, L. A. and Adar, E. (2003) Friends and neighbors on the web. Social Networks, 25(3), 211230.CrossRefGoogle Scholar
33.Stauffer, D. and de Oliveira, P. M. C. (2006) Optimization of hierarchical structures of information flow. International Journal of Modern Physics C, 17, 1367.CrossRefGoogle Scholar
34.Watts, D. J. and Strogatz, S. H. (1998) Collective dynamics of smallworld networks. Nature, 393, 440442.CrossRefGoogle ScholarPubMed
35.Helbing, D. (1995) Quantitative Sociodynamics. Stochastic Methods and Models of Social Interaction Processes (Dordrecht: Kluwer Academic).Google Scholar
36.Christen, M., Bongard, G., Pausits, A., Stoop, N. and Stoop, R. (2008) Managing autonomy and control in economic systems. In: D. Helbing (ed.) Managing Complexity: Insights, Concepts, Applications (Berlin: Springer), pp. 3756.CrossRefGoogle Scholar
37.Fasold, D. (2001) Optimierung logistischer Prozessketten am Beispiel einer Nassätzanlage in der Halbleiterproduktion. MA thesis, TU Dresden.Google Scholar
38.Helbing, D., Seidel, T., Lämmer, S. and Peters, K. (2006) Self-organization principles in supply networks and production systems. In: B. K. Chakrabarti, A. Chakraborti and A. Chatterjee (eds) Econophysics and Sociophysics – Trends and Perspectives (Weinheim: Wiley), pp. 535558.CrossRefGoogle Scholar
39.Helbing, D. and Platkowski, T. (2000) Self-organization in space and induced by fluctuations. International Journal of Chaos Theory and Applications, 5(4), 4762.Google Scholar
40.Helbing, D. (2004) Dynamic decision behavior and optimal guidance through information services: Models and experiments. In: M. Schreckenberg and R. Selten (eds) Human Behaviour and Traffic Networks (Heidelberg, Berlin, New York: Springer), pp. 4795.CrossRefGoogle Scholar
41.Helbing, D., Treiber, M. and Saam, N. J. (2005) Analytical investigation of innovation dynamics considering stochasticity in the evaluation of fitness. Physical Review E, 71, 067101.CrossRefGoogle ScholarPubMed
Figure 0

Figure 1 Illustration of linear and non-linear functions. While linear functions have one maximum in a limited area (left), non-linear functions may have many (local) maxima (right)

Figure 1

Figure 2 Illustration of trajectories that converge towards (a) a stable stationary point, (b) a limit cycle, and (c) a strange attractor

Figure 2

Figure 3 Illustration of the “butterfly effect”, i.e. the separation of neighboring trajectories in the course of time

Figure 3

Figure 4 Emergence of turn-taking behavior: After some time, individuals may learn to improve their average success by choosing both possible options in an alternating and coordinated way (after4)

Figure 4

Figure 5 Illustration of the interaction network in anthropogenic systems. When the system is seriously challenged, this is likely to cause cascading failures along the arrows of this network (after21)

Figure 5

Figure 6 Illustration of different kinds of hierarchical organization. As there are no alternative communication links, strict hierarchies are vulnerable to the failure of nodes or links (after31)

Figure 6

Figure 7 Top: Schematic representation of the successive processes of a wet bench, i.e. a particular supply chain in semiconductor production. Middle: The Gantt diagrams illustrate the treatment times of the first four of several more processes, where we have used the same colors for processes belonging to the same run, i.e. the same set of wafers. The left diagram shows the original schedule, while the right one shows an optimized schedule based on the “slower-is-faster effect”. Bottom: The increase in the throughput of a wet bench by switching from the original production schedule to the optimized one was found to be 33%, in some cases even higher (after37)

Figure 7

Figure 8 Illustration of frequency distributions of behaviors in space (after39). Left: Separation of oppositely moving pedestrians perpendicularly to their walking direction for a low fluctuation strength. Right: Noise-induced ordering for medium fluctuation levels leads to a clear separation into two spatial areas. This reduces frictional effects and increases the efficiency of motion

Figure 8

Figure 9 Typical individual decision changes of 9 test persons in a route choice experiment with two alternative routes. Note that we find almost similar or opposite behaviors after some time. The test persons develop a few kinds of complementary strategies (“roles”) in favor of a good group performance (after40)