I
Organization is certainly a form of social cooperation, but it could nevertheless be considered closely akin to technology. We speak of organizational technology. We now leave much of what used to be done through social interaction to computers. They can do the job just as well if not better. Classical organizational sociology also appears to have had this parallel in mind when it described the organization as an apparatus of power or even as a machine that performs set tasks. The analogy rests on the assumption that organizations can fairly reliably repeat the same work processes, and that if something does not work, the defect can be looked for and found.
Max Weber and “Taylorist” work organization had operated with such notions, pointing to the need for a general theory of organization. A political ruler, it was claimed, relied on an administrative staff programmed through rules that spared him from having to make decisions in individual cases and, nevertheless, ensured that such decisions were consistently made in accordance with the rules. Taylorist work organization sets itself the task of breaking down the working process to allow the best combination of steps in that process to be worked out. Both forms of organization could be framed by an end/means schema to ensure that the ends could be met reliably and at predictable cost. If this did not happen, the machine had to be repaired or parts had to be replaced by functionally equivalent elements.
As a result of its own empirical research, however, organizational sociology has gradually distanced itself these notions without quite abandoning them at the management theory level. A first step in this direction was the theory of “socio-technical systems.”1 It had developed in parallel to interest in “informal organization” and “human relations,” and its aim had been to determine the links between technology, work organization, and social relations in the workplace in order to establish a scientific basis for efforts to “democratize” and “humanize” the working world. This was an advance over older studies of technology-driven work processes2 because it differentiated and linked up different areas of the system, introducing difference-oriented systematization. The connections between technology and organization did not permit the one to be introduced without taking account of the other. It was implicitly presupposed that changes in technology were neither frequent nor rapid.
Further developments pointed to a stronger interest in the problems of managing technological innovations and their implementation in a smoothly running organization of work. This may have been prompted by technological developments themselves (such as computer technology or jet propulsion in aviation) and by other market requirements (for instance, more strongly differentiated products, or such with small production runs or even made to order3), but also by experience with resistance to the introduction of innovations.4 The focus of research thus shifted from systematic contexts to perspectives for planning changes. This shows once again that technical innovations cannot be understood as the application of scientific knowledge available to all and sundry but that “implicit,” firm-specific if not product-specific knowledge plays a role, knowledge that emerges when needed.5 At the slogan level we find the “humanization of work” being replaced by “socially compatible technology design.” “Socially compatible/responsible” appears to amount to decentralization and user participation in the planning, introduction, and recurrent modification of the system.
A current manifestation of the old concept of socio-technical systems could be the insight that introducing new devices or technologies has structural consequences for a given system.6 This is still called “contingency theory” (even though such a trivial insight hardly deserves the title of theory).7 Whether planned in advance or not, the new couplings bind other work processes, and this means that old habits have to be jettisoned and new ones developed. In Giddens’ terminology, such a process intervenes in the mutual determination of structures and actions; but it does not necessarily decide which structures are to be forgotten and what new ones have to be formed; for this is likely to depend not least on the historical particularity of the systems in question.
The identity of the term “technology,” defined as the fixed coupling of causal elements, cannot hide the fact that widely ranging matters are involved in how it impacts organizations. Above all, the operational basis for realizing a technology has to be identified; in other words, the operations that have to be coupled: physical, chemical, biological, or social. Technical apparatus is accordingly a factor of the environment of the communication system or a form with which communication itself limits acceptable connections. And the organization accordingly has to deal with the problem of adapting to apparatus if failure is to be avoided, or handle restrictive expectations very easy to deviate from. Technicizing communication itself thus requires error control, supervision, and social sanctions.
Furthermore, a distinction can be made between whether innovation is prompted primarily by science or inventions in the technology itself, as with the internal combustion engine or in electronics, or whether it is rooted in or has to do with organizational advantages, as with container transport. This is intersected by the distinction between whether innovation renders old knowledge and old forms of organization obsolete or whether it gradually improves and refines the initial invention, in either product type or production process.8 Depending on how radical the break is, management will have to adapt, or opportunities will emerge for starting up new firms. Large organizations are more likely to have the necessary resources for introducing improvements while small firms set up to exploit technological innovations will have a better chance of realizing such innovations. However the line of intersection might run, it is at any rate a clear argument against the notion of organizational forms being determined by technology. A distinction has at least to be drawn in terms of the sort of break with old habits required. Container transport revolutionized shipbuilding, the organization of shipping and ports, including the associated transport systems. But the changes had to be made within the context of existing shipyards, shipping companies, and port organizations. In the automotive industry, the invention of the automobile gave ample opportunity to set up new companies; only later as the product was improved and manufacturing rationalized did the number of enterprises on the market decline, leaving only a few big companies and a trend toward further shrinkage to only a few enterprises offering mobility as a service. In the case of telephone networks, both possibilities were tried out side by side from the very outset in both the United States and Europe.
While discussion on the “socio-technical” approach has subsided, the impact of technology at all levels of the organization has increased, namely through the use of computers. This development has clearly changed communication channels and decision-making competencies. At least in the field of automation, the notion that social systems are determined by technology had to be abandoned; the use and organizational embedding of the use of the computer in office and production had to be understood as a matter of more or less risky, more or less successful decisions9 that offered greater or lesser justification for introducing changes on the basis of experience.10 Moreover, it is questionable whether the optimum use of new technology can be a goal in itself. In the first place, it is only a matter of a shift in the means and ends typical of the organization. The question is, therefore, what general notions are to guide the introduction of computerized information production in organizations: more controllable information processing (also to monitor the day-by-day performance of employees?) or uncertainty absorption?
The mechanical engineering of the nineteenth and twentieth centuries had understood the human body as energy = work. Its peculiar humanity had been to save energy and gain time, to save labor and accelerate the transport of things and bodies. In the second half of the nineteenth century, this had led to the mass production of goods and the development of correspondingly big organizations. It was a question of tangible machines and tangible products. The computer realizes a completely different concept. Here we have to do with an invisible machine that can transform itself into another machine from one moment to the next in the course of utilization. Its switching processes are invisible, its speed of operation requires clear coupling; but results are accessible only via commands that render selected aspects of machine states visible. Inquiries permit unambiguity to be transformed back into ambiguity (purposefulness) of the use context. But the relation of surface (monitor, printout) to depth will change the possibility of problem presentation and argument. The time problem no longer lies in speeding up work processes (although this is still being worked on) but in the sequence of inquiries. Saving body energy reaches a point where, with extremely reduced use of the senses and bodily movement, it becomes dysfunctional. What is gained is the almost unlimited technicization of work processes. Developments seem to be moving from more peripheral areas toward key questions of organized decision making, but technology itself offers no concept for this. How much loose coupling, how much dependence on chance can be saved? And how can the impacts of failures be prevented from cascading? Through redundant parallel arrangements, thus through technical precautions against problems entailed by technology alone?
This is not the place to engage in this discussion with detailed analysis. But one more problem needs to be mentioned. Computer technology leads to the spatial fixing of work, not only at the level of the machine production of material goods but also at all levels of management. This is not to say that planned conferences or other face-to-face interaction are excluded. Inquiry facilities can be installed in conference rooms or printouts can be brought along. But little attention is given to the question of whether chance contacts among people circulating in space are not reduced.11 Will this lead to a clearer dividing line between work and socializing? And what are the consequences if this deprives the organization of an important source of chance stimulus?
Other sorts of problem arise when technical couplings become more complex, that is to say, come to consist of many, diverse elements that make varying demands on time. Time then becomes scarce, especially time for reacting to surprises. For from the temporal point of view, the tight coupling of technologically determined operations means immediate coupling. The system then has to reckon with malfunctions without having enough time in reserve to detect and correct them.12 The problem occurs independently of the magnitude of the damage or loss that such malfunctions can cause; but it becomes particularly serious in the field of high-risk technologies, where disasters can and have already occurred.13
At issue, against the political backdrop of such technology being banned, was to find some sort of encasement that would ensure that nothing happens and that, if disaster nevertheless threatens, the threat is registered and defused in time. In extremely threatening and unclear situations where highly urgent decisions have to be made, otherwise functioning divisions between systems can tend to collapse. For comprehensible neurobiological reasons, emotions mount and mental control of perception is reduced, making the situation even more complex than it already is.14 How others will react can no longer be foreseen, and there is no time to find agreement on how to define the situation. Such desiderata as humanization or social compatibility are completely out of place. Quite the opposite is required: safety technology must not be adapted to human beings with their weaknesses. It has to work with complete reliability, even more reliably than the risky core technology that is to be prevented from getting out of control. The malfunctions involved will be unexpected and very rare; but because this is so, the organization has to work reliably. It would therefore have to be more strongly technological in operation than the technology itself.
One problem clearly lies in the relation of controlling authorities to operational units.15 From a technical point of view, the computer offers the possibility of providing for both greater centralization and greater decentralization; indeed, even give ground for calling the sense of this distinction into question. But this does not mean that such possibilities can be realized in existing, still hierarchical organizations. The more elaborate control requirements become, the more attractive labor-saving circumventions, informal understandings, and restrictions on the flow of information will appear. Control is probably never realized to the degree that is technically possible,16 and the possibilities for socially acceptable follow-up measures would probably also be lacking. It is at any rate a negotiable, if not power-political issue.
Another problem is how to deal on the spot with poorly defined problems. The causes of an emerging disaster cannot be discerned clearly enough and, above all, not fast enough. Moreover, the implications of small problems are not always apparent, since they are likely to give rise to disasters only under rare and complex (“chance”) secondary conditions. Although post-disaster inquiries may well pinpoint causes, this transforms a poorly defined problem into a clearly defined problem, whereas the typical and recurring difficulty had been precisely in the relation between lack of clarity and workload, thus in the economy of attention, or in the unrecognized misinterpretation of communication or other signals.17, 18
To be essentially distinguished from risky technologies are so-called large technical systems (LTSs) – transport systems, power supply, telecommunications, extensively networked information provision – which provide the technical infrastructure for a wide range of other systems. In such systems, the risk of partial failure can be cushioned by redundant capacities, and only blackouts of considerable magnitude make themselves felt. The overall impression is that malfunctions caused by technology itself are on the increase without the system being capable of switching to manual operation or coordinating orally. More than in the past, the organization now needs the competence to ride out situational difficulties; but where is this competence to come from if everything is geared to computers and the capacity to intervene at other points in the system is lacking?19 The core technology may well be relatively simple, but its connections and resulting dependencies make the overall context confusingly complex – all the more so because it cannot be controlled by a single (however vast) organizational system. Since dependencies are highly complex and access has to be safeguarded, a sort of political responsibility for such systems is difficult to avoid, even though organizational problems differ strongly from those that are typically handled by public administration. Hybrid structures have thus developed, which dissolve the strict separation of public and private and, regardless of their technological infrastructure, have in recent years attracted a great deal of attention also as social systems.20 Where the basic product of an organization is highly uniform (energy, mobility, information), it will have to obey technological imperatives far more strongly than otherwise. It consequently does not make much sense to construct an umbrella concept for service organizations that also cover social work, therapy, education, and the like.
In the light of such heterogeneous demands on technology in general, research approaches have become more complex in the course of this development. The literature on management is now too vast for comfort. Research on safety engineering has not produced satisfying findings but rather disquieting results; there is even talk of “normal” disasters that have be reckoned with.21 But concern about the disasters caused by technology that the mass media daily report conceal a much more profound problem, namely the irreversible dependence of society on technology and thus on the technically possible production of energy. It is therefore high time to re-examine the foundations for theory. What do we wish to understand by “technology”?
II
Technology can be very formally defined as the tight coupling of causal elements, no matter what the material basis for this coupling.22 The concept includes human conduct insofar as it takes place automatically and is not interrupted by decisions. For example, the development of unproblematic readability (to be distinguished from textual comprehension) is part of the technology of the printing press and was developed as its correlate. The same can be said of the ability to drive a car (again: to be distinguished from decisions on direction or conduct in critical situations), and of other cases of machine operation. The range covered by technology cannot therefore be construed from the “materiality” of the coupled operations. In other words,23 technology can form functioning networks from quite heterogeneous elements as long as tight coupling succeeds. Technicization includes human perception and motoricity, and precisely this is the problem – not (to take the example of reading) because it is somehow non-human, but because it raises the problem of whether, when, and how one sees alternatives in the technicized process and consequently assumes responsibility for decisions.
Where technical coupling can be set up, a given cause (or a given complex of causes) A is always followed by an effect B: or a given item of information A (e.g., an advertisement or an application to a public authority) is always followed by a decision B.24 The reliability of such coupling can always be treated as a variable; we then have to do with various degrees of the process technicization. However, here, too, we would start from the boundary concept of non-variable tight coupling.
We can speak of tight coupling in both mechanical and electronic machines, and in so-called “software technologies” – for instance, timetables, airline reservation systems, or merchandise price marking. A typical advantage of such technologies is their very specific and rapid adjustment to unpredictable customer demands. It always has to do with tying down resources that can then no longer be transferred or only at great effort and cost. The advantages are thus not only in the length of the chain that can still be controlled by initial impulse. They lie also in network-like relationization with optional access to connections chosen ad hoc, where one cannot simply assume without further checks whether they are practicable in the system.
The concept of tight coupling clearly shows the advantages of a technical arrangement. Tight coupling enables considerable simplification in dealing with technology. Within the computer, in its “invisible machine,” it enables a speed of operation beyond the grasp and control of the mind. Above all, technology reduces the need for consensus. That it works can itself be taken for granted, and taken for granted in such a way that others can be assumed to take it for granted, as well.25 It makes artificial objects available, which also serve as a substitute for consensus. Technology divides consensus issues into problems of ends and problems of means, thus enabling relational rationalization that seeks to establish a favorable (possibly “optimal”) relation between ends and means. Advantages are also to be found in practical application: to start a technical process, one need only know what sets it off.26 To use technology, one does not even need to know the theories required for a scientific understanding and explanation of technical processes.27 This characteristic of technology offers organizations considerable advantages. Operation can be left to semi-skilled personnel, professional competence being reserved for questions of constructing technical processes or diagnosing malfunctions.
Another advantage of technical processes is that the resources they require can be calculated. This also means that the dependence of technology on environmental resources can be determined. Energy requirements depend on the extent to which the technology is utilized. This includes human work performance insofar as it is itself technicized or required as annex to technical processes.28
Finally, a technical substratum makes it possible to detect irregularities. Malfunctions literally attract attention automatically, and, albeit with more effort in diagnosis and knowledge, one can discover how to eliminate them. This has to do with the fact that technology facilitates learning. Frequent malfunctions allow vulnerable elements to be identified and defined as problems, and better solutions to be sought. What is more, technology makes its own conditions for utilization transparent. What problem it was designed to solve is known or can be determined, and on this basis functional equivalents can sought. This also works the other way round: one knows the process and in one way or another can look for the problem it can solve, and which could be rethought.
In sum, these various advantages of technology constitute the condition of the possibility for forming hierarchies. Work processes, such as teaching in school and preaching in church, which cannot be technicized, also resist hierarchical supervision, calculated resource allocation, and learning in the system of the organization.29 They oblige the organization to outsource the conditions for its success to face-to-face interaction, and leave their fulfillment to events supervision cannot follow. This does not exclude such interaction being influenced through resource allocation – when, for example, the educational system allots more time to certain subjects.30 But it is practically impossible to foresee the effect of differences in allocation on differences in operational results. The battle for resources is therefore always a struggle for the symbolic recognition of the importance of the recipient organizations. The hierarchy can react only to social noises, to complaints and protests, and it can perform its supervisory function at best in more or less ceremonial forms (inspection, visits, circulars).
This shows how strongly classical organization research has focused on the relation between a core technology and rational hierarchical control. If “informal organization” was also taken into account or the environment included through the question of resource dependence, this did not mean the basis conception was abandoned; it was merely modified. There is, indeed, a significant link between core technology and hierarchy formation. So the classical theory is by no means disproved. But it throws light only on a subphenomenon and does not convey an adequate idea of how an organization reproduces itself as system.
The use of causal technologies does not necessarily mean that the system as a whole or at the top is fully aware of what is going on within it or can at least ascertain what state it is in. The more complex technical couplings and variability in the temporal use of factors become, the less this can be said. The theorem of self-generated uncertainty then comes to bear. In virtually geometrical progression, uncertainty increases, especially under the impression of unforeseen surprises, malfunctions, or opportunities. The simple means of tight coupling and hierarchy are then, so to speak, exhausted, and the system has to react ad hoc, fast, competently, and professionally.
Continuing to reproduce under complex conditions requires loose coupling.31 This is true for the relation between motives (intentions) and performance and between decision premises and decisions. The need to take uncertainty absorption into account in linking decisions also points to (more or less) loose coupling. All tight and therefore vulnerable coupling of core technologies and all hierarchical concentration and control based on such coupling must therefore be embedded in a system that is based on other, more robust conditions for reproduction. Where technology does work, it works reliably, but reliability should not be confused with robustness. It is based on a high degree of indifference, but for this very reason is risky. Conditions shift on the other side of the form “technology.” Technology can register malfunctions if they occur within itself. It can react by repairs or by substitute performance. But tight coupling also means that diagnosis and learning requirements are limited. What is an advantage in the sense of a condition for specific reactions to occur is a disadvantage if the problem cannot be resolved by replacement or repair. Meticulous and detailed accounts usually provide no information on how to help an enterprise in difficulty.
Organizations that technicize work processes will accordingly have to provide for tight and loose couplings side by side and in conjunction. This can rightly be seen as a paradoxical requirement.32 At the same time, this distinction, if its unity is dissolved in the sense of the asserted difference, also points the way to resolving the paradox through decisions about concrete problems. One can decide in favor of a technology and then, looking at the problems it entails, be better able to discover the network of loose couplings in which it has to be embedded. For there are always two sides to a paradox. They reveal themselves when one pays attention to the unity of a distinction; but they also give widely ranging indications about deparadoxization in practice, depending on the type of distinction at issue.
This strict concept of technology raises doubts about the extent to which the automation of processes in offices and factories can be understood as “new technology.” Computers themselves are, of course, technical in conception. They have to work reliably. They cannot be allowed to be temperamental. They generate copious new information and combinatorial evaluations of information about their given area of operations, including, if hierarchical supervision is maintained, new possibilities for supervising and monitoring work.33 In the not too distant future, cars can be expected to tell their drivers how they are misbehaving. Computers thus expand the memory performance of the system – also for non-automatic follow-up decisions. They generate new possibilities for seeing alternatives. They lead to new self-descriptions of the system with unpredictable (needing a decision) connectivity options. “Il sistema sembra autoricognoscersi,” according to Butera34 with reference to Maturana und Varela’s theory of autopoietic organisms. The conclusion can only be that the automation of decisions at the operational level leads to more decisions having to be made at other levels. In the work process, less manual skill is required and more attention – which should not too hastily be interpreted as higher qualifications with a claim to better pay.
From the technology perspective, too, it is accordingly appropriate if not necessary to see organizations no longer as relations between more or less disciplined people but as the transformation of decisions into decisions under the conditions of autopoietic closure.
III
Technical systems can be described as allopoietic systems, which are exogenously controlled and which cease to operate when impulses cease. But they are obviously constructed to reduce the workload of autopoietic systems, if not to replace them. Since the invention of the computer, these possibilities of providing relief and substitution have multiplied, and in many organizations computers are now so integrated that their failure would cause serious disruption if not irreparable losses.
There are many facets to the consequent problems whose presentation would require specific studies. Provision has to be made for risks, redundancies have to be built in, backup has to be ensured, all at a cost that far exceeds the price of the computers themselves and their programs. Another problem has hitherto attracted little attention or been relegated to the background by well-established assumptions. The question is what autopoietic systems are actually replaced by the computer and superseded in the workplace.
As the example of artificial intelligence shows, studies of this sort usually set out from human consciousness, which is defined in terms of cognitive competence, ratio, intelligence, or similar traditional concepts.35 What tradition saw as distinguishing humans from animals is now and for the future to distinguish them from electronic machines. This at any rate is the hope of humanists – the vanishing point at which they hope to free the “human being” from the embrace of technology. At this point we could already consider whether the particularity of human consciousness, which the computer is likely to have difficulty attaining, does not lie in perception, and – in contrast to animals – in perception that is strongly language-dependent, directed and differentiated by language.
Be that as it may, a quite different question that arises is the relation between electronic data processing and communication. As the euphoria about artificial intelligence reaches its limits, the problem of substitution and improvement in relations between computer and communication appears to be shifting. For communication takes place under the conditions of mutual intransparency, which includes the intransparency of systems for themselves. One does not know a great deal about oneself or others, and so one talks, writes, prints, and transmits. For the computer, the operational and structural inaccessibility of what has been developed so far in the history of “human beings” is therefore likely to be due to the specific nature of social systems and not to the particularity of psychic systems. Humanists would then take refuge not in the consciousness or subjectness of the human being but in the autopoiesis of communication, or, to put it in terms more to their taste, in culture.
Computer systems can naturally be networked with one another and exchange the results of their work in the form of data. But precisely this is not the real achievement of communication. Communication produces a synthesis of information, utterance, and understanding on the basis of constructing a meaning that suppresses uncertainty, whose lack of grounding in physical, chemical, and organic realities can be compensated by the fact that every communication, if understood, can be answered in the affirmative or negative, and thus accepted or rejected in accordance with the means of persuasion that can be activated in the social system itself.36
This is also likely to be the case for organizations. However, the problems caused by making communication in organizations dependent on electronic data processing are all the more a matter of concern. And here we face the general problem that society is making itself dependent on technology even in the simplest of operations, and therefore has to ensure the permanent, efficient functioning of technical devices with their tight couplings.
IV
Modern society, more than any society before, is dependent on technology.37 Malfunctions can have consequences that can spread like an avalanche. If energy supplies and thus technology were to fail, this would cost most people their lives; it would accordingly not only affect societal communication but also have a massive impact on its environment: people would no longer talk, they would die. And the problem is by no means one of sensational one-off disasters alone, however many deaths they might cause, but also a problem of keeping things running at a normal level. Quite concretely and specifically this means the problem of energy supply.
The extent of technology dependence, which no one sought and which has come about through evolution, makes it understandable how much hope is, at least implicitly, placed on organization. This includes monitoring and controlling all core technologies, but also supplementing them by secondary technologies in the organization of human work through conditional programming. It also includes reducing the probability of disasters to rare exceptions, and to finding alternatives in the event of failures (e.g., in the supply of oil) in good time or at least fast enough. But is this hope justified?
The question is certainly too simplistic. It cannot be answered by a yes or a no. Our analysis of organizations as autopoietic systems of a particular type raises certain doubts. It poses the question whether modern society has not maneuvered itself into a double dependency, on technology and on organization, without being in a position to control the one with the aid of the other. Who then is in control? What organization?
Questions of this ilk belong in a theory of society; we can do no more than touch on them here. A sociology of technology would have to address the question of how many tight couplings a society can afford. This is a problem of smooth running at high risk, which is not only, not even primarily, a problem of disasters triggered by technology but above all one of maintaining the supply of energy, which has in turn to be technically produced. Organizational sociology, by contrast, would have a useful theory of organized social systems to offer a theory of society. This is the key issue of how the classical link between core technology and hierarchical control can be built into a more comprehensive concept of organization. For this purpose, it is not enough to supplement formal aspects by informal ones or to address the organization/machine analogy in terms of the organization/organism analogy.38 An organization is neither a trivial machine nor is it an organism. Unlike a technical system, it cannot tightly couple the totality of its operations. It is an autopoietic system of a specific type – a self-referential system, intransparent to itself and therefore also an unreliable but robust system that establishes itself in society without serving it. The concluding question must therefore be how a societal system can evolve that affords itself organizations, that is indeed dependent on them in almost39 all functional systems, but that cannot control them – except through organizations.
Technology and organization thus have at least one thing in common: they depend on themselves in fatal loops and are therefore unable to control their own evolution. Technology generates an energy supply problem that can be solved only by technology. The organization generates a control problem that can be solved only by organization. The social system of society provides the framework conditions for this evolution, the explosive mixture of order and chance on which all evolution depends in society. But society can react to what happens only at the level of technical developments and at the level of organization.