To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
If our computer systems break down, we might find an enormous dependency of which we were not truly aware. We may, by then, have become functionally illiterate – unable to deal with each other except with the aid of mechanisms.
Laurie (1979, p.141)
A tool is but the extension of a man's hand, and a machine is but a complex tool. And he that invents a machine augments the power of a man and the well-being of mankind.
Henry Ward Beecher (1813–1887), Proverbs from Plymouth Pulpit
Genres are associated with various cultural objects, objects that can attract us to the genres, make us suspicious of them, or even avoid their use. Sites for constructing, expressing, and modifying cultural objects include debate, writing, speeches, legal decisions, imagery, and design, as well as CSCW and other network-based system applications. Construction of cultural objects, like that of artifacts and other physical objects, should be considered in light of its reflexive dimensions. The objects serve to shape the cultures, individuals, and genres that are associated with them, and in turn are given shape.
Issues of dependence, autonomy, and intellectual augmentation are critical aspects of construction of the “first-person plural” – which involves the authority to attach the word “we” to a document, product, or decision. This authority is not automatically given by group members, and is often not recognized by parties outside the group. In some situations, establishment of this authority may depend on the ability of group participants and audiences to segregate the group from the “computer-mediated group.”
The work of a crowd is always inferior, whatever its nature, to that of an isolated individual.
Gustave LeBon (1895/1960, p. 200)
The release of productivity is the product of cooperatively organized intelligence.
Dewey and Tufts (1939, p. 446)
“Collaboration” and “cooperation” among individuals – the harnessing of people's skills and talents to conduct projects, make decisions, and create new ideas – are notions that are both commonplace and elusive. The contradiction between the two epigraphs underscores the fact that controversies concerning the value of collaboration are not new. We have all participated in meetings and team projects, in informal exchanges as well as structured games, but these activities remain only vaguely understood and nearly impossible to predict and control with any precision. Our modes of individual and group expression (our “virtual individuals” and “virtual groups”) are intimately linked with the technologies that support group interaction – technologies that have undergone dramatic change in the past decades.
Network-based computer applications designed to support joint efforts (“computer-supported cooperative work,” or CSCW, applications) have both staunch supporters and fierce critics. Promoters have characterized these systems as “coaches” and “educators” (Winograd and Flores, 1986); critics, in turn, have labeled the same systems as “oppressors” and “masters” with a “digitized whip” (Dvorak and Seymour, 1988). The terms “groupware” and “workgroup computing” can be found in many computing, management, and social science publications, along with words of high praise, condemnation, or ennui. Virtual reality (VR) applications have been incorporated into some CSCW initiatives, sometimes compounding confusion about the systems and further steepening the learning curve.
Picture a modern office setting, perhaps an insurance company headquarters. Some people are writing on sheets of paper. Others are looking into computer screens, entering numbers into a spreadsheet. Still others are conversing. Which of these individuals are working individually, and which are engaging in cooperative work? And which of the individuals engaging in cooperative activity are participating in healthy, well-working groups? Some of these issues might appear to be riddles or trick questions. Whether there are “riddles” (or linguistic puzzles) involved, the issue of how best to construe cooperative work activity is one of the most salient focal points of research and theory in CSCW applications.
In much the same manner as healthy and unhealthy forms of individual behavior have been constructed by social scientists, today's administrative theorists, network-based system developers, and CSCW researchers are attempting to construct notions of “functional” and “dysfunctional” collaborative behavior. Several of the theorists whose work is described in this chapter are attempting to segregate some kinds of work as “cooperative” and give them special forms of support. Others want to transform existing forms of work from their current, supposedly noncooperative form into cooperative work. Still others label all work as cooperative and want us to see work itself in a new light. Many have identified “right” ways of thinking about and engaging in cooperative activity, their conclusions bolstered with theoretical scaffolding, empirical research, and appeals to common sense.
We address the issue of what should be the proper relationship between theoretical computer science and practical computing. Starting from an analysis of what we perceive as the failure of formally based research to have as much impact on practical computing as is merited we propose a diagnosis based on the way formally based research is conducted and the way it is envisaged that results from these areas will be translated into practice. We suggest that it is the responsibility of practitioners of theoretical computer science to work more closely with the practical areas in order to identify ways in which their ideas can be used to augment current practice rather than seeking to replace it. As a case in point we examine functional programming and its relationship to programming parallel machines. We introduce a development, structured parallel programming, that seeks to combine the theoretical advantages of functional programming with established practice in these areas. We show how the full power of functional programming, for example highlevel abstraction and program transformation, can be made compatible with conventional imperative programming languages, providing practical solutions to many long standing problems in parallel computing.
Introduction
We characterise formally based computer science as the attempt to apply fundamental ideas in the logic and mathematics of computation to practical computing problems. Developments here include, for example, functional and logic programming, program transformation, studies of concurrency and formal program specification and verification. The hope underlying much of the research in these areas is that the mathematical foundation of these approaches will make possible radical solutions to many deep seated and long lasting problems in computing.
The requirements for design conflict cannot be reconciled. All designs for devices are in some degree failures, either because they flout one or another of the requirements or because they are compromises, and compromise implies a degree of failure.
David Pye, The Nature and Aesthetics of Design (1978)
Each electronic medium has come into its own only when we recognized its newness and stopped trying to use it as a container of the old.
Tony Schwartz, Media: The Second God (1981)
In building a design approach for computing applications that is responsive to their social and ethical dimensions, Pye's remarks are certainly instructive. He counsels us that all designs are in some senses compromises, and are therefore “failures.” Schwartz's comment prods us to look for new dimensions and possibilities in design, as well as to develop new ways of seeing media – not just new ways of applying new media to old problems. We tend to view new technologies with standards and expectations formed in previous eras: it is indeed difficult to do otherwise. The upshot of Pye's and Schwartz's counseling, simply put, is that computing applications and network-based system approaches in particular are likely candidates for revision, rethinking, and revision again.
I develop the notion of “genre-responsive design” in this chapter, along with some specific considerations for designers, managers, and users. Genres reflect complex political, social, and economic interactions among the individuals and groups with which they are associated. Couplings that genres have with various cultural objects serve to shape users’ expectations of those genres, as well as affect the scope of the genres’ utilization for constructing virtual individuals and groups.
The word “genre” (originally from French, meaning “type”) is largely associated with the realm of literature. In that context it is usually employed to refer to generic varieties of written material, such as novels, poems, and short stories. Viewing a set of computer applications as a genre emphasizes commonalities and family resemblances among set members (although genres can occasionally include loosely knit, heterogeneous compilations, held together for reasons that are largely accidental and historical). Questions about the range of expression that genres afford, and of individuals’ rationales in their choices of genres, occupy the attention of many literary critics (for example, Banta, 1978; Todorov, 1990), media specialists, and active as well as prospective consumers of the genres.
Discourse on genre plays an important role in genre construction. Genre-related notions can be powerful tools for understanding a variety of phenomena associated with human expression. The document you have in your hands right now conforms to a certain set of standards for presentation. Some are set by the American Psychological Society (APA), whosePublication Manual is the generally accepted style book for many written works. Writing standards may seem arbitrary, a trivial nuisance one must put up with in one's journey toward self-expression and group expression. However, these standards serve considerable functions in the development and maintenance of academic disciplines. For example, the adherence to uniform standards of citation that the APA requires bolsters the prestige of psychology as both a profession and a research area, supporting the notion that authors are indeed building on the work of others and adding to the growing stock of knowledge of the discipline as a whole (Bazerman, 1987b).
We all have had experiences we labeled as successful – perhaps even joyful – workplace or educational collaborations, where joint effort was free-flowing and results obtained were far greater than those any individual could have produced alone. The goal of facilitating such interaction with computer networks raises a number of difficult questions. How do we establish adequate platforms for selfdevelopment and self-expression, while providing vehicles for support of productive and efficient collaboration? How do we counterbalance powerful managerial and technological strategies with safeguards for the rights of individuals in their associations? This book takes a first step toward answering these questions.
In Chapter 1, I introduce and develop the notions of the “virtual individual” and “virtual group,” and explore how these entities play critical roles in human expression and interpersonal relationships. In this chapter, I also provide background and analysis on the research and application areas of network-based systems, with emphasis on groupware or “computer-supported cooperative work” (CSCW) approaches and linkages of groupware to the Internet and other large-scale networks. Although groupware and other network-based applications are of relatively recent vintage, they have roots that reach far back into the histories of computing, as well as the social and managerial sciences. I consider these applications in light of their many dimensions, in part by developing the notions of “genre” and “narrative” in relation to the growing varieties of computer artifacts and forms of computer-mediated expression.
Real-time computing has been the domain of the practical systems engineer for many decades. It is only comparatively recently that very much attention has been directed to its theoretical study. Using methods originally developed for use in operations research and optimization, scheduling theory has been used to analyse characteristic timing problems in the sharing of resources in real-time computing systems. Independently of this, specification and verification techniques used in sequential and concurrent programming have been extended to allow definition of the timing properties of programs. In terms of effectiveness, the two approaches are still limited and experimental, and neither on its own can yet be used to provide an exact timing analysis or to verify the timing properties of even modestly complex real-time programs of practical size. But if restricted classes of program are considered, they are rapidly approaching the point of practical usefulness. This suggests that the development of a discipline of real-time programming would allow the construction of programs with analyzable and verifiable timing properties. Such a discipline will need to be built upon on a well integrated framework in which different methods are used where appropriate to obtain timing properties to which a high level of assurance can be attached.
Introduction
A real-time computer system interacts with an environment which has time-varying properties and the system must exhibit predictable time-dependent behaviour. Most real-time systems have limited resources (e.g. memory, processors) whose allocation to competing demands must be scheduled in a way that will allow the system to satisfy its timing constraints. Thus one important aspect of the design and analysis of a real-time system is concerned with resource allocation.
The editors of this volume set out to compile a set of personal views about the long-term direction of computer science research. In responding to this goal, we have chosen to identify what we perceive as a longterm challenge to the capabilities of computing technology in serving the broader needs of people and society, and to discuss how this ‘Grand Challenge’ might be met by future research. We also present our personal view of the required research methodology.
Introduction
Much of present-day computer technology is concerned with the processing, storage and communication of digital data. The view taken in this contribution is that a far more important use of computers and computing is to manage and manipulate human-related information. Currently, the provision of such structured information has been tackled at the level of single organisations (company, institution, government department, etc.) by the use of databases which are often limited to single functions within the organisation. Databases are closed, in the sense that the information itself can be viewed in a limited number of ways, and the ways in which it can evolve are carefully controlled. Interaction between databases containing related data is prohibited, except through the mediation of human experts. This is an unnecessarily restricted concept of information processing, and one which fails to recognise its real social and economic potential. We foresee a huge market for personal information services based on open access to a continually evolving global network of stored information. Although there are significant technical difficulties associated with creating and controlling such networks and services, we predict that the economic incentives will ensure that the necessary development occurs and that this information market will – within a period of decades – dwarf the market in computing machinery and software.
This article argues that problems of scale and complexity of data in large scientific and engineering databases will drive the development of a new generation of databases. Examples are the human genome project with huge volumes of data accessed by widely distributed users, geographic information systems using satellite data, and advanced CAD and engineering design databases. Databases will share not just facts, but also procedures, methods and constraints. This together with the complexity of the data will favour the object-oriented database model. Knowledge base technology is also moving in this direction, leading to a distributed architecture of knowledge servers interchanging information on objects and constraints. An important theme is the re-use not just of data and methods but of higher-level knowledge. For success, a multi-disciplinary effort will be needed along the lines of the Knowledge Sharing Effort in USA, which is discussed.
Introduction
The research area of databases is a very interesting testing ground for computing science ideas. It is an area where theory meets reality in the form of large quantities of data with very computer-intensive demands on it. Until recently the major problems were in banking and commercial transactions. These sound easy in principle but they are made difficult by problems of scale, distributed access, and the ever present need to move a working system with long-term data onto new hardware, new operating systems, and new modes of interaction. Despite this, principles for database system architecture were established which have stood the test of time – data independence, serialised transactions, two-phase commit, query optimisation and conceptual schema languages. Thanks to these advances the database industry is very large and very successful.
Computer science and mathematics are closely related subjects, and over the last fifty years, each has fed off the other. Mathematicians have used computers to prove (or disprove) traditional results of mathematics, computer scientists have used more and more advanced mathematics in their work, and new areas of mathematics have been inspired by questions thrown up by computing.
Introduction
The academic subjects of mathematics and computer science, the oldest science and one of the newest, are closely related. This article considers the various ways in which they interact, and each influences the development of the other.
It is worth noting that we do not consider here the influence of computer technology (and the associated communications revolution) on the infrastructure and sociology of mathematics. Developments such as
CD-ROM publication (particularly of Mathematical Reviews),
electronic databases (again one thinks of Mathematical Reviews, but also of the Science Citation Index, which, even in its paper form, could not be compiled without computers),
electronic manuscripts and camera-ready copy,
ftp preprint systems and
electronic mail
have changed, and will continue to change, the way in which mathematicians consider, and add to, their literature, but this is not specific to mathematics, even though mathematicians have often been in the vanguard of such movements, presumably because of their general use of computers.
The Influence of Computers on Mathematics
Mathematicians have always numbered prodigious calculators among their kind, be they numerical calculators or symbolic ones (Delaunay's lunar theory (1860) contained a 120-page formula). Hence it is not surprising that the digital computer soon interested some pure mathematicians. With its help, they could perform far larger calculations than before, and investigate phenomena that were inaccessible to human computation.
On Disparity, Difficulty, Complexity, Novelty – and Inherent Uncertainty
It has been said that the term software engineering is an aspiration not a description. We would like to be able to claim that we engineer software, in the same sense that we engineer an aero-engine, but most of us would agree that this is not currently an accurate description of our activities. My suspicion is that it never will be.
From the point of view of this essay – i.e. dependability evaluation – a major difference between software and other engineering artefacts is that the former is pure design. Its unreliability is always the result of design faults, which in turn arise as a result of human intellectual failures. The unreliability of hardware systems, on the other hand, has tended until recently to be dominated by random physical failures of components – the consequences of the ‘perversity of nature’. Reliability theories have been developed over the years which have successfully allowed systems to be built to high reliability requirements, and the final system reliability to be evaluated accurately. Even for pure hardware systems, without software, however, the very success of these theories has more recently highlighted the importance of design faults in determining the overall reliability of the final product. The conventional hardware reliability theory does not address this problem at all.
In the case of software, there is no physical source of failures, and so none of the reliability theory developed for hardware is relevant. We need new theories that will allow us to achieve required dependability levels, and to evaluate the actual dependability that has been achieved, when the sources of the faults that ultimately result in failure are human intellectual failures.