To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
In the preceding chapter we introduced the syntax crystal model, demonstrating how it worked and how it could be used to generate and parse natural language strings. We showed how the model could handle many of the linguistic structures that have been used to illustrate the power of transformational grammars. We argued that the local rules of the syntax crystal were more accessible to explanation at the cognitive level. And especially important, the syntactic rules of the model use a minimum of essential features and thus are easy to map onto the conceptual structure underlying language strings.
In this chapter we are going to show how the syntax crystal model makes it easier to explain the acquisition of syntax. With very few assumptions, the model demonstrates how the information used to construct the complex hierarchical structures of syntax can be gradually acquired using a simple mediated learning principle. In advocating a model to account for syntax acquisition by learning theory principles, we are moving counter to the nativism allied with transformational theories. Therefore, we are going to divide this chapter into two parts. The first part opens the territory for a learning account of syntax acquisition by examining and challenging the view held by Chomsky and others that there are innate, specifically linguistic mechanisms for syntax. The second part discusses the acquisition of syntax using the syntax crystal model.
In this chapter we will first discuss Chomsky's criteria for the adequacy of a theory of grammar and how they are related to (1) the use of computers in linguistic research, (2) the assumption of an ideal language user, (3) the role of linguistic universals in a theory, and (4) the connection between syntax and semantics. We will look at some of the ways to compare and evaluate theories of syntax. Then different accounts of syntax will be examined and compared. In the next chapter we will discuss our own model of syntactic operations and compare it with those discussed here.
Chomsky's criteria for an adequate theory
Noam Chomsky (1957) proposed three criteria that an adequate theory of grammar should satisfy. By grammar, he meant both syntax and morphophonemics, but we will be concerned only with syntax. Theories of syntax in their present stage of development are far from providing a complete account of all language; instead they propose programs or directions for research, setting out the kinds of rules that would be necessary for a complete account. No theory actually satisfies Chomsky's three criteria, but many justify their programs in terms of the likelihood of doing so. As we shall see, the interpretation of how the criteria are to be met differ from theory to theory.
The criteria are:
A theory of grammar should provide a set of rules that can generate all and only syntactically correct strings.
The rules should also generate correct structural descriptions for each grammatical string.
This book is an attempt to answer the following questions: What are the essential features of language that permit a sentence or string of words to convey a complex idea? and What are the essential features of language users that enable them to produce and understand meaningful strings of words and to learn how to do so? The heart of these problems is syntax, and our answers constitute a new theory of syntax and syntax acquisition.
The goal of a theory of syntax
A theory of syntax must explain how someone can express a complex idea by organizing units of language into an appropriate pattern that conveys the idea, and how another person is able to identify from the language pattern, not only the concepts expressed by the individual units of language, but the relationships among the concepts that make up the idea. A theory of syntax should also explain what essential properties of language and of language users allow this method of encoding to be learned.
In trying to identify the essential features of a phenomenon, a good theory tries to represent the phenomenon as simply as possible without doing injustice to the complexity it is trying to explain. Actual syntax use (and its learning) may involve many redundant operations in order to increase speed and reliability. A theory of syntax will not be directly interested in all the properties and processes that may be involved in syntax use and acquisition, but in those that must be involved – the minimally essential features for syntax to work and to be learned.
The Syntax CRYstal Parser (SCRYP) is a computer program that uses syntax crystal rules to parse sentences of a natural language. The procedures developed for the parser constitute a set of heuristics for the bottom-up, as opposed to topdown, analysis of sentence structure. We believe such procedures reflect some processes that take place in human language comprehension. This appendix will outline some advantages the parser has over other procedures as well as presenting a description of the program itself. An important consideration is the applicability of SCRYP to descriptions of human language performance. For this reason, we emphasize parsimony of description and the applicability of structural and procedural assumptions to human language performance.
Local rules and global procedures
Parsers for descriptive and interpretive grammars
The problem of constructing a parser to derive syntactic descriptions of sentences on the basis of word order and inflections is largely the problem of selecting and implementing appropriate global procedures to coordinate the operations of local rules (see Chapters 1 and 4). Various types of local rules may be used in global procedures. The rules specify the elements of syntactic form, the procedures and their application.
In the preceding chapter, we looked at some of the properties of language strings. We examined the relations among morphemes that are available to carry information about the relations of concepts. We considered several lines of argument converging on the conclusion that language strings have an organization beyond their sequential structure. The organization is hierarchical, that is, organized at several different levels simultaneously, and the organization can be recursive, which allows parts of the organizing principle to be used over and over again.
The organization does not appear explicitly in language strings, but its existence can be inferred from the strings and how they are handled by language users. No contemporary account of the psychology of language or linguistics denies such organization (although a few may ignore it, e.g., Winokur, 1976).
The organization is the basis for the syntactic code that enables language strings to carry information about the relation of concepts. To understand a string, the language user must convert the sequence of morphemes (inflected lexemes plus function words) into something from which the idea may be extracted. When expressing an idea, the process is reversed. The cognitive operations involved can be described as the syntactic mechanism of the language user. A description of and a model for this mechanism are given in Chapter 5.
In this chapter we are going to study the contents of the second box in the flow chart of Figure 1.4: underlying conceptual structure.
Here we describe a simulation in the form of a game of how a child learns to use language as it learns about its world. The central idea is that language is acquired in the context of nonlinguistic information, that pragmatic, lexical, and syntactic information processing work together. Playing the game takes only about an hour or less. In our experience, we have found that it is a very good way to convey a sense of the problems and strategies involved in language learning and teaching. Necessarily, an hour's experience cannot capture every aspect of a process that takes years to complete. There are shortcomings of the simulation, for example, that adults already know that linguistic signals are meaningful whereas a baby must discover this fact. Nevertheless, we highly recommend that readers experience the game. A simplified version is presented here. Modifications are easy to add for research or entertainment purposes.
The game is played by one or more teams of two players each. Competition requires at least two teams but the game may be played by a single pair. Each team represents a family, a baby player and a parent player. Initially the parent player is informed as to the lexicon and syntax of a simple artificial language while the baby player remains naive. The object of the game is to have the baby player learn the language and achieve competence sooner than the babies of other families.
In the preceding chapter, we described the role of syntax in language processing and the criteria to be met by an adequate theory of syntax. Then we introduced phrase structure grammar and discussed its inadequacies from the point of view of transformational theory. The syntactic mechanisms of transformational theory were then described.
Transformational theory incorporates a large number of different global operations, from the transformation rules themselves to the metarules that in turn control their application. As we discussed in Chapter 1, global operations are less satisfactory in a model than local rules. Global operations are usually expressed in general terms; to actually use them, a further set of procedures must be specified.
In a cognitive model, a global operation is an incomplete explanation. It does not explain how the disparate parts controlled by the global operation are recognized and operated on; it only states that they are. A model of a cognitive process that invokes complex global operations presupposes additional cognitive processes for the model to work. Such a model, in effect, requires a homunculus, a little person inside, who performs the additional operations that are needed to explain the cognitive processes. The model still has to explain how the homunculus operates. Explanation by homunculus is open to the kind of criticism that Miller, Galanter, and Pribram (1960) directed at Tolman's (1948) explanation of navigation behavior that posited a “map control room in the brain” and left unexplained the details of how the maps were drawn and used.
The common opinion concerning scientific knowledge and theoretical understanding – of molecules, of stars, of nuclei and electromagnetic waves – is that it is of a kind very different from our knowledge of apples, and tables, and kitchen pots and sand. Whereas theoretical knowledge can be gained only by an act of creative genius, or by diligent study of the genius of another, knowledge of the latter kind can be gained by anyone, by casual observation. Theoretical understanding, it will be said, is artificial where the latter is natural, speculative where the latter is manifest, fluid where the latter is essentially stable, and parasitic where the latter is autonomous.
That these specious contrasts are wholesale nonsense has not prevented them finding expression and approval in the bulk of this century's philosophical literature. Theoretical “knowledge” is there represented as an essentially peripheral superstructure erected on the body of human knowledge proper. This approach did promise some advantages. One could hope to give an account of the semantics of theoretical concepts by explicating the special kinds of relations they must bear to non-theoretical concepts; and one could hope to give an account of the warrant or justification of theoretical beliefs by explicating the relations they must bear to our non-theoretical knowledge. That is, taking the non-theoretical as a temporary given, one could hope to provide a successful account of theoretical understanding short of the larger business of constructing an account of human understanding in general.
The guiding conviction of this chapter is as follows: perception consists in the conceptual exploitation of the natural information contained in our sensations or sensory states. This view will be articulated more fully as we proceed, but even in this rough formulation it suggests a question: how efficient are we at exploiting this information? The answer, I shall argue, is that we are not very efficient at it, or rather, not nearly as efficient as we might be. What needs to be made clear in the topic of perception is the truly vast amount of exploitable information, contained in our own sensations, that goes blissfully unexploited by our conceptually benighted selves.
As indicated earlier, this myopia appears remediable. Our current modes of conceptual exploitation are rooted, in substantial measure, not in the nature of our perceptual environment, nor in the innate features of our psychology, but rather in the structure and content of our common language, and in the process by which each child acquires the normal use of that language. By this process each of us grows into a conformity with the current conceptual template. In large measure we learn, from others, to perceive the world as everyone else perceives it. But if this is so, then we might have learned, and may yet learn, to conceive/perceive the world in ways other than those supplied by our present culture.
Normative epistemology: the problem in perspective
A child does not need an epistemological theory. He learns with a relentless efficiency his adult incarnation will envy, but he makes no conscious use of explicit principles to guide or shape the runaway evolution of his world picture. For the most part, of course, neither does an adult. Upon being pressed for a justification or explanation of some conviction or epistemic decision, an adult may respond with the likes of ‘Well, such-and-such implies that P’, or P would explain so-and-so’, or ‘P is the only serious possibility I can think of’. But in the vast majority of cases such humble remarks exhaust the speaker's awareness of whatever principles we might assume to govern his intellectual evolution. And yet in adults as well as children that evolution displays a richness and complexity that explanations of the sort just cited barely begin to penetrate.
That complex evolution, therefore, wants accounting for, as it occurs both in infants and in adults. We wish to understand in detail the concert of factors that produce and shape it. So far, our concerns will be purely theoretical (descriptive, explanatory). But our concerns do not end here. In particular, we wish to understand what factors and principles guide intellectual development in a rational man, indeed, in an ideally rational man.
The growing recognition that the analytic/synthetic distinction is as unreflected in linguistic fact as it is recalcitrant in linguistic theory has made the discussion of meaning interesting again. The familiar picture of a sharply delineable conceptual framework distinct from and presupposed by the edifice of merely empirical belief, a framework whose girders are analytic truths and whose joints are concepts rigidly defined thereby – all this must be swept away, to be replaced by the holistic and dynamic picture of the evolving network of all of a man's beliefs, beliefs no longer differentiated by any exclusive semantic credentials or unique epistemological status. The older picture must be swept away not because there are no semantically important differences between one's beliefs – of course there are – but rather because its explication of what those differences consist in is confused, mistaken, and explanatorily sterile.
The plausibility of the analytic/synthetic distinction lies principally in the fact that for certain of the sentences we accept – for example, ‘All bachelors are unmarried males’ – we find it plausible to insist that they do not admit of a denial that is consistent with our current understanding of the terms they contain. And this has suggested to many that the meaning of the terms they contain is the source or ground of the truth of such sentences.
We have before us so far a broad epistemological thesis and the outlines of a semantic theory. We must now assess the consequences of these closely related positions as they bear on the philosophy of mind. In particular, we shall see what light they throw on the nature of our self-conception generally, and on the specific knowledge one has of oneself and others, qua persons. We shall find, I think, that the illumination is considerable.
There is a further reason for turning, at this point in the essay, to the philosophy of mind. The epistemological theses advanced to this point fall well short of a comprehensive theory of knowledge, since they do not include a solution to the methodological problem. Beyond some casual mutterings about coherence, explanatory power, and the like, no attempt has been made to explicate what rationality consists in, as it bears on theoretical evolution in general. The reason is simple. I am of the opinion that this fundamental problem will require for its solution, or even for its significant advancement, a revolution in our self-conception. I must therefore establish that there is room for a revolution in our self-conception, that our self-conception is as speculative as any other. Once this has been established, we can explore the shape of the revolution with confidence.
It has been argued at length that our common-sense conceptual framework for empirical reality is in all relevant respects a theoretical framework.
This volume is descended from a paper delivered to the Western Division meetings of the Canadian Philosophical Association in 1971. That paper sketched the argument of chapter 2 and the principal thesis of chapter 5. In the interim, several intermediate versions of that material have been presented on a variety of occasions, and I should like to thank the participants, audiences, and departments involved for their kindness and critical suggestions.
The present essay is addressed simultaneously to two distinct audiences. The first audience consists of my professional colleagues, other academics, students, and lay readers, who are less than intimately familiar with the philosophical position commonly called scientific realism. For them I have here attempted to make available in fairly short compass a coherent and comprehensive account of that position as it bears on the philosophy of perception, on the theory of meaning, on the philosophy of mind, and on systematic epistemology. The view proposed is not merely eclectic, however. The synthesis effected is novel in various respects, and the supporting arguments are for the most part novel as well. It is my earnest hope, therefore, that the discussion will be found entertaining, and valuable as well, to those of my colleagues who already share a familiarity with the philosophy of science in general and with scientific realism in particular.