The Language Instinct: How the Mind Creates Language Read online

Page 11


  How Ann Salisbury can claim that Pam Dawber’s anger at not receiving her fair share of acclaim for Mork and Mindy’s success derives from a fragile ego escapes me.

  At the point just after the word not, the letter-writer had to keep four grammatical commitments in mind: (1) not requires -ing (her anger at not receiving acclaim); (2) at requires some kind of noun or gerund (her anger at not receiving acclaim); (3) the singular subject Pam Dawber’s anger requires the verb fourteen words downstream to agree with it in number (Dawber’s anger…derives from); (4) the singular subject beginning with How requires the verb twenty-seven words downstream to agree with it in number (How…escapes me). Similarly, a reader must keep these dependencies in mind while interpreting the sentence. Now, technically speaking, one could rig up a word-chain model to handle even these sentences, as long as there is some actual limit on the number of dependencies that the speaker need keep in mind (four, say). But the degree of redundancy in the device would be absurd; for each of the thousands of combinations of dependencies, an identical chain must be duplicated inside the device. In trying to fit such a superchain in a person’s memory, one quickly runs out of brain.

  The difference between the artificial combinatorial system we see in word-chain devices and the natural one we see in the human brain is summed up in a line from the Joyce Kilmer poem: “Only God can make a tree.” A sentence is not a chain but a tree. In a human grammar, words are grouped into phrases, like twigs joined in a branch. The phrase is given a name—a mental symbol—and little phrases can be joined into bigger ones.

  Take the sentence The happy boy eats ice cream. It begins with three words that hang together as a unit, the noun phrase the happy boy. In English a noun phrase (NP) is composed of a noun (N), sometimes preceded by an article or “determinator” (abbreviated “det”) and any number of adjectives (A). All this can be captured in a rule that defines what English noun phrases look like in general. In the standard notation of linguistics, an arrow means “consists of,” parentheses mean “optional,” and an asterisk means “as many of them as you want,” but I provide the rule just to show that all of its information can be captured precisely in a few symbols; you can ignore the notation and just look at the translation into ordinary words below it:

  NP (det) A* N

  “A noun phrase consists of an optional determiner, followed by any number of adjectives, followed by a noun.”

  The rule defines an upside-down tree branch:

  Here are two other rules, one defining the English sentence (S), the other defining the predicate or verb phrase (VP); both use the NP symbol as an ingredient:

  S NP VP

  “A sentence consists of a noun phrase followed by a verb phrase.”

  VP VNP

  “A verb phrase consists of a verb followed by a noun phrase.”

  We now need a mental dictionary that specifies which words belong to which part-of-speech categories (noun, verb, adjective, preposition, determiner):

  N boy, girl, dog, cat, ice cream, candy, hot dogs “Nouns may be drawn from the following list: boy, girl,…”

  V eats, likes, bites

  “Verbs may be drawn from the following list: eats, likes, bites.”

  A happy, lucky, tall

  “Adjectives may be drawn from the following list: happy, lucky, tall.”

  det a, the, one

  “Determiners may be drawn from the following list: a, the, one.”

  A set of rules like the ones I have listed—a “phrase structure grammar”—defines a sentence by linking the words to branches on an inverted tree:

  The invisible superstructure holding the words in place is a powerful invention that eliminates the problems of word-chain devices. The key insight is that a tree is modular, like telephone jacks or garden hose couplers. A symbol like “NP” is like a connector or fitting of a certain shape. It allows one component (a phrase) to snap into any of several positions inside other components (larger phrases). Once a kind of phrase is defined by a rule and given its connector symbol, it never has to be defined again; the phrase can be plugged in anywhere there is a corresponding socket. For example, in the little grammar I have listed, the symbol “NP” is used both as the subject of a sentence (S NP VP) and as the object of a verb phrase (VP V NP). In a more realistic grammar, it would also be used as the object of a preposition (near the boy), in a possessor phrase (the boy’s bat), as an indirect object (give the boy a cookie), and in several other positions. This plug-and-socket arrangement explains how people can use the same kind of phrase in many different positions in a sentence, including:

  [The happy happy boy] eats ice cream.

  I like [the happy happy boy].

  I gave [the happy happy boy] a cookie.

  [The happy happy boy]’s cat eats ice cream.

  There is no need to learn that the adjective precedes the noun (rather than vice versa) for the subject, and then have to learn the same thing for the object, and again for the indirect object, and yet again for the possessor.

  Note, too, that the promiscuous coupling of any phrase with any slot makes grammar autonomous from our common-sense expectations involving the meanings of the words. It thus explains why we can write and appreciate grammatical nonsense. Our little grammar defines all kinds of colorless green sentences, like The happy happy candy likes the tall ice cream, as well as conveying such newsworthy events as The girl bites the dog.

  Most interestingly, the labeled branches of a phrase structure tree act as an overarching memory or plan for the whole sentence. This allows nested long-distance dependencies, like if…then and either…or, to be handled with ease. All you need is a rule defining a phrase that contains a copy of the very same kind of phrase, such as:

  S either S or S

  “A sentence can consist of the word either, followed by a sentence, followed by the word or, followed by another sentence.”

  S then S

  “A sentence can consist of the word if, followed by a sentence, followed by the word then, followed by another sentence.”

  These rules embed one instance of a symbol inside another instance of the same symbol (here, a sentence inside a sentence), a neat trick—logicians call it “recursion”—for generating an infinite number of structures. The pieces of the bigger sentence are held together, in order, as a set of branches growing out of a common node. That node holds together each either with its or, each if with its then, as in the following diagram (the triangles are abbreviations for lots of underbrush that would only entangle us if shown in full):

  There is another reason to believe that a sentence is held together by a mental tree. So far I have been talking about stringing words into a grammatical order, ignoring what they mean. But grouping words into phrases is also necessary to connect grammatical sentences with their proper meanings, chunks of mentalese. We know that the sentence shown above is about a girl, not a boy, eating ice cream, and a boy, not a girl, eating hot dogs, and we know that the boy’s snack is contingent on the girl’s, not vice versa. That is because girl and ice cream are connected inside their own phrase, as are boy and hot dogs, as are the two sentences involving the girl. With a chaining device it’s just one damn word after another, but with a phrase structure grammar the connectedness of words in the tree reflects the relatedness of ideas in mentalese. Phrase structure, then, is one solution to the engineering problem of taking an interconnected web of thoughts in the mind and encoding them as a string of words that must be uttered, one at a time, by the mouth.

  One way to see how invisible phrase structure determines meaning is to recall one of the reasons mentioned in Chapter 3 that language and thought have to be different: a particular stretch of language can correspond to two distinct thoughts. I showed you examples like Child’s Stool Is Great for Use in Garden, where the single word stool has two meanings, corresponding to two entries in the mental dictionary. But sometimes a whole sentence has two meanings, even if each individual word has only one meaning. In
the movie Animal Crackers, Groucho Marx says, “I once shot an elephant in my pajamas. How he got into my pajamas I’ll never know.” Here are some similar ambiguities that accidentally appeared in newspapers:

  Yoko Ono will talk about her husband John Lennon who was killed in an interview with Barbara Walkers.

  Two cars were reported stolen by the Groveton police yesterday.

  The license fee for altered dogs with a certificate will be $3 and for pets owned by senior citizens who have not been altered the fee will be $1.50.

  Tonight’s program discusses stress, exercise, nutrition, and sex with Celtic forward Scott Wedman, Dr. Ruth Westheimer, and Dick Cavett.

  We will sell gasoline to anyone in a glass container.

  For sale: Mixing bowl set designed to please a cook with round bottom for efficient beating.

  The two meanings in each sentence come from the different ways in which the words can be joined up in a tree. For example, in discuss sex with Dick Cavett, the writer put the words together according to the tree below (“PP” means prepositional phrase): sex is what is to be discussed, and it is to be discussed with Dick Cavett.

  The alternative meaning comes from our analyzing the words according to the tree at the right: the words sex with Dick Cavett form a single branch of the tree, and sex with Dick Cavett is what is to be discussed.

  Phrase structure, clearly, is the kind of stuff language is made of. But what I have shown you is just a toy. In the rest of this chapter I will try to explain the modern Chomskyan theory of how language works. Chomsky’s writing are “classics” in Mark Twain’s sense: something that everybody wants to have read and nobody wants to read. When I come across one of the countless popular books on mind, language, and human nature that refer to “Chomsky’s deep structure of meaning common to all human languages” (wrong in two ways, we shall see), I know that Chomsky’s books of the last twenty-five years are sitting on a high shelf in the author’s study, their spines uncracked, their folios uncut. Many people want to have a go at speculating about the mind but have the same impatience about mastering the details of how language works that Eliza Doolittle showed to Henry Higgins in Pygmalion when she complained, “I don’t want to talk grammar. I want to talk like a lady in a flower shop.”

  For nonspecialists the reaction is even more extreme. In Shakespeare’s The Second Part of King Henry VI, the rebel Dick the Butcher speaks the well-known line “The first thing we do, let’s kill all the lawyers.” Less well known is the second thing Dick suggests they do: behead Lord Say. Why? Here is the indictment presented by the mob’s leader, Jack Cade:

  Thou hast most traitorously corrupted the youth of the realm in erecting a grammar school…. It will be proved to thy face that thou hast men about thee that usually talk of a noun and a verb, and such abominable words as no Christian ear can endure to hear.

  And who can blame the grammarphobe, when a typical passage from one of Chomsky’s technical works reads as follows?

  To summarize, we have been led to the following conclusions, on the assumption that the trace of a zero-level category must be properly governed. 1. VP is >-marked by I. 2. Only lexical categories are L-markers, so that VP is not L-marked by I. 3. -government is restricted to sisterhood without the qualification (35). 4. Only the terminus of an X0-chain can -mark or Case-mark. 5. Head-to-head movement forms an A-chain. 6. SPEC-head agreement and chains involve the same indexing. 7. Chain coindexing holds of the links of an extended chain. 8. There is no accidental coindexing of I. 9. I-V coindexing is a form of head-head agreement; if it is restricted to aspectual verbs, then base-generated structures of the form (174) count as adjunction structures. 10. Possibly, a verb does not properly govern its -marked complement.

  All this is unfortunate. People, especially those who hold forth on the nature of mind, should be just plain curious about the code that the human species uses to speak and understand. In return, the scholars who study language for a living should see that such curiosity can be satisfied. Chomsky’s theory need not be treated by either group as a set of cabalistic incantations that only the initiated can mutter. It is a set of discoveries about the design of language that can be appreciated intuitively if one first understands the problems to which the theory provides solutions. In fact, grasping grammatical theory provides an intellectual pleasure that is rare in the social sciences. When I entered high school in the late 1960s and electives were chosen for their “relevance,” Latin underwent a steep decline in popularity (thanks to students like me, I confess). Our Latin teacher Mrs. Rillie, whose merry birthday parties for Rome failed to slow the decline, tried to persuade us that Latin grammar honed the mind with its demands for precision, logic, and consistency. (Nowadays, such arguments are more likely to come from the computer programming teachers.) Mrs. Rillie had a point, but Latin declensional paradigms are not the best way to convey the inherent beauty of grammar. The insights behind Universal Grammar are much more interesting, not only because they are more general and elegant but because they are about living minds rather than dead tongues.

  Let’s start with nouns and verbs. Your grammar teacher may have had you memorize some formula that equated parts of speech with kinds of meanings, like

  A NOUN’s the name of any thing;

  As school or garden, hoop or swing.

  VERBS tell of something being done;

  To read, count, sing, laugh, jump, or run.

  But as in most matters about language, she did not get it quite right. It is true that most names for persons, places, and things arc nouns, but it is not true that most nouns are names for persons, places, or things. There are nouns with all kinds of meanings:

  the destruction of the city [an action]

  the way to San Jose [a path]

  whiteness moves downward [a quality]

  three miles along the path [a measurement in space]

  It takes three hours to solve the problem. [a measurement in time]

  Tell me the answer. [“what the answer is,” a question]

  She is a fool. [a category or kind]

  a meeting [an event]

  the square root of minus two [an abstract concept]

  He finally kicked the bucket. [no meaning at all]

  Likewise, though words for things being done, such as count and jump, are usually verbs, verbs can be other things, like mental states (know, like), possession (own, have), and abstract relations among ideas (falsify, prove).

  Conversely, a single concept, like “being interested,” can be expressed by different parts of speech:

  her interest in fungi [noun]

  Fungi are starting to interest her more and more. [verb]

  She seems interested in fungi. Fungi seem interesting to her. [adjective]

  Interestingly, the fungi grew an inch in an hour. [adverb]

  A part of speech, then, is not a kind of meaning; it is a kind of token that obeys certain formal rules, like a chess piece or a poker chip. A noun, for example, is simply a word that does nouny things; it is the kind of word that comes after an article, can have an ’s stuck onto it, and so on. There is a connection between concepts and part-of-speech categories, but it is a subtle and abstract one. When we construe an aspect of the world as something that can be identified and counted or measured and that can play a role in events, language often allows us to express that aspect as a noun, whether or not it is a physical object. For example, when we say I have three reasons for leaving, we are counting reasons as if they were objects (though of course we do not literally think that a reason can sit on a table or be kicked across a room). Similarly, when we construe some aspect of the world as an event or state involving several participants that affect one other, language often allows us to express that aspect as a verb. For example, when we say The situation justified drastic measures, we are talking about justification as if it were something the situation did, though again we know that justification is not something we can watch happening at a particular time and place. Nouns a
re often used for names of things, and verbs for something being done, but because the human mind can construe reality in a variety of ways, nouns and verbs are not limited to those uses.

  Now what about the phrases that group words into branches? One of the most intriguing discoveries of modem linguistics is that there appears to be a common anatomy in all phrases in all the world’s languages.

  Take the English noun phrase. A noun phrase (NP) is named after one special word, a noun, that must be inside it. The noun phrase owes most of its properties to that one noun. For example, the NP the cat in the hat refers to a kind of cat, not a kind of hat; the meaning of the word cat is the core of the meaning of the whole phrase. Similarly, the phrase fox in socks refers to a fox, not socks, and the entire phrase is singular in number (that is, we say that the fox in socks is or was here, not are or were here), because the word fox is singular in number. This special noun is called the “head” of the phrase, and the information filed with that word in memory “percolates up” to the topmost node, where it is interpreted as characterizing the phrase as a whole. The same goes for verb phrases: flying to Rio before the police catch him is an example of flying, not an example of catching, so the verb flying is called its head. Here we have the first principle of building the meaning of a phrase out of the meaning of the words inside the phrase. What the entire phrase is “about” is what its head word is about.