The Blank Slate: The Modern Denial of Human Nature Read online

Page 5


  These two ideas—the denial of human nature, and the autonomy of culture from individual minds—were also articulated by the founder of sociology, Emile Durkheim (1858–1917), who had foreshadowed Kroeber’s doctrine of the superorganic mind:

  Every time that a social phenomenon is directly explained by a psychological phenomenon, we may be sure that the explanation is false…. The group thinks, feels, and acts quite differently from the way in which members would were they isolated…. If we begin with the individual in seeking to explain phenomena, we shall be able to understand nothing of what takes place in the group…. Individual natures are merely the indeterminate material that the social factor molds and transforms. Their contribution consists exclusively in very general attitudes, in vague and consequently plastic predispositions.30

  And he laid down a law for the social sciences that would be cited often in the century to come: “The determining cause of a social fact should be sought among the social facts preceding it and not among the states of individual consciousness.”31

  Both psychology and the other social sciences, then, denied that the minds of individual people were important, but they set out in different directions from there. Psychology banished mental entities like beliefs and desires altogether and replaced them with stimuli and responses. The other social sciences located beliefs and desires in cultures and societies rather than in the heads of individual people. The different social sciences also agreed that the contents of cognition—ideas, thoughts, plans, and so on—were really phenomena of language, overt behavior that anyone could hear and write down. (Watson proposed that “thinking” really consisted of teensy movements of the mouth and throat.) But most of all they shared a dislike of instincts and evolution. Prominent social scientists repeatedly declared the slate to be blank:

  Instincts do not create customs; customs create instincts, for the putative instincts of human beings are always learned and never native.

  —Ellsworth Faris (1927)32

  Cultural phenomena… are in no respect hereditary but are characteristically and without exception acquired.

  —George Murdock (1932)33

  Man has no nature; what he has is history.

  —José Ortega y Gasset (1935)34

  With the exception of the instinctoid reactions in infants to sudden withdrawals of support and to sudden loud noises, the human being is entirely instinctless…. Man is man because he has no instincts, because everything he is and has become he has learned, acquired, from his culture, from the man-made part of the environment, from other human beings.

  —Ashley Montagu (1973)35

  True, the metaphor of choice was no longer a scraped tablet or white paper. Durkheim had spoken of “indeterminate material,” some kind of blob that was molded or pounded into shape by culture. Perhaps the best modern metaphor is Silly Putty, the rubbery stuff that children use both to copy printed matter (like a blank slate) and to mold into desired shapes (like indeterminate material). The malleability metaphor resurfaced in statements by two of Boas’s most famous students:

  Most people are shaped to the form of their culture because of the malleability of their original endowment…. The great mass of individuals take quite readily the form that is presented to them.

  —Ruth Benedict (1934)36

  We are forced to conclude that human nature is almost unbelievably malleable, responding accurately and contrastingly to contrasting cultural conditions.

  —Margaret Mead (1935)37

  Others likened the mind to some kind of sieve:

  Much of what is commonly called “human nature” is merely culture thrown against a screen of nerves, glands, sense organs, muscles, etc.

  —Leslie White (1949)38

  Or to the raw materials for a factory:

  Human nature is the rawest, most undifferentiated of raw material.

  —Margaret Mead (1928)39

  Our ideas, our values, our acts, even our emotions, are, like our nervous system itself, cultural products—products manufactured, indeed, out of tendencies, capacities, and dispositions with which we were born, but manufactured nonetheless.

  —Clifford Geertz (1973)40

  Or to an unprogrammed computer:

  Man is the animal most desperately dependent upon such extragenetic, outside-the-skin control mechanisms, such cultural programs, for ordering his behavior.

  —Clifford Geertz (1973)41

  Or to some other amorphous entity that can have many things done to it:

  Cultural psychology is the study of the way cultural traditions and social practices regulate, express, transform, and permute the human psyche, resulting less in psychic unity for humankind than in ethnic divergences in mind, self and emotion.

  —Richard Shweder (1990)42

  The superorganic or group mind also became an article of faith in social science. Robert Lowie (another Boas student) wrote, “The principles of psychology are as incapable of accounting for the phenomena of culture as is gravitation to account for architectural styles.”43 And in case you missed its full implications, the anthropologist Leslie White spelled it out:

  Instead of regarding the individual as a First Cause, as a prime mover, as the initiator and determinant of the culture process, we now see him as a component part, and a tiny and relatively insignificant part at that, of a vast, socio-cultural system that embraces innumerable individuals at any one time and extends back into their remote past as well…. For purposes of scientific interpretation, the culture process may be regarded as a thing sui generis; culture is explainable in terms of culture.44

  In other words, we should forget about the mind of an individual person like you, that tiny and insignificant part of a vast sociocultural system. The mind that counts is the one belonging to the group, which is capable of thinking, feeling, and acting on its own.

  The doctrine of the superorganism has had an impact on modern life that extends well beyond the writings of social scientists. It underlies the tendency to reify “society” as a moral agent that can be blamed for sins as if it were a person. It drives identity politics, in which civil rights and political perquisites are allocated to groups rather than to individuals. And as we shall see in later chapters, it defined some of the great divides between major political systems in the twentieth century.

  THE BLANK SLATE was not the only part of the official theory that social scientists felt compelled to prop up. They also strove to consecrate the Noble Savage. Mead painted a Gauguinesque portrait of native peoples as peaceable, egalitarian, materially satisfied, and sexually unconflicted. Her uplifting vision of who we used to be—and therefore who we can become again—was accepted by such otherwise skeptical writers as Bertrand Russell and H. L. Mencken. Ashley Montagu (also from the Boas circle), a prominent public intellectual from the 1950s until his recent death, tirelessly invoked the doctrine of the Noble Savage to justify the quest for brotherhood and peace and to refute anyone who might think such efforts were futile. In 1950, for example, he drafted a manifesto for the newly formed UNESCO that declared, “Biological studies lend support to the ethic of universal brotherhood, for man is born with drives toward co-operation, and unless these drives are satisfied, men and nations alike fall ill.”45 With the ashes of thirty-five million victims of World War II still warm or radioactive, a reasonable person might wonder how “biological studies” could show anything of the kind. The draft was rejected, but Montagu had better luck in the decades to come, when UNESCO and many scholarly societies adopted similar resolutions.46

  More generally, social scientists saw the malleability of humans and the autonomy of culture as doctrines that might bring about the age-old dream of perfecting mankind. We are not stuck with what we don’t like about our current predicament, they argued. Nothing prevents us from changing it except a lack of will and the benighted belief that we are permanently consigned to it by biology. Many social scientists have expressed the hope of a new and improved human nature:

  I fe
lt (and said so early) that the environmental explanation was preferable, whenever justified by the data, because it was more optimistic, holding out the hope of improvement.

  —Otto Klineberg (1928)47

  Modern sociology and modern anthropology are one in saying that the substance of culture, or civilization, is social tradition and that this social tradition is indefinitely modifiable by further learning on the part of men for happier and better ways of living together…. Thus the scientific study of institutions awakens faith in the possibility of remaking both human nature and human social life.

  —Charles Ellwood (1922)48

  Barriers in many fields of knowledge are falling below the new optimism which is that anybody can learn anything…. We have turned away from the concept of human ability as something fixed in the physiological structure, to that of a flexible and versatile mechanism subject to great improvement.

  —Robert Faris (1961)49

  Though psychology is not as politicized as some of the other social sciences, it too is sometimes driven by a Utopian vision in which changes in child-rearing and education will ameliorate social pathologies and improve human welfare. And psychological theorists sometimes try to add moral heft to arguments for connectionism or other empiricist theories with warnings about the pessimistic implications of innatist theories. They argue, for example, that innatist theories open the door to inborn differences, which could foster racism, or that the theories imply that human traits are unchangeable, which could weaken support for social programs.50

  TWENTIETH-CENTURY SOCIAL SCIENCE embraced not just the Blank Slate and the Noble Savage but the third member of the trinity, the Ghost in the Machine. The declaration that we can change what we don’t like about ourselves became a watchword of social science. But that only raises the question “Who or what is the ‘we’?” If the “we” doing the remaking are just other hunks of matter in the biological world, then any malleability of behavior we discover would be cold comfort, because we, the molders, would be biologically constrained and therefore might not mold people, or allow ourselves to be molded, in the most socially salutary way. A ghost in the machine is the ultimate liberator of human will—including the will to change society—from mechanical causation. The anthropologist Loren Eiseley made this clear when he wrote:

  The mind of man, by indetermination, by the power of choice and cultural communication, is on the verge of escape from the blind control of that deterministic world with which the Darwinists had unconsciously shackled man. The inborn characteristics laid upon him by the biological extremists have crumbled away…. Wallace saw and saw correctly, that with the rise of man the evolution of parts was to a marked degree outmoded, that mind was now the arbiter of human destiny.51

  The “Wallace” that Eiseley is referring to is Alfred Russel Wallace (1823–1913), the co-discoverer with Darwin of natural selection. Wallace parted company from Darwin by claiming that the human mind could not be explained by evolution and must have been designed by a superior intelligence. He certainly did believe that the mind of man could escape “the blind control of a deterministic world.” Wallace became a spiritualist and spent the later years of his career searching for a way to communicate with the souls of the dead.

  The social scientists who believed in an absolute separation of culture from biology may not have literally believed in a spook haunting the brain. Some used the analogy of the difference between living and nonliving matter. Kroeber wrote: “The dawn of the social… is not a link in any chain, not a step in a path, but a leap to another plane…. [It is like] the first occurrence of life in the hitherto lifeless universe…. From this moment on there should be two worlds in place of one.”52 And Lowie insisted that it was “not mysticism, but sound scientific method” to say that culture was “sui generis” and could be explained only by culture, because everyone knows that in biology a living cell can come only from another living cell.53

  At the time that Kroeber and Lowie wrote, they had biology on their side. Many biologists still thought that living things were animated by a special essence, an élan vital, and could not be reduced to inanimate matter. A 1931 history of biology, referring to genetics as it was then understood, said, “Thus the last of the biological theories leaves us where we first started, in the presence of a power called life or psyche which is not only of its own kind but unique in each and all of its exhibitions.”54 In the next chapter we will see that the analogy between the autonomy of culture and the autonomy of life would prove to be more telling than these social scientists realized.

  Chapter 3

  The Last Wall to Fall

  IN 1755 SAMUEL JOHNSON wrote that his dictionary should not be expected to “change sublunary nature, and clear the world at once from folly, vanity, and affectation.” Few people today are familiar with the lovely word sublunary, literally “below the moon.” It alludes to the ancient belief in a strict division between the pristine, lawful, unchanging cosmos above and our grubby, chaotic, fickle Earth below. The division was already obsolete when Johnson used the word: Newton had shown that the same force that pulled an apple toward the ground kept the moon in its celestial orbit.

  Newton’s theory that a single set of laws governed the motions of all objects in the universe was the first event in one of the great developments in human understanding: the unification of knowledge, which the biologist E. O. Wilson has termed consilience.1 Newton’s breaching of the wall between the terrestrial and the celestial was followed by a collapse of the once equally firm (and now equally forgotten) wall between the creative past and the static present. That happened when Charles Lyell showed that the Earth was sculpted in the past by forces we see today (such as earthquakes and erosion) acting over immense spans of time.

  The living and nonliving, too, no longer occupy different realms. In 1628 William Harvey showed that the human body is a machine that runs by hydraulics and other mechanical principles. In 1828 Friedrich Wöhler showed that the stuff of life is not a magical, pulsating gel but ordinary compounds following the laws of chemistry. Charles Darwin showed how the astonishing diversity of life and its ubiquitous signs of design could arise from the physical process of natural selection among replicators. Gregor Mendel, and then James Watson and Francis Crick, showed how replication itself could be understood in physical terms.

  The unification of our understanding of life with our understanding of matter and energy was the greatest scientific achievement of the second half of the twentieth century. One of its many consequences was to pull the rug out from under social scientists like Kroeber and Lowie who had invoked the “sound scientific method” of placing the living and nonliving in parallel universes. We now know that cells did not always come from other cells and that the emergence of life did not create a second world where before there was just one. Cells evolved from simpler replicating molecules, a nonliving part of the physical world, and may be understood as collections of molecular machinery—fantastically complicated machinery, of course, but machinery nonetheless.

  This leaves one wall standing in the landscape of knowledge, the one that twentieth-century social scientists guarded so jealously. It divides matter from mind, the material from the spiritual, the physical from the mental, biology from culture, nature from society, and the sciences from the social sciences, humanities, and arts. The division was built into each of the doctrines of the official theory: the blank slate given by biology versus the contents inscribed by experience and culture, the nobility of the savage in the state of nature versus the corruption of social institutions, the machine following inescapable laws versus the ghost that is free to choose and to improve the human condition.

  But this wall, too, is falling. New ideas from four frontiers of knowledge—the sciences of mind, brain, genes, and evolution—are breaching the wall with a new understanding of human nature. In this chapter I will show how they are filling in the blank slate, declassing the noble savage, and exorcising the ghost in the machine. In the fo
llowing chapter I will show that this new conception of human nature, connected to biology from below, can in turn be connected to the humanities and social sciences above. That new conception can give the phenomena of culture their due without segregating them into a parallel universe.

  THE FIRST BRIDGE between biology and culture is the science of mind, cognitive science.2 The concept of mind has been perplexing for as long as people have reflected on their thoughts and feelings. The very idea has spawned paradoxes, superstitions, and bizarre theories in every period and culture. One can almost sympathize with the behaviorists and social constructionists of the first half of the twentieth century, who looked on minds as enigmas or conceptual traps that were best avoided in favor of overt behavior or the traits of a culture.

  But beginning in the 1950s with the cognitive revolution, all that changed. It is now possible to make sense of mental processes and even to study them in the lab. And with a firmer grasp on the concept of mind, we can see that many tenets of the Blank Slate that once seemed appealing are now unnecessary or even incoherent. Here are five ideas from the cognitive revolution that have revamped how we think and talk about minds.

  The first idea: The mental world can be grounded in the physical world by the concepts of information, computation, and feedback. A great divide between mind and matter has always seemed natural because behavior appears to have a different kind of trigger than other physical events. Ordinary events have causes, it seems, but human behavior has reasons. I once participated in a BBC television debate on whether “science can explain human behavior.” Arguing against the resolution was a philosopher who asked how we might explain why someone was put in jail. Say it was for inciting racial hatred. The intention, the hatred, and even the prison, she said, cannot be described in the language of physics. There is simply no way to define “hatred” or “jail” in terms of the movements of particles. Explanations of behavior are like narratives, she argued, couched in the intentions of actors—a plane completely separate from natural science. Or take a simpler example. How might we explain why Rex just walked over to the phone? We would not say that phone-shaped stimuli caused Rex’s limbs to swing in certain arcs. Rather, we might say that he wanted to speak to his friend Cecile and knew that Cecile was home. No explanation has as much predictive power as that one. If Rex was no longer on speaking terms with Cecile, or if he remembered that Cecile was out bowling that night, his body would not have risen off the couch.