The Blank Slate: The Modern Denial of Human Nature Read online

Page 4


  This sea change included a revolution in the treatment of human nature by scientists and scholars. Academics were swept along by the changing attitudes to race and sex, but they also helped to direct the tide by holding forth on human nature in books and magazines and by lending their expertise to government agencies. The prevailing theories of mind were refashioned to make racism and sexism as untenable as possible. The doctrine of the Blank Slate became entrenched in intellectual life in a form that has been called the Standard Social Science Model or social constructionism.5 The model is now second nature to people and few are aware of the history behind it.6 Carl Degler, the foremost historian of this revolution, sums it up this way:

  What the available evidence does seem to show is that ideology or a philosophical belief that the world could be a freer and more just place played a large part in the shift from biology to culture. Science, or at least certain scientific principles or innovative scholarship also played a role in the transformation, but only a limited one. The main impetus came from the will to establish a social order in which innate and immutable forces of biology played no role in accounting for the behavior of social groups.7

  The takeover of intellectual life by the Blank Slate followed different paths in psychology and in the other social sciences, but they were propelled by the same historical events and progressive ideology. By the second and third decades of the twentieth century, stereotypes of women and ethnic groups were starting to look silly. Waves of immigrants from southern and eastern Europe, including many Jews, were filling the cities and climbing the social ladder. African Americans had taken advantage of the new “Negro colleges,” had migrated northward, and had begun the Harlem Renaissance. The graduates of flourishing women’s colleges helped launch the first wave of feminism. For the first time not all professors and students were white Anglo-Saxon Protestant males. To say that this sliver of humanity was constitutionally superior had not only become offensive but went against what people could see with their own eyes. The social sciences in particular were attracting women, Jews, Asians, and African Americans, some of whom became influential thinkers.

  Many of the pressing social problems of the first decades of the twentieth century concerned the less fortunate members of these groups. Should more immigrants be let in, and if so, from which countries? Once here, should they be encouraged to assimilate, and if so, how? Should women be given equal political rights and economic opportunities? Should blacks and whites be integrated? Other challenges were posed by children.8 Education had become compulsory and a responsibility of the state. As the cities teemed and family ties loosened, troubled and troublesome children became everyone’s problem, and new institutions were invented to deal with them, such as kindergartens, orphanages, reform schools, fresh-air camps, humane societies, and boys’ and girls’ clubs. Child development was suddenly on the front burner. These social challenges were not going to go away, and the most humane assumption was that all human beings had an equal potential to prosper if they were given the right upbringing and opportunities. Many social scientists saw it as their job to reinforce that assumption.

  MODERN PSYCHOLOGICAL THEORY, as every introductory textbook makes clear, has roots in John Locke and other Enlightenment thinkers. For Locke the Blank Slate was a weapon against the church and tyrannical monarchs, but these threats had subsided in the English-speaking world by the nineteenth century. Locke’s intellectual heir John Stuart Mill (1806–1873) was perhaps the first to apply his blank-slate psychology to political concerns we recognize today. He was an early supporter of women’s suffrage, compulsory education, and the improvement of the conditions of the lower classes. This interacted with his stands in psychology and philosophy, as he explained in his autobiography:

  I have long felt that the prevailing tendency to regard all the marked distinctions of human character as innate, and in the main indelible, and to ignore the irresistible proofs that by far the greater part of those differences, whether between individuals, races, or sexes, are such as not only might but naturally would be produced by differences in circumstances, is one of the chief hindrances to the rational treatment of great social questions, and one of the greatest stumbling blocks to human improvement…. [This tendency is] so agreeable to human indolence, as well as to conservative interests generally, that unless attacked at the very root, it is sure to be carried to even a greater length than is really justified by the more moderate forms of intuitional philosophy.9

  By “intuitional philosophy” Mill was referring to Continental intellectuals who maintained (among other things) that the categories of reason were innate. Mill wanted to attack their theory of psychology at the root to combat what he thought were its conservative social implications. He refined a theory of learning called associationism (previously formulated by Locke) that tried to explain human intelligence without granting it any innate organization. According to this theory, the blank slate is inscribed with sensations, which Locke called “ideas” and modern psychologists call “features.” Ideas that repeatedly appear in succession (such as the redness, roundness, and sweetness of an apple) become associated, so that any one of them can call to mind the others. And similar objects in the world activate overlapping sets of ideas in the mind. For example, after many dogs present themselves to the senses, the features that they share (fur, barking, four legs, and so on) hang together to stand for the category “dog.”

  The associationism of Locke and Mill has been recognizable in psychology ever since. It became the core of most models of learning, especially in the approach called behaviorism, which dominated psychology from the 1920s to the 1960s. The founder of behaviorism, John B. Watson (1878–1958), wrote one of the century’s most famous pronouncements of the Blank Slate:

  Give me a dozen healthy infants, well-formed, and my own specified world to bring them up in and I’ll guarantee to take any one at random and train him to become any type of specialist I might select—doctor, lawyer, artist, merchant-chief, and yes, even beggar-man and thief, regardless of his talents, penchants, tendencies, abilities, vocations, and race of his ancestors.10

  In behaviorism, an infant’s talents and abilities didn’t matter because there was no such thing as a talent or an ability. Watson had banned them from psychology, together with other contents of the mind, such as ideas, beliefs, desires, and feelings. They were subjective and unmeasurable, he said, and unfit for science, which studies only objective and measurable things. To a behaviorist, the only legitimate topic for psychology is overt behavior and how it is controlled by the present and past environment. (There is an old joke in psychology: What does a behaviorist say after making love? “It was good for you; how was it for me?”)

  Locke’s “ideas” had been replaced by “stimuli” and “responses,” but his laws of association survived as laws of conditioning. A response can be associated with a new stimulus, as when Watson presented a baby with a white rat and then clanged a hammer against an iron bar, allegedly making the baby associate fear with fur. And a response could be associated with a reward, as when a cat in a box eventually learned that pulling a string opened a door and allowed it to escape. In these cases an experimenter set up a contingency between a stimulus and another stimulus or between a response and a reward. In a natural environment, said the behaviorists, these contingencies are part of the causal texture of the world, and they inexorably shape the behavior of organisms, including humans.

  Among the casualties of behaviorist minimalism was the rich psychology of William James (1842–1910). James had been inspired by Darwin’s argument that perception, cognition, and emotion, like physical organs, had evolved as biological adaptations. James invoked the notion of instinct to explain the preferences of humans, not just those of animals, and he posited numerous mechanisms in his theory of mental life, including short-term and long-term memory. But with the advent of behaviorism they all joined the index of forbidden concepts. The psychologist J. R. Kantor wrote in 1923: “Brief is
the answer to the question as to what is the relationship between social psychology and instincts. Plainly, there is no relationship.”11 Even sexual desire was redefined as a conditioned response. The psychologist Zing Yang Kuo wrote in 1929:

  Behavior is not a manifestation of hereditary factors, nor can it be expressed in terms of heredity. [It is] a passive and forced movement mechanically and solely determined by the structural pattern of the organism and the nature of environmental forces…. All our sexual appetites are the result of social stimulation. The organism possesses no ready-made reaction to the other sex, any more than it possesses innate ideas.12

  Behaviorists believed that behavior could be understood independently of the rest of biology, without attention to the genetic makeup of the animal or the evolutionary history of the species. Psychology came to consist of the study of learning in laboratory animals. B. F. Skinner (1904–1990), the most famous psychologist in the middle decades of the twentieth century, wrote a book called The Behavior of Organisms in which the only organisms were rats and pigeons and the only behavior was lever pressing and key pecking. It took a trip to the circus to remind psychologists that species and their instincts mattered after all. In an article called “The Misbehavior of Organisms,” Skinner’s students Keller and Marian Breland reported that when they tried to use his techniques to train animals to insert poker chips into vending machines, the chickens pecked the chips, the raccoons washed them, and the pigs tried to root them with their snouts.13 And behaviorists were as hostile to the brain as they were to genetics. As late as 1974, Skinner wrote that studying the brain was just another misguided quest to find the causes of behavior inside the organism rather than out in the world.14

  Behaviorism not only took over psychology but infiltrated the public consciousness. Watson wrote an influential childrearing manual recommending that parents establish rigid feeding schedules for their children and give them a minimum of attention and love. If you comfort a crying child, he wrote, you will reward him for crying and thereby increase the frequency of crying behavior. (Benjamin Spock’s Baby and Child Care, first published in 1946 and famous for recommending indulgence toward children, was in part a reaction to Watson.) Skinner wrote several bestsellers arguing that harmful behavior is neither instinctive nor freely chosen but inadvertently conditioned. If we turned society into a big Skinner box and controlled behavior deliberately rather than haphazardly, we could eliminate aggression, overpopulation, crowding, pollution, and inequality, and thereby attain utopia.15 The noble savage became the noble pigeon.

  Strict behaviorism is pretty much dead in psychology, but many of its attitudes live on. Associationism is the learning theory assumed by many mathematical models and neural network simulations of learning.16 Many neuroscientists equate learning with the forming of associations, and look for an associative bond in the physiology of neurons and synapses, ignoring other kinds of computation that might implement learning in the brain.17 (For example, storing the value of a variable in the brain, as in “x = 3,” is a critical computational step in navigating and foraging, which are highly developed talents of animals in the wild. But this kind of learning cannot be reduced to the formation of associations, and so it has been ignored in neuroscience.) Psychologists and neuroscientists still treat organisms interchangeably, seldom asking whether a convenient laboratory animal (a rat, a cat, a monkey) is like or unlike humans in crucial ways.18 Until recently, psychology ignored the content of beliefs and emotions and the possibility that the mind had evolved to treat biologically important categories in different ways.19 Theories of memory and reasoning didn’t distinguish thoughts about people from thoughts about rocks or houses. Theories of emotion didn’t distinguish fear from anger, jealousy, or love.20 Theories of social relations didn’t distinguish among family, friends, enemies, and strangers.21 Indeed, the topics in psychology that most interest laypeople—love, hate, work, play, food, sex, status, dominance, jealousy, friendship, religion, art—are almost completely absent from psychology textbooks.

  One of the major documents of late twentieth-century psychology was the two-volume Parallel Distributed Processing by David Rumelhart, James McClelland, and their collaborators, which presented a style of neural network modeling called connectionism.22 Rumelhart and McClelland argued that generic associationist networks, subjected to massive amounts of training, could explain all of cognition. They realized that this theory left them without a good answer to the question “Why are people smarter than rats?” Here is their answer:

  Given all of the above, the question does seem a bit puzzling…. People have much more cortex than rats do or even than other primates do; in particular they have very much more… brain structure not dedicated to input/output—and presumably, this extra cortex is strategically placed in the brain to subserve just those functions that differentiate people from rats or even apes….

  But there must be another aspect to the difference between rats and people as well. This is that the human environment includes other people and the cultural devices that they have developed to organize their thinking processes.23

  Humans, then, are just rats with bigger blank slates, plus something called “cultural devices.” And that brings us to the other half of the twentieth-century revolution in social science.

  He’s so unhip, when you say “Dylan,”

  He thinks you’re talkin’ about Dylan Thomas (whoever he was).

  The man ain’t got no culture.

  —Simon and Garfunkel

  The word culture used to refer to exalted genres of entertainment, such as poetry, opera, and ballet. The other familiar sense—”the totality of socially transmitted behavior patterns, arts, beliefs, institutions, and all other products of human work and thought”—is only a century old. This change in the English language is just one of the legacies of the father of modern anthropology, Franz Boas (1858–1942).

  The ideas of Boas, like the ideas of the major thinkers in psychology, were rooted in the empiricist philosophers of the Enlightenment, in this case George Berkeley (1685–1753). Berkeley formulated the theory of idealism, the notion that ideas, not bodies and other hunks of matter, are the ultimate constituents of reality. After twists and turns that are too convoluted to recount here, idealism became influential among nineteenth-century German thinkers. It was embraced by the young Boas, a German Jew from a secular, liberal family.

  Idealism allowed Boas to lay a new intellectual foundation for egalitarian-ism. The differences among human races and ethnic groups, he proposed, come not from their physical constitution but from their culture, a system of ideas and values spread by language and other forms of social behavior. Peoples differ because their cultures differ. Indeed, that is how we should refer to them: the Eskimo culture or the Jewish culture, not the Eskimo race or the Jewish race. The idea that minds are shaped by culture served as a bulwark against racism and was the theory one ought to prefer on moral grounds. Boas wrote, “I claim that, unless the contrary can be proved, we must assume that all complex activities are socially determined, not hereditary.”24

  Boas’s case was not just a moral injunction; it was rooted in real discoveries. Boas studied native peoples, immigrants, and children in orphanages to prove that all groups of humans had equal potential. Turning Jespersen on his head, Boas showed that the languages of primitive peoples were not simpler than those of Europeans; they were just different. Eskimos’ difficulty in discriminating the sounds of our language, for example, is matched by our difficulty in discriminating the sounds of theirs. True, many non-Western languages lack the means to express certain abstract concepts. They may have no words for numbers higher than three, for example, or no word for goodness in general as opposed to the goodness of a particular person. But those limitations simply reflect the daily needs of those people as they live their lives, not an infirmity in their mental abilities. As in the story of Socrates drawing abstract philosophical concepts out of a slave boy, Boas showed that he could elicit new
word forms for abstract concepts like “goodness” and “pity” out of a Kwakiutl native from the Pacific Northwest. He also observed that when native peoples come into contact with civilization and acquire things that have to be counted, they quickly adopt a full-blown counting system.25

  For all his emphasis on culture, Boas was not a relativist who believed that all cultures are equivalent, nor was he an empiricist who believed in the Blank Slate. He considered European civilization superior to tribal cultures, insisting only that all peoples were capable of achieving it. He did not deny that there might be a universal human nature, or that there might be differences among people within an ethnic group. What mattered to him was the idea that all ethnic groups are endowed with the same basic mental abilities.26 Boas was right about this, and today it is accepted by virtually all scholars and scientists.

  But Boas had created a monster. His students came to dominate American social science, and each generation outdid the previous one in its sweeping pronouncements. Boas’s students insisted not just that differences among ethnic groups must be explained in terms of culture but that every aspect of human existence must be explained in terms of culture. For example, Boas had favored social explanations unless they were disproven, but his student Albert Kroeber favored them regardless of the evidence. “Heredity,” he wrote, “cannot be allowed to have acted any part in history.”27 Instead, the chain of events shaping a people “involves the absolute conditioning of historical events by other historical events.”28

  Kroeber did not just deny that social behavior could be explained by innate properties of minds. He denied that it could be explained by any properties of minds. A culture, he wrote, is superorganic—it floats in its own universe, free of the flesh and blood of actual men and women: “Civilization is not mental action but a body or stream of products of mental exercise…. Mentality relates to the individual. The social or cultural, on the other hand, is in its essence non-individual. Civilization as such begins only where the individual ends.”29