Reposted from The Edge.
Towards Understanding Human Nature
Every few years a book is published that commands our attention and causes us to consider questions that challenge our basic assumptions about ourselves. This month marks the publication of such a book, The Blank Slate: The Modern Denial of Human Nature by MIT research psychologist Steven Pinker.
Pinker is a unifier, someone who ties a lot of big ideas together. He has studied visual cognition and language acquisition in the laboratory, and was one of the first to develop computational models of how children learn the words and grammar of their first language. He has merged Chomskyan ideas about an innate language faculty with the Darwinian theory of adaptation and natural selection. Pinker also wrote one of the most influential critiques of neural-network models of the mind.
His book The Language Instinct discussed all aspects of language in a unified, Darwinian framework, and in How the Mind Works he did the same for the rest of the mind, explaining “what the mind is, how it evolved, and how it allows us to see, think, feel, laugh, interact, enjoy the arts, and ponder the mysteries of life.”
In The Blank Slate, he notes “that there is a quasi-religious theory of human nature that is prevalent among pundits and intellectuals, which includes both empirical assumptions about how the mind works and a set of values that people hang on those assumptions. The theory has three parts”.
One is the doctrine of “the blank slate”: that we have no inherent talents or temperaments, because the mind is shaped completely by the environmentÛparenting, culture, and society.
“The second is “the noble savage”: that evil motives are not inherent to people but come from corrupting social institutions.
The third is “the ghost in the machine”, that the most important part of us is somehow independent of our biology, so that our ability to have experiences and make choices can’t be explained by our physiological makeup and evolutionary history.
These three ideas are increasingly being challenged by the sciences of the mind, brain, genes, and evolution,” he says, “but they are held as much for their moral and political uplift as for any empirical rationale. People think that these doctrines are preferable on moral grounds and that the alternative is forbidden territory that we should avoid at all costs”.
An Interview with Steven Pinker
EDGE: How did you go from being an up-and-coming young Mandarin in the Cognitive Science department at MIT to a radical thinker attempting to overthrow the conventional wisdom regarding human nature?
STEVEN PINKER: I don’t consider myself to be that radical a thinker. My opinions about human nature are shared by many psychologists, linguists, and biologists, not to mention philosophers and scholars going back centuries. The connections I draw between human nature and political systems in my new book, for example, were prefigured in the debates during the Enlightenment and during the framing of the American Constitution. Madison, for example, asked “What is government itself but the greatest of all reflections on human nature?” People today sometimes get uncomfortable with empirical claims that seem to clash with their political assumptions, often because they haven’t given much thought to the connections. But a conception of human nature, and its connections to other fields such as politics and the arts, have been there from time immemorial.
EDGE: What questions are you asking yourself, and what do you hope to accomplish by going after the intellectual establishment in terms of their denial of human nature?
PINKER: The main question is: “Why are empirical questions about how the mind works so weighted down with political and moral and emotional baggage? Why do people believe that there are dangerous implications of the idea that the mind is a product of the brain, that the brain is organized in part by the genome, and that the genome was shaped by natural selection?” This idea has been met with demonstrations, denunciations, picketings, and comparisons to Nazism, both from the right and from the left. And these reactions affect both the day-to-day conduct of science and the public appreciation of the science. By exploring the political and moral colorings of discoveries about what makes us tick, we can have a more honest science and a less fearful intellectual milieu.
EDGE: Why do we need to assuage people’s fears? What’s the matter with telling the truth?
PINKER: It’s harder to find the truth if certain factual hypotheses are third rails—touch them and die. A clear example is research on parenting. Hundreds of studies have measured correlations between the practices of parents and the way their children turn out. For example, parents who talk a lot to their children have kids with better language skills, parents who spank have children who grow up to be violent, parents who are neither too authoritarian or too lenient have children who are well-adjusted, and so on. Most of the parenting expert industry and a lot of government policy turn these correlations into advice to parents, and blame the parents when children don’t turn out as they would have liked. But correlation does not imply causation. Parents provide their children with genes as well as an environment, so the fact that talkative parents have kids with good language skills could simply mean that and that the same genes that make parents talkative make children articulate. Until those studies are replicated with adopted children, who don’t get their genes from the people who bring them up, we don’t know whether the correlations reflect the effects of parenting, the effects of shared genes, or some mixture. But in most cases even the possibility that the correlations reflect shared genes is taboo. In developmental psychology it’s considered impolite even to mention it, let alone test it.
PINKER: Most intellectuals today have a phobia of any explanation of the mind that invokes genetics. They’re afraid of four things.
First there is a fear of inequality. The great appeal of the doctrine that the mind is a blank slate is the simple mathematical fact that zero equals zero. If we all start out blank, then no one can have more stuff written on his slate than anyone else. Whereas if we come into the world endowed with a rich set of mental faculties, they could work differently, or better or worse, in some people than in others. The fear is that this would open the door to discrimination, oppression, or eugenics, or even slavery and genocide.
Of course this is all a non sequitur. As many political writers have pointed out, commitment to political equality is not an empirical claim that people are clones. It’s a moral claim that in certain spheres we judge people as individuals, and don’t take into account the statistical average of the groups that they belong to. It’s also a recognition that however much people might vary, they have certain things in common by virtue of their common human nature. No one likes to be humiliated or oppressed or enslaves or deprived. Political equality consists of recognizing, as the Constitution says, that people have certain inalienable rights, namely life, liberty, and the pursuit of happiness. Recognizing those rights is not the same thing as believing that people are indistinguishable in every respect.
The second fear is the fear of imperfectability. If people are innately saddled with certain sins and flaws, like selfishness, prejudice, sort-sightedness, and self-deception, then political reform would seem to be a waste of time. Why try to make the world a better place if people are rotten to the core and will just screw it up no matter what you do? Again, this is a faulty argument. We know that there can be social improvement because we know that there has been social improvement, such as the end of slavery, torture, blood feuds, despotism, and the ownership of women in Western democracies. Social change can take place, even with a fixed human nature, because the mind is a complex system of many parts. Even if we do have some motives that tempt us to do awful things, we have other motives that can counteract them. We can figure out ways to pit one human desire against another, and thereby improve our condition, in the same way we manipulate physical and biological laws—rather than denying they exist—to improve our physical condition. We combat disease, we keep out the weather, we grow more crops, and we can jigger with our social arrangements as well.
A good example is the invention of democratic government. As Madison argued, by instituting checks and balances in a political system, one person’s ambition counteracts another’s. It’s not that we have bred or socialized a new human being who’s free of ambition. We’ve just developed a system in which these ambitions are kept under control.
Another reason that human nature doesn’t rule out social progress is that many features of human nature have free parameters. This has long been recognized in the case of language, where some languages use the mirror-image of the phrase order patterns found in English but otherwise work by the same logic. Our moral sense may also have a free parameter as well. People in all cultures have an ability to respect and sympathize with other people. The question is, with which other people? The default setting of our moral sense may be to sympathize only with members of our clan or village. Over the course of history, a knob or a slider has been adjusted so that a larger and larger portion of humanity is admitted into the circle of people whose interests we consider as comparable to our own. From the village or clan the moral circle has been expanded to the tribe, the nation, and most recently to all of humanity, as in the Universal Declaration of Human Rights. It’s an idea that came from the philosopher Peter Singer in his book The Expanding Circle. It’s an example of how we can enjoy social improvement and moral progress even if we are fitted with certain faculties, as long as those faculties can respond to inputs. In the case of the moral sense the relevant inputs may be a cosmopolitan awareness of history and the narratives of other peoples, which allow us to project ourselves into the experiences of people who might otherwise be treated as obstacles or enemies.
The third fear is a fear of determinism: that we will no longer be able to hold people responsible for their behavior because they can they can always blame it on their brain or their genes or their evolutionary history—the evolutionary-urge or killer-gene defense. The fear is misplaced for two reasons. One is that the silliest excuses for bad behavior have in fact invoked the environment, rather than biology, anyway—such as the abuse excuse that got the Menendez brothers off the hook in their first trial, the “black rage” defense that was used to try to exonerate the Long Island Railroad gunman, the “pornography made me to it” defense that some rapists have tried. If there’s a threat to responsibility it doesn’t come from biological determinism but from any determinism, including childhood upbringing, mass media, social conditioning, and so on.
But none of these should be taken seriously in the first place. Even if there are parts of the brain that compel people to do things for various reasons, there are other parts of the brain that respond to the legal and social contingencies that we call “holding someone responsible for their behavior.” For example, if I rob a liquor store, I’ll get thrown in jail, or if I cheat on my spouse my friends and relatives and neighbors will think that I’m a boorish cad and will refuse to have anything to do with me. By holding people responsible for their actions we are implementing contingencies that can affect parts of the brain and can lead people to inhibit what they would otherwise do. There’s no reason that we should give up that lever on people’s behavior—namely, the inhibition systems of the brain—just because we’re coming to understand more about the temptation systems.
The final fear is the fear of nihilism. If it can be shown that all of our motives and values are products of the physiology of the brain, which in turn was shaped by the forces of evolution, then they would in some sense be shams, without objective reality. I wouldn’t really be loving my child; all I would be doing is selfishly propagating my genes. Flowers and butterflies and works of art are not truly beautiful; my brain just evolved to give me a pleasant sensation when a certain pattern of light hits my retina. The fear is that biology will debunk all that we hold sacred.
This fear is based on a confusion between two very different ways to explain behavior. What biologists call a “proximate” explanation refers to what is meaningful to me given the brain that I have. An “ultimate” explanation refers to the evolutionary processes that gave me a brain with the ability to have those thoughts and feelings. Yes, evolution (the ultimate explanation for our minds) is a short-sighted selfish process in which genes are selected for their ability to maximize the number of copies of themselves. But that doesn’t mean that we are selfish and short sighted, at least not all the time. There’s nothing that prevents the selfish, amoral process of natural selection evolution from evolving a big-brained social organism with a complex moral sense. There’s an old saying that people who appreciate legislation and sausages should not see them being made. That’s a bit like human values—knowing how they were made can be misleading if you don’t think carefully about the process. Selfish genes don’t necessarily build a selfish organization.
EDGE: So if intellectuals are afraid of human nature, what do they believe instead? What are some of the indications that we are in denial? What are some of the prevalent myths?
PINKER: I think that there is a quasi-religious theory of human nature that is prevalent among pundits and intellectuals, which includes both empirical assumptions about how the mind works and a set of values that people hang on those assumptions. The theory is has three parts.
One is the doctrine of the “blank slate”: that we have no inherent talents or temperaments, because the mind is shaped completely by the environment—parenting, culture, and society.
The second is the “noble savage”: that evil motives are not inherent to people but come from corrupting social institutions.
The third is the “ghost in the machine”, that the most important part of us is somehow independent of our biology, so that our ability to have experiences and make choices can’t be explained by our physiological makeup and evolutionary history.
These three idea ideas are increasingly being challenged by the sciences of the mind, brain, genes, and evolution, but they are held as much for their moral and political uplift as for any empirical rationale. People think that these doctrines are preferable on moral grounds and that the alternative is a forbidden territory that we should avoid at all costs.
EDGE: How has the empirical work in the sciences undermined these beliefs?
PINKER: The “blank slate” has been undermined by a number of discoveries. One of them is a simple logical point that no matter how important learning and culture and socialization are, they don’t happen by magic. There has to be innate circuitry that does the learning, that creates the culture, that acquires the culture, and that responds to socialization. Once you try to specify what those learning mechanisms are, you’re forced to posit a great deal of innate structure to the mind.
It’s also been undermined by behavioral genetics, which has found that at least half of the variation in personality and intelligence in a society comes from differences in the genes. The most dramatic demonstration of this fact is that that identical twins who were separated at birth have fantastic similarities in their talents and tastes.
The “blank slate” has also undermined by evolutionary psychology and anthropology. For example, despite the undeniable variation among cultures, we now know that there is a vast set of universal traits that are common to all of the world’s 6,000 cultures. Also, evolutionary psychology has shown that many of our motives make no sense in terms of our day-to-day efforts to enhance our physical and psychological well-being, but they can be explained in terms of the mechanism of natural selection operating in the environment in which we evolved.
A relatively uncontroversial example is our tastes for sugar and fat, which were adaptive in an environment in which those nutrients were in short supply but don’t do anyone any good in a modern environment in which they are cheap and available anywhere. A more controversial example may be the universal thirst for revenge, which was one’s only defense in a world in which one couldn’t dial 911 to get the police to show up if one’s interests were threatened. A belligerent stance was one’s only deterrent against other people whose interests were in conflict with one’s own. Another one is our taste for attractive marriage partners. As wise people have pointed out for millennia, this makes no sense in terms of how happy or compatible the couple will be. The curve of your date’s nose, or the shape of her chin, doesn’t predict how well one you’re going to get along with her for the rest of your life. But evolutionary psychology has show that the physical features of beauty are cues to health and fertility. Our fatal weakness for attractive partners can be explained in terms of our evolutionary history, not our personal calculations of well-being.
The “blank slate” has also been undermined by brain science. The brain obviously has a great deal of what neuroscientists call plasticity—that’s what allows us to learn. But the newest research is showing that many properties of the brain are genetically organized, and don’t depend on information coming in from the senses.
The doctrine of the “noble savage” has been undermined by a revolution in our understanding of non-state societies. Many intellectuals believe that violence and war among hunter-gatherers is rare or ritualistic, and that the battle is called to a halt as soon as the first man falls. But studies that actually count the dead bodies have shown that the homicide rates among prehistoric peoples are orders of magnitude higher than the ones in modern societies—even taking into account the statistics from two world wars! We also have evidence that nasty traits such as psychopathy, violent tendencies, a lack of conscientiousness, and an antagonistic personality, are to a large extent heritable. And there are mechanisms in the brain, probably shared across primates, that underlie violence. All these suggest that what we don’t like about ourselves can’t just be blamed on the institutions of a particular society.
And the ghost in the machine has been undermined by cognitive science and neuroscience. The foundation of cognitive science is the computational theory of mind—the idea that intelligence can be explained as a kind of information-processing, and that motivation and emotion can be explained as feedback system. Feats and phenomena that were formerly thought to rely on mental stuff alone, such as beliefs, desires, intelligence, and goal-directed behavior can be explained in physical terms. And neuroscience has most decisively exorcised the ghost in the machine by showing that our thoughts, feelings, urges, and consciousness depend completely on the physiological activity of the brain.
EDGE: What’s the influence of evolutionary psychology in all of this?
PINKER: Evolutionary psychology is one of four sciences that are bringing human nature back into the picture. (The others are cognitive science, cognitive neuroscience and behavioral genetics.) There’s a sense in which all psychology is evolutionary. When it comes to understanding a complex psychological faculty such as thirst or shape perception or memory, psychologists have always appealed to their evolutionary functions, and it’s never been particularly controversial. It’s no coincidence that the effects of thirst are to keep the amount of water and the electrolyte balance in the body within certain limits required for survivalÛwithout such a mechanism, organisms would plump up and split like a hot dog on a grill or shrivel up like a prune. Likewise, it can’t be a coincidence that the brain compares the images from the two eyeballs and uses that information to compute depth. Without such an ability we’d be more likely to bump into trees and fall off cliffs. The only explanation, other than creationism, is that those systems evolved because they allowed our ancestors to survive and reproduce better than the alternatives.
Evolutionary psychology is taking that mindset and applying it to more emotionally charged aspects of behavior, such as sexuality, violence, beauty, and family feelings. One reason that evolution is more controversial in these areas than it is in the study of thirst is that the implications of evolution are less intuitive in the case of emotions and social relations. You don’t need to know much evolutionary biology to say that it’s useful to have stereo vision or thirst. But when it comes to how organisms deal with one another, common sense is no substitute for good evolutionary theory. We have no good intuitions about whether it’s adaptive, in the narrow biologist’s sense, to be monogamous or polygamous, to treat all your children equally or to play favorites, to be attracted to one kind of facial geometry or another. There you have to learn what the best evolutionary biology predicts. So evolutionary thinking in those fields is more surprising than in the rest of psychology.
EDGE: How are your ideas informed by the debate between Frank Sulloway and Judith Rich Harris which has been featured on Edge?
PINKER: Both of them, to their great credit, have addressed what may be the most important puzzle in the history of psychology. It’s one that most psychologists themselves don’t appreciate, and that most intellectuals don’t understand even when it’s explained in Newsweek or the daily papers. Here’s the puzzle. We know that genes matter in the formation of personalities. Probably about half of the variation in personality can be attributed to differences in genes. People then conclude, well the other half must come from the way your parents brought you up: half heredity, half environment, a nice compromise. Right? Wrong. The other 50% of the variation turns out not to be explained by which family you’ve been brought up in. Concretely, here’s what the behavioral geneticists have found. Everyone knows about the identical twins separated at birth that have all of these remarkable similarities: they score similarly on personality tests, they have similar tests in music, similar political opinions, and so on. But the other discovery, which is just as important, though less well appreciated, is that the twins separated at birth are no more different than the twins who are brought up together in the same house with the same parents, the same number of TV sets in the house, same number of books, same number of guns, and so on. Growing up together doesn’t make you more similar in intelligence or in personality over the long run. A corroborating finding is that adopted siblings, who grow up in the same house but don’t share genes, are not correlated at all. They are no more similar than two people plucked off the street at random. So no, it’s not all in the genes, but what isn’t in the genes isn’t in the family environment either. It can’t be explained in terms of the overall personalities or the child-rearing practices of parents.
Both Harris and Sulloway, and a handful of other psychologists like David Rowe, Robert Plomin, and Sandra Scarr, have called attention to this puzzle: what are the non-genetic determinants of personality and intelligence, given that they almost certainly are not the family environment. Many people, still groping for a way to put parents back into the picture, assume that differences among siblings must come from differences in the way parents treat their different children. Forget it. The best studies have shown that when parents treat their kids differently, it’s because the kids are different to begin with, just as anyone reacts differently to different people depending on their personalities. Any parent of more than one child knows that children are little people, born with personalities.
Where these two differ is that Sulloway argues that the unexplained variation comes from the way that children differentiate themselves from their siblings in the family. They take these strategies for competing for parental attention and resources outside the family and react to nonrelatives using the same strategies that worked for them inside the family. Harris argues that the missing variance comes from how children survive within peer groups—how they find a niche in their own society and develop strategies to prosper in it.
I think that Sulloway has captured something about the dynamics among siblings within the family. But I’m not convinced that these strategies shape their personalities outside the family. What works with your little brother is not necessarily going to work with strangers and friends and colleagues. And indeed most of the data that support Sulloway come from studies in which siblings rate their siblings or parents rate their children, or in which siblings rate themselves with respect to their siblings. The theory is not well-supported by studies that look at the personality of people outside the home. Indeed, it’s a major tenet of evolutionary psychology that one’s relationships with kin are very different from ones relationships with non-relatives.
As for Judy Harris, I am completely persuaded by her argument that socialization takes place in the peer group rather than in the family. This is not a banal claim—most child psychologists won’t go near it. But it survives one empirical test after another. To take a few examples: kids almost always end up with the accent of their peers, never their parents. Children of culturally inept immigrants do just fine if they can learn the ropes from native-born peers. Children who are thrown together without an adult language to model will invent a language of their own. And many studies have shown that radical variations in parenting practice, such as whether you grow up in an Ozzie and Harriet family or a hippie commune, whether you have two parents of the same sex or one of each, whether you spent your hours in the family home or a day care center, whether you are an only child or come from a large family, or whether you were conceived the normal way or in a laboratory dish, leave no lasting marks on your personality, as long as you are part of a standard peer group.
What Harris’s theory has not explained to my satisfaction, at least not yet, is the missing variation in personality per se. Personality and socialization aren’t the same thing. Socialization is how you become a functioning person in a society—speak the language, win friends, hold a job, wear the accepted kinds of clothing and so on. Personality is whether you’re nice or nasty, bold or shy, conscientious or lackadaisical. Here’s the problem. Let’s to go back to our touchstone: identical twins brought up together, who share both their genes and their environment, but nonetheless are not identical in personality. They almost certainly will have grown up in the same peer groups, or at least the same kinds of peer groups, and their personalities and physical characteristics will tend to place them in the same niches within those peer groups. So peer groups by themselves can’t explain the unexplained variation in personality.
To be fair, Harris points out that which niche you fill in a peer group—the peacemaker, the loose cannon, the jester, the facilitator, and so on—might partly be determined by chance: which niche is left open when you find a circle of buddies to hang out with. I think there’s something to that, but it’s a special case of what might be an enormous role for chance in the shaping of who we are. In addition to which niche was open in your peer group, there are other unpredictable events that happened to each of as we grew up. Which twin got the top bunk bed, which got the bottom bunk bed? Did you get chased by a dog, or dropped on your head, or infected by a virus, or smiled on by a teacher?
And there are even more chance events in the wiring of the brain in utero and the first couple of years of life. We know that there isn’t nearly enough information in the genome to specify the brain down to the last synapse, and that the brain isn’t completely shaped by incoming sensory information either. Based on studies of the development of simple organisms like fruit flies and roundworms, we know that much in development is a matter of chance. You can have genetically homogeneous strains of roundworm brought up in the same monotonous laboratory conditions, and find that one lives three times as long as the other. Or two fruitflies from inbred strains, which are in effect clones, can be physically different—they can have different numbers of bristles under each wing, for example. If in simple organisms like worms and flies can turn out differently for capricious and unpredictable reasons, then surely chance plays an even bigger role in the way our brains develop.
EDGE: Who influenced you to go down this path?
PINKER: When I was an undergraduate, I read Chomsky, who was one of the first to break the taboo against explanations that appeal to human nature. Decades ago he argued that our capacity for language is an innate ability of the human mind, and he connected his theories to enlightenment philosophers and political thinkers who acknowledged the importance of human nature. In graduate school my mentors were Steve Kosslyn, who trained me to be an experimental psychologist, and Roger Brown, who invented the modern science of language acquisition and got me interested in the topic. Roger was also a gifted writer, with a great wit and panache. He certainly inspired me to pay attention to clarity, style, and breadth in writing. Joan Bresnan, a brilliant linguist, was my postdoctoral adviser, and she sharpened my formal and computational and mathematical competence. Aside from these mentors, I was influenced by cognitive scientists like Warren McCullough, Herb Simon, Allen Newell, Marvin Minsky, George Miller, Gordon Bower, and John Anderson, who developed the computational theory of mind. Later, I was influenced by the evolutionary psychologists John Tooby, Leda Cosmides, and Don Symons, who got me to read the work of George Williams, Richard Dawkins, Robert Trivers, and John Maynard Smith. I have been interested in behavioral genetics ever since I read about the work of Tom Bouchard and David Lykken in Science in the late 1980s, but it was Judy Harris who really forced me to think through the implications of that work, and of work in social and personality development more generally.
EDGE: Who will be your critics?
PINKER: Certainly the postmodernists in the humanities. Also, many of the child psychologists who are still stuck on parents as the shapers of childrens’ personality and intelligence. Third, the neural network modelers who have tried to revive the laws of association as an explanation for all aspects of language and cognition. Fourth, some of the more extreme enthusiasts of neural plasticity, who believe that the brain is infinitely malleable, and that this holds great promise for education and child-rearing and successful aging. Fifth, people with sympathies for the romantic revolutionary politics of the 60s and 70s, which is where the initial opposition to sociobiology came from. They have always been enraged by the claim that limitations on human nature might constrain our social arrangements.
EDGE: What about implications for other fields?
PINKER: The “blank slate” has had an enormous influence in far-flung fields. One example is architecture and urban planning. The 20th century saw the rise of a movement that has been called “authoritarian high modernism,” which was contemporaneous with the ascendance of the blank slate. City planners believed that people’s taste for green space, for ornament, for people-watching, for cozy places for intimate social gatherings, were just social constructions. They were archaic historical artifacts that were getting in the way of the orderly design of cities, and should be ignored by planners designing optimal cities according to so-called scientific principles.
Le Corbusier was the clearest example. He and other planners had a minimalist conception of human nature. A human being needs so many cubic of air per day, a temperature within a certain range, so many gallons of water, and so many square feet in which to sleep and work. Houses became “machines for living,” and cities were designed around the most efficient way to satisfy this short list of needs, namely freeways, huge rectangular concrete housing projects, and open plazas. In extreme cases this led to the wastelands of planned cities like Brasilia; in milder cases it gave us the so-called urban renewal projects in American cities and the dreary highrises in the Soviet Union and English council flats. Ornamentation, human scale, green space, gardens, and comfortable social meeting places were written out of the cities because the planners had a theory of human nature that omitted human esthetic and social needs.
Another example is the arts. In the 20th century, modernism and post-modernism took over, and their practitioners disdained beauty as bourgeois, saccharine, and lightweight. Art was deliberately made incomprehensible or ugly or shocking—again, on the assumption that people’s tastes for attractive faces, landscapes, colors, and so on were reversible social constructions. This also led to an exaggeration of the dynamic of social status that has always been part of the arts. The elite arts used to be aligned with the economic and political aristocracy. They involved displays of sumptuosity and the flaunting of rare and precious skills that only the idle rich could cultivate. But now that any now that any schmo can afford a Mozart CD or can go to a free museum, artists had to figure out new ways to differentiate themselves from the rabble. And so art became baffling and uninterpretable without acquaintance with arcane theory.
By their own admission, the humanities programs in universities, and institutions that promote new works of elite art, are in crisis. People are staying away in droves. I don’t think it takes an Einstein to figure out why. By denying people’s sense of visual beauty in painting and sculpture, melody in music, meter and rhyme in poetry, plot and narrative and character in fiction, the elite arts wrote off the vast majority of their audience—the people who approach art in part for pleasure and edification rather than social one-upmanship. Today there are movements in the arts to reintroduce beauty and narrative and melody and other basic human pleasures. And they are considered radical extremists!
EDGE: Why do people still treat art and literary critics as the wisest and most relevant intellectuals? In terms of literature, why is it that in the leading cultural magazines, you can still find a lot more of Virginia Woolf, Lytton Strachey, and Bloomsbury, than discussions about the issues you and other scientists are raising?
PINKER: One reason for the canonization of artists is a quirk of our moral sense. Many studies show that that people hallucinate moral virtue in other people who are high in status—people who are good-looking, or powerful, or well-connected, or artistically or athletically talented. Status and virtue are cross-wired in the human brain. We see it in language, where words like “noble” and “ugly” have two meanings. “Noble” can mean high in status or morally virtuous; “ugly” can mean physically unattractive or morally despicable. The deification of Princess Diana and John F. Kennedy Jr. are obvious examples. I think this confusion leads intellectuals and artists themselves to believe that the elite arts and humanities are a kind of higher, exalted form of human endeavor. Anyone else having some claim to insights into the human condition is seen as a philistine, and possibly as immoral if they are seen as debunking the pretensions of those in the arts and the humanities.
To be fair, there are other strands of the arts and humanities, sometimes brushed aside in the 20th century, that resonate quite well with the arguments that I’ve been making. Many artists and scholars have pointed out that ultimately art depends on human nature. The aesthetic and emotional reactions that we have to works of art depend on how our brain is put together. Art works because it appeals to certain faculties of the mind. Music depends on details of the auditory system, painting and sculpture on the visual system. Poetry and literature depend on language. And the insights we hope to take away from great works of art depend on their ability to explore the eternal conflicts in the human condition, like those between men and women, self and society, parent and child, sibling and sibling, and friend and friend. Some theoreticians of literature have suggested that we appreciate tragedy and great works of fiction because they explore the permutations and combinations of human conflict and these are just the themes that scientific fields like evolutionary psychology and behavioral genetics and social psychology try to illuminate.
EDGE: So what do you see as the appropriate role for art?
PINKER: Good heavens, that’s not for me to weigh in on! The most I can do is suggest ways in which the sciences of mind might pipe in with insights that could complement those of scholars in the humanities. Linguistics can help poetics and rhetoric; perception science can be useful for the analysis of music and the visual arts; cognitive science has a role to play in the analysis of literature and cinema; evolutionary psychology can shed light on esthetics. And more generally, the sciences of mind can reinforce the idea that there really is an enduring human nature that great art can appeal to.
EDGE: Who are some of the people exploring the convergence of art and science?
PINKER: Among novelists, Ian McEwan, David Lodge, A. S. Byatt, John Updike, Iris Murdoch, Tom Wolfe, and George Orwell are a few that I am familiar with who have invoked notions of human nature, sometimes traditional ones, sometimes ones from scientific psychology, in their work or their explanations. Among scholars and critics, the list is growing; here are some who pop into mind. George Steiner on biological conflict and drama. Ernest Gombrich on perception and art. Joseph Carroll, Frederick Turner, Mark Turner, Brian Boyd, Patrick Hogan, on literature. Elaine Scarry on mental imagery and fiction. Denis Dutton has been a catalyst for this convergence through his journal Philosophy and Literature and at his web site.
EDGE: Does this portend a more general trend?
PINKER: We may be seeing a coming together of the humanities and the science of human nature. They’ve been long separated because of post-modernism and modernism. But now graduate students are grumbling in emails and in conference hallways about being locked out of the job market unless they perpetuate postmodernist gobbledygook, and how they’re eager for new ideas from the sciences that could invigorate the humanities within universities, which are, by anyone’s account, in trouble. Also connoisseurs and appreciators of art are getting sick of the umpteenth exhibit on the female body featuring mangled body parts, or ironic allusions to commercial culture that are supposed to shake people out of their bourgeois complacency but that are really no more insightful than an ad parody in Mad Magazine or on Saturday Night Live.
EDGE: I asked about the connections to other fields, like history? Science doesn’t take place in a vacuum. Didn’t historical events of the 20th century have something to do with the popularity of the “blank slate” ?
PINKER: Intellectual life was enormously affected by an understandable revulsion to Nazism, with its pseudoscientific theories of race, and its equally nonsensical glorification of conflict as part of the evolutionary wisdom of nature. It was natural to reject anything that smacked of a genetic approach to human affairs. But historians of ideas have begun to fill in another side of the picture. During the twentieth century, equally horrific genocides were carried out in the name of Marxism, such as in the mass purges and manmade famines of Lenin, Stalin, and Mao, and the madness in Kampuchea. The remarkable fact is that the two great ideologically driven genocides of the 20th century came from theories of human nature that were diametrically opposed. The Marxists had no use for the concept of race, didn’t believe in genes, and denied Darwin’s theory of natural selection as the mechanism of evolutionary adaptation. This shows is that it’s not a biological approach to human nature that is uniquely sinister. There must be common threads to Nazism and totalitarian Marxism that cut across a belief in the importance of evolution or genetics. One common thread was a desire to reshape humanity. In the Marxists’ case it was through social engineering; in the Nazis’ case it was eugenics. Neither of them were satisfied with human beings as we find them, with all their flaws and weaknesses. Rather than building a social order around enduring human, traits they had the conceit that they could re-engineer human traits using scientific—in reality pseudoscientific—principles.
In Martin Amis’s new book about Stalinism, he argues that intellectuals have not yet come to grips with the lessons of Marxist totalitarianism in the way that they did with Nazi totalitarianism many decades ago. A number of historians and political philosophers have made the same point. This blind spot has distorted the intellectual landscape, including the implications and non-implications of genetics and evolution for understanding ourselves.
EDGE: Final thoughts?
PINKER: Chekhov once said, “Man will become better when you show him what he is like.” I can’t do better than that.
STEVEN PINKER, research psychologist, is Peter de Florez Professor in the Department of Brain and Cognitive Sciences at the MIT; director of the McDonnell-Pew Center for Cognitive Neuroscience at MIT; author of Language Learnability and Language Development: Learnability and Cognition; The Language Instinct; How the Mind Works; Words and Rules: The Ingredients of Language, and The Blank Slate: The Modern Denial of Human Nature.
His research on visual cognition and on the psychology of language has received the Troland Award from the National Academy of Sciences and two prizes from the American Psychological Association. He has also received awards for his graduate teaching at MIT and for his undergraduate teaching at MIT, two prizes for general achievement, an honorary doctorate, and five awards for his popular science books.
Pinker is a fellow of several scholarly societies, including the American Academy of Arts and Sciences and the American Association for the Advancement of Science. He is an associate editor of Cognition and serves on many professional panels, including the Usage Panel of the American Heritage Dictionary and the Scientific Advisory Panel of an 8-hour NOVA television series on evolution. He also writes frequently in the popular press, including The New York Times, Time, Slate, and The New Yorker.
Steven Pinker’s The Blank Slate: The Modern Denial of Human Nature at Amazon.com.