Publication Date: 2005-03-28
Rereading B. F. Skinner's Beyond Freedom and Dignity caused this professor of psychology to rethink the comfortable conceptual dichotomy "behaviorism bad; evolutionism good."
I am not now, nor have I ever been, a devotee of behaviorism, radical or otherwise. Moreover, when I teach or write about animal behavior, I often counterpoise B.F. Skinner's work in particular as the intellectual antipode of my own perspective, which emphasizes the importance of built-in, prewired, evolutionarily generated mechanisms. For Skinner and his disciples, living things (including human beings) are tabula rasa, blank slates upon which the contingencies of reinforcement write as they will, thereby constituting the crucial -- indeed, the only -- determinant of behavior: the experience of each individual.
By contrast, I think it much more likely that living things are palimpsests, tablets that are far from blank, because natural selection has written upon them, then crossed out and rewritten, doing this again and again, innovating, erasing, revising, and correcting, passing down our "nature" as a heavily edited evolutionary bequeathal, a much overwritten tablet of DNA.
All this could hardly be more different from behaviorism, whose more hard-nosed version denies the very existence of "human nature." Consider, for instance, the famed pronouncement by John B. Watson, Skinner's conceptual mentor: "Give me a dozen healthy infants, well-formed, and my own specified world to bring them up in and I'll guarantee to take any one at random and train him to become any type of specialist I might select -- doctor, lawyer, artist, merchant-chief and yes, even beggar-man and thief, regardless of his talents, penchants, tendencies, abilities, vocations, and race of his ancestors."
Into my own comfortable conceptual dichotomy ("behaviorism bad; evolutionism good"), there came an apple of discord when I happened to reread Skinner's Beyond Freedom and Dignity, published more than three decades ago.
Please don't misunderstand: I haven't become a convert to behaviorism. But I have emerged with a deeper respect for B.F. Skinner and his work, and a recognition that in his legacy, not just evolutionary biologists but all scientists have a potent intellectual ally. His research didn't encompass neurobiology, sociobiology, or, indeed, biology at all. But there is no doubt that he "did" science, and moreover, that he provided the rest of us with some conceptual tools and arguments that will help us along our way.
Sociobiology and Skinner? Oxymoronic indeed, although "science and Skinner" is a bit more coherent, since the driving force of Skinner's work was a passion to make the study of behavior "scientific" at last. Even if we choose to discount Skinner's claim that reinforcement is the key to behavior -- and discount it I do -- there is deep wisdom in his pioneering insistence upon science as the fundamental paradigm for explaining human actions. Moreover, in a time of rising religious fundamentalism, abetted by powerful political allies, as well as a revival of pseudosciences of all sorts -- not to mention the postmodernist denial of the legitimacy of science itself -- we need all the clear-eyed thinking we can get, especially when applied to our own species.
For me, in short, the issue is not whether Skinner was correct in the particular paradigm he espoused, but rather, his prescience in pushing students of behavior to embrace the broader paradigm of science, with its emphasis on objective, mechanistic explanations.
The problem is not simply one of seeing ourselves as others see us, but as we really are. Thus for a long time the best view in the city of Warsaw has been from the top of the Ministry of Culture. Why? Because this is practically the only place in that otherwise appealing city from which it is impossible to see the Ministry of Culture (a thoroughly regrettable example of Stalinist architecture at its worst). By the same token, we all see the world from the ministry of our own perceptions, having only this very limited viewpoint from which to see ourselves.
It was Skinner who identified, more clearly than anyone before -- or after -- the key stumbling block for those of us trying to see ourselves accurately; namely, a reluctance to countenance that human actions are caused, because the more causation, the less credit. "We recognize a person's dignity or worth," writes Skinner, "when we give him credit for what he has done. The amount we give is inversely proportional to the conspicuousness of the causes of his behavior. If we do not know why a person acts as he does, we attribute his behavior to him. We try to gain additional credit for ourselves by concealing the reasons why we behave in given ways or by claiming to have acted for less powerful reasons." Ironically, there is something flattering and legitimizing in actions or thoughts that spring unbidden from our "self" -- whatever that may be -- and that aren't otherwise explicable. By the same token, the more our actions are caused, the less are we credited for them.
Skinner, again: "Any evidence that a person's behavior may be attributed to external circumstances seems to threaten his dignity or worth. We are not inclined to give a person credit for achievements which are in fact due to forces over which he has no control. We tolerate a certain amount of such evidence, as we accept without alarm some evidence that a man is not free. No one is greatly disturbed when important details of works of art and literature, political careers, and scientific discoveries are attributed to 'influences' in the lives of artists, writers, statesmen, and scientists respectively. But as an analysis of behavior adds further evidence, the achievements for which a person himself is to be given credit seem to approach zero, and both the evidence and the science which produces it are then challenged." And not only achievements: The quotidian events of normal living also qualify.
Most of my students are alternately amused and troubled, for example, when I speculate that "love" is, on one level, an evolutionary mechanism that insures an inclination to invest in individuals suitable to help maximize one's fitness, and on another, a consequence of appropriate amounts of oxytocin (in women) or vasopressin (in men), released in conjunction with sexual satisfaction. "That's just not acceptable," one young lady moaned, "I want my boyfriend to love me on his own, and not because of his genes or chemicals, but because of him and me!"
It is one thing, however, to insist on being loved for one's self, and not, for example, because of a hefty trust account; quite another to demand that love emerge spontaneously, somehow bubbling up and taking form without any cause whatsoever.
Skinner points out, further, that a scientific conception of behavior "does not dehumanize man, it dehomunculizes him," abolishing the unsupportable conceit that people are responsible for their actions. Why unsupportable? After all, each of us knows, subjectively, that we are free to act as we choose, and yet, as David Hume pointed out more than two centuries ago, this "knowledge" must be false: Either our behavior is a consequence of prior events (modern readers can substitute "contingencies of reinforcement," "genetic predispositions toward fitness maximization," "electrochemical events taking place across neuronal membranes," and so forth), in which case we are not responsible for such actions, or it is truly spontaneous and thus random, in which case we are, if anything, even less responsible.
Thus are we transported to the ancient and seemingly insoluble conundrum of free will, which most of us "solve" by adopting two altogether inconsistent viewpoints. On the one hand, anyone espousing science -- or even something as basic as cause and effect -- cannot help acknowledging that free will must be an illusion insofar as everything is caused. But on the other, nearly all of us act in our daily lives as though we possess free will in abundance, and, moreover, that others do, too. Do we contradict ourselves? Very well, a modern-day Whitman might conclude, we contradict ourselves. We are large; we contain multitudes.
Skinner points with bemusement to essayist Joseph Wood Krutch's lament that humanity's self-conception has greatly deteriorated, from Hamlet's "How like a god!" to Pavlov's "How like a dog." One needn't be a behaviorist -- or a Pavlovian -- to conclude, however, that this transition constitutes progress. Whether god or homunculus or Wizard of Oz hiding behind the curtains with his hand on the levers of power, there is little to be gained from such metaphysical explanatory fictions; by contrast, although Homo sapiens are more complex than dogs, their dog-nature -- unlike their purportedly divine essence -- is at least amenable to scientific analysis and rational understanding. In short, from god to dog is a step up.
But isn't it demeaning? And even dangerous? After all, Tom DeLay, the House majority leader, has already blamed episodes of violence such as the shootings in Littleton, Colo., on the teaching of evolution, which presumably "dehumanizes" human beings. On the other hand, Darwin, in the final paragraph of The Origin of Species, suggested instead that "there is grandeur in this view of life," one that recognizes the connectedness of our species to the rest of evolution.
Such connectedness presumably implies not only historical continuity but also continuity of mechanism, at the level of organs, cells, organelles, molecules, and so forth. As to dangerous, here is Skinner, once again: "The problem is to free men, not from control, but from certain kinds of control. ... " That is, we may choose ignorance over self-knowledge, but this will not in itself make us into autonomous creatures. We are influenced, and to some degree even controlled, by what surrounds us (as well as what emanates from our DNA). It is no coincidence that John B. Watson became a major figure in the nascent advertising industry.
What about diminution of our free will? Skinner takes that one on, too, when he notes that "no theory changes what it is a theory about." If we had free will before Skinner, or Darwin, or recent pioneers in neurobiology, then nothing in their work can take it away. And likewise, insofar as human behavior is already "controlled," then science will not free us. Well, actually, that's not quite true: The more we understand about the nature of whatever control already exists (at the level of "reinforcers," neurobiology, genetic predispositions, etc.), the freer we are to design the kinds of control we would like. Skinner is quite clear that the goal is not to free human behavior from control -- because in his opinion, that can never be -- but to introduce some choice as to the kinds of control. And this, paradoxically, promises to put "us," whoever that is, back into the driver's seat, or at least nearby.
Throughout his work, Skinner studiously avoided any intimation as to consciousness, subjectivity, or their underlying neural mechanisms, not because he denied their existence, but because he maintained that they could not be scientifically investigated. Like Freud, he felt that the elucidation of the connection between neurological mechanisms and thought and consciousness was far in the future. Neither Skinner nor Freud, for that matter, would have been likely to disagree with these opening lines of Francis Crick's important book, The Astonishing Hypothesis: "You, your joys and your sorrows, your memories and your ambitions, your sense of personal identity and free will, are in fact no more than the behavior of a vast assembly of nerve cells and their associated molecules. As Lewis Carroll's Alice might have phrased it, 'You're nothing but a pack of neurons.'"
For nearly all scientists, as well as nonscientists with an empirical turn of mind, there is nothing whatsoever astonishing about this hypothesis. Indeed, it would be astonishing if not true, which only italicizes the wisdom of Skinner's insistence that we would do well to stop deluding ourselves, and start accepting that behavior -- like all other natural processes -- is caused. People may deserve dignity, or inherently possess it, or struggle to achieve it, but such dignity is not impeded by the fact that they are embodied, evolved creatures functioning in a physical world.
And yet, in a different sense, Crick's book, like Skinner's, was well titled. Although it is a mundane fact, generally taken for granted among all scientists, that Descartes was wrong and there is no dualism separating mind from body, the fact of embodiment (and thus the dependence of mind on body) is itself astonishing. Yet it is part of B.F. Skinner's legacy that we know this to be so.
In a 1991 science-fiction story by Terry Bisson, we listen in on a conversation between the robotic commander of an interplanetary expedition and his equally electronic leader, reporting with astonishment that the human inhabitants of Earth are "made out of meat":
"Meat?" "There's no doubt about it. ... " "That's impossible. ... How can meat make a machine? You're asking me to believe in sentient meat." "I'm not asking you. I'm telling you. These creatures are the only sentient race in the sector, and they're made out of meat." ... "Spare me. Okay, maybe they're only part meat. ... " "Nope, we thought of that, since they do have meat heads. ... But ... they're meat all the way through." "No brain?" "Oh, there is a brain all right. It's just that the brain is made out of meat!" "So ... what does the thinking?" "You're not understanding, are you? The brain does the thinking. The meat." "Thinking meat! You're asking me to believe in thinking meat?" "Yes, thinking meat! Conscious meat! Dreaming meat! The meat is the whole deal! Are you getting the picture?"
Thanks in large part to B.F. Skinner, we are.
David P. Barash is a professor of psychology at the University of Washington.
This essay appeared in The Chronicle of Higher Education , April 1, 2005.