Working knowledge



Download 1.09 Mb.
Page9/35
Date06.08.2017
Size1.09 Mb.
#27884
1   ...   5   6   7   8   9   10   11   12   ...   35

Both agreed that social conditioning is necessary to bring any human goodness out. Both made a major virtue of education. Mencius thought it would bring out the good, Xunzi that it would control the bad.

What is not reasonable is the idea that people are “naturally” anything very specific. People “left to themselves” will not be good or bad; they will be dead. Social life is a physical necessity, at least for babies and children. It automatically shapes people, through culture and instruction.

Thus, hopeful philosophies based on the “natural” human drive to learn, or to be virtuous, or to be religious always fail. The drives may be real enough, but, inevitably, culture either brings out those drives or stifles them. The hopeful Quakers, the humanistic psychologists of the 1950s, the Kropotkinian anarchists, and others assuming natural human good have been bitterly disappointed throughout history. Even the saints of old often defended narrow interests by deadly force. (Standards of sainthood got pretty lax in the Middle Ages.)

However, the sour philosophers, from Xunzi to Hobbes and Freud, also fail to predict anything interesting. They can always point to the odd sociopath or psychopath as “proof” of human baseness, but even Xunzi knew that the vast majority of people have their worst tendencies trained out of them (cf. Baumeister 1997, 2005). The common claim that “people are only out for what they can get” is as silly as the fond dream of universal peace and love just around the corner.

Closest to truth was the Chinese philosopher Mencius, who taught that people are naturally prosocial, but that bringing it out requires nurturance, support, and empowerment. Evil is unnatural, but common and easily picked up, even by failure of good training, let alone by bad training. Similarly, in the ancient western world, evil was a mistake. “Satan” is from Hebrew shaytan, “untruth” or “lie”; the Greek diabolos, which gives us “diabolical” and “devil” in English, means the same. Biology gives us a world in which we can and must calculate moment by moment whether to help or harm, and gives us a high level of defensiveness. Culture tells us how to do the calculations. Then each one of us must do the calculations for the task immediately at hand. Evil often comes from mistakes—often the perhaps-prudent one of taking an innocent remark amiss. Without major efforts to teach them how to respond courageously yet forbearingly, they will respond with anger, hatred, or despair. This is the cloud that hides the inner light.


Explanations

One might note how progressive restriction of level of explanation can operate in analyzing foodways (see Anderson 2005a):

At the most basic biological level, we need the calories, protein, fats, vitamins, minerals.

We then need to avoid poisons and stay healthy.

We then need to figure out how to get all that for minimum effort or expense—to do “optimal foraging,” in the jargon.

This means, in an agricultural society, looking at crop ecology and other agricultural issues.

In a civilization, one has to worry about money and prices.

Then, that done, food always gets involved in social bonding: sharing, reciprocity, generosity. It marks religious and ethnic affiliation. It diffuses among neighbors.

It marks class, region, occupation, gender, age, and so on.

On a still smaller and more restricted level, it marks occasion: birthday, Christmas, business deal.

It allows individuals to show off and jockey for status.

It reveals social knowledge via ordinary etiquette.

Then, at all levels, it is affected by contingent histories and just plain accidents, including personal taste.

Social scientists have explained social systems in dozens of ways, ranging from the sublime to the ridiculous. We will find it useful to classify these, very roughly and crudely, into four types.

Mode 1 consists of rational need-satisfaction theories. Most of them are broadly materialist. These include straightforward biological functionalism: society seen as a way of getting food, shelter, and reproduction. It includes more complex materialist theories like Adam Smith’s cultural evolutionary dynamics, Marxism and other political economies, “rational choice theory,” and modern enviromental and ecological theories.

Mode 2 consists of explanations resorting largely to human instincts or innate tendencies. People clearly have inborn behavior. A smile is a smile everywhere, even if the Mona Lisa had her own brand.

Mode 3 consists of explanations that are broadly idealist—not in the sense of having high ideals, but in the sense of living according to ideas rather than material needs or evil wants. Most religious leaders thought, and think, this way. In western social science it was the view of Immanuel Kant, and since he essentially created most of modern social science, he had a truly profound influence on us all. His straight-line intellectual descendents included Dilthey, Boas, Parsons, Lévi-Strauss, and most of the other makers of modern sociology and anthropology.

Social functionalism, from Marx to Durkheim and the later functionalists, is a Kantian offshoot with considerable cross-fertilization from Mode 1. Social functionalists see that a society needs communication systems, a law code, a calendar, a leadership and power system, allocated roles, status and prestige, morals, festivals, and so on. These emergents cannot be predicted directly from physical needs; they have a social and interactive history.

Mode 4 is a broadly empirical tradition. Pure empiricists hold that one can simply observe and count behaviors, and get along by inferring minimal thought-processes behind the actions. Pure empiricists form a grand chain, from John Locke to B. F. Skinner. Locke was the least extreme, and in fact is really more an ancestor to Kant—an early scholar of cognitive processes. Since Kant, empiricists have been less and less able to resist taking account of thought processes. The pure-empiricist trend in social science ended with Skinner’s attempts to equate pigeon behavior in the lab with language learning (see Skinner 1957, 1959). This was so patently hopeless, and so memorably demolished by a famous review by Noam Chomsky (1959?), that the pure empiricist program could not survive. However, modern experimental psychology, especially the heavily biological forms like neuropsychology, are derived from this lineage. They now take explicit account of ideas and mental phenomena, however (Damasio 1994; LeDoux 1996).

All four of the above have merit. Theories, as Michel Foucault (2007) reminds us, are a tool kit, not a religion. Every worker needs a whole set of tools. Unity comes in the result—fixing the house or the world—rather than in the means. You can’t fix even a simple toy with only one tool, and social theorists might reflect on that.

Theorists find their favorite level to explain. Biologists like the whole-species level. They prefer to explain the things that all people do everywhere. Human ecologists and political scientists are more restrictive, but still prefer the big picture: variations and history dynamics on a world scale. Interpretivists and cultural anthropologists like to look at cultures. Psychologists (except those who are basically biologists) like to look at individuals. To get the whole picture, one has to integrate all these.

IV. How We Don’t Know: Cognition Confounded


Explanations exist; they have existed for all time; there is always a well-known solution to every human problem—neat, plausible and wrong.

H. L. Mencken (1920:158)


Heuristics, Biases, and Other Cognitive Problems

Cultures encode a great deal of wrong information, and the most obvious way to explain it is through natural human mistakes. Culture can operate as a vast distorting medium. Much cultural error is motivated, often by cynical motives, but most of it is probably natural—the result of heuristics and biases in human thought. Many errors of thought occur simply because a wrong explanation seems more plausible than a right one. Knowledge stemming from revered elders is passed on, believed, and remembered because its value and accuracy is chronically overestimated. Knowledge coming from enemies or strangers is slighted.

The most obvious case is belief in magic and supernatural powers. Since ancient times, philosophers and psychologists have observed that humans tend to assume agency until proven otherwise; if it rains, somebody must be making it rain. This is so natural to humans that it is hard to stop, and then each succeeding generation piles a new finding about rainmaking or astrology or black magic on top of the old ones.

Sometimes a completely arbitrary cultural belief becomes institutionalized and lasts forever. Science is, of course, far from immune. About half the science I learned as a child is disproved now. Racism and its variant make up the largest and most obviously false class of formerly-scientific beliefs. But there are others, less obviously wrong and not much less pernicious.

Cultural training can be condition whole categories of thought in fascinating ways. For example, American children are trained to see simple causal relationships, because parents love to explain things this way. This leads the children to some wonderful bits of folk functionalism: “Lions are to go in the zoo,” “clouds are for raining,” “a hole is to dig” (Bloom and Weisberg 2007:996, with additions; see also Hood 2009:98). Children in other cultures do this less often. Bloom and Weisberg call this “promiscuous teleology,” a phrase that could apply to all too many functionalist explanations in social science!

Humans are prone to systematic mistakes both in “hot cognition”—emotional thought—and “cold cognition,” more coolly cognitive processing. Humans are not good at calculating probabilities or other numerical matters. They are not good at intuitively understanding sampling. They are easily tricked by almost any statistical manipulation, as politicians all know. (The classic book How to Lie with Statistics [Huff 1991] has gone through dozens of editions.)

Modern psychology has identified so many limits to human understanding that we can no longer use the model of rational individual utility-maximizing that has dominated so much of social science since Hobbes. Though Bacon and Locke had made insightful and detailed comments on human foibles in rationality, the overwhelming majority of western philosophers and scientists believed people were basically rational, up through the 1970s. About that time the tide began to turn, with Human Inference by Richard Nisbett and Lee Ross (1980) a major landmark. Since that time, the pendulum has swung far in the other direction, and now there are books appearing almost daily on how irrational people are and how much their perceptions are distorted. (Some landmarks over the years include Ariely 2009; Anderson 1996; Chabris and Simon 2010; Elster 1983; Kahnemann, Slovic and Tversky 1983; Thaler 1992). Dan Ariely has become a national radio and TV personality for his particularly outstanding research on it.

We have learned how prescient was David Hume’s observation that “reason is, and ought only to be the slave of the passions, and can never pretend to any other office than to serve and obey them” (Hume 1969 [1739-1740]).

Analysis of the limits to thought go back to the ancient Greeks, who saw how easily people fall into emotional distortion of reality. Most of our current knowledge in this area was foreshadowed by Plato and Aristotle. Indeed, the rationalist-empiricist view developed by Hobbes and Locke was a self-conscious rebellion against earlier irrationalist views of thought processes.

The current conventional wisdom is that we should at least know the limits of our understanding by bringing all these unconscious biases to consciousness. Dan Ariely has advocated this position in his bestselling and truly excellent account Predictably Irrational (2009). The problem is that Ariely takes well over 300 pages to give an account of some of the major ones, and leaves countless more untreated, while the other volumes cited above list dozens more problems. It has taken me 30 years of heavy reading to get even some purchase on this burgeoning area of psychology. Thousands of learned articles and several dozen books have treated it. There is little chance of anyone, let alone everyone, knowing all the limits to their rationality. We must indeed stick to the time-honored roll of careful thought habits: take as little as possible on faith, check everything, and so on. But this, pursued relentlessly, would leave us without the benefits of cultural tradition, which is still a better guide than our inevitably limited personal experience. There is no escape: we have to take some chances. But we are now well on our way to seeing why culture happens. One main reason is to correct these biases—or to use them creatively to sell useful ideas.

Kant introduced a more balanced critique of rationalism. Most basic in some ways was Kant’s realization that humans seek patterns and structures in everything. The origins of this, which Shermer (2009) calls “patternicity,” are not hard to find: it is the recurrent symmetrical pattern that distinguishes the snake from the grass, or the leopard from the spots of light and shade in the forest (Gombrich 1960, 1979; Shermer 2009). Our ancestors’ lives depended on instantly detecting patterns like this, and assuming them as a default. Hence our love of patterns in art, and probably our love of landscape pictures in general (Gombrich 1960, 1979). It is notoriously difficult, if not impossible, for humans to produce truly random lists of numbers or anything else (Hood 2009). We cannot help imposing patterns.

Some of our subconscious cognitive processes get fairly amusing. Andrew Elliott and Daniela Niesta (2008) found that men are more attracted to women’s pictures when in red frames than without. Red attracts and excites; it is the color of romance. They point out that we have the full cultural baggage here—valentines, shocking pink underwear, “red light districts,” and all. One might add the Chinese use of red as the color of marriage and of religion, and similar associations with red in cultures around the world.

It is not clear whether this is partially inborn or is purely a cultural matter. There is an obvious biological prime: flushing when excited. A woman blushes when romantic or excited, so reddening excites the male in turn. The associations of blood, fertility, and health are also obvious. Elliott and Niesta cite youth and strength, the menstrual cycle, and so on. They also duly note the red rumps of estrus females among the primates, but this seems doubtfully related.
We have to use “heuristics and biases” (Ariely 2009; Dawes 2001; Kahneman, Slovic and Tversky 1982; Nisbett and Ross 1980) to process knowledge. We then have to give it structure—systematize it—so we can retrieve it. Thus are born cognitive and structural theories of knowledge.

Even such a seemingly trivial fact as the long-standing existence or presence of something is enough to make us see it more favorably. Traditions, familiar objects, the existing social system, and anything else “time-tested” is seen more favorably than comparable but newer and less familiar items (Edelman et al. 2009). Anyone who owns a dog and cat will know that humans are not the only animals with this bias. It is probably universal among higher animals, and there are obvious reasons for that universality. This is sad news for reformers and radicals, but they should borrow a leaf from the book of many clever innovators, and claim that their reforms are restoring the glories of old, or the proper faith, or the old-fashioned moral standard. Similarly, anyone selling a new invention is well advised (and, indeed, is often advised—by advertising agencies) to claim that it allows one to do better what one has been doing all along. Above all, people believe what they want to believe (Anderson 1996; Dawes 2001; Nisbett and Ross 1980; Pronin 2008; Seligman 1990).

Self-serving and over-optimistic attributions dominate life and lead us into gambling, marrying appalling spouses, and other mistakes. Among other things, “we believe that our weaknesses are so common that they are really just part and parcel of normal human fallibility, while our strengths are rare and special” (Fine 2006:7). We take credit for our successes, blame others for failure, and attribute the worst to them (Tavris and Aronson 2007). Bad people, or people with a guilty conscience, are prone to see the world in negative terms, partly as self-justification: “everybody else does it too.” People exaggerate the difficulty of tests, problems and hardships they have survived or overcome, especially if they feel they succeeded (Azar 2007).

Huron (2006) emphasizes that memory is not about the past but about the future (see also Schacter and Addis 2007). It tells us what to expect. Most of us have had the experience (famously reported by Piaget) of finding out that one or another important “memory” was false—a story overheard, not a fact remembered. All of us have had the experience of exaggerating and simplifying memories. This is done partly to defend self-image, but partly to make prediction easier.

People can be terrifyingly loyal to hierarchic superiors. Stanley Milgram’s famous experiments involving simulated electric shocks to victims (actually co-experimenters) are well known. Students gave what they believed to be real shocks, as requested by experimenters, even when the supposed victims were showing major distress. Only a few students refused, and they were the born rebels—the independent-minded, nonconformist ones. Recent studies show nothing has changed since Schachter’s time (Burger 2009). The implications are clear enough, and fit perfectly with all we know about genocide and torture (Baumeister 1997; Staub 1989).

People overweigh advice from sympathetic quarters (Nisbett and Ross 1980). Republicans believe Fox News, Democrats believe The Nation, and no amount of exposure of mistakes makes them doubt. (I deliberately picked those two sources because of their notoriously frequent trails of misinformation; apologies to readers whose oxen are gored!)

Humans systematically miscalculate or ignore probabilities (Nisbett and Ross 1980). We cannot, in the normal course of things, carry out the more complex operations required by rational choice theory. As Richard Thaler says, “...an economist who spends a year finding a new solution to a nagging prolem, such as the optimal way to search for a job when unemployed, is content to assume that the unemployed have already solved this problems and search accordingly. The assumption that everyone else can intuitively solve problems that an economists has to struggle to solve analytically reflects admirable modesty, but it does seem a bit puzzling (Thaler 1992:2).” When presented with a simple, clear, present benefit (like catching a fish or cutting taxes), we do not think much of the hard-to-calculate future (when fishless seas or unpoliced streets confront us).

Love, good luck, and even the event of one's birthday may temporarily distort thought processes. We are aware that we should never shop at the grocery store when hungry, never go to a bar when trying to quit drinking, and never flirt with an enemy spy of the opposite sex unless we are in a thoroughly sexless mood. We may act differently after hearing a Puritan minister from the way we act after hearing an ad for a trip to Las Vegas. Neither “rational choice” nor “stable preferences” are obviously present in such cases.

One example of a wrong but useful heuristic is the Fundamental Attribution Error: our tendency to assume that, when people do something, it is because of their stable personalities rather than because of the situations they are in (Kahnemann, Slovic and Tversky 1983; Nisbett and Ross 1980).

People are also prone to think that the unreal is real if they talk about it a lot. Money is really an abstraction—a counter for exchange values—but we keep thinking that the coins and papers in our pockets really are money (rather than just tokens that we trustingly accept as if they were real wealth). We are all too prone to believe that the skyrocketing value of stocks and houses during bubbles is real and is a measure of true intrinsic worth—only to have our illusions shattered and our pockets cleaned out when the bubbles burst (Tyran 2007—note he was writing before 2008!). For this reason, financial advisers and stockbrokers actually do worse than naïve persons who just pick stocks by buying into any companies they have heard of (Gigerenzer 2007). “Irrational exuberance,” in Keynes’ phrase, takes over, and common sense goes out the window, leading to repeated boom-bust cycles (Stix 2009). Similarly, bringing money into a social situation immediately ruins it for sociability by turning it into a commercial transaction (Ariely 2009; Sahlins 1976).

Humans are less hopeful about gaining than worried about losing—a bias of obvious utility if one are choosing between picking more berries and getting out of the way of the bear that is also picking. This is actually wired in the brain (De Martino et al. 2006); the fear center in the amygdala kicks in when loss is threatened, and the frontal lobes try to integrate this animal response with rational knowledge.

Ethnic and gender stereotypes take on a horrible life of their own. Asian women students keyed to think “Asian” do better than average on math tests, but if they are keyed to think “woman” they do worse than average (Carpenter 2008). Black students react similarly to keying. Women do better on math tests if keyed to thinking of themselves as “students” rather than “women.” One can use this to a limited extent to improve school performance, but, alas, it more often seems to cut the other way; people’s expectations of themselves and others are devastatingly compromised by conscious or unconscious biases. This phenomenon is known as stereotype threat, and is the subject of an excellent recent book, Whistling Vivaldi, by Black psychologist Claude Steele (2010), who has done a great deal of the research, especially on Black vs. white stereotypes and stereotype threat.

Steele and others also point out ways to get around it. Steele’s title comes from a Black reporter who whistled Vivaldi as he walked down the street, thus keying white passers-by into thinking “scholar” rather than “mugger.” A dramatic study by Steele’s sometime coworkers Geoffrey Cohen and associates (Cohen et al. 2006; Steele 2010:152ff) showed that getting African-American students who had been falling behind other groups to write a simple essay on African-American values at the beginning of a school semester improved their performance throughout the whole semester; it closed 40% of the gap between them and the white students.

Steele also deals with the question of identity, including how one tells that one is an African-American in the first place. He tells the story of a Black writer, “black, both his parents were black, and…all of his ancestors were black as far back as the eighteenth century” (Steele 2010:64), who passed as white all his professional life. This certainly reveals the insanity of American “race” classification as well as anything can. Obviously, the vast majority of his ancestors were in fact white, or he could not have passed. But “one drop of blood” makes you black in the United States, unless you don’t bother to remind people of it. So Steele doesn’t even think twice of calling all those ancestors “black.” Having a few such cases in my own family (indeed, one in my own ancestry), I can relate, and feel the irony quite close to home. Fortunate those who get to choose what “race” they are.

Of course identity is even messier when one is an “American,” an Arab, an engineer, a dog-lover, a Republican, a cancer survivor…. Every one of which categories has its own stereotypes, for good and ill. And as Steele points out, even the most ridiculous distinctions can lead to prejudice. Biology is clearly against us; chimps fight rival groups. Rhesu monkeys like their own group members and dislike outgroup monkeys (Mahajan et al. 2011), so there is little chance that those stereotypes will usually be for good.



Download 1.09 Mb.

Share with your friends:
1   ...   5   6   7   8   9   10   11   12   ...   35




The database is protected by copyright ©ininet.org 2024
send message

    Main page