Sudden successful fulfillment of an urgent need is one main source of human pleasure. We all know this about sex and about cold beer on a hot day, and practically every culture seems to have a proverb equivalent to “hunger is the best sauce.”
Arousal—whether by stimulants or by dangerous sports—can be a pleasure in its own right. The pleasures of sex seem usually to involve more effort in heightening desire than in satisfying it.
Feeling in control is a good feeling; pushing one’s sense of control to or beyond the limit (as on a roller coaster or in extreme sports) is exciting, and not just because of the physiology of “adrenaline rushes.” Learning and understanding satisfy a need and are truly enjoyable. We humans like to whip up curiosity and then satisfy it; consider the pleasure of mystery tales. Almost everyone seems to have a hobby: some one thing they want to learn about just because they enjoy learning.
Normally, however, such curiosity is structured by immediate need. People, like all other mammals, are usually interested in things only to the degree that they have a material or social reason to be interested. Throughout history, the vast majority of people, when faced with the need to know about anything beyond their social group, have simply accepted conventional wisdom or ancient book-learning. Always there is some interest, explaining the slow but steady progress of knowledge in all societies, but only in the west since 1500 has the drive to accumulate new knowledge become a major industry. The origins of this remain obscure, but correlations with the expansion of trade, business, and religious enquiry are obvious.
Among academics, learning is often a goal in itself—a pure pleasure, not just a way of knowing enough to cope. Academics forget that this is unusual, and make sour remarks about students who have a normal, instrumental attitude toward knowledge.
A professor who has built her life on analyzing the proteins in the fur of the two-toed sloth can never understand how students can fail to be absolutely fascinated, and can be hurt and angry when students persist in being bored with sloth proteins. What is astonishing is how many students do become interested in them if the teacher is inspiring. Truly, social charisma can do anything.
Some of us even have made a hobby of understanding everything! If only life were long enough…. Yet, worldwide, even among academics, the most interesting thing is always one’s social group, and gossip remains the major topic of conversation (Dunbar 2004).
Throughout history, hedonists have lived for their key pleasure, puritans have lived to stop them. The hedonist lives to eat. The puritan eats to live, and lives to blame the hedonist for immorality. Some people have sex only to produce children, others only for pleasure, others only as part of a love relationship. Such “revealed preferences"—the things people actually do, or spend their money on—keep surprising us.
Happiness in an activity can come from many sources, only one of which is the intrinsic pleasure of the activity. More often, the happiness or pleasure comes from social approbation. Something intrinsically unenjoyable seems pleasurable because “everybody does it,” or because we get respected for doing it. In fact, the whole point of many activities is that they are so unpleasant, difficult, and demanding that others are impressed by our ability to do them at all. Just as believing the preposterous is a great way of proving one is truly religious (Atran 2002, 2010), so torturing oneself to follow the latest media fad is a great way of proving one is part of the group. (The technical term for this is “costly signaling,” and it is almost universal among animals.)
Extreme sports are an example. Some people climb mountains just because they enjoy the activity and the view. Most of us who have this persuasion climb rather small mountains. Others want to triumph over nature, or over themselves. The most serious climbers, though, usually seem to have social approbation on their minds, however much they may also enjoy the peaks. They want the respect that comes from doing a “hairy” climb, especially if they can be the first to solo up south face in winter, or something of that nature.
Once a need is satisfied, further satisfaction is not usually pleasant. Our bodies tell us when we have had enough to eat, enough to drink, enough sex. They err less than you might think; eating 100 calories more than you burn up, every day, will make you gain a pound a month. Very few people do that.
The major exception here is the control need. It has no obvious satiation point. This is fine when one asserts control by knowing. It is less fine when one feels the need to control everyone and everything in the vicinity.
Money does indeed fail to buy happiness; it can buy “life satisfaction”—relative content with one’s life—but real positive feelings depend on “fulfillment of pscyhological needs: learning, autonomy, using ones skills, respect, and the ability to count on others” (Diener et al. 2010). In other words, on social and control need satisfaction. Even the “life satisfaction” seems to be more about keeping up with the Joneses than about the pleasures of wealth, for rising incomes do not cause notable rises in it, unless one moves from genuine want to genuine comfort.
On the whole, most of us are content to hold even, but people find real meaning in more demanding activities. The old German formula for a good life, made famous by Sigmund Freud, is “work and love.” Most people who seem deeply satisfied with life do indeed get their satisfaction from these two things (see Frankl 1959, 1978). They also get their real social place from those two, and social place is far more basic and deeply important than happiness or satisfaction. Thus, even correcting someone at a task is often taken as a deadly rejection, and produces anger that often seems highly disproportionate to the scale of the correction. This is one of the reasons administrators often burn out.
A long-standing problem for western metapsychology has been the separation of mind and body. This separation is sometimes unfairly blamed on Descartes, but Plato was already fatally infected with it, and so has Christianity been, ever since its earliest centuries. Almost everyone condemns it today, but few have really considered an alternative, beyond pointing to neuropsychology—a field relatively young.
A female friend of mine, a cancer survivor, once said of her cancer: “I felt my body had let me down, and I was sort of mad at it.” The idea that one could think of one’s body as separate from one’s self, and the idea that said body could be an “it” without a gender, were both strange to me. I have since asked many people about this, and found that most women seem to feel that way; most men feel that they are in their bodies and that their bodies are emphatically male. But there are striking exceptions.
The best comments I have heard on the subject of self-embodiment come from friends and students in the world of dance; female dancers I know are thoroughly in their (gendered) bodies. The most thoughtful, extensive, and sophisticated commentary I have read on such matters is Raymond Gibbs’ Embodiment and Cognitive Science (2006). Gibbs constructs a whole psychology on the basis of knowing not only that the mind is in the physical brain, but also that the brain is constantly receiving enormous amounts of feedback from the body. Not only do we experience cold and heat, taste and smell, sound and light, but we also have massive and continual proprioceptive feedback. We know where we are in space, what muscles are moving, which are tense, which are relaxed, which are readying for action, and so on. We are aware of physical states caused by hormones, though most people are not aware of the hormonal basis of these states. The wonderful, warm, tender feeling of holding a child is caused in part by oxytocin release. Hormone specialists may think of this when they hold their kids, just as Buddhist meditators may be acutely conscious of heartbeat, digestion, and breathing, but most of us just go with the feeling.
Obviously, disembodied thought simply isn’t possible for humans. Even the sages of old often commented on the way indigestion colors cognition. More recently, we have learned that smiling makes you happier—in fact, holding a pencil between your teeth makes you happier, because it forces your lips into a smile-like shape! (See Lyubomirsky 2007.) Frowning, of course, has the opposite effect. It is extremely doubtful whether we could think at all without tides of hormones, neurotransmitters, and other chemical data coming from the 97% of our bodies that is not brain tissue.
Marcel Mauss (1979; the French original dates from the late 1930s) continued this line of thought by showing how thoroughly culture is embodied. How we walk, swim, dance, and even breathe is culturally conditioned. His modest talk started a whole field of anthropology, one that has produced some of the most exciting new material in recent years. Unfortunately, this work is outside my purview here (Bruno Latour, 2005, provides a wonderful follow-up and updating).
“Consciousness” is one of those infuriatingly ambiguous words that deserves better. It needs definition.
Never was I more convinced by a book than I was by Daniel Dennett’s Consciousness Explained (1991). I started it convinced (contra Dennett) that humans had a distinctive “consciousness” that machines could not approximate. I finished it completely convinced that Dennett was right in maintaining that not only could one conceivably build a conscious robot (though no one has done it), but that the whole concept of “consciousness” was in sorry shape.
Jerome Kagan has recently unpacked the word “consciousness,” listing four ways the word is used—roughly, sensory awareness, cognition (rational thought), active planning and execution, and the ability to use symbols and abstractions (Kagan 2006:123).
I would prefer to restrict the word to its proper medical use: being aware, as opposed to comatose. Consider the very frequent claim that only humans have true “consciousness” (Dennett 1991). Obviously this is ridiculous, given the normal use of the term. Even a philosopher can tell if her dog is awake, asleep, or in a coma.
Several vague meanings of “consciousness” have slipped in when people maintain that animals lack it. First, users of the word may still be deluded by the long-discredited Cartesian view that animals lack minds—Descartes was thinking of souls (in French, he wrote âmes). Of course, Descartes did not invent this idea; it comes from Plato and from Christian theology. Second, and more serious, “consciousness” has been used by some to mean the higher-order representations that humans constantly make. This would be reasonable if “consciousness” did not already have its ordinary meaning of “awake and aware,” but it does, so the term causes nothing but confusion when used for a different concept.
Another meaning of “consciousness” is “deliberately attentive.” We use this one when we say that so-and-so “is so busy with her work that she is totally unconscious of all around her.” Of course we don’t mean that she is literally unconscious; we know that if we called her name, or played some music she hates, she would immediately respond (with great annoyance in the second case!).
Then there is “being self-conscious.” Even this has two completely different meanings: either being conscious of self, or conscious of making a fool of said self.
Some claim that nonhuman animals are “self-aware” or “self-conscious” if they rub at colored spots placed on their foreheads when they look in a mirror. Animals that do not do this are not “self-aware.” However, most mammals do not recognize “self” by sight but by smell. Confront a dog with a mirror for the first time, and you will see the dog start at the strange “dog,” then sniff at it, and immediately lose all interest. Dogs recognize their self-scents on any objects they have touched, and they recognize the scents of their packmates. Conversely, take a crow or an elephant, which use sight more, and they do rub at colored spots on themselves, when they see them in mirrors.
So “consciousness” can mean “non-comatose state,” or “awake and aware,” or “deliberately attentive,” or “able to make simple plans,” or “able to make higher-order, complex plans,” or “aware of paint spots on one’s forehead.” Trimming this back leaves us with two well-established and very different meanings: (1) non-comatose; (2) directly, immediately aware on several levels. The second is too well established to drop, but is horribly vague. If I am driving while carrying on a conversation, most of my driving is being done at a subconscious, or even unconscious, level. My brain is working hard at it, but I am not “consciously” thinking about it. As traffic gradually thickens, I have to concentrate more and more on the driving, until eventually I have to stop talking. There was no one point at which I shifted from “subsconscious” to “conscious” driving. I merely had to use more and more of my brain on the task. The same is true of monitoring conversations at a party, working in the yard, or doing anything of the sort.
Philosophers and behavior biologists must clean up their act on usage of this word. In the meantime, saying that only humans are or could be “conscious,” or similar philosophic tags based on vapid definitions, is ridiculous nonsense.
From the various social experiences they have, people construct selves. "Self," in English at least, is another highly ambiguous concept. What does a person mean when she says "I'm not myself today" or "this is not the real me you're seeing"? Who is she, then? And when someone says “I have to tell myself to argue less,” who is the “I” who is telling the self?
And how do we "make up our minds"? Who is it that makes up a mind? Does one make it up as one makes up a story, or as one makes up a bed?
One meaning of "self" is everything wrapped in my skin. Obviously, this is not the self implied in those familiar taglines, which seem very similar to the aforementioned comment of the woman whose body was an “it.” The "self" of the taglines appears to be the core of things that are the "real me," as opposed to transient concerns like being hungry, or being nervous about a manuscript submission. The Buddhist idea of "self" is the exact opposite: the “self” is the accidental and trivial, while one’s deep core is something that transcends selfhood and can thus be reincarnated in another, different “self.”
Arthur Rimbaud said: “I is another” (je est un autre). This was not bad grammar: he saw the self was different from some true inner being.
Psychologists have dealt variously with these matters. Every psychologist seems to have a different scheme. Antonio Damasio, for example, distinguished a proto-self (physical being), a core slef (basic feelings) and an autobiographical self (conscious higher-level processes; Damasio 2000). All these neat schemes fail to satisfy. A normal human feels herself to be single individual, not a bunch of selves doing different things. Yet, the same individual knows she is a “different person” in different contexts—working at her day job in a child care center or unwinding in a night club at 2 a.m., for instance. She is the same person, but she has many schemas for acting in particular places, and they can make her seem very different indeed. This paradoxical fact has led to thousands of pages of philosophical and psychological speculation.
Yet another meaning of “self” is the socially defined persona that develops through interaction, as opposed to the deep biological personhood that presumably exists underneath that. Evidently, “self” is merely a convenience-term for a lot of traits and concepts. There may be no unity in there—no phenomenologically unified individual. Philosophers have argued at vast length over this (notable is Derek Parfit’s brilliant Reasons and Persons, 1986). However, as practiced meditators know, concentration and dropping of internal barriers can bring up one’s total personhood. Because of this, I suspect that, even without meditation, a totality is always there, and always brought to every situation. The fact that some part of the totality is not immediately conscious of some other part of it does not make the totality much less real. If so, then “self” changes all the time.
However defined, the self is socially constructed, as George Herbert Mead pointed out long ago (Mead 1964). The folk theory of a deep biological personhood noted above appears to be deeply flawed. Even the core inside, the basic person that remains constant from childhood to old age, was originally constructed in the bosom of the family or comparable orientation group. The wider “self” that is the sum total of one’s fully-developed personhood is far more obviously social. It is the result of a long series of interactions between its bearer and his or her "signficant others." (Mead coined that term, but he did not use it to mean "sex partners"; he used it to mean everybody we interact with, so long as they are important enough to us to affect us significantly.)
The cultural psychologists and cross-cultural psychologists have not failed to point out that the concept of “self” is extremely different in different societies, depending somewhat on experience and somewhat on cultural drift. Individuals within societies may differ widely as well. This makes the concept of self even more problematic. These psychologists talk of “self-schemas”: our ideas of what a self is and how selves differ. Perhaps such cultural ideas are real in a sense that the self is not.
Individuals are socially constructed. There is a simple test in social science that many of us have used in class. It consists of nothing but the question “What am I?” followed by 20 lines. The students are instructed to list 20 things that they are—their names not included.
They almost always start with kinterms: I am a son, father, wife, sister.
Next, if they are American, they list their occupation. If they are not American-born and raised, they usually say where they are from: French, Genevan, Roman, Shanghainese. Then the order reverses; Americans list where they are from in the third group of lines, others list their occupations.
If there is space left, they generally go on to hobbies: skier, wine-taster, videogamer, guitarist.
That usually fills 20 lines, but any normal person could go on for hundreds of lines: dog-owner, former circus-fan, maker of gourmet potato salad….
The point of this is to show you that you are social to the point of being socially constructed. By a social definition of self, your essential being is your sonship or daughtership, your job, your social place.
Emmanuel Levinas (e.g. 1969) built up a whole theology and philosophy from this perception. For him, we are made from interactions, and thus the others in our lives are literally infinitely important to us. Without them there would be no me. I would never have survived a day, let alone learned to talk or walk. We owe everything to each other, and Levinas’ God is in the interaction spaces.
The full slipperiness of the word “experience” emerges from reading Martin Jay’s Songs of Experience (2005). Jay focuses on uses of the word in a narrow universe: philosophical writings from Montaigne to Foucault. Most of his book focuses on the first two-thirds of the 20th century. Yet, this occupies him for over 400 pages, and this is just an overview. “Experience” has been used to justify fascism, Marxism, and everything in between. Many of the grave sages were seeking “pure” experience, whatever that is. But there is no such thing. Two people may see the same dog, but they will not think about it the same way. In fact, when I am thinking in Spanish or Maya, I do not see the same dog that I see in English. I am so used to the languages that perro makes me think of a grubby street-cur, and peek’ makes me think of a small, scruffy, lovable but dirty pet. Dog makes me think of my own big, cute mutts.
The connotations change with the language. The dogs themselves accommodate: mean when ill-treated, humble and gentle in Maya village conditions, or active and enthusiastic when spoiled with pet-store goodies.
A devotee of “pure” experience might focus on dogness across cultures, and recognize—correctly—that we really all see the same animals. Constant interaction with dogs shapes our experience more than culture and language do. Conversely, a radical culturalist would note that dog, perro, and peek’ are totally different and unrelated words, and conclude that culture is arbitrary and incommensurable. A cultural anthropologist like me will conclude that dogs are dogs but people see them rather differently, according to what they—the dogs and the people—have learned.
Time matters as much as culture. I see and experience dogs differently now from the ways I did 60 years ago. Anglo-American culture sees dogs differently over different generations. “Mind” and “thought,” as seen by different types of philosophers or psychologists, are even more different than cross-cultural “dogs.” Some deny the reality of experience because it is undefinable and cannot be “pure,” or they cultivate weird experiences simply to have them, or they discuss experience devoid of emotion (Jay 2005). Others say there is only language (or “text” or “discourse”)—forgetting, among other things, that we have developed music, dance, painting, and many other arts specifically to communicate the things we can’t say in words. As the dancer said when asked to explain her dance, “If I could explain it, I wouldn’t have to dance it.” (This folktale has been hung on every significant female dancer in modern history; no one seems to know who really started it.) Still other sages—notably of the Frankfurt school, according to Jay—tell us that experience today is a pale shadow of what it once was. I somehow doubt that my experience of sex, or good wine, or a back rub is all that much less real than that of a medieval peasant or a Pleistocene gatherer.
Philosophers everywhere hate the fact that we cannot have perfect knowledge, pure experience, and absolute truth. Anthropologists, however, frequently revel in the fact that our knowledge is ultimately incomplete and is shaped by our cultural and personal “experience.” One would blame these “pure experience” worries on Plato if one did not know that Chinese Neo-Confucians, Hindu Vedantins, and others around the world have the selfsame issue. I think philosophers must be drinkers of vodka instead of red wine. They want the pure distilled experience, shorn of all taste, smell, color, texture, and indeed of anything except the gift of rapid passing-out. A rich, complex, multilayered, multi-textured, subtle drink is not for them. Especially since wine takes longer to produce oblivion.
Mystics have always held that the human mind can know ultimate truth. As the Quakers put it, people have an Inner Light: People can contact some Higher Power or inner awareness of higher powers, to get a unitary vision of the Good. The philosophers’ “truth, beauty and goodness” come together in one thing. This may be revelation, or Dao, or hozhoo (the Navaho concept that unites truth, beauty, morality, and health in one word). Whatever may be real beyond the brain, there is most certainly something within the brain that makes us aware of, and sensitive to, Plato’s philosophic trinity of “truth, beauty, and goodness.”
This raises the question of whether people are innately “good” or “bad.” Certainly they are innately sociable, desiring harmony, pleasantness, and other social goods. Certainly they are also defensive and sometimes irresponsible or selfish. The Chinese have had a longstanding dialogue about this, since Mencius took the positive view (Mencius 1970) and Xunzi (1999) the negative one in the 4th century BC. The evidence suggested to the Chinese, and to us today, that Mencius was right: humans are basically sociable, with the unpleasant aspects of the human condition being defensive reactions to social threats and slights and to social insecurity generally. This still leaves Xunzi a large opening for his arguments, and he is by no means dismissed today. The Bible gave us a world that moved from original sin to Cain slaying Abel. Theorists from Xunzi to Hobbes (1950 ) in 17th-century England saw humans as being in a state of “warre” (as Hobbes put it) because of the natural human tendency toward violence and noncooperation. Nietzsche, Freud, and Foucault have kept this theory alive.