Monday, January 7, 2013

Reinforcement

This line really stuck with me recently: If you view the world as a benevolent place, your rapid-fire, reflexive response in a situation is more likely to spread that benevolence further.

It's from an opinion piece in the Los Angeles Times written by Robert M. Sapolsky and titled Human - for Better or Worse. Read the whole text here.
***********


"It's obviously hard to answer the question of what primordial humans were like, since we can't go back in time and study them. But another way of getting at this issue is to study people who must act in a primordial manner, having to make instant gut decisions. Do we tend to become more or less noble than usual when we must act on rapid intuition?

Light is shed on this in a recent study by David Rand and colleagues at Harvard, published in the prestigious journal Science, and the research is tragically relevant. The authors recruited volunteers to play one of those economic games in which individuals in a group are each given some hypothetical money; each person must decide whether to be cooperative and benefit the entire group, or to act selfishly and receive greater individual gain. A key part of the experiment was that the scientists altered how much time subjects had to decide whether to cooperate. And that made a difference. When people had to make a rapid decision based on their gut, levels of cooperation rose; give them time to reflect on the wisdom of their actions, and the opposite occurred.

Testing a new set of volunteers, the authors also manipulated how much respect subjects had for intuitive decision-making. Just before participating in the economics game, people had to either write a paragraph about a time that it had paid off to make a decision based on intuition rather than reflection, or a paragraph about a time when reflection turned out to be the best way to go. Bias people toward valuing quick, intuitive decision-making, and they acted more for the common good in the subsequent game. In contrast, bias people in the reflective direction, and "looking out for No. 1" comes more to the forefront — something the authors termed "calculated greed."

Naturally, not everyone behaved identically in response to these experimental manipulations. Where might differences come from? The authors asked participants a simple question: On a scale of 1 to 10, how much can you trust people whom you interact with in your daily life? And the more trusting subjects were, the more quick, intuitive thinking pushed them in the direction of cooperation. If you view the world as a benevolent place, your rapid-fire, reflexive response in a situation is more likely to spread that benevolence further.

Neuroscience has generated a trendy new subfield called "neuroeconomics," which examines how the brain makes economic decisions. The field's punch line is that we are not remotely the gleaming, logical machines of rationality that most economists proclaim; instead, we make decisions amid the swirl of our best and worst emotions. Neuroeconomics, in turn, has spawned the sub-subfield of "the neuroscience of moral judgment." Scientists such as Jonathan Haidt of New York University have shown that we frequently feel rather than think our way to moral judgments; in general, the more affective parts of our brains generate quick, intuitive, moral decisions ("I can't tell you why, but that is wrong, wrong, wrong"), while the more cognitive parts play catch-up milliseconds to years later to come up with logical rationales for our gut intuitions. Thus, it is obviously important to understand what leads intuitive decisions in the direction of acting for the common good."



No comments: