Tag Archives: psychology

Boost Your Brainpower: Chew Gum

So you wish to boost your brain function? Well, forget the folate, B vitamins, omega-3 fatty acids, ginko biloba, and the countless array of other supplements. Researchers have confirmed that chewing gum increases cognitive abilities. However, while gum chewers perform significantly better on a battery of psychological tests, the boost is fleeting — lasting only on average for the first 20 minutes of testing.

[div class=attrib]From Wired:[end-div]

Why do people chew gum? If an anthropologist from Mars ever visited a typical supermarket, they’d be confounded by those shelves near the checkout aisle that display dozens of flavored gum options. Chewing without eating seems like such a ridiculous habit, the oral equivalent of running on a treadmill. And yet, people have been chewing gum for thousands of years, ever since the ancient Greeks began popping wads of mastic tree resin in their mouth to sweeten the breath. Socrates probably chewed gum.

It turns out there’s an excellent rationale for this long-standing cultural habit: Gum is an effective booster of mental performance, conferring all sorts of benefits without any side effects. The latest investigation of gum chewing comes from a team of psychologists at St. Lawrence University. The experiment went like this: 159 students were given a battery of demanding cognitive tasks, such as repeating random numbers backward and solving difficult logic puzzles. Half of the subjects chewed gum (sugar-free and sugar-added) while the other half were given nothing. Here’s where things get peculiar: Those randomly assigned to the gum-chewing condition significantly outperformed those in the control condition on five out of six tests. (The one exception was verbal fluency, in which subjects were asked to name as many words as possible from a given category, such as “animals.”) The sugar content of the gum had no effect on test performance.

While previous studies achieved similar results — chewing gum is often a better test aid than caffeine — this latest research investigated the time course of the gum advantage. It turns out to be rather short lived, as gum chewers only showed an increase in performance during the first 20 minutes of testing. After that, they performed identically to non-chewers.

What’s responsible for this mental boost? Nobody really knows. It doesn’t appear to depend on glucose, since sugar-free gum generated the same benefits. Instead, the researchers propose that gum enhances performance due to “mastication-induced arousal.” The act of chewing, in other words, wakes us up, ensuring that we are fully focused on the task at hand.

[div class=attrib]Read the entire article here.[end-div]

[div class=attrib]Image: Chewing gum tree, Mexico D.F. Courtesy of mexicolore.[end-div]

Book Review: Thinking, Fast and Slow. Daniel Kahneman

Daniel Kahneman brings together for the first time his decades of groundbreaking research and profound thinking in social psychology and cognitive science in his new book, Thinking Fast and Slow. He presents his current understanding of judgment and decision making and offers insight into how we make choices in our daily lives. Importantly, Kahneman describes how we can identify and overcome the cognitive biases that frequently lead us astray. This is an important work by one of our leading thinkers.

[div class=attrib]From Skeptic:[end-div]

The ideas of the Princeton University Psychologist Daniel Kahneman, recipient of the Nobel Prize in Economic Sciences for his seminal work that challenged the rational model of judgment and decision making, have had a profound and widely regarded impact on psychology, economics, business, law and philosophy. Until now, however, he has never brought together his many years of research and thinking in one book. In the highly anticipated Thinking, Fast and Slow, Kahneman introduces the “machinery of the mind.” Two systems drive the way we think and make choices: System One is fast, intuitive, and emotional; System Two is slower, more deliberative, and more logical. Examining how both systems function within the mind, Kahneman exposes the extraordinary capabilities and also the faults and biases of fast thinking, and the pervasive influence of intuitive impressions on our thoughts and our choices. Kahneman shows where we can trust our intuitions and how we can tap into the benefits of slow thinking. He offers practical and enlightening insights into how choices are made in both our business and personal lives, and how we can guard against the mental glitches that often get us into trouble. Kahneman will change the way you think about thinking.

[div class=attrib]Image: Thinking, Fast and Slow, Daniel Kahneman. Courtesy of Publishers Weekly.[end-div]

A Medical Metaphor for Climate Risk

While scientific evidence of climate change continues to mount and an increasing number of studies point causal fingers at ourselves there is perhaps another way to visualize the risk of inaction or over-reaction. So, since most people can leave ideology aside when it comes to their own health, a medical metaphor, courtesy of Andrew Revkin over at Dot Earth, may be of use to broaden acceptance of the message.

[div class=attrib]From the New York Times:[end-div]

Paul C. Stern, the director of the National Research Council committee on the human dimensions of global change, has been involved in a decades-long string of studies of behavior, climate change and energy choices.

This is an arena that is often attacked by foes of cuts in greenhouse gases, who see signs of mind control and propaganda. Stern says that has nothing to do with his approach, as he made clear in “Contributions of Psychology to Limiting Climate Change,” a paper that was part of a special issue of the journal American Psychologist on climate change and behavior:

Psychological contributions to limiting climate change will come not from trying to change people’s attitudes, but by helping to make low-carbon technologies more attractive and user-friendly, economic incentives more transparent and easier to use, and information more actionable and relevant to the people who need it.

The special issue of the journal builds on a 2009 report on climate and behavior from the American Psychological Association that was covered here. Stern has now offered a reaction to the discussion last week of Princeton researcher Robert Socolow’s call for a fresh approach to climate policy that acknowledges “the news about climate change is unwelcome, that today’s climate science is incomplete, and that every ’solution’ carries risk.” Stern’s response, centered on a medical metaphor (not the first) is worth posting as a “Your Dot” contribution. You can find my reaction to his idea below. Here’s Stern’s piece:

I agree with Robert Socolow that scientists could do better at encouraging a high quality of discussion about climate change.

But providing better technical descriptions will not help most people because they do not follow that level of detail.  Psychological research shows that people often use simple, familiar mental models as analogies for complex phenomena.  It will help people think through climate choices to have a mental model that is familiar and evocative and that also neatly encapsulates Socolow’s points that the news is unwelcome, that science is incomplete, and that some solutions are dangerous. There is such a model.

Too many people think of climate science as an exact science like astronomy that can make highly confident predictions, such as about lunar eclipses.  That model misrepresents the science, does poorly at making Socolow’s points, and has provided an opening for commentators and bloggers seeking to use any scientific disagreement to discredit the whole body of knowledge.

A mental model from medical science might work better.  In the analogy, the planet is a patient suspected of having a serious, progressive disease (anthropogenic climate change).  The symptoms are not obvious, just as they are not with diabetes or hypertension, but the disease may nevertheless be serious.  Humans, as guardians of the planet, must decide what to do.  Scientists are in the role of physician.  The guardians have been asking the physicians about the diagnosis (is this disease present?), the nature of the disease, its prognosis if untreated, and the treatment options, including possible side effects.  The medical analogy helps clarify the kinds of errors that are possible and can help people better appreciate how science can help and think through policy choices.

Diagnosis. A physician must be careful to avoid two errors:  misdiagnosing the patient with a dread disease that is not present, and misdiagnosing a seriously ill patient as healthy.  To avoid these types of error, physicians often run diagnostic tests or observe the patient over a period of time before recommending a course of treatment.  Scientists have been doing this with Earth’s climate at least since 1959, when strong signs of illness were reported from observations in Hawaii.

Scientists now have high confidence that the patient has the disease.  We know the causes:  fossil fuel consumption, certain land cover changes, and a few other physical processes. We know that the disease produces a complex syndrome of symptoms involving change in many planetary systems (temperature, precipitation, sea level and acidity balance, ecological regimes, etc.).  The patient is showing more and more of the syndrome, and although we cannot be sure that each particular symptom is due to climate change rather than some other cause, the combined evidence justifies strong confidence that the syndrome is present.

Prognosis. Fundamental scientific principles tell us that the disease is progressive and very hard to reverse.  Observations tell us that the processes that cause it have been increasing, as have the symptoms.  Without treatment, they will get worse.  However, because this is an extremely rare disease (in fact, the first known case), there is uncertainty about how fast it will progress.  The prognosis could be catastrophic, but we cannot assign a firm probability to the worst outcomes, and we are not even sure what the most likely outcome is.  We want to avoid either seriously underestimating or overestimating the seriousness of the prognosis.

Treatment. We want treatments that improve the patient’s chances at low cost and with limited adverse side effects and we want to avoid “cures” that might be worse than the disease.  We want to consider the chances of improvement for each treatment, and its side effects, in addition to the untreated prognosis.  We want to avoid the dangers both of under-treatment and of side effects.  We know that some treatments (the ones limiting climate change) get at the causes and could alleviate all the symptoms if taken soon enough.  But reducing the use of fossil fuels quickly could be painful.  Other treatments, called adaptations, offer only symptomatic relief.  These make sense because even with strong medicine for limiting climate change, the disease will get worse before it gets better.

Choices. There are no risk-free choices.  We know that the longer treatment is postponed, the more painful it will be, and the worse the prognosis.  We can also use an iterative treatment approach (as Socolow proposed), starting some treatments and monitoring their effects and side effects before raising the dose.  People will disagree about the right course of treatment, but thinking about the choices in this way might give the disagreements the appropriate focus.

[div class=attrib]Read more here.[end-div]

[div class=attrib]Image courtesy of Stephen Wilkes for The New York Times.[end-div]

Misconceptions of Violence

We live in violent times. Or do we?

Despite the seemingly constant flow of human engineered destruction on our fellow humans, other species and our precious environment some thoughtful analysis — beyond the headlines of cable news — shows that all may not be lost to our violent nature. An insightful interview with psychologist Steven Pinker, author of “How the Mind Works” shows us that contemporary humans are not as bad as we may have thought. His latest book, “The Better Angels of Our Nature: Why Violence Has Declined,” analyzes the basis and history of human violence. Perhaps surprisingly Pinker suggests that we live in remarkably peaceful times, comparatively speaking. Characteristically he backs up his claims with clear historical evidence.

[div class=attrib]From Gareth Cook for Mind Matters:[end-div]

COOK: What would you say is the biggest misconception people have about violence?
PINKER: That we are living in a violent age. The statistics suggest that this may be the most peaceable time in our species’s existence.

COOK: Can you give a sense for how violent life was 500 or 1000 years ago?
PINKER: Statistics aside, accounts of daily life in medieval and early modern Europe reveal a society soaked in blood and gore. Medieval knights—whom today we would call warlords—fought their numerous private wars with a single strategy: kill as many of the opposing knight’s peasants as possible. Religious instruction included prurient descriptions of how the saints of both sexes were tortured and mutilated in ingenious ways. Corpses broken on the wheel, hanging from gibbets, or rotting in iron cages where the sinner had been left to die of exposure and starvation were a common part of the landscape. For entertainment, one could nail a cat to a post and try to head-butt it to death, or watch a political prisoner get drawn and quartered, which is to say partly strangled, disemboweled, and castrated before being decapitated. So many people had their noses cut off in private disputes that medical textbooks had procedures that were alleged to grow them back.

COOK: How has neuroscience contributed to our understanding of violence and its origins?
PINKER: Neuroscientists have long known that aggression in animals is not a unitary phenomenon driven by a single hormone or center. When they stimulate one part of the brain of a cat, it will lunge for the experimenter in a hissing, fangs-out rage; when they stimulate another, it will silently stalk a hallucinatory mouse. Still another circuit primes a male cat for a hostile confrontation with another male. Similar systems for rage, predatory seeking, and male-male aggression may be found in Homo sapiens, together with uniquely human, cognitively-driven  systems of aggression such as political and religious ideologies and moralistic punishment. Today, even the uniquely human systems can be investigated using functional neuroimaging. So neuroscience has given us the crucial starting point in understanding violence, namely that it is not a single thing. And it has helped us to discover biologically realistic taxonomies of the major motives for violence.

COOK: Is the general trend toward less violence going to continue in the future?
PINKER: It depends. In the arena of custom and institutional practices, it’s a good bet. I suspect that violence against women, the criminalization of homosexuality, the use of capital punishment, the callous treatment of animals on farms, corporal punishment of children, and other violent social practices will continue to decline, based on the fact that worldwide moralistic shaming movements in the past (such as those against slavery, whaling, piracy, and punitive torture) have been effective over long stretches of time. I also don’t expect war between developed countries to make a comeback any time soon. But civil wars, terrorist acts, government repression, and genocides in backward parts of the world are simply too capricious to allow predictions. With six billion people in the world, there’s no predicting what some cunning fanatic or narcissistic despot might do.

[div class=attrib]Read more of the interview here.[end-div]

[div class=attrib]Image courtesy of Scientific American.[end-div]

All Power Corrupts

[div class=attrib]From the Economist:[end-div]

DURING the second world war a new term of abuse entered the English language. To call someone “a little Hitler” meant he was a menial functionary who employed what power he had in order to annoy and frustrate others for his own gratification. From nightclub bouncers to the squaddies at Abu Ghraib prison who tormented their prisoners for fun, little Hitlers plague the world. The phenomenon has not, though, hitherto been subject to scientific investigation.

Nathanael Fast of the University of Southern California has changed that. He observed that lots of psychological experiments have been done on the effects of status and lots on the effects of power. But few, if any, have been done on both combined. He and his colleagues Nir Halevy of Stanford University and Adam Galinsky of Northwestern University, in Chicago, set out to correct this. In particular they wanted to see if it is circumstances that create little Hitlers or, rather, whether people of that type simply gravitate into jobs which allow them to behave badly. Their results have just been published in the Journal of Experimental Social Psychology.

Dr Fast’s experiment randomly assigned each of 213 participants to one of four situations that manipulated their status and power. All participants were informed that they were taking part in a study on virtual organisations and would be interacting with, but not meeting, a fellow student who worked in the same fictional consulting firm. Participants were then assigned either the role of “idea producer”, a job that entailed generating and working with important ideas, or of “worker”, a job that involved menial tasks like checking for typos. A post-experiment questionnaire demonstrated that participants did, as might be expected, look upon the role of idea producer with respect and admiration. Equally unsurprisingly, they looked down on the role of worker.

Participants who had both status and power did not greatly demean their partners. They chose an average of 0.67 demeaning activities for those partners to perform. Low-power/low-status and low-power/high-status participants behaved similarly. They chose, on average, 0.67 and 0.85 demeaning activities. However, participants who were low in status but high in power—the classic “little Hitler” combination—chose an average of 1.12 deeply demeaning tasks for their partners to engage in. That was a highly statistically significant distinction.

Of course, not everybody in the high-power/low-status quadrant of the experiment behaved badly. Underlying personality may still have a role. But as with previous experiments in which random members of the public have been asked to play prison guard or interrogator, Dr Fast’s result suggests that many quite ordinary people will succumb to bad behaviour if the circumstances are right.

[div class=attrib]Read more here.[end-div]

[div class=attrib]Image courtesy of the Economist / Getty Images.[end-div]

Complex Decision To Make? Go With the Gut

Over the last couple of years a number of researchers have upended conventional wisdom by finding that complex decisions, for instance, those having lots of variables, are better “made” through our emotional system. This flies in the face of the commonly held belief that complexity is best handled by our rational side.

[div class=attrib]Jonah Lehrer over at the Frontal Cortex brings us up to date on current thinking.[end-div]

We live in a world filled with difficult decisions. In fact, we’ve managed to turn even trivial choices – say, picking a toothpaste – into a tortured mental task, as the typical supermarket has more than 200 different dental cleaning options. Should I choose a toothpaste based on fluoride content? Do I need a whitener in my toothpaste? Is Crest different than Colgate? The end result is that the banal selection becomes cognitively demanding, as I have to assess dozens of alternatives and take an array of variables into account. And it’s not just toothpaste: The same thing has happened to nearly every consumption decision, from bottled water to blue jeans to stocks. There are no simple choices left – capitalism makes everything complicated.

How should we make all these hard choices? How does one navigate a world of seemingly infinite alternatives? For thousands of years, the answer has seemed obvious: when faced with a difficult dilemma, we should carefully assess our options and spend a few moments consciously deliberating the information. Then, we should choose the toothpaste that best fits our preferences. This is how we maximize utility and get the most bang for the buck. We are rational agents – we should make decisions in a rational manner.

But what if rationality backfires? What if we make better decisions when we trust our gut instincts? While there is an extensive literature on the potential wisdom of human emotion, it’s only in the last few years that researchers have demonstrated that the emotional system (aka Type 1 thinking) might excel at complex decisions, or those involving lots of variables. If true, this would suggest that the unconscious is better suited for difficult cognitive tasks than the conscious brain, that the very thought process we’ve long disregarded as irrational and impulsive might actually be “smarter” than reasoned deliberation. This is largely because the unconscious is able to handle a surfeit of information, digesting the facts without getting overwhelmed. (Human reason, in contrast, has a very strict bottleneck and can only process about four bits of data at any given moment.) When confused in the toothpaste aisle, bewildered by all the different options, we should go with the product that feels the best.

The most widely cited demonstration of this theory is a 2006 Science paper led by Ap Dijksterhuis. (I wrote about the research in How We Decide.) The experiment went like this: Dijksterhuis got together a group of Dutch car shoppers and gave them descriptions of four different used cars. Each of the cars was rated in four different categories, for a total of sixteen pieces of information. Car number 1, for example, was described as getting good mileage, but had a shoddy transmission and poor sound system. Car number 2 handled poorly, but had lots of legroom. Dijksterhuis designed the experiment so that one car was objectively ideal, with “predominantly positive aspects”. After showing people these car ratings, Dijksterhuis then gave them a few minutes to consciously contemplate their decision. In this “easy” situation, more than fifty percent of the subjects ended up choosing the best car.

[div class=attrib]Read more of the article and Ap Dijksterhuis’ classic experiment here.[end-div]

[div class=attrib]Image courtesy of CustomerSpeak.[end-div]

The Teen Brain: Work In Progress or Adaptive Network?

[div class=attrib]From Wired:[end-div]

Ever since the late-1990s, when researchers discovered that the human brain takes into our mid-20s to fully develop — far longer than previously thought — the teen brain has been getting a bad rap. Teens, the emerging dominant narrative insisted, were “works in progress” whose “immature brains” left them in a state “akin to mental retardation” — all titles from prominent papers or articles about this long developmental arc.

In a National Geographic feature to be published next week, however, I highlight a different take: A growing view among researchers that this prolonged developmental arc is less a matter of delayed development than prolonged flexibility. This account of the adolescent brain — call it the “adaptive adolescent” meme rather than the “immature brain” meme — “casts the teen less as a rough work than as an exquisitely sensitive, highly adaptive creature wired almost perfectly for the job of moving from the safety of home into the complicated world outside.” The teen brain, in short, is not dysfunctional; it’s adaptive. .

Carl Zimmer over at Discover gives us some further interesting insights into recent studies of teen behavior.

[div class=attrib]From Discover:[end-div]

Teenagers are a puzzle, and not just to their parents. When kids pass from childhood to adolescence their mortality rate doubles, despite the fact that teenagers are stronger and faster than children as well as more resistant to disease. Parents and scientists alike abound with explanations. It is tempting to put it down to plain stupidity: Teenagers have not yet learned how to make good choices. But that is simply not true. Psychologists have found that teenagers are about as adept as adults at recognizing the risks of dangerous behavior. Something else is at work.

Scientists are finally figuring out what that “something” is. Our brains have networks of neurons that weigh the costs and benefits of potential actions. Together these networks calculate how valuable things are and how far we’ll go to get them, making judgments in hundredths of a second, far from our conscious awareness. Recent research reveals that teen brains go awry because they weigh those consequences in peculiar ways.

… Neuroscientist B. J. Casey and her colleagues at the Sackler Institute of the Weill Cornell Medical College believe the unique way adolescents place value on things can be explained by a biological oddity. Within our reward circuitry we have two separate systems, one for calculating the value of rewards and another for assessing the risks involved in getting them. And they don’t always work together very well.

… The trouble with teens, Casey suspects, is that they fall into a neurological gap. The rush of hormones at puberty helps drive the reward-system network toward maturity, but those hormones do nothing to speed up the cognitive control network. Instead, cognitive control slowly matures through childhood, adolescence, and into early adulthood. Until it catches up, teenagers are stuck with strong responses to rewards without much of a compensating response to the associated risks.

[div class=attrib]More from theSource here.[end-div]

[div class=attrib]Image courtesy of Kitra Cahana, National Geographic.[end-div]

Free Will: An Illusion?

Neuroscientists continue to find interesting experimental evidence that we do not have free will. Many philosophers continue to dispute this notion and cite inconclusive results and lack of holistic understanding of decision-making on the part of brain scientists. An article by Kerri Smith over at Nature lays open this contentious and fascinating debate.

[div class=attrib]From Nature:[end-div]

The experiment helped to change John-Dylan Haynes’s outlook on life. In 2007, Haynes, a neuroscientist at the Bernstein Center for Computational Neuroscience in Berlin, put people into a brain scanner in which a display screen flashed a succession of random letters1. He told them to press a button with either their right or left index fingers whenever they felt the urge, and to remember the letter that was showing on the screen when they made the decision. The experiment used functional magnetic resonance imaging (fMRI) to reveal brain activity in real time as the volunteers chose to use their right or left hands. The results were quite a surprise.

“The first thought we had was ‘we have to check if this is real’,” says Haynes. “We came up with more sanity checks than I’ve ever seen in any other study before.”

The conscious decision to push the button was made about a second before the actual act, but the team discovered that a pattern of brain activity seemed to predict that decision by as many as seven seconds. Long before the subjects were even aware of making a choice, it seems, their brains had already decided.

As humans, we like to think that our decisions are under our conscious control — that we have free will. Philosophers have debated that concept for centuries, and now Haynes and other experimental neuroscientists are raising a new challenge. They argue that consciousness of a decision may be a mere biochemical afterthought, with no influence whatsoever on a person’s actions. According to this logic, they say, free will is an illusion. “We feel we choose, but we don’t,” says Patrick Haggard, a neuroscientist at University College London.

You may have thought you decided whether to have tea or coffee this morning, for example, but the decision may have been made long before you were aware of it. For Haynes, this is unsettling. “I’ll be very honest, I find it very difficult to deal with this,” he says. “How can I call a will ‘mine’ if I don’t even know when it occurred and what it has decided to do?”

[div class=attrib]More from theSource here.[end-div]

[div class=attrib]Image courtesy of Nature.[end-div]

Reading Between the Lines

In his book, “The Secret Life of Pronouns”, professor of psychology James Pennebaker describes how our use of words like “I”, “we”, “he”, “she” and “who” reveals a wealth of detail about ourselves including, and very surprisingly, our health and social status.

[div class=attrib]Excerpts from James Pennebaker’s interview with Scientific American:[end-div]

In the 1980s, my students and I discovered that if people were asked to write about emotional upheavals, their physical health improved. Apparently, putting emotional experiences into language changed the ways people thought about their upheavals. In an attempt to better understand the power of writing, we developed a computerized text analysis program to determine how language use might predict later health improvements.

Much to my surprise, I soon discovered that the ways people used pronouns in their essays predicted whose health would improve the most. Specifically, those people who benefited the most from writing changed in their pronoun use from one essay to another. Pronouns were reflecting people’’s abilities to change perspective.

As I pondered these findings, I started looking at how people used pronouns in other texts — blogs, emails, speeches, class writing assignments, and natural conversation. Remarkably, how people used pronouns was correlated with almost everything I studied. For example, use of  first-person singular pronouns (I, me, my) was consistently related to gender, age, social class, honesty, status, personality, and much more.

… In my own work, we have analyzed the collected works of poets, playwrights, and novelists going back to the 1500s to see how their writing changed as they got older. We’ve compared the pronoun use of suicidal versus non-suicidal poets. Basically, poets who eventually commit suicide use I-words more than non-suicidal poets.
The analysis of language style can also serve as a psychological window into authors and their relationships. We have analyzed the poetry of Elizabeth Barrett and Robert Browning and compared it with the history of their marriage. Same thing with Ted Hughes and Sylvia Plath. Using a method we call Language Style Matching, we can isolate changes in the couples’ relationships.

… One of the most interesting results was part of a study my students and I conducted dealing with status in email correspondence. Basically, we discovered that in any interaction, the person with the higher status uses I-words less (yes, less) than people who are low in status. The effects were quite robust and, naturally, I wanted to test this on myself. I always assumed that I was a warm, egalitarian kind of guy who treated people pretty much the same.

I was the same as everyone else. When undergraduates wrote me, their emails were littered with I, me, and my. My response, although quite friendly, was remarkably detached — hardly an I-word graced the page.

[div class=attrib]More from theSource here.[end-div]

[div class=attrib]Images courtesy of Univesity of Texas at Austin.[end-div]

There’s Weird and Then There’s WEIRD

[div class=attrib]From Neuroanthropology:[end-div]

The most recent edition of Behavioral and Brain Sciences carries a remarkable review article by Joseph Henrich, Steven J. Heine and Ara Norenzayan, ‘The weirdest people in the world?’ The article outlines two central propositions; first, that most behavioural science theory is built upon research that examines intensely a narrow sample of human variation (disproportionately US university undergraduates who are, as the authors write, Western, Educated, Industrialized, Rich, and Democratic, or ‘WEIRD’).

More controversially, the authors go on to argue that, where there is robust cross-cultural research, WEIRD subjects tend to be outliers on a range of measurable traits that do vary, including visual perception, sense of fairness, cooperation, spatial reasoning, and a host of other basic psychological traits. They don’t ignore universals – discussing them in several places – but they do highlight human variation and its implications for psychological theory.

As is the custom at BBS, the target article is accompanied by a large number of responses from scholars around the world, and then a synthetic reflection from the original target article authors to the many responses (in this case, 28). The total of the discussion weighs in at a hefty 75 pages, so it will take most readers (like me) a couple of days to digest the whole thing.

t’s my second time encountering the article as I read a pre-print version and contemplated proposing a response, but, sadly, there was just too much I wanted to say, and not enough time in the calendar (conference organizing and the like dominating my life) for me to be able to pull it together. I regret not writing a rejoinder, but I can do so here with no limit on my space and the added advantage of seeing how other scholars responded to the article.

My one word review of the collection of target article and responses: AMEN!

Or maybe that should be, AAAAAAAMEEEEEN! {Sung by angelic voices.}

There’s a short version of the argument in Nature as well, but the longer version is well worth the read.

[div class=attrib]More from theSource here.[end-div]

Communicating Meaning in Cyberspace

Clarifying intent, emotion, wishes and meaning is a rather tricky and cumbersome process that we all navigate each day. Online in the digital world this is even more challenging, if not sometimes impossible. The pre-digital method of exchanging information in a social context would have been face-to-face. Such a method provides the full gamut of verbal and non-verbal dialogue between two or more parties. Importantly, it also provides a channel for the exchange of unconscious cues between people, which researchers are increasingly finding to be of critical importance during communication.

So, now replace the the face-to-face interaction with email, texting, instant messaging, video chat, and other forms of digital communication and you have a new playground for researchers in cognitive and social sciences. The intriguing question for researchers, and all of us for that matter, is: how do we ensure our meaning, motivations and intent are expressed clearly through digital communications?

There are some partial answers over at Anthropology in Practice, which looks at how users of digital media express emotion, resolve ambiguity and communicate cross-culturally.

[div class=attrib]Anthropology in Practice:[end-div]

The ability to interpret social data is rooted in our theory of mind—our capacity to attribute mental states (beliefs, intents, desires, knowledge, etc.) to the self and to others. This cognitive development reflects some understanding of how other individuals relate to the world, allowing for the prediction of behaviors.1 As social beings we require consistent and frequent confirmation of our social placement. This confirmation is vital to the preservation of our networks—we need to be able to gauge the state of our relationships with others.

Research has shown that children whose capacity to mentalize is diminished find other ways to successfully interpret nonverbal social and visual cues 2-6, suggesting that the capacity to mentalize is necessary to social life. Digitally-mediated communication, such as text messaging and instant messaging, does not readily permit social biofeedback. However cyber communicators still find ways of conveying beliefs, desires, intent, deceit, and knowledge online, which may reflect an effort to preserve the capacity to mentalize in digital media.

The Challenges of Digitally-Mediated Communication

In its most basic form DMC is text-based, although the growth of video conferencing technology indicates DMC is still evolving. One of the biggest criticisms of DMC has been the lack of nonverbal cues which are an important indicator to the speaker’s meaning, particularly when the message is ambiguous.

Email communicators are all too familiar with this issue. After all, in speech the same statement can have multiple meanings depending on tone, expression, emphasis, inflection, and gesture. Speech conveys not only what is said, but how it is said—and consequently, reveals a bit of the speaker’s mind to interested parties. In a plain-text environment like email only the typist knows whether a statement should be read with sarcasm.

[div class=attrib]More from theSource here.[end-div]

[div class=attrib]Image courtesy of Wikipedia / Creative Commons.[end-div]

And You Thought Being Direct and Precise Was Good

A new psychological study upends our understanding of the benefits of direct and precise information as a motivational tool. Results from the study by Himanshu Mishra and Baba Shiv describe the cognitive benefits of vague and inarticulate feedback over precise information. At first glance this seems to be counter-intuitive. After all, fuzzy math, blurred reasoning and unclear directives would seem to be the banes of current societal norms that value data in as a precise a form as possible. We measure, calibrate, verify and re-measure and report information to the nth degree.

[div class=attrib]Stanford Business:[end-div]

Want to lose weight in 2011? You’ve got a better chance of pulling it off if you tell yourself, “I’d like to slim down and maybe lose somewhere between 5 and 15 pounds this year” instead of, “I’d like to lose 12 pounds by July 4.”

In a paper to be published in an upcoming issue of the journal Psychological Science, business school Professor Baba Shiv concludes that people are more likely to stay motivated and achieve a goal if it’s sketched out in vague terms than if it’s set in stone as a rigid or precise plan.

“For one to be successful, one needs to be motivated,” says Shiv, the Stanford Graduate School of Business Sanwa Bank, Limited, Professor of Marketing. He is coauthor of the paper “In Praise of Vagueness: Malleability of Vague Information as a Performance Booster” with Himanshu Mishra and Arul Mishra, both of the University of Utah. Presenting information in a vague way — for instance using numerical ranges or qualitative descriptions — “allows you to sample from the information that’s in your favor,” says Shiv, whose research includes studying people’s responses to incentives. “You’re sampling and can pick the part you want,” the part that seems achievable or encourages you to keep your expectations upbeat to stay on track, says Shiv.

By comparison, information presented in a more-precise form doesn’t let you view it in a rosy light and so can be discouraging. For instance, Shiv says, a coach could try to motivate a sprinter by reviewing all her past times, recorded down to the thousandths of a second. That would remind her of her good times but also the poor ones, potentially de-motivating her. Or, the coach could give the athlete less-precise but still-accurate qualitative information. “Good coaches get people not to focus on the times but on a dimension that is malleable,” says Shiv. “They’ll say, “You’re mentally tough.’ You can’t measure that.” The runner can then zero in on her mental strength to help her concentrate on her best past performances, boosting her motivation and ultimately improving her times. “She’s cherry-picking her memories, and that’s okay, because that’s allowing her to get motivated,” says Shiv.

Of course, Shiv isn’t saying there’s no place for precise information. A pilot needs exact data to monitor a plane’s location, direction, and fuel levels, for instance. But information meant to motivate is different, and people seeking motivation need the chance to focus on just the positive. When it comes to motivation, Shiv said, “negative information outweighs positive. If I give you five pieces of negative information and five pieces of positive information, the brain weighs the negative far more than the positive … It’s a survival mechanism. The brain weighs the negative to keep us secure.”

[div class=attrib]More from theSource here.[end-div]

 

Art Makes Your Body Tingle

The next time you wander through an art gallery and feel lightheaded after seeing a Monroe silkscreen by Warhol, or feel reflective and soothed by a scene from Monet’s garden you’ll be in good company. New research shows that the body reacts to art not just our grey matter.

The study by Wolfgang Tschacher and colleagues, and published by the American Psychological Association, found that:

. . . physiological responses during perception of an artwork were significantly related to aesthetic-emotional experiencing. The dimensions “Aesthetic Quality,” “Surprise/Humor,” “Dominance,” and “Curatorial Quality” were associated with cardiac measures (heart rate variability, heart rate level) and skin conductance variability.

In other words, art makes your pulse race, your skin perspire and your body tingle.

[div class=attrib]From Miller-McCune:[end-div]

Art exhibits are not generally thought of as opportunities to get our pulses racing and skin tingling. But newly published research suggests aesthetic appreciation is, in fact, a full-body experience.

Three hundred and seventy-three visitors to a Swiss museum agreed to wear special gloves measuring four physiological responses as they strolled through an art exhibit. Researchers found an association between the gallery-goers’ reported responses to the artworks and three of the four measurements of bodily stimulation.

“Our findings suggest that an idiosyncratically human property — finding aesthetic pleasure in viewing artistic artifacts — is linked to biological markers,” researchers led by psychologist Wolfgang Tschacher of the University of Bern, Switzerland, write in the journal Psychology of Aesthetics, Creativity and the Arts.

Their study, the first of its kind conducted in an actual art gallery, provides evidence for what Tschacher and his colleagues call “the embodiment of aesthetics.”

[div class=attrib]More from theSource here.[end-div]

[div class=attrib]Image courtesy of Wikipedia / Creative Commons.[end-div]

Bad reasoning about reasoning

[div class=attrib]By Massimo Pigliucci at Rationally Speaking:[end-div]

A recent paper on the evolutionary psychology of reasoning has made mainstream news, with extensive coverage by the New York Times, among others. Too bad the “research” is badly flawed, and the lesson drawn by Patricia Cohen’s commentary in the Times is precisely the wrong one.

Readers of this blog and listeners to our podcast know very well that I tend to be pretty skeptical of evolutionary psychology in general. The reason isn’t because there is anything inherently wrong about thinking that (some) human behavioral traits evolved in response to natural selection. That’s just an uncontroversial consequence of standard evolutionary theory. The devil, rather, is in the details: it is next to impossible to test specific evopsych hypotheses because the crucial data are often missing. The fossil record hardly helps (if we are talking about behavior), there are precious few closely related species for comparison (and they are not at all that closely related), and the current ecological-social environment is very different from the “ERE,” the Evolutionarily Relevant Environment (which means that measuring selection on a given trait in today’s humans is pretty much irrelevant).
That said, I was curious about Hugo Mercier and Dan Sperber’s paper, “Why do humans reason? Arguments for an argumentative theory,” published in Behavioral and Brain Sciences (volume 34, pp. 57-111, 2011), which is accompanied by an extensive peer commentary. My curiosity was piqued in particular because of the Times’ headline from the June 14 article: “Reason Seen More as Weapon Than Path to Truth.” Oh crap, I thought.

Mercier and Sperber’s basic argument is that reason did not evolve to allow us to seek truth, but rather to win arguments with our fellow human beings. We are natural lawyers, not natural philosophers. This, according to them, explains why people are so bad at reasoning, for instance why we tend to fall for basic mistakes such as the well known confirmation bias — a tendency to seek evidence in favor of one’s position and discount contrary evidence that is well on display in politics and pseudoscience. (One could immediately raise the obvious “so what?” objection to all of this: language possibly evolved to coordinate hunting and gossip about your neighbor. That doesn’t mean we can’t take writing and speaking courses and dramatically improve on our given endowment, natural selection be damned.)

The first substantive thing to notice about the paper is that there isn’t a single new datum to back up the central hypothesis. It is one (long) argument in which the authors review well known cognitive science literature and simply apply evopsych speculation to it. If that’s the way to get into the New York Times, I better increase my speculation quotient.

[div class=attrib]More from theSource here.[end-div]

Hello Internet; Goodbye Memory

Imagine a world without books; you’d have to commit useful experiences, narratives and data to handwritten form and memory.Imagine a world without the internet and real-time search; you’d have to rely on a trusted expert or a printed dictionary to find answers to your questions. Imagine a world without the written word; you’d have to revert to memory and oral tradition to pass on meaningful life lessons and stories.

Technology is a wonderfully double-edged mechanism. It brings convenience. It helps in most aspects of our lives. Yet, it also brings fundamental cognitive change that brain scientists have only recently begun to fathom. Recent studies, including the one cited below from Columbia University explore this in detail.

[div class=attrib]From Technology Review:[end-div]

A study says that we rely on external tools, including the Internet, to augment our memory.

The flood of information available online with just a few clicks and finger-taps may be subtly changing the way we retain information, according to a new study. But this doesn’t mean we’re becoming less mentally agile or thoughtful, say the researchers involved. Instead, the change can be seen as a natural extension of the way we already rely upon social memory aids—like a friend who knows a particular subject inside out.

Researchers and writers have debated over how our growing reliance on Internet-connected computers may be changing our mental faculties. The constant assault of tweets and YouTube videos, the argument goes, might be making us more distracted and less thoughtful—in short, dumber. However, there is little empirical evidence of the Internet’s effects, particularly on memory.

Betsy Sparrow, assistant professor of psychology at Columbia University and lead author of the new study, put college students through a series of four experiments to explore this question.

One experiment involved participants reading and then typing out a series of statements, like “Rubber bands last longer when refrigerated,” on a computer. Half of the participants were told that their statements would be saved, and the other half were told they would be erased. Additionally, half of the people in each group were explicitly told to remember the statements they typed, while the other half were not. Participants who believed the statements would be erased were better at recalling them, regardless of whether they were told to remember them.

[div class=attrib]More from theSource here.[end-div]

The Good, the Bad and the Ugly – 40 years on

One of the most fascinating and (in)famous experiments in social psychology began in the bowels of Stanford University 40 years ago next month. The experiment intended to evaluate how people react to being powerless. However, on conclusion it took a broader look at role assignment and reaction to authority.

The Stanford Prison Experiment incarcerated male college student volunteers in a mock prison for 6 fateful days. Some of the students were selected to be prison guards, the remainder would be prisoners. The researchers, led by psychology professor Philip Zimbardo encouraged the guards to think of themselves as actual guards in a real prison. What happened during these 6 days in “prison” is the stuff of social science legend. The results continues to shock psychologists to this day; many were not prepared for the outcome after 6 days, which saw guards take their roles to the extreme becoming overarchingly authoritarian and mentally abusive, and prisoners become down-trodden and eventually rebellious. A whistle-blower eventually called the experiment to an abrupt end (it was to have continued for 2 weeks).

Forty years on, researchers went back to interview professor Zimbardo and some of the participating guards and prisoners to probe their feelings now. Recollections from one of the guards is below.

[div class=attrib]From Stanford Magazine:[end-div]

I was just looking for some summer work. I had a choice of doing this or working at a pizza parlor. I thought this would be an interesting and different way of finding summer employment.

The only person I knew going in was John Mark. He was another guard and wasn’t even on my shift. That was critical. If there were prisoners in there who knew me before they encountered me, then I never would have been able to pull off anything I did. The act that I put on—they would have seen through it immediately.

What came over me was not an accident. It was planned. I set out with a definite plan in mind, to try to force the action, force something to happen, so that the researchers would have something to work with. After all, what could they possibly learn from guys sitting around like it was a country club? So I consciously created this persona. I was in all kinds of drama productions in high school and college. It was something I was very familiar with: to take on another personality before you step out on the stage. I was kind of running my own experiment in there, by saying, “How far can I push these things and how much abuse will these people take before they say, ‘knock it off?'” But the other guards didn’t stop me. They seemed to join in. They were taking my lead. Not a single guard said, “I don’t think we should do this.”

The fact that I ramped up the intimidation and the mental abuse without any real sense as to whether I was hurting anybody— I definitely regret that. But in the long run, no one suffered any lasting damage. When the Abu Ghraib scandal broke, my first reaction was, this is so familiar to me. I knew exactly what was going on. I could picture myself in the middle of that and watching it spin out of control. When you have little or no supervision as to what you’re doing, and no one steps in and says, “Hey, you can’t do this”—things just keep escalating. You think, how can we top what we did yesterday? How do we do something even more outrageous? I felt a deep sense of familiarity with that whole situation.

Sometimes when people know about the experiment and then meet me, it’s like, My God, this guy’s a psycho! But everyone who knows me would just laugh at that.

[div class=attrib]More from theSource here.[end-div]

Book Review: The Psychopath Test. Jon Ronson

Hilarious and disturbing. I suspect Jon Ronson would strike a couple of checkmarks in the Hare PCL-R Checklist against my name for finding his latest work both hilarious and disturbing. Would this, perhaps, make me a psychopath?

Jon Ronson is author of The Psychopath Test and the Hare PCL-R, named for its inventor,  Canadian psychologist Bob Hare, is the gold standard in personality trait measurement for psychopathic disorder (officially known as Antisocial Personality Disorder).

Ronson’s book is a fascinating journey through the “madness industry” covering psychiatrists, clinical psychologists, criminal scientists, criminal profilers, and of course their clients: patients, criminals and the “insane” at large. Fascinated by the psychopathic traits that the industry applied to the criminally insane, Ronson goes on to explore these behavior and personality traits in the general population. And, perhaps to no surprise he finds that a not insignificant proportion of business leaders and others in positions on authority could be classified as “psychopaths” based on the standard PCL-R checklist.

Ronson’s stories are poignant. He tells us the tale of Tony, who feigned madness to avoid what he believed would be have been a harsher prison sentence for a violent crime. Instead, Tony found himself in Broadmoor, a notorious maximum security institution for the criminally insane. Twelve years on, Tony still incarcerated, finds it impossible to convince anyone of his sanity, despite behaving quite normally. His doctors now admit that he was sane at the time of admission, but agree that he must have been nuts to feign insanity in the first place, and furthermore only someone who is insane could behave so “sanely” while surrounded by the insane!

Tony’s story and the other characters that Ronson illuminates in this work are thoroughly memorable, especially Al Dunlap, empathy poor, former CEO of Sunbeam — perhaps one of the high-functioning psychopaths who lives in our midst. Peppered throughout Ronson’s interviews with madmen and madwomen, are his perpetual anxiety and self-reflection; he now has considerable diagnostic power and insight versed on such tools as the PCL-R checklist. As a result, Ronson begins seeing “psychopaths” everywhere.

My only criticism of the book is that Jon Ronson should have made it 200 pages longer and focused much more on the “psychopathic” personalities that roam amongst us, not just those who live behind bars, and on the madness industry itself, now seemingly lead by the major  pharmaceutical companies.

Undiscovered

[div class=attrib]From Eurozine:[end-div]

Neurological and Darwinistic strands in the philosophy of consciousness see human beings as no more than our evolved brains. Avoiding naturalistic explanations of human beings’ fundamental difference from other animals requires openness to more expansive approaches, argues Raymond Tallis.

For several decades I have been arguing against what I call biologism. This is the idea, currently dominant within secular humanist circles, that humans are essentially animals (or at least much more beastly than has been hitherto thought) and that we need therefore to look to the biological sciences, and only there, to advance our understanding of human nature. As a result of my criticism of this position I have been accused of being a Cartesian dualist, who thinks that the mind is some kind of a ghost in the machinery of the brain. Worse, it has been suggested that I am opposed to Darwinism, to neuroscience or to science itself. Worst of all, some have suggested that I have a hidden religious agenda. For the record, I regard neuroscience (which was my own area of research) as one of the greatest monuments of the human intellect; I think Cartesian dualism is a lost cause; and I believe that Darwin’s theory is supported by overwhelming evidence. Nor do I have a hidden religious agenda: I am an atheist humanist. And this is in fact the reason why I have watched the rise of biologism with such dismay: it is a consequence of the widespread assumption that the only alternative to a supernatural understanding of human beings is a strictly naturalistic one that sees us as just another kind of beast and, ultimately, as being less conscious agents than pieces of matter stitched into the material world.

This is to do humanity a gross disservice, as I think we are so much more than gifted chimps. Unpacking the most “ordinary” moment of human life reveals knowledge, skills, emotions, intuitions, a sense of past and future and of an infinitely elaborated world, that are not to be found elsewhere in the living world.

Biologism has two strands: “Neuromania” and “Darwinitis”. Neuromania arises out of the belief that human consciousness is identical with neural activity in certain parts of the brain. It follows from this that the best way to investigate what we humans truly are, to understand the origins of our beliefs, our predispositions, our morality and even our aesthetic pleasures, will be to peer into the brains of human subjects using the latest scanning technology. This way we shall know what is really going on when we are having experiences, thinking thoughts, feeling emotions, remembering memories, making decisions, being wise or silly, breaking the law, falling in love and so on.

The other strand is Darwinitis, rooted in the belief that evolutionary theory not only explains the origin of the species H. sapiens – which it does, of course – but also explains humans as they are today; that people are at bottom the organisms forged by the processes of natural selection and nothing more.

[div class=attrib]More from theSource here.[end-div]

I Didn’t Sin—It Was My Brain

[div class=attrib]From Discover:[end-div]

Why does being bad feel so good? Pride, envy, greed, wrath, lust, gluttony, and sloth: It might sound like just one more episode of The Real Housewives of New Jersey, but this enduring formulation of the worst of human failures has inspired great art for thousands of years. In the 14th century Dante depicted ghoulish evildoers suffering for eternity in his masterpiece, The Divine Comedy. Medieval muralists put the fear of God into churchgoers with lurid scenarios of demons and devils. More recently George Balanchine choreographed their dance.

Today these transgressions are inspiring great science, too. New research is explaining where these behaviors come from and helping us understand why we continue to engage in them—and often celebrate them—even as we declare them to be evil. Techniques such as functional magnetic resonance imaging (fMRI), which highlights metabolically active areas of the brain, now allow neuroscientists to probe the biology behind bad intentions.

The most enjoyable sins engage the brain’s reward circuitry, including evolutionarily ancient regions such as the nucleus accumbens and hypothalamus; located deep in the brain, they provide us such fundamental feelings as pain, pleasure, reward, and punishment. More disagreeable forms of sin such as wrath and envy enlist the dorsal anterior cingulate cortex (dACC). This area, buried in the front of the brain, is often called the brain’s “conflict detector,” coming online when you are confronted with contradictory information, or even simply when you feel pain. The more social sins (pride, envy, lust, wrath) recruit the medial prefrontal cortex (mPFC), brain terrain just behind the forehead, which helps shape the awareness of self.

No understanding of temptation is complete without considering restraint, and neuroscience has begun to illuminate this process as well. As we struggle to resist, inhibitory cognitive control networks involving the front of the brain activate to squelch the impulse by tempering its appeal. Meanwhile, research suggests that regions such as the caudate—partly responsible for body movement and coordination—suppress the physical impulse. It seems to be the same whether you feel a spark of lechery, a surge of jealousy, or the sudden desire to pop somebody in the mouth: The two sides battle it out, the devilish reward system versus the angelic brain regions that hold us in check.

It might be too strong to claim that evolution has wired us for sin, but excessive indulgence in lust or greed could certainly put you ahead of your competitors. “Many of these sins you could think of as virtues taken to the extreme,” says Adam Safron, a research consultant at Northwestern University whose neuroimaging studies focus on sexual behavior. “From the perspective of natural selection, you want the organism to eat, to procreate, so you make them rewarding. But there’s a potential for that process to go beyond the bounds.”

[div class=attrib]More from theSource here[end-div]