Tag Archives: psychology

You’re Not In Control

dual_elevator_door_buttons

Press a button, then something happens. Eat too much chocolate, then you feel great (and then put on weight). Step in to the middle of a busy road, then you get hit by an oncoming car. Walk in the rain, then you get wet. Watch your favorite comedy show, then you laugh.

Every moment of our lives is filled with actions and consequences, causes and effects. Usually we have a good sense of what is likely to happen when we take a specific action. This sense of predictability smooths our lives and makes us feel in control.

But sometimes all is not what is seems. Take the buttons on some of the most actively used objects in our daily lives. Press the “close door” button on the elevator [or “lift” for my British readers], then the door closes, right? Press the “pedestrian crossing” button at the crosswalk [or “zebra crossing”], then the safe to cross signal blinks to life, right? Adjust the office thermostat, then you feel more comfortable, right?

Well, if you think that by pressing a button you are commanding the elevator door to close, or the crosswalk signal to flash, or the thermostat to change the office temperature, you’re probably wrong. You may feel in control, but actually you’re not. In many cases the button may serve no functional purpose; the systems just work automatically. But the button still offers a psychological purpose — a placebo-like effect. We are so conditioned to the notion that pressing a button yields an action, that we still feel in control even when the button does nothing beyond making an audible click.

From the NYT:

Pressing the door-close button on an elevator might make you feel better, but it will do nothing to hasten your trip.

Karen W. Penafiel, executive director of National Elevator Industry Inc., a trade group, said the close-door feature faded into obsolescence a few years after the enactment of the Americans With Disabilities Act in 1990.

The legislation required that elevator doors remain open long enough for anyone who uses crutches, a cane or wheelchair to get on board, Ms. Penafiel said in an interview on Tuesday. “The riding public would not be able to make those doors close any faster,” she said.

The buttons can be operated by firefighters and maintenance workers who have the proper keys or codes.

No figures were available for the number of elevators still in operation with functioning door-close buttons. Given that the estimated useful life of an elevator is 25 years, it is likely that most elevators in service today have been modernized or refurbished, rendering the door-close buttons a thing of the past for riders, Ms. Penafiel said.

Read the entire story here.

Image: Elevator control panel, cropped to show only dual “door open” and “door close” buttons. Courtesy: Nils R. Barth. Wikipedia. Creative Commons CC0 1.0 Universal Public Domain Dedication.

Thoughts As Shapes

wednesday is indigo blue bookcoverJonathan Jackson has a very rare form of a rare neurological condition. He has synesthesia, which is a cross-connection of two (or more) unrelated senses where an perception in one sense causes an automatic experience in another sense. Some synesthetes, for instance, see various sounds or musical notes as distinct colors (chromesthesia), others perceive different words as distinct tastes (lexical-gustatory synesthesia).

Jackson, on the other hand, experiences his thoughts as shapes in a visual mindmap. This is so fascinating I’ve excerpted a short piece of his story below.

Also, if you are further intrigued by this subject I recommend three great reads on the subject: Wednesday Is Indigo Blue: Discovering the Brain of Synesthesia by Richard Cytowic, and David M. Eagleman; Musicophilia: Tales of Music and the Brain, by Oliver Sacks; The Man Who Tasted Shapes by Richard Cytowic.

From the Atlantic:

One spring evening in the mid 2000s, Jonathan Jackson and Andy Linscott sat on some seaside rocks near their college campus, smoking the kind of cigarettes reserved for heartbreak. Linscott was, by his own admission, “emotionally spewing” over a girl, and Jackson was consoling him.

Jackson had always been a particularly good listener. But in the middle of their talk, he did something Linscott found deeply odd.

“He got up and jumped over to this much higher rock,” Linscott says. “He was like, ‘Andy, I’m listening, I just want to get a different angle. I want to see what you’re saying and the shape of your words from a different perspective.’ I was baffled.”

For Jackson, moving physically to think differently about an idea seemed totally natural. “People say, ‘Okay, we need to think about this from a new angle’ all the time!” he says. “But for me that’s literal.”

Jackson has synesthesia, a neurological phenomenon that has long been defined as the co-activation of two or more conventionally unrelated senses. Some synesthetes see music (known as auditory-visual synesthesia) or read letters and numbers in specific hues (grapheme-color synesthesia). But recent research has complicated that definition, exploring where in the sensory process those overlaps start and opening up the term to include types of synesthesia in which senses interact in a much more complex manner.

Read the entire  story here.

Image: Wednesday Is Indigo Blue, bookcover, Courtesy: By Richard E. Cytowic and David M. Eagleman, MIT Press.

Pessimism About Positive Thinking

Many of us have grown up in a world that teaches and values the power of positive thinking. The mantra of positive thinkers goes something like this: think positively about yourself, your situation, your goals and you will be much more motivated and energized to fulfill your dreams.

By some accounts the self-improvement industry in the US alone weighs in with annual revenues of around $10 billion. So, positive thinking must work, right? Psychologists suggest that it’s really not that simple; singular focus on positivity may help us in the short-term, but over the longer-term it frustrates our motivations and hinders progress towards our goals.

In short, it pays to be in touch with the negatives as well, to embrace and understand obstacles, to learn from and challenge our setbacks. It is to our advantage to be a pragmatic dreamer, grounded in both the beauty and ugliness that surrounds us.

From aeon:

In her book The Secret Daily Teachings (2008), the self-help author Rhonda Byrne suggested that: ‘Whatever big thing you are asking for, consider having the celebration now as though you have received it.’

Yet research in psychology reveals a more complicated picture. Indulging in undirected positive flights of fancy isn’t always in our interest. Positive thinking can make us feel better in the short term, but over the long term it saps our motivation, preventing us from achieving our wishes and goals, and leaving us feeling frustrated, stymied and stuck. If we really want to move ahead in our lives, engage with the world and feel energised, we need to go beyond positive thinking and connect as well with the obstacles that stand in our way. By bringing our dreams into contact with reality, we can unleash our greatest energies and make the most progress in our lives.

Now, you might wonder if positive thinking is really as harmful as I’m suggesting. In fact, it is. In a number of studies over two decades, my colleagues and I have discovered a powerful link between positive thinking and poor performance. In one study, we asked college students who had a crush on someone from afar to tell us how likely they would be to strike up a relationship with that person. Then we asked them to complete some open-ended scenarios related to dating. ‘You are at a party,’ one scenario read. ‘While you are talking to [your crush], you see a girl/boy, whom you believe [your crush] might like, come into the room. As she/he approaches the two of you, you imagine…’

Some of the students completed the scenarios by spinning a tale of romantic success. ‘The two of us leave the party, everyone watches, especially the other girl.’ Others offered negative fantasies about love thwarted: ‘My crush and the other girl begin to converse about things which I know nothing. They seem to be much more comfortable with each other than he and I….’

We checked back with the students after five months to see if they had initiated a relationship with their crush. The more students had engaged in positive fantasies about the future, the less likely they were actually to have started up a romantic relationship.

My colleagues and I performed such studies with participants in a number of demographic groups, in different countries, and with a range of personal wishes, including health goals, academic and professional goals, and relationship goals. Consistently, we found a correlation between positive fantasies and poor performance. The more that people ‘think positive’ and imagine themselves achieving their goals, the less they actually achieve.

Positive thinking impedes performance because it relaxes us and drains the energy we need to take action. After having participants in one study positively fantasise about the future for as little as a few minutes, we observed declines in systolic blood pressure, a standard measure of a person’s energy level. These declines were significant: whereas smoking a cigarette will typically raise a person’s blood pressure by five or 10 points, engaging in positive fantasies lowers it by about half as much.

Read the entire article here.

The Psychopath Test and the Nominee

Republican U.S. presidential nominee Donald Trump speaks as he accepts the nomination during the final session of the Republican National Convention in Cleveland, Ohio, U.S. July 21, 2016. REUTERS/Brian Snyder - RTSJ4LA

Wouldn’t it be interesting to know if the potential next President of the United States were a psychopath?

I would certainly like to have the answer, which would seem to be just as important as knowing if the nominee supports a minimum wage increase, universal healthcare, equity for women, and justice for minorities.

So, interestingly enough Keith Olbermann over at Vanity Fair ran Donald Trump through the Hare Psychopathy Checklist. It was developed by Robert D. Hare, a criminal psychologist, in the early 1980s. Still in use today, the 20-point checklist is used as a simple tool (among others) to quickly assess if a subject has mental health issues ranging from brain injury to psychopathy.

Here’s how the checklist works. Take each of the 20 items and score each with either a 0, 1 or 2, with 0 denoting “does not exhibit” and 2 denoting “does exhibit”. The highest score of 40 indicates that the subject has a high potential for being a dangerous psychopath; 30 is the minimum ranking for psychopathic tendencies.

I urge you to read the full article, but in the meantime I’ll excerpt Donald Trump’s score’s on each dimension below:

  • Glibness/superficial charm — 2
  • Grandiose sense of self-worth — 2
  • Need for stimulation/proneness to boredom — 2
  • Pathological Lying — 2
  • Cunning/Manipulative — 2
  • Lack of remorse or guilt — 2
  • Shallow Affect — 2
  • Callous/lack of empathy — 2
  • Parasitic lifestyle — 2
  • Poor behavioral controls — 2
  • Promiscuous sexual behavior — 2
  • Early behavior problems — 2
  • Lack of realistic, long-term goals — 1
  • Impulsivity — 2
  • Irresponsibility — 1
  • Failure to accept responsibility for one’s own actions — 2
  • Many short-term marital relationships — 0
  • Juvenile delinquency — 2
  • Revocation of conditional release — 0
  • Criminal versatility — 0

Total score, 32.  There you have it. So, when you vote in November, 2016, please think of the children of the world and the nuclear codes.

Image: Republican U.S. presidential nominee Donald Trump speaks as he accepts the nomination during the final session of the Republican National Convention in Cleveland, Ohio, U.S. July 21, 2016. Courtesy: PBS / REUTERS/Brian Snyder – RTSJ4LA.

Towards an Understanding of Consciousness

Robert-Fudd-Consciousness-17C

The modern scientific method has helped us make great strides in our understanding of much that surrounds us. From knowledge of the infinitesimally small building blocks of atoms to the vast structures of the universe, theory and experiment have enlightened us considerably over the last several hundred years.

Yet a detailed understanding of consciousness still eludes us. Despite the intricate philosophical essays of John Locke in 1690 that laid the foundations for our modern day views of consciousness, a fundamental grasp of its mechanisms remain as elusive as our knowledge of the universe’s dark matter.

So, it’s encouraging to come across a refreshing view of consciousness, described in the context of evolutionary biology. Michael Graziano, associate professor of psychology and neuroscience at Princeton University, makes a thoughtful case for Attention Schema Theory (AST), which centers on the simple notion that there is adaptive value for the brain to build awareness. According to AST, the brain is constantly constructing and refreshing a model — in Graziano’s words an “attention schema” — that describes what its covert attention is doing from one moment to the next. The brain constructs this schema as an analog to its awareness of attention in others — a sound adaptive perception.

Yet, while this view may hold promise from a purely adaptive and evolutionary standpoint, it does have some way to go before it is able to explain how the brain’s abstraction of a holistic awareness is constructed from the physical substrate — the neurons and connections between them.

Read more of Michael Graziano’s essay, A New Theory Explains How Consciousness Evolved. Graziano is the author of Consciousness and the Social Brain, which serves as his introduction to AST. And, for a compelling rebuttal, check out R. Scott Bakker’s article, Graziano, the Attention Schema Theory, and the Neuroscientific Explananda Problem.

Unfortunately, until our experimentalists make some definitive progress in this area, our understanding will remain just as abstract as the theories themselves, however compelling. But, ideas such as these inch us towards a deeper understanding.

Image: Representation of consciousness from the seventeenth century. Robert FluddUtriusque cosmi maioris scilicet et minoris […] historia, tomus II (1619), tractatus I, sectio I, liber X, De triplici animae in corpore visione. Courtesy: Wikipedia. Public Domain.

Are You Monotasking or Just Paying Attention?

We have indeed reached the era of peak multi-tasking. It’s time to select a different corporate meme.

Study after recent study shows that multi-tasking is an illusion — we can’t perform two or more cognitive tasks in parallel, at the same time. Rather, we timeshare: dividing our attention from one task to another sequentially. These studies also show that dividing our attention in this way tends to have a deleterious effect on all of the tasks. I say cognitive tasks because it’s rather obvious that we can all perform some tasks at the same time: walk and chew gum (or thumb a smartphone); drive and sing; shower and think; read and eat. But, all of these combinations require that one of these tasks is mostly autonomic. That is, we perform one task without conscious effort.

Yet more social scientists have determined that multi-tasking is a fraud — perhaps perpetuated by corporate industrial engineers convinced that they can wring more hours of work from you.

What are we to do now having learned that our super-efficient world of juggling numerous tasks as the “same time” is nothing but a mirage?

Well, observers of the fragile human condition have not rested. This time social scientists have discovered an amazing human talent. And they’ve coined a mesmerizing new term, known as monotasking. In some circles it’s called uni-tasking or single-tasking.

When I was growing up this was called “paying attention”.

But, this being the era of self-help-life-experience-consulting gone mad and sub-minute attention spans (fueled by multi-tasking) we can now all eagerly await the rise of an entirely new industry dedicated to this wonderful monotasking breakthrough. Expect a whole host of monotasking books, buzzworthy news articles, daytime TV shows with monotasking tips and personal coaching experts at TED events armed with “look what monotasking can do for you” powerpoint decks.

Personally, I will quietly retreat, and return to old-school staying focused, and remind my kids to do the same.

From NYT:

Stop what you’re doing.

Well, keep reading. Just stop everything else that you’re doing.

Mute your music. Turn off your television. Put down your sandwich and ignore that text message. While you’re at it, put your phone away entirely. (Unless you’re reading this on your phone. In which case, don’t. But the other rules still apply.)

Just read.

You are now monotasking.

Maybe this doesn’t feel like a big deal. Doing one thing at a time isn’t a new idea.

Indeed, multitasking, that bulwark of anemic résumés everywhere, has come under fire in recent years. A 2014 study in the Journal of Experimental Psychology found that interruptions as brief as two to three seconds — which is to say, less than the amount of time it would take you to toggle from this article to your email and back again — were enough to double the number of errors participants made in an assigned task.

Earlier research out of Stanford revealed that self-identified “high media multitaskers” are actually more easily distracted than those who limit their time toggling.

So, in layman’s terms, by doing more you’re getting less done.

But monotasking, also referred to as single-tasking or unitasking, isn’t just about getting things done.

Not the same as mindfulness, which focuses on emotional awareness, monotasking is a 21st-century term for what your high school English teacher probably just called “paying attention.”

“It’s a digital literacy skill,” said Manoush Zomorodi, the host and managing editor of WNYC Studios’ “Note to Self” podcast, which recently offered a weeklong interactive series called Infomagical, addressing the effects of information overload. “Our gadgets and all the things we look at on them are designed to not let us single-task. We weren’t talking about this before because we simply weren’t as distracted.”

Continue reading the main story

Ms. Zomorodi prefers the term “single-tasking”: “ ‘Monotasking’ seemed boring to me. It sounds like ‘monotonous.’ ”

Kelly McGonigal, a psychologist, lecturer at Stanford and the author of “The Willpower Instinct,” believes that monotasking is “something that needs to be practiced.” She said: “It’s an important ability and a form of self-awareness as opposed to a cognitive limitation.”

Read the entire article here.

Image courtesy of Google Search.

Achieving Failure

Our society values success.

Our work environments value triumphing over the competition. We look to our investments to beat the market. We support our favorite teams, but adore them when they trounce their rivals. Our schools and colleges (mostly) help educate our children, but do so in a way that rewards success — good grades, good test scores and good behavior (as in, same as everyone else). We continually reward our kids for success on a task, at school, with a team.

Yet, all of us know, in our hearts and the back of our minds, that the most important lessons and trials stem from failure — not success. From failure we learn to persevere, we learn to change and adapt, we learn to overcome. From failure we learn to avoid, or tackle obstacles head on; we learn to reassess and reevaluate. We evolve from our failures.

So this begs the question: why are so many of our processes and systems geared solely to rewarding and reinforcing success?

From NPR:

Is failure a positive opportunity to learn and grow, or is it a negative experience that hinders success? How parents answer that question has a big influence on how much children think they can improve their intelligence through hard work, a study says.

“Parents are a really critical force in child development when you think about how motivation and mindsets develop,” says Kyla Haimovitz, a professor of psychology at Stanford University. She coauthored the study, published in Psychological Science with colleague Carol Dweck, who pioneered research on mindsets. “Parents have this powerful effect really early on and throughout childhood to send messages about what is failure, how to respond to it.”

Although there’s been a lot of research on how these forces play out, relatively little looks at what parents can do to motivate their kids in school, Haimovitz says. This study begins filling that gap.

“There is a fair amount of evidence showing that when children view their abilities as more malleable and something they can change over time, then they deal with obstacles in a more constructive way,” says Gail Heyman, a professor of psychology at the University of California at San Diego who was not involved in this study.

But communicating that message to children is not simple.

“Parents need to represent this to their kids in the ways they react about their kids’ failures and setbacks,” Haimovitz says. “We need to really think about what’s visible to the other person, what message I’m sending in terms of my words and my deeds.”

In other words, if a child comes home with a D on a math test, how a parent responds will influence how the child perceives their own ability to learn math. Even a well-intentioned, comforting response of “It’s OK, you’re still a great writer” may send the message that it’s time to give up on math rather than learn from the problems they got wrong, Haimovitz explains.

Read the entire story here.

Your Brain on LSD

Brain-on-LSD

For the first time, researchers have peered inside the brain to study the realtime effect of the psychedelic drug LSD (lysergic acid diethylamide). Yes, neuroscientists scanned the brains of subjects who volunteered to take a trip inside an MRI scanner, all in the name of science.

While the researchers did not seem to document the detailed subjective experiences of their volunteers, the findings suggest that they were experiencing intense dreamlike visions, effectively “seeing with their eyes shut”. Under the influence of LSD many areas of the brain that are usually compartmentalized showed far greater interconnection and intense activity.

LSD was first synthesized in 1938. Its profound psychological properties were studied from the mid-1940s to the early sixties. The substance was later banned — worldwide — after its adoption as a recreational drug.

This new study was conducted by researchers from Imperial College London and The Beckley Foundation, which researches psychoactive substances.

From Guardian:

The profound impact of LSD on the brain has been laid bare by the first modern scans of people high on the drug.

The images, taken from volunteers who agreed to take a trip in the name of science, have given researchers an unprecedented insight into the neural basis for effects produced by one of the most powerful drugs ever created.

A dose of the psychedelic substance – injected rather than dropped – unleashed a wave of changes that altered activity and connectivity across the brain. This has led scientists to new theories of visual hallucinations and the sense of oneness with the universe some users report.

The brain scans revealed that trippers experienced images through information drawn from many parts of their brains, and not just the visual cortex at the back of the head that normally processes visual information. Under the drug, regions once segregated spoke to one another.

Further images showed that other brain regions that usually form a network became more separated in a change that accompanied users’ feelings of oneness with the world, a loss of personal identity called “ego dissolution”.

David Nutt, the government’s former drugs advisor, professor of neuropsychopharmacology at Imperial College London, and senior researcher on the study, said neuroscientists had waited 50 years for this moment. “This is to neuroscience what the Higgs boson was to particle physics,” he said. “We didn’t know how these profound effects were produced. It was too difficult to do. Scientists were either scared or couldn’t be bothered to overcome the enormous hurdles to get this done.”

Read the entire story here.

Image: Different sections of the brain, either on placebo, or under the influence of LSD (lots of orange). Courtesy: Imperial College/Beckley Foundation.

Bad Behavior Goes Viral

Social psychologists often point out how human behavior is contagious. Laugh and others will join in. Yawn and all those around you will yawn as well. In a bad mood at home? Well, soon, chances are that the rest of your family with join you on a downer as well.

And, the contagion doesn’t end there, especially with negative behaviors; study after study shows the viral spread of suicide, product tampering, rioting, looting, speeding and even aircraft hijacking. So too, are mass shootings. Since the United States is a leading venue for mass shootings, there is now even a term for a mass shooting that happens soon after the first — an echo shooting.

From the Washington Post:

A man had just gone on a shooting rampage in Kalamazoo, Mich., allegedly killing six people while driving for Uber. Sherry Towers, an Arizona State University physicist who studies how viruses spread, worried while watching the news coverage.

Last year, Towers published a study using mathematical models to examine whether mass shootings, like viruses, are contagious. She identified a 13-day period after high-profile mass shootings when the chance of another spikes. Her findings are confirmed more frequently than she would like.

Five days after Kalamazoo, a man in Kansas shot 17 people, killing three by firing from his car. To Towers, that next shooting seemed almost inevitable.

“I absolutely dread watching this happen,” she said.

As the nation endures an ongoing stream of mass shootings, criminologists, police and even the FBI are turning to virus epidemiology and behavioral psychology to understand what sets off mass shooters and figure out whether, as with the flu, the spread can be interrupted.

“These things are clustering in time, and one is causing the next one to be more likely,” said Gary Slutkin, a physician and epidemiologist at the University of Illinois at Chicago who runs Cure Violence, a group that treats crime as a disease. “That’s definitional of a contagious disease. Flu is a risk factor for more flu. Mass shootings are a risk factor for mass shootings.”

The idea is not without skeptics. James Alan Fox, a Northeastern University professor who studies mass shootings, said: “Some bunching just happens. Yes, there is some mimicking going on, but the vast majority of mass killers don’t need someone else to give them the idea.”

Confirming, disputing or further exploring the idea scientifically is hampered by the federal funding ban on gun violence research. Towers and her colleagues did their study on their own time. And there’s not even a common database or definition of mass shootings.

The Congressional Research Service uses the term “public mass shootings” to describe the killing of four or more people in “relatively public places” by a perpetrator selecting victims “somewhat indiscriminately.”

In the 1980s, the violence occurred in post offices. In the 1990s, schools. Now it is mutating into new forms, such as the terrorist attack in San Bernardino, Calif., that initially appeared to be a workplace shooting by a disgruntled employee.

Researchers say the contagion is potentially more complicated than any virus. There is the short-term effect of a high-profile mass shooting, which can lead quickly to another incident. Towers found that such echo shootings account for up to 30 percent of all rampages.

But there appear to be longer incubation periods, too. Killers often find inspiration in past mass shootings, praising what their predecessors accomplished, innovating on their methods and seeking to surpass them in casualties and notoriety.

Read the entire article here.

The Increasing Mortality of White Males

This is the type of story that you might not normally, and certainly should not, associate with the world’s richest country. In a reversal of a long-established trend, death rates are increasing for less educated, white males. The good news is that death rates continue to fall for other demographic and racial groups, especially Hispanics and African Americans. So, what is happening to white males?

From the NYT:

It’s disturbing and puzzling news: Death rates are rising for white, less-educated Americans. The economists Anne Case and Angus Deaton reported in December that rates have been climbing since 1999 for non-Hispanic whites age 45 to 54, with the largest increase occurring among the least educated. An analysis of death certificates by The New York Times found similar trends and showed that the rise may extend to white women.

Both studies attributed the higher death rates to increases in poisonings and chronic liver disease, which mainly reflect drug overdoses and alcohol abuse, and to suicides. In contrast, death rates fell overall for blacks and Hispanics.

Why are whites overdosing or drinking themselves to death at higher rates than African-Americans and Hispanics in similar circumstances? Some observers have suggested that higher rates of chronic opioid prescriptions could be involved, along with whites’ greater pessimism about their finances.

Yet I’d like to propose a different answer: what social scientists call reference group theory. The term “reference group” was pioneered by the social psychologist Herbert H. Hyman in 1942, and the theory was developed by the Columbia sociologist Robert K. Merton in the 1950s. It tells us that to comprehend how people think and behave, it’s important to understand the standards to which they compare themselves.

How is your life going? For most of us, the answer to that question means comparing our lives to the lives our parents were able to lead. As children and adolescents, we closely observed our parents. They were our first reference group.

And here is one solution to the death-rate conundrum: It’s likely that many non-college-educated whites are comparing themselves to a generation that had more opportunities than they have, whereas many blacks and Hispanics are comparing themselves to a generation that had fewer opportunities.

Read the entire article here.

Words Before Death

SQ_Lethal_Injection_RoomA team of psychologists recently compiled and assessed the last words of prison inmates who were facing execution in Texas.

I was surprised to learn of a publicly accessible “last statement” database, available via Texas’ department of criminal justice.

Whether you subscribe to the idea that the death penalty is just [I do not] or not, you will surely find these final utterances moving — time for some reflection.

From the Independent:

Psychologists have analysed the last words of inmates who were condemned to death in Texas.

In a new paper, published in Frontiers in Psychology, researchers Dr. Sarah Hirschmüller and Dr. Boris Egloff used a database of last statements of inmates on death row and found the majority of the statements to be positive.

The researchers theorise that the inmates, the average age of whom in the current dataset is just over 39, expressed positive sentiments, because their minds were working in overdrive to avert them from fearing their current situation.

This is called ‘Terror-Management Theory’ (TMT). The concept is that people search for meaning when confronted with terror in a bid to maintain self-esteem and that “individuals employ a wide range of cognitive and behavioural efforts to regulate the anxiety that mortality salience evokes.”

Read more here.

Image: Execution room in the San Quentin State Prison in California. Public Domain.

The Curious Psychology of Returns

In a recent post I wrote about the world of reverse logistics, which underlies the multi-billion dollar business of product returns. But while the process of consumer returns runs like a well-oiled, global machine the psychology of returns is confusingly counter-intuitive.

For instance, a lenient return policy leads to more returned products — no surprise there. But, it also causes increased consumer spending, and the increased spending outweighs the cost to the business of processing the increased returns. Also, and rather more curiously, a more lenient return time limit correlates to a reduction in returns, not an increase.From the Washington Post:

January is prime time for returns in the retail industry, the month where shoppers show up in droves to trade in an ill-fitting sweater from grandma or to unload the second and third “Frozen” dolls that showed up under the Christmas tree.

This post-Christmas ritual has always been costly for retailers, comprising a large share of the $284 billion in goods that were returned in 2014.  But now it is arguably becoming more urgent for the industry to think carefully about return policies, as analysts say the rise of online shopping is bringing with it a surge in returns. The return rate for the industry overall is about 8 percent, but analysts say that it is likely significantly higher than that online, since shoppers are purchasing goods without seeing them in person or trying them on.

Against that backdrop, researchers at University of Texas-Dallas sought to get a better handle on how return policies affect shopper behavior and, in turn, whether lenient policies such as offering a lengthy period for returns actually helps or hurts a retailer’s business.

Overall, a lenient return policy did indeed correlate with more returns. But, crucially, it was even more strongly correlated with an increase in purchases. In other words, retailers are generally getting a clear sales benefit from giving customers the assurance of a return.

One surprising finding: More leniency on time limits is associated with a reduction — not an increase — in returns.

This may seem counterintuitive, but researchers say it could have varying explanations. Ryan Freling, who conducted the research alongside Narayan Janakiraman and Holly Syrdal, said that this is perhaps a result of what’s known as “endowment effect.”

“That would say that the longer a customer has a product in their hands, the more attached they feel to it,” Freling said.

Plus, the long time frame creates less urgency around the decision over whether or not to take it back.

Read the entire article here.

Neck Tingling and ASMR

Google-search-asmrEver had that curious tingling sensation at the back and base of your neck? Of course you have. Perhaps you’ve felt this sensation during a particular piece of music or from a watching a key scene in a movie or when taking in a panorama from the top of a mountain or from smelling a childhood aroma again. In fact, most people report having felt this sensation, albeit rather infrequently.

But, despite its commonality very little research exists to help us understand how and why it happens. Psychologists tend to agree that the highly personal and often private nature of the neck tingling experience make it difficult to study and hence generalize. This means, of course, that the internet is rife with hypotheses and pseudo-science. Just try searching for ASMR videos and be (not) surprised by the 2 million+ results.

From the Guardian:

Autonomous Sensory Meridian Response, or ASMR, is a curious phenomenon. Those who experience it often characterise it as a tingling sensation in the back of the head or neck, or another part of the body, in response to some sort of sensory stimulus. That stimulus could be anything, but over the past few years, a subculture has developed around YouTube videos, and their growing popularity was the focus of a video posted on the Guardian this last week. It’s well worth a watch, but I couldn’t help but feel it would have been a bit more interesting if there had been some scientific background in it. The trouble is, there isn’t actually much research on ASMR out there.

To date, only one research paper has been published on the phenomenon. In March last year, Emma Barratt, a graduate student at Swansea University, and Dr Nick Davis, then a lecturer at the same institution, published the results of a survey of some 500 ASMR enthusiasts. “ASMR is interesting to me as a psychologist because it’s a bit ‘weird’” says Davis, now at Manchester Metropolitan University. “The sensations people describe are quite hard to describe, and that’s odd because people are usually quite good at describing bodily sensation. So we wanted to know if everybody’s ASMR experience is the same, and of people tend to be triggered by the same sorts of things.”

Read the entire story here.

Image courtesy of Google Search.

iScoliosis

Google-search-neck-xray

Industrial and occupational illnesses have followed humans since the advent of industry. Obvious ones include: lung diseases from mining and a variety of skin diseases from exposure to agricultural and factory chemicals.

The late 20th century saw us succumb to carpal tunnel and other repetitive stress injuries from laboring over our desks and computers. Now, in the 21st we are becoming hosts to the smartphone pathogen.

In addition to the spectrum of social and cultural disorders wrought by our constantly chattering mobile devices, we are at increased psychological and physical risk. But, let’s leave aside the two obvious ones: risk from vehicle injury due to texting while driving, and risk from injury due to texting while walking. More commonly, we are at increased risk of back and other chronic physical problems resulting from poor posture. This in turn leads to mood disorders, memory problems and depression. Some have termed this condition “text-neck”, “iHunch”, or “iPosture”; I’ll go with “iScoliosis™”.

From NYT:

THERE are plenty of reasons to put our cellphones down now and then, not least the fact that incessantly checking them takes us out of the present moment and disrupts family dinners around the globe. But here’s one you might not have considered: Smartphones are ruining our posture. And bad posture doesn’t just mean a stiff neck. It can hurt us in insidious psychological ways.

If you’re in a public place, look around: How many people are hunching over a phone? Technology is transforming how we hold ourselves, contorting our bodies into what the New Zealand physiotherapist Steve August calls the iHunch. I’ve also heard people call it text neck, and in my work I sometimes refer to it as iPosture.

The average head weighs about 10 to 12 pounds. When we bend our necks forward 60 degrees, as we do to use our phones, the effective stress on our neck increases to 60 pounds — the weight of about five gallons of paint. When Mr. August started treating patients more than 30 years ago, he says he saw plenty of “dowagers’ humps, where the upper back had frozen into a forward curve, in grandmothers and great-grandmothers.” Now he says he’s seeing the same stoop in teenagers.

When we’re sad, we slouch. We also slouch when we feel scared or powerless. Studies have shown that people with clinical depression adopt a posture that eerily resembles the iHunch. One, published in 2010 in the official journal of the Brazilian Psychiatric Association, found that depressed patients were more likely to stand with their necks bent forward, shoulders collapsed and arms drawn in toward the body.

Posture doesn’t just reflect our emotional states; it can also cause them. In a study published in Health Psychology earlier this year, Shwetha Nair and her colleagues assigned non-depressed participants to sit in an upright or slouched posture and then had them answer a mock job-interview question, a well-established experimental stress inducer, followed by a series of questionnaires. Compared with upright sitters, the slouchers reported significantly lower self-esteem and mood, and much greater fear. Posture affected even the contents of their interview answers: Linguistic analyses revealed that slouchers were much more negative in what they had to say. The researchers concluded, “Sitting upright may be a simple behavioral strategy to help build resilience to stress.”

Slouching can also affect our memory: In a study published last year in Clinical Psychology and Psychotherapy of people with clinical depression, participants were randomly assigned to sit in either a slouched or an upright position and then presented with a list of positive and negative words. When they were later asked to recall those words, the slouchers showed a negative recall bias (remembering the bad stuff more than the good stuff), while those who sat upright showed no such bias. And in a 2009 study of Japanese schoolchildren, those who were trained to sit with upright posture were more productive than their classmates in writing assignments.

Read the entire article here, preferably not via your smartphone.

Image courtesy of Google Search.

 

Hate Crimes and the Google Correlation

Google-search-hate-speechIt had never occurred to me, but it makes perfect sense: there’s a direct correlation between Muslim hates crimes and Muslim hate searches on Google. For that matter, there is probably a correlation between other types of hate speech and hate crimes — women, gays, lesbians, bosses, blacks, whites, bad drivers, religion X. But it is certainly the case that Muslims and the Islamic religion are taking the current brunt both online and in the real world.

Clearly, we have a long way to go in learning that entire populations are not to blame for the criminal acts of a few. However, back to the correlations.

Mining of Google search data shows indisputable relationships. As the researchers point out, “When Islamophobic searches are at their highest levels, such as during the controversy over the ‘ground zero mosque’ in 2010 or around the anniversary of 9/11, hate crimes tend to be at their highest levels, too.” Interestingly enough there are currently just over 50 daily searches for “I hate my boss” in the US. In November there were 120 searches per day for “I hate Muslims”.

So, here’s an idea. Let’s get Google to replace the “I’m Feeling Lucky” button on the search page (who uses that anyway) with “I’m Feeling Hateful”. This would make the search more productive for those needing to vent their hatred.

More from NYT:

HOURS after the massacre in San Bernardino, Calif., on Dec. 2, and minutes after the media first reported that at least one of the shooters had a Muslim-sounding name, a disturbing number of Californians had decided what they wanted to do with Muslims: kill them.

The top Google search in California with the word “Muslims” in it was “kill Muslims.” And the rest of America searched for the phrase “kill Muslims” with about the same frequency that they searched for “martini recipe,” “migraine symptoms” and “Cowboys roster.”

People often have vicious thoughts. Sometimes they share them on Google. Do these thoughts matter?

Yes. Using weekly data from 2004 to 2013, we found a direct correlation between anti-Muslim searches and anti-Muslim hate crimes.

We measured Islamophobic sentiment by using common Google searches that imply hateful attitudes toward Muslims. A search for “are all Muslims terrorists?” for example leaves little to the imagination about what the searcher really thinks. Searches for “I hate Muslims” are even clearer.

When Islamophobic searches are at their highest levels, such as during the controversy over the “ground zero mosque” in 2010 or around the anniversary of 9/11, hate crimes tend to be at their highest levels, too.

In 2014, according to the F.B.I., anti-Muslim hate crimes represented 16.3 percent of the total of 1,092 reported offenses. Anti-Semitism still led the way as a motive for hate crimes, at 58.2 percent.

Hate crimes may seem chaotic and unpredictable, a consequence of random neurons that happen to fire in the brains of a few angry young men. But we can explain some of the rise and fall of anti-Muslim hate crimes just based on what people are Googling about Muslims.

The frightening thing is this: If our model is right, Islamophobia and thus anti-Muslim hate crimes are currently higher than at any time since the immediate aftermath of the Sept. 11 attacks. Although it will take awhile for the F.B.I. to collect and analyze the data before we know whether anti-Muslim hate crimes are in fact rising spectacularly now, Islamophobic searches in the United States were 10 times higher the week after the Paris attacks than the week before. They have been elevated since then and rose again after the San Bernardino attack.

According to our model, when all the data is analyzed by the F.B.I., there will have been more than 200 anti-Muslim attacks in 2015, making it the worst year since 2001.

How can these Google searches track Islamophobia so well? Who searches for “I hate Muslims” anyway?

We often think of Google as a source from which we seek information directly, on topics like the weather, who won last night’s game or how to make apple pie. But sometimes we type our uncensored thoughts into Google, without much hope that Google will be able to help us. The search window can serve as a kind of confessional.

There are thousands of searches every year, for example, for “I hate my boss,” “people are annoying” and “I am drunk.” Google searches expressing moods, rather than looking for information, represent a tiny sample of everyone who is actually thinking those thoughts.

There are about 1,600 searches for “I hate my boss” every month in the United States. In a survey of American workers, half of the respondents said that they had left a job because they hated their boss; there are about 150 million workers in America.

In November, there were about 3,600 searches in the United States for “I hate Muslims” and about 2,400 for “kill Muslims.” We suspect these Islamophobic searches represent a similarly tiny fraction of those who had the same thoughts but didn’t drop them into Google.

“If someone is willing to say ‘I hate them’ or ‘they disgust me,’ we know that those emotions are as good a predictor of behavior as actual intent,” said Susan Fiske, a social psychologist at Princeton, pointing to 50 years of psychology research on anti-black bias. “If people are making expressive searches about Muslims, it’s likely to be tied to anti-Muslim hate crime.”

Google searches seem to suffer from selection bias: Instead of asking a random sample of Americans how they feel, you just get information from those who are motivated to search. But this restriction may actually help search data predict hate crimes.

Read more here.

Image courtesy of Google Search.

 

Fight or Flight (or Record?)

Google-search-danger

Psychologists, social scientists and researchers of the human brain have long maintained that we have three typical responses to an existential, usually physical, threat. First, we may stand our ground to tackle and fight the threat. Second, we may turn and run from danger. Third, we may simply freeze with indecision and inaction. These responses have been studied, documented and confirmed over the decades. Further, they tend to mirror those of other animals when faced with a life-threatening situation.

But, now that humans have entered the smartphone age, it appears that there is a fourth response — to film or record the threat. This may seem hard to believe and foolhardy, but quite disturbingly it’s is a growing trend, especially among younger people.

From the Telegraph:

If you witnessed a violent attack on an innocent victim, would you:

a) help
b) run
c) freeze

Until now, that was the hypothetical question we all asked ourselves when reading about horrific events such as terror attacks.

What survival instinct would come most naturally? Fight or flight?

No longer. Over the last couple of years it’s become very obvious that there’s a fourth option:

d) record it all on your smartphone.

This reaction of filming traumatic events has become more prolific in recent weeks. Last month’s terror attacks in Paris saw mobile phone footage of people being shot, photos of bodies lying in the street, and perhaps most memorably, a pregnant woman clinging onto a window ledge.

Saturday [December 5, 2015] night saw another example when a terror suspect started attacking passengers on the Tube at Leytonstone Station. Most of the horrific incident was captured on video, as people stood filming him.

One brave man, 33-year-old engineer David Pethers, tried to fight the attacker. He ended up with a cut to the neck as he tried to protect passing children. But while he was intervening, others just held up their phones.

“There were so many opportunities where someone could have grabbed him,” he told the Daily Mail. “One guy came up to me afterwards and said ‘well done, I want to shake your hand, you are the only one who did anything, I got the whole thing on film.’

“I was so angry, I nearly turned on him but I walked away. I though, ‘Are you crazy? You are standing there filming and did nothing.’ I was really angry afterwards.”

It’s hard to disagree. Most of us know heroism is rare and admirable. We can easily understand people trying to escape and save themselves, or even freezing in the face of terror.

But deliberately doing nothing and choosing to film the whole thing? That’s a lot harder to sympathise with.

Psychotherapist Richard Reid agrees – “the sensible option would be to think about your own safety and get out, or think about helping people” – but he says it’s important we understand this new reaction.

“Because events like terror attacks are so outside our experience, people don’t fully connect with it,” he explains.

“It’s like they’re watching a film. It doesn’t occur to them they could be in danger or they could be helping. The reality only sinks in after the event. It’s a natural phenomenon. It’s not necessarily the most useful response, but we have to accept it.”

Read the entire story here.

Image courtesy of Google Search.

Rudeness Goes Viral

We know intuitively, anecdotally and through scientific study that aggressive behavior can be transmitted to others through imitation. The famous Bobo doll experiment devised by researchers at Stanford University in the early 1960s, and numerous precursors, showed that subjects given an opportunity to observe aggressive models later reproduced a good deal of physical and verbal aggression substantially identical with that of the model. In these studies the model was usually someone with a higher social status or with greater authority (e.g., an adult) than the observer (e.g., a child).

Recent updates to these studies now show that low-intensity behaviors such as rudeness can be as equally contagious as more intense behaviors like violence. Fascinatingly, the contagion seems to work equally well even if the model and observer are peers.

So, keep this in mind: watching rude behaviors leads us to be rude to others.

From Scientific American:

Flu season is nearly upon us, and in an effort to limit contagion and spare ourselves misery, many of us will get vaccinated. The work of Jonas Salk and Thomas Francis has helped restrict the spread of the nasty bug for generations, and the influenza vaccine is credited with saving tens of thousands of lives. But before the vaccine could be developed, scientists first had to identify the cause of influenza — and, importantly, recognize that it was contagious.

New research by Trevor Foulk, Andrew Woolum, and Amir Erez at the University of Florida takes that same first step in identifying a different kind of contagious menace: rudeness. In a series of studies, Foulk and colleagues demonstrate that being the target of rude behavior, or even simply witnessing rude behavior, induces rudeness. People exposed to rude behavior tend to have concepts associated with rudeness activated in their minds, and consequently may interpret ambiguous but benign behaviors as rude. More significantly, they themselves are more likely to behave rudely toward others, and to evoke hostility, negative affect, and even revenge from others.

The finding that negative behavior can beget negative behavior is not exactly new, as researchers demonstrated decades ago that individuals learn vicariously and will repeat destructive actions.  In the now infamous Bobo doll experiment, for example, children who watched an adult strike a Bobo doll with a mallet or yell at it were themselves abusive toward the doll.  Similarly, supervisors who believe they are mistreated by managers tend to pass on this mistreatment to their employees.

Previous work on the negative contagion effect, however, has focused primarily on high-intensity behaviors like hitting or abusive supervision that are (thankfully) relatively infrequent in everyday life.  In addition, in most previous studies the destructive behavior was modeled by someone with a higher status than the observer. These extreme negative behaviors may thus get repeated because (a) they are quite salient and (b) the observer is consciously and intentionally trying to emulate the behavior of someone with an elevated social status.

To examine whether this sensitivity impacts social behavior, Foulk’s team conducted another study in which participants were asked to play the part of an employee at a local bookstore.  Participants first observed a video showing either a polite or a rude interaction among coworkers.  They were then asked to respond to an email from a customer.  The email was either neutral (e.g., “I am writing to check on an order I placed a few weeks ago.”), highly aggressive (e.g., “I guess you or one of your incompetent staff must have lost my order.”), or moderately rude (I’m really surprised by this as EVERYBODY said you guys give really good customer service???).

Foulk and colleagues again found that prior exposure to rude behavior creates a specific sensitivity to rudeness. Notably, the type of video participants observed did not affect their responses to the neutral or aggressive emails; instead, the nature of those emails drove the response.  That is, all participants were more likely to send a hostile response to the aggressive email than to neutral email, regardless of whether they had previously observed a polite or rude employee interaction.  However, the type of video participants observed early in the study did affect their interpretation of and response to the rude email.  Those who had seen the polite video adopted a benign interpretation of the moderately rude email and delivered a neutral response, while those who had seen the rude video adopted a malevolent interpretation and delivered a hostile response.  Thus, observing rude behaviors, even those committed by coworkers or peers, resulted in greater sensitivity and heightened response to rudeness.

Read the entire article here.

Creativity and Mental Illness

Vincent_van_Gogh-Self_portrait_with_bandaged_ear

The creative genius — oft misunderstood, outcast, tortured, misanthropic, fueled by demon spirits. Yet, this same description would seem to be equally apt at describing many of those who are unfortunate enough to suffer from mental illness. So, could creativity and mental illness be high-level symptoms of a broader underlying spectrum “disorder”? After all, a not insignificant number of people and businesses tend to regard creativity as a behavioral problem — best left outside the front-door to the office. Time to check out the results of the latest psychological study.

From the Guardian:

The ancient Greeks were first to make the point. Shakespeare raised the prospect too. But Lord Byron was, perhaps, the most direct of them all: “We of the craft are all crazy,” he told the Countess of Blessington, casting a wary eye over his fellow poets.

The notion of the tortured artist is a stubborn meme. Creativity, it states, is fuelled by the demons that artists wrestle in their darkest hours. The idea is fanciful to many scientists. But a new study claims the link may be well-founded after all, and written into the twisted molecules of our DNA.

In a large study published on Monday, scientists in Iceland report that genetic factors that raise the risk of bipolar disorder and schizophrenia are found more often in people in creative professions. Painters, musicians, writers and dancers were, on average, 25% more likely to carry the gene variants than professions the scientists judged to be less creative, among which were farmers, manual labourers and salespeople.

Kari Stefansson, founder and CEO of deCODE, a genetics company based in Reykjavik, said the findings, described in the journal Nature Neuroscience, point to a common biology for some mental disorders and creativity. “To be creative, you have to think differently,” he told the Guardian. “And when we are different, we have a tendency to be labelled strange, crazy and even insane.”

The scientists drew on genetic and medical information from 86,000 Icelanders to find genetic variants that doubled the average risk of schizophrenia, and raised the risk of bipolar disorder by more than a third. When they looked at how common these variants were in members of national arts societies, they found a 17% increase compared with non-members.

The researchers went on to check their findings in large medical databases held in the Netherlands and Sweden. Among these 35,000 people, those deemed to be creative (by profession or through answers to a questionnaire) were nearly 25% more likely to carry the mental disorder variants.

Stefansson believes that scores of genes increase the risk of schizophrenia and bipolar disorder. These may alter the ways in which many people think, but in most people do nothing very harmful. But for 1% of the population, genetic factors, life experiences and other influences can culminate in problems, and a diagnosis of mental illness.

“Often, when people are creating something new, they end up straddling between sanity and insanity,” said Stefansson. “I think these results support the old concept of the mad genius. Creativity is a quality that has given us Mozart, Bach, Van Gogh. It’s a quality that is very important for our society. But it comes at a risk to the individual, and 1% of the population pays the price for it.”

Stefansson concedes that his study found only a weak link between the genetic variants for mental illness and creativity. And it is this that other scientists pick up on. The genetic factors that raise the risk of mental problems explained only about 0.25% of the variation in peoples’ artistic ability, the study found. David Cutler, a geneticist at Emory University in Atlanta, puts that number in perspective: “If the distance between me, the least artistic person you are going to meet, and an actual artist is one mile, these variants appear to collectively explain 13 feet of the distance,” he said.

Most of the artist’s creative flair, then, is down to different genetic factors, or to other influences altogether, such as life experiences, that set them on their creative journey.

For Stefansson, even a small overlap between the biology of mental illness and creativity is fascinating. “It means that a lot of the good things we get in life, through creativity, come at a price. It tells me that when it comes to our biology, we have to understand that everything is in some way good and in some way bad,” he said.

Read the entire article here.

Image: Vincent van Gogh, self-portrait, 1889. Courtesy of Courtauld Institute Galleries, London. Wikipaintings.org. Public Domain.

Monsters of Our Own Making

For parents: a few brief tips on how to deal with young adult children — that most pampered of generations. Tip number 1: turn off junior’s access to the family Netflix account.

From WSJ:

Congratulations. Two months ago, your kid graduated from college, bravely finishing his degree rather than dropping out to make millions on his idea for a dating app for people who throw up during Cross Fit training. If he’s like a great many of his peers, he’s moved back home, where he’s figuring out how to become an adult in the same room that still has his orthodontic headgear strapped to an Iron Man helmet.

Now we’re deep into summer, and the logistical challenges of your grad really being home are sinking in. You’re constantly juggling cars, cleaning more dishes and dealing with your daughter’s boyfriend, who not only slept over but also drank your last can of Pure Protein Frosty Chocolate shake.

But the real challenge here is a problem of your own making. You see, these children are members of the Most-Loved Generation: They’ve grown up with their lives stage-managed by us, their college-acceptance-obsessed parents. Remember when Eva, at age 7, was obsessed with gymnastics…for exactly 10 months, which is why the TV in your guest room sits on top of a $2,500 pommel horse?

Now that they’re out of college, you realize what wasn’t included in that $240,000 education: classes in life skills and decision-making.

With your kid at home, you find that he’s incapable of making a single choice on his own. Like when you’re working and he interrupts to ask how many blades is the best number for a multi-blade razor. Or when you’ve just crawled into bed and hear the familiar refrain of, “Mom, what can we eat?” All those years being your kid’s concierge and coach have created a monster.

So the time has come for you to cut the cord. And by that I mean: Take your kid off your Netflix account. He will be confused and upset at first, not understanding why this is happening to him, but it’s a great opportunity for him to sign up for something all by himself.

Which brings us to money. It’s finally time to channel your Angela Merkel and get tough with your young Alexis Tsipras. Put him on a consistent allowance and make him pay the extra fees incurred when he uses the ATM at the weird little deli rather than the one at his bank, a half-block away.

Next, nudge your kid to read books about self-motivation. Begin with baby steps: Don’t just hand her “Lean In” and “I Am Malala.” Your daughter’s great, but she’s no Malala. And the only thing she’s leaning in to is a bag of kettle corn while binge-watching “Orange Is the New Black.”

Instead, over dinner, casually drop a few pearls of wisdom from “Coach Wooden’s Pyramid of Success,” such as, “Make each day your masterpiece.” Let your kid decide whether getting a high score on her “Panda Pop Bubble Shooter” iPhone game qualifies. Then hope that John Wooden has piqued her curiosity and leave his book out with a packet of Sour Patch Xploderz on top. With luck, she’ll take the bait (candy and book).

Now it’s time to work on your kid’s inability to make a decision, which, let’s be honest, you’ve instilled over the years by jumping to answer all of her texts, even that time you were at the opera. “But,” you object, “it could have been an emergency!” It wasn’t. She couldn’t remember whether she liked Dijon mustard or mayo on her turkey wrap.

Set up some outings that nurture independence. Send your kid to the grocery store with orders to buy a week of dinner supplies. She’ll ask a hundred questions about what to get, but just respond with, “Whatever looks good to you” or, “Have fun with it.” She will look at you with panic, but don’t lose your resolve. Send her out and turn your phone off to avoid a barrage of texts, such as, “They’re out of bacterial wipes to clean off the shopping cart handle. What should I do?”

Rest assured, in a couple of hours, she’ll return with “dinner”—frozen waffles and a bag of Skinny Pop popcorn. Tough it out and serve it for dinner: The name of the game is positive reinforcement.

Once she’s back you’ll inevitably get hit with more questions, like, “It’s not lost, but how expensive is that remote key for the car?” Take a deep breath and just say, “Um, I’m not sure. Why don’t you Google it?”

Read the entire story here.

Multitasking: A Powerful and Diabolical Illusion

Our ever-increasingly ubiquitous technology makes possible all manner of things that would have been insurmountable just decades ago. We carry smartphones that envelope more computational power than mainframes just a generation ago. Yet for all this power at our fingertips we seem to forget that we are still very much human animals with limitations. One such “shortcoming” [your friendly editor believes it’s a boon] is our inability to multitask like our phones. I’ve written about this before, and am compelled to do so again after reading this thoughtful essay by Daniel J. Levitin, extracted from his book The Organized Mind: Thinking Straight in the Age of Information Overload. I even had to use his phrasing for the title of this post.

From the Guardian:

Our brains are busier than ever before. We’re assaulted with facts, pseudo facts, jibber-jabber, and rumour, all posing as information. Trying to figure out what you need to know and what you can ignore is exhausting. At the same time, we are all doing more. Thirty years ago, travel agents made our airline and rail reservations, salespeople helped us find what we were looking for in shops, and professional typists or secretaries helped busy people with their correspondence. Now we do most of those things ourselves. We are doing the jobs of 10 different people while still trying to keep up with our lives, our children and parents, our friends, our careers, our hobbies, and our favourite TV shows.

Our smartphones have become Swiss army knife–like appliances that include a dictionary, calculator, web browser, email, Game Boy, appointment calendar, voice recorder, guitar tuner, weather forecaster, GPS, texter, tweeter, Facebook updater, and flashlight. They’re more powerful and do more things than the most advanced computer at IBM corporate headquarters 30 years ago. And we use them all the time, part of a 21st-century mania for cramming everything we do into every single spare moment of downtime. We text while we’re walking across the street, catch up on email while standing in a queue – and while having lunch with friends, we surreptitiously check to see what our other friends are doing. At the kitchen counter, cosy and secure in our domicile, we write our shopping lists on smartphones while we are listening to that wonderfully informative podcast on urban beekeeping.

But there’s a fly in the ointment. Although we think we’re doing several things at once, multitasking, this is a powerful and diabolical illusion. Earl Miller, a neuroscientist at MIT and one of the world experts on divided attention, says that our brains are “not wired to multitask well… When people think they’re multitasking, they’re actually just switching from one task to another very rapidly. And every time they do, there’s a cognitive cost in doing so.” So we’re not actually keeping a lot of balls in the air like an expert juggler; we’re more like a bad amateur plate spinner, frantically switching from one task to another, ignoring the one that is not right in front of us but worried it will come crashing down any minute. Even though we think we’re getting a lot done, ironically, multitasking makes us demonstrably less efficient.

Multitasking has been found to increase the production of the stress hormone cortisol as well as the fight-or-flight hormone adrenaline, which can overstimulate your brain and cause mental fog or scrambled thinking. Multitasking creates a dopamine-addiction feedback loop, effectively rewarding the brain for losing focus and for constantly searching for external stimulation. To make matters worse, the prefrontal cortex has a novelty bias, meaning that its attention can be easily hijacked by something new – the proverbial shiny objects we use to entice infants, puppies, and kittens. The irony here for those of us who are trying to focus amid competing activities is clear: the very brain region we need to rely on for staying on task is easily distracted. We answer the phone, look up something on the internet, check our email, send an SMS, and each of these things tweaks the novelty- seeking, reward-seeking centres of the brain, causing a burst of endogenous opioids (no wonder it feels so good!), all to the detriment of our staying on task. It is the ultimate empty-caloried brain candy. Instead of reaping the big rewards that come from sustained, focused effort, we instead reap empty rewards from completing a thousand little sugar-coated tasks.

In the old days, if the phone rang and we were busy, we either didn’t answer or we turned the ringer off. When all phones were wired to a wall, there was no expectation of being able to reach us at all times – one might have gone out for a walk or been between places – and so if someone couldn’t reach you (or you didn’t feel like being reached), it was considered normal. Now more people have mobile phones than have toilets. This has created an implicit expectation that you should be able to reach someone when it is convenient for you, regardless of whether it is convenient for them. This expectation is so ingrained that people in meetings routinely answer their mobile phones to say, “I’m sorry, I can’t talk now, I’m in a meeting.” Just a decade or two ago, those same people would have let a landline on their desk go unanswered during a meeting, so different were the expectations for reachability.

Just having the opportunity to multitask is detrimental to cognitive performance. Glenn Wilson, former visiting professor of psychology at Gresham College, London, calls it info-mania. His research found that being in a situation where you are trying to concentrate on a task, and an email is sitting unread in your inbox, can reduce your effective IQ by 10 points. And although people ascribe many benefits to marijuana, including enhanced creativity and reduced pain and stress, it is well documented that its chief ingredient, cannabinol, activates dedicated cannabinol receptors in the brain and interferes profoundly with memory and with our ability to concentrate on several things at once. Wilson showed that the cognitive losses from multitasking are even greater than the cognitive losses from pot?smoking.

Russ Poldrack, a neuroscientist at Stanford, found that learning information while multitasking causes the new information to go to the wrong part of the brain. If students study and watch TV at the same time, for example, the information from their schoolwork goes into the striatum, a region specialised for storing new procedures and skills, not facts and ideas. Without the distraction of TV, the information goes into the hippocampus, where it is organised and categorised in a variety of ways, making it easier to retrieve. MIT’s Earl Miller adds, “People can’t do [multitasking] very well, and when they say they can, they’re deluding themselves.” And it turns out the brain is very good at this deluding business.

Then there are the metabolic costs that I wrote about earlier. Asking the brain to shift attention from one activity to another causes the prefrontal cortex and striatum to burn up oxygenated glucose, the same fuel they need to stay on task. And the kind of rapid, continual shifting we do with multitasking causes the brain to burn through fuel so quickly that we feel exhausted and disoriented after even a short time. We’ve literally depleted the nutrients in our brain. This leads to compromises in both cognitive and physical performance. Among other things, repeated task switching leads to anxiety, which raises levels of the stress hormone cortisol in the brain, which in turn can lead to aggressive and impulsive behaviour. By contrast, staying on task is controlled by the anterior cingulate and the striatum, and once we engage the central executive mode, staying in that state uses less energy than multitasking and actually reduces the brain’s need for glucose.

To make matters worse, lots of multitasking requires decision-making: Do I answer this text message or ignore it? How do I respond to this? How do I file this email? Do I continue what I’m working on now or take a break? It turns out that decision-making is also very hard on your neural resources and that little decisions appear to take up as much energy as big ones. One of the first things we lose is impulse control. This rapidly spirals into a depleted state in which, after making lots of insignificant decisions, we can end up making truly bad decisions about something important. Why would anyone want to add to their daily weight of information processing by trying to multitask?

Read the entire article here.

Your Goldfish is Better Than You

Common_goldfish

Well, perhaps not at philosophical musings or mathematics. But, your little orange aquatic friend now has an attention span that is longer than yours. And, it’s all thanks to mobile devices and multi-tasking on multiple media platforms. [Psst, by the way, multi-tasking at the level of media consumption is a fallacy]. On average, the adult attention span is now down to a laughingly paltry 8 seconds, whereas the lowly goldfish comes in at 9 seconds. Where of course that leaves your inbetweeners and teenagers is anyone’s guess.

From the Independent:

Humans have become so obsessed with portable devices and overwhelmed by content that we now have attention spans shorter than that of the previously jokingly juxtaposed goldfish.

Microsoft surveyed 2,000 people and used electroencephalograms (EEGs) to monitor the brain activity of another 112 in the study, which sought to determine the impact that pocket-sized devices and the increased availability of digital media and information have had on our daily lives.

Among the good news in the 54-page report is that our ability to multi-task has drastically improved in the information age, but unfortunately attention spans have fallen.

In 2000 the average attention span was 12 seconds, but this has now fallen to just eight. The goldfish is believed to be able to maintain a solid nine.

“Canadians [who were tested] with more digital lifestyles (those who consume more media, are multi-screeners, social media enthusiasts, or earlier adopters of technology) struggle to focus in environments where prolonged attention is needed,” the study reads.

“While digital lifestyles decrease sustained attention overall, it’s only true in the long-term. Early adopters and heavy social media users front load their attention and have more intermittent bursts of high attention. They’re better at identifying what they want/don’t want to engage with and need less to process and commit things to memory.”

Anecdotely, many of us can relate to the increasing inability to focus on tasks, being distracted by checking your phone or scrolling down a news feed.

Another recent study by the National Centre for Biotechnology Information and the National Library of Medicine in the US found that 79 per cent of respondents used portable devices while watching TV (known as dual-screening) and 52 per cent check their phone every 30 minutes.

Read the entire story here.

Image: Common Goldfish. Public Domain.

 

Women Are From Venus, Men Can’t Remember

Yet another body of research underscores how different women are from men. This time, we are told, that the sexes generally encode and recall memories differently. So, the next time you take issue with a spouse (of different gender) about a — typically trivial — past event keep in mind that your own actions, mood and gender will affect your recall. If you’re female, your memories may be much more vivid than your male counterpart, but not necessarily more correct. If you (male) won last night’s argument, your spouse (female) will — unfortunately for you — remember it more accurately than you, which of course will lead to another argument.

From WSJ:

Carrie Aulenbacher remembers the conversation clearly: Her husband told her he wanted to buy an arcade machine he found on eBay. He said he’d been saving up for it as a birthday present to himself. The spouses sat at the kitchen table and discussed where it would go in the den.

Two weeks later, Ms. Aulenbacher came home from work and found two arcade machines in the garage—and her husband beaming with pride.

“What are these?” she demanded.

“I told you I was picking them up today,” he replied.

She asked him why he’d bought two. He said he’d told her he was getting “a package deal.” She reminded him they’d measured the den for just one. He stood his ground.

“I believe I told her there was a chance I was going to get two,” says Joe Aulenbacher, who is 37 and lives in Erie, Pa.

“It still gets me going to think about it a year later,” says Ms. Aulenbacher, 36. “My home is now overrun with two machines I never agreed upon.” The couple compromised by putting one game in the den and the other in Mr. Aulenbacher’s weight room.

It is striking how many arguments in a relationship start with two different versions of an event: “Your tone of voice was rude.” “No it wasn’t.” “You didn’t say you’d be working late.” “Yes I did.” “I told you we were having dinner with my mother tonight.” “No, honey. You didn’t.”

How can two people have different memories of the same event? It starts with the way each person perceives the event in the first place—and how they encoded that memory. “You may recall something differently at least in part because you understood it differently at the time,” says Dr. Michael Ross, professor emeritus in the psychology department at the University of Waterloo in Ontario, Canada, who has studied memory for many years.

Researchers know that spouses sometimes can’t even agree on concrete events that happened in the past 24 hours—such as whether they had an argument or whether one received a gift from the other. A study in the early 1980s, published in the journal “Behavioral Assessment,” found that couples couldn’t perfectly agree on whether they had sex the previous night.

Women tend to remember more about relationship issues than men do. When husbands and wives are asked to recall concrete relationship events, such as their first date, an argument or a recent vacation, women’s memories are more vivid and detailed.

But not necessarily more accurate. When given a standard memory test where they are shown names or pictures and then asked to recall them, women do just about the same as men.

Researchers have found that women report having more emotions during relationship events than men do. They may remember events better because they pay more attention to the relationship and reminisce more about it.

People also remember their own actions better. So they can recall what they did, just not what their spouse did. Researchers call this an egocentric bias, and study it by asking people to recall their contributions to events, as well as their spouse’s. Who cleans the kitchen more? Who started the argument? Whether the event is positive or negative, people tend to believe that they had more responsibility.

Your mood—both when an event happens and when you recall it later—plays a big part in memory, experts say. If you are in a positive mood or feeling positive about the other person, you will more likely recall a positive experience or give a positive interpretation to a negative experience. Similarly, negative moods tend to reap negative memories.

Negative moods may also cause stronger memories. A person who lost an argument remembers it more clearly than the person who won it, says Dr. Ross. Men tend to win more arguments, he says, which may help to explain why women remember the spat more. But men who lost an argument remember it as well as women who lost.

Read the entire article here.

We Are All Always Right, All of the Time

You already know this: you believe that your opinion is correct all the time, about everything. And, interestingly enough, your friends and neighbors believe that they are always right too. Oh, and the colleague at the office with whom you argue all the time — she’s right all the time too.

How can this be, when in an increasingly science-driven, objective universe facts trump opinion? Well, not so fast. It seems that we humans have an internal mechanism that colors our views based on a need for acceptance within a broader group. That is, we generally tend to spin our rational views in favor of group consensus, versus supporting the views of a subject matter expert, which might polarize the group. This is both good and bad. Good because it reinforces the broader benefits of being within a group; bad because we are more likely to reject opinion, evidence and fact from experts outside of our group — think climate change.

From the Washington Post:

It’s both the coolest — and also in some ways the most depressing — psychology study ever.

Indeed, it’s so cool (and so depressing) that the name of its chief finding — the Dunning-Kruger effect — has at least halfway filtered into public consciousness. In the classic 1999 paper, Cornell researchers David Dunning and Justin Kruger found that the less competent people were in three domains — humor, logic, and grammar — the less likely they were to be able to recognize that. Or as the researchers put it:

We propose that those with limited knowledge in a domain suffer from a dual burden: Not only do they reach mistaken conclusions and make regrettable errors, but their incompetence robs them of the ability to realize it.

Dunning and Kruger didn’t directly apply this insight to our debates about science. But I would argue that the effect named after them certainly helps to explain phenomena like vaccine denial, in which medical authorities have voiced a very strong opinion, but some parents just keep on thinking that, somehow, they’re in a position to challenge or ignore this view.

So why do I bring this classic study up now?

The reason is that an important successor to the Dunning-Kruger paper has just been come out — and it, too, is pretty depressing (at least for those of us who believe that domain expertise is a thing to be respected and, indeed, treasured)This time around, psychologists have not uncovered an endless spiral of incompetence and the inability to perceive it. Rather, they’ve shown that people have an “equality bias” when it comes to competence or expertise, such that even when it’s very clear that one person in a group is more skilled, expert, or competent (and the other less), they are nonetheless inclined to seek out a middle ground in determining how correct different viewpoints are.

Yes, that’s right — we’re all right, nobody’s wrong, and nobody gets hurt feelings.

The new study, just published in the Proceedings of the National Academy of Sciences, is by Ali Mahmoodi of the University of Tehran and a long list of colleagues from universities in the UK, Germany, China, Denmark, and the United States. And no wonder: The research was transnational, and the same experiment — with the same basic results — was carried out across cultures in China, Denmark, and Iran.

Read the entire story here.

The Killer Joke and the Killer Idea

Some jokes can make you laugh until you cry. Some jokes can kill. And, research shows that thoughts alone can have equally devastating consequences as well.

From BBC:

Beware the scaremongers. Like a witch doctor’s spell, their words might be spreading modern plagues.

We have long known that expectations of a malady can be as dangerous as a virus. In the same way that voodoo shamans could harm their victims through the power of suggestion, priming someone to think they are ill can often produce the actual symptoms of a disease. Vomiting, dizziness, headaches, and even death, could be triggered through belief alone. It’s called the “nocebo effect”.

But it is now becoming clear just how easily those dangerous beliefs can spread through gossip and hearsay – with potent effect. It may be the reason why certain houses seem cursed with illness, and why people living near wind turbines report puzzling outbreaks of dizziness, insomnia and vomiting. If you have ever felt “fluey” after a vaccination, believed your cell phone was giving you a headache, or suffered an inexplicable food allergy, you may have also fallen victim to a nocebo jinx. “The nocebo effect shows the brain’s power,” says Dimos Mitsikostas, from Athens Naval Hospital in Greece. “And we cannot fully explain it.”

A killer joke

Doctors have long known that beliefs can be deadly – as demonstrated by a rather nasty student prank that went horribly wrong. The 18th Century Viennese medic, Erich Menninger von Lerchenthal, describes how students at his medical school picked on a much-disliked assistant. Planning to teach him a lesson, they sprung upon him before announcing that he was about to be decapitated. Blindfolding him, they bowed his head onto the chopping block, before dropping a wet cloth on his neck. Convinced it was the kiss of a steel blade, the poor man “died on the spot”.

While anecdotes like this abound, modern researchers had mostly focused on the mind’s ability to heal, not harm – the “placebo effect”, from the Latin for “I will please”. Every clinical trial now randomly assigns patients to either a real drug, or a placebo in the form of an inert pill. The patient doesn’t know which they are taking, and even those taking the inert drug tend to show some improvement – thanks to their faith in the treatment.

Yet alongside the benefits, people taking placebos often report puzzling side effects – nausea, headaches, or pain – that are unlikely to come from an inert tablet. The problem is that people in a clinical trial are given exactly the same health warnings whether they are taking the real drug or the placebo – and somehow, the expectation of the symptoms can produce physical manifestations in some placebo takers. “It’s a consistent phenomenon, but medicine has never really dealt with it,” says Ted Kaptchuk at Harvard Medical School.

Over the last 10 years, doctors have shown that this nocebo effect – Latin for “I will harm” – is very common. Reviewing the literature, Mitsikostas has so far documented strong nocebo effects in many treatments for headache, multiple sclerosis, and depression. In trials for Parkinson’s disease, as many as 65% report adverse events as a result of their placebo. “And around one out of 10 treated will drop out of a trial because of nocebo, which is pretty high,” he says.

Although many of the side-effects are somewhat subjective – like nausea or pain – nocebo responses do occasionally show up as rashes and skin complaints, and they are sometimes detectable on physiological tests too. “It’s unbelievable – they are taking sugar pills and when you measure liver enzymes, they are elevated,” says Mitsikostas.

And for those who think these side effects are somehow “deliberately” willed or imagined, measures of nerve activity following nocebo treatment have shown that the spinal cord begins responding to heightened painbefore conscious deliberation would even be possible.

Consider the near fatal case of “Mr A”, reported by doctor Roy Reeves in 2007. Mr A was suffering from depression when he consumed a whole bottle of pills. Regretting his decision, Mr A rushed to ER, and promptly collapsed at reception. It looked serious; his blood pressure had plummeted, and he was hyperventilating; he was immediately given intravenous fluids. Yet blood tests could find no trace of the drug in his system. Four hours later, another doctor arrived to inform Reeves that the man had been in the placebo arm of a drugs trial; he had “overdosed” on sugar tablets. Upon hearing the news, the relieved Mr A soon recovered.

We can never know whether the nocebo effect would have actually killed Mr A, though Fabrizio Benedetti at the University of Turin Medical School thinks it is certainly possible. He has scanned subjects’ brains as they undergo nocebo suggestions, which seems to set off a chain of activation in the hypothalamus, and the pituitary and adrenal glands – areas that deal with extreme threats to our body. If your fear and belief were strong enough, the resulting cocktail of hormones could be deadly, he says.

Read the entire story here.

True “False Memory”

Apparently it is surprisingly easy to convince people to remember a crime, or other action, that they never committed. Makes one wonder how many of the around 2 million people in US prisons are incarcerated due to these false memories in both inmates and witnesses.

From ars technica:

The idea that memories are not as reliable as we think they are is disconcerting, but it’s pretty well-established. Various studies have shown that participants can be persuaded to create false childhood memories—of being lost in a shopping mall or hospitalized, or even highly implausible scenarios like having tea with Prince Charles.

The creation of false memories has obvious implications for the legal system, as it gives us reasons to distrust both eyewitness accounts and confessions. It’s therefore important to know exactly what kinds of false memories can be created, what influences the creation of a false memory, and whether false recollections can be distinguished from real ones.

A recent paper in Psychological Science found that 71 percent of participants exposed to certain interview techniques developed false memories of having committed a crime as a teenager. In reality, none of these people had experienced contact with the police during the age bracket in question.

After establishing a pool of potential participants, the researchers sent out questionnaires to the caregivers of these individuals. They eliminated any participants who had been involved in some way with an assault or theft, or had other police contact between the ages of 11 and 14. They also asked the caregivers to describe in detail a highly emotional event that the participant had experienced at this age. The caregivers were asked not to discuss the content of the questionnaire with the participants.

The 60 eligible participants were divided into two groups: one that would be given false memories of committing an assault, theft, or assault with a weapon, and another that would be provided with false memories of another emotional event—an injury, an attack by a dog, or the loss of a large sum of money. In the first of three interviews with each participant, the interviewer presented the true memory that had been provided by the caregiver. Once the interviewer’s credibility and knowledge of the participant’s background had been established, the false memory was presented.

For both kinds of memory, the interviewer gave the participant “cues”, such as their age at the time, people who had been involved, and the time of year. Participants were then asked to recall the details of what had happened. No participants recalled the false event the first time it was mentioned—which would have rung alarm bells—but were reassured that people could often uncover memories like these through effort.

A number of tactics were used to induce the false memory. Social pressure was applied to encourage recall of details, the interviewer attempted to build a rapport with the participants, and the participants were told that their caregivers had corroborated the facts. They were also encouraged to use visualization techniques to “uncover” the memory.

In each of the three interviews, participants were asked to provide as many details as they could for both events. After the final interview, they were informed that the second memory was false, and asked whether they had really believed the events had occurred. They were also asked to rate how surprised they were to find out that it was false. Only participants who answered that they had genuinely believed the false memory, and who could give more than ten details of the event, were classified as having a true false memory. Of the participants in the group with criminal false stories, 71 percent developed a “true” false memory. The group with non-criminal false stories was not significantly different, with 77 percent of participants classified as having a false memory. The details participants provided for their false memories did not differ significantly in either quality or quantity from their true memories.

This study is only a beginning, and there is still a great deal of work to be done. There are a number of factors that couldn’t be controlled for but which may have influenced the results. For instance, the researchers suggest that, since only one interviewer was involved, her individual characteristics may have influenced the results, raising the question of whether only certain kinds of interviewers can achieve these effects. It isn’t clear whether participants were fully honest about having believed in the false memory, since they could have just been trying to cooperate; the results could also have been affected by the fact that there were no negative consequences to telling the false story.

Read the entire article here.

Building a Memory Palace

Feats of memory have long been the staple of human endeavor — for instance, memorizing and recalling Pi to hundreds of decimal places. Nowadays, however, memorization is a competitive sport replete with grand prizes, worthy of a place in an X-Games tournament.

From the NYT:

The last match of the tournament had all the elements of a classic showdown, pitting style versus stealth, quickness versus deliberation, and the world’s foremost card virtuoso against its premier numbers wizard.

If not quite Ali-Frazier or Williams-Sharapova, the duel was all the audience of about 100 could ask for. They had come to the first Extreme Memory Tournament, or XMT, to see a fast-paced, digitally enhanced memory contest, and that’s what they got.

The contest, an unusual collaboration between industry and academic scientists, featured one-minute matches between 16 world-class “memory athletes” from all over the world as they met in a World Cup-like elimination format. The grand prize was $20,000; the potential scientific payoff was large, too.

One of the tournament’s sponsors, the company Dart NeuroScience, is working to develop drugs for improved cognition. The other, Washington University in St. Louis, sent a research team with a battery of cognitive tests to determine what, if anything, sets memory athletes apart. Previous research was sparse and inconclusive.

Yet as the two finalists, both Germans, prepared to face off — Simon Reinhard, 35, a lawyer who holds the world record in card memorization (a deck in 21.19 seconds), and Johannes Mallow, 32, a teacher with the record for memorizing digits (501 in five minutes) — the Washington group had one preliminary finding that wasn’t obvious.

“We found that one of the biggest differences between memory athletes and the rest of us,” said Henry L. Roediger III, the psychologist who led the research team, “is in a cognitive ability that’s not a direct measure of memory at all but of attention.”

The Memory Palace

The technique the competitors use is no mystery.

People have been performing feats of memory for ages, scrolling out pi to hundreds of digits, or phenomenally long verses, or word pairs. Most store the studied material in a so-called memory palace, associating the numbers, words or cards with specific images they have already memorized; then they mentally place the associated pairs in a familiar location, like the rooms of a childhood home or the stops on a subway line.

The Greek poet Simonides of Ceos is credited with first describing the method, in the fifth century B.C., and it has been vividly described in popular books, most recently “Moonwalking With Einstein,” by Joshua Foer.

Each competitor has his or her own variation. “When I see the eight of diamonds and the queen of spades, I picture a toilet, and my friend Guy Plowman,” said Ben Pridmore, 37, an accountant in Derby, England, and a former champion. “Then I put those pictures on High Street in Cambridge, which is a street I know very well.”

As these images accumulate during memorization, they tell an increasingly bizarre but memorable story. “I often use movie scenes as locations,” said James Paterson, 32, a high school psychology teacher in Ascot, near London, who competes in world events. “In the movie ‘Gladiator,’ which I use, there’s a scene where Russell Crowe is in a field, passing soldiers, inspecting weapons.”

Mr. Paterson uses superheroes to represent combinations of letters or numbers: “I might have Batman — one of my images — playing Russell Crowe, and something else playing the horse, and so on.”

The material that competitors attempt to memorize falls into several standard categories. Shuffled decks of cards. Random words. Names matched with faces. And numbers, either binary (ones and zeros) or integers. They are given a set amount of time to study — up to one minute in this tournament, an hour or more in others — before trying to reproduce as many cards, words or digits in the order presented.

Now and then, a challenger boasts online of having discovered an entirely new method, and shows up at competitions to demonstrate it.

“Those people are easy to find, because they come in last, or close to it,” said another world-class competitor, Boris Konrad, 29, a German postdoctoral student in neuroscience. “Everyone here uses this same type of technique.”

Anyone can learn to construct a memory palace, researchers say, and with practice remember far more detail of a particular subject than before. The technique is accessible enough that preteens pick it up quickly, and Mr. Paterson has integrated it into his teaching.

“I’ve got one boy, for instance, he has no interest in academics really, but he knows the Premier League, every team, every player,” he said. “I’m working with him, and he’s using that knowledge as scaffolding to help remember what he’s learning in class.”

Experts in Forgetting

The competitors gathered here for the XMT are not just anyone, however. This is the all-world team, an elite club of laser-smart types who take a nerdy interest in stockpiling facts and pushing themselves hard.

In his doctoral study of 30 world-class performers (most from Germany, which has by far the highest concentration because there are more competitions), Mr. Konrad has found as much. The average I.Q.: 130. Average study time: 1,000 to 2,000 hours and counting. The top competitors all use some variation of the memory-palace system and test, retest and tweak it.

“I started with my own system, but now I use his,” said Annalena Fischer, 20, pointing to her boyfriend, Christian Schäfer, 22, whom she met at a 2010 memory competition in Germany. “Except I don’t use the distance runners he uses; I don’t know anything about the distance runners.” Both are advanced science students and participants in Mr. Konrad’s study.

One of the Washington University findings is predictable, if still preliminary: Memory athletes score very highly on tests of working memory, the mental sketchpad that serves as a shopping list of information we can hold in mind despite distractions.

One way to measure working memory is to have subjects solve a list of equations (5 + 4 = x; 8 + 9 = y; 7 + 2 = z; and so on) while keeping the middle numbers in mind (4, 9 and 2 in the above example). Elite memory athletes can usually store seven items, the top score on the test the researchers used; the average for college students is around two.

“And college students tend to be good at this task,” said Dr. Roediger, a co-author of the new book “Make It Stick: The Science of Successful Learning.” “What I’d like to do is extend the scoring up to, say, 21, just to see how far the memory athletes can go.”

Yet this finding raises another question: Why don’t the competitors’ memory palaces ever fill up? Players usually have many favored locations to store studied facts, but they practice and compete repeatedly. They use and reuse the same blueprints hundreds of times, and the new images seem to overwrite the old ones — virtually without error.

“Once you’ve remembered the words or cards or whatever it is, and reported them, they’re just gone,” Mr. Paterson said.

Many competitors say the same: Once any given competition is over, the numbers or words or facts are gone. But this is one area in which they have less than precise insight.

In its testing, which began last year, the Washington University team has given memory athletes surprise tests on “old” material — lists of words they’d been tested on the day before. On Day 2, they recalled an average of about three-quarters of the words they memorized on Day 1 (college students remembered fewer than 5 percent). That is, despite what competitors say, the material is not gone; far from it.

Yet to install a fresh image-laden “story” in any given memory palace, a memory athlete must clear away the old one in its entirety. The same process occurs when we change a password: The old one must be suppressed, so it doesn’t interfere with the new one.

One term for that skill is “attentional control,” and psychologists have been measuring it for years with standardized tests. In the best known, the Stroop test, people see words flash by on a computer screen and name the color in which a word is presented. Answering is nearly instantaneous when the color and the word match — “red” displayed in red — but slower when there’s a mismatch, like “red” displayed in blue.

Read the entire article here.

Intimate Anonymity

A new mobile app lets you share all your intimate details with a stranger for 20 days. The fascinating part of this social experiment is that the stranger remains anonymous throughout. The app known as 20 Day Stranger is brought to us by the venerable MIT Media Lab. It may never catch on, but you can be sure that psychologists are gleefully awaiting some data.

From Slate:

Social media is all about connecting with people you know, people you sort of know, or people you want to know. But what about all those people you didn’t know you wanted to know? They’re out there, too, and the new iPhone app 20 Day Stranger wants to put you in touch with them. Created by the MIT Media Lab’s Playful Systems research group, the app connects strangers and allows them to update each other about any and every detail of their lives for 20 days. But the people are totally anonymous and can interact directly only at the end of their 20 days together, when they can exchange one message each.

20 Day Stranger uses information from the iPhone’s sensors to alert your stranger-friend when you wake up (and start moving the phone), when you’re in a car or bus (from GPS tracking), and where you are. But it isn’t totally privacy-invading: The app also takes steps to keep both people anonymous. When it shows your stranger-friend that you’re walking around somewhere, it accompanies the notification with images from a half-mile radius of where you actually are on Google Maps. Your stranger-friend might be able to figure out what area you’re in, or they might not.

Kevin Slavin, the director of Playful Systems, explained to Fast Company that the app’s goal is to introduce people online in a positive and empathetic way, rather than one that’s filled with suspicion or doubt. Though 20 Day Stranger is currently being beta tested, Playful Systems’ goal is to generally release it in the App Store. But the group is worried about getting people to adopt it all over instead of building up user bases in certain geographic areas. “There’s no one type of person what will make it useful,” Slavin said. “It’s the heterogeneous quality of everyone in aggregate. Which is a bad [promotional] strategy if you’re making commercial software.”

At this point it’s not that rare to interact frequently with someone you’ve never met in person on social media. What’s unusual it not to know their name or anything about who they are. But an honest window into another person’s life without the pressure of identity could expand your worldview and maybe even stimulate introspection. It sounds like a step up from Secret, that’s for sure.

Read the entire article here.

Paper is the Next Big Thing

Da-Vinci-Hammer-Codex

Luddites and technophobes rejoice, paper-bound books may be with us for quite some time. And, there may be some genuinely scientific reasons why physical books will remain. Recent research shows that people learn more effectively when reading from paper versus its digital offspring.

From Wired:

Paper books were supposed to be dead by now. For years, information theorists, marketers, and early adopters have told us their demise was imminent. Ikea even redesigned a bookshelf to hold something other than books. Yet in a world of screen ubiquity, many people still prefer to do their serious reading on paper.

Count me among them. When I need to read deeply—when I want to lose myself in a story or an intellectual journey, when focus and comprehension are paramount—I still turn to paper. Something just feels fundamentally richer about reading on it. And researchers are starting to think there’s something to this feeling.

To those who see dead tree editions as successors to scrolls and clay tablets in history’s remainder bin, this might seem like literary Luddism. But I e-read often: when I need to copy text for research or don’t want to carry a small library with me. There’s something especially delicious about late-night sci-fi by the light of a Kindle Paperwhite.

What I’ve read on screen seems slippery, though. When I later recall it, the text is slightly translucent in my mind’s eye. It’s as if my brain better absorbs what’s presented on paper. Pixels just don’t seem to stick. And often I’ve found myself wondering, why might that be?

The usual explanation is that internet devices foster distraction, or that my late-thirty-something brain isn’t that of a true digital native, accustomed to screens since infancy. But I have the same feeling when I am reading a screen that’s not connected to the internet and Twitter or online Boggle can’t get in the way. And research finds that kids these days consistently prefer their textbooks in print rather than pixels. Whatever the answer, it’s not just about habit.

Another explanation, expressed in a recent Washington Post article on the decline of deep reading, blames a sweeping change in our lifestyles: We’re all so multitasked and attention-fragmented that our brains are losing the ability to focus on long, linear texts. I certainly feel this way, but if I don’t read deeply as often or easily as I used to, it does still happen. It just doesn’t happen on screen, and not even on devices designed specifically for that experience.

Maybe it’s time to start thinking of paper and screens another way: not as an old technology and its inevitable replacement, but as different and complementary interfaces, each stimulating particular modes of thinking. Maybe paper is a technology uniquely suited for imbibing novels and essays and complex narratives, just as screens are for browsing and scanning.

“Reading is human-technology interaction,” says literacy professor Anne Mangen of Norway’s University of Stavenger. “Perhaps the tactility and physical permanence of paper yields a different cognitive and emotional experience.” This is especially true, she says, for “reading that can’t be done in snippets, scanning here and there, but requires sustained attention.”

Mangen is among a small group of researchers who study how people read on different media. It’s a field that goes back several decades, but yields no easy conclusions. People tended to read slowly and somewhat inaccurately on early screens. The technology, particularly e-paper, has improved dramatically, to the point where speed and accuracy aren’t now problems, but deeper issues of memory and comprehension are not yet well-characterized.

Complicating the scientific story further, there are many types of reading. Most experiments involve short passages read by students in an academic setting, and for this sort of reading, some studies have found no obvious differences between screens and paper. Those don’t necessarily capture the dynamics of deep reading, though, and nobody’s yet run the sort of experiment, involving thousands of readers in real-world conditions who are tracked for years on a battery of cognitive and psychological measures, that might fully illuminate the matter.

In the meantime, other research does suggest possible differences. A 2004 study found that students more fully remembered what they’d read on paper. Those results were echoed by an experiment that looked specifically at e-books, and another by psychologist Erik Wästlund at Sweden’s Karlstad University, who found that students learned better when reading from paper.

Wästlund followed up that study with one designed to investigate screen reading dynamics in more detail. He presented students with a variety of on-screen document formats. The most influential factor, he found, was whether they could see pages in their entirety. When they had to scroll, their performance suffered.

According to Wästlund, scrolling had two impacts, the most basic being distraction. Even the slight effort required to drag a mouse or swipe a finger requires a small but significant investment of attention, one that’s higher than flipping a page. Text flowing up and down a page also disrupts a reader’s visual attention, forcing eyes to search for a new starting point and re-focus.

Mangen is among a small group of researchers who study how people read on different media. It’s a field that goes back several decades, but yields no easy conclusions. People tended to read slowly and somewhat inaccurately on early screens. The technology, particularly e-paper, has improved dramatically, to the point where speed and accuracy aren’t now problems, but deeper issues of memory and comprehension are not yet well-characterized.

Complicating the scientific story further, there are many types of reading. Most experiments involve short passages read by students in an academic setting, and for this sort of reading, some studies have found no obvious differences between screens and paper. Those don’t necessarily capture the dynamics of deep reading, though, and nobody’s yet run the sort of experiment, involving thousands of readers in real-world conditions who are tracked for years on a battery of cognitive and psychological measures, that might fully illuminate the matter.

In the meantime, other research does suggest possible differences. A 2004 study found that students more fully remembered what they’d read on paper. Those results were echoed by an experiment that looked specifically at e-books, and another by psychologist Erik Wästlund at Sweden’s Karlstad University, who found that students learned better when reading from paper.

Wästlund followed up that study with one designed to investigate screen reading dynamics in more detail. He presented students with a variety of on-screen document formats. The most influential factor, he found, was whether they could see pages in their entirety. When they had to scroll, their performance suffered.

According to Wästlund, scrolling had two impacts, the most basic being distraction. Even the slight effort required to drag a mouse or swipe a finger requires a small but significant investment of attention, one that’s higher than flipping a page. Text flowing up and down a page also disrupts a reader’s visual attention, forcing eyes to search for a new starting point and re-focus.

Read the entire electronic article here.

Image: Leicester or Hammer Codex, by Leonardo da Vinci (1452-1519). Courtesy of Wikipedia / Public domain.

 

Need Some Exercise? Laugh

Duck_SoupYour sense of humor and wit will keep your brain active and nimble. It will endear you to friends (often), family (usually) and bosses (sometimes). In addition, there is growing evidence that being an amateur (or professional) comedian or a just a connoisseur of good jokes will help you physically as well.

From WSJ:

“I just shot an elephant in my pajamas,” goes the old Groucho Marx joke. “How he got in my pajamas I don’t know.”

You’ve probably heard that one before, or something similar. For example, while viewing polling data for the 2008 presidential election on Comedy Central, Stephen Colbert deadpanned, “If I’m reading this graph correctly…I’d be very surprised.”

Zingers like these aren’t just good lines. They reveal something unusual about how the mind operates—and they show us how humor works. Simply put, the brain likes to jump the gun. We are always guessing where things are going, and we often get it wrong. But this isn’t necessarily bad. It’s why we laugh.

Humor is a form of exercise—a way of keeping the brain engaged. Mr. Colbert’s line is a fine example of this kind of mental calisthenics. If he had simply observed that polling data are hard to interpret, you would have heard crickets chirping. Instead, he misdirected his listeners, leading them to expect ponderous analysis and then bolting in the other direction to declare his own ignorance. He got a laugh as his audience’s minds caught up with him and enjoyed the experience of being fooled.

We benefit from taxing our brains with the mental exercise of humor, much as we benefit from the physical exercise of a long run or a tough tennis match. Comedy extends our mental stamina and improves our mental flexibility. A 1976 study by Avner Ziv of Tel Aviv University found that those who listened to a comedy album before taking a creativity test performed 20% better than those who weren’t exposed to the routine beforehand. In 1987, researchers at the University of Maryland found that watching comedy more than doubles our ability to solve brain teasers, like the so-called Duncker candle problem, which challenges people to attach a candle to a wall using only a book of matches and a box of thumbtacks. Research published in 1998 by psychologist Heather Belanger of the College of William and Mary even suggests that humor improves our ability to mentally rotate imaginary objects in our heads—a key test of spatial thinking ability.

The benefits of humor don’t stop with increased intelligence and creativity. Consider the “cold pressor test,” in which scientists ask subjects to submerge their hands in water cooled to just above the freezing mark.

This isn’t dangerous, but it does allow researchers to measure pain tolerance—which varies, it turns out, depending on what we’ve been doing before dunking our hands. How long could you hold your hand in 35-degree water after watching 10 minutes of Bill Cosby telling jokes? The answer depends on your own pain tolerance, but I can promise that it is longer than it would be if you had instead watched a nature documentary.

Like exercise, humor helps to prepare the mind for stressful events. A study done in 2000 by Arnold Cann, a psychologist at the University of North Carolina, had subjects watch 16 minutes of stand-up comedy before viewing “Faces of Death”—the notorious 1978 shock film depicting scene after scene of gruesome deaths. Those who watched the comedy routine before the grisly film reported significantly less psychological distress than those who watched a travel show instead. The degree to which humor can inoculate us from stress is quite amazing (though perhaps not as amazing as the fact that Dr. Cann got his experiment approved by his university’s ethical review board).

This doesn’t mean that every sort of humor is helpful. Taking a dark, sardonic attitude toward life can be unhealthy, especially when it relies on constant self-punishment. (Rodney Dangerfield: “My wife and I were happy for 20 years. Then we met.”) According to Nicholas Kuiper of the University of Western Ontario, people who resort to this kind of humor experience higher rates of depression than their peers, along with higher anxiety and lower self-esteem. Enjoying a good laugh is healthy, so long as you yourself aren’t always the target.

Having an active sense of humor helps us to get more from life, both cognitively and emotionally. It allows us to exercise our brains regularly, looking for unexpected and pleasing connections even in the face of difficulties or hardship. The physicist Richard Feynman called this “the kick of the discovery,” claiming that the greatest joy of his life wasn’t winning the Nobel Prize—it was the pleasure of discovering new things.

Read the entire story here.

Image: Duck Soup, promotional movie poster (1933). Courtesy of Wikipedia.

 

Is Your City Killing You?

The stresses of modern day living are taking a toll on your mind and body. And, more so if you happen to live in an concrete jungle. The results are even more pronounced for those of us living in large urban centers. That’s the finding of some fascinating new brain research out of Germany. Their simple answer to a lower-stress life: move to the countryside.

From The Guardian:

You are lying down with your head in a noisy and tightfitting fMRI brain scanner, which is unnerving in itself. You agreed to take part in this experiment, and at first the psychologists in charge seemed nice.

They set you some rather confusing maths problems to solve against the clock, and you are doing your best, but they aren’t happy. “Can you please concentrate a little better?” they keep saying into your headphones. Or, “You are among the worst performing individuals to have been studied in this laboratory.” Helpful things like that. It is a relief when time runs out.

Few people would enjoy this experience, and indeed the volunteers who underwent it were monitored to make sure they had a stressful time. Their minor suffering, however, provided data for what became a major study, and a global news story. The researchers, led by Dr Andreas Meyer-Lindenberg of the Central Institute of Mental Health in Mannheim, Germany, were trying to find out more about how the brains of different people handle stress. They discovered that city dwellers’ brains, compared with people who live in the countryside, seem not to handle it so well.

To be specific, while Meyer-Lindenberg and his accomplices were stressing out their subjects, they were looking at two brain regions: the amygdalas and the perigenual anterior cingulate cortex (pACC). The amygdalas are known to be involved in assessing threats and generating fear, while the pACC in turn helps to regulate the amygdalas. In stressed citydwellers, the amygdalas appeared more active on the scanner; in people who lived in small towns, less so; in people who lived in the countryside, least of all.

And something even more intriguing was happening in the pACC. Here the important relationship was not with where the the subjects lived at the time, but where they grew up. Again, those with rural childhoods showed the least active pACCs, those with urban ones the most. In the urban group moreover, there seemed not to be the same smooth connection between the behaviour of the two brain regions that was observed in the others. An erratic link between the pACC and the amygdalas is often seen in those with schizophrenia too. And schizophrenic people are much more likely to live in cities.

When the results were published in Nature, in 2011, media all over the world hailed the study as proof that cities send us mad. Of course it proved no such thing – but it did suggest it. Even allowing for all the usual caveats about the limitations of fMRI imaging, the small size of the study group and the huge holes that still remained in our understanding, the results offered a tempting glimpse at the kind of urban warping of our minds that some people, at least, have linked to city life since the days of Sodom and Gomorrah.

The year before the Meyer-Lindenberg study was published, the existence of that link had been established still more firmly by a group of Dutch researchers led by Dr Jaap Peen. In their meta-analysis (essentially a pooling together of many other pieces of research) they found that living in a city roughly doubles the risk of schizophrenia – around the same level of danger that is added by smoking a lot of cannabis as a teenager.

At the same time urban living was found to raise the risk of anxiety disorders and mood disorders by 21% and 39% respectively. Interestingly, however, a person’s risk of addiction disorders seemed not to be affected by where they live. At one time it was considered that those at risk of mental illness were just more likely to move to cities, but other research has now more or less ruled that out.

So why is it that the larger the settlement you live in, the more likely you are to become mentally ill? Another German researcher and clinician, Dr Mazda Adli, is a keen advocate of one theory, which implicates that most paradoxical urban mixture: loneliness in crowds. “Obviously our brains are not perfectly shaped for living in urban environments,” Adli says. “In my view, if social density and social isolation come at the same time and hit high-risk individuals … then city-stress related mental illness can be the consequence.”

Read the entire story here.