Tag Archives: behavior

Achieving Failure

Our society values success.

Our work environments value triumphing over the competition. We look to our investments to beat the market. We support our favorite teams, but adore them when they trounce their rivals. Our schools and colleges (mostly) help educate our children, but do so in a way that rewards success — good grades, good test scores and good behavior (as in, same as everyone else). We continually reward our kids for success on a task, at school, with a team.

Yet, all of us know, in our hearts and the back of our minds, that the most important lessons and trials stem from failure — not success. From failure we learn to persevere, we learn to change and adapt, we learn to overcome. From failure we learn to avoid, or tackle obstacles head on; we learn to reassess and reevaluate. We evolve from our failures.

So this begs the question: why are so many of our processes and systems geared solely to rewarding and reinforcing success?

From NPR:

Is failure a positive opportunity to learn and grow, or is it a negative experience that hinders success? How parents answer that question has a big influence on how much children think they can improve their intelligence through hard work, a study says.

“Parents are a really critical force in child development when you think about how motivation and mindsets develop,” says Kyla Haimovitz, a professor of psychology at Stanford University. She coauthored the study, published in Psychological Science with colleague Carol Dweck, who pioneered research on mindsets. “Parents have this powerful effect really early on and throughout childhood to send messages about what is failure, how to respond to it.”

Although there’s been a lot of research on how these forces play out, relatively little looks at what parents can do to motivate their kids in school, Haimovitz says. This study begins filling that gap.

“There is a fair amount of evidence showing that when children view their abilities as more malleable and something they can change over time, then they deal with obstacles in a more constructive way,” says Gail Heyman, a professor of psychology at the University of California at San Diego who was not involved in this study.

But communicating that message to children is not simple.

“Parents need to represent this to their kids in the ways they react about their kids’ failures and setbacks,” Haimovitz says. “We need to really think about what’s visible to the other person, what message I’m sending in terms of my words and my deeds.”

In other words, if a child comes home with a D on a math test, how a parent responds will influence how the child perceives their own ability to learn math. Even a well-intentioned, comforting response of “It’s OK, you’re still a great writer” may send the message that it’s time to give up on math rather than learn from the problems they got wrong, Haimovitz explains.

Read the entire story here.

Bad Behavior Goes Viral

Social psychologists often point out how human behavior is contagious. Laugh and others will join in. Yawn and all those around you will yawn as well. In a bad mood at home? Well, soon, chances are that the rest of your family with join you on a downer as well.

And, the contagion doesn’t end there, especially with negative behaviors; study after study shows the viral spread of suicide, product tampering, rioting, looting, speeding and even aircraft hijacking. So too, are mass shootings. Since the United States is a leading venue for mass shootings, there is now even a term for a mass shooting that happens soon after the first — an echo shooting.

From the Washington Post:

A man had just gone on a shooting rampage in Kalamazoo, Mich., allegedly killing six people while driving for Uber. Sherry Towers, an Arizona State University physicist who studies how viruses spread, worried while watching the news coverage.

Last year, Towers published a study using mathematical models to examine whether mass shootings, like viruses, are contagious. She identified a 13-day period after high-profile mass shootings when the chance of another spikes. Her findings are confirmed more frequently than she would like.

Five days after Kalamazoo, a man in Kansas shot 17 people, killing three by firing from his car. To Towers, that next shooting seemed almost inevitable.

“I absolutely dread watching this happen,” she said.

As the nation endures an ongoing stream of mass shootings, criminologists, police and even the FBI are turning to virus epidemiology and behavioral psychology to understand what sets off mass shooters and figure out whether, as with the flu, the spread can be interrupted.

“These things are clustering in time, and one is causing the next one to be more likely,” said Gary Slutkin, a physician and epidemiologist at the University of Illinois at Chicago who runs Cure Violence, a group that treats crime as a disease. “That’s definitional of a contagious disease. Flu is a risk factor for more flu. Mass shootings are a risk factor for mass shootings.”

The idea is not without skeptics. James Alan Fox, a Northeastern University professor who studies mass shootings, said: “Some bunching just happens. Yes, there is some mimicking going on, but the vast majority of mass killers don’t need someone else to give them the idea.”

Confirming, disputing or further exploring the idea scientifically is hampered by the federal funding ban on gun violence research. Towers and her colleagues did their study on their own time. And there’s not even a common database or definition of mass shootings.

The Congressional Research Service uses the term “public mass shootings” to describe the killing of four or more people in “relatively public places” by a perpetrator selecting victims “somewhat indiscriminately.”

In the 1980s, the violence occurred in post offices. In the 1990s, schools. Now it is mutating into new forms, such as the terrorist attack in San Bernardino, Calif., that initially appeared to be a workplace shooting by a disgruntled employee.

Researchers say the contagion is potentially more complicated than any virus. There is the short-term effect of a high-profile mass shooting, which can lead quickly to another incident. Towers found that such echo shootings account for up to 30 percent of all rampages.

But there appear to be longer incubation periods, too. Killers often find inspiration in past mass shootings, praising what their predecessors accomplished, innovating on their methods and seeking to surpass them in casualties and notoriety.

Read the entire article here.

The Curious Psychology of Returns

In a recent post I wrote about the world of reverse logistics, which underlies the multi-billion dollar business of product returns. But while the process of consumer returns runs like a well-oiled, global machine the psychology of returns is confusingly counter-intuitive.

For instance, a lenient return policy leads to more returned products — no surprise there. But, it also causes increased consumer spending, and the increased spending outweighs the cost to the business of processing the increased returns. Also, and rather more curiously, a more lenient return time limit correlates to a reduction in returns, not an increase.From the Washington Post:

January is prime time for returns in the retail industry, the month where shoppers show up in droves to trade in an ill-fitting sweater from grandma or to unload the second and third “Frozen” dolls that showed up under the Christmas tree.

This post-Christmas ritual has always been costly for retailers, comprising a large share of the $284 billion in goods that were returned in 2014.  But now it is arguably becoming more urgent for the industry to think carefully about return policies, as analysts say the rise of online shopping is bringing with it a surge in returns. The return rate for the industry overall is about 8 percent, but analysts say that it is likely significantly higher than that online, since shoppers are purchasing goods without seeing them in person or trying them on.

Against that backdrop, researchers at University of Texas-Dallas sought to get a better handle on how return policies affect shopper behavior and, in turn, whether lenient policies such as offering a lengthy period for returns actually helps or hurts a retailer’s business.

Overall, a lenient return policy did indeed correlate with more returns. But, crucially, it was even more strongly correlated with an increase in purchases. In other words, retailers are generally getting a clear sales benefit from giving customers the assurance of a return.

One surprising finding: More leniency on time limits is associated with a reduction — not an increase — in returns.

This may seem counterintuitive, but researchers say it could have varying explanations. Ryan Freling, who conducted the research alongside Narayan Janakiraman and Holly Syrdal, said that this is perhaps a result of what’s known as “endowment effect.”

“That would say that the longer a customer has a product in their hands, the more attached they feel to it,” Freling said.

Plus, the long time frame creates less urgency around the decision over whether or not to take it back.

Read the entire article here.

Rudeness Goes Viral

We know intuitively, anecdotally and through scientific study that aggressive behavior can be transmitted to others through imitation. The famous Bobo doll experiment devised by researchers at Stanford University in the early 1960s, and numerous precursors, showed that subjects given an opportunity to observe aggressive models later reproduced a good deal of physical and verbal aggression substantially identical with that of the model. In these studies the model was usually someone with a higher social status or with greater authority (e.g., an adult) than the observer (e.g., a child).

Recent updates to these studies now show that low-intensity behaviors such as rudeness can be as equally contagious as more intense behaviors like violence. Fascinatingly, the contagion seems to work equally well even if the model and observer are peers.

So, keep this in mind: watching rude behaviors leads us to be rude to others.

From Scientific American:

Flu season is nearly upon us, and in an effort to limit contagion and spare ourselves misery, many of us will get vaccinated. The work of Jonas Salk and Thomas Francis has helped restrict the spread of the nasty bug for generations, and the influenza vaccine is credited with saving tens of thousands of lives. But before the vaccine could be developed, scientists first had to identify the cause of influenza — and, importantly, recognize that it was contagious.

New research by Trevor Foulk, Andrew Woolum, and Amir Erez at the University of Florida takes that same first step in identifying a different kind of contagious menace: rudeness. In a series of studies, Foulk and colleagues demonstrate that being the target of rude behavior, or even simply witnessing rude behavior, induces rudeness. People exposed to rude behavior tend to have concepts associated with rudeness activated in their minds, and consequently may interpret ambiguous but benign behaviors as rude. More significantly, they themselves are more likely to behave rudely toward others, and to evoke hostility, negative affect, and even revenge from others.

The finding that negative behavior can beget negative behavior is not exactly new, as researchers demonstrated decades ago that individuals learn vicariously and will repeat destructive actions.  In the now infamous Bobo doll experiment, for example, children who watched an adult strike a Bobo doll with a mallet or yell at it were themselves abusive toward the doll.  Similarly, supervisors who believe they are mistreated by managers tend to pass on this mistreatment to their employees.

Previous work on the negative contagion effect, however, has focused primarily on high-intensity behaviors like hitting or abusive supervision that are (thankfully) relatively infrequent in everyday life.  In addition, in most previous studies the destructive behavior was modeled by someone with a higher status than the observer. These extreme negative behaviors may thus get repeated because (a) they are quite salient and (b) the observer is consciously and intentionally trying to emulate the behavior of someone with an elevated social status.

To examine whether this sensitivity impacts social behavior, Foulk’s team conducted another study in which participants were asked to play the part of an employee at a local bookstore.  Participants first observed a video showing either a polite or a rude interaction among coworkers.  They were then asked to respond to an email from a customer.  The email was either neutral (e.g., “I am writing to check on an order I placed a few weeks ago.”), highly aggressive (e.g., “I guess you or one of your incompetent staff must have lost my order.”), or moderately rude (I’m really surprised by this as EVERYBODY said you guys give really good customer service???).

Foulk and colleagues again found that prior exposure to rude behavior creates a specific sensitivity to rudeness. Notably, the type of video participants observed did not affect their responses to the neutral or aggressive emails; instead, the nature of those emails drove the response.  That is, all participants were more likely to send a hostile response to the aggressive email than to neutral email, regardless of whether they had previously observed a polite or rude employee interaction.  However, the type of video participants observed early in the study did affect their interpretation of and response to the rude email.  Those who had seen the polite video adopted a benign interpretation of the moderately rude email and delivered a neutral response, while those who had seen the rude video adopted a malevolent interpretation and delivered a hostile response.  Thus, observing rude behaviors, even those committed by coworkers or peers, resulted in greater sensitivity and heightened response to rudeness.

Read the entire article here.

Cat in the (Hat) Box

google-search-cat

Cat owner? Ever pondered why your aloof, inscrutable feline friend loves boxes? Here are some answers courtesy of people who study these kinds of things.

From Wired:

Take heart feline enthusiasts. Your cat’s continued indifference toward her new Deluxe Scratch DJ Deck may be disappointing, but there is an object that’s pretty much guaranteed to pique her interest. That object, as the Internet has so thoroughly documented, is a box. Any box, really. Big boxes, small boxes, irregularly shaped boxes—it doesn’t matter. Place one on the ground, a chair, or a bookshelf and watch as Admiral Snuggles quickly commandeers it.

So what are we to make of the strange gravitational pull that empty Amazon packaging exerts on Felis sylvestris catus? Like many other really weird things cats do, science hasn’t fully cracked this particular feline mystery. There’s the obvious predation advantage a box affords: Cats are ambush predators, and boxes provide great hiding places to stalk prey from (and retreat to). But there’s clearly more going on here.

Thankfully, behavioral biologists and veterinarians have come up with a few other interesting explanations. In fact, when you look at all the evidence together, it could be that your cat may not just like boxes, he may need them.

The box-and-whisker plot

Understanding the feline mind is notoriously difficult. Cats, after all, tend not to be the easiest test subjects. Still, there’s a sizable amount of behavioral research on cats who are, well, used for other kinds of research (i.e., lab cats). These studies—many of which focused on environmental enrichment—have been taking place for more than 50 years and they make one thing abundantly clear: Your fuzzy companion derives comfort and security from enclosed spaces.

This is likely true for a number of reasons, but for cats in these often stressful situations, a box or some other type of separate enclosure (within the enclosures they’re already in) can have a profound impact on both their behavior and physiology.

EthologistClaudia Vinke of Utrecht University in the Netherlands is one of the latest researchers to study stress levels in shelter cats. Working with domestic cats in a Dutch animal shelter, Vinke provided hiding boxes for a group of newly arrived cats while depriving another group of them entirely. She found a significant difference in stress levels between cats that had the boxes and those that didn’t. In effect, the cats with boxes got used to their new surroundings faster, were far less stressed early on, and were more interested in interacting with humans.

Read the entire story here.

Image courtesy of Google Search.

Luck

Four-leaf_clover

Some think they have it constantly at their side, like a well-trained puppy. Others crave and seek it. And yet others believe they have been shunned by it. Some put their love lives down to it, and many believe it has had a hand in guiding their careers, friendships, and finances. Of course, many know that it — luck — plays a crucial part in their fortunes at the poker table, roulette wheel or at the races. So what really is luck? Does it stem from within or does it envelope us like a benevolent (mostly) aether? And more importantly, how can more of us find some and tune it to our purposes? 

Carlin Flora over at Aeon presents an insightful analysis, with some rather simple answers. Oh, and you may wish to give away that rabbit’s foot.

From aeon:

In 1992, Archie Karas, then a waiter, headed out to Las Vegas. By 1995, he had turned $50 into $40 million, in what has become known as the biggest winning streak in gambling history. Most of us would call it an instance of great luck, or we might say of Archie himself: ‘What a lucky guy!’ The cold-hearted statistician would laugh at our superstious notions, and instead describe a series of chance processes that happened to work out for Karas. In the larger landscape where randomness reigns, anything can happen at any given casino. Calling its beneficiaries lucky is simply sticking a label on it after the fact.

To investigate luck is to take on one of the grandest of all questions: how can we explain what happens to us, and whether we will be winners, losers or somewhere in the middle at love, work, sports, gambling and life overall? As it turns out, new findings suggest that luck is not a phenomenon that appears exclusively in hindsight, like a hail storm on your wedding day. Nor is it an expression of our desire to see patterns where none exist, like a conviction that your yellow sweater is lucky. The concept of luck is not a myth.

Instead, the studies show, luck can be powered by past good or bad luck, personality and, in a meta-twist, even our own ideas and beliefs about luck itself. Lucky streaks are real, but they are the product of more than just blind fate. Our ideas about luck influence the way we behave in risky situations. We really can make our own luck, though we don’t like to think of ourselves as lucky – a descriptor that undermines other qualities, like talent and skill. Luck can be a force, but it’s one we interact with, shape and cultivate. Luck helps determine our fate here on Earth, even if you think its ultimate cause divine.

Luck is perspective and point of view: if a secular man happened to survive because he took a meeting outside his office at the World Trade Center on the morning of 11 September 2001, he might simply acknowledge random chance in life without assigning a deeper meaning. A Hindu might conclude he had good karma. A Christian might say God was watching out for him so that he could fulfil a special destiny in His service. The mystic could insist he was born under lucky stars, as others are born with green eyes.

Traditionally, the Chinese think luck is an inner trait, like intelligence or upbeat mood, notes Maia Young, a management expert at the University of California, Los Angeles. ‘My mom always used to tell me, “You have a lucky nose”, because its particular shape was a lucky one, according to Chinese lore.’ Growing up in the American Midwest, it dawned on Young that the fleeting luck that Americans often talked about – a luck that seemed to visit the same person at certain times (‘I got lucky on that test!’) but not others (‘I got caught in traffic before my interview!’) – was not equivalent to the unchanging, stable luck her mother saw in her daughter, her nose being an advertisement of its existence within.

‘It’s something that I have that’s a possession of mine, that can be more relied upon than just dumb luck,’ says Young. The distinction stuck with her. You might think someone with a lucky nose wouldn’t roll up their sleeves to work hard – why bother? – but here’s another cultural difference in perceptions of luck. ‘In Chinese culture,’ she says, ‘hard work can go hand-in-hand with being lucky. The belief system accommodates both.’

On the other hand, because Westerners see effort and good fortune as taking up opposite corners of the ring, they are ambivalent about luck. They might pray for it and sincerely wish others they care about ‘Good luck!’ but sometimes they just don’t want to think of themselves as lucky. They’d rather be deserving. The fact that they live in a society that is neither random nor wholly meritocratic makes for an even messier slamdance between ‘hard work’ and ‘luck’. Case in point: when a friend gets into a top law or medical school, we might say: ‘Congratulations! You’ve persevered. You deserve it.’ Were she not to get in, we would say: ‘Acceptance is arbitrary. Everyone’s qualified these days – it’s the luck of the draw.’

Read the entire article here.

Image: Four-leaf clover. Some consider it a sign of god luck. Courtesy of Phyzome.

FOMO Reshaping You and Your Network

Fear of missing out (FOMO) and other negative feelings are greatly disproportional to good ones in online social networks. The phenomenon is widespread and well-documented. Compound this with the observation — though unintuitive — that your online friends will have more friends and be more successful than you, and you have a recipe for a growing, deep-seated inferiority complex. Add to this other behavioral characteristics that are peculiar or exaggerated in online social networks and you have a more fundamental recipe — one that threatens the very fabric of the network itself. Just consider how online trolling, status lurking, persona-curation, passive monitoring, stalking and deferred (dis-)liking are re-fashioning our behaviors and the networks themselves.

From ars technica:

I found out my new college e-mail address in 2005 from a letter in the mail. Right after opening the envelope, I went straight to the computer. I was part of a LiveJournal group made of incoming students, and we had all been eagerly awaiting our college e-mail addresses, which had a use above and beyond corresponding with professors or student housing: back then, they were required tokens for entry to the fabled thefacebook.com.

That was nine years ago, and Facebook has now been in existence for 10. But even in those early days, Facebook’s cultural impact can’t be overstated. A search for “Facebook” on Google Scholar alone now produces 1.2 million results from 2006 on; “Physics” only returns 456,000.

But in terms of presence, Facebook is flopping around a bit now. The ever-important “teens” despise it, and it’s not the runaway success, happy addiction, or awe-inspiring source of information it once was. We’ve curated our identities so hard and had enough experiences with unforeseen online conflict that Facebook can now feel more isolating than absorbing. But what we are dissatisfied with is what Facebook has been, not what it is becoming.

Even if the grand sociological experiment that was Facebook is now running a little dry, the company knows this—which is why it’s transforming Facebook into a completely different entity. And the cause of all this built-up disarray that’s pushing change? It’s us. To prove it, let’s consider the social constructs and weirdnesses Facebook gave rise to, how they ultimately undermined the site, and how these ideas are shaping Facebook into the company it is now and will become.

Cue that Randy Newman song

Facebook arrived late to the concept of online friending, long after researchers started wondering about the structure of these social networks. What Facebook did for friending, especially reciprocal friending, was write it so large that it became a common concern. How many friends you had, who did and did not friend you back, and who should friend each other first all became things that normal people worried about.

Once Facebook opened beyond colleges, it became such a one-to-one representation of an actual social network that scientists started to study it. They applied social theories like those of weak ties or identity creation to see how they played out sans, or in supplement to, face-to-face interactions.

In a 2007 study, when Facebook was still largely campus-bound, a group of researchers said that Facebook “appears to play an important role in the process by which students form and maintain social capital.” They were using it to keep in touch with old friends and “to maintain or intensify relationships characterized by some form of offline connection.”

This sounds mundane now, since Facebook is so integrated into much of our lives. Seeing former roommates or childhood friends posting updates to Facebook feels as commonplace as literally seeing them nearly every day back when we were still roommates at 20 or friends at eight.

But the ability to keep tabs on someone without having to be proactive about it—no writing an e-mail, making a phone call, etc.—became the unique selling factor of Facebook. Per the 2007 study above, Facebook became a rich opportunity for “convert[ing] latent ties into weak ties,” connections that are valuable because they are with people who are sufficiently distant socially to bring in new information and opportunities.

Some romantic pixels have been spilled about the way no one is ever lost to anyone anymore; most people, including ex-lovers, estranged family members, or missed connections are only a Wi-Fi signal away.

“Modern technology has made our worlds smaller, but perhaps it also has diminished life’s mysteries, and with them, some sense of romance,” writes David Vecsey in The New York Times. Vecsey cites a time when he tracked down a former lover “across two countries and an ocean,” something he would not have done in the absence of passive social media monitoring. “It was only in her total absence, in a total vacuum away from her, that I was able to appreciate the depth of love I felt.”

The art of the Facebook-stalk

While plenty of studies have been conducted on the productive uses of Facebook—forming or maintaining weak ties, supplementing close relationships, or fostering new, casual ones—there are plenty that also touch on the site as a means for passive monitoring. Whether it was someone we’d never met, a new acquaintance, or an unrequited infatuation, Facebook eventually had enough breadth that you could call up virtually anyone’s profile, if only to see how fat they’ve gotten.

One study referred to this process as “social investigation.” We developed particular behaviors to avoid creating suspicion: do not “like” anything by the object of a stalking session, or if we do like it, don’t “like” too quickly; be careful not to type a name we want to search into the status field by accident; set an object of monitoring as a “close friend,” even if they aren’t, so their updates show up without fail; friend their friends; surreptitiously visit profile pages multiple times a day in case we missed anything.

This passive monitoring is one of the more utilitarian uses of Facebook. It’s also one of the most addictive. The (fictionalized) movie The Social Network closes with Facebook’s founder, Mark Zuckerberg, gazing at the Facebook profile of a high-school crush. Facebook did away with the necessity of keeping tabs on anyone. You simply had all of the tabs, all of the time, with the most recent information whenever you wanted to look at them.

The book Digital Discourse cites a classic example of the Facebook stalk in an IM conversation between two teenagers:

“I just saw what Tanya Eisner wrote on your Facebook wall. Go to her house,” one says.
“Woah, didn’t even see that til right now,” replies the other.
“Haha it looks like I stalk you… which I do,” says the first.
“I stalk u too its ok,” comforts the second.

But even innocent, casual information recon in the form of a Facebook stalk can rub us the wrong way. Any instance of a Facebook interaction that ends with an unexpected third body’s involvement can taint the rest of users’ Facebook behavior, making us feel watched.

Digital Discourse states that “when people feel themselves to be the objects of stalking, creeping, or lurking by third parties, they express annoyance or even moral outrage.” It cites an example of another teenager who gets a wall post from a person she barely knows, and it explains something she wrote about in a status update. “Don’t stalk my status,” she writes in mocking command to another friend, as if talking to the interloper.

You are who you choose to be

“The advent of the Internet has changed the traditional conditions of identity production,” reads a study from 2008 on how people presented themselves on Facebook. People had been curating their presences online for a long time before Facebook, but the fact that Facebook required real names and, for a long time after its inception, association with an educational institution made researchers wonder if it would make people hew a little closer to reality.

But beyond the bounds of being tied to a real name, users still projected an idealized self to others; a type of “possible self,” or many possible selves, depending on their sharing settings. Rather than try to describe themselves to others, users projected a sort of aspirational identity.

People were more likely to associate themselves with cultural touchstones, like movies, books, or music, than really identify themselves. You might not say you like rock music, but you might write Led Zeppelin as one of your favorite bands, and everyone else can infer your taste in music as well as general taste and coolness from there.

These identity proxies also became vectors for seeking approval. “The appeal is as much to the likeability of my crowd, the desirability of my boyfriend, or the magic of my music as it is to the personal qualities of the Facebook users themselves,” said the study. The authors also noted that, for instance, users tended to post photos of themselves mostly in groups in social situations. Even the profile photos, which would ostensibly have a single subject, were socially styled.

As the study concluded, “identity is not an individual characteristic; it is not an expression of something innate in a person, it is rather a social product, the outcome of a given social environment and hence performed differently in varying contexts.” Because Facebook was so susceptible to this “performance,” so easily controlled and curated, it quickly became less about real people and more about highlight reels.

We came to Facebook to see other real people, but everyone, even casual users, saw it could be gamed for personal benefit. Inflicting our groomed identities on each other soon became its own problem.

Fear of missing out

A long-time problem of social networks has been that the bad feelings they can generate are greatly disproportional to good ones.

In strict terms of self-motivation, posting something and getting a good reception feels good. But most of Facebook use is watching other people post about their own accomplishments and good times. For a social network of 300 friends with an even distribution of auspicious life events, you are seeing 300 times as many good things happen to others as happen to you (of course, everyone has the same amount of good luck, but in bulk for the consumer, it doesn’t feel that way). If you were happy before looking at Facebook, or even after posting your own good news, you’re not now.

The feelings of inadequacy did start to drive people back to Facebook. Even in the middle of our own vacations, celebration dinners, or weddings, we might check Facebook during or after to compare notes and see if we really had the best time possible.

That feeling became known as FOMO, “fear of missing out.” As Jenna Wortham wrote in The New York Times, “When we scroll through pictures and status updates, the worry that tugs at the corners of our minds is set off by the fear of regret… we become afraid that we’ve made the wrong decision about how to spend our time.”

Even if you had your own great stuff to tell Facebook about, someone out there is always doing better. And Facebook won’t let you forget. The brewing feeling of inferiority means users don’t post about stuff that might be too lame. They might start to self-censor, and then the bar for what is worth the “risk” of posting rises higher and higher. As people stop posting, there is less to see, less reason to come back and interact, like, or comment on other people’s material. Ultimately, people, in turn, have less reason to post.

Read the entire article here.

Influencing and Bullying

We sway our co-workers. We coach teams. We cajole our spouses and we parent our kids. But what characterizes this behavior over more overt and negative forms of influencing, such as bullying? It’s a question very much worth exploring since we are all bullies at some point — much more so than we tend to think of ourselves. And, not surprisingly, this goes hand-in-hand with deceit.

From the NYT:

WHAT is the chance that you could get someone to lie for you? What about vandalizing public property at your suggestion?

Most of us assume that others would go along with such schemes only if, on some level, they felt comfortable doing so. If not, they’d simply say “no,” right?

Yet research suggests that saying “no” can be more difficult than we believe — and that we have more power over others’ decisions than we think.

Social psychologists have spent decades demonstrating how difficult it can be to say “no” to other people’s propositions, even when they are morally questionable — consider Stanley Milgram’s infamous experiments, in which participants were persuaded to administer what they believed to be dangerous electric shocks to a fellow participant.

Countless studies have subsequently shown that we find it similarly difficult to resist social pressure from peers, friends and colleagues. Our decisions regarding everything from whether to turn the lights off when we leave a room to whether to call in sick to take a day off from work are affected by the actions and opinions of our neighbors and colleagues.

But what about those times when we are the ones trying to get someone to act unethically? Do we realize how much power we wield with a simple request, suggestion or dare? New research by my students and me suggests that we don’t.

We examined this question in a series of studies in which we had participants ask strangers to perform unethical acts. Before making their requests, participants predicted how many people they thought would comply. In one study, 25 college students asked 108 unfamiliar students to vandalize a library book. Targets who complied wrote the word “pickle” in pen on one of the pages.

As in the Milgram studies, many of the targets protested. They asked the instigators to take full responsibility for any repercussions. Yet, despite their hesitation, a large portion still complied.

Most important for our research question, more targets complied than participants had anticipated. Our participants predicted that an average of 28.5 percent would go along. In fact, fully half of those who were approached agreed. Moreover, 87 percent of participants underestimated the number they would be able to persuade to vandalize the book.

In another study, we asked 155 participants to think about a series of ethical dilemmas — for example, calling in sick to work to attend a baseball game. One group was told to think about these misdeeds from the perspective of a person deciding whether to commit them, and to imagine receiving advice from a colleague suggesting they do it or not. Another group took the opposite side, and thought about them from the perspective of someone advising another person about whether or not to do each deed.

Those in the first group were strongly influenced by the advice they received. When they were urged to engage in the misdeed, they said they would be more comfortable doing so than when they were advised not to. Their average reported comfort level fell around the midpoint of a 7-point scale after receiving unethical advice, but fell closer to the low end after receiving ethical advice.

However, participants in the “advisory” role thought that their opinions would hold little sway over the other person’s decision, assuming that participants in the first group would feel equally comfortable regardless of whether they had received unethical or ethical advice.

Taken together, our research, which was recently published in the journal Personality and Social Psychology Bulletin, suggests that we often fail to recognize the power of social pressure when we are the ones doing the pressuring.

Notably, this tendency may be especially pronounced in cultures like the United States’, where independence is so highly valued. American culture idolizes individuals who stand up to peer pressure. But that doesn’t mean that most do; in fact, such idolatry may hide, and thus facilitate, compliance under social pressure, especially when we are the ones putting on the pressure.

Consider the roles in the Milgram experiments: Most people have probably fantasized about being one of the subjects and standing up to the pressure. But in daily life, we play the role of the metaphorical experimenter in those studies as often as we play the participant. We bully. We pressure others to blow off work to come out for a drink or stiff a waitress who is having a bad night. These suggestions are not always wrong or unethical, but they may impact others’ behaviors more than we realize.

Read the entire story here.

Rushing to be Late

You’re either a cat person or you are a dog person. You’re either an early bird or a night owl, and similarly you’re either usually early or habitually late.

From the Washington Post:

I’m a late person.

I don’t think of myself as late, though. Every single time that it happens (and it invariably happens) I think of it as an exceptional fluke that will not occur again. Me, chronically late? No! Unforeseen things just happen on my way to getting places. If I were honest I would admit that these miscalculations never result in my being early, but I am not honest. If we were completely honest, who could even get out of bed in the morning?

Here is a translation guide, if you know someone like me:

I am coming downstairs: I will respond to an email, eight minutes will pass, then I will come downstairs.
I am a block away: I am two blocks away.
I am five minutes away: I am ten minutes away.
I am seventeen minutes away: I am giving you an oddly specific number to disguise the fact that I am probably something like half an hour away.
Twenty minutes away!: I am lost somewhere miles away, but optimistic.
I’m en route!: I am still in my apartment
See you at [Time we originally agreed upon]: I’m about to go take a shower, then get dressed, and then I will leave at the time we agreed to meet.

And if you say “I’m running five minutes late” this, to me, translates to “Hey, you now have time to watch a 90 minute film before you get dressed!”

I haven’t always been a late person. I didn’t think of myself as a late person until last week, when it finally happened.

“Dinner is at 7:00,” a friend told me. I showed up at 7:15, after a slight miscalculation or two while getting dressed that I had totally not foreseen, and then we waited for fifteen more minutes. Dinner was at 7:30. I had been assigned my own time zone. I was That Late Person.

The curse of the habitually late person is to be surrounded by early people. Early people do not think of themselves as Early People. They think of themselves as Right. “You have to be early in order to be on time,” they point out. Being on time is important to them. The forty minutes between when they arrive ten minutes early in order to “scout the place out” and “get in line” and when you show up mumbling excuses is the time it takes them to perfect the reproachful but resigned expression they are wearing when you get there. It is an expression that would not look out of place on a medieval saint. It is luminous with a kind of righteous indignation, eyes lifted skyward to someone who appreciates the value of time, a sad, small smile curving the lips to show that they forgive you, because they always forgive you, because you know not what you do.

“Well,” you say, “there was traffic.” This is never a lie. There is always traffic somewhere. But it is seldom actually why you are late. You might as well say, “I hear in Los Angeles today there was a bear running around and the police had to subdue it” for the relevance this story has to your arrival time. You hit every green light. The traffic parted for you, effortlessly, as though you were Moses. You were still half an hour late.

Still, it is best to say something. The next best thing to not being late, you have always felt, is to have an amusing excuse for why. “I am sorry I’m late,” you say. “I ran into Constance Moondragon, that crazy lady from the bus!” This is, technically, true — you saw her on the sidewalk, but did not actually speak to her — and it buys you time.

Sometimes this compounds. When you realize you are late, the thought sometimes occurs to you that “Well, since I’m going to be late, I should bring a gift to atone.” Then you are two hours late because all the liquor stores were closed, instead of forty-five minutes late, as planned.

Being late is a kind of optimism. Every time I leave to go somewhere I always think, on some level, “Maybe this is the day that leaving exactly when the event starts will get me there on time.” I am not sure how this will work, but hope springs eternal.

Besides, isn’t there is a kind of graciousness to being late, as some writers of etiquette books will tell you? If you show up precisely on time, you run the risk of catching your hosts in the inevitable last-minute scramble to make the place look decent, pour the wine, and hide their collections of werewolf erotica under the settee. To arrive 15 minutes after the scheduled time shows not disrespect for your hosts’ time, but a respect for their effort to make hosting seem like an effortless flow of magic.

The hosts never quite see things that way, of course.

By this point, you have probably lost all sympathy for me. The first comment on this piece will, I assume, be someone saying, “You sound like you are deeply self-centered and don’t care at all about the feelings of others, and I feel sorry for you.” And the thing is, all the evidence points to your being right, except for my feeble assertion that in my heart of hearts, I really do value your time, I never consciously intend to be late in a cruel way, and I am not the terrible person I appear. And that doesn’t go very far.

And all this being said, the life of a late person is great. I don’t do it on purpose, but it has much to recommend it. “People who show up late for things are always so much more cheerful than the people who have to wait for them,” E. V. Lucas said. This is true. One time I showed up early for something by mistake, and it was awful! I had to wait around for half an hour! Being late, you get all the fun of being there, with none of the pain of having to wait for other people to get there. You show up, and the party has already started. You get to do That Fun Thing That You Were Doing Right Before You Left and then join in That Fun Thing Everyone Is Doing When You Arrive. It’s the best of all possible worlds. You never have to stand alone in the rain anywhere waiting for anyone to assemble. Your host is never in the shower when you show up. You miss a couple of trailers, but you never have to see those long-form infomercials or answer movie theater trivia. You never have to be the first one at a party, making awkward small talk to the host and volunteering to help saute the onions. Do you really look like someone who would be good at sauteing onions? Of course not. What are you doing here? Why didn’t you wait half an hour like everyone else? You could be watching a video of a cat and a horse being friends!

Read the entire article here.

Six Rules to Super-Charge Your Creativity

Creative minds by their very nature are all different. Yet upon further examination it seems that there are some key elements and common routines that underlie many of the great, innovative thinkers. First and foremost, of course, is to be an early-bird.

From the Guardian:

One morning this summer, I got up at first light – I’d left the blinds open the night before – then drank a strong cup of coffee, sat near-naked by an open window for an hour, worked all morning, then had a martini with lunch. I took a long afternoon walk, and for the rest of the week experimented with never working for more than three hours at a stretch.

This was all in an effort to adopt the rituals of some great artists and thinkers: the rising-at-dawn bit came from Ernest Hemingway, who was up at around 5.30am, even if he’d been drinking the night before; the strong coffee was borrowed from Beethoven, who personally counted out the 60 beans his morning cup required. Benjamin Franklin swore by “air baths”, which was his term for sitting around naked in the morning, whatever the weather. And the midday cocktail was a favourite of VS Pritchett (among many others). I couldn’t try every trick I discovered in a new book, Daily Rituals: How Great Minds Make Time, Find Inspiration And Get To Work; oddly, my girlfriend was unwilling to play the role of Freud’s wife, who put toothpaste on his toothbrush each day to save him time. Still, I learned a lot. For example: did you know that lunchtime martinis aren’t conducive to productivity?

As a writer working from home, of course, I have an unusual degree of control over my schedule – not everyone could run such an experiment. But for anyone who thinks of their work as creative, or who pursues creative projects in their spare time, reading about the habits of the successful, can be addictive. Partly, that’s because it’s comforting to learn that even Franz Kafka struggled with the demands of his day job, or that Franklin was chronically disorganised. But it’s also because of a covert thought that sounds delusionally arrogant if expressed out loud: just maybe, if I took very hot baths like Flaubert, or amphetamines like Auden, I might inch closer to their genius.

Several weeks later, I’m no longer taking “air baths”, while the lunchtime martini didn’t last more than a day (I mean, come on). But I’m still rising early and, when time allows, taking long walks. Two big insights have emerged. One is how ill-suited the nine-to-five routine is to most desk-based jobs involving mental focus; it turns out I get far more done when I start earlier, end a little later, and don’t even pretend to do brain work for several hours in the middle. The other is the importance of momentum. When I get straight down to something really important early in the morning, before checking email, before interruptions from others, it beneficially alters the feel of the whole day: once interruptions do arise, they’re never quite so problematic. Another technique I couldn’t manage without comes from the writer and consultant Tony Schwartz: use a timer to work in 90-minute “sprints”, interspersed with signficant breaks. (Thanks to this, I’m far better than I used to be at separating work from faffing around, rather than spending half the day flailing around in a mixture of the two.)

The one true lesson of the book, says its author, Mason Currey, is that “there’s no one way to get things done”. For every Joyce Carol Oates, industriously plugging away from 8am to 1pm and again from 4pm to 7pm, or Anthony Trollope, timing himself typing 250 words per quarter-hour, there’s a Sylvia Plath, unable to stick to a schedule. (Or a Friedrich Schiller, who could only write in the presence of the smell of rotting apples.) Still, some patterns do emerge. Here, then, are six lessons from history’s most creative minds.

1. Be a morning person

It’s not that there aren’t successful night owls: Marcel Proust, for one, rose sometime between 3pm and 6pm, immediately smoked opium powders to relieve his asthma, then rang for his coffee and croissant. But very early risers form a clear majority, including everyone from Mozart to Georgia O’Keeffe to Frank Lloyd Wright. (The 18th-century theologian Jonathan Edwards, Currey tells us, went so far as to argue that Jesus had endorsed early rising “by his rising from the grave very early”.) For some, waking at 5am or 6am is a necessity, the only way to combine their writing or painting with the demands of a job, raising children, or both. For others, it’s a way to avoid interruption: at that hour, as Hemingway wrote, “There is no one to disturb you and it is cool or cold and you come to your work and warm as you write.” There’s another, surprising argument in favour of rising early, which might persuade sceptics: that early-morning drowsiness might actually be helpful. At one point in his career, the novelist Nicholson Baker took to getting up at 4.30am, and he liked what it did to his brain: “The mind is newly cleansed, but it’s also befuddled… I found that I wrote differently then.”

Psychologists categorise people by what they call, rather charmingly, “morningness” and “eveningness”, but it’s not clear that either is objectively superior. There is evidence that morning people are happier and more conscientious, but also that night owls might be more intelligent. If you’re determined to join the ranks of the early risers, the crucial trick is to start getting up at the same time daily, but to go to bed only when you’re truly tired. You might sacrifice a day or two to exhaustion, but you’ll adjust to your new schedule more rapidly.

2. Don’t give up the day job

Time is short, my strength is limited, the office is a horror, the apartment is noisy,” Franz Kafka complained to his fiancee, “and if a pleasant, straightforward life is not possible, then one must try to wriggle through by subtle manoeuvres.” He crammed in his writing between 10.30pm and the small hours of the morning. But in truth, a “pleasant, straightforward life” might not have been preferable, artistically speaking: Kafka, who worked in an insurance office, was one of many artists who have thrived on fitting creative activities around the edges of a busy life. William Faulkner wrote As I Lay Dying in the afternoons, before commencing his night shift at a power plant; TS Eliot’s day job at Lloyds bank gave him crucial financial security; William Carlos Williams, a paediatrician, scribbled poetry on the backs of his prescription pads. Limited time focuses the mind, and the self-discipline required to show up for a job seeps back into the processes of art. “I find that having a job is one of the best things in the world that could happen to me,” wrote Wallace Stevens, an insurance executive and poet. “It introduces discipline and regularity into one’s life.” Indeed, one obvious explanation for the alcoholism that pervades the lives of full-time authors is that it’s impossible to focus on writing for more than a few hours a day, and, well, you’ve got to make those other hours pass somehow.

3. Take lots of walks

There’s no shortage of evidence to suggest that walking – especially walking in natural settings, or just lingering amid greenery, even if you don’t actually walk much – is associated with increased productivity and proficiency at creative tasks. But Currey was surprised, in researching his book, by the sheer ubiquity of walking, especially in the daily routines of composers, including Beethoven, Mahler, Erik Satie and Tchaikovksy, “who believed he had to take a walk of exactly two hours a day and that if he returned even a few minutes early, great misfortunes would befall him”. It’s long been observed that doing almost anything other than sitting at a desk can be the best route to novel insights. These days, there’s surely an additional factor at play: when you’re on a walk, you’re physically removed from many of the sources of distraction – televisions, computer screens – that might otherwise interfere with deep thought.

Read the entire article here.

Image: Frank Lloyd Wright, architect, c. March 1, 1926. Courtesy of U.S. Library of Congress.

Liking the Likes of Likers

Researchers trawling through data from Facebook and other social networking sites find good examples of what they call human herding behavior.  A notable case shows that if you “like” an article online, your friends are more likely to “like” that article too. Is it a case of similarities of the group leading to similar behavior among peers? Well, apparently not — the same research also found that if you dislike the same article, your friends are not as likely to dislike it as well. So what is going on?

From the New York Times:

If you “like” this article on a site like Facebook, somebody who reads it is more likely to approve of it, even if the reporting and writing are not all that great.

But surprisingly, an unfair negative reaction will not spur others to dislike the article. Instead, a thumbs-down view will soon be counteracted by thumbs up from other readers.

Those are the implications of new research looking at the behavior of thousands of people reading online comments, scientists reported Friday in the journal Science. A positive nudge, they said, can set off a bandwagon of approval.

“Hype can work,” said one of the researchers, Sinan K. Aral, a professor of information technology and marketing at the Massachusetts Institute of Technology, “and feed on itself as well.”

If people tend to herd together on popular opinions, that could call into question the reliability of “wisdom of the crowd” ratings on Web sites like Yelp or Amazon and perhaps provide marketers with hints on how to bring positive attention to their products.

“This is certainly a provocative study,” said Matthew O. Jackson, a professor of economics at Stanford who was not involved with the research. “It raises a lot of questions we need to answer.”

Besides Dr. Aral (who is also a scholar in residence at The New York Times research and development laboratory, working on unrelated projects), the researchers are from Hebrew University in Jerusalem and New York University.

They were interested in answering a question that long predates the iPhone and Justin Bieber: Is something popular because it is actually good, or is it popular just because it is popular?

To help answer that question, the researchers devised an experiment in which they could manipulate a small corner of the Internet: reader comments.

They collaborated with an unnamed Web site, the company did not want its involvement disclosed, on which users submit links to news articles. Readers can then comment on the articles, and they can also give up or down votes on individual comments. Each comment receives a rating calculated by subtracting negative votes from positive ones.

The experiment performed a subtle, random change on the ratings of comments submitted on the site over five months: right after each comment was made, it was given an arbitrary up or down vote, or — for a control group — left alone. Reflecting a tendency among the site’s users to provide positive feedback, about twice as many of these arbitrary initial votes were positive: 4,049 to 1,942.

The first person reading the comment was 32 percent more likely to give it an up vote if it had been already given a fake positive score. There was no change in the likelihood of subsequent negative votes. Over time, the comments with the artificial initial up vote ended with scores 25 percent higher than those in the control group.

“That is a significant change,” Dr. Aral said. “We saw how these very small signals of social influence snowballed into behaviors like herding.”

Meanwhile, comments that received an initial negative vote ended up with scores indistinguishable from those in the control group.

The Web site allows users to say whether they like or dislike other users, and the researchers found that a commenter’s friends were likely to correct the negative score while enemies did not find it worth their time to knock down a fake up vote.

The distortion of ratings through herding is not a novel concern. Reddit, a social news site that said it was not the one that participated in the study, similarly allows readers to vote comments up or down, but it also allows its moderators to hide those ratings for a certain amount of time. “Now a comment will more likely be voted on based on its merit and appeal to each user, rather than having its public perception influence its votes,” it explained when it unveiled the feature in April.

Read the entire article here.

Image: Facebook “like” icon. Courtesy of Wikimedia / Facebook.

Criminology and Brain Science

Pathological criminals and the non-criminals who seek to understand them have no doubt co-existed since humans first learned to steal from and murder one another.

So while we may be no clearer in fully understanding the underlying causes of anti-social, destructive and violent behavior many researchers continue their quests. In one camp are those who maintain that such behavior is learned or comes as a consequence of poor choices or life-events, usually traumatic, or through exposure to an acute psychological or physiological stressor. In the other camp, are those who argue that genes and their subsequent expression, especially those controlling brain function, are a principal cause.

Some recent neurological studies of criminals and psychopaths shows fascinating, though not unequivocal, results.

From the Wall Street Journal:

The scientific study of crime got its start on a cold, gray November morning in 1871, on the east coast of Italy. Cesare Lombroso, a psychiatrist and prison doctor at an asylum for the criminally insane, was performing a routine autopsy on an infamous Calabrian brigand named Giuseppe Villella. Lombroso found an unusual indentation at the base of Villella’s skull. From this singular observation, he would go on to become the founding father of modern criminology.

Lombroso’s controversial theory had two key points: that crime originated in large measure from deformities of the brain and that criminals were an evolutionary throwback to more primitive species. Criminals, he believed, could be identified on the basis of physical characteristics, such as a large jaw and a sloping forehead. Based on his measurements of such traits, Lombroso created an evolutionary hierarchy, with Northern Italians and Jews at the top and Southern Italians (like Villella), along with Bolivians and Peruvians, at the bottom.

These beliefs, based partly on pseudoscientific phrenological theories about the shape and size of the human head, flourished throughout Europe in the late 19th and early 20th centuries. Lombroso was Jewish and a celebrated intellectual in his day, but the theory he spawned turned out to be socially and scientifically disastrous, not least by encouraging early-20th-century ideas about which human beings were and were not fit to reproduce—or to live at all.

The racial side of Lombroso’s theory fell into justifiable disrepute after the horrors of World War II, but his emphasis on physiology and brain traits has proved to be prescient. Modern-day scientists have now developed a far more compelling argument for the genetic and neurological components of criminal behavior. They have uncovered, quite literally, the anatomy of violence, at a time when many of us are preoccupied by the persistence of violent outrages in our midst.

The field of neurocriminology—using neuroscience to understand and prevent crime—is revolutionizing our understanding of what drives “bad” behavior. More than 100 studies of twins and adopted children have confirmed that about half of the variance in aggressive and antisocial behavior can be attributed to genetics. Other research has begun to pinpoint which specific genes promote such behavior.

Brain-imaging techniques are identifying physical deformations and functional abnormalities that predispose some individuals to violence. In one recent study, brain scans correctly predicted which inmates in a New Mexico prison were most likely to commit another crime after release. Nor is the story exclusively genetic: A poor environment can change the early brain and make for antisocial behavior later in life.

Most people are still deeply uncomfortable with the implications of neurocriminology. Conservatives worry that acknowledging biological risk factors for violence will result in a society that takes a soft approach to crime, holding no one accountable for his or her actions. Liberals abhor the potential use of biology to stigmatize ostensibly innocent individuals. Both sides fear any seeming effort to erode the idea of human agency and free will.

It is growing harder and harder, however, to avoid the mounting evidence. With each passing year, neurocriminology is winning new adherents, researchers and practitioners who understand its potential to transform our approach to both crime prevention and criminal justice.

The genetic basis of criminal behavior is now well established. Numerous studies have found that identical twins, who have all of their genes in common, are much more similar to each other in terms of crime and aggression than are fraternal twins, who share only 50% of their genes.

In a landmark 1984 study, my colleague Sarnoff Mednick found that children in Denmark who had been adopted from parents with a criminal record were more likely to become criminals in adulthood than were other adopted kids. The more offenses the biological parents had, the more likely it was that their offspring would be convicted of a crime. For biological parents who had no offenses, 13% of their sons had been convicted; for biological parents with three or more offenses, 25% of their sons had been convicted.

As for environmental factors that affect the young brain, lead is neurotoxic and particularly damages the prefrontal region, which regulates behavior. Measured lead levels in our bodies tend to peak at 21 months—an age when toddlers are apt to put their fingers into their mouths. Children generally pick up lead in soil that has been contaminated by air pollution and dumping.

Rising lead levels in the U.S. from 1950 through the 1970s neatly track increases in violence 20 years later, from the ’70s through the ’90s. (Violence peaks when individuals are in their late teens and early 20s.) As lead in the environment fell in the ’70s and ’80s—thanks in large part to the regulation of gasoline—violence fell correspondingly. No other single factor can account for both the inexplicable rise in violence in the U.S. until 1993 and the precipitous drop since then.

Lead isn’t the only culprit. Other factors linked to higher aggression and violence in adulthood include smoking and drinking by the mother before birth, complications during birth and poor nutrition early in life.

Genetics and environment may work together to encourage violent behavior. One pioneering study in 2002 by Avshalom Caspi and Terrie Moffitt of Duke University genotyped over 1,000 individuals in a community in New Zealand and assessed their levels of antisocial behavior in adulthood. They found that a genotype conferring low levels of the enzyme monoamine oxidase A (MAOA), when combined with early child abuse, predisposed the individual to later antisocial behavior. Low MAOA has been linked to reduced volume in the amygdala—the emotional center of the brain—while physical child abuse can damage the frontal part of the brain, resulting in a double hit.

Brain-imaging studies have also documented impairments in offenders. Murderers, for instance, tend to have poorer functioning in the prefrontal cortex—the “guardian angel” that keeps the brakes on impulsive, disinhibited behavior and volatile emotions.

Read the entire article following the jump.

Image: The Psychopath Test by Jon Ronson, book cover. Courtesy of Goodreads.

Blind Loyalty and the Importance of Critical Thinking

Two landmark studies in the 1960s and ’70s put behavioral psychology squarely in the public consciousness. The obedience experiments by Stanley Milgram and the Stanford Prison experiment demonstrated how regular individuals could be made, quite simply, to obey figures in authority and to subject others to humiliation, suffering and pain.

A re-examination of these experiments and several recent similar studies have prompted a number of psychologists to offer a reinterpretation of the original conclusions. They suggest that humans may not be inherently evil after all. However, we remain dangerously flawed — our willingness to follow those in authority, especially in those with whom we identify, makes us susceptible to believing in the virtue of actions that by all standards would be monstrous. It turns out that an open mind able to think critically may be the best antidote.

[div class=attrib]From the Pacific Standard:[end-div]

They are among the most famous of all psychological studies, and together they paint a dark portrait of human nature. Widely disseminated in the media, they spread the belief that people are prone to blindly follow authority figures—and will quickly become cruel and abusive when placed in positions of power.

It’s hard to overstate the impact of Stanley Milgram’s obedience experiments of 1961, or the Stanford Prison Experiment of 1971. Yet in recent years, the conclusions derived from those studies have been, if not debunked, radically reinterpreted.

A new perspective—one that views human nature in a more nuanced light—is offered by psychologists Alex Haslam of the University of Queensland, Australia, and Stephen Reicher of the University of St. Andrews in Scotland.

In an essay published in the open-access journal PLoS Biology, they argue that people will indeed comply with the questionable demands of authority figures—but only if they strongly identify with that person, and buy into the rightness of those beliefs.
In other words, we’re not unthinking automatons. Nor are we monsters waiting for permission for our dark sides to be unleashed. However, we are more susceptible to psychological manipulation than we may realize.

In Milgram’s study, members of the general public were placed in the role of “teacher” and told that a “learner” was in a nearby room. Each time the “learner” failed to correctly recall a word as part of a memory experiment, the “teacher” was told to administer an electrical shock.

As the “learner” kept making mistakes, the “teacher” was ordered to give him stronger and stronger jolts of electricity. If a participant hesitated, the experimenter—an authority figure wearing a white coat—instructed him to continue.

Somewhat amazingly, most people did so: 65 percent of participants continued to give stronger and stronger shocks until the experiment ended with the “learner” apparently unconscious. (The torture was entirely fictional; no actual shocks were administered.)
To a world still reeling from the question of why so many Germans obeyed orders and carried out Nazi atrocities, here was a clear answer: We are predisposed to obey authority figures.

The Stanford Prisoner Experiment, conducted a few years later, was equally unnerving. Students were randomly assigned to assume the role of either prisoner or guard in a “prison” set up in the university’s psychology department. As Haslam and Reicher note, “such was the abuse meted out to the prisoners by the guards that the study had to be terminated after just six days.”

Lead author Philip Zimbardo, who assumed the role of “prison superintendent” with a level of zeal he later found frightening, concluded that brutality was “a natural consequence of being in the uniform of a guard and asserting the power inherent in that role.”

So is all this proof of the “banality of evil,” to use historian Hannah Arendt’s memorable phrase? Not really, argue Haslam and Reicher. They point to their own work on the BBC Prison Study, which mimicked the seminal Stanford study.

They found that participants “did not conform automatically to their assigned role” as prisoner or guard. Rather, there was a period of resistance, which ultimately gave way to a “draconian” new hierarchy. Before becoming brutal, the participants needed time to assume their new identities, and internalize their role in the system.

Once they did so, “the hallmark of the tyrannical regime was not conformity, but creative leadership and engaged followership within a group of true believers,” they write. “This analysis mirrors recent conclusions about the Nazi tyranny.”

[div class=attrib]Read the entire article after the jump.[end-div]

Social Outcast = Creative Wunderkind

A recent study published in the Journal of Experimental Psychology correlates social ostracization and rejection with creativity. Businesses seeking creative individuals take note: perhaps your next great hire is a social misfit.

[div class=attrib]From Fast Company:[end-div]

Are you a recovering high school geek who still can’t get the girl? Are you always the last person picked for your company’s softball team? When you watched Office Space, did you feel a special kinship to the stapler-obsessed Milton Waddams? If you answered yes to any of these questions, do not despair. Researchers at Johns Hopkins and Cornell have recently found that the socially rejected might also be society’s most creatively powerful people.

The study, which is forthcoming in the Journal of Experimental Psychology, is called “Outside Advantage: Can Social Rejection Fuel Creative Thought?” It found that people who already have a strong “self-concept”–i.e. are independently minded–become creatively fecund in the face of rejection. “We were inspired by the stories of highly creative individuals like Steve Jobs and Lady Gaga,” says the study’s lead author, Hopkins professor Sharon Kim. “And we wanted to find a silver lining in all the popular press about bullying. There are benefits to being different.”

The study consisted of 200 Cornell students and set out to identify the relationship between the strength of an individual’s self-concept and their level of creativity. First, Kim tested the strength of each student’s self-concept by assessing his or her “need for uniqueness.” In other words, how important it is for each individual to feel separate from the crowd. Next, students were told that they’d either been included in or rejected from a hypothetical group project. Finally, they were given a simple, but creatively demanding, task: Draw an alien from a planet unlike earth.

If you’re curious about your own general creativity level (at least by the standards of Kim’s study), go ahead and sketch an alien right now…Okay, got your alien? Now give yourself a point for every non-human characteristic you’ve included in the drawing. If your alien has two eyes between the nose and forehead, you don’t get any points. If your alien has two eyes below the mouth, or three eyes that breathe fire, you get a point. If your alien doesn’t even have eyes or a mouth, give yourself a bunch of points. In short, the more dissimilar your alien is to a human, the higher your creativity score.

Kim found that people with a strong self-concept who were rejected produced more creative aliens than people from any other group, including people with a strong self-concept who were accepted. “If you’re in a mindset where you don’t care what others think,” she explained, “you’re open to ideas that you may not be open to if you’re concerned about what other people are thinking.”

This may seem like an obvious conclusion, but Kim pointed out that most companies don’t encourage the kind of freedom and independence that readers of Fast Company probably expect. “The benefits of being different is not a message everyone is getting,” she said.

But Kim also discovered something unexpected. People with a weak self-concept could be influenced toward a stronger one and, thus, toward a more creative mindset. In one part of the study, students were asked to read a short story in which all the pronouns were either singular (I/me) or plural (we/us) and then to circle all the pronouns. They were then “accepted” or “rejected” and asked to draw their aliens.

Kim found that all of the students who read stories with singular pronouns and were rejected produced more creative aliens. Even the students who originally had a weaker self-concept. Once these group-oriented individuals focused on individual-centric prose, they became more individualized themselves. And that made them more creative.

This finding doesn’t prove that you can teach someone to have a strong self-concept but it suggests that you can create a professional environment that facilitates independent and creative thought.

[div class=attrib]Read the entire article after the jump.[end-div]

Busyness As Chronic Illness

Apparently, being busy alleviates the human existential threat. So, if your roughly 16 hours, or more, of wakefulness each day is crammed with memos, driving, meetings, widgets, calls, charts, quotas, angry customers, school lunches, deciding, reports, bank statements, kids, budgets, bills, baking, making, fixing, cleaning and mad bosses, then your life must be meaningful, right?

Think again.

Author Tim Kreider muses below on this chronic state of affairs, and hits close to the nerve when he suggests that, “I can’t help but wonder whether all this histrionic exhaustion isn’t a way of covering up the fact that most of what we do doesn’t matter.”

[div class=attrib]From the New York Times:[end-div]

If you live in America in the 21st century you’ve probably had to listen to a lot of people tell you how busy they are. It’s become the default response when you ask anyone how they’re doing: “Busy!” “So busy.” “Crazy busy.” It is, pretty obviously, a boast disguised as a complaint. And the stock response is a kind of congratulation: “That’s a good problem to have,” or “Better than the opposite.”

Notice it isn’t generally people pulling back-to-back shifts in the I.C.U. or commuting by bus to three minimum-wage jobs  who tell you how busy they are; what those people are is not busy but tired. Exhausted. Dead on their feet. It’s almost always people whose lamented busyness is purely self-imposed: work and obligations they’ve taken on voluntarily, classes and activities they’ve “encouraged” their kids to participate in. They’re busy because of their own ambition or drive or anxiety, because they’re addicted to busyness and dread what they might have to face in its absence.

Almost everyone I know is busy. They feel anxious and guilty when they aren’t either working or doing something to promote their work. They schedule in time with friends the way students with 4.0 G.P.A.’s  make sure to sign up for community service because it looks good on their college applications. I recently wrote a friend to ask if he wanted to do something this week, and he answered that he didn’t have a lot of time but if something was going on to let him know and maybe he could ditch work for a few hours. I wanted to clarify that my question had not been a preliminary heads-up to some future invitation; this was the invitation. But his busyness was like some vast churning noise through which he was shouting out at me, and I gave up trying to shout back over it.

Even children are busy now, scheduled down to the half-hour with classes and extracurricular activities. They come home at the end of the day as tired as grown-ups. I was a member of the latchkey generation and had three hours of totally unstructured, largely unsupervised time every afternoon, time I used to do everything from surfing the World Book Encyclopedia to making animated films to getting together with friends in the woods to chuck dirt clods directly into one another’s eyes, all of which provided me with important skills and insights that remain valuable to this day. Those free hours became the model for how I wanted to live the rest of my life.

The present hysteria is not a necessary or inevitable condition of life; it’s something we’ve chosen, if only by our acquiescence to it. Not long ago I  Skyped with a friend who was driven out of the city by high rent and now has an artist’s residency in a small town in the south of France. She described herself as happy and relaxed for the first time in years. She still gets her work done, but it doesn’t consume her entire day and brain. She says it feels like college — she has a big circle of friends who all go out to the cafe together every night. She has a boyfriend again. (She once ruefully summarized dating in New York: “Everyone’s too busy and everyone thinks they can do better.”) What she had mistakenly assumed was her personality — driven, cranky, anxious and sad — turned out to be a deformative effect of her environment. It’s not as if any of us wants to live like this, any more than any one person wants to be part of a traffic jam or stadium trampling or the hierarchy of cruelty in high school — it’s something we collectively force one another to do.

Busyness serves as a kind of existential reassurance, a hedge against emptiness; obviously your life cannot possibly be silly or trivial or meaningless if you are so busy, completely booked, in demand every hour of the day. I once knew a woman who interned at a magazine where she wasn’t allowed to take lunch hours out, lest she be urgently needed for some reason. This was an entertainment magazine whose raison d’être was obviated when “menu” buttons appeared on remotes, so it’s hard to see this pretense of indispensability as anything other than a form of institutional self-delusion. More and more people in this country no longer make or do anything tangible; if your job wasn’t performed by a cat or a boa constrictor in a Richard Scarry book I’m not sure I believe it’s necessary. I can’t help but wonder whether all this histrionic exhaustion isn’t a way of covering up the fact that most of what we do doesn’t matter.

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image courtesy of Entrepreneur.com.[end-div]

Men are From LinkedIn, Women are From Pinterest

No surprise. Women and men use online social networks differently. A new study of online behavior by researchers in Vienna, Austria, shows that the sexes organize their networks very differently and for different reasons.

[div class=attrib]From Technology Review:[end-div]

One of the interesting insights that social networks offer is the difference between male and female behaviour.

In the past, behavioural differences have been hard to measure. Experiments could only be done on limited numbers of individuals and even then, the process of measurement often distorted people’s behaviour.

That’s all changed with the advent of massive online participation in gaming, professional and friendship  networks. For the first time, it has become possible to quantify exactly how the genders differ in their approach to things like risk and communication.

Gender specific studies are surprisingly rare, however. Nevertheless a growing body if evidence is emerging that social networks reflect many of the social and evolutionary differences that we’ve long suspected.

Earlier this year, for example, we looked at a remarkable study of a mobile phone network that demonstrated the different reproductive strategies that men and women employ throughout their lives, as revealed by how often they call friends, family and potential mates.

Today, Michael Szell and Stefan Thurner at the Medical University of Vienna in Austria say they’ve found significance differences in the way men and women manage their social networks in an online game called Pardus with over 300,000 players.

In this game, players  explore various solar systems in a virtual universe. On the way, they can mark other players as friends or enemies, exchange messages, gain wealth by trading  or doing battle but can also be killed.

The interesting thing about online games is that almost every action of every player is recorded, mostly without the players being consciously aware of this. That means measurement bias is minimal.

The networks of friends and enemies that are set up also differ in an important way from those on social networking sites such as Facebook. That’s because players can neither see nor influence other players’ networks. This prevents the kind of clustering and herding behaviour that sometimes dominates  other social networks.

Szell and Thurner say the data reveals clear and significant differences between men and women in Pardus.

For example, men and women  interact with the opposite sex differently.  “Males reciprocate friendship requests from females faster than vice versa and hesitate to reciprocate hostile actions of females,” say Szell and Thurner.

Women are also significantly more risk averse than men as measured by the amount of fighting they engage in and their likelihood of dying.

They are also more likely to be friends with each other than men.

These results are more or less as expected. More surprising is the finding that women tend to be more wealthy than men, probably because they engage more in economic than destructive behaviour.

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image courtesy of InformationWeek.[end-div]