Tag Archives: social science

Forget The Millennials — It’s Time For Generation K

Blame fickle social scientists. After the baby-boomers the most researched generation has been that of the millennials — so-called due to their coming of age at the turn of the century. We know what millennails like to eat and drink, how they dress, their politics; we know about their proclivity to sharing, their need for meaning and fun at work; we know they need attention and constant feedback. In fact, we have learned so much — and perhaps so little — from the thousands of, often-conflicting, research studies of millennials that some researchers have decided to move on to new blood. Yes, it’s time to tap another rich vein of research material — Generation K. But I’ll stop after relating what the “K” means in Generation K, and let you form your own conclusions.

[tube]n-7K_OjsDCQ[/tube]

Generation K is named for Katniss, as in the Hunger Games‘ hero Katniss Everdeen. That’s right, if you were born between 1995 and 2002, according to economist Noreena Hertz you are Gen-Katniss.

From the Guardian:

The brutal, bleak series that has captured the hearts of a generation will come to a brutal, bleak end in November when The Hunger Games: Mockingjay – Part 2 arrives in cinemas. It is the conclusion of the Hunger Games saga, which has immersed the young in a cleverly realised world of trauma, violence, mayhem and death.

For fans of Suzanne Collins’s trilogy about a young girl, Katniss Everdeen, forced to fight for survival in a country ruled by fear and fuelled by televised gladiatorial combat, this is the moment they have been waiting for.

Since the first book in the trilogy was published in 2008, Collins’s tale has sold more than 65 million copies in the US alone. The films, the first of which was released in 2012, have raked in more than $2bn worldwide at the box office and made a global star of their leading lady, Jennifer Lawrence, who plays the increasingly traumatised Katniss with a perfect mix of fury and resignation. For the huge appeal of The Hunger Games goes deeper than the fact that it’s an exciting tale well told. The generation who came to Katniss as young teens and have grown up ploughing through the books and queuing for the movies respond to her story in a particularly personal way.

As to why that might be, the economist and academic Noreena Hertz, who coined the term Generation K (after Katniss) for those born between 1995 and 2002, says that this is a generation riddled with anxiety, distrustful of traditional institutions from government to marriage, and, “like their heroine Katniss Everdeen, [imbued with] a strong sense of what is right and fair”.

“I think The Hunger Games resonates with them so much because they are Katniss navigating a dark and difficult world,” says Hertz, who interviewed 2,000 teenagers from the UK and the US about their hopes, fears and beliefs, concluding that today’s teens are shaped by three factors: technology, recession and coming of age in a time of great unease.

“This is a generation who grew up through 9/11, the Madrid bombings, the London bombings and Islamic State terrors. They see danger piped down their smartphones and beheadings on their Facebook page,” she says. “My data showed very clearly how anxious they are about everything from getting into debt or not getting a job, to wider issues such as climate change and war – 79% of those who took part in my survey worried about getting a job, 72% worried about debt, and you have to remember these are teenagers.

“In previous generations teenagers did not think in this way. Unlike the first-era millennials [who Hertz classes as those aged between 20 and 30] who grew up believing that the world was their oyster and ‘Yes we can’, this new generation knows the world is an unequal and harsh place.”

Writer and activist Laurie Penny, herself a first-era millennial at the age of 29, agrees. “I think what today’s young people have grasped that my generation didn’t get until our early 20s, is that adults don’t know everything,” she says. “They might be trying their best but they don’t always have your best interests at heart. The current generation really understands that – they’re more politically engaged and they have more sense of community because they’re able to find each other easily thanks to their use of technology.”

One of the primary appeals of the Hunger Games trilogy is its refusal to sugarcoat the scenarios Katniss finds herself in. In contrast to JK Rowling’s Harry Potter series, there are no reliable adult figures to dispense helpful advice and no one in authority she can truly trust (notably even the most likeable adult figures in the books tend to be flawed at best and fraudulent at worst). Even her friends may not always have her back, hard as they try – Dumbledore’s Army would probably find themselves taken out before they’d uttered a single counter-curse in the battlegrounds of Panem. At the end of the day, Katniss can only rely on one person, herself.

“Ultimately, the message of the Hunger Games is that everything’s not going to be OK,” says Penny. “One of the reasons Jennifer Lawrence is so good is because she lets you see that while Katniss is heroic, she’s also frightened all of the time. She spends the whole story being forced into situations she doesn’t want to be in. Kids respond because they can imagine what it’s like to be terrified but know that you have to carry on.”

It’s incontestable that we live in difficult times and that younger generations in particular may be more acutely aware that things aren’t improving any time soon, but is it a reach to say that fans of the Hunger Games are responding as much to the world around them as to the books?

Read the entire story here.

Video: The Hunger Games: Mockingjay Part 2 Official Trailer – “We March Together”. Courtesy of the Hunger Games franchise.

Online Social Networks Make Us More and Less Social

Two professors walk in to a bar… One claims that online social networks enrich our relationships and social lives; the other claims that technology diminishes and distracts us from real world relationships. Professor Keith N. Hampton at Rutgers University’s School of Communication and Information argues for the former positive position. While Professor Larry Rosen at California State University argues against. Who’s right?

Well, they’re both probably right.

But, several consequences seem to be more certain about our new, social technologies: our focus is increasingly fragmented and short; our memory and knowledge retention is being increasingly outsourced; our impatience and need for instant gratification continues to grow; and our newly acquired anxieties continue to expand — fear of missing out, fear of being unfriended, fear of being trolled, fear of being shamed, fear from not getting comments or replies, fear of not going viral, fear of partner’s lack of status reciprocity, fear of partner’s status change, fear of being Photoshopped or photobombed, fear of having personal images distributed, fear of quiet…

From the WSJ:

With the spread of mobile technology, it’s become much easier for more people to maintain constant contact with their social networks online. And a lot of people are taking advantage of that opportunity.

One indication: A recent Pew Research survey of adults in the U.S. found that 71% use Facebook at least occasionally, and 45% of Facebook users check the site several times a day.

That sounds like people are becoming more sociable. But some people think the opposite is happening. The problem, they say, is that we spend so much time maintaining superficial connections online that we aren’t dedicating enough time or effort to cultivating deeper real-life relationships. Too much chatter, too little real conversation.

Others counter that online social networks supplement face-to-face sociability, they don’t replace it. These people argue that we can expand our social horizons online, deepening our connections to the world around us, and at the same time take advantage of technology to make our closest relationships even closer.

Larry Rosen, a professor of psychology at California State University, Dominguez Hills, says technology is distracting us from our real-world relationships. Keith N. Hampton, who holds the Professorship in Communication and Public Policy at Rutgers University’s School of Communication and Information, argues that technology is enriching those relationships and the rest of our social lives.

Read the entire story here.

 

Your Friends Are Friendlier… And…

friends-cast

Your friends have more friends than you. But wait there’s more not-so-good news. Not only are your friends friendlier and befriended more than you, they are also likely to be wealthier and happier. How can this be, you may ask? It’s all down to averaging and the mathematics of networks and their interconnections. This so-called Friendship Paradox manifests itself in the dynamics of all social networks — it applies online as well as in the real world.

From Technology Review:

Back in 1991, the sociologist Scott Feld made a surprising discovery while studying the properties of social networks. Feld calculated the average number of friends that a person in the network has and compared this to the average number of friends that these friends had.

Against all expectations it turned out that the second number is always bigger than the first. Or in other words, your friends have more friends than you do.

Researchers have since observed the so-called friendship paradox in a wide variety of situations. On Facebook, your friends will have more friends than you have. On Twitter, your followers will have more followers than you do. And in real life, your sexual partners will have had more partners than you’ve had. At least, on average.

Network scientists have long known that this paradoxical effect is the result of the topology of networks—how they are connected together. That’s why similar networks share the same paradoxical properties.

But are your friends also happier than you are, or richer, or just better? That’s not so clear because happiness and wealth are not directly represented in the topology of a friendship network. So an interesting question is how far the paradox will go.

Today, we get an answer thanks to the work of Young-Ho Eom at the University of Toulouse in France and Hang-Hyun Jo at Aalto University in Finland. These guys have evaluated the properties of different characteristics on networks and worked out the mathematical conditions that determine whether the paradox applies to them or not. Their short answer is yes: your friends probably are richer than you are.

The paradox arises because numbers of friends people have are distributed in a way that follows a power law rather than an ordinary linear relationship. So most people have a few friends while a small number of people have lots of friends.

It’s this second small group that causes the paradox. People with lots of friends are more likely to number among your friends in the first place. And when they do, they significantly raise the average number of friends that your friends have. That’s the reason that, on average, your friends have more friends than you do.

But what of other characteristics, such as wealth and happiness, which are not represented by the network topology?

To study other types of network, Eom and Jo looked at two academic networks in which scientists are linked if they have co-authored a scientific paper together. Each scientist is a node in the network and the links arise between scientists who have been co-authors.

Sure enough, the paradox raises its head in this network too. If you are a scientist, your co-authors will have more co-authors than you, as reflected in the network topology. But curiously, they will also have more publications and more citations than you too.

Eom and Jo call this the “generalized friendship paradox” and go on to derive the mathematical conditions in which it occurs. They say that when a paradox arises as a result of the way nodes are connected together, any other properties of these nodes demonstrate the same paradoxical nature, as long as they are correlated in certain way.

As it turns out, number of publications and citations meet this criteria. And so too do wealth and happiness. So the answer is yes: your friends probably are richer and happier than you are.

That has significant implications for the way people perceive themselves given that their friends will always seem happier, wealthier and more popular than they are. And the problem is likely to be worse in networks where this is easier to see. “This might be the reason why active online social networking service users are not happy,” say Eom and Jo, referring to other research that has found higher levels of unhappiness among social network users.

So if you’re an active Facebook user feeling inadequate and unhappy because your friends seem to be doing better than you are, remember that almost everybody else on the network is in a similar position.

Read the entire article here.

Image: Cast of the CBS TV show Friends. Courtesy of Vanity Fair, CBS and respective rights holders.

Liking the Likes of Likers

Researchers trawling through data from Facebook and other social networking sites find good examples of what they call human herding behavior.  A notable case shows that if you “like” an article online, your friends are more likely to “like” that article too. Is it a case of similarities of the group leading to similar behavior among peers? Well, apparently not — the same research also found that if you dislike the same article, your friends are not as likely to dislike it as well. So what is going on?

From the New York Times:

If you “like” this article on a site like Facebook, somebody who reads it is more likely to approve of it, even if the reporting and writing are not all that great.

But surprisingly, an unfair negative reaction will not spur others to dislike the article. Instead, a thumbs-down view will soon be counteracted by thumbs up from other readers.

Those are the implications of new research looking at the behavior of thousands of people reading online comments, scientists reported Friday in the journal Science. A positive nudge, they said, can set off a bandwagon of approval.

“Hype can work,” said one of the researchers, Sinan K. Aral, a professor of information technology and marketing at the Massachusetts Institute of Technology, “and feed on itself as well.”

If people tend to herd together on popular opinions, that could call into question the reliability of “wisdom of the crowd” ratings on Web sites like Yelp or Amazon and perhaps provide marketers with hints on how to bring positive attention to their products.

“This is certainly a provocative study,” said Matthew O. Jackson, a professor of economics at Stanford who was not involved with the research. “It raises a lot of questions we need to answer.”

Besides Dr. Aral (who is also a scholar in residence at The New York Times research and development laboratory, working on unrelated projects), the researchers are from Hebrew University in Jerusalem and New York University.

They were interested in answering a question that long predates the iPhone and Justin Bieber: Is something popular because it is actually good, or is it popular just because it is popular?

To help answer that question, the researchers devised an experiment in which they could manipulate a small corner of the Internet: reader comments.

They collaborated with an unnamed Web site, the company did not want its involvement disclosed, on which users submit links to news articles. Readers can then comment on the articles, and they can also give up or down votes on individual comments. Each comment receives a rating calculated by subtracting negative votes from positive ones.

The experiment performed a subtle, random change on the ratings of comments submitted on the site over five months: right after each comment was made, it was given an arbitrary up or down vote, or — for a control group — left alone. Reflecting a tendency among the site’s users to provide positive feedback, about twice as many of these arbitrary initial votes were positive: 4,049 to 1,942.

The first person reading the comment was 32 percent more likely to give it an up vote if it had been already given a fake positive score. There was no change in the likelihood of subsequent negative votes. Over time, the comments with the artificial initial up vote ended with scores 25 percent higher than those in the control group.

“That is a significant change,” Dr. Aral said. “We saw how these very small signals of social influence snowballed into behaviors like herding.”

Meanwhile, comments that received an initial negative vote ended up with scores indistinguishable from those in the control group.

The Web site allows users to say whether they like or dislike other users, and the researchers found that a commenter’s friends were likely to correct the negative score while enemies did not find it worth their time to knock down a fake up vote.

The distortion of ratings through herding is not a novel concern. Reddit, a social news site that said it was not the one that participated in the study, similarly allows readers to vote comments up or down, but it also allows its moderators to hide those ratings for a certain amount of time. “Now a comment will more likely be voted on based on its merit and appeal to each user, rather than having its public perception influence its votes,” it explained when it unveiled the feature in April.

Read the entire article here.

Image: Facebook “like” icon. Courtesy of Wikimedia / Facebook.

Us and Them: Group Affinity Begins Early

Research shows how children as young as four years empathize with some but not others. It’s all about the group: which peer group you belong to versus the rest. Thus, the uphill struggle to instill tolerance in the next generation needs to begin very early in life.

From the WSJ:

Here’s a question. There are two groups, Zazes and Flurps. A Zaz hits somebody. Who do you think it was, another Zaz or a Flurp?

It’s depressing, but you have to admit that it’s more likely that the Zaz hit the Flurp. That’s an understandable reaction for an experienced, world-weary reader of The Wall Street Journal. But here’s something even more depressing—4-year-olds give the same answer.

In my last column, I talked about some disturbing new research showing that preschoolers are already unconsciously biased against other racial groups. Where does this bias come from?

Marjorie Rhodes at New York University argues that children are “intuitive sociologists” trying to make sense of the social world. We already know that very young children make up theories about everyday physics, psychology and biology. Dr. Rhodes thinks that they have theories about social groups, too.

In 2012 she asked young children about the Zazes and Flurps. Even 4-year-olds predicted that people would be more likely to harm someone from another group than from their own group. So children aren’t just biased against other racial groups: They also assume that everybody else will be biased against other groups. And this extends beyond race, gender and religion to the arbitrary realm of Zazes and Flurps.

In fact, a new study in Psychological Science by Dr. Rhodes and Lisa Chalik suggests that this intuitive social theory may even influence how children develop moral distinctions.

Back in the 1980s, Judith Smetana and colleagues discovered that very young kids could discriminate between genuinely moral principles and mere social conventions. First, the researchers asked about everyday rules—a rule that you can’t be mean to other children, for instance, or that you have to hang up your clothes. The children said that, of course, breaking the rules was wrong. But then the researchers asked another question: What would you think if teachers and parents changed the rules to say that being mean and dropping clothes were OK?

Children as young as 2 said that, in that case, it would be OK to drop your clothes, but not to be mean. No matter what the authorities decreed, hurting others, even just hurting their feelings, was always wrong. It’s a strikingly robust result—true for children from Brazil to Korea. Poignantly, even abused children thought that hurting other people was intrinsically wrong.

This might leave you feeling more cheerful about human nature. But in the new study, Dr. Rhodes asked similar moral questions about the Zazes and Flurps. The 4-year-olds said it would always be wrong for Zazes to hurt the feelings of others in their group. But if teachers decided that Zazes could hurt Flurps’ feelings, then it would be OK to do so. Intrinsic moral obligations only extended to members of their own group.

The 4-year-olds demonstrate the deep roots of an ethical tension that has divided philosophers for centuries. We feel that our moral principles should be universal, but we simultaneously feel that there is something special about our obligations to our own group, whether it’s a family, clan or country.

Read the entire article after the jump.

Image: Us and Them, Pink Floyd. Courtesy of Pink Floyd / flickr.

From 7 Up to 56 Up

[tube]ngSGIjwwc4U[/tube]

The classic documentary and social experiment continues with the release this week of “56 Up”. Michael Apted began this remarkable process with a documentary called “7 Up” in 1964. It followed the lives of 14 British children aged 7, from different socio-economic backgrounds. Although the 7 Up documentary was initially planned to be a one-off, subsequent installments followed in seven-year cycles. Each time Apted would bring us up to date with the lives of his growing subjects. Now, they are all turning 56 years old. Fifty-six years on the personal stories are poignant and powerful, yet class divisions remain.

[div class=attrib]From the Telegraph:[end-div]

Life rushes by so fast, it flickers today and is gone tomorrow. In “56 Up” — the latest installment in Michael Apted’s remarkable documentary project that has followed a group of Britons since 1964, starting when they were 7 — entire lifetimes race by with a few edits. One minute, a boy is merrily bobbing along. The next, he is 56 years old, with a wife or an ex, a few children or none, a career, a job or just dim prospects. Rolls of fat girdle his middle and thicken his jowls. He has regrets, but their sting has usually softened, along with everything else.

In a lot of documentaries you might not care that much about this boy and what became of him. But if you have watched any of the previous episodes in Mr. Apted’s series, you will care, and deeply, partly because you watched that boy grow up, suffer and triumph in a project that began as a news gimmick and social experiment and turned into a plangent human drama. Conceived as a one-off for a current-affairs program on Granada Television, the first film, “Seven Up!,” was a 40-minute look at the lives of 14 children from different backgrounds. Britain was changing, or so went the conventional wisdom, with postwar affluence having led the working class to adapt middle-class attitudes and lifestyles.

In 1963, though, the sociologists John H. Goldthorpe and David Lockwood disputed this widely held “embourgeoisement thesis,” arguing that the erosion of social class had not been as great as believed. In its deeply personal fashion, the “Up” series went on to make much the same point by checking in with many of the same boys and girls, men and women, every seven years. Despite some dropouts, the group has remained surprisingly intact. For better and sometimes worse, and even with their complaints about the series, participants like Tony Walker, who wanted to be a jockey and found his place as a cabby, have become cyclical celebrities. For longtime viewers they have become something more, including mirrors.

It’s this mirroring that helps make the series so poignant. As in the earlier movies, Mr. Apted again folds in older material from the ages of 7, 14 and so on, to set the scene and jog memories. The abrupt juxtapositions of epochs can be jarring, unnerving or touching — sometimes all three — as bright-faced children bloom and sometimes fade within seconds. An analogous project in print or even still photographs wouldn’t be as powerful, because what gives the “Up” series its punch is not so much its longevity or the human spectacle it offers, but that these are moving images of touchingly vibrant lives at certain moments in time and space. The more you watch, the more the movies transform from mirrors into memory machines, ones that inevitably summon reflections of your own life.

Save for “Seven Up!,” filmed in gorgeous black and white, the documentaries are aesthetically unremarkable. Shot in digital, “56 Up” pretty much plays like the earlier movies, with its mix of interviews and location shooting. Every so often you hear someone off screen, presumably Mr. Apted, make a comment, though mostly he lets his choice of what to show — the subjects at work or play, with family or friends — and his editing do his editorializing. In the past he has brought participants together, but he doesn’t here, which feels like a missed opportunity. Have the three childhood friends from the East End of London, Jackie Bassett, Lynn Johnson and Sue Sullivan, two of whom have recently endured heart-rendingly bad times, remained in contact? Mr. Apted doesn’t say.

With few exceptions and despite potential path-changing milestones like marriages and careers, everyone seems to have remained fairly locked in his or her original social class. At 7, Andrew Brackfield and John Brisby already knew which universities they would or should attend. “We think,” John said in “Seven Up!, “I’m going to Cambridge and Trinity Hall,” though he landed at Oxford. Like Mr. Brackfield, who did attend Cambridge, Mr. Brisby became a lawyer and still sounds to the manner born, with an accent that evokes old-fashioned news readers and Bond villains. The two hold instructively different views about whether the series corroborates the first film’s thesis about the rigidity of the British class structure, never mind that their lives are strong evidence that little has changed.

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Video: 7 Up – Part 1. Courtesy of World in Action, Granada TV.[end-div]

Pluralistic Ignorance

Why study the science of climate change when you can study the complexities of climate change deniers themselves? That was the question that led several groups of independent researchers to study why some groups of people cling to mistaken beliefs and hold inaccurate views of the public consensus.

[div class=attrib]From ars technica:[end-div]

By just about every measure, the vast majority of scientists in general—and climate scientists in particular—have been convinced by the evidence that human activities are altering the climate. However, in several countries, a significant portion of the public has concluded that this consensus doesn’t exist. That has prompted a variety of studies aimed at understanding the large disconnect between scientists and the public, with results pointing the finger at everything from the economy to the weather. Other studies have noted societal influences on acceptance, including ideology and cultural identity.

Those studies have generally focused on the US population, but the public acceptance of climate change is fairly similar in Australia. There, a new study has looked at how societal tendencies can play a role in maintaining mistaken beliefs. The authors of the study have found evidence that two well-known behaviors—the “false consensus” and “pluralistic ignorance”—are helping to shape public opinion in Australia.

False consensus is the tendency of people to think that everyone else shares their opinions. This can arise from the fact that we tend to socialize with people who share our opinions, but the authors note that the effect is even stronger “when we hold opinions or beliefs that are unpopular, unpalatable, or that we are uncertain about.” In other words, our social habits tend to reinforce the belief that we’re part of a majority, and we have a tendency to cling to the sense that we’re not alone in our beliefs.

Pluralistic ignorance is similar, but it’s not focused on our own beliefs. Instead, sometimes the majority of people come to believe that most people think a certain way, even though the majority opinion actually resides elsewhere.

As it turns out, the authors found evidence of both these effects. They performed two identical surveys of over 5,000 Australians, done a year apart; about 1,350 people took the survey both times, which let the researchers track how opinions evolve. Participants were asked to describe their own opinion on climate change, with categories including “don’t know,” “not happening,” “a natural occurrence,” and “human-induced.” After voicing their own opinion, people were asked to estimate what percentage of the population would fall into each of these categories.

In aggregate, over 90 percent of those surveyed accepted that climate change was occurring (a rate much higher than we see in the US), with just over half accepting that humans were driving the change. Only about five percent felt it wasn’t happening, and even fewer said they didn’t know. The numbers changed only slightly between the two polls.

The false consensus effect became obvious when the researchers looked at what these people thought that everyone else believed. Here, the false consensus effect was obvious: every single group believed that their opinion represented the plurality view of the population. This was most dramatic among those who don’t think that the climate is changing; even though they represent far less than 10 percent of the population, they believed that over 40 percent of Australians shared their views. Those who profess ignorance also believed they had lots of company, estimating that their view was shared by a quarter of the populace.

Among those who took the survey twice, the effect became even more pronounced. In the year between the surveys, they respondents went from estimating that 30 percent of the population agreed with them to thinking that 45 percent did. And, in general, this group was the least likely to change its opinion between the two surveys.

But there was also evidence of pluralistic ignorance. Every single group grossly overestimated the number of people who were unsure about climate change or convinced it wasn’t occurring. Even those who were convinced that humans were changing the climate put 20 percent of Australians into each of these two groups.

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Image: Flood victims. Courtesy of NRDC.[end-div]

The Myth of Social Mobility

There is a commonly held myth in the United States that anyone can make it; that is, even if you’re at the bottom of the income distribution curve you have the opportunity to climb up to a wealthier future. Independent research over the last couple of decades debunks this myth and paints a rather different and more disturbing reality. For instance, it shows how Americans are now less socially mobile — in the upward sense — than citizens of Canada and most countries in Europe.

[div class=attrib]From the Economist:[end-div]

THE HAMPTONS, A string of small towns on the south shore of Long Island, have long been a playground for America’s affluent. Nowadays the merely rich are being crimped by the ultra-wealthy. In August it can cost $400,000 to rent a fancy house there. The din of helicopters and private jets is omnipresent. The “Quiet Skies Coalition”, formed by a group of angry residents, protests against the noise, particularly of one billionaire’s military-size Chinook. “You can’t even play tennis,” moans an old-timer who stays near the East Hampton airport. “It’s like the third world war with GIV and GV jets.”

Thirty years ago, Loudoun County, just outside Washington, DC, in Northern Virginia, was a rural backwater with a rich history. During the war of 1812 federal documents were kept safe there from the English. Today it is the wealthiest county in America. Rolling pastures have given way to technology firms, swathes of companies that thrive on government contracts and pristine neighbourhoods with large houses. The average household income, at over $130,000, is twice the national level. The county also marks the western tip of the biggest cluster of affluence in the country. Between Loudoun County and north-west Washington, DC, there are over 800,000 people in exclusive postcodes that are home to the best-educated and wealthiest 5% of the population, dubbed “superzips” by Charles Murray, a libertarian social scientist.

THE HAMPTONS, A string of small towns on the south shore of Long Island, have long been a playground for America’s affluent. Nowadays the merely rich are being crimped by the ultra-wealthy. In August it can cost $400,000 to rent a fancy house there. The din of helicopters and private jets is omnipresent. The “Quiet Skies Coalition”, formed by a group of angry residents, protests against the noise, particularly of one billionaire’s military-size Chinook. “You can’t even play tennis,” moans an old-timer who stays near the East Hampton airport. “It’s like the third world war with GIV and GV jets.”

Thirty years ago, Loudoun County, just outside Washington, DC, in Northern Virginia, was a rural backwater with a rich history. During the war of 1812 federal documents were kept safe there from the English. Today it is the wealthiest county in America. Rolling pastures have given way to technology firms, swathes of companies that thrive on government contracts and pristine neighbourhoods with large houses. The average household income, at over $130,000, is twice the national level. The county also marks the western tip of the biggest cluster of affluence in the country. Between Loudoun County and north-west Washington, DC, there are over 800,000 people in exclusive postcodes that are home to the best-educated and wealthiest 5% of the population, dubbed “superzips” by Charles Murray, a libertarian social scientist.

[div clas=attrib]Read the entire article following the jump.[end-div]

Power and Baldness

Since behavioral scientists and psychologists first began roaming the globe we have come to know how and (sometimes) why visual appearance is so important in human interactions. Of course, anecdotally, humans have known this for thousands of years — that image is everything. After all it, was not Mary Kay or L’Oreal who brought us make-up but the ancient Egyptians. Yet, it is still fascinating to see how markedly the perception of an individual can change with a basic alteration, and only at the surface. Witness the profound difference in characteristics that we project onto a male with male pattern baldness (wimp) when he shaves his head (tough guy). And, of course, corporations can now assign a monetary value to the shaven look. As for comb-overs, well that is another topic entirely.

[div class=attrib]From the Wall Street Journal:[end-div]

Up for a promotion? If you’re a man, you might want to get out the clippers.

Men with shaved heads are perceived to be more masculine, dominant and, in some cases, to have greater leadership potential than those with longer locks or with thinning hair, according to a recent study out of the University of Pennsylvania’s Wharton School.

That may explain why the power-buzz look has caught on among business leaders in recent years. Venture capitalist and Netscape founder Marc Andreessen, 41 years old, DreamWorks Animation Chief Executive Jeffrey Katzenberg, 61, and Amazon.com Inc. CEO Jeffrey Bezos, 48, all sport some variant of the close-cropped look.

Some executives say the style makes them appear younger—or at least, makes their age less evident—and gives them more confidence than a comb-over or monk-like pate.

“I’m not saying that shaving your head makes you successful, but it starts the conversation that you’ve done something active,” says tech entrepreneur and writer Seth Godin, 52, who has embraced the bare look for two decades. “These are people who decide to own what they have, as opposed to trying to pretend to be something else.”

Wharton management lecturer Albert Mannes conducted three experiments to test peoples’ perceptions of men with shaved heads. In one of the experiments, he showed 344 subjects photos of the same men in two versions: one showing the man with hair and the other showing him with his hair digitally removed, so his head appears shaved.

In all three tests, the subjects reported finding the men with shaved heads as more dominant than their hirsute counterparts. In one test, men with shorn heads were even perceived as an inch taller and about 13% stronger than those with fuller manes. The paper, “Shorn Scalps and Perceptions of Male Dominance,” was published online, and will be included in a coming issue of journal Social Psychological and Personality Science.

The study found that men with thinning hair were viewed as the least attractive and powerful of the bunch, a finding that tracks with other studies showing that people perceive men with typical male-pattern baldness—which affects roughly 35 million Americans—as older and less attractive. For those men, the solution could be as cheap and simple as a shave.

According to Wharton’s Dr. Mannes—who says he was inspired to conduct the research after noticing that people treated him more deferentially when he shaved off his own thinning hair—head shavers may seem powerful because the look is associated with hypermasculine images, such as the military, professional athletes and Hollywood action heroes like Bruce Willis. (Male-pattern baldness, by contrast, conjures images of “Seinfeld” character George Costanza.)

New York image consultant Julie Rath advises her clients to get closely cropped when they start thinning up top. “There’s something really strong, powerful and confident about laying it all bare,” she says, describing the thinning or combed-over look as “kind of shlumpy.”

The look is catching on. A 2010 study from razor maker Gillette, a unit of Procter & Gamble Co., found that 13% of respondents said they shaved their heads, citing reasons as varied as fashion, sports and already thinning hair, according to a company spokesman. HeadBlade Inc., which sells head-shaving accessories, says revenues have grown 30% a year in the past decade.

Shaving his head gave 60-year-old Stephen Carley, CEO of restaurant chain Red Robin Gourmet Burgers Inc., a confidence boost when he was working among 20-somethings at tech start-ups in the 1990s. With his thinning hair shorn, “I didn’t feel like the grandfather in the office anymore.” He adds that the look gave him “the impression that it was much harder to figure out how old I was.”

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image: Comb-over patent, 1977. Courtesy of Wikipedia.[end-div]

Bicyclist Tribes

If you ride a bike (as in, bicycle) you will find that you probably belong to a specific tribe of bicyclist — and you’re being observed by bicyclist watchers! Read on to find out if you’re a Roadie or a Beach Cruiser or if you belong to one of the other tribes. Of course, some are quite simply in an exclusive “mayo jaune” tribe of their own.

[div class=attrib]From Wall Street Journal:[end-div]

Bird watching is a fine hobby for those with the time and inclination to traipse into nature, but the thrill of spotting different species of bicyclists can be just as rewarding. Why travel to Argentina to find a black-breasted plovercrest when one can spy a similarly plumed “Commuter” at the neighborhood Starbucks? No need to squint into binoculars or get up at the crack of dawn, either—bicyclists are out and about at all hours.

Bicyclist-watching has become much more interesting in recent years as the number of two-wheeled riders has grown. High gas prices, better bicycles, concern about the environment, looking cool—they’re all contributing factors. And with proliferation has come specialization. People don’t just “ride” bikes anymore: They commute or race or cruise, with each activity spawning corresponding gear and attitudes. Those in the field categorize cyclists into groups known as “bike tribes.” Instead of ducks, hawks and water fowl, bicyclologists might speak of Roadies, Cyclocrossers and Beach Cruisers.

To identify a bike tribe, note distinguishing marks, patterns and habits. Start with the dominant color and materials of a cyclist’s clothing. For example, garish jerseys and Lycra shorts indicate a Roadie, while padded gloves, mud-spattered jackets and black cleats are the territory of Cyclocrossers. Migration patterns are revealing. Observe the speed of travel and the treatment of other cyclists. Does the cyclist insist on riding amid cars even when wide bicycle paths are available? Probably a Roadie. Is the cyclist out in the pouring rain? Sounds like a Commuter. The presence of juveniles is telling, too; only a few tribes travel with offspring.

The Roadie

No bike tribe is more common in the United States than the Roadie. Their mien is sportiness and “performance” their goal. Roadies love passing other bike riders; they get annoyed when they have to dodge pedestrians walking with dogs or small children; they often ride in the middle of the road. They tend to travel in packs and spend time in small bicycle shops.

The Commuter

Commuters view a bicycle first and foremost as a means of transportation. They don’t ride without a destination. It’s easy to confuse Commuters with other tribes because others will sometimes use their bicycles to get to work. Even more challenging, Commuters come in all shapes and sizes and ride all different types of bicycles. But there are some distinguishing behaviors. Commuters almost always travel alone. They tend to wear drabber clothing than other tribes. Some adopt a smug, I’m-saving-the-world attitude, which is apparent in the way they glare at motorists. Commuters are most visible during rush hour.

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Image: Bradley Wiggins, Winner 2012 Tour de France.[end-div]

Social Media and Vanishing History

Social media is great for notifying members in one’s circle of events in the here and now. Of course, most events turn out to be rather trivial, of the “what I ate for dinner” kind. However, social media also has a role in spreading word of more momentous social and political events; the Arab Spring comes to mind.

But, while Twitter and its peers may be a boon for those who live in the present moment and need to transmit their current status, it seems that our social networks are letting go of the past. Will history become lost and irrelevant to the Twitter generation?

A terrifying thought.

[div class=attrib]From Technology Review:[end-div]

On 25 January 2011, a popular uprising began in Egypt that  led to the overthrow of the country’s brutal president and to the first truly free elections. One of the defining features of this uprising and of others in the Arab Spring was the way people used social media to organise protests and to spread news.

Several websites have since begun the task of curating this content, which is an important record of events and how they unfolded. That led Hany SalahEldeen and Michael Nelson at Old Dominion University in Norfolk, Virginia, to take a deeper look at the material to see how much the shared  were still live.

What they found has serious implications. SalahEldeen and Nelson say a significant proportion of the websites that this social media points to has disappeared. And the same pattern occurs for other culturally significant events, such as the the H1N1 virus outbreak, Michael Jackson’s death and the Syrian uprising.

In other words, our history, as recorded by social media, is slowly leaking away.

Their method is straightforward. SalahEldeen and Nelson looked for tweets on six culturally significant events that occurred between June 2009 and March 2012. They then filtered the URLs these tweets pointed to and checked to see whether the content was still available on the web, either in its original form or in an archived form.

They found that the older the social media, the more likely its content was to be missing. In fact, they found an almost linear relationship between time and the percentage lost.

The numbers are startling. They say that 11 per cent of the social media content had disappeared within a year and 27 per cent within 2 years. Beyond that, SalahEldeen and Nelson say the world loses 0.02 per cent of its culturally significant social media material every day.

That’s a sobering thought.

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image: Movie poster for the 2002 film ”The Man Without a Past”. The Man Without a Past (Finnish: Mies vailla menneisyyttä) is a 2002 Finnish comedy-drama film directed by Aki Kaurismäki. Courtesy of Wikipedia.[end-div]

Watch Out Corporate America: Gen-Y is Coming

Social scientists have had Generation-Y, also known as “millenials”, under their microscopes for a while. Born between 1982 and 1999, Gen-Y is now coming of age and becoming a force in the workplace displacing aging “boomers” as they retire to the hills. So, researchers are now looking at how Gen-Y is faring inside corporate America. Remember, Gen-Y is the “it’s all about me generation”; members are characterized as typically lazy and spoiled, have a grandiose sense of entitlement, inflated self-esteem and deep emotional fragility. Their predecessors, the baby boomers, on the other hand are often seen as over-bearing, work-obsessed, competitive and narrow-minded. A clash of cultures is taking shape in office cubes across the country as these groups, with such differing personalities and philosophies, tussle within the workplace. However, it may not be all bad, as columnist Emily Matchar, argues below — corporate America needs the kind of shake-up that Gen-Y promises.

[div class=attrib]From the Washington Post:[end-div]

Have you heard the one about the kid who got his mom to call his boss and ask for a raise? Or about the college student who quit her summer internship because it forbade Facebook in the office?

Yep, we’re talking about Generation Y — loosely defined as those born between 1982 and 1999 — also known as millennials. Perhaps you know them by their other media-generated nicknames: teacup kids,for their supposed emotional fragility; boomerang kids, who always wind up back home; trophy kids — everyone’s a winner!; the Peter Pan generation, who’ll never grow up.

Now this pampered, over-praised, relentlessly self-confident generation (at age 30, I consider myself a sort of older sister to them) is flooding the workplace. They’ll make up 75 percent of the American workforce by 2025 — and they’re trying to change everything.

These are the kids, after all, who text their dads from meetings. They think “business casual” includes skinny jeans. And they expect the company president to listen to their “brilliant idea.”

When will they adapt?

They won’t. Ever. Instead, through their sense of entitlement and inflated self-esteem, they’ll make the modern workplace adapt to them. And we should thank them for it. Because the modern workplace frankly stinks, and the changes wrought by Gen Y will be good for everybody.

Few developed countries demand as much from their workers as the United States. Americans spend more time at the office than citizens of most other developed nations. Annually, we work 408 hours more than the Dutch, 374 hours more than the Germans and 311 hours more than the French. We even work 59 hours more than the stereotypically nose-to-the-grindstone Japanese. Though women make up half of the American workforce, the United States is the only country in the developed world without guaranteed paid maternity leave.

All this hard work is done for less and less reward. Wages have been stagnant for years, benefits shorn, opportunities for advancement blocked. While the richest Americans get richer, middle-class workers are left to do more with less. Because jobs are scarce and we’re used to a hierarchical workforce, we accept things the way they are. Worse, we’ve taken our overwork as a badge of pride. Who hasn’t flushed with a touch of self-importance when turning down social plans because we’re “too busy with work”?

Into this sorry situation strolls the self-esteem generation, printer-fresh diplomas in hand. And they’re not interested in business as usual.

The current corporate culture simply doesn’t make sense to much of middle-class Gen Y. Since the cradle, these privileged kids have been offered autonomy, control and choices (“Green pants or blue pants today, sweetie?”). They’ve been encouraged to show their creativity and to take their extracurricular interests seriously. Raised by parents who wanted to be friends with their kids, they’re used to seeing their elders as peers rather than authority figures. When they want something, they’re not afraid to say so.

[div class=attrib]Read the entire article after the jump.[end-div]

Extreme Equals Happy, Moderate Equals Unhappy

[div class=attrib]From the New York Times:[end-div]

WHO is happier about life — liberals or conservatives? The answer might seem straightforward. After all, there is an entire academic literature in the social sciences dedicated to showing conservatives as naturally authoritarian, dogmatic, intolerant of ambiguity, fearful of threat and loss, low in self-esteem and uncomfortable with complex modes of thinking. And it was the candidate Barack Obama in 2008 who infamously labeled blue-collar voters “bitter,” as they “cling to guns or religion.” Obviously, liberals must be happier, right?

Wrong. Scholars on both the left and right have studied this question extensively, and have reached a consensus that it is conservatives who possess the happiness edge. Many data sets show this. For example, the Pew Research Center in 2006 reported that conservative Republicans were 68 percent more likely than liberal Democrats to say they were “very happy” about their lives. This pattern has persisted for decades. The question isn’t whether this is true, but why.

Many conservatives favor an explanation focusing on lifestyle differences, such as marriage and faith. They note that most conservatives are married; most liberals are not. (The percentages are 53 percent to 33 percent, according to my calculations using data from the 2004 General Social Survey, and almost none of the gap is due to the fact that liberals tend to be younger than conservatives.) Marriage and happiness go together. If two people are demographically the same but one is married and the other is not, the married person will be 18 percentage points more likely to say he or she is very happy than the unmarried person.

An explanation for the happiness gap more congenial to liberals is that conservatives are simply inattentive to the misery of others. If they recognized the injustice in the world, they wouldn’t be so cheerful. In the words of Jaime Napier and John Jost, New York University psychologists, in the journal Psychological Science, “Liberals may be less happy than conservatives because they are less ideologically prepared to rationalize (or explain away) the degree of inequality in society.” The academic parlance for this is “system justification.”

The data show that conservatives do indeed see the free enterprise system in a sunnier light than liberals do, believing in each American’s ability to get ahead on the basis of achievement. Liberals are more likely to see people as victims of circumstance and oppression, and doubt whether individuals can climb without governmental help. My own analysis using 2005 survey data from Syracuse University shows that about 90 percent of conservatives agree that “While people may begin with different opportunities, hard work and perseverance can usually overcome those disadvantages.” Liberals — even upper-income liberals — are a third less likely to say this.

So conservatives are ignorant, and ignorance is bliss, right? Not so fast, according to a study from the University of Florida psychologists Barry Schlenker and John Chambers and the University of Toronto psychologist Bonnie Le in the Journal of Research in Personality. These scholars note that liberals define fairness and an improved society in terms of greater economic equality. Liberals then condemn the happiness of conservatives, because conservatives are relatively untroubled by a problem that, it turns out, their political counterparts defined.

There is one other noteworthy political happiness gap that has gotten less scholarly attention than conservatives versus liberals: moderates versus extremists.

Political moderates must be happier than extremists, it always seemed to me. After all, extremists actually advertise their misery with strident bumper stickers that say things like, “If you’re not outraged, you’re not paying attention!”

But it turns out that’s wrong. People at the extremes are happier than political moderates. Correcting for income, education, age, race, family situation and religion, the happiest Americans are those who say they are either “extremely conservative” (48 percent very happy) or “extremely liberal” (35 percent). Everyone else is less happy, with the nadir at dead-center “moderate” (26 percent).

What explains this odd pattern? One possibility is that extremists have the whole world figured out, and sorted into good guys and bad guys. They have the security of knowing what’s wrong, and whom to fight. They are the happy warriors.

Whatever the explanation, the implications are striking. The Occupy Wall Street protesters may have looked like a miserable mess. In truth, they were probably happier than the moderates making fun of them from the offices above. And none, it seems, are happier than the Tea Partiers, many of whom cling to guns and faith with great tenacity. Which some moderately liberal readers of this newspaper might find quite depressing.

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image courtesy of Psychology Today.[end-div]

The Wantologist

This may sound like another job from the future, but “wantologists” wander among us in 2012.

[div class=attrib]From the New York Times:[end-div]

IN the sprawling outskirts of San Jose, Calif., I find myself at the apartment door of Katherine Ziegler, a psychologist and wantologist. Could it be, I wonder, that there is such a thing as a wantologist, someone we can hire to figure out what we want? Have I arrived at some final telling moment in my research on outsourcing intimate parts of our lives, or at the absurdist edge of the market frontier?

A willowy woman of 55, Ms. Ziegler beckons me in. A framed Ph.D. degree in psychology from the University of Illinois hangs on the wall, along with an intricate handmade quilt and a collage of images clipped from magazines — the back of a child’s head, a gnarled tree, a wandering cat — an odd assemblage that invites one to search for a connecting thread.

After a 20-year career as a psychologist, Ms. Ziegler expanded her practice to include executive coaching, life coaching and wantology. Originally intended to help business managers make purchasing decisions, wantology is the brainchild of Kevin Kreitman, an industrial engineer who set up a two-day class to train life coaches to apply this method to individuals in private life. Ms. Ziegler took the course and was promptly certified in the new field.

Ms. Ziegler explains that the first step in thinking about a “want,” is to ask your client, “ ‘Are you floating or navigating toward your goal?’ A lot of people float. Then you ask, ‘What do you want to feel like once you have what you want?’ ”

She described her experience with a recent client, a woman who lived in a medium-size house with a small garden but yearned for a bigger house with a bigger garden. She dreaded telling her husband, who had long toiled at renovations on their present home, and she feared telling her son, who she felt would criticize her for being too materialistic.

Ms. Ziegler took me through the conversation she had with this woman: “What do you want?”

“A bigger house.”

“How would you feel if you lived in a bigger house?”

“Peaceful.”

“What other things make you feel peaceful?”

“Walks by the ocean.” (The ocean was an hour’s drive away.)

“Do you ever take walks nearer where you live that remind you of the ocean?”“Certain ones, yes.”

“What do you like about those walks?”

“I hear the sound of water and feel surrounded by green.”

This gentle line of questions nudged the client toward a more nuanced understanding of her own desire. In the end, the woman dedicated a small room in her home to feeling peaceful. She filled it with lush ferns. The greenery encircled a bubbling slate-and-rock tabletop fountain. Sitting in her redesigned room in her medium-size house, the woman found the peace for which she’d yearned.

I was touched by the story. Maybe Ms. Ziegler’s client just needed a good friend who could listen sympathetically and help her work out her feelings. Ms. Ziegler provided a service — albeit one with a wacky name — for a fee. Still, the mere existence of a paid wantologist indicates just how far the market has penetrated our intimate lives. Can it be that we are no longer confident to identify even our most ordinary desires without a professional to guide us?

Is the wantologist the tail end of a larger story? Over the last century, the world of services has changed greatly.

A hundred — or even 40 — years ago, human eggs and sperm were not for sale, nor were wombs for rent. Online dating companies, nameologists, life coaches, party animators and paid graveside visitors did not exist.

Nor had a language developed that so seamlessly melded village and market — as in “Rent-a-Mom,” “Rent-a-Dad,” “Rent-a-Grandma,” “Rent-a-Friend” — insinuating itself, half joking, half serious, into our culture. The explosion in the number of available personal services says a great deal about changing ideas of what we can reasonably expect from whom. In the late 1940s, there were 2,500 clinical psychologists licensed in the United States. By 2010, there were 77,000 — and an additional 50,000 marriage and family therapists.

[div class=attrib]Read the entire article after the jump.[end-div]

Loneliness in the Age of Connectedness

Online social networks are a boon to researchers. As never before, social scientists are probing our connections, our innermost thoughts now made public, our networks of friends, and our loneliness. Some academics point to the likes of Facebook for making our increasingly shallow “friendships” a disposable and tradable commodity, and ironically facilitating isolation from more intimate and deeper connections. Others see Facebook merely as a mirror — we have, quite simply, made ourselves lonely, and our social networks instantly and starkly expose our isolation for all to see and “like”.

An insightful article by novelist Stephen Marche over at The Atlantic examines our self-imposed loneliness.

[div class=attrib]From the Atlantic:[end-div]

Yvette Vickers, a former Playboy playmate and B-movie star, best known for her role in Attack of the 50 Foot Woman, would have been 83 last August, but nobody knows exactly how old she was when she died. According to the Los Angeles coroner’s report, she lay dead for the better part of a year before a neighbor and fellow actress, a woman named Susan Savage, noticed cobwebs and yellowing letters in her mailbox, reached through a broken window to unlock the door, and pushed her way through the piles of junk mail and mounds of clothing that barricaded the house. Upstairs, she found Vickers’s body, mummified, near a heater that was still running. Her computer was on too, its glow permeating the empty space.

The Los Angeles Times posted a story headlined “Mummified Body of Former Playboy Playmate Yvette Vickers Found in Her Benedict Canyon Home,” which quickly went viral. Within two weeks, by Technorati’s count, Vickers’s lonesome death was already the subject of 16,057 Facebook posts and 881 tweets. She had long been a horror-movie icon, a symbol of Hollywood’s capacity to exploit our most basic fears in the silliest ways; now she was an icon of a new and different kind of horror: our growing fear of loneliness. Certainly she received much more attention in death than she did in the final years of her life. With no children, no religious group, and no immediate social circle of any kind, she had begun, as an elderly woman, to look elsewhere for companionship. Savage later told Los Angeles magazine that she had searched Vickers’s phone bills for clues about the life that led to such an end. In the months before her grotesque death, Vickers had made calls not to friends or family but to distant fans who had found her through fan conventions and Internet sites.

Vickers’s web of connections had grown broader but shallower, as has happened for many of us. We are living in an isolation that would have been unimaginable to our ancestors, and yet we have never been more accessible. Over the past three decades, technology has delivered to us a world in which we need not be out of contact for a fraction of a moment. In 2010, at a cost of $300 million, 800 miles of fiber-optic cable was laid between the Chicago Mercantile Exchange and the New York Stock Exchange to shave three milliseconds off trading times. Yet within this world of instant and absolute communication, unbounded by limits of time or space, we suffer from unprecedented alienation. We have never been more detached from one another, or lonelier. In a world consumed by ever more novel modes of socializing, we have less and less actual society. We live in an accelerating contradiction: the more connected we become, the lonelier we are. We were promised a global village; instead we inhabit the drab cul-de-sacs and endless freeways of a vast suburb of information.

At the forefront of all this unexpectedly lonely interactivity is Facebook, with 845 million users and $3.7 billion in revenue last year. The company hopes to raise $5 billion in an initial public offering later this spring, which will make it by far the largest Internet IPO in history. Some recent estimates put the company’s potential value at $100 billion, which would make it larger than the global coffee industry—one addiction preparing to surpass the other. Facebook’s scale and reach are hard to comprehend: last summer, Facebook became, by some counts, the first Web site to receive 1 trillion page views in a month. In the last three months of 2011, users generated an average of 2.7 billion “likes” and comments every day. On whatever scale you care to judge Facebook—as a company, as a culture, as a country—it is vast beyond imagination.

Despite its immense popularity, or more likely because of it, Facebook has, from the beginning, been under something of a cloud of suspicion. The depiction of Mark Zuckerberg, in The Social Network, as a bastard with symptoms of Asperger’s syndrome, was nonsense. But it felt true. It felt true to Facebook, if not to Zuckerberg. The film’s most indelible scene, the one that may well have earned it an Oscar, was the final, silent shot of an anomic Zuckerberg sending out a friend request to his ex-girlfriend, then waiting and clicking and waiting and clicking—a moment of superconnected loneliness preserved in amber. We have all been in that scene: transfixed by the glare of a screen, hungering for response.

When you sign up for Google+ and set up your Friends circle, the program specifies that you should include only “your real friends, the ones you feel comfortable sharing private details with.” That one little phrase, Your real friends—so quaint, so charmingly mothering—perfectly encapsulates the anxieties that social media have produced: the fears that Facebook is interfering with our real friendships, distancing us from each other, making us lonelier; and that social networking might be spreading the very isolation it seemed designed to conquer.

Facebook arrived in the middle of a dramatic increase in the quantity and intensity of human loneliness, a rise that initially made the site’s promise of greater connection seem deeply attractive. Americans are more solitary than ever before. In 1950, less than 10 percent of American households contained only one person. By 2010, nearly 27 percent of households had just one person. Solitary living does not guarantee a life of unhappiness, of course. In his recent book about the trend toward living alone, Eric Klinenberg, a sociologist at NYU, writes: “Reams of published research show that it’s the quality, not the quantity of social interaction, that best predicts loneliness.” True. But before we begin the fantasies of happily eccentric singledom, of divorcées dropping by their knitting circles after work for glasses of Drew Barrymore pinot grigio, or recent college graduates with perfectly articulated, Steampunk-themed, 300-square-foot apartments organizing croquet matches with their book clubs, we should recognize that it is not just isolation that is rising sharply. It’s loneliness, too. And loneliness makes us miserable.

We know intuitively that loneliness and being alone are not the same thing. Solitude can be lovely. Crowded parties can be agony. We also know, thanks to a growing body of research on the topic, that loneliness is not a matter of external conditions; it is a psychological state. A 2005 analysis of data from a longitudinal study of Dutch twins showed that the tendency toward loneliness has roughly the same genetic component as other psychological problems such as neuroticism or anxiety.

[div class=attrib]Kindly read the entire article after the momentary jump.[end-div]

[div class=attrib]Photograph courtesy of Phillip Toledano / The Atlantic.[end-div]

Our Children: Independently Dependent

Why can’t our kids tie their own shoes?

Are we raising our children to be self-obsessed, attention-seeking, helpless and dependent groupthinkers? And, why may the phenomenon of “family time” in the U.S. be a key culprit?

These are some of the questions raised by anthropologist Elinor Ochs and her colleagues. Over the last decade they have studied family life across the globe, from the Amazon region, to Samoa, and middle-America.

[div class=attrib]From the Wall Street Journal:[end-div]

Why do American children depend on their parents to do things for them that they are capable of doing for themselves? How do U.S. working parents’ views of “family time” affect their stress levels? These are just two of the questions that researchers at UCLA’s Center on Everyday Lives of Families, or CELF, are trying to answer in their work.

By studying families at home—or, as the scientists say, “in vivo”—rather than in a lab, they hope to better grasp how families with two working parents balance child care, household duties and career, and how this balance affects their health and well-being.

The center, which also includes sociologists, psychologists and archeologists, wants to understand “what the middle class thought, felt and what they did,” says Dr. Ochs. The researchers plan to publish two books this year on their work, and say they hope the findings may help families become closer and healthier.

Ten years ago, the UCLA team recorded video for a week of nearly every moment at home in the lives of 32 Southern California families. They have been picking apart the footage ever since, scrutinizing behavior, comments and even their refrigerators’s contents for clues.

The families, recruited primarily through ads, owned their own homes and had two or three children, at least one of whom was between 7 and 12 years old. About a third of the families had at least one nonwhite member, and two were headed by same-sex couples. Each family was filmed by two cameras and watched all day by at least three observers.

Among the findings: The families had very a child-centered focus, which may help explain the “dependency dilemma” seen among American middle-class families, says Dr. Ochs. Parents intend to develop their children’s independence, yet raise them to be relatively dependent, even when the kids have the skills to act on their own, she says.

In addition, these parents tended to have a very specific, idealized way of thinking about family time, says Tami Kremer-Sadlik, a former CELF research director who is now the director of programs for the division of social sciences at UCLA. These ideals appeared to generate guilt when work intruded on family life, and left parents feeling pressured to create perfect time together. The researchers noted that the presence of the observers may have altered some of the families’ behavior.

How kids develop moral responsibility is an area of focus for the researchers. Dr. Ochs, who began her career in far-off regions of the world studying the concept of “baby talk,” noticed that American children seemed relatively helpless compared with those in other cultures she and colleagues had observed.

In those cultures, young children were expected to contribute substantially to the community, says Dr. Ochs. Children in Samoa serve food to their elders, waiting patiently in front of them before they eat, as shown in one video snippet. Another video clip shows a girl around 5 years of age in Peru’s Amazon region climbing a tall tree to harvest papaya, and helping haul logs thicker than her leg to stoke a fire.

By contrast, the U.S. videos showed Los Angeles parents focusing more on the children, using simplified talk with them, doing most of the housework and intervening quickly when the kids had trouble completing a task.

In 22 of 30 families, children frequently ignored or resisted appeals to help, according to a study published in the journal Ethos in 2009. In the remaining eight families, the children weren’t asked to do much. In some cases, the children routinely asked the parents to do tasks, like getting them silverware. “How am I supposed to cut my food?” Dr. Ochs recalls one girl asking her parents.

Asking children to do a task led to much negotiation, and when parents asked, it sounded often like they were asking a favor, not making a demand, researchers said. Parents interviewed about their behavior said it was often too much trouble to ask.

For instance, one exchange caught on video shows an 8-year-old named Ben sprawled out on a couch near the front door, lifting his white, high-top sneaker to his father, the shoe laced. “Dad, untie my shoe,” he pleads. His father says Ben needs to say “please.”

“Please untie my shoe,” says the child in an identical tone as before. After his father hands the shoe back to him, Ben says, “Please put my shoe on and tie it,” and his father obliges.

[div class=attrib]Read the entire article after the jump:[end-div]

[div class=attrib]Image courtesy of Kyle T. Webster / Wall Street Journal.[end-div]

Time for An Over-The-Counter Morality Pill?

Stories of people who risk life and limb to help a stranger and those who turn a blind eye are as current as they are ancient. Almost on a daily basis the 24-hours news cycle carries a heartwarming story of someone doing good to or for another; and seemingly just as often comes the story of indifference. Social and psychological researchers have studied this behavior in humans, and animals, for decades. However, only recently has progress been made in identifying some underlying factors. Peter Singer, a professor of bioethics at Princeton University, and researcher Agata Sagan recap some current understanding.

All of this leads to a conundrum: would it be ethical to market a “morality” pill that would make us do more good more often?

[div class=attrib]From the New York Times:[end-div]

Last October, in Foshan, China, a 2-year-old girl was run over by a van. The driver did not stop. Over the next seven minutes, more than a dozen people walked or bicycled past the injured child. A second truck ran over her. Eventually, a woman pulled her to the side, and her mother arrived. The child died in a hospital. The entire scene was captured on video and caused an uproar when it was shown by a television station and posted online. A similar event occurred in London in 2004, as have others, far from the lens of a video camera.

Yet people can, and often do, behave in very different ways.

A news search for the words “hero saves” will routinely turn up stories of bystanders braving oncoming trains, swift currents and raging fires to save strangers from harm. Acts of extreme kindness, responsibility and compassion are, like their opposites, nearly universal.

Why are some people prepared to risk their lives to help a stranger when others won’t even stop to dial an emergency number?

Scientists have been exploring questions like this for decades. In the 1960s and early ’70s, famous experiments by Stanley Milgram and Philip Zimbardo suggested that most of us would, under specific circumstances, voluntarily do great harm to innocent people. During the same period, John Darley and C. Daniel Batson showed that even some seminary students on their way to give a lecture about the parable of the Good Samaritan would, if told that they were running late, walk past a stranger lying moaning beside the path. More recent research has told us a lot about what happens in the brain when people make moral decisions. But are we getting any closer to understanding what drives our moral behavior?

Here’s what much of the discussion of all these experiments missed: Some people did the right thing. A recent experiment (about which we have some ethical reservations) at the University of Chicago seems to shed new light on why.

Researchers there took two rats who shared a cage and trapped one of them in a tube that could be opened only from the outside. The free rat usually tried to open the door, eventually succeeding. Even when the free rats could eat up all of a quantity of chocolate before freeing the trapped rat, they mostly preferred to free their cage-mate. The experimenters interpret their findings as demonstrating empathy in rats. But if that is the case, they have also demonstrated that individual rats vary, for only 23 of 30 rats freed their trapped companions.

The causes of the difference in their behavior must lie in the rats themselves. It seems plausible that humans, like rats, are spread along a continuum of readiness to help others. There has been considerable research on abnormal people, like psychopaths, but we need to know more about relatively stable differences (perhaps rooted in our genes) in the great majority of people as well.

Undoubtedly, situational factors can make a huge difference, and perhaps moral beliefs do as well, but if humans are just different in their predispositions to act morally, we also need to know more about these differences. Only then will we gain a proper understanding of our moral behavior, including why it varies so much from person to person and whether there is anything we can do about it.

[div class=attrib]Read more here.[end-div]

Forget the Groupthink: Rise of the Introvert

Author Susan Cain reviews her intriguing book, “Quiet : The Power of Introverts” in an interview with Gareth Cook over at Mind Matters / Scientific American.

She shows us how social and business interactions and group-driven processes, often led and coordinated by extroverts, may not be the most efficient method for introverts to shine creatively.

[div class=attrib]From Mind Matters:[end-div]

Cook: This may be a stupid question, but how do you define an introvert? How can somebody tell whether they are truly introverted or extroverted?

Cain: Not a stupid question at all! Introverts prefer quiet, minimally stimulating environments, while extroverts need higher levels of stimulation to feel their best. Stimulation comes in all forms – social stimulation, but also lights, noise, and so on. Introverts even salivate more than extroverts do if you place a drop of lemon juice on their tongues! So an introvert is more likely to enjoy a quiet glass of wine with a close friend than a loud, raucous party full of strangers.

It’s also important to understand that introversion is different from shyness. Shyness is the fear of negative judgment, while introversion is simply the preference for less stimulation. Shyness is inherently uncomfortable; introversion is not. The traits do overlap, though psychologists debate to what degree.

Cook: You argue that our culture has an extroversion bias. Can you explain what you mean?

Cain: In our society, the ideal self is bold, gregarious, and comfortable in the spotlight. We like to think that we value individuality, but mostly we admire the type of individual who’s comfortable “putting himself out there.” Our schools, workplaces, and religious institutions are designed for extroverts. Introverts are to extroverts what American women were to men in the 1950s — second-class citizens with gigantic amounts of untapped talent.

In my book, I travel the country – from a Tony Robbins seminar to Harvard Business School to Rick Warren’s powerful Saddleback Church – shining a light on the bias against introversion. One of the most poignant moments was when an evangelical pastor I met at Saddleback confided his shame that “God is not pleased” with him because he likes spending time alone.

Cook: How does this cultural inclination affect introverts?

Cain: Many introverts feel there’s something wrong with them, and try to pass as extroverts. But whenever you try to pass as something you’re not, you lose a part of yourself along the way. You especially lose a sense of how to spend your time. Introverts are constantly going to parties and such when they’d really prefer to be home reading, studying, inventing, meditating, designing, thinking, cooking…or any number of other quiet and worthwhile activities.

According to the latest research, one third to one half of us are introverts – that’s one out of every two or three people you know. But you’d never guess that, right? That’s because introverts learn from an early age to act like pretend-extroverts.

[div class=attrib]Read the entire article here.[end-div]

Handedness Shapes Perception and Morality

A group of new research studies show that our left- or right-handedness shapes our perception of “goodness” and “badness”.

[div class=attrib]From Scientific American:[end-div]

A series of studies led by psychologist Daniel Casasanto suggests that one thing that may shape our choice is the side of the menu an item appears on. Specifically, Casasanto and his team have shown that for left-handers, the left side of any space connotes positive qualities such as goodness, niceness, and smartness. For right-handers, the right side of any space connotes these same virtues. He calls this idea that “people with different bodies think differently, in predictable ways” the body-specificity hypothesis.

In one of Casasanto’s experiments, adult participants were shown pictures of two aliens side by side and instructed to circle the alien that best exemplified an abstract characteristic. For example, participants may have been asked to circle the “more attractive” or “less honest” alien. Of the participants who showed a directional preference (most participants did), the majority of right-handers attributed positive characteristics more often to the aliens on the right whereas the majority of left-handers attributed positive characteristics more often to aliens on the left.

Handedness was found to predict choice in experiments mirroring real-life situations as well. When participants read near-identical product descriptions on either side of a page and were asked to indicate the products they wanted to buy, most righties chose the item described on the right side while most lefties chose the product on the left. Similarly, when subjects read side-by-side resumes from two job applicants presented in a random order, they were more likely to choose the candidate described on their dominant side.

Follow-up studies on children yielded similar results. In one experiment, children were shown a drawing of a bookshelf with a box to the left and a box to the right. They were then asked to think of a toy they liked and a toy they disliked and choose the boxes in which they would place the toys. Children tended to choose to place their preferred toy in the box to their dominant side and the toy they did not like to their non-dominant side.

[div class=attrib]Read more here.[end-div]

[div class=attrib]Image: Drawing Hands by M. C. Escher, 1948, Lithograph. Courtesy of Wikipedia.[end-div]

An Evolutionary Benefit to Self-deception

[div class=attrib]From Scientific American:[end-div]

We lie to ourselves all the time. We tell ourselves that we are better than average — that we are more moral, more capable, less likely to become sick or suffer an accident. It’s an odd phenomenon, and an especially puzzling one to those who think about our evolutionary origins. Self-deception is so pervasive that it must confer some advantage. But how could we be well served by a brain that deceives us? This is one of the topics tackled by Robert Trivers in his new book, “The Folly of Fools,” a colorful survey of deception that includes plane crashes, neuroscience and the transvestites of the animal world. He answered questions from Mind Matters editor Gareth Cook.

Cook: Do you have any favorite examples of deception in the natural world?
Trivers: Tough call. They are so numerous, intricate and bizarre.  But you can hardly beat female mimics for general interest. These are males that mimic females in order to achieve closeness to a territory-holding male, who then attracts a real female ready to lay eggs. The territory-holding male imagines that he is in bed (so to speak) with two females, when really he is in bed with one female and another male, who, in turn, steals part of the paternity of the eggs being laid by the female. The internal dynamics of such transvestite threesomes is only just being analyzed. But for pure reproductive artistry one can not beat the tiny blister beetles that assemble in arrays of 100’s to 1000’s, linking together to produce the larger illusion of a female solitary bee, which attracts a male bee who flies into the mirage in order to copulate and thereby carries the beetles to their next host.

Cook: At what age do we see the first signs of deception in humans?
Trivers: In the last trimester of pregnancy, that is, while the offspring is still inside its mother. The baby takes over control of the mother’s blood sugar level (raising it), pulse rate (raising it) and blood distribution (withdrawing it from extremities and positioning it above the developing baby). It does so by putting into the maternal blood stream the same chemicals—or close mimics—as those that the mother normally produces to control these variables. You could argue that this benefits mom. She says, my child knows better what it needs than I do so let me give the child control. But it is not in the mother’s best interests to allow the offspring to get everything it wants; the mother must apportion her biological investment among other offspring, past, present and future. The proof is in the inefficiency of the new arrangement, the hallmark of conflict. The offspring produces these chemicals at 1000 times the level that the mother does. This suggests a co-evolutionary struggle in which the mother’s body becomes deafer as the offspring becomes louder.
After birth, the first clear signs of deception come about age 6 months, which is when the child fakes need when there appears to be no good reason. The child will scream and bawl, roll on the floor in apparent agony and yet stop within seconds after the audience leaves the room, only to resume within seconds when the audience is back. Later, the child will hide objects from the view of others and deny that it cares about a punishment when it clearly does.  So-called ‘white lies’, of the sort “The meal you served was delicious” appear after age 5.

[div class=attrib]Read the entire article here.[end-div]

On the Need for Charisma

[div class=attrib]From Project Syndicate:[end-div]

A leadership transition is scheduled in two major autocracies in 2012. Neither is likely to be a surprise. Xi Jinping is set to replace Hu Jintao as President in China, and, in Russia, Vladimir Putin has announced that he will reclaim the presidency from Dmitri Medvedev. Among the world’s democracies, political outcomes this year are less predictable. Nicolas Sarkozy faces a difficult presidential re-election campaign in France, as does Barack Obama in the United States.

In the 2008 US presidential election, the press told us that Obama won because he had “charisma” – the special power to inspire fascination and loyalty. If so, how can his re-election be uncertain just four years later? Can a leader lose his or her charisma? Does charisma originate in the individual, in that person’s followers, or in the situation? Academic research points to all three.

Charisma proves surprisingly hard to identify in advance. A recent survey concluded that “relatively little” is known about who charismatic leaders are. Dick Morris, an American political consultant, reports that in his experience, “charisma is the most elusive of political traits, because it doesn’t exist in reality; only in our perception once a candidate has made it by hard work and good issues.” Similarly, the business press has described many a CEO as “charismatic” when things are going well, only to withdraw the label when profits fall.

Political scientists have tried to create charisma scales that would predict votes or presidential ratings, but they have not proven fruitful. Among US presidents, John F. Kennedy is often described as charismatic, but obviously not for everyone, given that he failed to capture a majority of the popular vote, and his ratings varied during his presidency.

Kennedy’s successor, Lyndon Johnson, lamented that he lacked charisma. That was true of his relations with the public, but Johnson could be magnetic – even overwhelming – in personal contacts. One careful study of presidential rhetoric found that even such famous orators as Franklin Roosevelt and Ronald Reagan could not count on charisma to enact their programs.

Charisma is more easily identified after the fact. In that sense, the concept is circular. It is like the old Chinese concept of the “mandate of heaven”: emperors were said to rule because they had it, and when they were overthrown, it was because they had lost it.

But no one could predict when that would happen. Similarly, success is often used to prove – after the fact – that a modern political leader has charisma. It is much harder to use charisma to predict who will be a successful leader.

[div class=attrib]Read the entire article here.[end-div]

Social Influence Through Social Media: Not!

Online social networks are an unprecedentedly rich source of material for psychologists, social scientists and observers of human behavior. Now a recent study shows that influence through these networks may not be as powerful or widespread as first thought. The study, “Social Selection and Peer Influence in an Online Social Network,” by Kevin Lewis, Marco Gonzalez and Jason Kaufman is available here.

[div class=attrib]From the Wall Street Journal:[end-div]

Social media gives ordinary people unprecedented power to broadcast their taste in movies, books and film, but for the most part those tastes don’t rub off on other people, a new study of college students finds. Instead, social media appears to strengthen our bonds with people whose tastes already resemble ours.

Researchers followed the Facebook pages and networks of some 1,000 students, at one college, for four years (looking only at public information). The strongest determinant of Facebook friendship was “mere propinquity” — living in the same building, studying the same subject—but people also self-segregated by gender, race, socioeconomic background and place of origin.

When it came to culture, researchers used an algorithm to identify taste “clusters” within the categories of music, movies, and books. They learned that fans of “lite/classic rock”* and “classical/jazz” were significantly more likely than chance would predict to form and maintain friendships, as were devotees of films featuring “dark satire” or “raunchy comedy / gore.” But this was the case for no other music or film genre — and for no books.

What’s more, “jazz/classical” was the only taste to spread from people who possessed it to those who lacked it. The researchers suggest that this is because liking jazz and classical music serves as a class marker, one that college-age people want to acquire. (I’d prefer to believe that they adopt those tastes on aesthetic grounds, but who knows?) “Indie/alt” music, in fact, was the opposite of contagious: People whose friends liked that style music tended to drop that preference themselves, over time.

[div class=attrib]Read the entire article here.[end-div]

Will nostalgia destroy pop culture?

[div class=attrib]Thomas Rogers for Slate:[end-div]

Over the last decade, American culture has been overtaken by a curious, overwhelming sense of nostalgia. Everywhere you look, there seems to be some new form of revivalism going on. The charts are dominated by old-school-sounding acts like Adele and Mumford & Sons. The summer concert schedule is dominated by reunion tours. TV shows like VH1’s “I Love the 90s” allow us to endlessly rehash the catchphrases of the recent past. And, thanks to YouTube and iTunes, new forms of music and pop culture are facing increasing competition from the ever-more-accessible catalog of older acts.

In his terrific new book, “Retromania,” music writer Simon Reynolds looks at how this nostalgia obsession is playing itself out everywhere from fashion to performance art to electronic music — and comes away with a worrying prognosis. If we continue looking backward, he argues, we’ll never have transformative decades, like the 1960s, or bold movements like rock ‘n’ roll, again. If all we watch and listen to are things that we’ve seen and heard before, and revive trends that have already existed, culture becomes an inescapable feedback loop.

Salon spoke to Reynolds over the phone from Los Angeles about the importance of the 1960s, the strangeness of Mumford & Sons — and why our future could be defined by boredom.

In the book you argue that our culture has increasingly been obsessed with looking backward, and that’s a bad thing. What makes you say that?

Every day, some new snippet of news comes along that is somehow connected to reconsuming the past. Just the other day I read that the famous Redding Festival in Britain is going to be screening a 1992 Nirvana concert during their festival. These events are like cultural antimatter. They won’t be remembered 20 years from now, and the more of them there are, the more alarming it is. I can understand why people want to go to them — they’re attractive and comforting. But this nostalgia seems to have crept into everything. The other day my daughter, who is 5 years old, was at camp, and they had an ’80s day. How can my daughter even understand what that means? She said the counselors were dressed really weird.

[div class=attrib]More from theSource here.[end-div]

[div class=attrib]Image courtesy of Slate.[end-div]

Communicating Meaning in Cyberspace

Clarifying intent, emotion, wishes and meaning is a rather tricky and cumbersome process that we all navigate each day. Online in the digital world this is even more challenging, if not sometimes impossible. The pre-digital method of exchanging information in a social context would have been face-to-face. Such a method provides the full gamut of verbal and non-verbal dialogue between two or more parties. Importantly, it also provides a channel for the exchange of unconscious cues between people, which researchers are increasingly finding to be of critical importance during communication.

So, now replace the the face-to-face interaction with email, texting, instant messaging, video chat, and other forms of digital communication and you have a new playground for researchers in cognitive and social sciences. The intriguing question for researchers, and all of us for that matter, is: how do we ensure our meaning, motivations and intent are expressed clearly through digital communications?

There are some partial answers over at Anthropology in Practice, which looks at how users of digital media express emotion, resolve ambiguity and communicate cross-culturally.

[div class=attrib]Anthropology in Practice:[end-div]

The ability to interpret social data is rooted in our theory of mind—our capacity to attribute mental states (beliefs, intents, desires, knowledge, etc.) to the self and to others. This cognitive development reflects some understanding of how other individuals relate to the world, allowing for the prediction of behaviors.1 As social beings we require consistent and frequent confirmation of our social placement. This confirmation is vital to the preservation of our networks—we need to be able to gauge the state of our relationships with others.

Research has shown that children whose capacity to mentalize is diminished find other ways to successfully interpret nonverbal social and visual cues 2-6, suggesting that the capacity to mentalize is necessary to social life. Digitally-mediated communication, such as text messaging and instant messaging, does not readily permit social biofeedback. However cyber communicators still find ways of conveying beliefs, desires, intent, deceit, and knowledge online, which may reflect an effort to preserve the capacity to mentalize in digital media.

The Challenges of Digitally-Mediated Communication

In its most basic form DMC is text-based, although the growth of video conferencing technology indicates DMC is still evolving. One of the biggest criticisms of DMC has been the lack of nonverbal cues which are an important indicator to the speaker’s meaning, particularly when the message is ambiguous.

Email communicators are all too familiar with this issue. After all, in speech the same statement can have multiple meanings depending on tone, expression, emphasis, inflection, and gesture. Speech conveys not only what is said, but how it is said—and consequently, reveals a bit of the speaker’s mind to interested parties. In a plain-text environment like email only the typist knows whether a statement should be read with sarcasm.

[div class=attrib]More from theSource here.[end-div]

[div class=attrib]Image courtesy of Wikipedia / Creative Commons.[end-div]

The Pervasive Threat of Conformity: Peer Pressure Is Here to Stay

[div class=attrib]From BigThink:[end-div]

Today, I’d like to revisit one of the most well-known experiments in social psychology: Solomon Asch’s lines study. Let’s look once more at his striking findings on the power of group conformity and consider what they mean now, more than 50 years later, in a world that is much changed from Asch’s 1950s America.

How long are these lines? I don’t know until you tell me.

In the 1950s, Solomon Asch conducted a series of studies to examine the effects of peer pressure, in as clear-cut a setting as possible: visual perception. The idea was to see if, when presented with lines of differing lengths and asked questions about the lines (Which was the longest? Which corresponded to a reference line of a certain length?), participants would answer with the choice that was obviously correct – or would fall sway to the pressure of a group that gave an incorrect response. Here is a sample stimulus from one of the studies:

Which line matches the reference line? It seems obvious, no? Now, imagine that you were in a group with six other people – and they all said that it was, in fact, Line B.  Now, you would have no idea that you were the only actual participant and that the group was carefully arranged with confederates, who were instructed to give that answer and were seated in such a way that they would answer before you. You’d think that they, like you, were participants in the study – and that they all gave what appeared to you to be a patently wrong answer. Would you call their bluff and say, no, the answer is clearly Line A? Are you all blind? Or, would you start to question your own judgment? Maybe it really is Line B. Maybe I’m just not seeing things correctly. How could everyone else be wrong and I be the only person who is right?

We don’t like to be the lone voice of dissent

While we’d all like to imagine that we fall into the second camp, statistically speaking, we are three times more likely to be in the first: over 75% of Asch’s subjects (and far more in the actual condition given above) gave the wrong answer, going along with the group opinion.

[div class=attrib]More from theSource here.[end-div]

The Good, the Bad and the Ugly – 40 years on

One of the most fascinating and (in)famous experiments in social psychology began in the bowels of Stanford University 40 years ago next month. The experiment intended to evaluate how people react to being powerless. However, on conclusion it took a broader look at role assignment and reaction to authority.

The Stanford Prison Experiment incarcerated male college student volunteers in a mock prison for 6 fateful days. Some of the students were selected to be prison guards, the remainder would be prisoners. The researchers, led by psychology professor Philip Zimbardo encouraged the guards to think of themselves as actual guards in a real prison. What happened during these 6 days in “prison” is the stuff of social science legend. The results continues to shock psychologists to this day; many were not prepared for the outcome after 6 days, which saw guards take their roles to the extreme becoming overarchingly authoritarian and mentally abusive, and prisoners become down-trodden and eventually rebellious. A whistle-blower eventually called the experiment to an abrupt end (it was to have continued for 2 weeks).

Forty years on, researchers went back to interview professor Zimbardo and some of the participating guards and prisoners to probe their feelings now. Recollections from one of the guards is below.

[div class=attrib]From Stanford Magazine:[end-div]

I was just looking for some summer work. I had a choice of doing this or working at a pizza parlor. I thought this would be an interesting and different way of finding summer employment.

The only person I knew going in was John Mark. He was another guard and wasn’t even on my shift. That was critical. If there were prisoners in there who knew me before they encountered me, then I never would have been able to pull off anything I did. The act that I put on—they would have seen through it immediately.

What came over me was not an accident. It was planned. I set out with a definite plan in mind, to try to force the action, force something to happen, so that the researchers would have something to work with. After all, what could they possibly learn from guys sitting around like it was a country club? So I consciously created this persona. I was in all kinds of drama productions in high school and college. It was something I was very familiar with: to take on another personality before you step out on the stage. I was kind of running my own experiment in there, by saying, “How far can I push these things and how much abuse will these people take before they say, ‘knock it off?'” But the other guards didn’t stop me. They seemed to join in. They were taking my lead. Not a single guard said, “I don’t think we should do this.”

The fact that I ramped up the intimidation and the mental abuse without any real sense as to whether I was hurting anybody— I definitely regret that. But in the long run, no one suffered any lasting damage. When the Abu Ghraib scandal broke, my first reaction was, this is so familiar to me. I knew exactly what was going on. I could picture myself in the middle of that and watching it spin out of control. When you have little or no supervision as to what you’re doing, and no one steps in and says, “Hey, you can’t do this”—things just keep escalating. You think, how can we top what we did yesterday? How do we do something even more outrageous? I felt a deep sense of familiarity with that whole situation.

Sometimes when people know about the experiment and then meet me, it’s like, My God, this guy’s a psycho! But everyone who knows me would just laugh at that.

[div class=attrib]More from theSource here.[end-div]

Culturally Specific Mental Disorders: A Bad Case of the Brain Fags

Is this man buff enough? Image courtesy of Slate

If you happen to have just read The Psychopath Test by Jon Ronson, this article in Slate is appropriately timely, and presents new fodder for continuing research (and a sequel). It would therefore come as no surprise to find Mr.Ronson trekking through Newfoundland in search of “Old Hag Syndrome”, a type of sleep paralysis, visiting art museums in Italy for “Stendhal Syndrome,” a delusional disorder experienced by Italians after studying artistic masterpieces, and checking on Nigerian college students afflicted by “Brain Fag Syndrome”. Then there is: “Wild Man Syndrome,” from New Guinea (a syndrome combining hyperactivity, clumsiness and forgetfulness), “Koro Syndrome” (a delusion of disappearing protruding body parts) first described in China over 2,000 years ago, “Jiko-shisen-kyofu” from Japan (a fear of offending others by glancing at them), and here in the west, “Muscle Dysmorphia Syndrome” (a delusion common in weight-lifters that one’s body is insufficiently ripped).

All of these and more can be found in the latest version of the DSM-IV (Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition) manual.

[div class=attrib]From Slate:[end-div]

In 1951, Hong Kong psychiatrist Pow-Meng Yap authored an influential paper in the Journal of Mental Sciences on the subject of “peculiar psychiatric disorders”—those that did not fit neatly into the dominant disease-model classification scheme of the time and yet appeared to be prominent, even commonplace, in certain parts of the world. Curiously these same conditions—which include “amok” in Southeast Asia and bouffée délirante in French-speaking countries—were almost unheard of outside particular cultural contexts. The American Psychiatric Association has conceded that certain mysterious mental afflictions are so common, in some places, that they do in fact warrant inclusion as “culture-bound syndromes” in the official Diagnostic and Statistical Manual of Mental Disorders.

he working version of this manual, the DSM-IV, specifies 25 such syndromes. Take “Old Hag Syndrome,” a type of sleep paralysis in Newfoundland in which one is visited by what appears to be a rather unpleasant old hag sitting on one’s chest at night. (If I were a bitter, divorced straight man, I’d probably say something diabolical about my ex-wife here.) Then there’s gururumba, or “Wild Man Syndrome,” in which New Guinean males become hyperactive, clumsy, kleptomaniacal, and conveniently amnesic, “Brain Fag Syndrome” (more on that in a moment), and “Stendhal Syndrome,” a delusional disorder experienced mostly by Italians after gazing upon artistic masterpieces. The DSM-IV defines culture-bound syndromes as “recurrent, locality-specific patterns of aberrant behavior and troubling experience that may or may not be linked to a particular diagnostic category.”
And therein lies the nosological pickle: The symptoms of culture-bound syndromes often overlap with more general, known psychiatric conditions that are universal in nature, such as schizophrenia, body dysmorphia, and social anxiety. What varies across cultures, and is presumably moulded by them, is the unique constellation of symptoms, or “idioms of distress.”

Some scholars believe that many additional distinct culture-bound syndromes exist. One that’s not in the manual but could be, argue psychiatrists Gen Kanayama and Harrison Pope in a short paper published earlier this year in the Harvard Review of Psychiatry, is “muscle dysmorphia.” The condition is limited to Western males, who suffer the delusion that they are insufficiently ripped. “As a result,” write the authors, “they may lift weights compulsively in the gym, often gain large amounts of muscle mass, yet still perceive themselves as too small.” Within body-building circles, in fact, muscle dysmorphia has long been recognized as a sort of reverse anorexia nervosa. But it’s almost entirely unheard of among Asian men. Unlike hypermasculine Western heroes such as Hercules, Thor, and the chiseled Arnold of yesteryear, the Japanese and Chinese have tended to prefer their heroes fully clothed, mentally acute, and lithe, argue Kanayama and Pope. In fact, they say anabolic steroid use is virtually nonexistent in Asian countries, even though the drugs are considerably easier to obtain, being available without a prescription at most neighborhood drugstores.

[div class=attrib]More from theSource here.[end-div]

Disconnected?

[div class=attrib]From Slate:[end-div]

Have you heard that divorce is contagious? A lot of people have. Last summer a study claiming to show that break-ups can propagate from friend to friend to friend like a marriage-eating bacillus spread across the news agar from CNN to CBS to ABC with predictable speed. “Think of this ‘idea’ of getting divorced, this ‘option’ of getting divorced like a virus, because it spreads more or less the same way,” explained University of California-San Diego professor James Fowler to the folks at Good Morning America.

It’s a surprising, quirky, and seemingly plausible finding, which explains why so many news outlets caught the bug. But one weird thing about the media outbreak was that the study on which it was based had never been published in a scientific journal. The paper had been posted to the Social Science Research Network web site, a sort of academic way station for working papers whose tagline is “Tomorrow’s Research Today.” But tomorrow had not yet come for the contagious divorce study: It had never actually passed peer review, and still hasn’t. “It is under review,” Fowler explained last week in an email. He co-authored the paper with his long-time collaborator, Harvard’s Nicholas Christakis, and lead author Rose McDermott.

A few months before the contagious divorce story broke, Slate ran an article I’d written based on a related, but also unpublished, scientific paper. The mathematician Russell Lyons had posted a dense treatise on his website suggesting that the methods employed by Christakis and Fowler in their social network studies were riddled with statistical errors at many levels. The authors were claiming—in the New England Journal of Medicine, in a popular book, in TED talks, in snappy PR videos—that everything from obesity to loneliness to poor sleep could spread from person to person to person like a case of the galloping crud. But according to Lyons and several other experts, their arguments were shaky at best. “It’s not clear that the social contagionists have enough evidence to be telling people that they owe it to their social network to lose weight,” I wrote last April. As for the theory that obesity and divorce and happiness contagions radiate from human beings through three degrees of friendship, I concluded “perhaps it’s best to flock away for now.”

The case against Christakis and Fowler has grown since then. The Lyons paper passed peer review and was published in the May issue of the journal Statistics, Politics, and Policy. Two other recent papers raise serious doubts about their conclusions. And now something of a consensus is forming within the statistics and social-networking communities that Christakis and Fowler’s headline-grabbing contagion papers are fatally flawed. Andrew Gelman, a professor of statistics at Columbia, wrote a delicately worded blog post in June noting that he’d “have to go with Lyons” and say that the claims of contagious obesity, divorce and the like “have not been convincingly demonstrated.” Another highly respected social-networking expert, Tom Snijders of Oxford, called the mathematical model used by Christakis and Fowler “not coherent.” And just a few days ago, Cosma Shalizi, a statistician at Carnegie Mellon, declared, “I agree with pretty much everything Snijders says.”

[div class=attrib]More from theSource here.[end-div]