Category Archives: Social Sciences

Reading Makes You A Better Person

Scientists have finally learned what book lovers have known for some time — reading fiction makes you a better person.

From Readers Digest:

Anyone who reads understands the bittersweet feeling of finishing a good book. It’s as if a beloved friend has suddenly packed her things and parted, the back cover swinging closed like a taxicab door. Farewell, friend. See you on the shelf.

If you’ve ever felt weird for considering fictional characters your friends or fictional places your home, science says you no longer have to. A new body of research is emerging to explain how books have such a powerful emotional pull on us, and the answer du jour is surprising—when we step into a fictional world, we treat the experiences as if they were real. Adding to the endless list of reading benefits is this: Reading fiction literally makes you more empathetic in real life.

Not all fiction is created equal, though—and reading a single chapter of Harry Potter isn’t an instant emotion-enhancer. Here are a few key caveats from the nerdy scientists trying to figure out why reading rules.

Rule #1: The story has to “take you somewhere.”

How many times have you heard someone declare that a good book “transports” you? That immersive power that allows readers to happily inhabit other people, places, and points of view for hours at a time is precisely what a team of researchers in the Netherlands credit for the results of a 2013 study in which students asked to read an Arthur Conan Doyle mystery showed a marked increase in empathy one week later, while students tasked with reading a sampling of news articles showed a decline.

Read the entire article here.

What Up With That: Nationalism

The recent political earthquake in the US is just one example of a nationalistic wave that swept across Western democracies in 2015-2016. The election in the US seemed to surprise many political talking-heads since the nation was, and still is, on a continuing path towards greater liberalism (mostly due to demographics).

So, what exactly is up with that? Can American liberals enter a coma for the next 4 years, sure to awaken refreshed and ready for a new left-of-center regime? Or, is the current nationalistic mood — albeit courtesy of a large minority — likely to prevail for a while longer? Well, there’s no clear answer, and political scientists and researchers are baffled.

Care to learn more about theories of nationalism and the historical underpinnings of nationalism? Visit my reading list over at Goodreads. But make sure you start with: Imagined Communities: Reflections on the Origin and Spread of Nationalism by Benedict Anderson. It’s been the global masterwork on the analysis of nationalism since it was first published in 1983.

I tend to agree with Anderson’s thesis, that a nation is mostly a collective figment of people’s imagination facilitated by modern communications networks. So, I have to believe that eventually our networks will help us overcome the false strictures of our many national walls and borders.

From Scientific American:

Waves of nationalist sentiment are reshaping the politics of Western democracies in unexpected ways — carrying Donald Trump to a surprise victory last month in the US presidential election, and pushing the United Kingdom to vote in June to exit the European Union. And nationalist parties are rising in popularity across Europe.

Many economists see this political shift as a consequence of globalization and technological innovation over the past quarter of a century, which have eliminated many jobs in the West. And political scientists are tracing the influence of cultural tensions arising from immigration and from ethnic, racial and sexual diversity. But researchers are struggling to understand why these disparate forces have combined to drive an unpredictable brand of populist politics.

“We have to start worrying about the stability of our democracies,” says Yascha Mounk, a political scientist at Harvard University in Cambridge, Massachusetts. He notes that the long-running World Values Survey shows that people are increasingly disaffected with their governments — and more willing to support authoritarian leaders.

Some academics have explored potential parallels between the roots of the current global political shift and the rise of populism during the Great Depression, including in Nazi Germany. But Helmut Anheier, president of the Hertie School of Governance in Berlin, cautions that the economic struggles of middle-class citizens across the West today are very different, particularly in mainland Europe.

The Nazis took advantage of the extreme economic hardship that followed the First World War and a global depression, but today’s populist movements are growing powerful in wealthy European countries with strong social programmes. “What brings about a right-wing movement when there are no good reasons for it?”Anheier asks.

In the United States, some have suggested that racism motivated a significant number of Trump voters. But that is too simplistic an explanation, says Theda Skocpol, a sociologist at Harvard University.  “Trump dominated the news for more than a year, and did so with provocative statements that were meant to exacerbate every tension in the US,” she says.

Read the entire story here.

p.s. What Up With That is my homage to the recurring Saturday Night Live (SNL) sketch of the same name.

Surplus Humans and the Death of Work

detroit-industry-north-wall-diego-rivera

It’s a simple equation: too many humans, not enough work. Low paying, physical jobs continue to disappear, replaced by mechanization. More cognitive work characterized by the need to think is increasingly likely to be automated and robotized. This has complex and dire consequences, and not just global economic ramifications, but moral ones. What are we to make of ourselves and of a culture that has intimately linked work with meaning when the work is outsourced or eliminated entirely?

A striking example comes from the richest country in the world — the United States. Recently and anomalously life-expectancy has shown a decrease among white people in economically depressed areas of the nation. Many economists suggest that the quest for ever-increasing productivity — usually delivered through automation — is chipping away at the very essence of what it means to be human: value purpose through work.

James Livingston professor of history at Rutgers University summarizes the existential dilemma, excerpted below, in his latest book No More Work: Why Full Employment is a Bad Idea.

From aeon:

Work means everything to us Americans. For centuries – since, say, 1650 – we’ve believed that it builds character (punctuality, initiative, honesty, self-discipline, and so forth). We’ve also believed that the market in labour, where we go to find work, has been relatively efficient in allocating opportunities and incomes. And we’ve believed that, even if it sucks, a job gives meaning, purpose and structure to our everyday lives – at any rate, we’re pretty sure that it gets us out of bed, pays the bills, makes us feel responsible, and keeps us away from daytime TV.

These beliefs are no longer plausible. In fact, they’ve become ridiculous, because there’s not enough work to go around, and what there is of it won’t pay the bills – unless of course you’ve landed a job as a drug dealer or a Wall Street banker, becoming a gangster either way.

These days, everybody from Left to Right – from the economist Dean Baker to the social scientist Arthur C Brooks, from Bernie Sanders to Donald Trump – addresses this breakdown of the labour market by advocating ‘full employment’, as if having a job is self-evidently a good thing, no matter how dangerous, demanding or demeaning it is. But ‘full employment’ is not the way to restore our faith in hard work, or in playing by the rules, or in whatever else sounds good. The official unemployment rate in the United States is already below 6 per cent, which is pretty close to what economists used to call ‘full employment’, but income inequality hasn’t changed a bit. Shitty jobs for everyone won’t solve any social problems we now face.

Don’t take my word for it, look at the numbers. Already a fourth of the adults actually employed in the US are paid wages lower than would lift them above the official poverty line – and so a fifth of American children live in poverty. Almost half of employed adults in this country are eligible for food stamps (most of those who are eligible don’t apply). The market in labour has broken down, along with most others.

Those jobs that disappeared in the Great Recession just aren’t coming back, regardless of what the unemployment rate tells you – the net gain in jobs since 2000 still stands at zero – and if they do return from the dead, they’ll be zombies, those contingent, part-time or minimum-wage jobs where the bosses shuffle your shift from week to week: welcome to Wal-Mart, where food stamps are a benefit.

Read the entire essay here.

Image: Detroit Industry North Wall, Diego Rivera. Courtesy: Detroit Institute of Arts. Wikipedia.

You’re Not In Control

dual_elevator_door_buttons

Press a button, then something happens. Eat too much chocolate, then you feel great (and then put on weight). Step in to the middle of a busy road, then you get hit by an oncoming car. Walk in the rain, then you get wet. Watch your favorite comedy show, then you laugh.

Every moment of our lives is filled with actions and consequences, causes and effects. Usually we have a good sense of what is likely to happen when we take a specific action. This sense of predictability smooths our lives and makes us feel in control.

But sometimes all is not what is seems. Take the buttons on some of the most actively used objects in our daily lives. Press the “close door” button on the elevator [or “lift” for my British readers], then the door closes, right? Press the “pedestrian crossing” button at the crosswalk [or “zebra crossing”], then the safe to cross signal blinks to life, right? Adjust the office thermostat, then you feel more comfortable, right?

Well, if you think that by pressing a button you are commanding the elevator door to close, or the crosswalk signal to flash, or the thermostat to change the office temperature, you’re probably wrong. You may feel in control, but actually you’re not. In many cases the button may serve no functional purpose; the systems just work automatically. But the button still offers a psychological purpose — a placebo-like effect. We are so conditioned to the notion that pressing a button yields an action, that we still feel in control even when the button does nothing beyond making an audible click.

From the NYT:

Pressing the door-close button on an elevator might make you feel better, but it will do nothing to hasten your trip.

Karen W. Penafiel, executive director of National Elevator Industry Inc., a trade group, said the close-door feature faded into obsolescence a few years after the enactment of the Americans With Disabilities Act in 1990.

The legislation required that elevator doors remain open long enough for anyone who uses crutches, a cane or wheelchair to get on board, Ms. Penafiel said in an interview on Tuesday. “The riding public would not be able to make those doors close any faster,” she said.

The buttons can be operated by firefighters and maintenance workers who have the proper keys or codes.

No figures were available for the number of elevators still in operation with functioning door-close buttons. Given that the estimated useful life of an elevator is 25 years, it is likely that most elevators in service today have been modernized or refurbished, rendering the door-close buttons a thing of the past for riders, Ms. Penafiel said.

Read the entire story here.

Image: Elevator control panel, cropped to show only dual “door open” and “door close” buttons. Courtesy: Nils R. Barth. Wikipedia. Creative Commons CC0 1.0 Universal Public Domain Dedication.

Morality and a Second Language

Frequent readers will know that I’m intrigued by social science research into the human condition. Well, this collection of studies is fascinating. To summarize the general finding: you are less likely to follow ethical behavior if you happen to be thinking in an acquired, second language. Put another way, you are more moral when you think in your mother tongue.

Perhaps counter-intuitively a moral judgement made in a foreign language requires more cognitive processing power than one made in the language of childhood. Consequently, moral judgements of dubious or reprehensible behavior are likely to be seen as less wrong than those evaluated in native tongue.

I suppose there is a very valuable lesson here: if you plan to do some shoplifting or rob a bank then you should evaluate the pros and cons of your criminal enterprise in the second language that you learned in school.

From Scientific American:

What defines who we are? Our habits? Our aesthetic tastes? Our memories? If pressed, I would answer that if there is any part of me that sits at my core, that is an essential part of who I am, then surely it must be my moral center, my deep-seated sense of right and wrong.

And yet, like many other people who speak more than one language, I often have the sense that I’m a slightly different person in each of my languages—more assertive in English, more relaxed in French, more sentimental in Czech. Is it possible that, along with these differences, my moral compass also points in somewhat different directions depending on the language I’m using at the time?

Psychologists who study moral judgments have become very interested in this question. Several recent studies have focused on how people think about ethics in a non-native language—as might take place, for example, among a group of delegates at the United Nations using a lingua franca to hash out a resolution. The findings suggest that when people are confronted with moral dilemmas, they do indeed respond differently when considering them in a foreign language than when using their native tongue.

In a 2014 paper led by Albert Costa, volunteers were presented with a moral dilemma known as the “trolley problem”: imagine that a runaway trolley is careening toward a group of five people standing on the tracks, unable to move. You are next to a switch that can shift the trolley to a different set of tracks, thereby sparing the five people, but resulting in the death of one who is standing on the side tracks. Do you pull the switch?

Most people agree that they would. But what if the only way to stop the trolley is by pushing a large stranger off a footbridge into its path? People tend to be very reluctant to say they would do this, even though in both scenarios, one person is sacrificed to save five. But Costa and his colleagues found that posing the dilemma in a language that volunteers had learned as a foreign tongue dramatically increased their stated willingness to shove the sacrificial person off the footbridge, from fewer than 20% of respondents working in their native language to about 50% of those using the foreign one. (Both native Spanish- and English-speakers were included, with English and Spanish as their respective foreign languages; the results were the same for both groups, showing that the effect was about using a foreign language, and not about which particular language—English or Spanish—was used.)

Using a very different experimental setup, Janet Geipel and her colleagues also found that using a foreign language shifted their participants’ moral verdicts. In their study, volunteers read descriptions of acts that appeared to harm no one, but that many people find morally reprehensible—for example, stories in which siblings enjoyed entirely consensual and safe sex, or someone cooked and ate his dog after it had been killed by a car. Those who read the stories in a foreign language (either English or Italian) judged these actions to be less wrong than those who read them in their native tongue.

Read the entire article here.

The Conspiracy of Disbelief

Faux news and hoaxes are a staple of our culture. I suspect that disinformation, fabrications and lies have been around since our ancestors first learned to walk on their hind legs. Researchers know that lying provides a critical personal and social function; white lies help hide discomfort and often strengthen support with partners and peers. Broader and deeper lies are often used to build and maintain power and project strength over others. Indeed, some nations rise and fall based on the quality of their falsehoods and propaganda.

The rise of the internet and social media over the last couple of decades has amplified the issue to such an extent that it becomes ever more challenging to decipher fact from fiction. Indeed entire highly profitable industries are built on feeding misinformation and disseminating hoaxes. But while many of us laugh at and dismiss the front page headlines of the National Enquirer proclaiming “aliens abducted my neighbor“, other forms of fiction are much more sinister. One example is the Sandy Hook mass shooting, where a significant number of paranoid and skeptical conspiracy theorists continue to maintain to this day — almost 4 years on — that the massacre of 20 elementary school children and 6 adults was and is a well-fabricated hoax.

From NY Magazine:

On December 14, 2012, Lenny Pozner dropped off his three children, Sophia, Arielle, and Noah, at Sandy Hook Elementary School in Newtown, Connecticut. Noah had recently turned 6, and on the drive over they listened to his favorite song, “Gangnam Style,” for what turned out to be the last time. Half an hour later, while Sophia and Arielle hid nearby, Adam Lanza walked into Noah’s first-grade class with an AR-15 rifle. Noah was the youngest of the 20 children and seven adults killed in one of the deadliest shootings in American history. When the medical examiner found Noah lying face up in a Batman sweatshirt, his jaw had been blown off. Lenny and his wife, Veronique, raced to the school as soon as they heard the news, but had to wait for hours alongside other parents to learn their son’s fate.

It didn’t take much longer for Pozner to find out that many people didn’t believe his son had died or even that he had lived at all. Days after the rampage, a man walked around Newtown filming a video in which he declared that the massacre had been staged by “some sort of New World Order global elitists” intent on taking away our guns and our liberty. A week later, James Tracy, a professor at Florida Atlantic University, wrote a blog post expressing doubts about the massacre. By January, a 30-minute YouTube video, titled “The Sandy Hook Shooting — Fully Exposed,” which asked questions like “Wouldn’t frantic kids be a difficult target to hit?,” had been viewed more than 10 million times.

As the families grieved, conspiracy theorists began to press their case in ways that Newtown couldn’t avoid. State officials received anonymous phone calls at their homes, late at night, demanding answers: Why were there no trauma helicopters? What happened to the initial reports of a second shooter? A Virginia man stole playground signs memorializing two of the victims, then called their parents to say that the burglary shouldn’t affect them, since their children had never existed. At one point, Lenny Pozner was checking into a hotel out of town when the clerk looked up from the address on his driver’s license and said, “Oh, Sandy Hook — the government did that.” Pozner had tried his best to ignore the conspiracies, but eventually they disrupted his grieving process so much that he could no longer turn a blind eye. “Conspiracy theorists erase the human aspect of history,” Pozner said this summer. “My child — who lived, who was a real person — is basically going to be erased.”

Read the entire disturbing story here.

Intolerance and Divine Revelation

Intolerance_film

Another day, another heinous, murderous act in the name of religion — the latest this time a French priest killed in his own church by a pair shouting “Allahu akbar!” (To be fair countless other similar acts continue on a daily basis in non-Western nations, but go unreported or under-reported in the mainstream media).

Understandably, local and national religious leaders decry these heinous acts as a evil perversion of Islamic faith. Now, I’d be the first to admit that attributing such horrendous crimes solely to the faiths of the perpetrators is a rather simplistic rationalization. Other factors, such as political disenfranchisement, (perceived) oppression, historical persecution and economic pressures, surely play a triggering and/or catalytic role.

Yet, as Gary Gutting professor of philosophy at the University of Notre Dame reminds us in another of his insightful essays, religious intolerance is a fundamental component. The three main Abrahamic religions — Judaism, Christianity and Islam — are revelatory faiths. Their teachings are each held to be incontrovertible truth revealed to us by an omniscient God (or a divine messenger). Strict adherence to these beliefs has throughout history led many believers — of all faiths — to enact their intolerance in sometimes very violent ways. Over time, numerous socio-economic pressures have generally softened this intolerance — but not equally across the three faiths.

From NYT:

Both Islam and Christianity claim to be revealed religions, holding that their teachings are truths that God himself has conveyed to us and wants everyone to accept. They were, from the start, missionary religions. A religion charged with bringing God’s truth to the world faces the question of how to deal with people who refuse to accept it. To what extent should it tolerate religious error? At certain points in their histories, both Christianity and Islam have been intolerant of other religions, often of each other, even to the point of violence.

This was not inevitable, but neither was it an accident. The potential for intolerance lies in the logic of religions like Christianity and Islam that say their teaching derive from a divine revelation. For them, the truth that God has revealed is the most important truth there is; therefore, denying or doubting this truth is extremely dangerous, both for nonbelievers, who lack this essential truth, and for believers, who may well be misled by the denials and doubts of nonbelievers. Given these assumptions, it’s easy to conclude that even extreme steps are warranted to eliminate nonbelief.

You may object that moral considerations should limit our opposition to nonbelief. Don’t people have a human right to follow their conscience and worship as they think they should? Here we reach a crux for those who adhere to a revealed religion. They can either accept ordinary human standards of morality as a limit on how they interpret divine teachings, or they can insist on total fidelity to what they see as God’s revelation, even when it contradicts ordinary human standards. Those who follow the second view insist that divine truth utterly exceeds human understanding, which is in no position to judge it. God reveals things to us precisely because they are truths we would never arrive at by our natural lights. When the omniscient God has spoken, we can only obey.

For those holding this view, no secular considerations, not even appeals to conventional morality or to practical common sense, can overturn a religious conviction that false beliefs are intolerable. Christianity itself has a long history of such intolerance, including persecution of Jews, crusades against Muslims, and the Thirty Years’ War, in which religious and nationalist rivalries combined to devastate Central Europe. This devastation initiated a move toward tolerance among nations that came to see the folly of trying to impose their religions on foreigners. But intolerance of internal dissidents — Catholics, Jews, rival Protestant sects — continued even into the 19th century. (It’s worth noting that in this period the Muslim Ottoman Empire was in many ways more tolerant than most Christian countries.) But Christians eventually embraced tolerance through a long and complex historical process.

Critiques of Christian revelation by Enlightenment thinkers like Voltaire, Rousseau and Hume raised serious questions that made non-Christian religions — and eventually even rejections of religion — intellectually respectable. Social and economic changes — including capitalist economies, technological innovations, and democratic political movements — undermined the social structures that had sustained traditional religion.

The eventual result was a widespread attitude of religious toleration in Europe and the United States. This attitude represented ethical progress, but it implied that religious truth was not so important that its denial was intolerable. Religious beliefs and practices came to be regarded as only expressions of personal convictions, not to be endorsed or enforced by state authority. This in effect subordinated the value of religious faith to the value of peace in a secular society. Today, almost all Christians are reconciled to this revision, and many would even claim that it better reflects the true meaning of their religion.

The same is not true of Muslims. A minority of Muslim nations have a high level of religious toleration; for example Albania, Kosovo, Senegal and Sierra Leone. But a majority — including Saudi Arabia, Iran, Pakistan, Iraq and Malaysia — maintain strong restrictions on non-Muslim (and in some cases certain “heretical” Muslim) beliefs and practices. Although many Muslims think God’s will requires tolerance of false religious views, many do not.

Read the entire story here.

Image: D.W. Griffith’s Intolerance (1916) movie poster. Courtesy: Sailko / Dekkappai at Wikipedia. Public Domain.

Fish Roasts Human: Don’t Read It, Share It

Common_goldfish2

Interestingly enough, though perhaps not surprisingly, people on social media share news stories rather than read them. At first glance this seems rather perplexing: after all, why would you tweet or re-tweet or like or share a news item before actually reading and understanding it?

Arnaud Legout co-author of a recent study, out of Columbia University and the French National Institute (Inria), tells us that “People form an opinion based on a summary, or summary of summaries, without making the effort to go deeper.” More confusingly, he adds, “Our results show that sharing content and actually reading it are poorly correlated.”

Please take 8 seconds or more to mull over this last statement again:

Our results show that sharing content and actually reading it are poorly correlated.

Without doubt our new technological platforms and social media have upended traditional journalism. But, in light of this unnerving finding I have to wonder if this means the eventual and complete collapse of deep analytical, investigative journalism and the replacement of thoughtful reflection with “NationalEnquirerThink”.

Perhaps I’m reading too much into the findings, but it does seem that it is more important for social media users to bond with and seek affirmation from their followers than it is to be personally informed.

With average human attention span now down to 8 seconds I think our literary and contemplative future now seems to belong safely in the fins of our cousin, the goldfish (attention span, 9 seconds).

Learn more about Arnaud Legout’s disturbing study here.

Image: Common Goldfish. Courtesy: Wikipedia. Public Domain.

Are You Monotasking or Just Paying Attention?

We have indeed reached the era of peak multi-tasking. It’s time to select a different corporate meme.

Study after recent study shows that multi-tasking is an illusion — we can’t perform two or more cognitive tasks in parallel, at the same time. Rather, we timeshare: dividing our attention from one task to another sequentially. These studies also show that dividing our attention in this way tends to have a deleterious effect on all of the tasks. I say cognitive tasks because it’s rather obvious that we can all perform some tasks at the same time: walk and chew gum (or thumb a smartphone); drive and sing; shower and think; read and eat. But, all of these combinations require that one of these tasks is mostly autonomic. That is, we perform one task without conscious effort.

Yet more social scientists have determined that multi-tasking is a fraud — perhaps perpetuated by corporate industrial engineers convinced that they can wring more hours of work from you.

What are we to do now having learned that our super-efficient world of juggling numerous tasks as the “same time” is nothing but a mirage?

Well, observers of the fragile human condition have not rested. This time social scientists have discovered an amazing human talent. And they’ve coined a mesmerizing new term, known as monotasking. In some circles it’s called uni-tasking or single-tasking.

When I was growing up this was called “paying attention”.

But, this being the era of self-help-life-experience-consulting gone mad and sub-minute attention spans (fueled by multi-tasking) we can now all eagerly await the rise of an entirely new industry dedicated to this wonderful monotasking breakthrough. Expect a whole host of monotasking books, buzzworthy news articles, daytime TV shows with monotasking tips and personal coaching experts at TED events armed with “look what monotasking can do for you” powerpoint decks.

Personally, I will quietly retreat, and return to old-school staying focused, and remind my kids to do the same.

From NYT:

Stop what you’re doing.

Well, keep reading. Just stop everything else that you’re doing.

Mute your music. Turn off your television. Put down your sandwich and ignore that text message. While you’re at it, put your phone away entirely. (Unless you’re reading this on your phone. In which case, don’t. But the other rules still apply.)

Just read.

You are now monotasking.

Maybe this doesn’t feel like a big deal. Doing one thing at a time isn’t a new idea.

Indeed, multitasking, that bulwark of anemic résumés everywhere, has come under fire in recent years. A 2014 study in the Journal of Experimental Psychology found that interruptions as brief as two to three seconds — which is to say, less than the amount of time it would take you to toggle from this article to your email and back again — were enough to double the number of errors participants made in an assigned task.

Earlier research out of Stanford revealed that self-identified “high media multitaskers” are actually more easily distracted than those who limit their time toggling.

So, in layman’s terms, by doing more you’re getting less done.

But monotasking, also referred to as single-tasking or unitasking, isn’t just about getting things done.

Not the same as mindfulness, which focuses on emotional awareness, monotasking is a 21st-century term for what your high school English teacher probably just called “paying attention.”

“It’s a digital literacy skill,” said Manoush Zomorodi, the host and managing editor of WNYC Studios’ “Note to Self” podcast, which recently offered a weeklong interactive series called Infomagical, addressing the effects of information overload. “Our gadgets and all the things we look at on them are designed to not let us single-task. We weren’t talking about this before because we simply weren’t as distracted.”

Continue reading the main story

Ms. Zomorodi prefers the term “single-tasking”: “ ‘Monotasking’ seemed boring to me. It sounds like ‘monotonous.’ ”

Kelly McGonigal, a psychologist, lecturer at Stanford and the author of “The Willpower Instinct,” believes that monotasking is “something that needs to be practiced.” She said: “It’s an important ability and a form of self-awareness as opposed to a cognitive limitation.”

Read the entire article here.

Image courtesy of Google Search.

Achieving Failure

Our society values success.

Our work environments value triumphing over the competition. We look to our investments to beat the market. We support our favorite teams, but adore them when they trounce their rivals. Our schools and colleges (mostly) help educate our children, but do so in a way that rewards success — good grades, good test scores and good behavior (as in, same as everyone else). We continually reward our kids for success on a task, at school, with a team.

Yet, all of us know, in our hearts and the back of our minds, that the most important lessons and trials stem from failure — not success. From failure we learn to persevere, we learn to change and adapt, we learn to overcome. From failure we learn to avoid, or tackle obstacles head on; we learn to reassess and reevaluate. We evolve from our failures.

So this begs the question: why are so many of our processes and systems geared solely to rewarding and reinforcing success?

From NPR:

Is failure a positive opportunity to learn and grow, or is it a negative experience that hinders success? How parents answer that question has a big influence on how much children think they can improve their intelligence through hard work, a study says.

“Parents are a really critical force in child development when you think about how motivation and mindsets develop,” says Kyla Haimovitz, a professor of psychology at Stanford University. She coauthored the study, published in Psychological Science with colleague Carol Dweck, who pioneered research on mindsets. “Parents have this powerful effect really early on and throughout childhood to send messages about what is failure, how to respond to it.”

Although there’s been a lot of research on how these forces play out, relatively little looks at what parents can do to motivate their kids in school, Haimovitz says. This study begins filling that gap.

“There is a fair amount of evidence showing that when children view their abilities as more malleable and something they can change over time, then they deal with obstacles in a more constructive way,” says Gail Heyman, a professor of psychology at the University of California at San Diego who was not involved in this study.

But communicating that message to children is not simple.

“Parents need to represent this to their kids in the ways they react about their kids’ failures and setbacks,” Haimovitz says. “We need to really think about what’s visible to the other person, what message I’m sending in terms of my words and my deeds.”

In other words, if a child comes home with a D on a math test, how a parent responds will influence how the child perceives their own ability to learn math. Even a well-intentioned, comforting response of “It’s OK, you’re still a great writer” may send the message that it’s time to give up on math rather than learn from the problems they got wrong, Haimovitz explains.

Read the entire story here.

Practice May Make You Perfect, But Not Creative

Practice will help you improve in a field with well-defined and well-developed tasks, processes and rules. This includes areas like sports and musicianship. Though, keep in mind that it may indeed take some accident of genetics to be really good at one of these disciplines in the first place.

But, don’t expect practice to make you better in all areas of life, particularly in creative endeavors. Creativity stems from original thought not replicable behavior. Scott Kaufman director of the Imagination Institute at the University of Pennsylvania reminds us of this in a recent book review.” The authors of Peak: Secrets from the New Science of Expertise, psychologist Anders Ericsson and journalist Robert Pool, review a swath of research on human learning and skill acquisition and conclude that deliberate, well-structured practice can help anyone master new skills. I think we can all agree with this conclusion.

But like Kaufman I believe that many creative “skills” lie in an area of human endeavor that is firmly beyond the assistance of practice. Most certainly practice will help an artist hone and improve her brushstrokes; but practice alone will not bring forth her masterpiece. So, here is a brief summary of 12 key elements that Kaufman distilled from over 50 years of research studies into creativity:

Excerpts from Creativity Is Much More Than 10,000 Hours of Deliberate Practice by Scott Kaufman:

  1. Creativity is often blind. If only creativity was all about deliberate practice… in reality, it’s impossible for creators to know completely whether their new idea or product will be well received.
  2. Creative people often have messy processes. While expertise is characterized by consistency and reliability, creativity is characterized by many false starts and lots and lots of trial-and-error.
  3. Creators rarely receive helpful feedback. When creators put something novel out into the world, the reactions are typically either acclaim or rejection
  4. The “10-Year Rule” is not a rule. The idea that it takes 10 years to become a world-class expert in any domain is not a rule. [This is the so-called Ericsson rule from his original paper on deliberate practice amongst musicians.]
  5. Talent is relevant to creative accomplishment. If we define talent as simply the rate at which a person acquires expertise, then talent undeniably matters for creativity.
  6. Personality is relevant. Not only does the speed of expertise acquisition matter, but so do a whole host of other traits. People differ from one another in a multitude of ways… At the very least, research has shown that creative people do tend to have a greater inclination toward nonconformity, unconventionality, independence, openness to experience, ego strength, risk taking, and even mild forms of psychopathology.
  7. Genes are relevant. [M]odern behavioral genetics has discovered that virtually every single psychological trait — including the inclination and willingness to practice — is influenced by innate genetic endowment.
  8. Environmental experiences also matter. [R]esearchers have found that many other environmental experiences substantially affect creativity– including socioeconomic origins, and the sociocultural, political, and economic context in which one is raised.
  9. Creative people have broad interests. While the deliberate practice approach tends to focus on highly specialized training… creative experts tend to have broader interests and greater versatility compared to their less creative expert colleagues.
  10. Too much expertise can be detrimental to creative greatness. The deliberate practice approach assumes that performance is a linear function of practice. Some knowledge is good, but too much knowledge can impair flexibility.
  11. Outsiders often have a creative advantage. If creativity were all about deliberate practice, then outsiders who lack the requisite expertise shouldn’t be very creative. But many highly innovative individuals were outsiders to the field in which they contributed. Many marginalized people throughout history — including immigrants — came up with highly creative ideas not in spite of their experiences as an outsider, but because of their experiences as an outsider.
  12. Sometimes the creator needs to create a new path for others to deliberately practice. Creative people are not just good at solving problems, however. They are also good at finding problems.

In my view the most salient of Kaufman’s dozen ingredients for creativity are #11 and #12 — and I can personally attest to their importance: fresh ideas are more likely to come from outsiders; and, creativeness in one domain often stems from experiences in another, unrelated, realm.

Read Kaufman’s enlightening article in full here.

Dishonesty and Intelligence

Another day, another survey. This time it’s one that links honesty and intelligence. Apparently, the more intelligent you are — as measured by a quick intelligence test — the less likely you’ll be to lie. Fascinatingly, the survey also shows that those who do lie from the small subgroup of the most intelligent tell smaller whoppers; people in the less intelligent subgroup tell bigger lies, for a bigger payoff.

From Washington Post:

Last summer, a couple of researchers ran a funny experiment about honesty. They went to an Israeli shopping mall and recruited people, one-by-one, into a private booth. Alone inside the booth, each subject rolled a six-sided die. Then they stepped out and reported the number that came up.

There was an incentive to lie. The higher the number, the more money people received. If they rolled a one, they got a bonus of about $2.50. If they rolled a two, they got a bonus of $5, and so on. If they rolled a six, the bonus was about $15. (Everyone also received $5 just for participating.)

Before I reveal the results, think about what you would do in that situation. Someone comes up to you at the mall and offers you free money to roll a die. If you wanted to make a few extra bucks, you could lie about what you rolled. Nobody would know, and nobody would be harmed.

Imagine you went into that booth and rolled a 1. What would you do? Would you be dishonest? Would you say you rolled a six, just to get the largest payout?

The researchers, Bradley Ruffle of Wilfrid Laurier University and Yossef Tobol, of the Jerusalem College of Technology, wanted to know what kinds of people would lie in this situation. So they asked everyone about their backgrounds, whether they considered themselves honest, whether they thought honesty was important. They asked whether people were employed, how much money they earned, and whether they were religious. They also gave people a quick intelligence test.

Out of all those attributes, brainpower stood out. Smarter people were less likely to lie about the number they rolled.

It didn’t matter whether they claimed they were honest or not; it didn’t matter whether they were religious, whether they were male or female, or whether they lived in a city. Money didn’t seem to be a factor either. Even after controlling for incomes, the researchers found that the most honest people were the ones who scored highest on the intelligence test.

Read the entire article here.

World Happiness Ranking

national-happiness-2015

Yet again, nations covering the northern latitudes outrank all others on this year’s global happiness scale. Not surprisingly, Denmark topped the happiness list in 2015, having secured the top spot since 2012, except for 2014 when it was pipped by Switzerland. The top 5 for 2015 are: Denmark, Iceland, Norway, Finland, and Canada.

The report finds that the happiest nations tend to be those with lower income disparity and strong national health and social safety programs. Ironically, richer nations, including the United States, tend to rank lower due to rising inequalities in income, wealth and health.

That said, the United States moved to No. 13, up two places from No. 15 the previous year. This is rather perplexing considering all the anger that we’re hearing about during the relentless 2016 presidential election campaign.

At the bottom of the list of 157 nations is Burundi, recently torn by a violent political upheaval. The bottom five nations for 2015 are: Benin, Afghanistan, Togo, Syria and Burundi; all have recently suffered from war or disease or both.

The happiness score for each nation is based on multiple national surveys covering a number of criteria, which are aggregated into six key measures: GDP per capita, social support; healthy life expectancy; freedom to make life choices; generosity; and perceptions of corruption.

The World Happiness Report was prepared by the Sustainable Development Solutions Network, an international group of social scientists and public health experts under the auspices of the United Nations.

Read more on the report here.

Image: Top 30 nations ranked for happiness, screenshot. Courtesy: World Happiness Report, The Distribution of World Happiness, by John F. Helliwell, Canadian Institute for Advanced Research and Vancouver School of Economics, University of British Columbia; Haifang Huang, Department of Economics, University of Alberta; Shun Wang, KDI School of Public Policy and Management, South Korea.

Bad Behavior Goes Viral

Social psychologists often point out how human behavior is contagious. Laugh and others will join in. Yawn and all those around you will yawn as well. In a bad mood at home? Well, soon, chances are that the rest of your family with join you on a downer as well.

And, the contagion doesn’t end there, especially with negative behaviors; study after study shows the viral spread of suicide, product tampering, rioting, looting, speeding and even aircraft hijacking. So too, are mass shootings. Since the United States is a leading venue for mass shootings, there is now even a term for a mass shooting that happens soon after the first — an echo shooting.

From the Washington Post:

A man had just gone on a shooting rampage in Kalamazoo, Mich., allegedly killing six people while driving for Uber. Sherry Towers, an Arizona State University physicist who studies how viruses spread, worried while watching the news coverage.

Last year, Towers published a study using mathematical models to examine whether mass shootings, like viruses, are contagious. She identified a 13-day period after high-profile mass shootings when the chance of another spikes. Her findings are confirmed more frequently than she would like.

Five days after Kalamazoo, a man in Kansas shot 17 people, killing three by firing from his car. To Towers, that next shooting seemed almost inevitable.

“I absolutely dread watching this happen,” she said.

As the nation endures an ongoing stream of mass shootings, criminologists, police and even the FBI are turning to virus epidemiology and behavioral psychology to understand what sets off mass shooters and figure out whether, as with the flu, the spread can be interrupted.

“These things are clustering in time, and one is causing the next one to be more likely,” said Gary Slutkin, a physician and epidemiologist at the University of Illinois at Chicago who runs Cure Violence, a group that treats crime as a disease. “That’s definitional of a contagious disease. Flu is a risk factor for more flu. Mass shootings are a risk factor for mass shootings.”

The idea is not without skeptics. James Alan Fox, a Northeastern University professor who studies mass shootings, said: “Some bunching just happens. Yes, there is some mimicking going on, but the vast majority of mass killers don’t need someone else to give them the idea.”

Confirming, disputing or further exploring the idea scientifically is hampered by the federal funding ban on gun violence research. Towers and her colleagues did their study on their own time. And there’s not even a common database or definition of mass shootings.

The Congressional Research Service uses the term “public mass shootings” to describe the killing of four or more people in “relatively public places” by a perpetrator selecting victims “somewhat indiscriminately.”

In the 1980s, the violence occurred in post offices. In the 1990s, schools. Now it is mutating into new forms, such as the terrorist attack in San Bernardino, Calif., that initially appeared to be a workplace shooting by a disgruntled employee.

Researchers say the contagion is potentially more complicated than any virus. There is the short-term effect of a high-profile mass shooting, which can lead quickly to another incident. Towers found that such echo shootings account for up to 30 percent of all rampages.

But there appear to be longer incubation periods, too. Killers often find inspiration in past mass shootings, praising what their predecessors accomplished, innovating on their methods and seeking to surpass them in casualties and notoriety.

Read the entire article here.

The Global Peril of Narcissism

Google-search-demagogue

I suspect that prior to our gluttonous always-on, social media age narcissists were very much a local phenomenon — probably much like European diseases remained mostly confined to the Old World prior to the advent of frequent shipping and air travel. Nowadays narcissistic traits such as self-absorption, image inflation and lack of empathy spread and amplify across the globe as impressionable tribes like, follow and emulate their narcissistic role models. As the virus of self-obsession spreads this puts our increasingly global village at some peril — replacing empathy with indifference and altruism with self-promotion, and leading to the inevitable rise of charismatic demagogues.

Author and psychotherapist Pat Macdonald aptly describes the rise of narcissism in her recent paper Narcissism in the Modern World. Quite paradoxically, Macdonald finds that,

“Much of our distress comes from a sense of disconnection. We have a narcissistic society where self-promotion and individuality seem to be essential, yet in our hearts that’s not what we want. We want to be part of a community, we want to be supported when we’re struggling, we want a sense of belonging. Being extraordinary is not a necessary component to being loved.”

From the Guardian:

“They unconsciously deny an unstated and intolerably poor self-image through inflation. They turn themselves into glittering figures of immense grandeur surrounded by psychologically impenetrable walls. The goal of this self-deception is to be impervious to greatly feared external criticism and to their own rolling sea of doubts.” This is how Elan Golomb describes narcissistic personality disorder in her seminal book Trapped in the Mirror. She goes on to describe the central symptom of the disorder – the narcissist’s failure to achieve intimacy with anyone – as the result of them seeing other people like items in a vending machine, using them to service their own needs, never being able to acknowledge that others might have needs of their own, still less guess what they might be. “Full-bodied narcissistic personality disorder remains a fairly unusual diagnosis,” Pat MacDonald, author of the paper Narcissism in the Modern World, tells me. “Traditionally, it is very difficult to reverse narcissistic personality disorder. It would take a long time and a lot of work.”

What we talk about when we describe an explosion of modern narcissism is not the disorder but the rise in narcissistic traits. Examples are everywhere. Donald Trump epitomises the lack of empathy, the self-regard and, critically, the radical overestimation of his own talents and likability. Katie Hopkins personifies the perverse pride the narcissist takes in not caring for others. (“No,” she wrote in the Sun about the refugee crisis. “I don’t care. Show me pictures of coffins, show me bodies floating in water, play violins and show me skinny people looking sad. I still don’t care.”) Those are the loudest examples, blaring like sirens; there is a general hubbub of narcissism beneath, which is conveniently – for observation purposes, at least – broadcast on social media. Terrible tragedies, such as the attacks on Paris, are appropriated by people thousands of miles away and used as a backdrop to showcase their sensitivity. The death of David Bowie is mediated through its “relevance” to voluble strangers.

It has become routine for celebrities to broadcast banal information and fill Instagram with the “moments” that constitute their day, the tacit principle being that, once you are important enough, nothing is mundane. This delusion then spills out to the non-celebrity; recording mundane events becomes proof of your importance. The dramatic rise in cosmetic surgery is part of the same effect; the celebrity fixates on his or her appearance to meet the demands of fame. Then the vanity, being the only truly replicable trait, becomes the thing to emulate. Ordinary people start having treatments that only intense scrutiny would warrant – 2015 saw a 13% rise in procedures in the UK, with the rise in cosmetic dentistry particularly marked, because people don’t like their teeth in selfies. The solution – stop taking selfies – is apparently so 2014.

Read the entire story here.

Image courtesy of Google Search.

MondayMap: Internet Racism

map-internet-racism

Darkest blue and light blue respectively indicate much less and less racist areas than the national average. The darkest red indicates the most racist zones.

No surprise: the areas with the highest number of racists are in the South and the rural Northeastern United States. Head west of Texas and you’ll find fewer and fewer pockets of racists. Further, and perhaps not surprisingly, the greater the degree of n-word usage the higher is the rate of black mortality.

Sadly, this map is not of 18th or 19th century America, it’s from a recent study, April 2015, posted on Public Library of Science (PLOS) ONE.

Now keep in mind that the map highlights racism through tracking of pejorative search terms such as the n-word, and doesn’t count actual people, and it’s a geographic generalization. Nonetheless it’s a stark reminder that we seem to be two nations divided by the mighty Mississippi River and we still have a very long way to go before we are all “westerners”.

From Washington Post:

Where do America’s most racist people live? “The rural Northeast and South,” suggests a new study just published in PLOS ONE.

The paper introduces a novel but makes-tons-of-sense-when-you-think-about-it method for measuring the incidence of racist attitudes: Google search data. The methodology comes from data scientist Seth Stephens-Davidowitz. He’s used it before to measure the effect of racist attitudes on Barack Obama’s electoral prospects.

“Google data, evidence suggests, are unlikely to suffer from major social censoring,” Stephens-Davidowitz wrote in a previous paper. “Google searchers are online and likely alone, both of which make it easier to express socially taboo thoughts. Individuals, indeed, note that they are unusually forthcoming with Google.” He also notes that the Google measure correlates strongly with other standard measures social science researchers have used to study racist attitudes.

This is important, because racism is a notoriously tricky thing to measure. Traditional survey methods don’t really work — if you flat-out ask someone if they’re racist, they will simply tell you no. That’s partly because most racism in society today operates at the subconscious level, or gets vented anonymously online.

For the PLOS ONE paper, researchers looked at searches containing the N-word. People search frequently for it, roughly as often as searches for  “migraine(s),” “economist,” “sweater,” “Daily Show,” and “Lakers.” (The authors attempted to control for variants of the N-word not necessarily intended as pejoratives, excluding the “a” version of the word that analysis revealed was often used “in different contexts compared to searches of the term ending in ‘-er’.”)

Read the entire article here.

Image: Association between an Internet-Based Measure of Area Racism and Black Mortality. Courtesy of Washington Post / PLOS (Public Library of Science) ONE.

The Increasing Mortality of White Males

This is the type of story that you might not normally, and certainly should not, associate with the world’s richest country. In a reversal of a long-established trend, death rates are increasing for less educated, white males. The good news is that death rates continue to fall for other demographic and racial groups, especially Hispanics and African Americans. So, what is happening to white males?

From the NYT:

It’s disturbing and puzzling news: Death rates are rising for white, less-educated Americans. The economists Anne Case and Angus Deaton reported in December that rates have been climbing since 1999 for non-Hispanic whites age 45 to 54, with the largest increase occurring among the least educated. An analysis of death certificates by The New York Times found similar trends and showed that the rise may extend to white women.

Both studies attributed the higher death rates to increases in poisonings and chronic liver disease, which mainly reflect drug overdoses and alcohol abuse, and to suicides. In contrast, death rates fell overall for blacks and Hispanics.

Why are whites overdosing or drinking themselves to death at higher rates than African-Americans and Hispanics in similar circumstances? Some observers have suggested that higher rates of chronic opioid prescriptions could be involved, along with whites’ greater pessimism about their finances.

Yet I’d like to propose a different answer: what social scientists call reference group theory. The term “reference group” was pioneered by the social psychologist Herbert H. Hyman in 1942, and the theory was developed by the Columbia sociologist Robert K. Merton in the 1950s. It tells us that to comprehend how people think and behave, it’s important to understand the standards to which they compare themselves.

How is your life going? For most of us, the answer to that question means comparing our lives to the lives our parents were able to lead. As children and adolescents, we closely observed our parents. They were our first reference group.

And here is one solution to the death-rate conundrum: It’s likely that many non-college-educated whites are comparing themselves to a generation that had more opportunities than they have, whereas many blacks and Hispanics are comparing themselves to a generation that had fewer opportunities.

Read the entire article here.