Category Archives: Idea Soup

GE and EE: The Dark Side of Facebook

That’s G.E. and E.E, not “Glee”. In social psychology circles GE means grandiose exhibitionism, while EE stands for entitlement / exploitativeness. Researchers find that having a large number of “ifriends”on social networks, such as Facebook, correlates with high levels of GE and EE. The greater the number of friends you have online, the greater the odds that you are a chronic attention seeker with shallow relationships or a “socially disruptive narcissist”.

[div class=attrib]From the Guardian:[end-div]

People who score highly on the Narcissistic Personality Inventory questionnaire had more friends on Facebook, tagged themselves more often and updated their newsfeeds more regularly.

The research comes amid increasing evidence that young people are becoming increasingly narcissistic, and obsessed with self-image and shallow friendships.

The latest study, published in the journal Personality and Individual Differences, also found that narcissists responded more aggressively to derogatory comments made about them on the social networking site’s public walls and changed their profile pictures more often.

A number of previous studies have linked narcissism with Facebook use, but this is some of the first evidence of a direct relationship between Facebook friends and the most “toxic” elements of narcissistic personality disorder.

Researchers at Western Illinois University studied the Facebook habits of 294 students, aged between 18 and 65, and measured two “socially disruptive” elements of narcissism – grandiose exhibitionism (GE) and entitlement/exploitativeness (EE).

GE includes ”self-absorption, vanity, superiority, and exhibitionistic tendencies” and people who score high on this aspect of narcissism need to be constantly at the centre of attention. They often say shocking things and inappropriately self-disclose because they cannot stand to be ignored or waste a chance of self-promotion.

The EE aspect includes “a sense of deserving respect and a willingness to manipulate and take advantage of others”.

The research revealed that the higher someone scored on aspects of GE, the greater the number of friends they had on Facebook, with some amassing more than 800.

Those scoring highly on EE and GG were also more likely to accept friend requests from strangers and seek social support, but less likely to provide it, according to the research.

Carol Craig, a social scientist and chief executive of the Centre for Confidence and Well-being, said young people in Britain were becoming increasingly narcissistic and Facebook provided a platform for the disorder.

“The way that children are being educated is focussing more and more on the importance of self esteem – on how you are seen in the eyes of others. This method of teaching has been imported from the US and is ‘all about me’.

“Facebook provides a platform for people to self-promote by changing profile pictures and showing how many hundreds of friends you have. I know of some who have more than 1,000.”

Dr Viv Vignoles, senior lecturer in social psychology at Sussex University, said there was “clear evidence” from studies in America that college students were becoming increasingly narcissistic.

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image “Looking at You, and You and You”, Jennifer Daniel, an illustrator, created a fan page on Facebook and asked friends to submit their images for this mosaic; 238 of them did so. Courtesy of the New York Times.[end-div]

Culturomics

[div class=attrib]From the Wall Street Journal:[end-div]

Can physicists produce insights about language that have eluded linguists and English professors? That possibility was put to the test this week when a team of physicists published a paper drawing on Google’s massive collection of scanned books. They claim to have identified universal laws governing the birth, life course and death of words.

The paper marks an advance in a new field dubbed “Culturomics”: the application of data-crunching to subjects typically considered part of the humanities. Last year a group of social scientists and evolutionary theorists, plus the Google Books team, showed off the kinds of things that could be done with Google’s data, which include the contents of five-million-plus books, dating back to 1800.

Published in Science, that paper gave the best-yet estimate of the true number of words in English—a million, far more than any dictionary has recorded (the 2002 Webster’s Third New International Dictionary has 348,000). More than half of the language, the authors wrote, is “dark matter” that has evaded standard dictionaries.

The paper also tracked word usage through time (each year, for instance, 1% of the world’s English-speaking population switches from “sneaked” to “snuck”). It also showed that we seem to be putting history behind us more quickly, judging by the speed with which terms fall out of use. References to the year “1880” dropped by half in the 32 years after that date, while the half-life of “1973” was a mere decade.

In the new paper, Alexander Petersen, Joel Tenenbaum and their co-authors looked at the ebb and flow of word usage across various fields. “All these different words are battling it out against synonyms, variant spellings and related words,” says Mr. Tenenbaum. “It’s an inherently competitive, evolutionary environment.”

When the scientists analyzed the data, they found striking patterns not just in English but also in Spanish and Hebrew. There has been, the authors say, a “dramatic shift in the birth rate and death rates of words”: Deaths have increased and births have slowed.

English continues to grow—the 2011 Culturonomics paper suggested a rate of 8,500 new words a year. The new paper, however, says that the growth rate is slowing. Partly because the language is already so rich, the “marginal utility” of new words is declining: Existing things are already well described. This led them to a related finding: The words that manage to be born now become more popular than new words used to get, possibly because they describe something genuinely new (think “iPod,” “Internet,” “Twitter”).

Higher death rates for words, the authors say, are largely a matter of homogenization. The explorer William Clark (of Lewis & Clark) spelled “Sioux” 27 different ways in his journals (“Sieoux,” “Seaux,” “Souixx,” etc.), and several of those variants would have made it into 19th-century books. Today spell-checking programs and vigilant copy editors choke off such chaotic variety much more quickly, in effect speeding up the natural selection of words. (The database does not include the world of text- and Twitter-speak, so some of the verbal chaos may just have shifted online.)

[div class=attrib]Read the entire article here.[end-div]

Creativity: Insight, Shower, Wine, Perspiration? Yes

Some believe creativity stems from a sudden insightful realization, a bolt from the blue that awakens the imagination. Others believe creativity comes from years of discipline and hard work. Well, both groups are correct, but the answer is a little more complex.

[div class=attrib]From the Wall Street Journal:[end-div]

Creativity can seem like magic. We look at people like Steve Jobs and Bob Dylan, and we conclude that they must possess supernatural powers denied to mere mortals like us, gifts that allow them to imagine what has never existed before. They’re “creative types.” We’re not.

But creativity is not magic, and there’s no such thing as a creative type. Creativity is not a trait that we inherit in our genes or a blessing bestowed by the angels. It’s a skill. Anyone can learn to be creative and to get better at it. New research is shedding light on what allows people to develop world-changing products and to solve the toughest problems. A surprisingly concrete set of lessons has emerged about what creativity is and how to spark it in ourselves and our work.

The science of creativity is relatively new. Until the Enlightenment, acts of imagination were always equated with higher powers. Being creative meant channeling the muses, giving voice to the gods. (“Inspiration” literally means “breathed upon.”) Even in modern times, scientists have paid little attention to the sources of creativity.

But over the past decade, that has begun to change. Imagination was once thought to be a single thing, separate from other kinds of cognition. The latest research suggests that this assumption is false. It turns out that we use “creativity” as a catchall term for a variety of cognitive tools, each of which applies to particular sorts of problems and is coaxed to action in a particular way.

Does the challenge that we’re facing require a moment of insight, a sudden leap in consciousness? Or can it be solved gradually, one piece at a time? The answer often determines whether we should drink a beer to relax or hop ourselves up on Red Bull, whether we take a long shower or stay late at the office.

The new research also suggests how best to approach the thorniest problems. We tend to assume that experts are the creative geniuses in their own fields. But big breakthroughs often depend on the naive daring of outsiders. For prompting creativity, few things are as important as time devoted to cross-pollination with fields outside our areas of expertise.

Let’s start with the hardest problems, those challenges that at first blush seem impossible. Such problems are typically solved (if they are solved at all) in a moment of insight.

Consider the case of Arthur Fry, an engineer at 3M in the paper products division. In the winter of 1974, Mr. Fry attended a presentation by Sheldon Silver, an engineer working on adhesives. Mr. Silver had developed an extremely weak glue, a paste so feeble it could barely hold two pieces of paper together. Like everyone else in the room, Mr. Fry patiently listened to the presentation and then failed to come up with any practical applications for the compound. What good, after all, is a glue that doesn’t stick?

On a frigid Sunday morning, however, the paste would re-enter Mr. Fry’s thoughts, albeit in a rather unlikely context. He sang in the church choir and liked to put little pieces of paper in the hymnal to mark the songs he was supposed to sing. Unfortunately, the little pieces of paper often fell out, forcing Mr. Fry to spend the service frantically thumbing through the book, looking for the right page. It seemed like an unfixable problem, one of those ordinary hassles that we’re forced to live with.

But then, during a particularly tedious sermon, Mr. Fry had an epiphany. He suddenly realized how he might make use of that weak glue: It could be applied to paper to create a reusable bookmark! Because the adhesive was barely sticky, it would adhere to the page but wouldn’t tear it when removed. That revelation in the church would eventually result in one of the most widely used office products in the world: the Post-it Note.

Mr. Fry’s invention was a classic moment of insight. Though such events seem to spring from nowhere, as if the cortex is surprising us with a breakthrough, scientists have begun studying how they occur. They do this by giving people “insight” puzzles, like the one that follows, and watching what happens in the brain:

A man has married 20 women in a small town. All of the women are still alive, and none of them is divorced. The man has broken no laws. Who is the man?

[div class=attrib]Read the entire article here.[end-div]

[div class=attrib]Image courtesy of Google search.[end-div]

How to Be a Great Boss… From Hell, For Dummies

Some very basic lessons on how to be a truly bad boss. Lesson number one: keep your employees from making any contribution to or progress on meaningful work.

[div class=attrib]From the Washington Post:[end-div]

Recall your worst day at work, when events of the day left you frustrated, unmotivated by the job, and brimming with disdain for your boss and your organization. That day is probably unforgettable. But do you know exactly how your boss was able to make it so horrible for you? Our research provides insight into the precise levers you can use to re-create that sort of memorable experience for your own underlings.

Over the past 15 years, we have studied what makes people happy and engaged at work. In discovering the answer, we also learned a lot about misery at work. Our research method was pretty straightforward. We collected confidential electronic diaries from 238 professionals in seven companies, each day for several months. All told, those diaries described nearly 12,000 days – how people felt, and the events that stood out in their minds. Systematically analyzing those diaries, we compared the events occurring on the best days with those on the worst.

What we discovered is that the key factor you can use to make employees miserable on the job is to simply keep them from making progress in meaningful work.

People want to make a valuable contribution, and feel great when they make progress toward doing so. Knowing this progress principle is the first step to knowing how to destroy an employee’s work life. Many leaders, from team managers to CEOs, are already surprisingly expert at smothering employee engagement. In fact, on one-third of those 12,000 days, the person writing the diary was either unhappy at work, demotivated by the work, or both.

That’s pretty efficient work-life demolition, but it leaves room for improvement.

Step 1: Never allow pride of accomplishment. When we analyzed the events occurring on people’s very worst days at the office, one thing stood out: setbacks. Setbacks are any instances where employees feel stalled in their most important work or unable to make any meaningful contribution. So, at every turn, stymie employees’ desire to make a difference. One of the most effective examples we saw was a head of product development, who routinely moved people on and off projects like chess pieces in a game for which only he had the rules.

The next step follows organically from the first.

Step 2: Miss no opportunity to block progress on employees’ projects. Every day, you’ll see dozens of ways to inhibit substantial forward movement on your subordinates’ most important efforts. Goal-setting is a great place to start. Give conflicting goals, change them as frequently as possible, and allow people no autonomy in meeting them. If you get this formula just right, the destructive effects on motivation and performance can be truly dramatic.

[div class=attrib]Read the entire article here.[end-div]

[div class=attrib]Image courtesy of Google search.[end-div]

Turing Test 2.0 – Intelligent Behavior Free of Bigotry

One wonders what the world would look like today had Alan Turing been criminally prosecuted and jailed by the British government for his homosexuality before the Second World War, rather than in 1952. Would the British have been able to break German Naval ciphers encoded by their Enigma machine? Would the German Navy have prevailed, and would the Nazis have gone on to conquer the British Isles?

Actually, Turing was not imprisoned in 1952 — rather, he “accepted” chemical castration at the hands of the British government rather than face jail. He died two years later of self-inflicted cyanide poisoning, just short of his 42nd birthday.

Now a hundred years on from his birthday, historians are reflecting on his short life and his lasting legacy. Turing is widely regarded to have founded the discipline of artificial intelligence and he made significant contributions to computing. Yet most of his achievements went unrecognized for many decades or were given short shrift, perhaps, due to his confidential work for the government, or more likely, because of his persona non grata status.

In 2009 the British government offered Turing an apology. And, of course, we now have the Turing Test. (The Turing Test is a test of a machine’s ability to exhibit intelligent behavior). So, one hundred years after Turing’s birth to honor his life we should launch a new and improved Turing Test. Let’s call it the Turing Test 2.0.

This test would measure a human’s ability to exhibit intelligent behavior free of bigotry.

[div class=attrib]From Nature:[end-div]

Alan Turing is always in the news — for his place in science, but also for his 1952 conviction for having gay sex (illegal in Britain until 1967) and his suicide two years later. Former Prime Minister Gordon Brown issued an apology to Turing in 2009, and a campaign for a ‘pardon’ was rebuffed earlier this month.

Must you be a great figure to merit a ‘pardon’ for being gay? If so, how great? Is it enough to break the Enigma ciphers used by Nazi Germany in the Second World War? Or do you need to invent the computer as well, with artificial intelligence as a bonus? Is that great enough?

Turing’s reputation has gone from zero to hero, but defining what he achieved is not simple. Is it correct to credit Turing with the computer? To historians who focus on the engineering of early machines, Turing is an also-ran. Today’s scientists know the maxim ‘publish or perish’, and Turing just did not publish enough about computers. He quickly became perishable goods. His major published papers on computability (in 1936) and artificial intelligence (in 1950) are some of the most cited in the scientific literature, but they leave a yawning gap. His extensive computer plans of 1946, 1947 and 1948 were left as unpublished reports. He never put into scientific journals the simple claim that he had worked out how to turn his 1936 “universal machine” into the practical electronic computer of 1945. Turing missed those first opportunities to explain the theory and strategy of programming, and instead got trapped in the technicalities of primitive storage mechanisms.

He could have caught up after 1949, had he used his time at the University of Manchester, UK, to write a definitive account of the theory and practice of computing. Instead, he founded a new field in mathematical biology and left other people to record the landscape of computers. They painted him out of it. The first book on computers to be published in Britain, Faster than Thought (Pitman, 1953), offered this derisive definition of Turing’s theoretical contribution:

“Türing machine. In 1936 Dr. Turing wrote a paper on the design and limitations of computing machines. For this reason they are sometimes known by his name. The umlaut is an unearned and undesirable addition, due, presumably, to an impression that anything so incomprehensible must be Teutonic.”

That a book on computers should describe the theory of computing as incomprehensible neatly illustrates the climate Turing had to endure. He did make a brief contribution to the book, buried in chapter 26, in which he summarized computability and the universal machine. However, his low-key account never conveyed that these central concepts were his own, or that he had planned the computer revolution.

[div class=attrib]Read the entire article here.[end-div]

[div class=attrib]Image: Alan Mathison Turing at the time of his election to a Fellowship of the Royal Society. Photograph was taken at the Elliott & Fry studio on 29 March 1951.[end-div]

The New Middle Age

We have all heard it — 50 is the “new 30”, 60 is the “new 40”. Adolescence now seems to stretch on into the mid- to late-20s. And, what on Earth is “middle age” anyway? As these previously well defined life-stages become more fluid perhaps it’s time for yet another calibration.

[div class=attrib]From the Independent:[end-div]

One thing that can be said of “Middle Age” is that it’s moving further from the middle. The annual British Social Attitudes Survey suggests just a third of people in their 40s regard themselves as middle-aged, while almost a third of those in their 70s are still clinging to the label, arthritic fingers notwithstanding. In A Shed of One’s Own, his very funny new memoir of male midlife crisis and its avoidance, Marcus Berkmann reaches for a number of definitions for his time of life: “Middle age is comedy, and also tragedy,” he says. “Other people’s middle age is self-evidently ridiculous, while our own represents the collapse of all our hopes and dreams.”

He cites Denis Norden, who said: “Middle age is when, wherever you go on holiday, you pack a sweater.” And the fictional Frasier Crane, who maintains that the middle-aged “go ‘oof’ when [they] sit down on a sofa”. Shakespeare’s famous Seven Ages of Man speech, delivered by the melancholy Jacques in As You Like It, delineated the phases of human development by occupation: the schoolboy, the adolescent lover, the soldier, and the – presumably, middle-aged – legal professional. We have long defined ourselves compulsively by our stages in life; we yearn for maturity, then mourn the passing of youth. But to what extent are these stages socio-cultural (holidays/sweaters) and to what extent are they biological (sofas/”oof”)?

Patricia Cohen, New York Times reporter and author of another new study of ageing, In Our Prime: The Invention of Middle Age, might not be overly sympathetic to Berkmann’s plight. The mid-life crisis, she suggests, is a marketing trick designed to sell cosmetics, cars and expensive foreign holidays; people in their 20s and 30s are far more vulnerable to such a crisis than their parents. Cohen finds little evidence for so-called “empty nest syndrome”, or for the widespread stereotype of the rich man with the young “trophy wife”.

She even claims that middle age itself is a “cultural fiction”, and that Americans only became neurotic about entering their 40s at the turn of the 20th century, when they started lying to census-takers about their age. Before then, “age was not an essential ingredient of one’s identity”. Rather, people were classified according to “marker events”: marriage, parenthood and so on. In 1800 the average American woman had seven children; by 1900 she had three. They were out of her hair by her early 40s and, thanks to modern medicine, she could look forward to a further 20 years or more of active life.

As Berkmann laments, “one of the most tangible symptoms of middle age is the sensation that you’re being cast adrift from mainstream culture.” Then again, the baby boomers, and the more mature members of “Generation X”, are the most powerful of economic blocs. The over-50s spend far more on consumer goods than their younger counterparts, making them particularly valuable to advertisers – and perpetuating the idea of the middle-aged as a discernible demographic.

David Bainbridge, a vet and evolutionary zoologist, also weighs in on the topic in his latest book, Middle Age: A Natural History. Middle age is an exclusively human phenomenon, Bainbridge explains, and doesn’t exist elsewhere in the animal kingdom, where infirmity often follows hot on the heels of parenthood. It is, he argues, “largely the product of millions of years of human evolution… not a 20th-century cultural invention.” He urges readers to embrace middle age as “flux, not crisis” – which is probably what he said to his wife, when he bought himself a blue vintage Lotus soon after turning 40.

[div class=attrib]Read the entire article here.[end-div]

[div class=attrib]Image courtesy of Practical Financial.[end-div]

Doctors Die Too, But Differently

[div class=attrib]From the Wall Street Journal:[end-div]

Years ago, Charlie, a highly respected orthopedist and a mentor of mine, found a lump in his stomach. It was diagnosed as pancreatic cancer by one of the best surgeons in the country, who had developed a procedure that could triple a patient’s five-year-survival odds—from 5% to 15%—albeit with a poor quality of life.

Charlie, 68 years old, was uninterested. He went home the next day, closed his practice and never set foot in a hospital again. He focused on spending time with his family. Several months later, he died at home. He got no chemotherapy, radiation or surgical treatment. Medicare didn’t spend much on him.

It’s not something that we like to talk about, but doctors die, too. What’s unusual about them is not how much treatment they get compared with most Americans, but how little. They know exactly what is going to happen, they know the choices, and they generally have access to any sort of medical care that they could want. But they tend to go serenely and gently.

Doctors don’t want to die any more than anyone else does. But they usually have talked about the limits of modern medicine with their families. They want to make sure that, when the time comes, no heroic measures are taken. During their last moments, they know, for instance, that they don’t want someone breaking their ribs by performing cardiopulmonary resuscitation (which is what happens when CPR is done right).

In a 2003 article, Joseph J. Gallo and others looked at what physicians want when it comes to end-of-life decisions. In a survey of 765 doctors, they found that 64% had created an advanced directive—specifying what steps should and should not be taken to save their lives should they become incapacitated. That compares to only about 20% for the general public. (As one might expect, older doctors are more likely than younger doctors to have made “arrangements,” as shown in a study by Paula Lester and others.)

Why such a large gap between the decisions of doctors and patients? The case of CPR is instructive. A study by Susan Diem and others of how CPR is portrayed on TV found that it was successful in 75% of the cases and that 67% of the TV patients went home. In reality, a 2010 study of more than 95,000 cases of CPR found that only 8% of patients survived for more than one month. Of these, only about 3% could lead a mostly normal life.

[div class=attrib]Read the entire article here.[end-div]

[div class=attrib]Image: The Triumph of Death, Pieter Bruegel the Elder, 1562. Museo del Prado in Madrid.[end-div]

Culture, Language and Genes

In the early 19th century Noah Webster set about re-defining written English. His aim was to standardize the spoken word in the fledgling nation and to distinguish American from British usage. In his own words, “as an independent nation, our honor requires us to have a system of our own, in language as well as government.”

He used his dictionary, which still bears his name today, as a tool to cleanse English of its stubborn reliance on aristocratic pedantry and over-reliance on Latin and Greek. He “simplified” the spelling of numerous words that he believed were contsructed with rules that were all too complicated. Thus, “colour” became “color” and “honour” switched to “honor”; “centre” became “center”, “behaviour” to “behavior”, “traveller” to “traveler”.

Webster offers a perfect example of why humanity seems so adept at fragmenting into diverse cultural groups that thrive through mutual uncomprehension. In “Wired for Culture”, evolutionary biologist Mark Pagel offers a compelling explanation based on that small, yet very selfish biological building block, the gene.

[div class=attrib]From the Wall Street Journal:[end-div]

The island of Gaua, part of Vanuatu in the Pacific, is just 13 miles across, yet it has five distinct native languages. Papua New Guinea, an area only slightly bigger than Texas, has 800 languages, some spoken by just a few thousand people.

Evolutionary biologists have long gotten used to the idea that bodies are just genes’ ways of making more genes, survival machines that carry genes to the next generation. Think of a salmon struggling upstream just to expend its body (now expendable) in spawning. Dr. Pagel’s idea is that cultures are an extension of this: that the way we use culture is to promote the long-term interests of our genes.

It need not be this way. When human beings’ lives became dominated by culture, they could have adopted habits that did not lead to having more descendants. But on the whole we did not; we set about using culture to favor survival of those like us at the expense of other groups, using religion, warfare, cooperation and social allegiance. As Dr. Pagel comments: “Our genes’ gamble at handing over control to…ideas paid off handsomely” in the conquest of the world.

What this means, he argues, is that if our “cultures have promoted our genetic interests throughout our history,” then our “particular culture is not for us, but for our genes.”

We’re expendable. The allegiance we feel to one tribe—religious, sporting, political, linguistic, even racial—is a peculiar mixture of altruism toward the group and hostility to other groups. Throughout history, united groups have stood, while divided ones fell.

Language is the most striking exemplar of Dr. Pagel’s thesis. He calls language “one of the most powerful, dangerous and subversive traits that natural selection has ever devised.” He draws attention to the curious parallels between genetics and linguistics. Both are digital systems, in which words or base pairs are recombined to make an infinite possibility of messages. (Elsewhere I once noted the numerical similarity between Shakespeare’s vocabulary of about 20,000 distinct words and his genome of about 21,000 genes).

Dr. Pagel points out that language is a “technology for rewiring other people’s minds…without either of you having to perform surgery.” But natural section was unlikely to favor such a technology if it helped just the speaker, or just the listener, at the expense of the other. Rather, he says that, just as the language of the genes promotes its own survival via a larger cooperative entity called the body, so language itself endures via the survival of the individual and the tribe.

[div class=attrib]Read the entire article here.[end-div]

[div class=attrib]Image courtesy of PA / Daily Mail.[end-div]

A Philosoper On Avoiding Death

Below we excerpt a brilliant essay by Alex Byrne summarizing his argument that our personal survival is grossly over-valued. But, this should not give future teleportation engineers chance to pause. Alex Byrne is a professor of philosophy at MIT.

[div class=attrib]From the Boston Review:[end-div]

Star Trek–style teleportation may one day become a reality. You step into the transporter, which instantly scans your body and brain, vaporizing them in the process. The information is transmitted to Mars, where it is used by the receiving station to reconstitute your body and brain exactly as they were on Earth. You then step out of the receiving station, slightly dizzy, but pleased to arrive on Mars in a few minutes, as opposed to the year it takes by old-fashioned spacecraft.

But wait. Do you really step out of the receiving station on Mars? Someone just like you steps out, someone who apparently remembers stepping into the transporter on Earth a few minutes before. But perhaps this person is merely your replica—a kind of clone or copy. That would not make this person you: in Las Vegas there is a replica of the Eiffel Tower, but the Eiffel Tower is in Paris, not in Las Vegas. If the Eiffel Tower were vaporized and a replica instantly erected in Las Vegas, the Eiffel Tower would not have been transported to Las Vegas. It would have ceased to exist. And if teleportation were like that, stepping into the transporter would essentially be a covert way of committing suicide. Troubled by these thoughts, you now realize that “you” have been commuting back and forth to Mars for years . . .

So which is it? You are preoccupied with a question about your survival: Do you survive teleportation to Mars? A lot hangs on the question, and it is not obvious how to answer it. Teleportation is just science fiction, of course; does the urgent fictional question have a counterpart in reality? Indeed it does: Do you, or could you, survive death?

Teeming hordes of humanity adhere to religious doctrines that promise survival after death: perhaps bodily resurrection at the Day of Judgment, reincarnation, or immaterial immortality. For these people, death is not the end.

Some of a more secular persuasion do not disagree. The body of the baseball great Ted Williams lies in a container cooled by liquid nitrogen to -321 degrees Fahrenheit, awaiting the Great Thawing, when he will rise to sign sports memorabilia again. (Williams’s prospects are somewhat compromised because his head has apparently been preserved separately.) For the futurist Ray Kurzweil, hope lies in the possibility that he will be uploaded to new and shiny hardware—as pictures are transferred to Facebook’s servers—leaving his outmoded biological container behind.

Isn’t all this a pipe dream? Why isn’t “uploading” merely a way of producing a perfect Kurzweil-impersonator, rather than the real thing? Cryogenic storage might help if I am still alive when frozen, but what good is it after I am dead? And is the religious line any more plausible? “Earth to earth, ashes to ashes, dust to dust” hardly sounds like the dawn of a new day. Where is—as the Book of Common Prayer has it—the “sure and certain hope of the Resurrection to eternal life”? If a forest fire consumes a house and the luckless family hamster, that’s the end of them, presumably. Why are we any different?

Philosophers have had a good deal of interest to say about these issues, under the unexciting rubric of “personal identity.” Let us begin our tour of some highlights with a more general topic: the survival, or “persistence,” of objects over time.

Physical objects (including plants and animals) typically come into existence at some time, and cease to exist at a later time, or so we normally think. For example, a cottage might come into existence when enough beams and bricks are assembled, and cease to exist a century later, when it is demolished to make room for a McMansion. A mighty oak tree began life as a tiny green shoot, or perhaps an acorn, and will end its existence when it is sawn into planks.

The cottage and the oak survive a variety of vicissitudes throughout their careers. The house survived Hurricane Irene, say. That is, the house existed before Irene and also existed after Irene. We can put this in terms of “identity”: the house existed before Irene and something existed after Irene that was identical to the house.

[div class=attrib]Read the entire essay here.[end-div]

A Very, Like, Interestaaaaaaang Linguistic Study?

Uptalk? Verbal fry? Linguistic curiosities enter the mainstream courtesy of trendsetting young women aged 18-25 and Australians.

[div class=attrib]From the Daily Telegraph:[end-div]

From Valley Girls to the Kardashians, young women have long been mocked for the way they talk.

Whether it be uptalk (pronouncing statements as if they were questions? Like this?), creating slang words like “bitchin’ ” and “ridic,” or the incessant use of “like” as a conversation filler, vocal trends associated with young women are often seen as markers of immaturity or even stupidity.

Right?

But linguists — many of whom once promoted theories consistent with that attitude — now say such thinking is outmoded. Girls and women in their teens and 20s deserve credit for pioneering vocal trends and popular slang, they say, adding that young women use these embellishments in much more sophisticated ways than people tend to realize.

“A lot of these really flamboyant things you hear are cute, and girls are supposed to be cute,” said Penny Eckert, a professor of linguistics at Stanford University. “But they’re not just using them because they’re girls. They’re using them to achieve some kind of interactional and stylistic end.”

The latest linguistic curiosity to emerge from the petri dish of girl culture gained a burst of public recognition in December, when researchers from Long Island University published a paper about it in The Journal of Voice. Working with what they acknowledged was a very small sample — recorded speech from 34 women ages 18 to 25 — the professors said they had found evidence of a new trend among female college students: a guttural fluttering of the vocal cords they called “vocal fry.”

A classic example of vocal fry, best described as a raspy or croaking sound injected (usually) at the end of a sentence, can be heard when Mae West says, “Why don’t you come up sometime and see me,” or, more recently on television, when Maya Rudolph mimics Maya Angelou on “Saturday Night Live.”

Not surprisingly, gadflies in cyberspace were quick to pounce on the study — or, more specifically, on the girls and women who are frying their words. “Are they trying to sound like Kesha or Britney Spears?” teased The Huffington Post, naming two pop stars who employ vocal fry while singing, although the study made no mention of them. “Very interesteeeaaaaaaaaang,” said Gawker.com, mocking the lazy, drawn-out affect.

Do not scoff, says Nassima Abdelli-Beruh, a speech scientist at Long Island University and an author of the study. “They use this as a tool to convey something,” she said. “You quickly realize that for them, it is as a cue.”

Other linguists not involved in the research also cautioned against forming negative judgments.

“If women do something like uptalk or vocal fry, it’s immediately interpreted as insecure, emotional or even stupid,” said Carmen Fought, a professor of linguistics at Pitzer College in Claremont, Calif. “The truth is this: Young women take linguistic features and use them as power tools for building relationships.”

The idea that young women serve as incubators of vocal trends for the culture at large has longstanding roots in linguistics. As Paris is to fashion, the thinking goes, so are young women to linguistic innovation.

[div class=attrib]Read the entire article here.[end-div]

[div class=attrib]Image courtesy of Paul Hoppe, Daily Telegraph.[end-div]

Daddy’s Girl, Yes; Mother’s Boy, No

Western social norms tolerate a strong bond between father and daughter; it’s OK to be a daddy’s girl. Yet, for a mother’s boy and mothers of mothers’ boys it’s a different story. In fact, a strong bond between mother and son is frequently looked upon with derision. Just check out the mother’s body “definition” in Wikipedia; there’s no formal entry for Daddy’s Girl.

Why is this, and is it right?

Excerpts below from the forthcoming book “From “The Mama’s Boy Myth” by Kate Stone Lombardi.

[div class=attrib]From the Wall Street Journal:[end-div]

My daughter Jeanie and I use Google chat throughout the day to discuss work, what we had for lunch, how we’re avoiding the gym, and emotional issues big and small. We may also catch up by phone in the evening. I can open up to Jeanie about certain things that I wouldn’t share with another soul, and I believe she would say the same about me. We are very close, which you probably won’t find particularly surprising or alarming.

Now switch genders. Suppose I told you that I am very close to my son, Paul. That I love hanging out with him and that we have dozens of inside jokes and shared traditions. Even though we speak frequently, I get a little thrill each time I hear his signature ringtone on my cellphone. Next, I confess that Paul is so sensitive and intuitive that he “gets me” in a very special way.

Are you starting to speculate that something is a little off? Are you getting uncomfortable about the kind of guy my son is growing up to be?

For generations mothers have gotten one message: that keeping their sons close is wrong, possibly even dangerous. A mother who fosters a deep emotional bond with her son, we’ve been told, is setting him up to be weak and effeminate—an archetypal mama’s boy. He’ll never be independent or able to form healthy adult relationships. As the therapist and child-rearing guru Michael Gurian wrote in his 1994 book about mothers and sons, “a mother’s job…is very much to hold back the coming of manhood.” A well-adjusted, loving mother is one who gradually but surely pushes her son away, both emotionally and physically, in order to allow him to become a healthy man.

This was standard operating procedure for our mothers, our grandmothers and even our great-grandmothers. Amazingly, we’re still encouraged to buy this parenting advice today.

Somehow, when so many of our other beliefs about the roles of men and women have been revolutionized, our view of the mother-son relationship has remained frozen in time. We’ve dramatically changed the way we raise our daughters, encouraging them to be assertive, play competitive sports and aim high in their educational and professional ambitions. We don’t fret about “masculinizing” our girls.

As for daughters and their fathers, while a “mama’s boy” may be a reviled creature, people tend to look tolerantly on a “daddy’s girl.” A loving and supportive father is considered essential to a girl’s self-esteem. Fathers are encouraged to be involved in their daughters’ lives, whether it’s coaching their soccer teams or escorting their teenage girls to father-daughter dances. A father who flouts gender stereotypes and teaches his daughter a traditionally masculine task—say, rebuilding a car engine—is considered to be pretty cool. But a mother who does something comparable—like teaching her son to knit or even encouraging him to talk more openly about his feelings—is looked at with contempt. What is she trying to do to that boy?

Many mothers are confused and anxious when it comes to raising boys. Should they defer to their husband when he insists that she stop kissing their first-grade son at school drop-off? If she cuddles her 10-year-old boy when he is hurt, will she turn him into a wimp? If she keeps him too close, will she make him gay? If her teenage boy is crying in his room, should she go in and comfort him, or will this embarrass and shame him? Anthony E. Wolf, a child psychologist and best-selling author, warns us that “strong emotional contact with his mother is especially upsetting to any teenage boy.”

None of these fears, however, is based on any actual science. In fact, research shows that boys suffer when they separate prematurely from their mothers and benefit from closeness in myriad ways throughout their lives.

A study published in Child Development involving almost 6,000 children, age 12 and younger, found that boys who were insecurely attached to their mothers acted more aggressive and hostile later in childhood—kicking and hitting others, yelling, disobeying adults and being generally destructive.

A study of more than 400 middle school boys revealed that sons who were close to their mothers were less likely to define masculinity as being physically tough, stoic and self-reliant. They not only remained more emotionally open, forming stronger friendships, but they also were less depressed and anxious than their more macho classmates. And they were getting better grades.

There is evidence that a strong mother-son bond prevents delinquency in adolescence. And though it has been long established that teenagers who have good communication with their parents are more likely to resist negative peer pressure, new research shows that it is a boy’s mother who is the most influential when it comes to risky behavior, not only with alcohol and drugs but also in preventing both early and unprotected sex.

Finally, there are no reputable scientific studies suggesting that a boy’s sexual orientation can be altered by his mother, no matter how much she loves him.

[div class=attrib]Read the entire article here.[end-div]

Social Skin

[div class=attrib]From Anthropology in Practice:[end-div]

Are you inked?

I’m not, though I’ve thought about it seriously and have a pretty good idea of what I would get and where I would put it—if I could work up the nerve to get in the chair. I’ll tell you one thing: It most certainly is not a QR code like Fred Bosch, who designed his tattoo to link to something new every time it’s scanned. While the idea is intriguing and presents an interesting re-imagining of tattoos in the digital age, it seems to run counter to the nature of tattoos.

Tattoo As Talisman and Symbol

The word “tattoo” derives from the Tahitian word “tatau” (wound) and the the Polynesian root “ta” (drawing), which neatly summarizes the history of the practice (1). Humans have been inscribing their bodies (and the bodies of others) for thousands of years for self decoration, to display affiliation, and for punitive reasons. The oldest example of a tattooed individual is 5,200 year-old Ötzi the Iceman, who was found in 1991 in the area of the Italian-Austrian border. He had several tattoos on his back, right knee, and around his ankles, which researchers believe may have served medicinal purposes—possibly a form of acupuncture before acupuncture existed (2). Tattoos have also been found on Egyptian mummies dating to 2000 B.C. And sculpted artifacts and figurines marked by body art and piercings provide clues that tattooing was widely practiced from 500 B.C. to – 500 A.D. (3).

Tattoos have been used to signify occupation, patriotism, loyalty, and religious affiliation. For example, there is a rich maritime tradition of tattoos, including initials (both seamen’s own and those of significant others), anchors, mermaids, fish, ships, and religious symbols (4). It seems that most seafarers in the 18th and 19th centuries entered the ranks of the tattooed with initials—possibly for identification purposes—before adding different imagery (5), reflecting what was popular at the time: seafarers born after the American Declaration of Independence displayed more patriotic symbols (e.g., flags, eagles, stars, the words “Independence” and “Liberty,” and the year 1776 than those born prior). And there are also some interesting superstitions tied to them suggesting that tattooing has been an important means of exerting control over one’s situation (6):

H-O-L-D-F-A-S-T, one letter on the back of each finger, next to the hand knuckle, will save a sailor whose life depends on holding a rope.

A crucifix on the back will save the seaman from flogging because no boatswain’s mate would whip a cross, and if he did, the cross would alleviate the pain.

A seaman who could stand to have a full rigged ship tattooed on his chest would automatically make a good topman.

Crucifixes tattooed on each arm and leg would save a man who had fallen in the water and found himself among 775,000 hungry white sharks, who would not even bother smelling him.

That last point might be a bit of a fisherman’s tale (what if it’s 774,000 white sharks?), but it serves nicely to show how deeply enmeshed tattooing has been with certain occupations.

Early Christians got tattoos of religious symbols. Tattoos were purchased by pilgrims and Crusaders as proof that they had made it to Jerusalem, serving as a symbol of witness and identification. The Church largely did not approve even though there was biblical authorization for the practice: While there is evidence that “God’s word and work were passed on through generations through tattoos inscribed on the bodies of Saints, like the stigmata on St. Francis of Assisi,” the idea that the unmarked body is representative of God’s image and should not be altered was persistent (7).

[div class=attrib]Read the entire article here.[end-div]

[div class=attrib]Image courtesy of Tattoo Galleries.[end-div]

The United States of Henry

It is a distinct possibility that had it not been for the graphic tales of violence and sex from the New World around 500 years ago the continent that is now known as America would have taken another label.

Amerigo Vespucci’s westward expeditions between 1497 and 1502 made landfall in what is now Guyana, Venezuela and Brazil. Accounts of these expeditions, with stories of lustful natives having cannibalistic tendencies and life spans of 150 years, caught the public imagination. Present day scholars dispute the authenticity of many of Vespucci’s first hand accounts and letters; in fact the dates and number of Vespucci’s expeditions remains unsettled to this day.

However, 500 years ago Vespucci was held in relatively high esteem, particularly by a German geographer named Martin Waldseemüller. It was Waldseemüller who in 1505 enamored by Vespucci’s colorful observations published a new survey of world geography and named the newly discovered southern continent “America”, after Vespucci. In the survey Waldseemüller wrote: “I do not see what right any one would have to object to calling this part, after Americus who discovered it and who is a man of intelligence, Amerige, that is, the Land of Americus, or America: since both Europa and Asia got their names from women”.

For those interested in the etymology of the name “America” read on. Amerigo Vespucci is the modern Italianate form of the medieval latinized name Emericus (or Americus) Vespucius. In, assigning the name “America”, Waldseemüller took the feminine form of Americus. The German equivalent to Emericus is Heinrich, which in English, is, of course, Henry.

[div class=attrib]From the Independent:[end-div]

[Amerigo Vespucci] was not a natural sailor. Writing to Lorenzo de’Medici, he moaned about “the risks of shipwreck, the innumerable physical deprivations, the permanent anguish that afflicted our spirits… we were prey to such terrible fear that we gave up every hope of surviving.” But when everything was as bad as it could get, “In the midst of this terrible tempest… it pleased the Almighty to show us the continent, new earth and an unknown world.”

These were the words that, once set in type, galvanised Europe. Vespucci knew the geographical works of Ptolemy and had spent years steeped in maps and geographical speculation. For him the coast of modern Venezuela and Brazil where his expedition landed had nothing in common with the zones described by explorers of the Orient. Instead this was something far more fascinating – an unimagined world.

“Surely,” he wrote, “if the terrestrial paradise be in any part of this earth, I esteem that it is not far from these parts.” In his description, this New World is made up of extremes. On the one hand, the people he encounters are living in a dream-like state of bliss: with no metals except gold, no clothes, no signs of age, few diseases, no government, no religion, no trade. In a land rich in animals and plants, colours and fragrances, free from the stain of civilisation, “they live 150 years and rarely fall ill”.

But turn the coin and he was in a world of devils. “They eat one another, the victor [eats] the vanquished,” he wrote. “I know a man… who was reputed to have eaten more than 300 human bodies…” The women are intensely desirable: “none… among them who had a flabby breast,” but they are also monsters and witches: “… Being very lustful, [they] cause the private parts of their husbands to swell up to such a huge size that they appear deformed and disgusting… in consequence of this many lose their organs which break through lack of attention, and they remain eunuchs… When [the women] had the opportunity of copulating with Christians, urged by excessive lust, they defiled… themselves.”

Vespucci’s sensational description inspired an early etching of the Florentine’s first encounter with an American: the explorer and the naked, voluptuous and very pale woman lock eyes; the woman is in the act of clambering off a hammock and moving in his direction. Meanwhile, on a nearby hillock, a woman is roasting the lower half of a human body over a fire.

The wild and fantastic nature of Vespucci’s descriptions raises the question of how reliable any of his observations are – but then vast doubt surrounds almost everything about his adventures. We don’t know how many voyages he undertook; his authorship of some of the accounts is questionable; and it is not even universally accepted that he identified South America for what it was, a new continent.

[div class=attrib]Read the entire article here.[end-div]

[div class=attrib]Image courtesy of biography.com.[end-div]

Your Guide to Online Morality

By most estimates Facebook has around 800 million registered users. This means that its policies governing what is or is not appropriate user content should bear detailed scrutiny. So, a look at Facebook’s recently publicized guidelines for sexual and violent content show a somewhat peculiar view of morality. It’s a view that some characterize as typically American prudishness, but with a blind eye towards violence.

[div class=attrib]From the Guardian:[end-div]

Facebook bans images of breastfeeding if nipples are exposed – but allows “graphic images” of animals if shown “in the context of food processing or hunting as it occurs in nature”. Equally, pictures of bodily fluids – except semen – are allowed as long as no human is included in the picture; but “deep flesh wounds” and “crushed heads, limbs” are OK (“as long as no insides are showing”), as are images of people using marijuana but not those of “drunk or unconscious” people.

The strange world of Facebook’s image and post approval system has been laid bare by a document leaked from the outsourcing company oDesk to the Gawker website, which indicates that the sometimes arbitrary nature of picture and post approval actually has a meticulous – if faintly gore-friendly and nipple-unfriendly – approach.

For the giant social network, which has 800 million users worldwide and recently set out plans for a stock market flotation which could value it at up to $100bn (£63bn), it is a glimpse of its inner workings – and odd prejudices about sex – that emphasise its American origins.

Facebook has previously faced an outcry from breastfeeding mothers over its treatment of images showing them with their babies. The issue has rumbled on, and now seems to have been embedded in its “Abuse Standards Violations”, which states that banned items include “breastfeeding photos showing other nudity, or nipple clearly exposed”. It also bans “naked private parts” including “female nipple bulges and naked butt cracks” – though “male nipples are OK”.

The guidelines, which have been set out in full, depict a world where sex is banned but gore is acceptable. Obvious sexual activity, even if “naked parts” are hidden, people “using the bathroom”, and “sexual fetishes in any form” are all also banned. The company also bans slurs or racial comments “of any kind” and “support for organisations and people primarily known for violence”. Also banned is anyone who shows “approval, delight, involvement etc in animal or human torture”.

[div class=attrib]Read the entire article here.[end-div]

[div class=attrib]Image courtesy of Guardian / Photograph: Dominic Lipinski/PA.[end-div]

Religion for Atheists and the Agape Restaurant

Alain de Botton is a writer of book-length essays on love, travel, architecture and literature. In his latest book, Religion for Atheists, de Botton argues that while the supernatural claims of all religions are entirely false, religions still have important things to teach the secular world. An excerpt from the book below.

[div class=attrib]From the Wall Street Journal:[end-div]

One of the losses that modern society feels most keenly is the loss of a sense of community. We tend to imagine that there once existed a degree of neighborliness that has been replaced by ruthless anonymity, by the pursuit of contact with one another primarily for individualistic ends: for financial gain, social advancement or romantic love.

In attempting to understand what has eroded our sense of community, historians have assigned an important role to the privatization of religious belief that occurred in Europe and the U.S. in the 19th century. They have suggested that we began to disregard our neighbors at around the same time that we ceased to honor our gods as a community.

This raises two questions: How did religion once enhance the spirit of community? More practically, can secular society ever recover that spirit without returning to the theological principles that were entwined with it? I, for one, believe that it is possible to reclaim our sense of community—and that we can do so, moreover, without having to build upon a religious foundation.

Insofar as modern society ever promises us access to a community, it is one centered on the worship of professional success. We sense that we are brushing up against its gates when the first question we are asked at a party is “What do you do?,” our answer to which will determine whether we are warmly welcomed or conclusively abandoned.

In these competitive, pseudo-communal gatherings, only a few sides of us count as currency with which to buy the goodwill of strangers. What matters above all is what is on our business cards. Those who have opted to spend their lives looking after children, writing poetry or nurturing orchards will be left in no doubt that they have run contrary to the dominant mores of the powerful, who will marginalize them accordingly.

Given this level of discrimination, it is no surprise that many of us choose to throw ourselves with a vengeance into our careers. Focusing on work to the exclusion of almost everything else is a plausible strategy in a world that accepts workplace achievements as the main tokens for securing not just the financial means to survive physically but also the attention that we require to thrive psychologically.

Religions seem to know a great deal about our loneliness. Even if we believe very little of what they tell us about the afterlife or the supernatural origins of their doctrines, we can nevertheless admire their understanding of what separates us from strangers and their attempts to melt away one or two of the prejudices that normally prevent us from building connections with others.

Consider Catholicism, which starts to create a sense of community with a setting. It marks off a piece of the earth, puts walls up around it and declares that within their confines there will reign values utterly unlike the ones that hold sway in the world beyond. A church gives us rare permission to lean over and say hello to a stranger without any danger of being thought predatory or insane.

The composition of the congregation also feels significant. Those in attendance tend not to be uniformly of the same age, race, profession or educational or income level; they are a random sampling of souls united only by their shared commitment to certain values. We are urged to overcome our provincialism and our tendency to be judgmental—and to make a sign of peace to whomever chance has placed on either side of us. The Church asks us to leave behind all references to earthly status. Here no one asks what anyone else “does.” It no longer matters who is the bond dealer and who the cleaner.

The Church does more, however, than merely declare that worldly success doesn’t matter. In a variety of ways, it enables us to imagine that we could be happy without it. Appreciating the reasons why we try to acquire status in the first place, it establishes conditions under which we can willingly surrender our attachment to it.

[div class=attrib]Read the entire article here.[end-div]

[div class=attrib]Image: Alain de Botton. Courtesy of BBC.[end-div]

 

Beautiful Explanations

Each year for the past 15 years Edge has posed a weighty question to a group of scientists, researchers, philosophers, mathematicians and thinkers. For 2012, Edge asked the question, “What Is Your Favorite Deep, Elegant, or Beautiful Explanation?”, to 192 of our best and brightest. Back came 192 different and no-less wonderful answers. We can post but a snippet here, so please visit the Edge, and then make a note to buy the book (it’s not available yet).

Read the entire article here.

The Mysterious Coherence Between Fundamental Physics and Mathematics
Peter Woit, Mathematical Physicist, Columbia University; Author, Not Even Wrong

Any first course in physics teaches students that the basic quantities one uses to describe a physical system include energy, momentum, angular momentum and charge. What isn’t explained in such a course is the deep, elegant and beautiful reason why these are important quantities to consider, and why they satisfy conservation laws. It turns out that there’s a general principle at work: for any symmetry of a physical system, you can define an associated observable quantity that comes with a conservation law:

1. The symmetry of time translation gives energy
2. The symmetries of spatial translation give momentum
3. Rotational symmetry gives angular momentum
4. Phase transformation symmetry gives charge

 

Einstein Explains Why Gravity Is Universal
Sean Carroll, Theoretical Physicist, Caltech; Author, From Eternity to Here: The Quest for the Ultimate Theory of Time

The ancient Greeks believed that heavier objects fall faster than lighter ones. They had good reason to do so; a heavy stone falls quickly, while a light piece of paper flutters gently to the ground. But a thought experiment by Galileo pointed out a flaw. Imagine taking the piece of paper and tying it to the stone. Together, the new system is heavier than either of its components, and should fall faster. But in reality, the piece of paper slows down the descent of the stone.

Galileo argued that the rate at which objects fall would actually be a universal quantity, independent of their mass or their composition, if it weren’t for the interference of air resistance. Apollo 15 astronaut Dave Scott once illustrated this point by dropping a feather and a hammer while standing in vacuum on the surface of the Moon; as Galileo predicted, they fell at the same rate.

Subsequently, many scientists wondered why this should be the case. In contrast to gravity, particles in an electric field can respond very differently; positive charges are pushed one way, negative charges the other, and neutral particles not at all. But gravity is universal; everything responds to it in the same way.

Thinking about this problem led Albert Einstein to what he called “the happiest thought of my life.” Imagine an astronaut in a spaceship with no windows, and no other way to peer at the outside world. If the ship were far away from any stars or planets, everything inside would be in free fall, there would be no gravitational field to push them around. But put the ship in orbit around a massive object, where gravity is considerable. Everything inside will still be in free fall: because all objects are affected by gravity in the same way, no one object is pushed toward or away from any other one. Sticking just to what is observed inside the spaceship, there’s no way we could detect the existence of gravity.

 

True or False: Beauty Is Truth
Judith Rich Harris, Independent Investigator and Theoretician; Author, The Nurture Assumption; No Two Alike: Human Nature and Human Individuality

“Beauty is truth, truth beauty,” said John Keats. But what did he know? Keats was a poet, not a scientist.

In the world that scientists inhabit, truth is not always beautiful or elegant, though it may be deep. In fact, it’s my impression that the deeper an explanation goes, the less likely it is to be beautiful or elegant.

Some years ago, the psychologist B. F. Skinner proposed an elegant explanation of “the behavior of organisms,” based on the idea that rewarding a response—he called it reinforcement—increases the probability that the same response will occur again in the future. The theory failed, not because it was false (reinforcement generally does increase the probability of a response) but because it was too simple. It ignored innate components of behavior. It couldn’t even handle all learned behavior. Much behavior is acquired or shaped through experience, but not necessarily by means of reinforcement. Organisms learn different things in different ways.

 

The Power Of One, Two, Three
Charles Seife, Professor of Journalism, New York University; formerly journalist, Science Magazine; Author, Proofiness: The Dark Arts of Mathematical Deception

Sometimes even the simple act of counting can tell you something profound.

One day, back in the late 1990s, when I was a correspondent for New Scientist magazine, I got an e-mail from a flack waxing rhapsodic about an extraordinary piece of software. It was a revolutionary data-compression program so efficient that it would squash every digital file by 95% or more without losing a single bit of data. Wouldn’t my magazine jump at the chance to tell the world about the computer program that will make their hard drives hold 20 times more information than every before.

No, my magazine wouldn’t.

No such compression algorithm could possibly exist; it was the algorithmic equivalent of a perpetual motion machine. The software was a fraud.

The reason: the pigeonhole principle.

 

Watson and Crick Explain How DNA Carries Genetic Information
Gary Klein, Cognitive Psychologist; Author, Sources of Power; Streetlights and Shadows: Searching for Keys to Adaptive Decision Making

In 1953, when James Watson pushed around some two-dimensional cut-outs and was startled to find that an adenine-thymine pair had an isomorphic shape to the guanine-cytosine pair, he solved eight mysteries simultaneously. In that instant he knew the structure of DNA: a helix. He knew how many strands: two. It was a double helix. He knew what carried the information: the nucleic acids in the gene, not the protein. He knew what maintained the attraction: hydrogen bonds. He knew the arrangement: The sugar-phosphate backbone was on the outside and the nucleic acids were in the inside. He knew how the strands match: through the base pairs. He knew the arrangement: the two identical chains ran in opposite directions. And he knew how genes replicated: through a zipper-like process.

The discovery that Watson and Crick made is truly impressive, but I am also interested in what we can learn from the process by which they arrived at their discovery. On the surface, the Watson-Crick story fits in with five popular claims about innovation, as presented below. However, the actual story of their collaboration is more nuanced than these popular claims suggest.

It is important to have clear research goals. Watson and Crick had a clear goal, to describe the structure of DNA, and they succeeded.

But only the first two of their eight discoveries had to do with this goal. The others, arguably the most significant, were unexpected byproducts.

Tear-Jerker Dissected

Set aside the fact that having heard Adele’s song “Someone Like You” so often may want to make you cry from trying to escape, science has now found an answer to why the tear-jerker makes you sob.

[div class=attrib]From the Wall Street Journal:[end-div]

On Sunday night [February 12, 2012], the British singer-songwriter Adele is expected to sweep the Grammys. Three of her six nominations are for her rollicking hit “Rolling in the Deep.” But it’s her ballad “Someone Like You” that has risen to near-iconic status recently, due in large part to its uncanny power to elicit tears and chills from listeners. The song is so famously sob-inducing that “Saturday Night Live” recently ran a skit in which a group of co-workers play the tune so they can all have a good cry together.

What explains the magic of Adele’s song? Though personal experience and culture play into individual reactions, researchers have found that certain features of music are consistently associated with producing strong emotions in listeners. Combined with heartfelt lyrics and a powerhouse voice, these structures can send reward signals to our brains that rival any other pleasure.

Twenty years ago, the British psychologist John Sloboda conducted a simple experiment. He asked music lovers to identify passages of songs that reliably set off a physical reaction, such as tears or goose bumps. Participants identified 20 tear-triggering passages, and when Dr. Sloboda analyzed their properties, a trend emerged: 18 contained a musical device called an “appoggiatura.”

An appoggiatura is a type of ornamental note that clashes with the melody just enough to create a dissonant sound. “This generates tension in the listener,” said Martin Guhn, a psychologist at the University of British Columbia who co-wrote a 2007 study on the subject. “When the notes return to the anticipated melody, the tension resolves, and it feels good.”

Chills often descend on listeners at these moments of resolution. When several appoggiaturas occur next to each other in a melody, it generates a cycle of tension and release. This provokes an even stronger reaction, and that is when the tears start to flow.

[div class=attrib]Read the entire sob story here.[end-div]

[div class=attrib]Image of Adele. Courtesy of The Wall Street Journal (illustration) Associated Press (photo); Universal Music Publishing (score).[end-div]

Yawning and Empathy

[div class=attrib]From Scientific American:[end-div]

You can tell a lot about a person from their body. And I don’t just mean how many hours they spend at the gym, or how easy it is for them to sweet-talk their way out of speeding tickets. For the past several decades researchers have been studying the ways in which the body reveals properties of the mind. An important subset of this work has taken this idea a step further: do the ways our bodies relate to one another tell us about the ways in which our minds relate to one another? Consider behavioral mimicry. Many studies have found that we quite readily mimic the nonverbal behavior of those with whom we interact. Furthermore, the degree to which we mimic others is predicted by both our personality traits as well as our relationship to those around us. In short, the more empathetic we are, the more we mimic, and the more we like the people we’re interacting with, the more we mimic. The relationship between our bodies reveals something about the relationship between our minds.

The bulk of this research has made use of clever experimental manipulations involving research assistant actors. The actor crosses his legs and then waits to see if the participant crosses his legs, too. If so, we’ve found mimicry, and can now compare the presence of mimicry with self-reports of, say, liking and interpersonal closeness to see if there is a relationship. More naturalistic evidence for this phenomenon has been much harder to come by. That is, to what extent do we see this kind of nonverbal back and forth in the real world and to what extent does it reveal the same properties of minds that seem to hold true in the lab?

A recent study conducted by Ivan Norscia and Elisabetta Palagi and published in the journal PLoSONE has found such evidence in the unlikeliest of places: yawns. More specifically, yawn contagion, or that annoyingly inevitable phenomenon that follows seeing, hearing (and even reading) about another yawn. You’ve certainly experienced this, but perhaps you have not considered what it might reveal to others (beyond a lack of sleep or your interest level in their conversation). Past work has demonstrated that, similar to behavioral mimicry, contagious yawners tend to be higher in dispositional empathy. That is, they tend to be the type of people who are better, and more interested in, understanding other people’s internal states. Not only that, but contagious yawning seems to emerge in children at the same time that they develop the cognitive capacities involved in empathizing with others. And children who lack this capacity, such as in autism, also show deficits in their ability to catch others’ yawns. In short, the link between yawning and empathizing appears strong.

Given that regions of the brain involved in empathizing with others can be influenced by the degree of psychological closeness to those others, Norscia and Palagi wanted to know whether contagious yawning might also reveal information about how we relate to those around us. Specifically, are we more likely to catch the yawns of people to whom we are emotionally closer? Can we deduce something about the quality of the relationships between individuals based solely on their pattern of yawning?  Yawning might tell us the degree to which we empathize with, and by extension care about, the people around us.

[div  class=attrib]Read the entire article here.[end-div]

[div class=attrib]Image courtesy of Alex Gumerov/iStock / Scientific American.[end-div]

L’Entente Cordiale: Parenting the French Way

French children, it seems, unlike their cousins in the United States, don’t suffer temper tantrums, sit patiently at meal-times, defer to their parents, eat all their vegetables, respect adults, and are generally happy. Why is this and should American parents ditch the latest pop psychology handbooks for parenting lessons from La Belle France?

[div class=attrib]From the Wall Street Journal:[end-div]

When my daughter was 18 months old, my husband and I decided to take her on a little summer holiday. We picked a coastal town that’s a few hours by train from Paris, where we were living (I’m American, he’s British), and booked a hotel room with a crib. Bean, as we call her, was our only child at this point, so forgive us for thinking: How hard could it be?

We ate breakfast at the hotel, but we had to eat lunch and dinner at the little seafood restaurants around the old port. We quickly discovered that having two restaurant meals a day with a toddler deserved to be its own circle of hell.

Bean would take a brief interest in the food, but within a few minutes she was spilling salt shakers and tearing apart sugar packets. Then she demanded to be sprung from her high chair so she could dash around the restaurant and bolt dangerously toward the docks.

Our strategy was to finish the meal quickly. We ordered while being seated, then begged the server to rush out some bread and bring us our appetizers and main courses at the same time. While my husband took a few bites of fish, I made sure that Bean didn’t get kicked by a waiter or lost at sea. Then we switched. We left enormous, apologetic tips to compensate for the arc of torn napkins and calamari around our table.

After a few more harrowing restaurant visits, I started noticing that the French families around us didn’t look like they were sharing our mealtime agony. Weirdly, they looked like they were on vacation. French toddlers were sitting contentedly in their high chairs, waiting for their food, or eating fish and even vegetables. There was no shrieking or whining. And there was no debris around their tables.

Though by that time I’d lived in France for a few years, I couldn’t explain this. And once I started thinking about French parenting, I realized it wasn’t just mealtime that was different. I suddenly had lots of questions. Why was it, for example, that in the hundreds of hours I’d clocked at French playgrounds, I’d never seen a child (except my own) throw a temper tantrum? Why didn’t my French friends ever need to rush off the phone because their kids were demanding something? Why hadn’t their living rooms been taken over by teepees and toy kitchens, the way ours had?

Soon it became clear to me that quietly and en masse, French parents were achieving outcomes that created a whole different atmosphere for family life. When American families visited our home, the parents usually spent much of the visit refereeing their kids’ spats, helping their toddlers do laps around the kitchen island, or getting down on the floor to build Lego villages. When French friends visited, by contrast, the grownups had coffee and the children played happily by themselves.

By the end of our ruined beach holiday, I decided to figure out what French parents were doing differently. Why didn’t French children throw food? And why weren’t their parents shouting? Could I change my wiring and get the same results with my own offspring?

Driven partly by maternal desperation, I have spent the last several years investigating French parenting. And now, with Bean 6 years old and twins who are 3, I can tell you this: The French aren’t perfect, but they have some parenting secrets that really do work.

I first realized I was on to something when I discovered a 2009 study, led by economists at Princeton, comparing the child-care experiences of similarly situated mothers in Columbus, Ohio, and Rennes, France. The researchers found that American moms considered it more than twice as unpleasant to deal with their kids. In a different study by the same economists, working mothers in Texas said that even housework was more pleasant than child care.

[div class=attrib]Read the entire article here. This is adapted from “Bringing Up Bébé: One American Mother Discovers the Wisdom of French Parenting,” to be published February 7, 2012 by the Penguin Press.[end-div]

[div class=attrib]Image: That’s the way to do it … a young boy at the Côte d’Or restaurant, Saulieu. Courtesy of Owen Franken/Corbis / Guardian [end-div]

Do We Need Philosophy Outside of the Ivory Tower?

In her song “What I Am”, Edie Brickell reminds us that philosophy is “the talk on a cereal box” and “a walk on the slippery rocks“.

Philosopher Gary Gutting makes the case that the discipline is more important than ever, and yes, it belongs in the mainstream consciousness, and not just within the confines of academia.

[div class=attrib]From the New York Times:[end-div]

Almost every article that appears in The Stone provokes some comments from readers challenging the very idea that philosophy has anything relevant to say to non-philosophers.  There are, in particular, complaints that philosophy is an irrelevant “ivory-tower” exercise, useless to any except those interested in logic-chopping for its own sake.

There is an important conception of philosophy that falls to this criticism.  Associated especially with earlier modern philosophers, particularly René Descartes, this conception sees philosophy as the essential foundation of the beliefs that guide our everyday life.  For example, I act as though there is a material world and other people who experience it as I do.   But how do I know that any of this is true?  Couldn’t I just be dreaming of a world outside my thoughts?  And, since (at best) I see only other human bodies, what reason do I have to think that there are any minds connected to those bodies?  To answer these questions, it would seem that I need rigorous philosophical arguments for my existence and the existence of other thinking humans.

Of course, I don’t actually need any such arguments, if only because I have no practical alternative to believing that I and other people exist.  As soon as we stop thinking weird philosophical thoughts, we immediately go back to believing what skeptical arguments seem to call into question.  And rightly so, since, as David Hume pointed out, we are human beings before we are philosophers.

But what Hume and, by our day, virtually all philosophers are rejecting is only what I’m calling the foundationalist conception of philosophy. Rejecting foundationalism means accepting that we have every right to hold basic beliefs that are not legitimated by philosophical reflection.  More recently, philosophers as different as Richard Rorty and Alvin Plantinga have cogently argued that such basic beliefs include not only the “Humean” beliefs that no one can do without, but also substantive beliefs on controversial questions of ethics, politics and religion.  Rorty, for example, maintained that the basic principles of liberal democracy require no philosophical grounding (“the priority of democracy over philosophy”).

If you think that the only possible “use” of philosophy would be to provide a foundation for beliefs that need no foundation, then the conclusion that philosophy is of little importance for everyday life follows immediately.  But there are other ways that philosophy can be of practical significance.

Even though basic beliefs on ethics, politics and religion do not require prior philosophical justification, they do need what we might call “intellectual maintenance,” which itself typically involves philosophical thinking.  Religious believers, for example, are frequently troubled by the existence of horrendous evils in a world they hold was created by an all-good God.  Some of their trouble may be emotional, requiring pastoral guidance.  But religious commitment need not exclude a commitment to coherent thought. For instance, often enough believers want to know if their belief in God makes sense given the reality of evil.  The philosophy of religion is full of discussions relevant to this question.  Similarly, you may be an atheist because you think all arguments for God’s existence are obviously fallacious. But if you encounter, say, a sophisticated version of the cosmological argument, or the design argument from fine-tuning, you may well need a clever philosopher to see if there’s anything wrong with it.

[div class=attrib]Read the entire article here.[end-div]

Forget the Groupthink: Rise of the Introvert

Author Susan Cain reviews her intriguing book, “Quiet : The Power of Introverts” in an interview with Gareth Cook over at Mind Matters / Scientific American.

She shows us how social and business interactions and group-driven processes, often led and coordinated by extroverts, may not be the most efficient method for introverts to shine creatively.

[div class=attrib]From Mind Matters:[end-div]

Cook: This may be a stupid question, but how do you define an introvert? How can somebody tell whether they are truly introverted or extroverted?

Cain: Not a stupid question at all! Introverts prefer quiet, minimally stimulating environments, while extroverts need higher levels of stimulation to feel their best. Stimulation comes in all forms – social stimulation, but also lights, noise, and so on. Introverts even salivate more than extroverts do if you place a drop of lemon juice on their tongues! So an introvert is more likely to enjoy a quiet glass of wine with a close friend than a loud, raucous party full of strangers.

It’s also important to understand that introversion is different from shyness. Shyness is the fear of negative judgment, while introversion is simply the preference for less stimulation. Shyness is inherently uncomfortable; introversion is not. The traits do overlap, though psychologists debate to what degree.

Cook: You argue that our culture has an extroversion bias. Can you explain what you mean?

Cain: In our society, the ideal self is bold, gregarious, and comfortable in the spotlight. We like to think that we value individuality, but mostly we admire the type of individual who’s comfortable “putting himself out there.” Our schools, workplaces, and religious institutions are designed for extroverts. Introverts are to extroverts what American women were to men in the 1950s — second-class citizens with gigantic amounts of untapped talent.

In my book, I travel the country – from a Tony Robbins seminar to Harvard Business School to Rick Warren’s powerful Saddleback Church – shining a light on the bias against introversion. One of the most poignant moments was when an evangelical pastor I met at Saddleback confided his shame that “God is not pleased” with him because he likes spending time alone.

Cook: How does this cultural inclination affect introverts?

Cain: Many introverts feel there’s something wrong with them, and try to pass as extroverts. But whenever you try to pass as something you’re not, you lose a part of yourself along the way. You especially lose a sense of how to spend your time. Introverts are constantly going to parties and such when they’d really prefer to be home reading, studying, inventing, meditating, designing, thinking, cooking…or any number of other quiet and worthwhile activities.

According to the latest research, one third to one half of us are introverts – that’s one out of every two or three people you know. But you’d never guess that, right? That’s because introverts learn from an early age to act like pretend-extroverts.

[div class=attrib]Read the entire article here.[end-div]

Self-Esteem and Designer Goods

[div class=attrib]From Scientific American:[end-div]

Sellers have long charged a premium for objects that confer some kind of social status, even if they offer few, if any, functional benefits over cheaper products. Designer sunglasses, $200,000 Swiss watches, and many high-end cars often seem to fall into this category. If a marketer can make a mundane item seem like a status symbol—maybe by wrapping it in a fancy package or associating it with wealth, success or beauty—they can charge more for it.

Although this practice may seem like a way to trick consumers out of their hard-earned cash, studies show that people do reap real psychological benefits from the purchase of high status items. Still, some people may gain more than others do, and studies also suggest that buying fancy stuff for yourself is unlikely to be the best way to boost your happiness or self-esteem.

In 2008, two research teams demonstrated that people process social values in the brain’s reward center: the striatum, which also responds to monetary gains. That these two values share a cerebral home suggests we may weigh our reputation in cash terms. Whether we like it or not, attaching a monetary value to social status makes good scientific sense.

Much of what revs up this reward center—food and recreational drugs, for example—is associated with a temporary rush of pleasure or good feeling, rather than long-lasting satisfaction. But when we literally pay for that good feeling, by buying a high-status car or watch, say, the effect may last long enough to unleash profitable behaviors. In a study published last year, researchers at National Sun Yat-Sen University in Taiwan found that the mere use of brand name products seemed to make people feel they deserved higher salaries, in one case, and in the other, would be more attractive to a potential date, reports Roger Dooley in his Neuromarketing blog. Thus, even if the boost of good feeling—and self-worth—is short-lived, it might spawn actions that yield lasting benefits.

Other data suggest that owning fancy things might have more direct psychological benefits. In a study published in 2010, psychologist Ed Deiner at the University of Illinois and his colleagues found that standard of living, as measured by household income and ownership of luxury goods, predicted a person’s overall satisfaction with life—although it did not seem to enhance positive emotions.  That rush of pleasure you get from the purchase probably does fade, but a type of self-esteem effect seems to last.

[div class=attrib]Read the entire article here.[end-div]

[div class=attrib]Image of luxury goods. Courtesy of Google search.[end-div]

Political and Social Stability and God

theDiagonal has carried several recent articles (here and here) that paint atheists in the same category as serial killers and child molesters, particularly in the United States. Why are atheists so reviled?

A study by Will Gervais and Ara Norenzayanat at the University of British Columbia shows that it boils down to trust. Simply put, we are more likely to find someone to be trustworthy if we believe God is watching over us.

Interestingly, their research also showed that atheists are more likely to be found in greater numbers in a population governed by a stable government with a broad social safety-net. Political instability, it seems, drives more citizens to believe in God.

[div class=attrib]From Scientific American:[end-div]

Atheists are one of the most disliked groups in America. Only 45 percent of Americans say they would vote for a qualified atheist presidential candidate, and atheists are rated as the least desirable group for a potential son-in-law or daughter-in-law to belong to. Will Gervais at the University of British Columbia recently published a set of studies looking at why atheists are so disliked. His conclusion: It comes down to trust.

Gervais and his colleagues presented participants with a story about a person who accidentally hits a parked car and then fails to leave behind valid insurance information for the other driver. Participants were asked to choose the probability that the person in question was a Christian, a Muslim, a rapist, or an atheist. They thought it equally probable the culprit was an atheist or a rapist, and unlikely the person was a Muslim or Christian. In a different study, Gervais looked at how atheism influences people’s hiring decisions. People were asked to choose between an atheist or a religious candidate for a job requiring either a high or low degree of trust. For the high-trust job of daycare worker, people were more likely to prefer the religious candidate. For the job of waitress, which requires less trust, the atheists fared much better.

It wasn’t just the highly religious participants who expressed a distrust of atheists. People identifying themselves as having no religious affiliation held similar opinions. Gervais and his colleagues discovered that people distrust atheists because of the belief that people behave better when they think that God is watching over them. This belief may have some truth to it. Gervais and his colleague Ara Norenzayan have found that reminding people about God’s presence has the same effect as telling people they are being watched by others: it increases their feelings of self-consciousness and leads them to behave in more socially acceptable ways.

When we know that somebody believes in the possibility of divine punishment, we seem to assume they are less likely to do something unethical. Based on this logic, Gervais and Norenzayan hypothesized that reminding people about the existence of secular authority figures, such as policemen and judges, might alleviate people’s prejudice towards atheists. In one study, they had people watch either a travel video or a video of a police chief giving an end-of-the-year report. They then asked participants how much they agreed with certain statements about atheists (e.g., “I would be uncomfortable with an atheist teaching my child.”) In addition, they measured participants’ prejudice towards other groups, including Muslims and Jewish people. Their results showed that viewing the video of the police chief resulted in less distrust towards atheists. However, it had no effect on people’s prejudice towards other groups. From a psychological standpoint, God and secular authority figures may be somewhat interchangeable. The existence of either helps us feel more trusting of others.

Gervais and Norenzayan’s findings may shed light on an interesting puzzle: why acceptance towards atheism has grown rapidly in some countries but not others. In many Scandinavian countries, including Norway and Sweden, the number of people who report believing in God has reached an all-time low. This may have something to do with the way these countries have established governments that guarantee a high level of social security for all of their citizens.  Aaron Kay and his colleagues ran a study in Canada which found that political insecurity may push us towards believing in God. They gave participants two versions of a fictitious news story: one describing Canada’s current political situation as stable, the other describing it as potentially unstable. After reading one of the two articles, people’s beliefs in God were measured. People who read the article describing the government as potentially unstable were more likely to agree that God, or some other type of nonhuman entity, is in control of the universe. A common belief in the divine may help people feel more secure. Yet when security is achieved by more secular means, it may remove some of the draw of faith.

[div class=attrib]Read the entire article here.[end-div]

[div class=attrib]Image: In God We Trust. Courtesy of the Houston Chronicle.[end-div]