Which Couch, the Blue or White? Stubbornness and Social Pressure

Counterintuitive results show that we are more likely to resist changing our minds when more people tell us where are wrong. A team of researchers from HP’s Social Computing Research Group found that humans are more likely to change their minds when fewer, rather than more, people disagree with them.

[div class=attrib]From HP:[end-div]

The research has practical applications for businesses, especially in marketing, suggests co-author Bernardo Huberman,  Senior HP Fellow and director of HP’s Social Computing Research Group.

“What this implies,” he says, “is that rather than overwhelming consumers with strident messages about an alternative product or service, in social media, gentle reporting of a few people having chosen that product or service can be more persuasive.”

The experiment – devised by Huberman along with Haiyi Zhu, an HP labs summer intern from Carnegie Mellon University, and Yarun Luon of HP Labs – reveals several other factors that determine whether choices can be reversed though social influence, too. It’s the latest product of HP Lab’s pioneering program in social computing, which is dedicated to creating software and algorithms that provide meaningful context to huge sets of unstructured data.

Study results: the power of opinion
Opinions and product ratings are everywhere online. But when do they actually influence our own choices?

To find out, the HP team asked several hundred people to make a series of choices between two different pieces of furniture.  After varying amounts of time, they were asked to choose again between the same items, but this time they were told that a certain number of other people had preferred the opposite item.  (Separately, the experiment also asked subjects to choose between two different baby pictures, to control for variance in subject matter).

Analysis of the resulting choices showed that receiving a small amount of social pressure to reverse one’s opinion (by being told that a just few people had chosen differently) was more likely to produce a reversed vote than when the pressure felt was much greater (i.e. where an overwhelming number of people were shown as having made a different choice).

The team also discovered:

– People were more likely to be influenced if they weren’t prompted to change their mind immediately after they had expressed their original preference.
– The more time that people spent on their choice, the more likely they were to reverse that choice and conform to the opinion of others later on.

[div class=attrib]More of this fascinating article here.[end-div]

Complex Decision To Make? Go With the Gut

Over the last couple of years a number of researchers have upended conventional wisdom by finding that complex decisions, for instance, those having lots of variables, are better “made” through our emotional system. This flies in the face of the commonly held belief that complexity is best handled by our rational side.

[div class=attrib]Jonah Lehrer over at the Frontal Cortex brings us up to date on current thinking.[end-div]

We live in a world filled with difficult decisions. In fact, we’ve managed to turn even trivial choices – say, picking a toothpaste – into a tortured mental task, as the typical supermarket has more than 200 different dental cleaning options. Should I choose a toothpaste based on fluoride content? Do I need a whitener in my toothpaste? Is Crest different than Colgate? The end result is that the banal selection becomes cognitively demanding, as I have to assess dozens of alternatives and take an array of variables into account. And it’s not just toothpaste: The same thing has happened to nearly every consumption decision, from bottled water to blue jeans to stocks. There are no simple choices left – capitalism makes everything complicated.

How should we make all these hard choices? How does one navigate a world of seemingly infinite alternatives? For thousands of years, the answer has seemed obvious: when faced with a difficult dilemma, we should carefully assess our options and spend a few moments consciously deliberating the information. Then, we should choose the toothpaste that best fits our preferences. This is how we maximize utility and get the most bang for the buck. We are rational agents – we should make decisions in a rational manner.

But what if rationality backfires? What if we make better decisions when we trust our gut instincts? While there is an extensive literature on the potential wisdom of human emotion, it’s only in the last few years that researchers have demonstrated that the emotional system (aka Type 1 thinking) might excel at complex decisions, or those involving lots of variables. If true, this would suggest that the unconscious is better suited for difficult cognitive tasks than the conscious brain, that the very thought process we’ve long disregarded as irrational and impulsive might actually be “smarter” than reasoned deliberation. This is largely because the unconscious is able to handle a surfeit of information, digesting the facts without getting overwhelmed. (Human reason, in contrast, has a very strict bottleneck and can only process about four bits of data at any given moment.) When confused in the toothpaste aisle, bewildered by all the different options, we should go with the product that feels the best.

The most widely cited demonstration of this theory is a 2006 Science paper led by Ap Dijksterhuis. (I wrote about the research in How We Decide.) The experiment went like this: Dijksterhuis got together a group of Dutch car shoppers and gave them descriptions of four different used cars. Each of the cars was rated in four different categories, for a total of sixteen pieces of information. Car number 1, for example, was described as getting good mileage, but had a shoddy transmission and poor sound system. Car number 2 handled poorly, but had lots of legroom. Dijksterhuis designed the experiment so that one car was objectively ideal, with “predominantly positive aspects”. After showing people these car ratings, Dijksterhuis then gave them a few minutes to consciously contemplate their decision. In this “easy” situation, more than fifty percent of the subjects ended up choosing the best car.

[div class=attrib]Read more of the article and Ap Dijksterhuis’ classic experiment here.[end-div]

[div class=attrib]Image courtesy of CustomerSpeak.[end-div]

Movies in the Mind: A Great Leap in Brain Imaging

A common premise of “mad scientists” in science fiction movies: a computer reconstructs video images from someone’s thoughts via a brain scanning device. Yet, now this is no longer the realm of fantasy. Researchers from the University of California at Berkeley have successfully decoded and reconstructed people’s dynamic visual experiences – in this case watching Hollywood movie trailers –using functional Magnetic Resonance Imaging (fMRI) and computer simulation models.

Watch the stunning video clip below showing side-by-side movies of what a volunteer was actually watching and a computer reconstruction of fMRI data from the same volunteer.

[youtube]nsjDnYxJ0bo[/youtube]

The results are a rudimentary first step, with the technology requiring decades of refinement before the fiction of movies, such as Brainstorm, becomes a closer reality. However, this groundbreaking research nonetheless paves the way to a future of tremendous promise in brain science. Imagine the ability to reproduce and share images of our dreams and memories, or peering into the brain of a comatose patient.

[div class=attrib]More from the UC-Berkeley article here.[end-div]

How Will You Die?

Bad news and good news. First, the bad news. If you’re between 45-54 years of age your cause of death will most likely be heart disease, that is, if you’re a male. If you are a female on the other hand, you’re more likely to fall prey to cancer. And, interestingly you are about 5 times more likely to die falling down stairs than from (accidental) electrocution. Now the good news. While the data may give us a probabilistic notion of how we may perish, no one (yet) knows when.

More vital statistics courtesy of this macabre infographic derived from data of National Center for Health Statistics and the National Safety Council.

Chance as a Subjective or Objective Measure

[div class=attrib]From Rationally Speaking:[end-div]

Stop me if you’ve heard this before: suppose I flip a coin, right now. I am not giving you any other information. What odds (or probability, if you prefer) do you assign that it will come up heads?

If you would happily say “Even” or “1 to 1” or “Fifty-fifty” or “probability 50%” — and you’re clear on WHY you would say this — then this post is not aimed at you, although it may pleasantly confirm your preexisting opinions as a Bayesian on probability. Bayesians, broadly, consider probability to be a measure of their state of knowledge about some proposition, so that different people with different knowledge may correctly quote different probabilities for the same proposition.

If you would say something along the lines of “The question is meaningless; probability only has meaning as the many-trials limit of frequency in a random experiment,” or perhaps “50%, but only given that a fair coin and fair flipping procedure is being used,” this post is aimed at you. I intend to try to talk you out of your Frequentist view; the view that probability exists out there and is an objective property of certain physical systems, which we humans, merely fallibly, measure.

My broader aim is therefore to argue that “chance” is always and everywhere subjective — a result of the limitations of minds — rather than objective in the sense of actually existing in the outside world.

[div class=attrib]Much more of this article here.[end-div]

[div class=attrib]Image courtesy of Wikipedia.[end-div]

MondayPoem: When I have Fears That I May Cease to Be

This week’s poem courtesy of the great romantic John Keats delves into the subject of time and brevity on this Earth. Although Keats was frequently scorned by critics during his lifetime, death transformed him into one of England’s most loved poets.

By John Keats:

– When I have Fears That I May Cease to Be

When I have fears that I may cease to be
Before my pen has gleaned my teeming brain,
Before high-pilèd books, in charactery,
Hold like rich garners the full ripened grain;
When I behold, upon the night’s starred face,
Huge cloudy symbols of a high romance,
And think that I may never live to trace
Their shadows with the magic hand of chance;
And when I feel, fair creature of an hour,
That I shall never look upon thee more,
Never have relish in the fairy power
Of unreflecting love—then on the shore
Of the wide world I stand alone, and think
Till love and fame to nothingness do sink.

 

[div class=attrib]Portrait of John Keats by William Hilton. National Portrait Gallery, London, courtesy of Wikipedia.[end-div]

Faster Than Light Travel

The world of particle physics is agog with recent news of an experiment that shows a very unexpected result – sub-atomic particles traveling faster than the speed of light. If verified and independently replicated the results would violate one of the universe’s fundamental properties described by Einstein in the Special Theory of Relativity. The speed of light — 186,282 miles per second (299,792 kilometers per second) — has long been considered an absolute cosmic speed limit.

Stranger still, over the last couple of days news of this anomalous result has even been broadcast on many cable news shows.

The experiment known as OPERA is a collaboration between France’s National Institute for Nuclear and Particle Physics Research and Italy’s Gran Sasso National Laboratory. Over the course of three years scientists fired a neutrino beam 454 miles (730 kilometers) underground from Geneva to a receiver in Italy. Their measurements show that neutrinos arrived an average of 60 nanoseconds sooner than light would have done. This doesn’t seem like a great amount, after all is only 60 billionths of a second, however the small difference could nonetheless undermine a hundred years of physics.

Understandably most physicists remain skeptical of the result, until further independent experiments are used to confirm the measurements or not. However, all seem to agree that if the result is confirmed this would be a monumental finding and would likely reshape modern physics and our understanding of the universe.

[div class=attrib]More on this intriguing story here courtesy of ARs Technica, which also offers a detailed explanation of several possible sources of error that may have contributed to the faster-than-light measurements.[end-div]

Eurovision

If you grew up in Europe or have spent at least 6 months there over the last 50 years you’ll have collided with the Eurovision Song Contest.

A quintessentially european invention, Eurovision, as it is commonly know, has grown from a handful of countries to embrace 43 nations across Europe in 2012. Countries compete for the prize of best song and the honor of hosting the contest the following year. While contestants and song are not usually guaranteed long-standing commercial success, the winner usually does claim 15 minutes or so on the spotlight and at least a singular one-hit-wonder. A notable exceptions was the Swedish group ABBA, which went on to generation-spanning superstardom.

Frank Jacobs over at Strange Maps offers his cartographic take on Eurovision.

[div class=attrib]From Strange Maps / Big Think:[end-div]

The Eurovision Song Contest is a resounding success in at least one respect. Set up as a laboratory of European harmony – musically, audiovisually and politically – its first edition [1] featured a mere 7 participating countries, all Western European. The 57th edition, next May in Azerbaijan, will have 43 countries from all over the continent vying for the top prize, and the honour to host the 2013 edition of the event in their capital city.

Mission accomplished, then. But a chorus of critics – swelling, as the turn of phrase suggests [2] – finds the annual event increasingly tacky and irrelevant. The winner is determined by a tally of national votes, which have less to do with the quality of the songs than with the degree of friendliness between the participating countries.

[div class=attrib]More of the article here.[end-div]

London’s Other River

You will have heard of the River Thames, the famous swathe of grey that cuts a watery path through London.  You may even have heard of several of London’s prominent canals, such as the Grand Union Canal and Regent’s Canal. But, you probably will not have heard of the mysterious River Fleet that meanders through eerie tunnels beneath the city.

The Fleet and its Victorian tunnels are available for exploration, but are not for the faint of heart or sensitive of nose.

For more stunning subterranean images follow the full article here.

[div class=attrib]Images courtesy of Environmental Grafitti.[end-div]

The Sins of Isaac Newton

Aside from founding classical mechanics — think universal gravitation and laws of motion, laying the building blocks of calculus, and inventing the reflecting telescope Isaac Newton made time for spiritual pursuits. In fact, Newton was a highly religious individual (though a somewhat unorthodox Christian).

So, although Newton is best remembered for his monumental work, Philosophiæ Naturalis Principia Mathematica, he kept a lesser known, but no-less detailed journal of his sins while a freshman at Cambridge. A list of Newton’s most “heinous” self-confessed, moral failings follows below.

[div class=attrib]From io9:[end-div]

10. Making a feather while on Thy day.

Anyone remember the Little House series, where every day they worked their prairie-wind-chapped asses off and risked getting bitten by badgers and nearly lost eyes to exploding potatoes (all true), but never complained about anything until they hit Sunday and literally had to do nothing all day? That was hundreds of years after Newton. And Newton was even more bored than the Little House people, although he was sorry about it later. He confesses everything from making a mousetrap on Sunday, to playing chimes, to helping a roommate with a school project, to making pies, to ‘squirting water’ on the Sabbath.

9. Having uncleane thoughts words and actions and dreamese.

Well, to be fair, he was only a boy at this time. He may have had all the unclean thoughts in the world, but Newton, on his death bed, is well known for saying he is proudest of dying a virgin. And this is from the guy who invented the Laws of Motion.

8. Robbing my mothers box of plums and sugar.

Clearly he needed to compensate for lack of carnal pleasure with some other kind of physical comfort. It seems that Newton had a sweet tooth. There’s this ‘robbery.’ There’s the aforementioned pies, although they might be savory pies. And in another confession he talks about how he had ‘gluttony in his sickness.’ The guy needed to eat.

7. Using unlawful means to bring us out of distresses.

This is a strange sin because it’s so vague. Could it be that the ‘distresses’ were financial, leading to another confessed sin of ‘Striving to cheat with a brass halfe crowne.’ Some biographers think that his is a sexual confession and his ‘distresses’ were carnal. Newton isn’t just saying that he used immoral means, but unlawful ones. What law did he break?

6. Using Wilford’s towel to spare my own.

Whatever else Newton was, he was a terrible roommate. Although he was a decent student, he was reputed to be bad at personal relationships with anyone, at any time. This sin, using someone’s towel, was probably more a big deal during a time when plague was running through the countryside. He also confesses to, “Denying my chamberfellow of the knowledge of him that took him for a sot.”

And his sweet tooth still reigned. Any plums anyone left out would probably be gone by the time they got back. He confessed the sin of “Stealing cherry cobs from Eduard Storer.” Just to top it off, Newton confessed to ‘peevishness’ with people over and over in his journal. He was clearly a moody little guy. No word on whether he apologized to them about it, but he apologized to God, and surely that was enough.

[div class=attrib]More of the article here.[end-div]

[div class=attrib]Image courtesy of Wikipedia.[end-div]

Why Are the French Not as Overweight as Americans?

[div class=attrib]From the New York Times:[end-div]

PARIS — You’re reminded hourly, even while walking along the slow-moving Seine or staring at sculpted marble bodies under the Louvre’s high ceilings, that the old continent is crumbling. They’re slouching toward a gerontocracy, these Europeans. Their banks are teetering. They can’t handle immigration. Greece is broke, and three other nations are not far behind. In a half-dozen languages, the papers shout: crisis!

If the euro fails, as Chancellor Angela Merkel of Germany said, then Europe fails. That means a recession here, and a likely one at home, which will be blamed on President Obama, and then Rick Perry will get elected, and the leader of the free world will be somebody who thinks the earth is only a few thousand years old.

You see where it’s all going, this endless “whither the euro question.” So, you think of something else, the Parisian way. You think of what these people can eat on a given day: pain au chocolat for breakfast, soupe a? l’oignon gratine?e topped by melted gruyere for lunch and foie gras for dinner, as a starter.

And then you look around: how can they live like this? Where are all the fat people? It’s a question that has long tormented visitors. These French, they eat anything they damn well please, drink like Mad Men and are healthier than most Americans. And of course, their medical care is free and universal, and considered by many to be the best in the world.

… Recent studies indicate that the French are, in fact, getting fatter — just not as much as everyone else. On average, they are where Americans were in the 1970s, when the ballooning of a nation was still in its early stages. But here’s the good news: they may have figured out some way to contain the biggest global health threat of our time, for France is now one of a handful of nations where obesity among the young has leveled off.

First, the big picture: Us. We — my fellow Americans — are off the charts on this global pathology. The latest jolt came from papers published last month in The Lancet, projecting that three-fourths of adults in the United States will be overweight or obese by 2020.

Only one state, Colorado, now has an obesity rate under 20 percent (obesity is the higher of the two body-mass indexes, the other being overweight). But that’s not good news. The average bulge of an adult Coloradan has increased 80 percent over the last 15 years. They only stand out by comparison to all other states. Colorado, the least fat state in 2011, would be the heaviest had they reported their current rate of obesity 20 years ago. That’s how much we’ve slipped.

… A study of how the French appear to have curbed childhood obesity shows the issue is not complex. Junk food vending machines were banned in schools. The young were encouraged to exercise more. And school lunches were made healthier.

… But another answer can come from self-discovery. Every kid should experience a fresh peach in August. And an American newly arrived in the City of Light should nibble at a cluster of grapes or some blood-red figs, just as the French do, with that camembert.

[div class=attrib]More from the article here.[end-div]

[div class=attrib]Obesity classification standards illustration courtesy of Wikipedia.[end-div]

Atheism: Scientific or Humanist

[div class=attrib]From The Stone forum, New York Times:[end-div]

Led by the biologist Richard Dawkins, the author of “The God Delusion,” atheism has taken on a new life in popular religious debate. Dawkins’s brand of atheism is scientific in that it views the “God hypothesis” as obviously inadequate to the known facts. In particular, he employs the facts of evolution to challenge the need to postulate God as the designer of the universe. For atheists like Dawkins, belief in God is an intellectual mistake, and honest thinkers need simply to recognize this and move on from the silliness and abuses associated with religion.

Most believers, however, do not come to religion through philosophical arguments. Rather, their belief arises from their personal experiences of a spiritual world of meaning and values, with God as its center.

In the last few years there has emerged another style of atheism that takes such experiences seriously. One of its best exponents is Philip Kitcher, a professor of philosophy at Columbia. (For a good introduction to his views, see Kitcher’s essay in “The Joy of Secularism,” perceptively discussed last month by James Wood in The New Yorker.)

Instead of focusing on the scientific inadequacy of theistic arguments, Kitcher critically examines the spiritual experiences underlying religious belief, particularly noting that they depend on specific and contingent social and cultural conditions. Your religious beliefs typically depend on the community in which you were raised or live. The spiritual experiences of people in ancient Greece, medieval Japan or 21st-century Saudi Arabia do not lead to belief in Christianity. It seems, therefore, that religious belief very likely tracks not truth but social conditioning. This “cultural relativism” argument is an old one, but Kitcher shows that it is still a serious challenge. (He is also refreshingly aware that he needs to show why a similar argument does not apply to his own position, since atheistic beliefs are themselves often a result of the community in which one lives.)

[div class=attrib]More of the article here.[end-div]

[div class=attrib]Image: Ephesians 2,12 – Greek atheos, courtesy of Wikipedia.[end-div]

MondayPoem: Mathematics Considered as a Vice

A poem by Anthony Hecht this week. On Hecht, Poetry Foundation remarks, “[o]ne of the leading voices of his generation, Anthony Hecht’s poetry is known for its masterful use of traditional forms and linguistic control.”

Following Hecht’s death in 2004 the New York Times observed:

It was Hecht’s gift to see into the darker recesses of our complex lives and conjure to his command the exact words to describe what he found there. Hecht remained skeptical about whether pain and contemplation can ultimately redeem us, yet his ravishing poems extend hope to his readers that they can.

By Anthony Hecht:

– Mathematics Considered as a Vice

I would invoke that man
Who chipped for all posterity an ass
(The one that Jesus rode)
Out of hard stone, and set its either wing
Among the wings of the most saintly clan
On Chartres Cathedral, and that it might sing
The praise to all who pass
Of its unearthly load,
Hung from its neck a harp-like instrument.
I would invoke that man
To aid my argument.

The ass smiles on us all,
Being astonished that an ass might rise
To such sure eminence
Not merely among asses but mankind,
Simpers, almost, upon the western wall
In praise of folly, who midst sow and kine,
Saw with its foolish eyes
Gold, Myrrh, and Frankincense
Enter the stable door, against all odds.
The ass smiles on us all.
Our butt at last is God’s.

That man is but an ass—
More perfectly, that ass is but a man
Who struggles to describe
Our rich, contingent and substantial world
In ideal signs: the dunged and pagan grass,
Misted in summer, or the mother-of-pearled
Home of the bachelor-clam.
A cold and toothless tribe
Has he for brothers, who would coldly think.
That man is but an ass
Who smells not his own stink.

For all his abstract style
Speaks not to our humanity, and shows
Neither the purity
Of heaven, nor the impurity beneath,
And cannot see the feasted crocodile
Ringed with St. Francis’ birds to pick its teeth,
Nor can his thought disclose
To normal intimacy,
Siamese twins, the double-beasted back,
For all his abstract style
Utters our chiefest lack.

Despite his abstract style,
Pickerel will dawdle in their summer pools
Lit by the flitterings
Of light dashing the gusty surfaces,
Or lie suspended among shades of bile
And lime in fluent shift, for all he says.
And all the grey-haired mules,
Simple and neuter things,
Will bray hosannas, blessing harp and wing.
For all his abstract style,
The ass will learn to sing.

The Teen Brain: Work In Progress or Adaptive Network?

[div class=attrib]From Wired:[end-div]

Ever since the late-1990s, when researchers discovered that the human brain takes into our mid-20s to fully develop — far longer than previously thought — the teen brain has been getting a bad rap. Teens, the emerging dominant narrative insisted, were “works in progress” whose “immature brains” left them in a state “akin to mental retardation” — all titles from prominent papers or articles about this long developmental arc.

In a National Geographic feature to be published next week, however, I highlight a different take: A growing view among researchers that this prolonged developmental arc is less a matter of delayed development than prolonged flexibility. This account of the adolescent brain — call it the “adaptive adolescent” meme rather than the “immature brain” meme — “casts the teen less as a rough work than as an exquisitely sensitive, highly adaptive creature wired almost perfectly for the job of moving from the safety of home into the complicated world outside.” The teen brain, in short, is not dysfunctional; it’s adaptive. .

Carl Zimmer over at Discover gives us some further interesting insights into recent studies of teen behavior.

[div class=attrib]From Discover:[end-div]

Teenagers are a puzzle, and not just to their parents. When kids pass from childhood to adolescence their mortality rate doubles, despite the fact that teenagers are stronger and faster than children as well as more resistant to disease. Parents and scientists alike abound with explanations. It is tempting to put it down to plain stupidity: Teenagers have not yet learned how to make good choices. But that is simply not true. Psychologists have found that teenagers are about as adept as adults at recognizing the risks of dangerous behavior. Something else is at work.

Scientists are finally figuring out what that “something” is. Our brains have networks of neurons that weigh the costs and benefits of potential actions. Together these networks calculate how valuable things are and how far we’ll go to get them, making judgments in hundredths of a second, far from our conscious awareness. Recent research reveals that teen brains go awry because they weigh those consequences in peculiar ways.

… Neuroscientist B. J. Casey and her colleagues at the Sackler Institute of the Weill Cornell Medical College believe the unique way adolescents place value on things can be explained by a biological oddity. Within our reward circuitry we have two separate systems, one for calculating the value of rewards and another for assessing the risks involved in getting them. And they don’t always work together very well.

… The trouble with teens, Casey suspects, is that they fall into a neurological gap. The rush of hormones at puberty helps drive the reward-system network toward maturity, but those hormones do nothing to speed up the cognitive control network. Instead, cognitive control slowly matures through childhood, adolescence, and into early adulthood. Until it catches up, teenagers are stuck with strong responses to rewards without much of a compensating response to the associated risks.

[div class=attrib]More from theSource here.[end-div]

[div class=attrib]Image courtesy of Kitra Cahana, National Geographic.[end-div]

The Universe and Determinism

General scientific consensus suggests that our universe has no pre-defined destiny. While a number of current theories propose anything from a final Big Crush to an accelerating expansion into cold nothingness the future plan for the universe is not pre-determined. Unfortunately, our increasingly sophisticated scientific tools are still to meager to test and answer these questions definitively. So, theorists currently seem to have the upper hand. And, now yet another theory puts current cosmological thinking on its head by proposing that the future is pre-destined and that it may even reach back into the past to shape the present. Confused? Read on!

[div class=attrib]From FQXi:[end-div]

The universe has a destiny—and this set fate could be reaching backwards in time and combining with influences from the past to shape the present. It’s a mind-bending claim, but some cosmologists now believe that a radical reformulation of quantum mechanics in which the future can affect the past could solve some of the universe’s biggest mysteries, including how life arose. What’s more, the researchers claim that recent lab experiments are dramatically confirming the concepts underpinning this reformulation.

Cosmologist Paul Davies, at Arizona State University in Tempe, is embarking on a project to investigate the future’s reach into the present, with the help of a $70,000 grant from the Foundational Questions Institute. It is a project that has been brewing for more than 30 years, since Davies first heard of attempts by physicist Yakir Aharonov to get to root of some of the paradoxes of quantum mechanics. One of these is the theory’s apparent indeterminism: You cannot predict the outcome of experiments on a quantum particle precisely; perform exactly the same experiment on two identical particles and you will get two different results.

While most physicists faced with this have concluded that reality is fundamentally, deeply random, Aharonov argues that there is order hidden within the uncertainty. But to understand its source requires a leap of imagination that takes us beyond our traditional view of time and causality. In his radical reinterpretation of quantum mechanics, Aharonov argues that two seemingly identical particles behave differently under the same conditions because they are fundamentally different. We just do not appreciate this difference in the present because it can only be revealed by experiments carried out in the future.

“It’s a very, very profound idea,” says Davies. Aharonov’s take on quantum mechanics can explain all the usual results that the conventional interpretations can, but with the added bonus that it also explains away nature’s apparent indeterminism. What’s more, a theory in which the future can influence the past may have huge—and much needed—repercussions for our understanding of the universe, says Davies.

[div class=attrib]More from theSource here.[end-div]

Free Will: An Illusion?

Neuroscientists continue to find interesting experimental evidence that we do not have free will. Many philosophers continue to dispute this notion and cite inconclusive results and lack of holistic understanding of decision-making on the part of brain scientists. An article by Kerri Smith over at Nature lays open this contentious and fascinating debate.

[div class=attrib]From Nature:[end-div]

The experiment helped to change John-Dylan Haynes’s outlook on life. In 2007, Haynes, a neuroscientist at the Bernstein Center for Computational Neuroscience in Berlin, put people into a brain scanner in which a display screen flashed a succession of random letters1. He told them to press a button with either their right or left index fingers whenever they felt the urge, and to remember the letter that was showing on the screen when they made the decision. The experiment used functional magnetic resonance imaging (fMRI) to reveal brain activity in real time as the volunteers chose to use their right or left hands. The results were quite a surprise.

“The first thought we had was ‘we have to check if this is real’,” says Haynes. “We came up with more sanity checks than I’ve ever seen in any other study before.”

The conscious decision to push the button was made about a second before the actual act, but the team discovered that a pattern of brain activity seemed to predict that decision by as many as seven seconds. Long before the subjects were even aware of making a choice, it seems, their brains had already decided.

As humans, we like to think that our decisions are under our conscious control — that we have free will. Philosophers have debated that concept for centuries, and now Haynes and other experimental neuroscientists are raising a new challenge. They argue that consciousness of a decision may be a mere biochemical afterthought, with no influence whatsoever on a person’s actions. According to this logic, they say, free will is an illusion. “We feel we choose, but we don’t,” says Patrick Haggard, a neuroscientist at University College London.

You may have thought you decided whether to have tea or coffee this morning, for example, but the decision may have been made long before you were aware of it. For Haynes, this is unsettling. “I’ll be very honest, I find it very difficult to deal with this,” he says. “How can I call a will ‘mine’ if I don’t even know when it occurred and what it has decided to do?”

[div class=attrib]More from theSource here.[end-div]

[div class=attrib]Image courtesy of Nature.[end-div]

Book Review: Are You Serious? Lee Siegel

“You cannot be serious”, goes the oft quoted opening to a John McEnroe javelin thrown at an unsuspecting tennis umpire. This leads us to an earnest review of what is means to be serious from Lee Siegel’s new book, “Are You Serious?” As Michael Agger points out for Slate:

We don’t know what to take seriously anymore. Is Brian Williams a serious news anchor or is he playing at being serious? How about Jon Stewart? The New York Times exudes seriousness, but the satire of The Onion can also be very serious.

Do we indeed need a how-to manual on how to exude required seriousness in the correct circumstances? Do we need a 3rd party narrator to tell us when to expect seriousness or irony or serious irony? Perhaps Lee Siegel’s book can shed some light.

[div class=attrib]More from Slate’s review of Siegel’s book:[end-div]

Siegel’s business-casual jaunt through seriosity begins with the Victorian poet Matthew Arnold, who saw the decline of religious spirit and proposed the “high seriousness” of poetry and literature in its place. “Seriousness implied a trustworthy personality,” Siegel writes, “just as faith in God once implied a trustworthy soul.” The way in which Arnold connected morality to cultural refinement soothed the intellectual insecurity of Americans vis-à-vis Europe and appealed to our ethos of self-improvement. The contemporary disciples of Arnold are those friends of yours who read Ulysses along with Ulysses Annotated, actually go to art galleries, and know their way around a Ring cycle. The way they enjoy culture expresses their seriousness of purpose.

I’ve only pulled at a few of the provocative strings in Siegel’s book. His argument that Sarah Palin is someone who has signaled seriousness by being willing to humiliate herself on reality TV makes a wild sort of sense. At other times, Siegel floats some nonsense that he knows to be silly.

But I don’t want to leave you hanging without providing Siegel’s answer to the question of finding seriousness in life. He gives us his “three pillars”: attention, purpose, continuity. That could mean being a really competent lawyer. Or being so skilled at being a pilot that you land a plane on the Hudson and save everyone onboard. Or being like Socrates and drinking the hemlock to prove that you believed in your ideas. Just find the thing that makes you “fully alive” and then you’re set. Which is to say that although the cultural and political figures we should take seriously change, the prospect of becoming a serious person remains dauntingly unchanged.

[div class=attrib]More from theSource here.[end-div]

The Lanier Effect

Twenty or so years ago the economic prognosticators and technology pundits would all have had us believe that the internet would transform society; it would level the playing field; it would help the little guy compete against the corporate behemoth; it would make us all “socially” rich if not financially. Yet, the promise of those early, heady days seems remarkably narrow nowadays. What happened? Or rather, what didn’t happen?

We excerpt a lengthy interview with Jaron Lanier over at the Edge. Lanier, a pioneer in the sphere of virtual reality, offers some well-laid arguments for and against concentration of market power as enabled by information systems and the internet. Though he leaves his most powerful criticism at the doors of Google. Their (in)famous corporate mantra — “do no evil” — will start to look remarkably disingenuous.

[div class=attrib]From the Edge:[end-div]

I’ve focused quite a lot on how this stealthy component of computation can affect our sense of ourselves, what it is to be a person. But lately I’ve been thinking a lot about what it means to economics.

In particular, I’m interested in a pretty simple problem, but one that is devastating. In recent years, many of us have worked very hard to make the Internet grow, to become available to people, and that’s happened. It’s one of the great topics of mankind of this era.  Everyone’s into Internet things, and yet we have this huge global economic trouble. If you had talked to anyone involved in it twenty years ago, everyone would have said that the ability for people to inexpensively have access to a tremendous global computation and networking facility ought to create wealth. This ought to create wellbeing; this ought to create this incredible expansion in just people living decently, and in personal liberty. And indeed, some of that’s happened. Yet if you look at the big picture, it obviously isn’t happening enough, if it’s happening at all.

The situation reminds me a little bit of something that is deeply connected, which is the way that computer networks transformed finance. You have more and more complex financial instruments, derivatives and so forth, and high frequency trading, all these extraordinary constructions that would be inconceivable without computation and networking technology.

At the start, the idea was, “Well, this is all in the service of the greater good because we’ll manage risk so much better, and we’ll increase the intelligence with which we collectively make decisions.” Yet if you look at what happened, risk was increased instead of decreased.

… We were doing a great job through the turn of the century. In the ’80s and ’90s, one of the things I liked about being in the Silicon Valley community was that we were growing the middle class. The personal computer revolution could have easily been mostly about enterprises. It could have been about just fighting IBM and getting computers on desks in big corporations or something, instead of this notion of the consumer, ordinary person having access to a computer, of a little mom and pop shop having a computer, and owning their own information. When you own information, you have power. Information is power. The personal computer gave people their own information, and it enabled a lot of lives.

… But at any rate, the Apple idea is that instead of the personal computer model where people own their own information, and everybody can be a creator as well as a consumer, we’re moving towards this iPad, iPhone model where it’s not as adequate for media creation as the real media creation tools, and even though you can become a seller over the network, you have to pass through Apple’s gate to accept what you do, and your chances of doing well are very small, and it’s not a person to person thing, it’s a business through a hub, through Apple to others, and it doesn’t create a middle class, it creates a new kind of upper class.

Google has done something that might even be more destructive of the middle class, which is they’ve said, “Well, since Moore’s law makes computation really cheap, let’s just give away the computation, but keep the data.” And that’s a disaster.

What’s happened now is that we’ve created this new regimen where the bigger your computer servers are, the more smart mathematicians you have working for you, and the more connected you are, the more powerful and rich you are. (Unless you own an oil field, which is the old way.) II benefit from it because I’m close to the big servers, but basically wealth is measured by how close you are to one of the big servers, and the servers have started to act like private spying agencies, essentially.

With Google, or with Facebook, if they can ever figure out how to steal some of Google’s business, there’s this notion that you get all of this stuff for free, except somebody else owns the data, and they use the data to sell access to you, and the ability to manipulate you, to third parties that you don’t necessarily get to know about. The third parties tend to be kind of tawdry.

[div class=attrib]Read the entire article.[end-div]

[div class=attrib]Image courtesy of Jaron Lanier.[end-div]

Reading Between the Lines

In his book, “The Secret Life of Pronouns”, professor of psychology James Pennebaker describes how our use of words like “I”, “we”, “he”, “she” and “who” reveals a wealth of detail about ourselves including, and very surprisingly, our health and social status.

[div class=attrib]Excerpts from James Pennebaker’s interview with Scientific American:[end-div]

In the 1980s, my students and I discovered that if people were asked to write about emotional upheavals, their physical health improved. Apparently, putting emotional experiences into language changed the ways people thought about their upheavals. In an attempt to better understand the power of writing, we developed a computerized text analysis program to determine how language use might predict later health improvements.

Much to my surprise, I soon discovered that the ways people used pronouns in their essays predicted whose health would improve the most. Specifically, those people who benefited the most from writing changed in their pronoun use from one essay to another. Pronouns were reflecting people’’s abilities to change perspective.

As I pondered these findings, I started looking at how people used pronouns in other texts — blogs, emails, speeches, class writing assignments, and natural conversation. Remarkably, how people used pronouns was correlated with almost everything I studied. For example, use of  first-person singular pronouns (I, me, my) was consistently related to gender, age, social class, honesty, status, personality, and much more.

… In my own work, we have analyzed the collected works of poets, playwrights, and novelists going back to the 1500s to see how their writing changed as they got older. We’ve compared the pronoun use of suicidal versus non-suicidal poets. Basically, poets who eventually commit suicide use I-words more than non-suicidal poets.
The analysis of language style can also serve as a psychological window into authors and their relationships. We have analyzed the poetry of Elizabeth Barrett and Robert Browning and compared it with the history of their marriage. Same thing with Ted Hughes and Sylvia Plath. Using a method we call Language Style Matching, we can isolate changes in the couples’ relationships.

… One of the most interesting results was part of a study my students and I conducted dealing with status in email correspondence. Basically, we discovered that in any interaction, the person with the higher status uses I-words less (yes, less) than people who are low in status. The effects were quite robust and, naturally, I wanted to test this on myself. I always assumed that I was a warm, egalitarian kind of guy who treated people pretty much the same.

I was the same as everyone else. When undergraduates wrote me, their emails were littered with I, me, and my. My response, although quite friendly, was remarkably detached — hardly an I-word graced the page.

[div class=attrib]More from theSource here.[end-div]

[div class=attrib]Images courtesy of Univesity of Texas at Austin.[end-div]

Once Not So Crazy Ideas About Our Sun

Some wacky ideas about our sun from not so long ago help us realize the importance of a healthy dose of skepticism combined with good science. In fact, as you’ll see from the timestamp on the image from NASA’s Solar and Heliospheric Observatory (SOHO) science can now bring us – the public – near realtime images of our nearest star.

[div class=attrib]From Slate:[end-div]

The sun is hell.

The18th-century English clergyman Tobias Swinden argued that hell couldn’t lie below Earth’s surface: The fires would soon go out, he reasoned, due to lack of air. Not to mention that the Earth’s interior would be too small to accommodate all the damned, especially after making allowances for future generations of the damned-to-be. Instead, wrote Swinden, it’s obvious that hell stares us in the face every day: It’s the sun.

The sun is made of ice.

In 1798, Charles Palmer—who was not an astronomer, but an accountant—argued that the sun can’t be a source of heat, since Genesis says that light already existed before the day that God created the sun. Therefore, he reasoned, the sun must merely focus light upon Earth—light that exists elsewhere in the universe. Isn’t the sun even shaped like a giant lens? The only natural, transparent substance that it could be made of, Palmer figured, is ice. Palmer’s theory was published in a widely read treatise that, its title crowed, “overturn[ed] all the received systems of the universe hitherto extant, proving the celebrated and indefatigable Sir Isaac Newton, in his theory of the solar system, to be as far distant from the truth, as any of the heathen authors of Greece or Rome.”

Earth is a sunspot.

Sunspots are magnetic regions on the sun’s surface. But in 1775, mathematician and theologian J. Wiedeberg said that the sun’s spots are created by the clumping together of countless solid “heat particles,” which he speculated were constantly being emitted by the sun. Sometimes, he theorized, these heat particles stick together even at vast distances from the sun—and this is how planets form. In other words, he believed that Earth is a sunspot.

The sun’s surface is liquid.

Throughout the 18th and 19th centuries, textbooks and astronomers were torn between two competing ideas about the sun’s nature. Some believed that its dazzling brightness was caused by luminous clouds and that small holes in the clouds, which revealed the cool, dark solar surface below, were the sunspots. But the majority view was that the sun’s body was a hot, glowing liquid, and that the sunspots were solar mountains sticking up through this lava-like substance.

The sun is inhabited.

No less a distinguished astronomer than William Herschel, who discovered the planet Uranus in 1781, often stated that the sun has a cool, solid surface on which human-like creatures live and play. According to him, these solar citizens are shielded from the heat given off by the sun’s “dazzling outer clouds” by an inner protective cloud layer—like a layer of haz-mat material—that perfectly blocks the solar emissions and allows for pleasant grassy solar meadows and idyllic lakes.

Language and Gender

As any Italian speaker would attest, the moon, of course is utterly feminine. It is “la luna”. Now, to a German it is “der mond”, and very masculine.

Numerous languages assign a grammatical gender to objects, which in turn influences how people see these objects as either female or male. Yet, researchers have found that sex tends to be ascribed to objects and concepts even in gender-neutral languages. Scientific American reviews this current research.

[div class attrib]From Scientific American:[end-div]

Gender is so fundamental to the way we understand the world that people are prone to assign a sex to even inanimate objects. We all know someone, or perhaps we are that person, who consistently refers to their computer or car with a gender pronoun (“She’s been running great these past few weeks!”) New research suggests that our tendency to see gender everywhere even applies to abstract ideas such as numbers. Across cultures, people see odd numbers as male and even numbers as female.

Scientists have long known that language can influence how we perceive gender in objects. Some languages consistently refer to certain objects as male or female, and this in turn, influences how speakers of that language think about those objects. Webb Phillips of the Max Planck Institute, Lauren Schmidt of HeadLamp Research, and Lera Boroditsky at Stanford University asked Spanish- and German-speaking bilinguals to rate various objects according to whether they seemed more similar to males or females. They found that people rated each object according to its grammatical gender. For example, Germans see the moon as being more like a man, because the German word for moon is grammatically masculine (“der Mond”). In contrast, Spanish-speakers see the moon as being more like a woman, because in Spanish the word for moon is grammatically feminine (“la Luna”).

Aside from language, objects can also become infused with gender based on their appearance, who typically uses them, and whether they seem to possess the type of characteristics usually associated with men or women. David Gal and James Wilkie of Northwestern University studied how people view gender in everyday objects, such as food and furniture. They found that people see food dishes containing meat as more masculine and salads and sour dairy products as more feminine. People see furniture items, such as tables and trash cans, as more feminine when they feature rounded, rather than sharp, edges.

[div class=attrib]More from theSource here.[end-div]

[div class=attrib]Image courtesy of Scientific American.[end-div]

What do Papua New Guinea and New York City Have In Common?

Well the simple answer is, around 800 spoken languages. Or to be more precise, Papua New Guinea is home to an astounding 830 different languages. New York City comes in a close second, with around 800 spoken languages – and that’s not counting when the United Nations is in session on Manhattan’s East Side. Sadly, some of the rarer tongues spoken in New York and Papua New Guinea, and around the globe for that matter, are rapidly becoming extinct – at the rate of around one language every two weeks.

As the Economist points out a group of linguists in New York City is working to codify some of the city’s most endangered tongues.

[div class=attrib]From the Economist:[end-div]

New York is also home, of course, to a lot of academic linguists, and three of them have got together to create an organisation called the Endangered Language Alliance (ELA), which is ferreting out speakers of unusual tongues from the city’s huddled immigrant masses. The ELA, which was set up last year by Daniel Kaufman, Juliette Blevins and Bob Holman, has worked in detail on 12 languages since its inception. It has codified their grammars, their pronunciations and their word-formation patterns, as well as their songs and legends. Among the specimens in its collection are Garifuna, which is spoken by descendants of African slaves who made their homes on St Vincent after a shipwreck unexpectedly liberated them; Mamuju, from Sulawesi in Indonesia; Mahongwe, a language from Gabon; Shughni, from the Pamirian region of Tajikistan; and an unusual variant of a Mexican language called Totonac.

Each volunteer speaker of a language of interest is first tested with what is known as a Swadesh list. This is a set of 207 high-frequency, slow-to-change words such as parts of the body, colours and basic verbs like eat, drink, sleep and kill. The Swadesh list is intended to ascertain an individual’s fluency before he is taken on. Once he has been accepted, Dr Kaufman and his colleagues start chipping away at the language’s phonology (the sounds of which it is composed) and its syntax (how its meaning is changed by the order of words and phrases). This sort of analysis is the bread and butter of linguistics.

Every so often, though, the researchers come across a bit of jam. The Mahongwe word manono, for example, means “I like” when spoken soft and flat, and “I don’t like” when the first syllable is a tad sharper in tone. Similarly, mbaza could be either “chest” or “council house”. In both cases, the two words are nearly indistinguishable to an English speaker, but yield starkly different patterns when run through a spectrograph. Manono is a particular linguistic oddity, since it uses only tone to differentiate an affirmative from a negative—a phenomenon the ELA has since discovered applies to all verbs in Mahongwe.

[div class=attrib]More from theSource here.[end-div]

[div class=attrib]Image courtesy of Wikipedia / Creative Commons.[end-div]

MondayPoem: A Little Language

This week theDiagonal triangulates its sights on the topic of language and communication. So, we introduce an apt poem by Robert Duncan. Of Robert Duncan, Poetry Foundation writes:

Though the name Robert Duncan is not well known outside the literary world, within that world it has become associated with a number of superlatives. Kenneth Rexroth, writing in Assays, names Duncan “one of the most accomplished, one of the most influential” of the postwar American poets.

 

By Robert Duncan:

– A Little Language
I know a little language of my cat, though Dante says
that animals have no need of speech and Nature
abhors the superfluous.   My cat is fluent.   He
converses when he wants with me.   To speak
is natural.   And whales and wolves I’ve heard
in choral soundings of the sea and air
know harmony and have an eloquence that stirs
my mind and heart—they touch the soul.   Here
Dante’s religion that would set Man apart
damns the effluence of our life from us
to build therein its powerhouse.
It’s in his animal communication Man is
true, immediate, and
in immediacy, Man is all animal.
His senses quicken in the thick of the symphony,
old circuits of animal rapture and alarm,
attentions and arousals in which an identity rearrives.
He hears
particular voices among
the concert, the slightest
rustle in the undertones,
rehearsing a nervous aptitude
yet to prove his. He sees the flick
of significant red within the rushing mass
of ruddy wilderness and catches the glow
of a green shirt
to delite him in a glowing field of green
—it speaks to him—
and in the arc of the spectrum color
speaks to color.
The rainbow articulates
a promise he remembers
he but imitates
in noises that he makes,
this speech in every sense
the world surrounding him.
He picks up on the fugitive tang of mace
amidst the savory mass,
and taste in evolution is an everlasting key.
There is a pun of scents in what makes sense.
Myrrh it may have been,
the odor of the announcement that filld the house.
He wakes from deepest sleep
upon a distant signal and waits
as if crouching, springs
to life.
[div class=attrib]Image courtesy of Wikipedia / Creative Commons.[end-div]

Sleep: Defragmenting the Brain

[div class=attrib]From Neuroskeptic:[end-div]

After a period of heavy use, hard disks tend to get ‘fragmented’. Data gets written all over random parts of the disk, and it gets inefficient to keep track of it all.

That’s why you need to run a defragmentation program occasionally. Ideally, you do this overnight, while you’re asleep, so it doesn’t stop you from using the computer.

A new paper from some Stanford neuroscientists argues that the function of sleep is to reorganize neural connections – a bit like a disk defrag for the brain – although it’s also a bit like compressing files to make more room, and a bit like a system reset: Synaptic plasticity in sleep: learning, homeostasis and disease

The basic idea is simple. While you’re awake, you’re having experiences, and your brain is forming memories. Memory formation involves a process called long-term potentiation (LTP) which is essentially the strengthening of synaptic connections between nerve cells.

Yet if LTP is strengthening synapses, and we’re learning all our lives, wouldn’t the synapses eventually hit a limit? Couldn’t they max out, so that they could never get any stronger?

Worse, the synapses that strengthen during memory are primarily glutamate synapses – and these are dangerous. Glutamate is a common neurotransmitter, and it’s even a flavouring, but it’s also a toxin.

Too much glutamate damages the very cells that receive the messages. Rather like how sound is useful for communication, but stand next to a pneumatic drill for an hour, and you’ll go deaf.

So, if our brains were constantly forming stronger glutamate synapses, we might eventually run into serious problems. This is why we sleep, according to the new paper. Indeed, sleep deprivation is harmful to health, and this theory would explain why.

[div class=attrib]More from theSource here.[end-div]

The Rise of Twins

[div class=attrib]From Slate:[end-div]

Twenty-five years ago, almost no one had a cell phone. Very few of us had digital cameras, and laptop computers belonged only to the very rich. But there is something else—not electronic, but also man-made—that has climbed from the margins of the culture in the 1980s to become a standard accoutrement in upscale neighborhoods across the land: twins.

According to the latest data from the Centers for Disease Control and Prevention the U.S. twin rate has skyrocketed from one pair born out of every 53 live births in 1980 to one out of every 31 births in 2008. Where are all these double-babies coming from? And what’s going to happen in years to come—will the multiple-birth rate continue to grow until America ends up a nation of twins?

The twin boom can be explained by changes in when and how women are getting pregnant. Demographers have in recent years described a “delayer boom,” in which birth rates have risen among the sort of women—college-educated—who tend to put off starting a family into their mid-30s or beyond. There are now more in this group than ever before: In 1980, just 12.8 percent of women had attained a bachelor’s degree or higher; by 2010, that number had almost tripled, to 37 percent. And women in their mid-30s have multiple births at a higher rate than younger women. A mother who is 35, for example, is four times more likely than a mother who is 20 to give birth to twins. That seems to be on account of her producing more follicle-stimulating hormone, or FSH, which boosts ovulation. The more FSH you have in your bloodstream, the greater your chances of producing more than one egg in each cycle, and having fraternal twins as a result.

[div class=attrib]More from theSource here.[end-div]

Science: A Contest of Ideas

[div class=attrib]From Project Syndicate:[end-div]

It was recently discovered that the universe’s expansion is accelerating, not slowing, as was previously thought. Light from distant exploding stars revealed that an unknown force (dubbed “dark energy”) more than outweighs gravity on cosmological scales.

Unexpected by researchers, such a force had nevertheless been predicted in 1915 by a modification that Albert Einstein proposed to his own theory of gravity, the general theory of relativity. But he later dropped the modification, known as the “cosmological term,” calling it the “biggest blunder” of his life.

So the headlines proclaim: “Einstein was right after all,” as though scientists should be compared as one would clairvoyants: Who is distinguished from the common herd by knowing the unknowable – such as the outcome of experiments that have yet to be conceived, let alone conducted? Who, with hindsight, has prophesied correctly?

But science is not a competition between scientists; it is a contest of ideas – namely, explanations of what is out there in reality, how it behaves, and why. These explanations are initially tested not by experiment but by criteria of reason, logic, applicability, and uniqueness at solving the mysteries of nature that they address. Predictions are used to test only the tiny minority of explanations that survive these criteria.

The story of why Einstein proposed the cosmological term, why he dropped it, and why cosmologists today have reintroduced it illustrates this process. Einstein sought to avoid the implication of unmodified general relativity that the universe cannot be static – that it can expand (slowing down, against its own gravity), collapse, or be instantaneously at rest, but that it cannot hang unsupported.

This particular prediction cannot be tested (no observation could establish that the universe is at rest, even if it were), but it is impossible to change the equations of general relativity arbitrarily. They are tightly constrained by the explanatory substance of Einstein’s theory, which holds that gravity is due to the curvature of spacetime, that light has the same speed for all observers, and so on.

But Einstein realized that it is possible to add one particular term – the cosmological term – and adjust its magnitude to predict a static universe, without spoiling any other explanation. All other predictions based on the previous theory of gravity – that of Isaac Newton – that were testable at the time were good approximations to those of unmodified general relativity, with that single exception: Newton’s space was an unmoving background against which objects move. There was no evidence yet, contradicting Newton’s view – no mystery of expansion to explain. Moreover, anything beyond that traditional conception of space required a considerable conceptual leap, while the cosmological term made no measurable difference to other predictions. So Einstein added it.

[div class=attrib]More from theSource here.[end-div]

[div class=attrib]Image courtesy of Wikipedia / Creative Commons.[end-div]

Creativity and Anger

It turns out that creativity gets a boost from anger. While anger certainly is not beneficial in some contexts, researchers have found that angry people are more likely to be creative.

[div class=attrib]From Scientific American:[end-div]

This counterintuitive idea was pursued by researchers Matthijs Baas, Carsten De Dreu, and Bernard Nijstad in a series of studies  recently published in The Journal of Experimental Social Psychology. They found that angry people were more likely to be creative – though this advantage didn’t last for long, as the taxing nature of anger eventually leveled out creativity. This study joins several recent lines of research exploring the relative upside to anger – the ways in which anger is not only less harmful than typically assumed, but may even be helpful (though perhaps in small doses).

In an initial study, the researchers found that feeling angry was indeed associated with brainstorming in a more unstructured manner, consistent with “creative” problem solving. In a second study, the researchers first elicited anger from the study participants (or sadness, or a non-emotional state) and then asked them to engage in a brainstorming session in which they generated ideas to preserve and improve the environment. In the beginning of this task, angry participants generated more ideas (by volume) and generated more original ideas (those thought of by less than 1 percent or less of the other participants), compared to the other sad or non-emotional participants. However, this benefit was only present in the beginning of the task, and eventually, the angry participants generated only as many ideas as the other participants.

These findings reported by Baas and colleagues make sense, given what we already know about anger. Though anger may be unpleasant to feel, it is associated with a variety of attributes that may facilitate creativity. First, anger is an energizing feeling, important for the sustained attention needed to solve problems creatively. Second, anger leads to more flexible, unstructured thought processes.

Anecdotal evidence from internal meetings at Apple certainly reinforces the notion that creativity may benefit from well-channeled anger. Apple is often cited as one of the wolrd’s most creative companies.

[div class=attrib]From Jonah Lehred over at Wired:[end-div]

Many of my favorite Steve Jobs stories feature his anger, as he unleashes his incisive temper on those who fail to meet his incredibly high standards. A few months ago, Adam Lashinsky had a fascinating article in Fortune describing life inside the sanctum of 1 Infinite Loop. The article begins with the following scene:

In the summer of 2008, when Apple launched the first version of its iPhone that worked on third-generation mobile networks, it also debuted MobileMe, an e-mail system that was supposed to provide the seamless synchronization features that corporate users love about their BlackBerry smartphones. MobileMe was a dud. Users complained about lost e-mails, and syncing was spotty at best. Though reviewers gushed over the new iPhone, they panned the MobileMe service.

Steve Jobs doesn’t tolerate duds. Shortly after the launch event, he summoned the MobileMe team, gathering them in the Town Hall auditorium in Building 4 of Apple’s campus, the venue the company uses for intimate product unveilings for journalists. According to a participant in the meeting, Jobs walked in, clad in his trademark black mock turtleneck and blue jeans, clasped his hands together, and asked a simple question:

“Can anyone tell me what MobileMe is supposed to do?” Having received a satisfactory answer, he continued, “So why the fuck doesn’t it do that?”

For the next half-hour Jobs berated the group. “You’ve tarnished Apple’s reputation,” he told them. “You should hate each other for having let each other down.” The public humiliation particularly infuriated Jobs. Walt Mossberg, the influential Wall Street Journal gadget columnist, had panned MobileMe. “Mossberg, our friend, is no longer writing good things about us,” Jobs said. On the spot, Jobs named a new executive to run the group.

Brutal, right? But those flashes of intolerant anger have always been an important part of Jobs’ management approach. He isn’t shy about the confrontation of failure and he doesn’t hold back negative feedback. He is blunt at all costs, a cultural habit that has permeated the company. Jonathan Ive, the lead designer at Apple, describes the tenor of group meetings as “brutally critical.”

[div class=attrib]More from theSource here and here.[end-div]

[div class=attrib]Image of Brandy Norwood, courtesy of Wikipedia / Creative Commons.[end-div]

CEO, COO, CFO, CTO: Acronym Soup Explained

[div class=attrib]From Slate:[end-div]

Steve Jobs resigned from his position as Apple’s CEO, or chief executive officer, Wednesday. Taking his place is Tim Cook, previously the company’s COO, or chief operating officer. They also have a CFO, and, at one point or another, the company has had a CIO and CTO, too. When did we start calling corporate bosses C-this-O and C-that-O?

The 1970s. The phrase chief executive officer has been used, if at times rarely, in connection to corporate structures since at least the 19th century. (See, for instance, this 1888 book on banking law in Canada.) About 40 years ago, the phrase began gaining ground on president as the preferred title for the top director in charge of a company’s daily operations. Around the same time, the use of CEO in printed material surged and, if the Google Books database is to be believed, surpassed the long-form chief executive officer in the early 1980s. CFO has gained popularity, too, but at a much slower rate.

The online version of the Oxford English Dictionary published its first entries for CEO and CFO in January of this year. The entries’ first citations are a 1972 article in the Harvard Business Review and a 1971 Boston Globe article, respectively. (Niche publications were using the initials at least a half-decade earlier.) The New York Times seems to have printed its first CEO in a table graphic for a 1972 article, “Executives’ Pay Still Rising,” when space for the full phrase might have been lacking.

[div class=attrib]More from theSource here.[end-div]

[div class=attrib]Image of Steve Jobs and Bill Gates courtesy of Wikipedia / Creative Commons.[end-div]