Global Interconnectedness: Submarine Cables

Apparently only 1 percent of global internet traffic is transmitted via satellite or terrestrially-based radio frequency. The remaining 99 percent is still carried via cable – fiber optic and copper. Much of this cable is strewn for many thousands of miles across the seabeds of our deepest oceans.

For a fascinating view of these intricate systems and to learn why and how Brazil is connected to Angola, or Auckland, New Zealand connected to Redondo Beach California via the 12,750 km long Pacific Fiber check the interactive Submarine Cable Map from TeleGeography.

Steve Jobs: The Secular Prophet

The world will miss Steve Jobs.

In early 2010 the U.S. Supreme Court overturned years of legal precedent by assigning First Amendment (free speech) protections to corporations. We could argue the merits and demerits of this staggering ruling until the cows come home. However, one thing is clear if corporations are to be judged as people. And, that is the world would in all likelihood benefit more from a corporation with a human, optimistic and passionate face (Apple) rather than from a faceless one (Exxon) or an ideological one (News Corp) or an opaque one (Koch Industries).

That said, we excerpt a fascinating essay on Steve Jobs by Andy Crouch below. We would encourage Mr.Crouch to take this worthy idea further by examining the Fortune 1000 list of corporations. Could he deliver a similar analysis for each of these corporations’ leaders? We believe not.

The world will miss Steve Jobs.

[div class=attrib]By Andy Crouch for the Wall Street Journal:[end-div]

Steve Jobs was extraordinary in countless ways—as a designer, an innovator, a (demanding and occasionally ruthless) leader. But his most singular quality was his ability to articulate a perfectly secular form of hope. Nothing exemplifies that ability more than Apple’s early logo, which slapped a rainbow on the very archetype of human fallenness and failure—the bitten fruit—and turned it into a sign of promise and progress.

That bitten apple was just one of Steve Jobs’s many touches of genius, capturing the promise of technology in a single glance. The philosopher Albert Borgmann has observed that technology promises to relieve us of the burden of being merely human, of being finite creatures in a harsh and unyielding world. The biblical story of the Fall pronounced a curse upon human work—”cursed is the ground for thy sake; in sorrow shalt thou eat of it all the days of thy life.” All technology implicitly promises to reverse the curse, easing the burden of creaturely existence. And technology is most celebrated when it is most invisible—when the machinery is completely hidden, combining godlike effortlessness with blissful ignorance about the mechanisms that deliver our disburdened lives.

Steve Jobs was the evangelist of this particular kind of progress—and he was the perfect evangelist because he had no competing source of hope. He believed so sincerely in the “magical, revolutionary” promise of Apple precisely because he believed in no higher power. In his celebrated Stanford commencement address (which is itself an elegant, excellent model of the genre), he spoke frankly about his initial cancer diagnosis in 2003. It’s worth pondering what Jobs did, and didn’t, say:

“No one wants to die. Even people who want to go to heaven don’t want to die to get there. And yet death is the destination we all share. No one has ever escaped it. And that is as it should be, because death is very likely the single best invention of life. It’s life’s change agent; it clears out the old to make way for the new. Right now, the new is you. But someday, not too long from now, you will gradually become the old and be cleared away. Sorry to be so dramatic, but it’s quite true. Your time is limited, so don’t waste it living someone else’s life. Don’t be trapped by dogma, which is living with the results of other people’s thinking. Don’t let the noise of others’ opinions drown out your own inner voice, heart and intuition. They somehow already know what you truly want to become.”

This is the gospel of a secular age.

[div class=attrib]Steve Jobs by Tim O’Brien, image courtesy of Wall Street Journal.[end-div]

Googlization of the Globe: For Good (or Evil)

Google’s oft quoted corporate mantra — do no evil — reminds us to remain vigilant even if the company believes it does good and can do no wrong.

Google serves up countless search results to ease our never-ending thirst for knowledge, deals, news, quotes, jokes, user manuals, contacts, products and so on. This is clearly of tremendous benefit to us, to Google and to Google’s advertisers. Of course in fulfilling our searches Google collects equally staggering amounts of information — about us. Increasingly the company will know where we are, what we like and dislike, what we prefer, what we do, where we travel, with whom and why, how our friends are, what we read, what we buy.

As Jaron Lanier remarked in a recent post, there is a fine line between being a global index to the world’s free and open library of information and being the paid gatekeeper to our collective knowledge and hoarder of our collective online (and increasingly offline) behaviors, tracks and memories. We have already seen how Google, and others, can personalize search results based on our previous tracks thus filtering and biasing what we see and read, limiting our exposure to alternate views and opinions.

It’s quite easy to imagine a rather more dystopian view of a society gone awry manipulated by a not-so-benevolent Google when, eventually, founders Brin and Page retire to their vacation bases on the moon.

With this in mind Daniel Soar over at London Review of Books reviews several recent books about Google and offers some interesting insights.

[div class=attrib]London Review of Books:[end-div]

This spring, the billionaire Eric Schmidt announced that there were only four really significant technology companies: Apple, Amazon, Facebook and Google, the company he had until recently been running. People believed him. What distinguished his new ‘gang of four’ from the generation it had superseded – companies like Intel, Microsoft, Dell and Cisco, which mostly exist to sell gizmos and gadgets and innumerable hours of expensive support services to corporate clients – was that the newcomers sold their products and services to ordinary people. Since there are more ordinary people in the world than there are businesses, and since there’s nothing that ordinary people don’t want or need, or can’t be persuaded they want or need when it flashes up alluringly on their screens, the money to be made from them is virtually limitless. Together, Schmidt’s four companies are worth more than half a trillion dollars. The technology sector isn’t as big as, say, oil, but it’s growing, as more and more traditional industries – advertising, travel, real estate, used cars, new cars, porn, television, film, music, publishing, news – are subsumed into the digital economy. Schmidt, who as the ex-CEO of a multibillion-dollar corporation had learned to take the long view, warned that not all four of his disruptive gang could survive. So – as they all converge from their various beginnings to compete in the same area, the place usually referred to as ‘the cloud’, a place where everything that matters is online – the question is: who will be the first to blink?

If the company that falters is Google, it won’t be because it didn’t see the future coming. Of Schmidt’s four technology juggernauts, Google has always been the most ambitious, and the most committed to getting everything possible onto the internet, its mission being ‘to organise the world’s information and make it universally accessible and useful’. Its ubiquitous search box has changed the way information can be got at to such an extent that ten years after most people first learned of its existence you wouldn’t think of trying to find out anything without typing it into Google first. Searching on Google is automatic, a reflex, just part of what we do. But an insufficiently thought-about fact is that in order to organise the world’s information Google first has to get hold of the stuff. And in the long run ‘the world’s information’ means much more than anyone would ever have imagined it could. It means, of course, the totality of the information contained on the World Wide Web, or the contents of more than a trillion webpages (it was a trillion at the last count, in 2008; now, such a number would be meaningless). But that much goes without saying, since indexing and ranking webpages is where Google began when it got going as a research project at Stanford in 1996, just five years after the web itself was invented. It means – or would mean, if lawyers let Google have its way – the complete contents of every one of the more than 33 million books in the Library of Congress or, if you include slightly varying editions and pamphlets and other ephemera, the contents of the approximately 129,864,880 books published in every recorded language since printing was invented. It means every video uploaded to the public internet, a quantity – if you take the Google-owned YouTube alone – that is increasing at the rate of nearly an hour of video every second.

[div class=attrib]Read more here.[end-div]

MondayPoem: Further In

Tomas Tranströmer is one of Sweden’s leading poets. He studied poetry and psychology at the University of Stockholm. Tranströmer was awarded the 2011 Nobel Prize for Literature “because, through his condensed, translucent images, he gives us fresh access to reality”.

By Tomas Tranströmer:

– Further In
On the main road into the city
when the sun is low.
The traffic thickens, crawls.
It is a sluggish dragon glittering.
I am one of the dragon’s scales.
Suddenly the red sun is
right in the middle of the windscreen
streaming in.
I am transparent
and writing becomes visible
inside me
words in invisible ink
which appear
when the paper is held to the fire!
I know I must get far away
straight through the city and then
further until it is time to go out
and walk far in the forest.
Walk in the footprints of the badger.
It gets dark, difficult to see.
In there on the moss lie stones.
One of the stones is precious.
It can change everything
it can make the darkness shine.
It is a switch for the whole country.
Everything depends on it.
Look at it, touch it…

Human Evolution Marches On

[div class=attrib]From Wired:[end-div]

Though ongoing human evolution is difficult to see, researchers believe they’ve found signs of rapid genetic changes among the recent residents of a small Canadian town.

Between 1800 and 1940, mothers in Ile aux Coudres, Quebec gave birth at steadily younger ages, with the average age of first maternity dropping from 26 to 22. Increased fertility, and thus larger families, could have been especially useful in the rural settlement’s early history.

According to University of Quebec geneticist Emmanuel Milot and colleagues, other possible explanations, such as changing cultural or environmental influences, don’t fit. The changes appear to reflect biological evolution.

“It is often claimed that modern humans have stopped evolving because cultural and technological advancements have annihilated natural selection,” wrote Milot’s team in their Oct. 3 Proceedings of the National Academy of Sciences paper. “Our study supports the idea that humans are still evolving. It also demonstrates that microevolution is detectable over just a few generations.”

Milot’s team based their study on detailed birth, marriage and death records kept by the Catholic church in Ile aux Coudres, a small and historically isolated French-Canadian island town in the Gulf of St. Lawrence. It wasn’t just the fact that average first birth age — a proxy for fertility — dropped from 26 to 22 in 140 years that suggested genetic changes. After all, culture or environment might have been wholly responsible, as nutrition and healthcare are for recent, rapid changes in human height. Rather, it was how ages dropped that caught their eye.

The patterns fit with models of gene-influenced natural selection. Moreover, thanks to the detailed record-keeping, it was possible to look at other possible explanations. Were better nutrition responsible, for example, improved rates of infant and juvenile mortality should have followed; they didn’t. Neither did the late-19th century transition from farming to more diversified professions.

[div class=attrib]Read more here.[end-div]

Misconceptions of Violence

We live in violent times. Or do we?

Despite the seemingly constant flow of human engineered destruction on our fellow humans, other species and our precious environment some thoughtful analysis — beyond the headlines of cable news — shows that all may not be lost to our violent nature. An insightful interview with psychologist Steven Pinker, author of “How the Mind Works” shows us that contemporary humans are not as bad as we may have thought. His latest book, “The Better Angels of Our Nature: Why Violence Has Declined,” analyzes the basis and history of human violence. Perhaps surprisingly Pinker suggests that we live in remarkably peaceful times, comparatively speaking. Characteristically he backs up his claims with clear historical evidence.

[div class=attrib]From Gareth Cook for Mind Matters:[end-div]

COOK: What would you say is the biggest misconception people have about violence?
PINKER: That we are living in a violent age. The statistics suggest that this may be the most peaceable time in our species’s existence.

COOK: Can you give a sense for how violent life was 500 or 1000 years ago?
PINKER: Statistics aside, accounts of daily life in medieval and early modern Europe reveal a society soaked in blood and gore. Medieval knights—whom today we would call warlords—fought their numerous private wars with a single strategy: kill as many of the opposing knight’s peasants as possible. Religious instruction included prurient descriptions of how the saints of both sexes were tortured and mutilated in ingenious ways. Corpses broken on the wheel, hanging from gibbets, or rotting in iron cages where the sinner had been left to die of exposure and starvation were a common part of the landscape. For entertainment, one could nail a cat to a post and try to head-butt it to death, or watch a political prisoner get drawn and quartered, which is to say partly strangled, disemboweled, and castrated before being decapitated. So many people had their noses cut off in private disputes that medical textbooks had procedures that were alleged to grow them back.

COOK: How has neuroscience contributed to our understanding of violence and its origins?
PINKER: Neuroscientists have long known that aggression in animals is not a unitary phenomenon driven by a single hormone or center. When they stimulate one part of the brain of a cat, it will lunge for the experimenter in a hissing, fangs-out rage; when they stimulate another, it will silently stalk a hallucinatory mouse. Still another circuit primes a male cat for a hostile confrontation with another male. Similar systems for rage, predatory seeking, and male-male aggression may be found in Homo sapiens, together with uniquely human, cognitively-driven  systems of aggression such as political and religious ideologies and moralistic punishment. Today, even the uniquely human systems can be investigated using functional neuroimaging. So neuroscience has given us the crucial starting point in understanding violence, namely that it is not a single thing. And it has helped us to discover biologically realistic taxonomies of the major motives for violence.

COOK: Is the general trend toward less violence going to continue in the future?
PINKER: It depends. In the arena of custom and institutional practices, it’s a good bet. I suspect that violence against women, the criminalization of homosexuality, the use of capital punishment, the callous treatment of animals on farms, corporal punishment of children, and other violent social practices will continue to decline, based on the fact that worldwide moralistic shaming movements in the past (such as those against slavery, whaling, piracy, and punitive torture) have been effective over long stretches of time. I also don’t expect war between developed countries to make a comeback any time soon. But civil wars, terrorist acts, government repression, and genocides in backward parts of the world are simply too capricious to allow predictions. With six billion people in the world, there’s no predicting what some cunning fanatic or narcissistic despot might do.

[div class=attrib]Read more of the interview here.[end-div]

[div class=attrib]Image courtesy of Scientific American.[end-div]

All Power Corrupts

[div class=attrib]From the Economist:[end-div]

DURING the second world war a new term of abuse entered the English language. To call someone “a little Hitler” meant he was a menial functionary who employed what power he had in order to annoy and frustrate others for his own gratification. From nightclub bouncers to the squaddies at Abu Ghraib prison who tormented their prisoners for fun, little Hitlers plague the world. The phenomenon has not, though, hitherto been subject to scientific investigation.

Nathanael Fast of the University of Southern California has changed that. He observed that lots of psychological experiments have been done on the effects of status and lots on the effects of power. But few, if any, have been done on both combined. He and his colleagues Nir Halevy of Stanford University and Adam Galinsky of Northwestern University, in Chicago, set out to correct this. In particular they wanted to see if it is circumstances that create little Hitlers or, rather, whether people of that type simply gravitate into jobs which allow them to behave badly. Their results have just been published in the Journal of Experimental Social Psychology.

Dr Fast’s experiment randomly assigned each of 213 participants to one of four situations that manipulated their status and power. All participants were informed that they were taking part in a study on virtual organisations and would be interacting with, but not meeting, a fellow student who worked in the same fictional consulting firm. Participants were then assigned either the role of “idea producer”, a job that entailed generating and working with important ideas, or of “worker”, a job that involved menial tasks like checking for typos. A post-experiment questionnaire demonstrated that participants did, as might be expected, look upon the role of idea producer with respect and admiration. Equally unsurprisingly, they looked down on the role of worker.

Participants who had both status and power did not greatly demean their partners. They chose an average of 0.67 demeaning activities for those partners to perform. Low-power/low-status and low-power/high-status participants behaved similarly. They chose, on average, 0.67 and 0.85 demeaning activities. However, participants who were low in status but high in power—the classic “little Hitler” combination—chose an average of 1.12 deeply demeaning tasks for their partners to engage in. That was a highly statistically significant distinction.

Of course, not everybody in the high-power/low-status quadrant of the experiment behaved badly. Underlying personality may still have a role. But as with previous experiments in which random members of the public have been asked to play prison guard or interrogator, Dr Fast’s result suggests that many quite ordinary people will succumb to bad behaviour if the circumstances are right.

[div class=attrib]Read more here.[end-div]

[div class=attrib]Image courtesy of the Economist / Getty Images.[end-div]

The Cult of the Super Person

It is undeniable that there is ever increasing societal pressure on children to perform compete, achieve and succeed, and to do so at ever younger ages. However, while average college test admission scores have improved it’s also arguable that admission standards have dropped. So, the picture painted by James Atlas in the article below is far from clear. Nonetheless, it’s disturbing that our children get less and less time to dream, play, explore and get dirty.

[div class=attrib]From the New York Times:[end-div]

A BROCHURE arrives in the mail announcing this year’s winners of a prestigious fellowship to study abroad. The recipients are allotted a full page each, with a photo and a thick paragraph chronicling their achievements. It’s a select group to begin with, but even so, there doesn’t seem to be anyone on this list who hasn’t mastered at least one musical instrument; helped build a school or hospital in some foreign land; excelled at a sport; attained fluency in two or more languages; had both a major and a minor, sometimes two, usually in unrelated fields (philosophy and molecular science, mathematics and medieval literature); and yet found time — how do they have any? — to enjoy such arduous hobbies as mountain biking and white-water kayaking.

Let’s call this species Super Person.

Do we have some anomalous cohort here? Achievement freaks on a scale we haven’t seen before? Has our hysterically competitive, education-obsessed society finally outdone itself in its tireless efforts to produce winners whose abilities are literally off the charts? And if so, what convergence of historical, social and economic forces has been responsible for the emergence of this new type? Why does Super Person appear among us now?

Perhaps there’s an evolutionary cause, and these robust intellects reflect the leap in the physical development of humans that we ascribe to better diets, exercise and other forms of health-consciousness. (Stephen Jay Gould called this mechanism “extended scope.”) All you have to do is watch a long rally between Novak Djokovic and Rafael Nadal to recognize — if you’re old enough — how much faster the sport has become over the last half century.

The Super Person training for the college application wars is the academic version of the Super Person slugging it out on the tennis court. For wonks, Harvard Yard is Arthur Ashe Stadium.

Preparing for Super Personhood begins early. “We see kids who’ve been training from an early age,” says Charles Bardes, chairman of admissions at Weill Cornell Medical College. “The bar has been set higher. You have to be at the top of the pile.”

And to clamber up there you need a head start. Thus the well-documented phenomenon of helicopter parents. In her influential book “Perfect Madness: Motherhood in the Age of Anxiety,” Judith Warner quotes a mom who gave up her career to be a full-time parent: “The children are the center of the household and everything goes around them. You want to do everything and be everything for them because this is your job now.” Bursting with pent-up energy, the mothers transfer their shelved career ambitions to their children. Since that book was published in 2005, the situation has only intensified. “One of my daughter’s classmates has a pilot’s license; 12-year-olds are taking calculus,” Ms. Warner said last week.

[div class=attrib]Read more of this article here.[end-div]

[div class=attrib]Image courtesy of Mark Todd. New York Times.[end-div]

Art Criticism at its Best

[div class=attrib]From Jonathan Jones over at the Guardian:[end-div]

Works of art are not objects. They are … Oh lord, what are they? Take, for convenience, a painting. It is a physical object, obviously, in that it consists of a wooden panel or a stretched canvas covered in daubs of colour. Depending on the light you may be more or less aware of cracks, brush marks, different layers of paint. Turn it around and it is even more obviously a physical object. But as such it is not art. Only when it is experienced as art can it be called art, and the intensity and value of that experience varies according to the way it is made and the way it is seen, that is, the receptiveness of the beholder to that particular work of art.

And this is why critics are the only real art writers. We are the only ones who acknowledge, as a basic principle, that art is an unstable category – it lives or dies according to rules that cannot ever be systematised. If you treat art in a pseudo-scientific way, as some kinds of art history do, you miss everything that makes it matter. Only on the hoof can it be caught, or rather followed on its elusive meanderings in and out of meaning, significance, and beauty.

Equally, an uncritical, purely literary approach to art also risks missing the whole point about it. You have to be critical, not just belle-lettriste, to get to the pulse of art. To respond to a work is to compare it with other works, and that comparison only has meaning if you judge their relative merits.

No such judgment is final. No critic is right, necessarily. It’s just that criticism offers a more honest and realistic understanding of the deep strangeness of our encounters with these mysterious human creations called works of art.

That is why the really great art historians were critics, who never fought shy of judgment. Kenneth Clark and EH Gombrich were extremely opinionated about what is and is not good art. Were they right or wrong? That is irrelevant. The response of one passionate and critical writer is worth a hundred, or a thousand, uncritical surveys that, by refusing to come off the fence, never get anywhere near the life of art.

[div class=attrib]Read more of this article here.[end-div]

[div class=attrib]Photograph of John Ruskin, circa 1870. Image courtesy of W. & D. Downey / Wikipedia.[end-div]

MondayPoem: Immortal Autumn

The Autumnal Equinox finally ushers in some cooler temperatures for the northern hemisphere, and with that we reflect on this most human of seasons courtesy of a poem by Archibald MacLeish.

By Archibald MacLeish:

– Immortal Autumn

I speak this poem now with grave and level voice
In praise of autumn, of the far-horn-winding fall.

I praise the flower-barren fields, the clouds, the tall
Unanswering branches where the wind makes sullen noise.

I praise the fall: it is the human season.
Now

No more the foreign sun does meddle at our earth,
Enforce the green and bring the fallow land to birth,
Nor winter yet weigh all with silence the pine bough,

But now in autumn with the black and outcast crows
Share we the spacious world: the whispering year is gone:
There is more room to live now: the once secret dawn
Comes late by daylight and the dark unguarded goes.

Between the mutinous brave burning of the leaves
And winter’s covering of our hearts with his deep snow
We are alone: there are no evening birds: we know
The naked moon: the tame stars circle at our eaves.

It is the human season. On this sterile air
Do words outcarry breath: the sound goes on and on.
I hear a dead man’s cry from autumn long since gone.

I cry to you beyond upon this bitter air.

Is Our Children Learning: Testing the Standardized Tests

Test grades once measured student performance. Nowadays test grades are used to measure teacher and parent, educational institution and even national performance. Gary Cutting over at the Stone forum has some instructive commentary.

[div class=attrib]From the New York Times:[end-div]

So what exactly do test scores tell us?

Poor test scores are the initial premises in most current arguments for educational reform.  At the end of last year, reading scores that showed American 15-year-olds in the middle of an international pack, led by Asian countries, prompted calls from researchers and educators for immediate action.  This year two sociologists, Richard Arum and Josipa Roksa, showed that 45 percent of students, after two years of college, have made no significant gains on a test of critical thinking.  Last week’s report of falling SAT scores is the latest example.

Given poor test results, many critics conclude that our schools are failing and propose plans for immediate action.  For example, when Arum and Raksa published their results, many concluded that college teachers need to raise standards in their courses, requiring more hours of study and assigning longer papers.

It is, however, not immediately obvious what follows from poor test scores.  Without taking any position about the state of our schools or how, if at all, they need reform, I want to reflect on what we need to add to the fact of poor scores to construct an argument for changing the way we educate.

The first question is whether a test actually tests for things that we want students to know.   We very seldom simply want students to do well on a test for its own sake.

[div class=attrib]Read more of this article here.[end-div]

[div class=attrib]Image courtesy of U.S. College Search.[end-div]

Map Your Favorite Red (Wine)

This season’s Beaujolais Nouveau is just over a month away so what better way to pave the road to French wines than a viticultural map. The wine map is based on the 1930’s iconic design by Harry Beck of the London Tube (subway).

[div class=attrib]From Frank Jacobs at Strange Maps:[end-div]

The coloured lines on this wine map denote the main wine-producing regions in France, the dots are significant cities or towns in those regions. Names that branch off from the main line via little streaks are the so-called appellations [2].

This schematic approach is illuminating for non-aficionados. In the first place, it clarifies the relation between region and appellation. For example: Médoc, Margaux and St-Emilion are three wines from the same region. So they are all Bordeaux wines, but each with its own appellation.

Secondly, it provides a good indication of the geographic relation between appellations within regions. Chablis and Nuits-St-Georges are northern Burgundy wines, while Beaujolais is a southern one. It also permits some comparison between regions: Beaujolais, although a Burgundy, neighbours Côte Rôtie, a northern Rhône Valley wine.

And lastly, it provides the names of the main grape varieties used in each region (the white ones italicised), like merlot or chardonnay.

Which Couch, the Blue or White? Stubbornness and Social Pressure

Counterintuitive results show that we are more likely to resist changing our minds when more people tell us where are wrong. A team of researchers from HP’s Social Computing Research Group found that humans are more likely to change their minds when fewer, rather than more, people disagree with them.

[div class=attrib]From HP:[end-div]

The research has practical applications for businesses, especially in marketing, suggests co-author Bernardo Huberman,  Senior HP Fellow and director of HP’s Social Computing Research Group.

“What this implies,” he says, “is that rather than overwhelming consumers with strident messages about an alternative product or service, in social media, gentle reporting of a few people having chosen that product or service can be more persuasive.”

The experiment – devised by Huberman along with Haiyi Zhu, an HP labs summer intern from Carnegie Mellon University, and Yarun Luon of HP Labs – reveals several other factors that determine whether choices can be reversed though social influence, too. It’s the latest product of HP Lab’s pioneering program in social computing, which is dedicated to creating software and algorithms that provide meaningful context to huge sets of unstructured data.

Study results: the power of opinion
Opinions and product ratings are everywhere online. But when do they actually influence our own choices?

To find out, the HP team asked several hundred people to make a series of choices between two different pieces of furniture.  After varying amounts of time, they were asked to choose again between the same items, but this time they were told that a certain number of other people had preferred the opposite item.  (Separately, the experiment also asked subjects to choose between two different baby pictures, to control for variance in subject matter).

Analysis of the resulting choices showed that receiving a small amount of social pressure to reverse one’s opinion (by being told that a just few people had chosen differently) was more likely to produce a reversed vote than when the pressure felt was much greater (i.e. where an overwhelming number of people were shown as having made a different choice).

The team also discovered:

– People were more likely to be influenced if they weren’t prompted to change their mind immediately after they had expressed their original preference.
– The more time that people spent on their choice, the more likely they were to reverse that choice and conform to the opinion of others later on.

[div class=attrib]More of this fascinating article here.[end-div]

Complex Decision To Make? Go With the Gut

Over the last couple of years a number of researchers have upended conventional wisdom by finding that complex decisions, for instance, those having lots of variables, are better “made” through our emotional system. This flies in the face of the commonly held belief that complexity is best handled by our rational side.

[div class=attrib]Jonah Lehrer over at the Frontal Cortex brings us up to date on current thinking.[end-div]

We live in a world filled with difficult decisions. In fact, we’ve managed to turn even trivial choices – say, picking a toothpaste – into a tortured mental task, as the typical supermarket has more than 200 different dental cleaning options. Should I choose a toothpaste based on fluoride content? Do I need a whitener in my toothpaste? Is Crest different than Colgate? The end result is that the banal selection becomes cognitively demanding, as I have to assess dozens of alternatives and take an array of variables into account. And it’s not just toothpaste: The same thing has happened to nearly every consumption decision, from bottled water to blue jeans to stocks. There are no simple choices left – capitalism makes everything complicated.

How should we make all these hard choices? How does one navigate a world of seemingly infinite alternatives? For thousands of years, the answer has seemed obvious: when faced with a difficult dilemma, we should carefully assess our options and spend a few moments consciously deliberating the information. Then, we should choose the toothpaste that best fits our preferences. This is how we maximize utility and get the most bang for the buck. We are rational agents – we should make decisions in a rational manner.

But what if rationality backfires? What if we make better decisions when we trust our gut instincts? While there is an extensive literature on the potential wisdom of human emotion, it’s only in the last few years that researchers have demonstrated that the emotional system (aka Type 1 thinking) might excel at complex decisions, or those involving lots of variables. If true, this would suggest that the unconscious is better suited for difficult cognitive tasks than the conscious brain, that the very thought process we’ve long disregarded as irrational and impulsive might actually be “smarter” than reasoned deliberation. This is largely because the unconscious is able to handle a surfeit of information, digesting the facts without getting overwhelmed. (Human reason, in contrast, has a very strict bottleneck and can only process about four bits of data at any given moment.) When confused in the toothpaste aisle, bewildered by all the different options, we should go with the product that feels the best.

The most widely cited demonstration of this theory is a 2006 Science paper led by Ap Dijksterhuis. (I wrote about the research in How We Decide.) The experiment went like this: Dijksterhuis got together a group of Dutch car shoppers and gave them descriptions of four different used cars. Each of the cars was rated in four different categories, for a total of sixteen pieces of information. Car number 1, for example, was described as getting good mileage, but had a shoddy transmission and poor sound system. Car number 2 handled poorly, but had lots of legroom. Dijksterhuis designed the experiment so that one car was objectively ideal, with “predominantly positive aspects”. After showing people these car ratings, Dijksterhuis then gave them a few minutes to consciously contemplate their decision. In this “easy” situation, more than fifty percent of the subjects ended up choosing the best car.

[div class=attrib]Read more of the article and Ap Dijksterhuis’ classic experiment here.[end-div]

[div class=attrib]Image courtesy of CustomerSpeak.[end-div]

Movies in the Mind: A Great Leap in Brain Imaging

A common premise of “mad scientists” in science fiction movies: a computer reconstructs video images from someone’s thoughts via a brain scanning device. Yet, now this is no longer the realm of fantasy. Researchers from the University of California at Berkeley have successfully decoded and reconstructed people’s dynamic visual experiences – in this case watching Hollywood movie trailers –using functional Magnetic Resonance Imaging (fMRI) and computer simulation models.

Watch the stunning video clip below showing side-by-side movies of what a volunteer was actually watching and a computer reconstruction of fMRI data from the same volunteer.

[youtube]nsjDnYxJ0bo[/youtube]

The results are a rudimentary first step, with the technology requiring decades of refinement before the fiction of movies, such as Brainstorm, becomes a closer reality. However, this groundbreaking research nonetheless paves the way to a future of tremendous promise in brain science. Imagine the ability to reproduce and share images of our dreams and memories, or peering into the brain of a comatose patient.

[div class=attrib]More from the UC-Berkeley article here.[end-div]

How Will You Die?

Bad news and good news. First, the bad news. If you’re between 45-54 years of age your cause of death will most likely be heart disease, that is, if you’re a male. If you are a female on the other hand, you’re more likely to fall prey to cancer. And, interestingly you are about 5 times more likely to die falling down stairs than from (accidental) electrocution. Now the good news. While the data may give us a probabilistic notion of how we may perish, no one (yet) knows when.

More vital statistics courtesy of this macabre infographic derived from data of National Center for Health Statistics and the National Safety Council.

Chance as a Subjective or Objective Measure

[div class=attrib]From Rationally Speaking:[end-div]

Stop me if you’ve heard this before: suppose I flip a coin, right now. I am not giving you any other information. What odds (or probability, if you prefer) do you assign that it will come up heads?

If you would happily say “Even” or “1 to 1” or “Fifty-fifty” or “probability 50%” — and you’re clear on WHY you would say this — then this post is not aimed at you, although it may pleasantly confirm your preexisting opinions as a Bayesian on probability. Bayesians, broadly, consider probability to be a measure of their state of knowledge about some proposition, so that different people with different knowledge may correctly quote different probabilities for the same proposition.

If you would say something along the lines of “The question is meaningless; probability only has meaning as the many-trials limit of frequency in a random experiment,” or perhaps “50%, but only given that a fair coin and fair flipping procedure is being used,” this post is aimed at you. I intend to try to talk you out of your Frequentist view; the view that probability exists out there and is an objective property of certain physical systems, which we humans, merely fallibly, measure.

My broader aim is therefore to argue that “chance” is always and everywhere subjective — a result of the limitations of minds — rather than objective in the sense of actually existing in the outside world.

[div class=attrib]Much more of this article here.[end-div]

[div class=attrib]Image courtesy of Wikipedia.[end-div]

MondayPoem: When I have Fears That I May Cease to Be

This week’s poem courtesy of the great romantic John Keats delves into the subject of time and brevity on this Earth. Although Keats was frequently scorned by critics during his lifetime, death transformed him into one of England’s most loved poets.

By John Keats:

– When I have Fears That I May Cease to Be

When I have fears that I may cease to be
Before my pen has gleaned my teeming brain,
Before high-pilèd books, in charactery,
Hold like rich garners the full ripened grain;
When I behold, upon the night’s starred face,
Huge cloudy symbols of a high romance,
And think that I may never live to trace
Their shadows with the magic hand of chance;
And when I feel, fair creature of an hour,
That I shall never look upon thee more,
Never have relish in the fairy power
Of unreflecting love—then on the shore
Of the wide world I stand alone, and think
Till love and fame to nothingness do sink.

 

[div class=attrib]Portrait of John Keats by William Hilton. National Portrait Gallery, London, courtesy of Wikipedia.[end-div]

Faster Than Light Travel

The world of particle physics is agog with recent news of an experiment that shows a very unexpected result – sub-atomic particles traveling faster than the speed of light. If verified and independently replicated the results would violate one of the universe’s fundamental properties described by Einstein in the Special Theory of Relativity. The speed of light — 186,282 miles per second (299,792 kilometers per second) — has long been considered an absolute cosmic speed limit.

Stranger still, over the last couple of days news of this anomalous result has even been broadcast on many cable news shows.

The experiment known as OPERA is a collaboration between France’s National Institute for Nuclear and Particle Physics Research and Italy’s Gran Sasso National Laboratory. Over the course of three years scientists fired a neutrino beam 454 miles (730 kilometers) underground from Geneva to a receiver in Italy. Their measurements show that neutrinos arrived an average of 60 nanoseconds sooner than light would have done. This doesn’t seem like a great amount, after all is only 60 billionths of a second, however the small difference could nonetheless undermine a hundred years of physics.

Understandably most physicists remain skeptical of the result, until further independent experiments are used to confirm the measurements or not. However, all seem to agree that if the result is confirmed this would be a monumental finding and would likely reshape modern physics and our understanding of the universe.

[div class=attrib]More on this intriguing story here courtesy of ARs Technica, which also offers a detailed explanation of several possible sources of error that may have contributed to the faster-than-light measurements.[end-div]

Eurovision

If you grew up in Europe or have spent at least 6 months there over the last 50 years you’ll have collided with the Eurovision Song Contest.

A quintessentially european invention, Eurovision, as it is commonly know, has grown from a handful of countries to embrace 43 nations across Europe in 2012. Countries compete for the prize of best song and the honor of hosting the contest the following year. While contestants and song are not usually guaranteed long-standing commercial success, the winner usually does claim 15 minutes or so on the spotlight and at least a singular one-hit-wonder. A notable exceptions was the Swedish group ABBA, which went on to generation-spanning superstardom.

Frank Jacobs over at Strange Maps offers his cartographic take on Eurovision.

[div class=attrib]From Strange Maps / Big Think:[end-div]

The Eurovision Song Contest is a resounding success in at least one respect. Set up as a laboratory of European harmony – musically, audiovisually and politically – its first edition [1] featured a mere 7 participating countries, all Western European. The 57th edition, next May in Azerbaijan, will have 43 countries from all over the continent vying for the top prize, and the honour to host the 2013 edition of the event in their capital city.

Mission accomplished, then. But a chorus of critics – swelling, as the turn of phrase suggests [2] – finds the annual event increasingly tacky and irrelevant. The winner is determined by a tally of national votes, which have less to do with the quality of the songs than with the degree of friendliness between the participating countries.

[div class=attrib]More of the article here.[end-div]

London’s Other River

You will have heard of the River Thames, the famous swathe of grey that cuts a watery path through London.  You may even have heard of several of London’s prominent canals, such as the Grand Union Canal and Regent’s Canal. But, you probably will not have heard of the mysterious River Fleet that meanders through eerie tunnels beneath the city.

The Fleet and its Victorian tunnels are available for exploration, but are not for the faint of heart or sensitive of nose.

For more stunning subterranean images follow the full article here.

[div class=attrib]Images courtesy of Environmental Grafitti.[end-div]

The Sins of Isaac Newton

Aside from founding classical mechanics — think universal gravitation and laws of motion, laying the building blocks of calculus, and inventing the reflecting telescope Isaac Newton made time for spiritual pursuits. In fact, Newton was a highly religious individual (though a somewhat unorthodox Christian).

So, although Newton is best remembered for his monumental work, Philosophiæ Naturalis Principia Mathematica, he kept a lesser known, but no-less detailed journal of his sins while a freshman at Cambridge. A list of Newton’s most “heinous” self-confessed, moral failings follows below.

[div class=attrib]From io9:[end-div]

10. Making a feather while on Thy day.

Anyone remember the Little House series, where every day they worked their prairie-wind-chapped asses off and risked getting bitten by badgers and nearly lost eyes to exploding potatoes (all true), but never complained about anything until they hit Sunday and literally had to do nothing all day? That was hundreds of years after Newton. And Newton was even more bored than the Little House people, although he was sorry about it later. He confesses everything from making a mousetrap on Sunday, to playing chimes, to helping a roommate with a school project, to making pies, to ‘squirting water’ on the Sabbath.

9. Having uncleane thoughts words and actions and dreamese.

Well, to be fair, he was only a boy at this time. He may have had all the unclean thoughts in the world, but Newton, on his death bed, is well known for saying he is proudest of dying a virgin. And this is from the guy who invented the Laws of Motion.

8. Robbing my mothers box of plums and sugar.

Clearly he needed to compensate for lack of carnal pleasure with some other kind of physical comfort. It seems that Newton had a sweet tooth. There’s this ‘robbery.’ There’s the aforementioned pies, although they might be savory pies. And in another confession he talks about how he had ‘gluttony in his sickness.’ The guy needed to eat.

7. Using unlawful means to bring us out of distresses.

This is a strange sin because it’s so vague. Could it be that the ‘distresses’ were financial, leading to another confessed sin of ‘Striving to cheat with a brass halfe crowne.’ Some biographers think that his is a sexual confession and his ‘distresses’ were carnal. Newton isn’t just saying that he used immoral means, but unlawful ones. What law did he break?

6. Using Wilford’s towel to spare my own.

Whatever else Newton was, he was a terrible roommate. Although he was a decent student, he was reputed to be bad at personal relationships with anyone, at any time. This sin, using someone’s towel, was probably more a big deal during a time when plague was running through the countryside. He also confesses to, “Denying my chamberfellow of the knowledge of him that took him for a sot.”

And his sweet tooth still reigned. Any plums anyone left out would probably be gone by the time they got back. He confessed the sin of “Stealing cherry cobs from Eduard Storer.” Just to top it off, Newton confessed to ‘peevishness’ with people over and over in his journal. He was clearly a moody little guy. No word on whether he apologized to them about it, but he apologized to God, and surely that was enough.

[div class=attrib]More of the article here.[end-div]

[div class=attrib]Image courtesy of Wikipedia.[end-div]

Why Are the French Not as Overweight as Americans?

[div class=attrib]From the New York Times:[end-div]

PARIS — You’re reminded hourly, even while walking along the slow-moving Seine or staring at sculpted marble bodies under the Louvre’s high ceilings, that the old continent is crumbling. They’re slouching toward a gerontocracy, these Europeans. Their banks are teetering. They can’t handle immigration. Greece is broke, and three other nations are not far behind. In a half-dozen languages, the papers shout: crisis!

If the euro fails, as Chancellor Angela Merkel of Germany said, then Europe fails. That means a recession here, and a likely one at home, which will be blamed on President Obama, and then Rick Perry will get elected, and the leader of the free world will be somebody who thinks the earth is only a few thousand years old.

You see where it’s all going, this endless “whither the euro question.” So, you think of something else, the Parisian way. You think of what these people can eat on a given day: pain au chocolat for breakfast, soupe a? l’oignon gratine?e topped by melted gruyere for lunch and foie gras for dinner, as a starter.

And then you look around: how can they live like this? Where are all the fat people? It’s a question that has long tormented visitors. These French, they eat anything they damn well please, drink like Mad Men and are healthier than most Americans. And of course, their medical care is free and universal, and considered by many to be the best in the world.

… Recent studies indicate that the French are, in fact, getting fatter — just not as much as everyone else. On average, they are where Americans were in the 1970s, when the ballooning of a nation was still in its early stages. But here’s the good news: they may have figured out some way to contain the biggest global health threat of our time, for France is now one of a handful of nations where obesity among the young has leveled off.

First, the big picture: Us. We — my fellow Americans — are off the charts on this global pathology. The latest jolt came from papers published last month in The Lancet, projecting that three-fourths of adults in the United States will be overweight or obese by 2020.

Only one state, Colorado, now has an obesity rate under 20 percent (obesity is the higher of the two body-mass indexes, the other being overweight). But that’s not good news. The average bulge of an adult Coloradan has increased 80 percent over the last 15 years. They only stand out by comparison to all other states. Colorado, the least fat state in 2011, would be the heaviest had they reported their current rate of obesity 20 years ago. That’s how much we’ve slipped.

… A study of how the French appear to have curbed childhood obesity shows the issue is not complex. Junk food vending machines were banned in schools. The young were encouraged to exercise more. And school lunches were made healthier.

… But another answer can come from self-discovery. Every kid should experience a fresh peach in August. And an American newly arrived in the City of Light should nibble at a cluster of grapes or some blood-red figs, just as the French do, with that camembert.

[div class=attrib]More from the article here.[end-div]

[div class=attrib]Obesity classification standards illustration courtesy of Wikipedia.[end-div]

Atheism: Scientific or Humanist

[div class=attrib]From The Stone forum, New York Times:[end-div]

Led by the biologist Richard Dawkins, the author of “The God Delusion,” atheism has taken on a new life in popular religious debate. Dawkins’s brand of atheism is scientific in that it views the “God hypothesis” as obviously inadequate to the known facts. In particular, he employs the facts of evolution to challenge the need to postulate God as the designer of the universe. For atheists like Dawkins, belief in God is an intellectual mistake, and honest thinkers need simply to recognize this and move on from the silliness and abuses associated with religion.

Most believers, however, do not come to religion through philosophical arguments. Rather, their belief arises from their personal experiences of a spiritual world of meaning and values, with God as its center.

In the last few years there has emerged another style of atheism that takes such experiences seriously. One of its best exponents is Philip Kitcher, a professor of philosophy at Columbia. (For a good introduction to his views, see Kitcher’s essay in “The Joy of Secularism,” perceptively discussed last month by James Wood in The New Yorker.)

Instead of focusing on the scientific inadequacy of theistic arguments, Kitcher critically examines the spiritual experiences underlying religious belief, particularly noting that they depend on specific and contingent social and cultural conditions. Your religious beliefs typically depend on the community in which you were raised or live. The spiritual experiences of people in ancient Greece, medieval Japan or 21st-century Saudi Arabia do not lead to belief in Christianity. It seems, therefore, that religious belief very likely tracks not truth but social conditioning. This “cultural relativism” argument is an old one, but Kitcher shows that it is still a serious challenge. (He is also refreshingly aware that he needs to show why a similar argument does not apply to his own position, since atheistic beliefs are themselves often a result of the community in which one lives.)

[div class=attrib]More of the article here.[end-div]

[div class=attrib]Image: Ephesians 2,12 – Greek atheos, courtesy of Wikipedia.[end-div]

MondayPoem: Mathematics Considered as a Vice

A poem by Anthony Hecht this week. On Hecht, Poetry Foundation remarks, “[o]ne of the leading voices of his generation, Anthony Hecht’s poetry is known for its masterful use of traditional forms and linguistic control.”

Following Hecht’s death in 2004 the New York Times observed:

It was Hecht’s gift to see into the darker recesses of our complex lives and conjure to his command the exact words to describe what he found there. Hecht remained skeptical about whether pain and contemplation can ultimately redeem us, yet his ravishing poems extend hope to his readers that they can.

By Anthony Hecht:

– Mathematics Considered as a Vice

I would invoke that man
Who chipped for all posterity an ass
(The one that Jesus rode)
Out of hard stone, and set its either wing
Among the wings of the most saintly clan
On Chartres Cathedral, and that it might sing
The praise to all who pass
Of its unearthly load,
Hung from its neck a harp-like instrument.
I would invoke that man
To aid my argument.

The ass smiles on us all,
Being astonished that an ass might rise
To such sure eminence
Not merely among asses but mankind,
Simpers, almost, upon the western wall
In praise of folly, who midst sow and kine,
Saw with its foolish eyes
Gold, Myrrh, and Frankincense
Enter the stable door, against all odds.
The ass smiles on us all.
Our butt at last is God’s.

That man is but an ass—
More perfectly, that ass is but a man
Who struggles to describe
Our rich, contingent and substantial world
In ideal signs: the dunged and pagan grass,
Misted in summer, or the mother-of-pearled
Home of the bachelor-clam.
A cold and toothless tribe
Has he for brothers, who would coldly think.
That man is but an ass
Who smells not his own stink.

For all his abstract style
Speaks not to our humanity, and shows
Neither the purity
Of heaven, nor the impurity beneath,
And cannot see the feasted crocodile
Ringed with St. Francis’ birds to pick its teeth,
Nor can his thought disclose
To normal intimacy,
Siamese twins, the double-beasted back,
For all his abstract style
Utters our chiefest lack.

Despite his abstract style,
Pickerel will dawdle in their summer pools
Lit by the flitterings
Of light dashing the gusty surfaces,
Or lie suspended among shades of bile
And lime in fluent shift, for all he says.
And all the grey-haired mules,
Simple and neuter things,
Will bray hosannas, blessing harp and wing.
For all his abstract style,
The ass will learn to sing.

The Teen Brain: Work In Progress or Adaptive Network?

[div class=attrib]From Wired:[end-div]

Ever since the late-1990s, when researchers discovered that the human brain takes into our mid-20s to fully develop — far longer than previously thought — the teen brain has been getting a bad rap. Teens, the emerging dominant narrative insisted, were “works in progress” whose “immature brains” left them in a state “akin to mental retardation” — all titles from prominent papers or articles about this long developmental arc.

In a National Geographic feature to be published next week, however, I highlight a different take: A growing view among researchers that this prolonged developmental arc is less a matter of delayed development than prolonged flexibility. This account of the adolescent brain — call it the “adaptive adolescent” meme rather than the “immature brain” meme — “casts the teen less as a rough work than as an exquisitely sensitive, highly adaptive creature wired almost perfectly for the job of moving from the safety of home into the complicated world outside.” The teen brain, in short, is not dysfunctional; it’s adaptive. .

Carl Zimmer over at Discover gives us some further interesting insights into recent studies of teen behavior.

[div class=attrib]From Discover:[end-div]

Teenagers are a puzzle, and not just to their parents. When kids pass from childhood to adolescence their mortality rate doubles, despite the fact that teenagers are stronger and faster than children as well as more resistant to disease. Parents and scientists alike abound with explanations. It is tempting to put it down to plain stupidity: Teenagers have not yet learned how to make good choices. But that is simply not true. Psychologists have found that teenagers are about as adept as adults at recognizing the risks of dangerous behavior. Something else is at work.

Scientists are finally figuring out what that “something” is. Our brains have networks of neurons that weigh the costs and benefits of potential actions. Together these networks calculate how valuable things are and how far we’ll go to get them, making judgments in hundredths of a second, far from our conscious awareness. Recent research reveals that teen brains go awry because they weigh those consequences in peculiar ways.

… Neuroscientist B. J. Casey and her colleagues at the Sackler Institute of the Weill Cornell Medical College believe the unique way adolescents place value on things can be explained by a biological oddity. Within our reward circuitry we have two separate systems, one for calculating the value of rewards and another for assessing the risks involved in getting them. And they don’t always work together very well.

… The trouble with teens, Casey suspects, is that they fall into a neurological gap. The rush of hormones at puberty helps drive the reward-system network toward maturity, but those hormones do nothing to speed up the cognitive control network. Instead, cognitive control slowly matures through childhood, adolescence, and into early adulthood. Until it catches up, teenagers are stuck with strong responses to rewards without much of a compensating response to the associated risks.

[div class=attrib]More from theSource here.[end-div]

[div class=attrib]Image courtesy of Kitra Cahana, National Geographic.[end-div]

The Universe and Determinism

General scientific consensus suggests that our universe has no pre-defined destiny. While a number of current theories propose anything from a final Big Crush to an accelerating expansion into cold nothingness the future plan for the universe is not pre-determined. Unfortunately, our increasingly sophisticated scientific tools are still to meager to test and answer these questions definitively. So, theorists currently seem to have the upper hand. And, now yet another theory puts current cosmological thinking on its head by proposing that the future is pre-destined and that it may even reach back into the past to shape the present. Confused? Read on!

[div class=attrib]From FQXi:[end-div]

The universe has a destiny—and this set fate could be reaching backwards in time and combining with influences from the past to shape the present. It’s a mind-bending claim, but some cosmologists now believe that a radical reformulation of quantum mechanics in which the future can affect the past could solve some of the universe’s biggest mysteries, including how life arose. What’s more, the researchers claim that recent lab experiments are dramatically confirming the concepts underpinning this reformulation.

Cosmologist Paul Davies, at Arizona State University in Tempe, is embarking on a project to investigate the future’s reach into the present, with the help of a $70,000 grant from the Foundational Questions Institute. It is a project that has been brewing for more than 30 years, since Davies first heard of attempts by physicist Yakir Aharonov to get to root of some of the paradoxes of quantum mechanics. One of these is the theory’s apparent indeterminism: You cannot predict the outcome of experiments on a quantum particle precisely; perform exactly the same experiment on two identical particles and you will get two different results.

While most physicists faced with this have concluded that reality is fundamentally, deeply random, Aharonov argues that there is order hidden within the uncertainty. But to understand its source requires a leap of imagination that takes us beyond our traditional view of time and causality. In his radical reinterpretation of quantum mechanics, Aharonov argues that two seemingly identical particles behave differently under the same conditions because they are fundamentally different. We just do not appreciate this difference in the present because it can only be revealed by experiments carried out in the future.

“It’s a very, very profound idea,” says Davies. Aharonov’s take on quantum mechanics can explain all the usual results that the conventional interpretations can, but with the added bonus that it also explains away nature’s apparent indeterminism. What’s more, a theory in which the future can influence the past may have huge—and much needed—repercussions for our understanding of the universe, says Davies.

[div class=attrib]More from theSource here.[end-div]