All posts by Mike

Democracy is Ugly and Petty

While this election cycle in the United States has been especially partisan this season, it’s worth remembering that politics in an open democracy is sometimes brutal, frequently nasty and often petty. Partisan fights, both metaphorical and physical, have been occuring since the Republic was founded

[div class=attrib]From the New York Times:[end-div]

As the cable news channels count down the hours before the first polls close on Tuesday, an entire election cycle will have passed since President Obama last sat down with Fox News. The organization’s standing request to interview the president is now almost two years old.

At NBC News, the journalists reporting on the Romney campaign will continue to absorb taunts from their sources about their sister cable channel, MSNBC. “You mean, Al Sharpton’s network,” as Stuart Stevens, a senior Romney adviser, is especially fond of reminding them.

Spend just a little time watching either Fox News or MSNBC, and it is easy to see why such tensions run high. In fact, by some measures, the partisan bitterness on cable news has never been as stark — and in some ways, as silly or small.

Martin Bashir, the host of MSNBC’s 4 p.m. hour, recently tried to assess why Mitt Romney seemed irritable on the campaign trail and offered a provocative theory: that he might have mental problems.

“Mrs. Romney has expressed concerns about her husband’s mental well-being,” Mr. Bashir told one of his guests. “But do you get the feeling that perhaps there’s more to this than she’s saying?”

Over on Fox News, similar psychological evaluations were under way on “Fox & Friends.” Keith Ablow, a psychiatrist and a member of the channel’s “Medical A-Team,” suggested that Joseph R. Biden Jr.’s “bizarre laughter” during the vice-presidential debate might have something to do with a larger mental health issue. “You have to put dementia on the differential diagnosis,” he noted matter-of-factly.

Neither outlet has built its reputation on moderation and restraint, but during this presidential election, research shows that both are pushing their stridency to new levels.

A Pew Research Center study found that of Fox News stories about Mr. Obama from the end of August through the end of October, just 6 percent were positive and 46 percent were negative.

Pew also found that Mr. Obama was covered far more than Mr. Romney. The president was a significant figure in 74 percent of Fox’s campaign stories, compared with 49 percent for Romney. In 2008, Pew found that the channel reported on Mr. Obama and John McCain in roughly equal amounts.

The greater disparity was on MSNBC, which gave Mr. Romney positive coverage just 3 percent of the time, Pew found. It examined 259 segments about Mr. Romney and found that 71 percent were negative.

MSNBC, whose programs are hosted by a new crop of extravagant partisans like Mr. Bashir, Mr. Sharpton and Lawrence O’Donnell, has tested the limits of good taste this year. Mr. O’Donnell was forced to apologize in April after describing the Mormon Church as nothing more than a scheme cooked up by a man who “got caught having sex with the maid and explained to his wife that God told him to do it.”

The channel’s hosts recycle talking points handed out by the Obama campaign, even using them as titles for program segments, like Mr. Bashir did recently with a segment he called “Romnesia,” referring to Mr. Obama’s term to explain his opponent’s shifting positions.

The hosts insult and mock, like Alex Wagner did in recently describing Mr. Romney’s trip overseas as “National Lampoon’s European Vacation” — a line she borrowed from an Obama spokeswoman. Mr. Romney was not only hapless, Ms. Wagner said, he also looked “disheveled” and “a little bit sweaty” in a recent appearance.

Not that they save their scorn just for their programs. Some MSNBC hosts even use the channel’s own ads promoting its slogan “Lean Forward,” to criticize Mr. Romney and the Republicans. Mr. O’Donnell accuses the Republican nominee of basing his campaign on the false notion that Mr. Obama is inciting class warfare. “You have to come up with a lie,” he says, when your campaign is based on empty rhetoric.

In her ad, Rachel Maddow breathlessly decodes the logic behind the push to overhaul state voting laws. “The idea is to shrink the electorate,” she says, “so a smaller number of people get to decide what happens to all of us.”

Such stridency has put NBC News journalists who cover Republicans in awkward and compromised positions, several people who work for the network said. To distance themselves from their sister channel, they have started taking steps to reassure Republican sources, like pointing out that they are reporting for NBC programs like “Today” and “Nightly News” — not for MSNBC.

At Fox News, there is a palpable sense that the White House punishes the outlet for its coverage, not only by withholding the president, who has done interviews with every other major network, but also by denying them access to Michelle Obama.

This fall, Mrs. Obama has done a spate of television appearances, from CNN to “Jimmy Kimmel Live” on ABC. But when officials from Fox News recently asked for an interview with the first lady, they were told no. She has not appeared on the channel since 2010, when she sat down with Mike Huckabee.

Lately the White House and Fox News have been at odds over the channel’s aggressive coverage of the attack on the American diplomatic mission in Benghazi, Libya. Fox initially raised questions over the White House’s explanation of the events that led to the attack — questions that other news organizations have since started reporting on more fully.

But the commentary on the channel quickly and often turns to accusations that the White House played politics with American lives. “Everything they told us was a lie,” Sean Hannity said recently as he and John H. Sununu, a former governor of New Hampshire and a Romney campaign supporter, took turns raising questions about how the Obama administration misled the public. “A hoax,” Mr. Hannity called the administration’s explanation. “A cover-up.”

Mr. Hannity has also taken to selectively fact-checking Mr. Obama’s claims, co-opting a journalistic tool that has proliferated in this election as news outlets sought to bring more accountability to their coverage.

Mr. Hannity’s guest fact-checkers have included hardly objective sources, like Dick Morris, the former Clinton aide turned conservative commentator; Liz Cheney, the daughter of former Vice President Dick Cheney; and Michelle Malkin, the right-wing provocateur.

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image courtesy of University of Maine at Farmington.[end-div]

The Beauty of Ugliness

The endless pursuit of beauty in human affairs probably pre-dates our historical record. We certainly know that ancient Egyptians used cosmetics believing them to offer magical and religious powers, in addition to aesthetic value.

Yet paradoxically beauty it is rather subjective and often fleeting. The French singer, songwriter, composer and bon viveur once said that, “ugliness is superior to beauty because it lasts longer”. Author Stephen Bayley argues in his new book “Ugly: The Aesthetics of Everything”, that beauty is downright boring.

[div class=attrib]From the Telegraph:[end-div]

Beauty is boring. And the evidence is piling up. An article in the journal Psychological Science now confirms what partygoers have known forever: that beauty and charm are no more directly linked than a high IQ and a talent for whistling.

A group of scientists set out to discover whether physically attractive people also have appealing character traits and values, and found, according to Lihi Segal-Caspi, who carried out part of the research, that “beautiful people tend to focus more on conformity and self-promotion than independence and tolerance”.

Certainly, while a room full of beautiful people might be impressively stiff with the whiff of Chanel No 5, the intellectual atmosphere will be carrying a very low charge. If positive at all.

The grizzled and gargoyle-like Parisian chanteur, and legendary lover, Serge Gainsbourg always used to pick up the ugliest girls at parties. This was not simply because predatory male folklore insists that ill-favoured women will be more “grateful”, but because Gainsbourg, a stylish contrarian, knew that the conversation would be better, the uglier the girl.

Beauty is a conformist conspiracy. And the conspirators include the fashion, cosmetics and movie businesses: a terrible Greek chorus of brainless idolatry towards abstract form. The conspirators insist that women – and, nowadays, men, too – should be un-creased, smooth, fat-free, tanned and, with the exception of the skull, hairless. Flawlessly dull. Even Hollywood once acknowledged the weakness of this proposition: Marilyn Monroe was made more attractive still by the addition of a “beauty spot”, a blemish turned into an asset.

The red carpet version of beauty is a feeble, temporary construction. Bodies corrode and erode, sag and bulge, just as cars rust and buildings develop a fine patina over time. This is not to be feared, rather to be understood and enjoyed. Anyone wishing to arrest these processes with the aid of surgery, aerosols, paint, glue, drugs, tape and Lycra must be both very stupid and very vain. Hence the problems encountered in conversation with beautiful people: stupidity and vanity rarely contribute much to wit and creativity.

Fine features may be all very well, but the great tragedy of beauty is that it is so ephemeral. Albert Camus said it “drives us to despair, offering for a minute the glimpse of an eternity that we should like to stretch out over the whole of time”. And Gainsbourg agreed when he said: “Ugliness is superior to beauty because it lasts longer.” A hegemony of beautiful perfection would be intolerable: we need a good measure of ugliness to keep our senses keen. If everything were beautiful, nothing would be.

And yet, despite the evidence against, there has been a conviction that beauty and goodness are somehow inextricably and permanently linked. Political propaganda exploited our primitive fear of ugliness, so we had Second World War American posters of Japanese looking like vampire bats. The Greeks believed that beauty had a moral character: beautiful people – discus-throwers and so on – were necessarily good people. Darwin explained our need for “beauty” in saying that breeding attractive children is a survival characteristic: I may feel the need to fuse my premium genetic material with yours, so that humanity continues in the same fine style.

This became a lazy consensus, described as the “beauty premium” by US economists Markus M Mobius and Tanya S Rosenblat. The “beauty premium” insists that as attractive children grow into attractive adults, they may find it easier to develop agreeable interpersonal communications skills because their audience reacts more favourably to them. In this beauty-related employment theory, short people are less likely to get a good job. As Randy Newman sang: “Short people got no reason to live.” So Darwin’s argument that evolutionary forces favour a certain physical type may be proven in the job market as well as the wider world.

But as soon as you try to grasp the concept of beauty, it disappears.

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Image courtesy of Google search.[end-div]

Does Evil Exist?

Humans have a peculiar habit of anthropomorphizing anything that moves, and for that matter, most objects that remain static as well. So, it is not surprising that evil is often personified and even stereotyped; it is said that true evil even has a home somewhere below where you currently stand.

[div class=attrib]From the Guardian:[end-div]

The friction between the presence of evil in our world and belief in a loving creator God sparks some tough questions. For many religious people these are primarily existential questions, as their faith contends with doubt and bewilderment. The biblical figure of Job, the righteous man who loses everything that is dear to him, remains a powerful example of this struggle. But the “problem of evil” is also an intellectual puzzle that has taxed the minds of philosophers and theologians for centuries.

One of the most influential responses to the problem of evil comes from St Augustine. As a young man, Augustine followed the teachings of a Christian sect known as the Manichees. At the heart of Manichean theology was the idea of a cosmic battle between the forces of good and evil. This, of course, proposes one possible solution to the problem of evil: all goodness, purity and light comes from God, and the darkness of evil has a different source.

However, Augustine came to regard this cosmic dualism as heretical, since it undermined God’s sovereignty. Of course, he wanted to hold on to the absolute goodness of God. But if God is the source of all things, where did evil come from? Augustine’s radical answer to this question is that evil does not actually come from anywhere. Rejecting the idea that evil is a positive force, he argues that it is merely a “name for nothing other than the absence of good”.

At first glance this looks like a philosophical sleight of hand. Augustine might try to define evil out of existence, but this cannot diminish the reality of the pain, suffering and cruelty that prompt the question of evil in the first place. As the 20th-century Catholic writer Charles Journet put it, the non-being of evil “can have a terrible reality, like letters carved out of stone”. Any defence of Augustine’s position has to begin by pointing out that his account of evil is metaphysical rather than empirical. In other words, he is not saying that our experience of evil is unreal. On the contrary, since a divinely created world is naturally oriented toward the good, any lack of goodness will be felt as painful, wrong and urgently in need of repair. To say that hunger is “merely” the absence of food is not to deny the intense suffering it involves.

One consequence of Augustine’s mature view of evil as “non-being”, a privation of the good, is that evil eludes our understanding. His sophisticated metaphysics of evil confirms our intuitive response of incomprehension in the face of gratuitous brutality, or of senseless “natural” evil like a child’s cancer. Augustine emphasises that evil is ultimately inexplicable, since it has no substantial existence: “No one therefore must try to get to know from me what I know that I do not know, unless, it may be, in order to learn not to know what must be known to be incapable of being known!” Interestingly, by the way, this mysticism about evil mirrors the “negative theology” which insists that God exceeds the limits of our understanding.

So, by his own admission, Augustine’s “solution” to the problem of evil defends belief in God without properly explaining the kinds of acts which exert real pressure on religious faith. He may be right to point out that the effects of evil tend to be destruction and disorder – a twisting or scarring of nature, and of souls. Nevertheless, believers and non-believers alike will feel that this fails to do justice to the power of evil. We may demand a better account of the apparent positivity of evil – of the fact, for example, that holocausts and massacres often involve meticulous planning, technical innovation and creative processes of justification.

Surprisingly, though, the basic insight of Augustinian theodicy finds support in recent science. In his 2011 book Zero Degrees of Empathy, Cambridge psychopathology professor Simon Baron-Cohen proposes “a new theory of human cruelty”. His goal, he writes, is to replace the “unscientific” term “evil” with the idea of “empathy erosion”: “People said to be cruel or evil are simply at one extreme of the empathy spectrum,” he writes. (He points out, though, that some people at this extreme display no more cruelty than those higher up the empathy scale – they are simply socially isolated.)

Loss of empathy resembles the Augustinian concept of evil in that it is a deficiency of goodness – or, to put it less moralistically, a disruption of normal functioning – rather than a positive force. In this way at least, Baron-Cohen’s theory echoes Augustine’s argument, against the Manicheans, that evil is not an independent reality but, in essence, a lack or a loss.

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Image: Marvel Comics Vault of Evil. Courtesy of Wikia / Marvel Comics.[end-div]

From Finely Textured Beef to Soylent Pink

Blame corporate euphemisms and branding for the obfuscation of everyday things. More sinister yet, is the constant re-working of names for our ever increasingly processed foodstuffs. Only last year as several influential health studies pointed towards the detrimental health effects of high fructose corn syrup (HFC) did the food industry act, but not by removing copious amounts of the addictive additive from many processed foods. Rather, the industry attempted to re-brand HFC as “corn sugar”. And, now on to the battle over “soylent pink” also known as “pink slim”.

[div class=attrib]From Slate:[end-div]

What do you call a mash of beef trimmings that have been chopped and then spun in a centrifuge to remove the fatty bits and gristle? According to the government and to the company that invented the process, you call it lean finely textured beef. But to the natural-food crusaders who would have the stuff removed from the nation’s hamburgers and tacos, the protein-rich product goes by another, more disturbing name: Pink slime.

The story of this activist rebranding—from lean finely textured beef to pink slime—reveals just how much these labels matter. It was the latter phrase that, for example, birthed the great ground-beef scare of 2012. In early March, journalists at both the Daily and at ABC began reporting on a burger panic: Lax rules from the U.S. Department of Agriculture allowed producers to fill their ground-beef packs with a slimy, noxious byproduct—a mush the reporters called unsanitary and without much value as a food. Coverage linked back to a New York Times story from 2009 in which the words pink slime had appeared in public for the first time in a quote from an email written by a USDA microbiologist who was frustrated at a decision to leave the additive off labels for ground meat.

The slimy terror spread in the weeks that followed. Less than a month after ABC’s initial reports, almost a quarter million people had signed a petition to get pink slime out of public school cafeterias. Supermarket chains stopped selling burger meat that contained it—all because of a shift from four matter-of-fact words to two visceral ones.

And now that rebranding has become the basis for a 263-page lawsuit. Last month, Beef Products Inc., the first and principal producer of lean/pink/textured/slimy beef, filed a defamation claim against ABC (along with that microbiologist and a former USDA inspector) in a South Dakota court. The company says the network carried out a malicious and dishonest campaign to discredit its ground-beef additive and that this work had grievous consequences. When ABC began its coverage, Beef Products Inc. was selling 5 million pounds of slime/beef/whatever every week. Then three of its four plants were forced to close, and production dropped to 1.6 million pounds. A weekly profit of $2.3 million had turned into a $583,000 weekly loss.

At Reuters, Steven Brill argued that the suit has merit. I won’t try to comment on its legal viability, but the details of the claim do provide some useful background about how we name our processed foods, in both industry and the media. It turns out the paste now known within the business as lean finely textured beef descends from an older, less purified version of the same. Producers have long tried to salvage the trimmings from a cattle carcass by cleaning off the fat and the bacteria that often congregate on these leftover parts. At best they could achieve a not-so-lean class of meat called partially defatted chopped beef, which USDA deemed too low in quality to be a part of hamburger or ground meat.

By the late 1980s, though, Eldon Roth of Beef Products Inc. had worked out a way to make those trimmings a bit more wholesome. He’d found a way, using centrifuges, to separate the fat more fully. In 1991, USDA approved his product as fat reduced beef and signed off on its use in hamburgers. JoAnn Smith, a government official and former president of the National Cattlemen’s Association, signed off on this “euphemistic designation,” writes Marion Nestle in Food Politics. (Beef Products, Inc. maintains that this decision “was not motivated by any official’s so-called ‘links to the beef industry.’ “) So 20 years ago, the trimmings had already been reformulated and rebranded once.

But the government still said that fat reduced beef could not be used in packages marked “ground beef.” (The government distinction between hamburger and ground beef is that the former can contain added fat, while the latter can’t.) So Beef Products Inc. pressed its case, and in 1993 it convinced the USDA to approve the mash for wider use, with a new and better name: lean finely textured beef. A few years later, Roth started killing the microbes on his trimmings with ammonia gas and got approval to do that, too. With government permission, the company went on to sell several billion pounds of the stuff in the next two decades.

In the meantime, other meat processors started making something similar but using slightly different names. AFA Foods (which filed for bankruptcy in April after the recent ground-beef scandal broke), has referred to its products as boneless lean beef trimmings, a more generic term. Cargill, which decontaminates its meat with citric acid in place of ammonia gas, calls its mash of trimmings finely textured beef.

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Image: Industrial ground beef. Courtesy of Wikipedia.[end-div]

Lillian Moller Gilbreth: Inventor of the Modern Kitchen

Lillian Moller Gilbreth, industrial engineer and psychologist, mother of 12 children, but non-cook, invented the modern kitchen design. Unveiled in 1929 at a Women’s Exposition, her design ideas were codified into what became known as the Kitchen Practical.

[div class=attrib]From Slate:[end-div]

The idea that housework is work now seems like a commonplace. We contract it out to housekeepers, laundromats, cleaning services, takeout places. We divvy it up: You cooked dinner, I’ll do the dishes. We count it as a second shift, as well as primary employment. But it wasn’t until the early part of the 20th century that first a literature, and then a science, developed about the best way to cook and clean. The results of this research shape the way we treat housework today, and created a template for the kitchen that remains conceptually unchanged from the 1920s. And the woman who made the kitchen better? She couldn’t cook.

If that sounds like the set-up for a comedy, that’s because it was. Lillian Moller Gilbreth, industrial psychologist and engineer, was the mother of 12 children. She and husband and partner Frank B. Gilbreth, inventors of what is known as motion study, pioneered the use of short films to watch how industrial processes and office tasks were done, breaking them down into component parts (which they called “therbligs,” Gilbreth backward) to determine how to make a job faster and less taxing. They tested many of their ideas on their children, establishing “the one best way” to take a bath, training preteens to touch type, and charting age-appropriate chores for each child. The ensuing hijinks provided enough material for memoirs written by two Gilbreth children, Cheaper by the Dozen and Belles on Their Toes.

While Frank Gilbreth was alive, he and Lillian worked for industry. She wrote or co-wrote many of his books, but often took no credit, as it was Frank with whom the male executives wanted to deal. After his sudden death in 1924, she had to re-establish herself as a solo female practitioner. According to biographer Jane Lancaster, in Making Time, Gilbreth soon saw that combining her professional expertise on motion study with her assumed expertise on women’s work gave her a marketable niche.

Frank B. Gilbreth Jr. and Ernestine Gilbreth Carey write, in Belles on Their Toes:
If the only way to enter a man’s field was through the kitchen door, that’s the way she’d enter… Mother planned, on paper, an efficiency-type kitchenette of the kind used today in a good many apartments. Under her arrangement, a person could mix a cake, put it in the oven, and do the dishes, without taking more than a couple of dozen steps.

It had to be cake, because that was one of few dishes Gilbreth made well. Gilbreth had grown up in an upper class household in California with a Chinese chef. She had worked side-by-side with Frank Gilbreth from the day they married. As she told a group of businesswomen in 1930, “We considered our time too valuable to be devoted to actual labor in the home. We were executives.”And family councils, at the Gilbreth home in Montclair, were run like board meetings.

Even though she did not do it herself, Gilbreth still considered housework unpaid labor, and as such, capable of efficiencies. The worker in the kitchen in the 1920s was often not a servant but the lady of the house, who spent an estimated 50 percent of her day there. The refrigerator had begun to arrive in middle-class homes, but was the subject of a pitched battle between gas and electric companies as to who made the superior chiller. Smaller electric appliances were also in development. “Home economists” raised the bar for domestic health and hygiene. Women became the targets of intense marketing campaigns for products large and small. Gilbreth worked for these manufacturers, and thus is complicit in the rise of consumerism for the home, but she never made explicit endorsements.

She did, however, partner with the Brooklyn Borough Gas Company to develop Gilbreth’s Kitchen Practical, unveiled in 1929 at a Women’s Exposition. The kitchen was intended to showcase the new gas-fueled appliances as well as Gilbreth’s research on motion savings. It was to replace the loose-fit kitchen of many traditional homes (including the Gilbreths’): a large room with discrete pieces of furniture around the edges. These might include a table, a freestanding cupboard or Hoosier cabinet, an icebox, a sink with a drying board and a stove. Ingredients, utensils and cookware might be across the room, or even in a separate pantry.

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image: Kitchen Practical 1929. Courtesy of Gilbreth Network.[end-div]

It’s About Equality, Stupid

[div class=attrib]From Project Syndicate:[end-div]

The king of Bhutan wants to make us all happier. Governments, he says, should aim to maximize their people’s Gross National Happiness rather than their Gross National Product. Does this new emphasis on happiness represent a shift or just a passing fad?

It is easy to see why governments should de-emphasize economic growth when it is proving so elusive. The eurozone is not expected to grow at all this year. The British economy is contracting. Greece’s economy has been shrinking for years. Even China is expected to slow down. Why not give up growth and enjoy what we have?

No doubt this mood will pass when growth revives, as it is bound to. Nevertheless, a deeper shift in attitude toward growth has occurred, which is likely to make it a less important lodestar in the future – especially in rich countries.

The first factor to undermine the pursuit of growth was concern about its sustainability. Can we continue growing at the old rate without endangering our future?

When people started talking about the “natural” limits to growth in the 1970’s, they meant the impending exhaustion of food and non-renewable natural resources. Recently the debate has shifted to carbon emissions. As the Stern Review of 2006 emphasized, we must sacrifice some growth today to ensure that we do not all fry tomorrow.

Curiously, the one taboo area in this discussion is population. The fewer people there are, the less risk we face of heating up the planet. But, instead of accepting the natural decline in their populations, rich-country governments absorb more and more people to hold down wages and thereby grow faster.

A more recent concern focuses on the disappointing results of growth. It is increasingly understood that growth does not necessarily increase our sense of well-being. So why continue to grow?

The groundwork for this question was laid some time ago. In 1974, the economist Richard Easterlin published a famous paper, “Does Economic Growth Improve the Human Lot? Some Empirical Evidence.” After correlating per capita income and self-reported happiness levels across a number of countries, he reached a startling conclusion: probably not.

Above a rather low level of income (enough to satisfy basic needs), Easterlin found no correlation between happiness and GNP per head. In other words, GNP is a poor measure of life satisfaction.

That finding reinforced efforts to devise alternative indexes. In 1972, two economists, William Nordhaus and James Tobin, introduced a measure that they called “Net Economic Welfare,” obtained by deducting from GNP “bad” outputs, like pollution, and adding non-market activities, like leisure. They showed that a society with more leisure and less work could have as much welfare as one with more work – and therefore more GNP – and less leisure.

More recent metrics have tried to incorporate a wider range of “quality of life” indicators. The trouble is that you can measure quantity of stuff, but not quality of life. How one combines quantity and quality in some index of “life satisfaction” is a matter of morals rather than economics, so it is not surprising that most economists stick to their quantitative measures of “welfare.”

But another finding has also started to influence the current debate on growth: poor people within a country are less happy than rich people. In other words, above a low level of sufficiency, peoples’ happiness levels are determined much less by their absolute income than by their income relative to some reference group. We constantly compare our lot with that of others, feeling either superior or inferior, whatever our income level; well-being depends more on how the fruits of growth are distributed than on their absolute amount.

Put another way, what matters for life satisfaction is the growth not of mean income but of median income – the income of the typical person. Consider a population of ten people (say, a factory) in which the managing director earns $150,000 a year and the other nine, all workers, earn $10,000 each. The mean average of their incomes is $25,000, but 90% earn $10,000. With this kind of income distribution, it would be surprising if growth increased the typical person’s sense of well-being.

[div class=attrib]Read the entire article after the jump.[end-div]

The Benefits and Beauty of Blue

[div class=attrib]From the New York Times:[end-div]

For the French Fauvist painter and color gourmand Raoul Dufy, blue was the only color with enough strength of character to remain blue “in all its tones.” Darkened red looks brown and whitened red turns pink, Dufy said, while yellow blackens with shading and fades away in the light. But blue can be brightened or dimmed, the artist said, and “it will always stay blue.”

Scientists, too, have lately been bullish on blue, captivated by its optical purity, complexity and metaphorical fluency. They’re exploring the physics and chemistry of blueness in nature, the evolution of blue ornaments and blue come-ons, and the sheer brazenness of being blue when most earthly life forms opt for earthy raiments of beige, ruddy or taupe.

One research team recently reported the structural analysis of a small, dazzlingly blue fruit from the African Pollia condensata plant that may well be the brightest terrestrial object in nature. Another group working in the central Congo basin announced the discovery of a new species of monkey, a rare event in mammalogy. Rarer still is the noteworthiest trait of the monkey, called the lesula: a patch of brilliant blue skin on the male’s buttocks and scrotal area that stands out from the surrounding fur like neon underpants.

Still other researchers are tracing the history of blue pigments in human culture, and the role those pigments have played in shaping our notions of virtue, authority, divinity and social class. “Blue pigments played an outstanding role in human development,” said Heinz Berke, an emeritus professor of chemistry at the University of Zurich. For some cultures, he said, they were as valuable as gold.

As a raft of surveys has shown, blue love is a global affair. Ask people their favorite color, and in most parts of the world roughly half will say blue, a figure three to four times the support accorded common second-place finishers like purple or green. Just one in six Americans is blue-eyed, but nearly one in two consider blue the prettiest eye color, which could be why some 50 percent of tinted contact lenses sold are the kind that make your brown eyes blue.

Sick children like their caretakers in blue: A recent study at the Cleveland Clinic found that young patients preferred nurses wearing blue uniforms to those in white or yellow. And am I the only person in the United States who doesn’t own a single pair of those permanently popular pants formerly known as dungarees?

“For Americans, bluejeans have a special connotation because of their association with the Old West and rugged individualism,” said Steven Bleicher, author of “Contemporary Color: Theory and Use.” The jeans take their John Wayne reputation seriously. “Because the indigo dye fades during washing, everyone’s blue becomes uniquely different,” said Dr. Bleicher, a professor of visual arts at Coastal Carolina University. “They’re your bluejeans.”

According to psychologists who explore the complex interplay of color, mood and behavior, blue’s basic emotional valence is calmness and open-endedness, in contrast to the aggressive specificity associated with red. Blue is sea and sky, a pocket-size vacation.

In a study that appeared in the journal Perceptual & Motor Skills, researchers at Aichi University in Japan found that subjects who performed a lengthy video game exercise while sitting next to a blue partition reported feeling less fatigued and claustrophobic, and displayed a more regular heart beat pattern, than did people who sat by red or yellow partitions.

In the journal Science, researchers at the University of British Columbia described their study of how computer screen color affected participants’ ability to solve either creative problems — for example, determining the word that best unifies the terms “shelf,” “read” and “end” (answer: book) — or detail-oriented tasks like copy editing. The researchers found that blue screens were superior to red or white backgrounds at enhancing creativity, while red screens worked best for accuracy tasks. Interestingly, when participants were asked to predict which screen color would improve performance on the two categories of problems, big majorities deemed blue the ideal desktop setting for both.

But skies have their limits, and blue can also imply coldness, sorrow and death. On learning of a good friend’s suicide in 1901, Pablo Picasso fell into a severe depression, and he began painting images of beggars, drunks, the poor and the halt, all famously rendered in a palette of blue.

The provenance of using “the blues” to mean sadness isn’t clear, but L. Elizabeth Crawford, a professor of psychology at the University of Richmond in Virginia, suggested that the association arose from the look of the body when it’s in a low energy, low oxygen state. “The lips turn blue, there’s a blue pallor to the complexion,” she said. “It’s the opposite of the warm flushing of the skin that we associate with love, kindness and affection.”

Blue is also known to suppress the appetite, possibly as an adaptation against eating rotten meat, which can have a bluish tinge. “If you’re on a diet, my advice is, take the white bulb out of the refrigerator and put in a blue one instead,” Dr. Bleicher said. “A blue glow makes food look very unappetizing.”

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Image: Morpho didius, dorsal view of male butterfly. Courtesy of Wikipedia.[end-div]

MondayPoem: End of Summer

A month in to fall and it really does now seem like Autumn — leaves are turning and falling, jackets have reappeared, brisk morning walks are now shrouded in darkness.

So, we turn to the first Poet Laureate of the United States of the new millenium — Stanley Kunitz, to remind us of Summer’s end. Kunitz was anointed Laureate at the age of ninety-five, and died six years later. His published works span almost eight decades of thoughtful creativity.

By Stanley Kunitz

– End of Summer

An agitation of the air,
A perturbation of the light
Admonished me the unloved year
Would turn on its hinge that night.

I stood in the disenchanted field
Amid the stubble and the stones,
Amazed, while a small worm lisped to me
The song of my marrow-bones.

Blue poured into summer blue,
A hawk broke from his cloudless tower,
The roof of the silo blazed, and I knew
That part of my life was over.

Already the iron door of the north
Clangs open: birds, leaves, snows
Order their populations forth,
And a cruel wind blows.

Connectedness: A Force For Good

The internet has the potential to make our current political process obsolete. A review of “The End of Politics” by British politician Douglas Carswell shows how connectedness provides a significant opportunity to reshape the political process, and in some cases completely undermine government, for the good.

[div class=attrib]Charles Moore for the Telegraph:[end-div]

I think I can help you tackle this thought-provoking book. First of all, the title misleads. Enchanting though the idea will sound to many people, this is not about the end of politics. It is, after all, written by a Member of Parliament, Douglas Carswell (Con., Clacton) and he is fascinated by the subject. There’ll always be politics, he is saying, but not as we know it.

Second, you don’t really need to read the first half. It is essentially a passionately expressed set of arguments about why our current political arrangements do not work. It is good stuff, but there is plenty of it in the more independent-minded newspapers most days. The important bit is Part Two, beginning on page 145 and running for a modest 119 pages. It is called “The Birth of iDemocracy”.

Mr Carswell resembles those old barometers in which, in bad weather (Part One), a man with a mackintosh, an umbrella and a scowl comes out of the house. In good weather (Part Two), he pops out wearing a white suit, a straw hat and a broad smile. What makes him happy is the feeling that the digital revolution can restore to the people the power which, in the early days of the universal franchise, they possessed – and much, much more. He believes that the digital revolution has at last harnessed technology to express the “collective brain” of humanity. We develop our collective intelligence by exchanging the properties of our individual ones.

Throughout history, we have been impeded in doing this by physical barriers, such as distance, and by artificial ones, such as priesthoods of bureaucrats and experts. Today, i-this and e-that are cutting out these middlemen. He quotes the internet sage, Clay Shirky: “Here comes everybody”. Mr Carswell directs magnificent scorn at the aides to David Cameron who briefed the media that the Prime Minister now has an iPad app which will allow him, at a stroke of his finger, “to judge the success or failure of ministers with reference to performance-related data”.

The effect of the digital revolution is exactly the opposite of what the aides imagine. Far from now being able to survey everything, always, like God, the Prime Minister – any prime minister – is now in an unprecedentedly weak position in relation to the average citizen: “Digital technology is starting to allow us to choose for ourselves things that until recently Digital Dave and Co decided for us.”

A non-physical business, for instance, can often decide pretty freely where, for the purposes of taxation, it wants to live. Naturally, it will choose benign jurisdictions. Governments can try to ban it from doing so, but they will either fail, or find that they are cutting off their nose to spite their face. The very idea of a “tax base”, on which treasuries depend, wobbles when so much value lies in intellectual property and intellectual property is mobile. So taxes need to be flatter to keep their revenues up. If they are flatter, they will be paid by more people.

Therefore it becomes much harder for government to grow, since most people do not want to pay more.

[div class=attrib]Read the entire article after the jump.[end-div]

The United Swing States of America

Frank Jacobs over at Strange Maps has found a timely reminder that shows the inordinate influence that a few voters in several crucial States have over the rest of us.

[div class=attrib]From Strange Maps:[end-div]

At the stroke of midnight on November 6th, the 21 registered voters of Dixville Notch, gathering in the wood-panelled Ballot Room of the Balsams Grand Resort Hotel, will have just one minute to cast their vote. Speed is of the essence, if the tiny New Hampshire town is to uphold its reputation (est. 1960) as the first place to declare its results in the US presidential elections.

Later that day, well over 200 million other American voters will face the same choice as the good folks of the Notch: returning Barack Obama to the White House for a second and final four-year term, or electing Mitt Romney as the 45th President of the United States.

The winner of that contest will not be determined by whoever wins a simple majority (i.e. 50% of all votes cast, plus at least one). Like many electoral processes across the world, the system to elect the next president of the United States is riddled with idiosyncrasies and peculiarities – the quadrennial quorum in Dixville Notch being just one example.

Even though most US Presidents have indeed gained office by winning the popular vote, but this is not always the case. What is needed, is winning the electoral vote. For the US presidential election is an indirect one: depending on the outcome in each of the 50 states, an Electoral College convenes in Washington DC to elect the President.

The total of 538 electors is distributed across the states in proportion to their population size, and is regularly adjusted to reflect increases or decreases. In 2008 Louisiana had 9 electors and South Carolina had 8; reflecting a relative population decrease, resp. increase, those numbers are now reversed.

Maine and Nebraska are the only states to assign their electors proportionally; the other 48 states (and DC) operate on the ABBA principle: however slight the majority of either candidate in any of those states, he would win all its electoral votes. This rather convoluted system underlines the fact that the US Presidential elections are the sum of 50, state-level contests. It also brings into focus that some states are more important than others.

Obviously, in this system the more populous states carry much more weight than the emptier ones. Consider the map of the United States, and focus on the 17 states west of the straight-ish line of state borders from North Dakota-Minnesota in the north to Texas-Louisiana in the south. Just two states – Texas and California – outweigh the electoral votes of the 15 others.

So presidential candidates concentrate their efforts on the states where they can hope to gain the greatest advantage. This excludes the fairly large number of states that are solidly ‘blue’ (i.e. Democratic) or ‘red’ (Republican). Texas, for example, is reliably Republican, while California can be expected to fall in the Democratic column.

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Map courtesy of Strange Maps / Big Think.[end-div]

Teenagers and Time

Parents have long known that the sleep-wake cycles of their adolescent offspring are rather different to those of anyone else in the household.

Several new and detailed studies of teenagers tell us why teens are impossible to awaken at 7 am, suddenly awake at 10 pm, and often able to sleep anywhere for stretches of 16 hours.

[div class=attrib]From the Wall Street Journal:[end-div]

Many parents know the scene: The groggy, sleep-deprived teenager stumbles through breakfast and falls asleep over afternoon homework, only to spring to life, wide-eyed and alert, at 10 p.m.—just as Mom and Dad are nodding off.

Fortunately for parents, science has gotten more sophisticated at explaining why, starting at puberty, a teen’s internal sleep-wake clock seems to go off the rails. Researchers are also connecting the dots between the resulting sleep loss and behavior long chalked up to just “being a teenager.” This includes more risk-taking, less self-control, a drop in school performance and a rise in the incidence of depression.

One 2010 study from the University of British Columbia, for example, found that sleep loss can hamper neuron growth in the brain during adolescence, a critical period for cognitive development.

Findings linking sleep loss to adolescent turbulence are “really revelatory,” says Michael Terman, a professor of clinical psychology and psychiatry at Columbia University Medical Center and co-author of “Chronotherapy,” a forthcoming book on resetting the body clock. “These are reactions to a basic change in the way teens’ physiology and behavior is organized.”

Despite such revelations, there are still no clear solutions for the teen-zombie syndrome. Should a parent try to enforce strict wake-up and bedtimes, even though they conflict with the teen’s body clock? Or try to create a workable sleep schedule around that natural cycle? Coupled with a trend toward predawn school start times and peer pressure to socialize online into the wee hours, the result can upset kids’ health, school performance—and family peace.

Jeremy Kern, 16 years old, of San Diego, gets up at 6:30 a.m. for school and tries to fall asleep by 10 p.m. But a heavy load of homework and extracurricular activities, including playing saxophone in his school marching band and in a theater orchestra, often keep him up later.

“I need 10 hours of sleep to not feel tired, and every single day I have to deal with being exhausted,” Jeremy says. He stays awake during early-afternoon classes “by sheer force of will.” And as research shows, sleep loss makes him more emotionally volatile, Jeremy says, like when he recently broke up with his girlfriend: “You are more irrational when you’re sleep deprived. Your emotions are much harder to control.”

Only 7.6% of teens get the recommended 9 to 10 hours of sleep, 23.5% get eight hours and 38.7% are seriously sleep-deprived at six or fewer hours a night, says a 2011 study by the Centers for Disease Control and Prevention.

It’s a biological 1-2-3 punch. First, the onset of puberty brings a median 1.5-hour delay in the body’s release of the sleep-inducing hormone melatonin, says Mary Carskadon, a professor of psychiatry and human behavior at the Brown University medical school and a leading sleep researcher.

Second, “sleep pressure,” or the buildup of the need to sleep as the day wears on, slows during adolescence. That is, kids don’t become sleepy as early. This sleep delay isn’t just a passing impulse: It continues to increase through adolescence, peaking at age 19.5 in girls and age 20.9 in boys, Dr. Carskadon’s research shows.

Finally, teens lose some of their sensitivity to morning light, the kind that spurs awakening and alertness. And they become more reactive to nighttime light, sparking activity later into the evening.

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image courtesy of the Guardian / Alamy.[end-div]

Human Civilization and Weapons Go Hand in Hand

There is great irony in knowing that we humans would not be as civilized were it not for our passion for lethal, projectile weapons.

[div class=attrib]From the New Scientist:[end-div]

IT’S about 2 metres long, made of tough spruce wood and carved into a sharp point at one end. The widest part, and hence its centre of gravity, is in the front third, suggesting it was thrown like a javelin. At 400,000 years old, this is the world’s oldest spear. And, according to a provocative theory, on its carved length rests nothing less than the foundation of human civilisation as we know it, including democracy, class divisions and the modern nation state.

At the heart of this theory is a simple idea: the invention of weapons that could kill at a distance meant that power became uncoupled from physical strength. Even the puniest subordinate could now kill an alpha male, with the right weapon and a reasonable aim. Those who wanted power were forced to obtain it by other means – persuasion, cunning, charm – and so began the drive for the cognitive attributes that make us human. “In short, 400,000 years of evolution in the presence of lethal weapons gave rise to Homo sapiens,” says Herbert Gintis, an economist at the Santa Fe Institute in New Mexico who studies the evolution of social complexity and cooperation.

The puzzle of how humans became civilised has received new impetus from studies of the evolution of social organisation in other primates. These challenge the long-held view that political structure is a purely cultural phenomenon, suggesting that genes play a role too. If they do, the fact that we alone of all the apes have built highly complex societies becomes even more intriguing. Earlier this year, an independent institute called the Ernst Strüngmann Forum assembled a group of scientists in Frankfurt, Germany, to discuss how this complexity came about. Hot debate centred on the possibility that, at pivotal points in history, advances in lethal weapons technology drove human societies to evolve in new directions.

The idea that weapons have catalysed social change came to the fore three decades ago, when British anthropologist James Woodburn spent time with the Hadza hunter-gatherers of Tanzania. Their lifestyle, which has not changed in millennia, is thought to closely resemble that of our Stone Age ancestors, and Woodburn observed that they are fiercely egalitarian. Although the Hadza people include individuals who take a lead in different arenas, no one person has overriding authority. They also have mechanisms for keeping their leaders from growing too powerful – not least, the threat that a bully could be ambushed or killed in his sleep. The hunting weapon, Woodburn suggested, acts as an equaliser.

Some years later, anthropologist Christopher Boehm at the University of Southern California pointed out that the social organisation of our closest primate relative, the chimpanzee, is very different. They live in hierarchical, mixed-sex groups in which the alpha male controls access to food and females. In his 2000 book, Hierarchy in the Forest, Boehm proposed that egalitarianism arose in early hominin societies as a result of the reversal of this strength-based dominance hierarchy – made possible, in part, by projectile weapons. However, in reviving Woodburn’s idea, Boehm also emphasised the genetic heritage that we share with chimps. “We are prone to the formation of hierarchies, but also prone to form alliances in order to keep from being ruled too harshly or arbitrarily,” he says. At the Strüngmann forum, Gintis argued that this inherent tension accounts for much of human history, right up to the present day.

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Image: M777 howitzer. Courtesy of Wikipedia.[end-div]

Want Your Kids to Be Conservative or Liberal?

Researchers have confirmed what we already know: Parents who endorse a more authoritarian parenting style towards their toddlers are more likely to have children who are ideologically conservative when they reach age 18; parents who support more egalitarian parenting are more likely to have children who grow up to be liberal.

[div class=attrib]From the Pacific Standard:[end-div]

Parents: Do you find yourselves arguing with your adult children over who deserves to win the upcoming election? Does it confuse and frustrate you to realize your political viewpoints are so different?

Newly published research suggests you may only have yourself to blame.
Providing the best evidence yet to back up a decades-old theory, researchers writing in the journal Psychological Science report a link between a mother’s attitude toward parenting and the political ideology her child eventually adopts. In short, authoritarian parents are more prone to produce conservatives, while those who gave their kids more latitude are more likely to produce liberals.

This dynamic was theorized as early as 1950. But until now, almost all the research supporting it has been based on retrospective reports, with parents assessing their child-rearing attitudes in hindsight.

This new study, by a team led by psychologist R. Chris Fraley of the University of Illinois at Urbana-Champaign, begins with new mothers describing their intentions and approach in 1991, and ends with a survey of their children 18 years later. In between, it features an assessment of the child’s temperament at age 4.

The study looked at roughly 700 American children and their parents, who were recruited for the National Institute of Child Health and Human Development’s Study of Early Child Care and Youth Development. When each child was one month old, his or her mother completed a 30-item questionnaire designed to reveal her approach to parenting.

Those who strongly agreed with such statements as “the most important thing to teach children is absolute obedience to whoever is in authority” were categorized as holding authoritarian parenting attitudes. Those who robustly endorsed such sentiments as “children should be allowed to disagree with their parents” were categorized as holding egalitarian parenting attitudes.

When their kids were 54 months old, the mothers assessed their child’s temperament by answering 80 questions about their behavior. The children were evaluated for such traits as shyness, restlessness, attentional focusing (determined by their ability to follow directions and complete tasks) and fear.

Finally, at age 18, the youngsters completed a 28-item survey measuring their political attitudes on a liberal-to-conservative scale.

“Parents who endorsed more authoritarian parenting attitudes when their children were one month old were more likely to have children who were conservative in their ideologies at age 18,” the researchers report. “Parents who endorsed more egalitarian parenting attitudes were more likely to have children who were liberal.”

Temperament at age 4—which, of course, was very likely impacted by those parenting styles—was also associated with later ideological leanings.

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image courtesy of the Daily Show with Jon Stewart and the Colbert Report via Wired.[end-div]

LBPD – Love of Books Personality Disorder

Author Joe Queenan explains why reading over 6,000 books may be because, as he puts it, he “find[s] ‘reality’ a bit of a disappointment”.

[div class=attrib]From the Wall Street Journal:[end-div]

I started borrowing books from a roving Quaker City bookmobile when I was 7 years old. Things quickly got out of hand. Before I knew it I was borrowing every book about the Romans, every book about the Apaches, every book about the spindly third-string quarterback who comes off the bench in the fourth quarter to bail out his team. I had no way of knowing it at the time, but what started out as a harmless juvenile pastime soon turned into a lifelong personality disorder.

Fifty-five years later, with at least 6,128 books under my belt, I still organize my daily life—such as it is—around reading. As a result, decades go by without my windows getting washed.

My reading habits sometimes get a bit loopy. I often read dozens of books simultaneously. I start a book in 1978 and finish it 34 years later, without enjoying a single minute of the enterprise. I absolutely refuse to read books that critics describe as “luminous” or “incandescent.” I never read books in which the hero went to private school or roots for the New York Yankees. I once spent a year reading nothing but short books. I spent another year vowing to read nothing but books I picked off the library shelves with my eyes closed. The results were not pretty.

I even tried to spend an entire year reading books I had always suspected I would hate: “Middlemarch,” “Look Homeward, Angel,” “Babbitt.” Luckily, that project ran out of gas quickly, if only because I already had a 14-year-old daughter when I took a crack at “Lolita.”

Six thousand books is a lot of reading, true, but the trash like “Hell’s Belles” and “Kid Colt and the Legend of the Lost Arroyo” and even “Part-Time Harlot, Full-Time Tramp” that I devoured during my misspent teens really puff up the numbers. And in any case, it is nowhere near a record. Winston Churchill supposedly read a book every day of his life, even while he was saving Western Civilization from the Nazis. This is quite an accomplishment, because by some accounts Winston Churchill spent all of World War II completely hammered.

A case can be made that people who read a preposterous number of books are not playing with a full deck. I prefer to think of us as dissatisfied customers. If you have read 6,000 books in your lifetime, or even 600, it’s probably because at some level you find “reality” a bit of a disappointment. People in the 19th century fell in love with “Ivanhoe” and “The Count of Monte Cristo” because they loathed the age they were living through. Women in our own era read “Pride and Prejudice” and “Jane Eyre” and even “The Bridges of Madison County”—a dimwit, hayseed reworking of “Madame Bovary”—because they imagine how much happier they would be if their husbands did not spend quite so much time with their drunken, illiterate golf buddies down at Myrtle Beach. A blind bigamist nobleman with a ruined castle and an insane, incinerated first wife beats those losers any day of the week. Blind, two-timing noblemen never wear belted shorts.

Similarly, finding oneself at the epicenter of a vast, global conspiracy involving both the Knights Templar and the Vatican would be a huge improvement over slaving away at the Bureau of Labor Statistics for the rest of your life or being married to someone who is drowning in dunning notices from Williams-Sonoma. No matter what they may tell themselves, book lovers do not read primarily to obtain information or to while away the time. They read to escape to a more exciting, more rewarding world. A world where they do not hate their jobs, their spouses, their governments, their lives. A world where women do not constantly say things like “Have a good one!” and “Sounds like a plan!” A world where men do not wear belted shorts. Certainly not the Knights Templar.

I read books—mostly fiction—for at least two hours a day, but I also spend two hours a day reading newspapers and magazines, gathering material for my work, which consists of ridiculing idiots or, when they are not available, morons. I read books in all the obvious places—in my house and office, on trains and buses and planes—but I’ve also read them at plays and concerts and prizefights, and not just during the intermissions. I’ve read books while waiting for friends to get sprung from the drunk tank, while waiting for people to emerge from comas, while waiting for the Iceman to cometh.

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Image courtesy of Southern Illinois University.[end-div]

The Great Blue Monday Fallacy

A yearlong survey of moodiness shows that the so-called Monday Blues may be more figment of the imagination than fact.

[div class=attrib]From the New York Times:[end-div]

DESPITE the beating that Mondays have taken in pop songs — Fats Domino crooned “Blue Monday, how I hate blue Monday” — the day does not deserve its gloomy reputation.

Two colleagues and I recently published an analysis of a remarkable yearlong survey by the Gallup Organization, which conducted 1,000 live interviews a day, asking people across the United States to recall their mood in the prior day. We scoured the data for evidence that Monday was bluer than Tuesday or Wednesday. We couldn’t find any.

Mood was evaluated with several adjectives measuring positive or negative feelings. Spanish-only speakers were queried in Spanish. Interviewers spoke to people in every state on cellphones and land lines. The data unequivocally showed that Mondays are as pleasant to Americans as the three days that follow, and only a trifle less joyful than Fridays. Perhaps no surprise, people generally felt good on the weekend — though for retirees, the distinction between weekend and weekdays was only modest.

Likewise, day-of-the-week mood was gender-blind. Over all, women assessed their daily moods more negatively than men did, but relative changes from day to day were similar for both sexes.

And yet still, the belief in blue Mondays persists.

Several years ago, in another study, I examined expectations about mood and day of the week: two-thirds of the sample nominated Monday as the “worst” day of the week. Other research has confirmed that this sentiment is widespread, despite the fact that, well, we don’t really feel any gloomier on that day.

The question is, why? Why do we believe something that our own immediate experience indicates simply isn’t true?

As it turns out, the blue Monday mystery highlights a phenomenon familiar to behavioral scientists: that beliefs or judgments about experience can be at odds with actual experience. Indeed, the disconnection between beliefs and experience is common.

Vacations, for example, are viewed more pleasantly after they are over compared with how they were experienced at the time. And motorists who drive fancy cars report having more fun driving than those who own more modest vehicles, though in-car monitoring shows this isn’t the case. The same is often true in reverse as well: we remember pain or symptoms of illness at higher levels than real-time experience suggests, in part because we ignore symptom-free periods in between our aches and pains.

HOW do we make sense of these findings? The human brain has vast, but limited, capacities to store, retrieve and process information. Yet we are often confronted with questions that challenge these capacities. And this is often when the disconnect between belief and experience occurs. When information isn’t available for answering a question — say, when it did not make it into our memories in the first place — we use whatever information is available, even if it isn’t particularly relevant to the question at hand.

When asked about pain for the last week, most people cannot completely remember all of its ups and downs over seven days. However, we are likely to remember it at its worst and may use that as a way of summarizing pain for the entire week. When asked about our current satisfaction with life, we may focus on the first things that come to mind — a recent spat with a spouse or maybe a compliment from the boss at work.

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image: “I Don’t Like Mondays” single cover. Courtesy of The Boomtown Rats / Ensign Records.[end-div]

La Serrata: Why the Rise Always Followed by the Fall

Humans do learn from their mistakes. Yet, history does repeat. Nations will continue to rise, and inevitably fall. Why? Chrystia Freeland, author of “Plutocrats: The Rise of the New Global Super-Rich and the Fall of Everyone Else,” offers an insightful analysis based in part on 14th century Venice.

[div class=attrib]From the New York Times:[end-div]

IN the early 14th century, Venice was one of the richest cities in Europe. At the heart of its economy was the colleganza, a basic form of joint-stock company created to finance a single trade expedition. The brilliance of the colleganza was that it opened the economy to new entrants, allowing risk-taking entrepreneurs to share in the financial upside with the established businessmen who financed their merchant voyages.

Venice’s elites were the chief beneficiaries. Like all open economies, theirs was turbulent. Today, we think of social mobility as a good thing. But if you are on top, mobility also means competition. In 1315, when the Venetian city-state was at the height of its economic powers, the upper class acted to lock in its privileges, putting a formal stop to social mobility with the publication of the Libro d’Oro, or Book of Gold, an official register of the nobility. If you weren’t on it, you couldn’t join the ruling oligarchy.

The political shift, which had begun nearly two decades earlier, was so striking a change that the Venetians gave it a name: La Serrata, or the closure. It wasn’t long before the political Serrata became an economic one, too. Under the control of the oligarchs, Venice gradually cut off commercial opportunities for new entrants. Eventually, the colleganza was banned. The reigning elites were acting in their immediate self-interest, but in the longer term, La Serrata was the beginning of the end for them, and for Venetian prosperity more generally. By 1500, Venice’s population was smaller than it had been in 1330. In the 17th and 18th centuries, as the rest of Europe grew, the city continued to shrink.

The story of Venice’s rise and fall is told by the scholars Daron Acemoglu and James A. Robinson, in their book “Why Nations Fail: The Origins of Power, Prosperity, and Poverty,” as an illustration of their thesis that what separates successful states from failed ones is whether their governing institutions are inclusive or extractive. Extractive states are controlled by ruling elites whose objective is to extract as much wealth as they can from the rest of society. Inclusive states give everyone access to economic opportunity; often, greater inclusiveness creates more prosperity, which creates an incentive for ever greater inclusiveness.

The history of the United States can be read as one such virtuous circle. But as the story of Venice shows, virtuous circles can be broken. Elites that have prospered from inclusive systems can be tempted to pull up the ladder they climbed to the top. Eventually, their societies become extractive and their economies languish.

That was the future predicted by Karl Marx, who wrote that capitalism contained the seeds of its own destruction. And it is the danger America faces today, as the 1 percent pulls away from everyone else and pursues an economic, political and social agenda that will increase that gap even further — ultimately destroying the open system that made America rich and allowed its 1 percent to thrive in the first place.

You can see America’s creeping Serrata in the growing social and, especially, educational chasm between those at the top and everyone else. At the bottom and in the middle, American society is fraying, and the children of these struggling families are lagging the rest of the world at school.

Economists point out that the woes of the middle class are in large part a consequence of globalization and technological change. Culture may also play a role. In his recent book on the white working class, the libertarian writer Charles Murray blames the hollowed-out middle for straying from the traditional family values and old-fashioned work ethic that he says prevail among the rich (whom he castigates, but only for allowing cultural relativism to prevail).

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image: The Grand Canal and the Church of the Salute (1730) by Canaletto. Courtesy of Museum of Fine Arts, Houston / WikiCommons.[end-div]

The Tubes of the Internets

Google lets the world peek at the many tubes that form a critical part of its search engine infrastructure — functional and pretty too.

[div class=attrib]From the Independent:[end-div]

They are the cathedrals of the information age – with the colour scheme of an adventure playground.

For the first time, Google has allowed cameras into its high security data centres – the beating hearts of its global network that allow the web giant to process 3 billion internet searches every day.

Only a small band of Google employees have ever been inside the doors of the data centres, which are hidden away in remote parts of North America, Belgium and Finland.

Their workplaces glow with the blinking lights of LEDs on internet servers reassuring technicians that all is well with the web, and hum to the sound of hundreds of giant fans and thousands of gallons of water, that stop the whole thing overheating.

“Very few people have stepped inside Google’s data centers [sic], and for good reason: our first priority is the privacy and security of your data, and we go to great lengths to protect it, keeping our sites under close guard,” the company said yesterday. Row upon row of glowing servers send and receive information from 20 billion web pages every day, while towering libraries store all the data that Google has ever processed – in case of a system failure.

With data speeds 200,000 times faster than an ordinary home internet connection, Google’s centres in America can share huge amounts of information with European counterparts like the remote, snow-packed Hamina centre in Finland, in the blink of an eye.

[div class=attrib]Read the entire article after the jump, or take a look at more images from the bowels of Google after the leap.[end-div]

3D Printing Coming to a Home Near You

It seems that not too long ago we were writing about pioneering research into 3D printing and start-up businesses showing off their industrially focused, prototype 3D printers. Now, only a couple of years later there is a growing, consumer market, home-based printers for under $3,000, and even a a 3D printing expo — 3D Printshow. The future looks bright and very much three dimensional.

[div class=attrib]From the Independent:[end-div]

It is Star Trek science made reality, with the potential for production-line replacement body parts, aeronautical spares, fashion, furniture and virtually any other object on demand. It is 3D printing, and now people in Britain can try it for themselves.

The cutting-edge technology, which layers plastic resin in a manner similar to an inkjet printer to create 3D objects, is on its way to becoming affordable for home use. Some of its possibilities will be on display at the UK’s first 3D-printing trade show from Friday to next Sunday at The Brewery in central London .

Clothes made using the technique will be exhibited in a live fashion show, which will include the unveiling of a hat designed for the event by the milliner Stephen Jones, and a band playing a specially composed score on 3D-printed musical instruments.

Some 2,000 consumers are expected to join 1,000 people from the burgeoning industry to see what the technique has to offer, including jewellery and art. A 3D body scanner, which can reproduce a “mini” version of the person scanned, will also be on display.

Workshops run by Jason Lopes of Legacy Effects, which provided 3D-printed models and props for cinema blockbusters such as the Iron Man series and Snow White and the Huntsman, will add a sprinkling of Hollywood glamour.

Kerry Hogarth, the woman behind 3D Printshow, said yesterday she aims to showcase the potential of the technology for families. While prices for printers start at around £1,500 – with DIY kits for less – they are expected to drop steadily over the coming year. One workshop, run by the Birmingham-based Black Country Atelier, will invite people to design a model vehicle and then see the result “printed” off for them to take home.

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image: 3D scanning and printing. Courtesy of Wikipedia.[end-div]

The Promise of Quantum Computation

Advanced in quantum physics and in the associated realm of quantum information promise to revolutionize computing. Imagine a computer several trillions of times faster than the present day supercomputers — well, that’s where we are heading.

[div class=attrib]From the New York Times:[end-div]

THIS summer, physicists celebrated a triumph that many consider fundamental to our understanding of the physical world: the discovery, after a multibillion-dollar effort, of the Higgs boson.

Given its importance, many of us in the physics community expected the event to earn this year’s Nobel Prize in Physics. Instead, the award went to achievements in a field far less well known and vastly less expensive: quantum information.

It may not catch as many headlines as the hunt for elusive particles, but the field of quantum information may soon answer questions even more fundamental — and upsetting — than the ones that drove the search for the Higgs. It could well usher in a radical new era of technology, one that makes today’s fastest computers look like hand-cranked adding machines.

The basis for both the work behind the Higgs search and quantum information theory is quantum physics, the most accurate and powerful theory in all of science. With it we created remarkable technologies like the transistor and the laser, which, in time, were transformed into devices — computers and iPhones — that reshaped human culture.

But the very usefulness of quantum physics masked a disturbing dissonance at its core. There are mysteries — summed up neatly in Werner Heisenberg’s famous adage “atoms are not things” — lurking at the heart of quantum physics suggesting that our everyday assumptions about reality are no more than illusions.

Take the “principle of superposition,” which holds that things at the subatomic level can be literally two places at once. Worse, it means they can be two things at once. This superposition animates the famous parable of Schrödinger’s cat, whereby a wee kitty is left both living and dead at the same time because its fate depends on a superposed quantum particle.

For decades such mysteries were debated but never pushed toward resolution, in part because no resolution seemed possible and, in part, because useful work could go on without resolving them (an attitude sometimes called “shut up and calculate”). Scientists could attract money and press with ever larger supercolliders while ignoring such pesky questions.

But as this year’s Nobel recognizes, that’s starting to change. Increasingly clever experiments are exploiting advances in cheap, high-precision lasers and atomic-scale transistors. Quantum information studies often require nothing more than some equipment on a table and a few graduate students. In this way, quantum information’s progress has come not by bludgeoning nature into submission but by subtly tricking it to step into the light.

Take the superposition debate. One camp claims that a deeper level of reality lies hidden beneath all the quantum weirdness. Once the so-called hidden variables controlling reality are exposed, they say, the strangeness of superposition will evaporate.

Another camp claims that superposition shows us that potential realities matter just as much as the single, fully manifested one we experience. But what collapses the potential electrons in their two locations into the one electron we actually see? According to this interpretation, it is the very act of looking; the measurement process collapses an ethereal world of potentials into the one real world we experience.

And a third major camp argues that particles can be two places at once only because the universe itself splits into parallel realities at the moment of measurement, one universe for each particle location — and thus an infinite number of ever splitting parallel versions of the universe (and us) are all evolving alongside one another.

These fundamental questions might have lived forever at the intersection of physics and philosophy. Then, in the 1980s, a steady advance of low-cost, high-precision lasers and other “quantum optical” technologies began to appear. With these new devices, researchers, including this year’s Nobel laureates, David J. Wineland and Serge Haroche, could trap and subtly manipulate individual atoms or light particles. Such exquisite control of the nano-world allowed them to design subtle experiments probing the meaning of quantum weirdness.

Soon at least one interpretation, the most common sense version of hidden variables, was completely ruled out.

At the same time new and even more exciting possibilities opened up as scientists began thinking of quantum physics in terms of information, rather than just matter — in other words, asking if physics fundamentally tells us more about our interaction with the world (i.e., our information) than the nature of the world by itself (i.e., matter). And so the field of quantum information theory was born, with very real new possibilities in the very real world of technology.

What does this all mean in practice? Take one area where quantum information theory holds promise, that of quantum computing.

Classical computers use “bits” of information that can be either 0 or 1. But quantum-information technologies let scientists consider “qubits,” quantum bits of information that are both 0 and 1 at the same time. Logic circuits, made of qubits directly harnessing the weirdness of superpositions, allow a quantum computer to calculate vastly faster than anything existing today. A quantum machine using no more than 300 qubits would be a million, trillion, trillion, trillion times faster than the most modern supercomputer.

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image: Bloch sphere representation of a qubit, the fundamental building block of quantum computers. Courtesy of Wikipedia.[end-div]

The Half Life of Facts

There is no doubting the ever expanding reach of science and the acceleration of scientific discovery. Yet the accumulation, and for that matter the acceleration in the accumulation, of ever more knowledge does come with a price — many historical facts that we learned as kids are no longer true. This is especially important in areas such as medical research where new discoveries are constantly making obsolete our previous notions of disease and treatment.

Author Samuel Arbesman, tells us why facts should have an expiration date in his new book, A review of The Half-Life of Facts.

[div class=attrib]From Reason:[end-div]

Dinosaurs were cold-blooded. Vast increases in the money supply produce inflation. Increased K-12 spending and lower pupil/teacher ratios boosts public school student outcomes. Most of the DNA in the human genome is junk. Saccharin causes cancer and a high fiber diet prevents it. Stars cannot be bigger than 150 solar masses. And by the way, what are the ten most populous cities in the United States?

In the past half century, all of the foregoing facts have turned out to be wrong (except perhaps the one about inflation rates). We’ll revisit the ten biggest cities question below. In the modern world facts change all of the time, according to Samuel Arbesman, author of The Half-Life of Facts: Why Everything We Know Has an Expiration Date.

Arbesman, a senior scholar at the Kaufmann Foundation and an expert in scientometrics, looks at how facts are made and remade in the modern world. And since fact-making is speeding up, he worries that most of us don’t keep up to date and base our decisions on facts we dimly remember from school and university classes that turn out to be wrong.

The field of scientometrics – the science of measuring and analyzing science – took off in 1947 when mathematician Derek J. de Solla Price was asked to store a complete set of the Philosophical Transactions of the Royal Society temporarily in his house. He stacked them in order and he noticed that the height of the stacks fit an exponential curve. Price started to analyze all sorts of other kinds of scientific data and concluded in 1960 that scientific knowledge had been growing steadily at a rate of 4.7 percent annually since the 17th century. The upshot was that scientific data was doubling every 15 years.

In 1965, Price exuberantly observed, “All crude measures, however arrived at, show to a first approximation that science increases exponentially, at a compound interest of about 7 percent  per annum, thus doubling in size every 10–15 years, growing by a factor of 10 every half century, and by something like a factor of a million in the 300 years which separate us from the seventeenth-century invention of the scientific paper when the process began.” A 2010 study in the journal Scientometrics looked at data between 1907 and 2007 and concluded that so far the “overall growth rate for science still has been at least 4.7 percent per year.”

Since scientific knowledge is still growing by a factor of ten every 50 years, it should not be surprising that lots of facts people learned in school and universities have been overturned and are now out of date.  But at what rate do former facts disappear? Arbesman applies the concept of half-life, the time required for half the atoms of a given amount of a radioactive substance to disintegrate, to the dissolution of facts. For example, the half-life of the radioactive isotope strontium-90 is just over 29 years. Applying the concept of half-life to facts, Arbesman cites research that looked into the decay in the truth of clinical knowledge about cirrhosis and hepatitis. “The half-life of truth was 45 years,” reported the researchers.

In other words, half of what physicians thought they knew about liver diseases was wrong or obsolete 45 years later. As interesting and persuasive as this example is, Arbesman’s book would have been strengthened by more instances drawn from the scientific literature.

Facts are being manufactured all of the time, and, as Arbesman shows, many of them turn out to be wrong. Checking each by each is how the scientific process is supposed work, i.e., experimental results need to be replicated by other researchers. How many of the findings in 845,175 articles published in 2009 and recorded in PubMed, the free online medical database, were actually replicated? Not all that many. In 2011, a disheartening study in Nature reported that a team of researchers over ten years was able to reproduce the results of only six out of 53 landmark papers in preclinical cancer research.

[div class=attrib]Read the entire article after the jump.[end-div]

Remembering the Future

Memory is a very useful cognitive tool. After all, where would we be if we had no recall of our family, friends, foods, words, tasks and dangers.

But, it turns our that memory may also help us imagine the future — another very important human trait.

[div class=attrib]From the New Scientist:[end-div]

WHEN thinking about the workings of the mind, it is easy to imagine memory as a kind of mental autobiography – the private book of you. To relive the trepidation of your first day at school, say, you simply dust off the cover and turn to the relevant pages. But there is a problem with this idea. Why are the contents of that book so unreliable? It is not simply our tendency to forget key details. We are also prone to “remember” events that never actually took place, almost as if a chapter from another book has somehow slipped into our autobiography. Such flaws are puzzling if you believe that the purpose of memory is to record your past – but they begin to make sense if it is for something else entirely.

That is exactly what memory researchers are now starting to realise. They believe that human memory didn’t evolve so that we could remember but to allow us to imagine what might be. This idea began with the work of Endel Tulving, now at the Rotman Research Institute in Toronto, Canada, who discovered a person with amnesia who could remember facts but not episodic memories relating to past events in his life. Crucially, whenever Tulving asked him about his plans for that evening, the next day or the summer, his mind went blank – leading Tulving to suspect that foresight was the flipside of episodic memory.

Subsequent brain scans supported the idea, suggesting that every time we think about a possible future, we tear up the pages of our autobiographies and stitch together the fragments into a montage that represents the new scenario. This process is the key to foresight and ingenuity, but it comes at the cost of accuracy, as our recollections become frayed and shuffled along the way. “It’s not surprising that we confuse memories and imagination, considering that they share so many processes,” says Daniel Schacter, a psychologist at Harvard University.

Over the next 10 pages, we will show how this theory has brought about a revolution in our understanding of memory. Given the many survival benefits of being able to imagine the future, for instance, it is not surprising that other creatures show a rudimentary ability to think in this way (“Do animals ever forget?”). Memory’s role in planning and problem solving, meanwhile, suggests that problems accessing the past may lie behind mental illnesses like depression and post-traumatic stress disorder, offering a new approach to treating these conditions (“Boosting your mental fortress”). Equally, a growing understanding of our sense of self can explain why we are so selective in the events that we weave into our life story – again showing definite parallels with the way we imagine the future (“How the brain spins your life story”). The work might even suggest some dieting tips (“Lost in the here and now”).

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image: The Persistence of Memory, 1931. Salvador Dalí. Courtesy of Salvador Dalí, Gala-Salvador Dalí Foundation/Artists Rights Society.[end-div]

Mourning the Lost Art of Handwriting

In this age of digital everything handwriting does still matter. Some of you may even still have a treasured fountain pen. Novelist Philip Hensher suggests why handwriting has import and value in his new book, The Missing Ink.

[div class=attrib]From the Guardian:[end-div]

About six months ago, I realised that I had no idea what the handwriting of a good friend of mine looked like. I had known him for over a decade, but somehow we had never communicated using handwritten notes. He had left voice messages for me, emailed me, sent text messages galore. But I don’t think I had ever had a letter from him written by hand, a postcard from his holidays, a reminder of something pushed through my letter box. I had no idea whether his handwriting was bold or crabbed, sloping or upright, italic or rounded, elegant or slapdash.

It hit me that we are at a moment when handwriting seems to be about to vanish from our lives altogether. At some point in recent years, it has stopped being a necessary and inevitable intermediary between people – a means by which individuals communicate with each other, putting a little bit of their personality into the form of their message as they press the ink-bearing point on to the paper. It has started to become just one of many options, and often an unattractive, elaborate one.

For each of us, the act of putting marks on paper with ink goes back as far as we can probably remember. At some point, somebody comes along and tells us that if you make a rounded shape and then join it to a straight vertical line, that means the letter “a”, just like the ones you see in the book. (But the ones in the book have a little umbrella over the top, don’t they? Never mind that, for the moment: this is how we make them for ourselves.) If you make a different rounded shape, in the opposite direction, and a taller vertical line, then that means the letter “b”. Do you see? And then a rounded shape, in the same direction as the first letter, but not joined to anything – that makes a “c”. And off you go.

Actually, I don’t think I have any memory of this initial introduction to the art of writing letters on paper. Our handwriting, like ourselves, seems always to have been there.

But if I don’t have any memory of first learning to write, I have a clear memory of what followed: instructions in refinements, suggestions of how to purify the forms of your handwriting.

You longed to do “joined-up writing”, as we used to call the cursive hand when we were young. Instructed in print letters, I looked forward to the ability to join one letter to another as a mark of huge sophistication. Adult handwriting was unreadable, true, but perhaps that was its point. I saw the loops and impatient dashes of the adult hand as a secret and untrustworthy way of communicating that one day I would master.

There was, also, wanting to make your handwriting more like other people’s. Often, this started with a single letter or figure. In the second year at school, our form teacher had a way of writing a 7 in the European way, with a cross-bar. A world of glamour and sophistication hung on that cross-bar; it might as well have had a beret on, be smoking Gitanes in the maths cupboard.

Your hand is formed by aspiration to the hand of others – by the beautiful italic strokes of a friend which seem altogether wasted on a mere postcard, or a note on your door reading “Dropped by – will come back later”. It’s formed, too, by anti-aspiration, the desire not to be like Denise in the desk behind who reads with her mouth open and whose writing, all bulging “m”s and looping “p”s, contains the atrocity of a little circle on top of every i. Or still more horrible, on occasion, usually when she signs her name, a heart. (There may be men in the world who use a heart-shaped jot, as the dot over the i is called, but I have yet to meet one. Or run a mile from one.)

Those other writing apparatuses, mobile phones, occupy a little bit more of the same psychological space as the pen. Ten years ago, people kept their mobile phone in their pockets. Now, they hold them permanently in their hand like a small angry animal, gazing crossly into our faces, in apparent need of constant placation. Clearly, people do regard their mobile phones as, in some degree, an extension of themselves. And yet we have not evolved any of those small, pleasurable pieces of behaviour towards them that seem so ordinary in the case of our pens. If you saw someone sucking one while they thought of the next phrase to text, you would think them dangerously insane.

We have surrendered our handwriting for something more mechanical, less distinctively human, less telling about ourselves and less present in our moments of the highest happiness and the deepest emotion. Ink runs in our veins, and shows the world what we are like. The shaping of thought and written language by a pen, moved by a hand to register marks of ink on paper, has for centuries, millennia, been regarded as key to our existence as human beings. In the past, handwriting has been regarded as almost the most powerful sign of our individuality. In 1847, in an American case, a witness testified without hesitation that a signature was genuine, though he had not seen an example of the handwriting for 63 years: the court accepted his testimony.

Handwriting is what registers our individuality, and the mark which our culture has made on us. It has been seen as the unknowing key to our souls and our innermost nature. It has been regarded as a sign of our health as a society, of our intelligence, and as an object of simplicity, grace, fantasy and beauty in its own right. Yet at some point, the ordinary pleasures and dignity of handwriting are going to be replaced permanently.

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image: Stipula fountain pen. Courtesy of Wikipedia.[end-div]

The Rise of Neurobollocks

For readers of thediagonal in North America “neurobollocks” would roughly translate to “neurobullshit”.

So what is this growing “neuro-trend”, why is there an explosion in “neuro-babble” and all things with a “neuro-” prefix, and is Malcolm Gladwell to blame?

[div class=attrib]From the New Statesman:[end-div]

An intellectual pestilence is upon us. Shop shelves groan with books purporting to explain, through snazzy brain-imaging studies, not only how thoughts and emotions function, but how politics and religion work, and what the correct answers are to age-old philosophical controversies. The dazzling real achievements of brain research are routinely pressed into service for questions they were never designed to answer. This is the plague of neuroscientism – aka neurobabble, neurobollocks, or neurotrash – and it’s everywhere.

In my book-strewn lodgings, one literally trips over volumes promising that “the deepest mysteries of what makes us who we are are gradually being unravelled” by neuroscience and cognitive psychology. (Even practising scientists sometimes make such grandiose claims for a general audience, perhaps urged on by their editors: that quotation is from the psychologist Elaine Fox’s interesting book on “the new science of optimism”, Rainy Brain, Sunny Brain, published this summer.) In general, the “neural” explanation has become a gold standard of non-fiction exegesis, adding its own brand of computer-assisted lab-coat bling to a whole new industry of intellectual quackery that affects to elucidate even complex sociocultural phenomena. Chris Mooney’s The Republican Brain: the Science of Why They Deny Science – and Reality disavows “reductionism” yet encourages readers to treat people with whom they disagree more as pathological specimens of brain biology than as rational interlocutors.

The New Atheist polemicist Sam Harris, in The Moral Landscape, interprets brain and other research as showing that there are objective moral truths, enthusiastically inferring – almost as though this were the point all along – that science proves “conservative Islam” is bad.

Happily, a new branch of the neuroscienceexplains everything genre may be created at any time by the simple expedient of adding the prefix “neuro” to whatever you are talking about. Thus, “neuroeconomics” is the latest in a long line of rhetorical attempts to sell the dismal science as a hard one; “molecular gastronomy” has now been trumped in the scientised gluttony stakes by “neurogastronomy”; students of Republican and Democratic brains are doing “neuropolitics”; literature academics practise “neurocriticism”. There is “neurotheology”, “neuromagic” (according to Sleights of Mind, an amusing book about how conjurors exploit perceptual bias) and even “neuromarketing”. Hoping it’s not too late to jump on the bandwagon, I have decided to announce that I, too, am skilled in the newly minted fields of neuroprocrastination and neuroflâneurship.

Illumination is promised on a personal as well as a political level by the junk enlightenment of the popular brain industry. How can I become more creative? How can I make better decisions? How can I be happier? Or thinner? Never fear: brain research has the answers. It is self-help armoured in hard science. Life advice is the hook for nearly all such books. (Some cram the hard sell right into the title – such as John B Arden’s Rewire Your Brain: Think Your Way to a Better Life.) Quite consistently, heir recommendations boil down to a kind of neo- Stoicism, drizzled with brain-juice. In a selfcongratulatory egalitarian age, you can no longer tell people to improve themselves morally. So self-improvement is couched in instrumental, scientifically approved terms.

The idea that a neurological explanation could exhaust the meaning of experience was already being mocked as “medical materialism” by the psychologist William James a century ago. And today’s ubiquitous rhetorical confidence about how the brain works papers over a still-enormous scientific uncertainty. Paul Fletcher, professor of health neuroscience at the University of Cambridge, says that he gets “exasperated” by much popular coverage of neuroimaging research, which assumes that “activity in a brain region is the answer to some profound question about psychological processes. This is very hard to justify given how little we currently know about what different regions of the brain actually do.” Too often, he tells me in an email correspondence, a popular writer will “opt for some sort of neuro-flapdoodle in which a highly simplistic and questionable point is accompanied by a suitably grand-sounding neural term and thus acquires a weightiness that it really doesn’t deserve. In my view, this is no different to some mountebank selling quacksalve by talking about the physics of water molecules’ memories, or a beautician talking about action liposomes.”

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image courtesy of Amazon.[end-div]

Power and Baldness

Since behavioral scientists and psychologists first began roaming the globe we have come to know how and (sometimes) why visual appearance is so important in human interactions. Of course, anecdotally, humans have known this for thousands of years — that image is everything. After all it, was not Mary Kay or L’Oreal who brought us make-up but the ancient Egyptians. Yet, it is still fascinating to see how markedly the perception of an individual can change with a basic alteration, and only at the surface. Witness the profound difference in characteristics that we project onto a male with male pattern baldness (wimp) when he shaves his head (tough guy). And, of course, corporations can now assign a monetary value to the shaven look. As for comb-overs, well that is another topic entirely.

[div class=attrib]From the Wall Street Journal:[end-div]

Up for a promotion? If you’re a man, you might want to get out the clippers.

Men with shaved heads are perceived to be more masculine, dominant and, in some cases, to have greater leadership potential than those with longer locks or with thinning hair, according to a recent study out of the University of Pennsylvania’s Wharton School.

That may explain why the power-buzz look has caught on among business leaders in recent years. Venture capitalist and Netscape founder Marc Andreessen, 41 years old, DreamWorks Animation Chief Executive Jeffrey Katzenberg, 61, and Amazon.com Inc. CEO Jeffrey Bezos, 48, all sport some variant of the close-cropped look.

Some executives say the style makes them appear younger—or at least, makes their age less evident—and gives them more confidence than a comb-over or monk-like pate.

“I’m not saying that shaving your head makes you successful, but it starts the conversation that you’ve done something active,” says tech entrepreneur and writer Seth Godin, 52, who has embraced the bare look for two decades. “These are people who decide to own what they have, as opposed to trying to pretend to be something else.”

Wharton management lecturer Albert Mannes conducted three experiments to test peoples’ perceptions of men with shaved heads. In one of the experiments, he showed 344 subjects photos of the same men in two versions: one showing the man with hair and the other showing him with his hair digitally removed, so his head appears shaved.

In all three tests, the subjects reported finding the men with shaved heads as more dominant than their hirsute counterparts. In one test, men with shorn heads were even perceived as an inch taller and about 13% stronger than those with fuller manes. The paper, “Shorn Scalps and Perceptions of Male Dominance,” was published online, and will be included in a coming issue of journal Social Psychological and Personality Science.

The study found that men with thinning hair were viewed as the least attractive and powerful of the bunch, a finding that tracks with other studies showing that people perceive men with typical male-pattern baldness—which affects roughly 35 million Americans—as older and less attractive. For those men, the solution could be as cheap and simple as a shave.

According to Wharton’s Dr. Mannes—who says he was inspired to conduct the research after noticing that people treated him more deferentially when he shaved off his own thinning hair—head shavers may seem powerful because the look is associated with hypermasculine images, such as the military, professional athletes and Hollywood action heroes like Bruce Willis. (Male-pattern baldness, by contrast, conjures images of “Seinfeld” character George Costanza.)

New York image consultant Julie Rath advises her clients to get closely cropped when they start thinning up top. “There’s something really strong, powerful and confident about laying it all bare,” she says, describing the thinning or combed-over look as “kind of shlumpy.”

The look is catching on. A 2010 study from razor maker Gillette, a unit of Procter & Gamble Co., found that 13% of respondents said they shaved their heads, citing reasons as varied as fashion, sports and already thinning hair, according to a company spokesman. HeadBlade Inc., which sells head-shaving accessories, says revenues have grown 30% a year in the past decade.

Shaving his head gave 60-year-old Stephen Carley, CEO of restaurant chain Red Robin Gourmet Burgers Inc., a confidence boost when he was working among 20-somethings at tech start-ups in the 1990s. With his thinning hair shorn, “I didn’t feel like the grandfather in the office anymore.” He adds that the look gave him “the impression that it was much harder to figure out how old I was.”

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image: Comb-over patent, 1977. Courtesy of Wikipedia.[end-div]

QTWTAIN: Are there Nazis living on the moon?

QTWTAIN is a Twitterspeak acronym for a Question To Which The Answer Is No.

QTWTAINs are a relatively recent journalistic phenomenon. They are often used as headlines to great effect by media organizations to grab a reader’s attention. But importantly, QTWTAINs imply that something ridiculous is true — by posing a headline as a question no evidence seems to be required. Here’s an example of a recent headline:

“Europe: Are there Nazis living on the moon?”

Author and journalist John Rentoul has done all connoisseurs of QTWTAINs a great service by collecting an outstanding selection from hundreds of his favorites into a new book, Questions to Which the Answer is No. Rentoul tells us his story, excerpted, below.

[div class=attrib]From the Independent:[end-div]

I have an unusual hobby. I collect headlines in the form of questions to which the answer is no. This is a specialist art form that has long been a staple of “prepare to be amazed” journalism. Such questions allow newspapers, television programmes and websites to imply that something preposterous is true without having to provide the evidence.

If you see a question mark after a headline, ask yourself why it is not expressed as a statement, such as “Church of England threatened by excess of cellulite” or “Revealed: Marlene Dietrich plotted to murder Hitler” or, “This penguin is a communist”.

My collection started with a bishop, a grudge against Marks & Spencer and a theft in broad daylight. The theft was carried out by me: I had been inspired by Oliver Kamm, a friend and hero of mine, who wrote about Great Historical Questions to Which the Answer is No on his blog. Then I came across this long headline in Britain’s second-best-selling newspaper three years ago: “He’s the outcast bishop who denies the Holocaust – yet has been welcomed back by the Pope. But are Bishop Williamson’s repugnant views the result of a festering grudge against Marks & Spencer?” Thus was an internet meme born.

Since then readers of The Independent blog and people on Twitter with nothing better to do have supplied me with a constant stream of QTWTAIN. If this game had a serious purpose, which it does not, it would be to make fun of conspiracy theories. After a while, a few themes recurred: flying saucers, yetis, Jesus, the murder of John F Kennedy, the death of Marilyn Monroe and reincarnation.

An enterprising PhD student could use my series as raw material for a thesis entitled: “A Typology of Popular Irrationalism in Early 21st-Century Media”. But that would be to take it too seriously. The proper use of the series is as a drinking game, to be followed by a rousing chorus of “Jerusalem”, which consists largely of questions to which the answer is no.

My only rule in compiling the series is that the author or publisher of the question has to imply that the answer is yes (“Does Nick Clegg Really Expect Us to Accept His Apology?” for example, would be ruled out of order). So far I have collected 841 of them, and the best have been selected for a book published this week. I hope you like them.

Is the Loch Ness monster on Google Earth?

Daily Telegraph, 26 August 2009

A picture of something that actually looked like a giant squid had been spotted by a security guard as he browsed the digital planet. A similar question had been asked by the Telegraph six months earlier, on 19 February, about a different picture: “Has the Loch Ness Monster emigrated to Borneo?”

Would Boudicca have been a Liberal Democrat?

This one is cheating, because Paul Richards, who asked it in an article in Progress magazine, 12 March 2010, did not imply that the answer was yes. He was actually making a point about the misuse of historical conjecture, comparing Douglas Carswell, the Conservative MP, who suggested that the Levellers were early Tories, to the spiritualist interviewed by The Sun in 1992, who was asked how Winston Churchill, Joseph Stalin, Karl Marx and Chairman Mao would have voted (Churchill was for John Major; the rest for Neil Kinnock, naturally).

Is Tony Blair a Mossad agent?

A question asked by Peza, who appears to be a cat, on an internet forum on 9 April 2010. One reader had a good reply: “Peza, are you drinking that vodka-flavoured milk?”

Could Angelina Jolie be the first female US President?

Daily Express, 24 June 2009

An awkward one this, because one of my early QTWTAIN was “Is the Express a newspaper?” I had formulated an arbitrary rule that its headlines did not count. But what are rules for, if not for changing?

[div class=attrib]Read the entire article after the jump?[end-div]

[div class=attrib]Book Cover: Questions to Which the Answer is No, by John Rentoul. Courtesy of the Independent / John Rentoul.[end-div]

Brilliant! The Brits are Coming

Following decades of one-way cultural osmosis — from the United States to the UK, it seems that the trend may be reversing. Well, at least in the linguistic department. Although it may be a while before “blimey” enters the American lexicon, other words and phrases such as “spot on”, “chat up”, “ginger” to describe hair color, “gormless”

[div class=attrib]From the BBC:[end-div]

There is little that irks British defenders of the English language more than Americanisms, which they see creeping insidiously into newspaper columns and everyday conversation. But bit by bit British English is invading America too.

Spot on – it’s just ludicrous!” snaps Geoffrey Nunberg, a linguist at the University of California at Berkeley.

“You are just impersonating an Englishman when you say spot on.”

Will do – I hear that from Americans. That should be put into quarantine,” he adds.

And don’t get him started on the chattering classes – its overtones of a distinctly British class system make him quiver.

But not everyone shares his revulsion at the drip, drip, drip of Britishisms – to use an American term – crossing the Atlantic.

“I enjoy seeing them,” says Ben Yagoda, professor of English at the University of Delaware, and author of the forthcoming book, How to Not Write Bad.

“It’s like a birdwatcher. If I find an American saying one, it makes my day!”

Last year Yagoda set up a blog dedicated to spotting the use of British terms in American English.

So far he has found more than 150 – from cheeky to chat-up via sell-by date, and the long game – an expression which appears to date back to 1856, and comes not from golf or chess, but the card game whist. President Barack Obama has used it in at least one speech.

Yagoda notices changes in pronunciation too – for example his students sometimes use “that sort of London glottal stop”, dropping the T in words like “important” or “Manhattan”.

Kory Stamper, Associate Editor for Merriam-Webster, whose dictionaries are used by many American publishers and news organisations, agrees that more and more British words are entering the American vocabulary.

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image: Ngram graph showing online usage of the phrase “chat up”. Courtesy of Google / BBC.[end-div]

Integrated Space Plan

The Integrated Space Plan is a 100 year vision of space exploration as envisioned over 20 years ago. It is a beautiful and intricate timeline covering the period 1983 to 2100. The timeline was developed in 1989 by Ronald M. Jones at Rockwell International, using long range planning data from NASA, the National Space Policy Directive and other Western space agencies.

While optimistic the plan nonetheless outlined unmanned rover exploration on Mars (done), a comet sample return mission (done), and an orbiter around Mercury (done). Over the longer-term the plan foresaw “human expansion into the inner solar system” by 2018, with “triplanetary, earth-moon-mars infrastructure” in place by 2023, “small martian settlements” followed in 2060, and “Venus terraforming operations” in 2080. The plan concludes with “human interstellar travel” sometime after the year 2100. So, perhaps there is hope for humans beyond this Pale Blue Dot after all.

More below on this fascinating diagram and how it was re-discovered from Sean Ragan over at Make Magazine. A detailed and large download of the plan follows: Integrated Space Plan.

[div class=attrib]From Make:[end-div]

I first encountered this amazing infographic hanging on a professor’s office wall when I was visiting law schools back in 1999. I’ve been trying, off and on, to run down my own copy ever since. It’s been one of those back-burner projects that I’ll poke at when it comes to mind, every now and again, but until quite recently all my leads had come up dry. All I really knew about the poster was that it had been created in the 80s by analysts at Rockwell International and that it was called the “Integrated Space Plan.”

About a month ago, all the little threads I’d been pulling on suddenly unraveled, and I was able to connect with a generous donor willing to entrust an original copy of the poster to me long enough to have it scanned at high resolution. It’s a large document, at 28 x 45?, but fortunately it’s monochrome, and reproduces well using 1-bit color at 600dpi, so even uncompressed bitmaps come in at under 5MB.

[div class=attrib]Read the entire article following the jump.[end-div]

Childhood Injuries on the Rise: Blame Parental Texting

The long-term downward trend in the number injuries to young children is no longer. Sadly, urgent care and emergency room doctors are now seeing more children aged 0-14 years with unintentional injuries. While the exact causes are yet to be determined, there is a growing body of anecdotal evidence that points to distraction among patents and supervisors — it’s the texting stupid!

The great irony is that should your child suffer an injury while you were using your smartphone, you’ll be able to contact the emergency room much more quickly now — courtesy of the very same smartphone.

[div class=attrib]From the Wall Street Journal:[end-div]

One sunny July afternoon in a San Francisco park, tech recruiter Phil Tirapelle was tapping away on his cellphone while walking with his 18-month-old son. As he was texting his wife, his son wandered off in front of a policeman who was breaking up a domestic dispute.

“I was looking down at my mobile, and the police officer was looking forward,” and his son “almost got trampled over,” he says. “One thing I learned is that multitasking makes you dumber.”

Yet a few minutes after the incident, he still had his phone out. “I’m a hypocrite. I admit it,” he says. “We all are.”

Is high-tech gadgetry diminishing the ability of adults to give proper supervision to very young children? Faced with an unending litany of newly proclaimed threats to their kids, harried parents might well roll their eyes at this suggestion. But many emergency-room doctors are worried: They see the growing use of hand-held electronic devices as a plausible explanation for the surprising reversal of a long slide in injury rates for young children. There have even been a few extreme cases of death and near drowning.

Nonfatal injuries to children under age five rose 12% between 2007 and 2010, after falling for much of the prior decade, according to the most recent data from the Centers for Disease Control and Prevention, based on emergency-room records. The number of Americans 13 and older who own a smartphone such as an iPhone or BlackBerry has grown from almost 9 million in mid-2007, when Apple introduced its device, to 63 million at the end of 2010 and 114 million in July 2012, according to research firm comScore.

Child-safety experts say injury rates had been declining since at least the 1970s, thanks to everything from safer playgrounds to baby gates on staircases to fences around backyard swimming pools. “It was something we were always fairly proud of,” says Dr. Jeffrey Weiss, a pediatrician at Phoenix Children’s Hospital who serves on an American Academy of Pediatrics working group for injury, violence and poison prevention. “The injuries were going down and down and down.” The recent uptick, he says, is “pretty striking.”

Childhood-injury specialists say there appear to be no formal studies or statistics to establish a connection between so-called device distraction and childhood injury. “What you have is an association,” says Dr. Gary Smith, founder and director of the Center for Injury Research and Policy of the Research Institute at Nationwide Children’s Hospital. “Being able to prove causality is the issue…. It certainly is a question that begs to be asked.”

It is well established that using a smartphone while driving or even crossing a street increases the risk of accident. More than a dozen pediatricians, emergency-room physicians, academic researchers and police interviewed by The Wall Street Journal say that a similar factor could be at play in injuries to young children.

“It’s very well understood within the emergency-medicine community that utilizing devices—hand-held devices—while you are assigned to watch your kids—that resulting injuries could very well be because you are utilizing those tools,” says Dr. Wally Ghurabi, medical director of the emergency center at the Santa Monica-UCLA Medical Center and Orthopaedic Hospital.

Adds Dr. Rahul Rastogi, an emergency-room physician at Kaiser Permanente in Oregon: “We think we’re multitasking and not really feeling like we are truly distracted. But in reality we are.”

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image courtesy of Science Daily.[end-div]

GigaBytes and TeraWatts

Online social networks have expanded to include hundreds of millions of twitterati and their followers. An ever increasing volume of data, images, videos and documents continues to move into the expanding virtual “cloud”, hosted in many nameless data centers. Virtual processing and computation on demand is growing by leaps and bounds.

Yet while business models for the providers of these internet services remain ethereal, one segment of this business ecosystem is salivating — electricity companies and utilities — at the staggering demand for electrical power.

[div class=attrib]From the New York Times:[end-div]

Jeff Rothschild’s machines at Facebook had a problem he knew he had to solve immediately. They were about to melt.

The company had been packing a 40-by-60-foot rental space here with racks of computer servers that were needed to store and process information from members’ accounts. The electricity pouring into the computers was overheating Ethernet sockets and other crucial components.

Thinking fast, Mr. Rothschild, the company’s engineering chief, took some employees on an expedition to buy every fan they could find — “We cleaned out all of the Walgreens in the area,” he said — to blast cool air at the equipment and prevent the Web site from going down.

That was in early 2006, when Facebook had a quaint 10 million or so users and the one main server site. Today, the information generated by nearly one billion people requires outsize versions of these facilities, called data centers, with rows and rows of servers spread over hundreds of thousands of square feet, and all with industrial cooling systems.

They are a mere fraction of the tens of thousands of data centers that now exist to support the overall explosion of digital information. Stupendous amounts of data are set in motion each day as, with an innocuous click or tap, people download movies on iTunes, check credit card balances through Visa’s Web site, send Yahoo e-mail with files attached, buy products on Amazon, post on Twitter or read newspapers online.

A yearlong examination by The New York Times has revealed that this foundation of the information industry is sharply at odds with its image of sleek efficiency and environmental friendliness.

Most data centers, by design, consume vast amounts of energy in an incongruously wasteful manner, interviews and documents show. Online companies typically run their facilities at maximum capacity around the clock, whatever the demand. As a result, data centers can waste 90 percent or more of the electricity they pull off the grid, The Times found.

To guard against a power failure, they further rely on banks of generators that emit diesel exhaust. The pollution from data centers has increasingly been cited by the authorities for violating clean air regulations, documents show. In Silicon Valley, many data centers appear on the state government’s Toxic Air Contaminant Inventory, a roster of the area’s top stationary diesel polluters.

Worldwide, the digital warehouses use about 30 billion watts of electricity, roughly equivalent to the output of 30 nuclear power plants, according to estimates industry experts compiled for The Times. Data centers in the United States account for one-quarter to one-third of that load, the estimates show.

“It’s staggering for most people, even people in the industry, to understand the numbers, the sheer size of these systems,” said Peter Gross, who helped design hundreds of data centers. “A single data center can take more power than a medium-size town.”

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image courtesy of the AP / Thanassis Stavrakis.[end-div]