Testosterone and the Moon

While the United States’ military makes no comment a number of corroborated reports suggest that the country had a plan to drop an atomic bomb on the moon during the height of the Cold War. Apparently, a Hiroshima-like explosion on our satellite would have been seen as a “show of force” by the Soviets. The shear absurdity of this Dr.Strangelove story makes it all the more real.

[div class=attrib]From the Independent:[end-div]

US Military chiefs, keen to intimidate Russia during the Cold War, plotted to blow up the moon with a nuclear bomb, according to project documents kept secret for for nearly 45 years.

The army chiefs allegedly developed a top-secret project called, ‘A Study of Lunar Research Flights’ – or ‘Project A119’, in the hope that their Soviet rivals would be intimidated by a display of America’s Cold War muscle.

According to The Sun newspaper the military bosses developed a classified plan to launch a nuclear weapon 238,000 miles to the moon where it would be detonated upon impact.

The planners reportedly opted for an atom bomb, rather than a hydrogen bomb, because the latter would be too heavy for the missile.

Physicist Leonard Reiffel, who says he was involved in the project, claims the hope was that the flash from the bomb would intimidate the Russians following their successful launching of the Sputnik satellite in October 1957.

The planning of the explosion reportedly included calculations by astronomer Carl Sagan, who was then a young graduate.

Documents reportedly show the plan was abandoned because of fears it would have an adverse effect on Earth should the explosion fail.

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Image courtesy of NASA.[end-div]

UX and the Untergunther: Underground (Literally) Art

Many cities around the globe are home to underground art movements — those whose participants eschew the strictures of modern day gallery wine and cheese, curated exhibits, and formal public art shows. Paris has gone a step further — though deeper, would be more correct — in providing a subterranean home for some truly underground art and the groups of dedicated, clandestine artists, hackers and art restorers.

Wired spent some quality time with a leading group of Parisian underground artists, known as UX, for Underground eXperiment. Follow Wired’s fascinating and lengthy article here.

[div class=attrib]From the BBC:[end-div]

The obsessively secretive members of an underground art collective have spent the last 30 years surreptitiously staging events in tunnels beneath Paris. They say they never ask permission – and never ask for subsidies.

We’re standing nervously on the pavement, trying not to feel self-conscious as we furtively scrutinise each passer-by.

After weeks of negotiation, we have a meeting with someone who says he is a member of the highly secretive French artists’ collective – UX, as they are known for short – outside a town hall in the south of Paris. It is late on a Sunday night but the street is still quite busy.

Finally I notice a young man dressed entirely in black apart from a red beret and a small rucksack on his back. He hovers for a moment and then motions us to follow him. Our destination is the catacombs, the tunnels that run beneath the pavements of Paris.

A few minutes later Tristan (not his real name) and two companions are pulling the heavy steel cover off a manhole. “Quick, quick,” he says, “before the police come.”

I stare down a seemingly endless black hole before stepping gingerly on to a rusty ladder and start to clamber down.

There are several more ladders after that before we finally reach the bottom. To my great relief, there are no rats – we go deeper than the rats ever do – but it is pitch black and very wet.

The water is ankle deep and my shoes are soaked through. “It’s fine, if you’re properly dressed,” laughs Tristan as he splashes ahead in his rubber boots.

Using the flashlight on my phone, we do our best to follow him. Along the way I notice some colourful graffiti and a painting of an evil looking cat.

After a few minutes, we reach a dry, open space with intricate carvings on the wall and it is here that we finally sit down to interrogate our mysterious companions.

Tristan explains that he gets a kick out of getting to places, which are normally off-limits. He is a “cataphile” – somebody who loves to roam the catacombs of Paris.

UX are not the only people who go underground. There is a rap song about cataphiles, people who would rather don the rubber boots of a sewer worker (egoutier) than go clubbing in a normal night spot.

There have been a number of raves underground – some chambers are said to be big enough to hold 1,000 people.

The galleries are turned into makeshift night clubs, with a bar, lighting effects, and DJ turntables, using electricity diverted from the Parisian metro.

He also climbs on the roofs of churches. “You get a great view of the city, especially at night and it’s a cool place for a picnic,” he says.

Tristan who is originally from Lyon says his group is called the Lyonnaise des Os – a reference to the piles of bones (“os” is French for “bone”) in the catacombs – but also a pun on France’s famous water company, Lyonnaise des Eaux. He and his group spend their time exploring the tunnels, and carving sculptures.

The UX are a loose collective of people from a variety of backgrounds. Not just artists but also engineers, civil servants, lawyers and even a state prosecutor. They divide into different groups depending on their interests.

The Untergunther specialise in clandestine acts of restoration of parts of France’s heritage which they believe the state has neglected. There is also an all-women group, nicknamed The Mouse House, who are experts at infiltration.

Another group, called La Mexicaine de Perforation, or The Mexican Consolidated Drilling Authority, stages arts events like film festivals underground. They once created an entire cinema under the Palais de Chaillot, by the Trocadero, with seats cut out of the rock.

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Image: Hacker-artists below Paris. Courtesy of Wired / UX.[end-div]

Antifragile

One of our favorite thinkers (and authors) here at theDiagonal is Nassim Taleb. His new work entitled Antifragile expands on ideas that he first described in his bestseller Black Swan.

Based on humanity’s need to find order and patterns out of chaos, and proclivity to seek causality where none exists we’ll need several more books from him before his profound and yet common-sense ideas sink in. In his latest work, Taleb shows how the improbable and unpredictable lie at the foundation of our universe.

[div class=attrib]From the Guardian:[end-div]

Now much does Nassim Taleb dislike journalists? Let me count the ways. “An erudite is someone who displays less than he knows; a journalist or consultant the opposite.” “This business of journalism is about pure entertainment, not the search for the truth.” “Most so-called writers keep writing and writing with the hope, some day, to find something to say.” He disliked them before, but after he predicted the financial crash in his 2007 book, The Black Swan, a book that became a global bestseller, his antipathy reached new heights. He has dozens and dozens of quotes on the subject, and if that’s too obtuse for us non-erudites, his online home page puts it even plainer: “I beg journalists and members of the media to leave me alone.”

He’s not wildly keen on appointments either. In his new book, Antifragile, he writes that he never makes them because a date in the calendar “makes me feel like a prisoner”.

So imagine, if you will, how keenly he must be looking forward to the prospect of a pre-arranged appointment to meet me, a journalist. I approach our lunch meeting, at the Polytechnic Institute of New York University where he’s the “distinguished professor of risk engineering”, as one might approach a sleeping bear: gingerly. And with a certain degree of fear. And yet there he is, striding into the faculty lobby in a jacket and Steve Jobs turtleneck (“I want you to write down that I started wearing them before he did. I want that to be known.”), smiling and effusive.

First, though, he has to have his photo taken. He claims it’s the first time he’s allowed it in three years, and has allotted just 10 minutes for it, though in the end it’s more like five. “The last guy I had was a fucking dick. He wanted to be artsy fartsy,” he tells the photographer, Mike McGregor. “You’re OK.”

Being artsy fartsy, I will learn, is even lower down the scale of Nassim Taleb pet hates than journalists. But then, being contradictory about what one hates and despises and loves and admires is actually another key Nassim Taleb trait.

In print, the hating and despising is there for all to see: he’s forever having spats and fights. When he’s not slagging off the Nobel prize for economics (a “fraud”), bankers (“I have a physical allergy to them”) and the academic establishment (he has it in for something he calls the “Soviet-Harvard illusion”), he’s trading blows with Steven Pinker (“clueless”), and a random reviewer on Amazon, who he took to his Twitter stream to berate. And this is just in the last week.

And yet here he is, chatting away, surprisingly friendly and approachable. When I say as much as we walk to the restaurant, he asks, “What do you mean?”

“In your book, you’re quite…” and I struggle to find the right word, “grumpy”.

He shrugs. “When you write, you don’t have the social constraints of having people in front of you, so you talk about abstract matters.”

Social constraints, it turns out, have their uses. And he’s an excellent host. We go to his regular restaurant, a no-nonsense, Italian-run, canteen-like place, a few yards from his faculty in central Brooklyn, and he insists that I order a glass of wine.

“And what’ll you have?” asks the waitress.

“I’ll take a coffee,” he says.

“What?” I say. “No way! You can’t trick me into ordering a glass of wine and then have coffee.” It’s like flunking lesson #101 at interviewing school, though in the end he relents and has not one but two glasses and a plate of “pasta without pasta” (though strictly speaking you could call it “mixed vegetables and chicken”), and attacks the bread basket “because it doesn’t have any calories here in Brooklyn”.

But then, having read his latest book, I actually know an awful lot about his diet. How he doesn’t eat sugar, any fruits which “don’t have a Greek or Hebrew name” or any liquid which is less than 1,000 years old. Just as I know that he doesn’t like air-conditioning, soccer moms, sunscreen and copy editors. That he believes the “non-natural” has to prove its harmlessness. That America tranquillises its children with drugs and pathologises sadness. That he values honour above all things, banging on about it so much that at times he comes across as a medieval knight who’s got lost somewhere in the space-time continuum. And that several times a week he goes and lifts weights in a basement gym with a bunch of doormen.

He says that after the financial crisis he received “all manner of threats” and at one time was advised to “stock up on bodyguards”. Instead, “I found it more appealing to look like one”. Now, he writes, when he’s harassed by limo drivers in the arrival hall at JFK, “I calmly tell them to fuck off.”

Taleb started out as a trader, worked as a quantitative analyst and ran his own investment firm, but the more he studied statistics, the more he became convinced that the entire financial system was a keg of dynamite that was ready to blow. In The Black Swan he argued that modernity is too complex to understand, and “Black Swan” events – hitherto unknown and unpredicted shocks – will always occur.

What’s more, because of the complexity of the system, if one bank went down, they all would. The book sold 3m copies. And months later, of course, this was more or less exactly what happened. Overnight, he went from lone-voice-in-the-wilderness, spouting off-the-wall theories, to the great seer of the modern age.

Antifragile, the follow-up, is his most important work so far, he says. It takes the central idea of The Black Swan and expands it to encompass almost every other aspect of life, from the 19th century rise of the nation state to what to eat for breakfast (fresh air, as a general rule).

[div class-attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Image: Black Swan, the movie, not the book by the same name by Nassim Taleb. Courtesy of Wkipedia.[end-div]

Telomere Test: A Date With Death

In 1977 Elizabeth Blackburn and Joseph Gall, molecular biologists, discovered the structure of the end caps, known as telomeres, of chromosomes. In 2009, Blackburn and colleagues Carol Greider and Jack Szostak shared the Nobel prize in Physiology or Medicine for discovering the enzyme telomerase, the enzyme responsible for replenishing telomeres.

It turns out that telomeres are rather important. Studies shows that telomeres regulate cell division, and as a consequence directly influence aging and life span. When a cell divides the length of its chromosomal telomeres shortens. Once a telomere is depleted its chromosome, and DNA, can no longer be replicated accurately, and the cell no longer divides, hastening cell death.

[div class=attrib]From the Independent:[end-div]

A blood test to determine how fast someone is ageing has been shown to work on a population of wild birds, the first time the ageing test has been used successfully on animals living outside a laboratory setting.

The test measures the average length of tiny structures on the tips of chromosomes called telomeres which are known to get shorter each time a cell divides during an organism’s lifetime.

Telomeres are believed to act like internal clocks by providing a more accurate estimate of a person’s true biological age rather than their actual chronological age.

This has led some experts to suggest that telomere tests could be used to estimate not only how fast someone is ageing, but possibly how long they have left to live if they die of natural causes.

Telomere tests have been widely used on experimental animals and at least one company is offering a £400 blood test in the UK for people interested in seeing how fast they are ageing based on their average telomere length.

Now scientists have performed telomere tests on an isolated population of songbirds living on an island in the Seychelles and found that the test does indeed accurately predict an animal’s likely lifespan.

“We saw that telomere length is a better indicator of life expectancy than chronological age. So by measuring telomere length we have a way of estimating the biological age of an individual – how much of its life it has used up,” said David Richardson of the University of East Anglia.

The researchers tested the average telomere lengths of a population of 320 Seychelles Warblers living on the remote Cousin Island, which ornithologists have studied for 20 years, documenting the life history of each bird.

“Our results provide the first clear and unambiguous evidence of a relationship between telomere length and mortality in the wild, and substantiate the prediction that telomere length and shortening rate can act as an indicator of biological age further to chronological age,” says the study published in the journal Molecular Ecology.

Studying an island population of wild birds was important because there were no natural predators and little migration, meaning that the scientists could accurately study the link between telomere length and a bird’s natural lifespan.

“We wanted to understand what happens over an entire lifetime, so the Seychelles warbler is an ideal research subject. They are naturally confined to an isolated tropical island, without any predators, so we can follow individuals throughout their lives, right into old age,” Dr Richardson said.

“We investigated whether, at any given age, their telomere lengths could predict imminent death. We found that short and rapidly shortening telomeres were a good indication that the bird would die within a year,” he said.

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Infographic courtesy of Independent.[end-div]

Lead a Congressional Committee on Science: No Grasp of Science Required

[div class=attrib]From ars technica:[end-div]

The House Committee on Space, Science, and Technology hears testimony on climate change in March 2011.[/ars_img]If you had the chance to ask questions of one of the world’s leading climatologists, would you select a set of topics that would be at home in the heated discussions that take place in the Ars forums? If you watch the video below, you’d find that’s precisely what Dana Rohrabacher (R-CA) chose to do when Penn State’s Richard Alley (a fellow Republican) was called before the House Science Committee, which has already had issues with its grasp of science. Rohrabacher took Alley on a tour of some of the least convincing arguments about climate change, all trying to convince him changes in the Sun were to blame for a changing climate. (Alley, for his part, noted that we have actually measured the Sun, and we’ve seen no such changes.)

Now, if he has his way, Rohrabacher will be chairing the committee once the next Congress is seated. Even if he doesn’t get the job, the alternatives aren’t much better.

There has been some good news for the Science Committee to come out of the last election. Representative Todd Akin (R-MO), whose lack of understanding of biology was made clear by his comments on “legitimate rape,” had to give up his seat to run for the Senate, a race he lost. Meanwhile, Paul Broun (R-GA), who said that evolution and cosmology are “lies straight from the pit of Hell,” won reelection, but he received a bit of a warning in the process: dead English naturalist Charles Darwin, who is ineligible to serve in Congress, managed to draw thousands of write-in votes. And, thanks to limits on chairmanships, Ralph Hall (R-TX), who accused climate scientists of being in it for the money (if so, they’re doing it wrong), will have to step down.

In addition to Rohrabacher, the other Representatives that are vying to lead the Committee are Wisconsin’s James Sensenbrenner and Texas’ Lamar Smith. They all suggest that they will focus on topics like NASA’s budget and the Department of Energy’s plans for future energy tech. But all of them have been embroiled in the controversy over climate change in the past.

In an interview with Science Insider about his candidacy, Rohrabacher engaged in a bit of triumphalism and suggested that his beliefs were winning out. “There were a lot of scientists who were just going along with the flow on the idea that mankind was causing a change in the world’s climate,” he said. “I think that after 10 years of debate, we can show that there are hundreds if not thousands of scientists who have come over to being skeptics, and I don’t know anyone [who was a skeptic] who became a believer in global warming.”

[div class=attrib]Read the entire article following the jump.[end-div]

The Rise of the Industrial Internet

As the internet that connects humans reaches a stable saturation point the industrial internet — the network that connects things — is increasing its growth and reach.

[div class=attrib]From the New York Times:[end-div]

When Sharoda Paul finished a postdoctoral fellowship last year at the Palo Alto Research Center, she did what most of her peers do — considered a job at a big Silicon Valley company, in her case, Google. But instead, Ms. Paul, a 31-year-old expert in social computing, went to work for General Electric.

Ms. Paul is one of more than 250 engineers recruited in the last year and a half to G.E.’s new software center here, in the East Bay of San Francisco. The company plans to increase that work force of computer scientists and software developers to 400, and to invest $1 billion in the center by 2015. The buildup is part of G.E’s big bet on what it calls the “industrial Internet,” bringing digital intelligence to the physical world of industry as never before.

The concept of Internet-connected machines that collect data and communicate, often called the “Internet of Things,” has been around for years. Information technology companies, too, are pursuing this emerging field. I.B.M. has its “Smarter Planet” projects, while Cisco champions the “Internet of Everything.”

But G.E.’s effort, analysts say, shows that Internet-era technology is ready to sweep through the industrial economy much as the consumer Internet has transformed media, communications and advertising over the last decade.

In recent months, Ms. Paul has donned a hard hat and safety boots to study power plants. She has ridden on a rail locomotive and toured hospital wards. “Here, you get to work with things that touch people in so many ways,” she said. “That was a big draw.”

G.E. is the nation’s largest industrial company, a producer of aircraft engines, power plant turbines, rail locomotives and medical imaging equipment. It makes the heavy-duty machinery that transports people, heats homes and powers factories, and lets doctors diagnose life-threatening diseases.

G.E. resides in a different world from the consumer Internet. But the major technologies that animate Google and Facebook are also vital ingredients in the industrial Internet — tools from artificial intelligence, like machine-learning software, and vast streams of new data. In industry, the data flood comes mainly from smaller, more powerful and cheaper sensors on the equipment.

Smarter machines, for example, can alert their human handlers when they will need maintenance, before a breakdown. It is the equivalent of preventive and personalized care for equipment, with less downtime and more output.

“These technologies are really there now, in a way that is practical and economic,” said Mark M. Little, G.E.’s senior vice president for global research.

G.E.’s embrace of the industrial Internet is a long-term strategy. But if its optimism proves justified, the impact could be felt across the economy.

The outlook for technology-led economic growth is a subject of considerable debate. In a recent research paper, Robert J. Gordon, a prominent economist at Northwestern University, argues that the gains from computing and the Internet have petered out in the last eight years.

Since 2000, Mr. Gordon asserts, invention has focused mainly on consumer and communications technologies, including smartphones and tablet computers. Such devices, he writes, are “smaller, smarter and more capable, but do not fundamentally change labor productivity or the standard of living” in the way that electric lighting or the automobile did.

But others say such pessimism misses the next wave of technology. “The reason I think Bob Gordon is wrong is precisely because of the kind of thing G.E. is doing,” said Andrew McAfee, principal research scientist at M.I.T.’s Center for Digital Business.

Today, G.E. is putting sensors on everything, be it a gas turbine or a hospital bed. The mission of the engineers in San Ramon is to design the software for gathering data, and the clever algorithms for sifting through it for cost savings and productivity gains. Across the industries it covers, G.E. estimates such efficiency opportunities at as much as $150 billion.

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Image: Internet of Things. Courtesy of Intel.[end-div]

Startup Culture: New is the New New

Starting up a new business was once a demanding and complex process, often undertaken in anonymity in the long shadows between the hours of a regular job. It still is over course. However nowadays “the startup” has become more of an event. The tech sector has raised this to a fine art by spawning an entire self-sustaining and self-promoting industry around startups.

You’ll find startup gurus, serial entrepreneurs and digital prophets — yes, AOL has a digital prophet on its payroll — strutting around on stage, twittering tips in the digital world, leading business plan bootcamps, pontificating on accelerator panels, hosting incubator love-ins in coffee shops or splashed across the covers of Entrepreneur or Inc or FastCompany magazines on an almost daily basis. Beware! The back of your cereal box may be next.

[div class=attrib]From the Telegraph:[end-div]

I’ve seen the best minds of my generation destroyed by marketing, shilling for ad clicks, dragging themselves through the strip-lit corridors of convention centres looking for a venture capitalist. Just as X Factor has convinced hordes of tone deaf kids they can be pop stars, the startup industry has persuaded thousands that they can be the next rockstar entrepreneur. What’s worse is that while X Factor clogs up the television schedules for a couple of months, tech conferences have proliferated to such an extent that not a week goes by without another excuse to slope off. Some founders spend more time on panels pontificating about their business plans than actually executing them.

Earlier this year, I witnessed David Shing, AOL’s Digital Prophet – that really is his job title – delivering the opening remarks at a tech conference. The show summed up the worst elements of the self-obsessed, hyperactive world of modern tech. A 42-year-old man with a shock of Russell Brand hair, expensive spectacles and paint-splattered trousers, Shingy paced the stage spouting buzzwords: “Attention is the new currency, man…the new new is providing utility, brothers and sisters…speaking on the phone is completely cliche.” The audience lapped it all up. At these rallies in praise of the startup, enthusiasm and energy matter much more than making sense.

Startup culture is driven by slinging around superlatives – every job is an “incredible opportunity”, every product is going to “change lives” and “disrupt” an established industry. No one wants to admit that most startups stay stuck right there at the start, pub singers pining for their chance in the spotlight. While the startups and hangers-on milling around in the halls bring in stacks of cash for the event organisers, it’s the already successful entrepreneurs on stage and the investors who actually benefit from these conferences. They meet up at exclusive dinners and in the speakers’ lounge where the real deals are made. It’s Studio 54 for geeks.

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Image: Startup, WA. Courtesy of Wikipedia.[end-div]

Us: Perhaps It’s All Due to Gene miR-941

Geneticists have discovered a gene that helps explain how humans and apes diverged from their common ancestor around 6 million years ago.

[div class=attrib]From the Guardian:[end-div]

Researchers have discovered a new gene they say helps explain how humans evolved from chimpanzees.

The gene, called miR-941, appears to have played a crucial role in human brain development and could shed light on how we learned to use tools and language, according to scientists.

A team at the University of Edinburgh compared it to 11 other species of mammals, including chimpanzees, gorillas, mice and rats.

The results, published in Nature Communications, showed that the gene is unique to humans.

The team believe it emerged between six and one million years ago, after humans evolved from apes.

Researchers said it is the first time a new gene carried by humans and not by apes has been shown to have a specific function in the human body.

Martin Taylor, who led the study at the Institute of Genetics and Molecular Medicine at the University of Edinburgh, said: “As a species, humans are wonderfully inventive – we are socially and technologically evolving all the time.

“But this research shows that we are innovating at a genetic level too.

“This new molecule sprang from nowhere at a time when our species was undergoing dramatic changes: living longer, walking upright, learning how to use tools and how to communicate.

“We’re now hopeful that we will find more new genes that help show what makes us human.”

The gene is highly active in two areas of the brain, controlling decision-making and language abilities, with the study suggesting it could have a role in the advanced brain functions that make us human.

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Image courtesy of ABCNews.[end-div]

Pluralistic Ignorance

Why study the science of climate change when you can study the complexities of climate change deniers themselves? That was the question that led several groups of independent researchers to study why some groups of people cling to mistaken beliefs and hold inaccurate views of the public consensus.

[div class=attrib]From ars technica:[end-div]

By just about every measure, the vast majority of scientists in general—and climate scientists in particular—have been convinced by the evidence that human activities are altering the climate. However, in several countries, a significant portion of the public has concluded that this consensus doesn’t exist. That has prompted a variety of studies aimed at understanding the large disconnect between scientists and the public, with results pointing the finger at everything from the economy to the weather. Other studies have noted societal influences on acceptance, including ideology and cultural identity.

Those studies have generally focused on the US population, but the public acceptance of climate change is fairly similar in Australia. There, a new study has looked at how societal tendencies can play a role in maintaining mistaken beliefs. The authors of the study have found evidence that two well-known behaviors—the “false consensus” and “pluralistic ignorance”—are helping to shape public opinion in Australia.

False consensus is the tendency of people to think that everyone else shares their opinions. This can arise from the fact that we tend to socialize with people who share our opinions, but the authors note that the effect is even stronger “when we hold opinions or beliefs that are unpopular, unpalatable, or that we are uncertain about.” In other words, our social habits tend to reinforce the belief that we’re part of a majority, and we have a tendency to cling to the sense that we’re not alone in our beliefs.

Pluralistic ignorance is similar, but it’s not focused on our own beliefs. Instead, sometimes the majority of people come to believe that most people think a certain way, even though the majority opinion actually resides elsewhere.

As it turns out, the authors found evidence of both these effects. They performed two identical surveys of over 5,000 Australians, done a year apart; about 1,350 people took the survey both times, which let the researchers track how opinions evolve. Participants were asked to describe their own opinion on climate change, with categories including “don’t know,” “not happening,” “a natural occurrence,” and “human-induced.” After voicing their own opinion, people were asked to estimate what percentage of the population would fall into each of these categories.

In aggregate, over 90 percent of those surveyed accepted that climate change was occurring (a rate much higher than we see in the US), with just over half accepting that humans were driving the change. Only about five percent felt it wasn’t happening, and even fewer said they didn’t know. The numbers changed only slightly between the two polls.

The false consensus effect became obvious when the researchers looked at what these people thought that everyone else believed. Here, the false consensus effect was obvious: every single group believed that their opinion represented the plurality view of the population. This was most dramatic among those who don’t think that the climate is changing; even though they represent far less than 10 percent of the population, they believed that over 40 percent of Australians shared their views. Those who profess ignorance also believed they had lots of company, estimating that their view was shared by a quarter of the populace.

Among those who took the survey twice, the effect became even more pronounced. In the year between the surveys, they respondents went from estimating that 30 percent of the population agreed with them to thinking that 45 percent did. And, in general, this group was the least likely to change its opinion between the two surveys.

But there was also evidence of pluralistic ignorance. Every single group grossly overestimated the number of people who were unsure about climate change or convinced it wasn’t occurring. Even those who were convinced that humans were changing the climate put 20 percent of Australians into each of these two groups.

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Image: Flood victims. Courtesy of NRDC.[end-div]

USANIT

Ever-present in Europe nationalism continues to grow as austerity measures across the continent catalyze xenophobia. And, now it’s spreading westwards across the Atlantic to the United States of America. Well, actually to be more precise nationalistic fervor is spreading to Texas. Perhaps in our lifetimes we’ll have to contend with USANIT — the United States of America Not Including Texas. Seventy-seven thousand Texans, so far, want the Lone Star to fly again across their nascent nation.

[div class=attrib]From the Guardian:[end-div]

Less than a week after Barack Obama was re-elected president, a slew of petitions have appeared on the White House’s We the People site, asking for states to be granted the right to peacefully withdraw from the union.

On Tuesday, all but one of the 33 states listed were far from reaching the 25,000 signature mark needed to get a response from the White House. Texas, however, had gained more than 77,000 online signatures in three days.

People from other states had signed the Texas petition. Another petition on the website was titled: “Deport everyone that signed a petition to withdraw their state from the United States of America.” It had 3,536 signatures.

The Texas petition reads:

Given that the state of Texas maintains a balanced budget and is the 15th largest economy in the world, it is practically feasible for Texas to withdraw from the union, and to do so would protect it’s citizens’ standard of living and re-secure their rights and liberties in accordance with the original ideas and beliefs of our founding fathers which are no longer being reflected by the federal government.

Activists across the country have advocated for independent statehood since the union was restored after the end of the Civil War in 1865. Texas has been host to some of the most fervent fights for independence.

Daniel Miller is the president of the Texas Nationalist Movement, which supports Texan independence and has its own online petition.

“We want to be able to govern ourselves without having some government a thousand-plus miles away that we have to go ask ‘mother may I’ to,” Miller said. “We want to protect our political, our cultural and our economic identities.”

Miller is not a fan of the word “secession”, because he views it as an over-generalization of what his group hopes to accomplish, but he encourages advocates for Texan independence to show their support when they can, including by signing the White House website petition.

“Given the political, cultural and economic pressures the United States is under, it’s not beyond the pale where one could envision the break up of the United States,” he said. “I don’t look at it as possibility, I look at it as an inevitability.”

Miller has been working for Texas independence for 16 years. He pointed to last week’s federal elections as evidence that a state independence movement is gaining traction. Miller pointed to the legalization of the sale of marijuana in Colorado and Washington, disobeying federal mandate.

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]State Flag of Texas courtesy of Wikipedia.[end-div]

Socialism and Capitalism Share the Same Parent

Expanding on the work of Immanuel Kant in the late 18th century, German philosopher Georg Wilhelm Friedrich Hegel laid the foundations for what would later become two opposing political systems, socialism and free market capitalism. His comprehensive framework of Absolute Idealism influenced numerous philosophers and thinkers of all shades including Karl Marx and Ralph Waldo Emerson. While many thinkers later rounded on Hegel’s world view as nothing but a thinly veiled attempt to justify totalitarianism in his own nation, there is no argument as to the profound influence of his works on later thinkers from both the left and the right wings of the political spectrum.

[div class=attrib]From FairObserver:[end-div]

It is common knowledge that among developed western countries the two leading socioeconomic systems are socialism and capitalism. The former is often associated more closely with European systems of governance and the latter with the American free market economy. It is also generally known that these two systems are rooted in two fundamentally different assumptions about how a healthy society progresses. What is not as well known is that they both stem from the same philosophical roots, namely the evolutionary philosophy of Georg Wilhelm Friedrich Hegel.

Georg Wilhelm Friedrich Hegel was a leading figure in the movement known as German Idealism that had its beginnings in the late 18th century. That philosophical movement was initiated by another prominent German thinker, Immanuel Kant. Kant published “The Critique of Pure Reason” in 1781, offering a radical new way to understand how we as human beings get along in the world. Hegel expanded on Kant’s theory of knowledge by adding a theory of social and historical progress. Both socialism and capitalism were inspired by different, and to some extent apposing, interpretations of Hegel’s philosophical system.

Immanuel Kant recognized that human beings create their view of reality by incorporating new information into their previous understanding of reality using the laws of reason. As this integrative process unfolds we are compelled to maintain a coherent picture of what is real in order to operate effectively in the world. The coherent picture of reality that we maintain Kant called a necessary transcendental unity. It can be understood as the overarching picture of reality, or worldview, that helps us make sense of the world and against which we interpret and judge all new experiences and information.

Hegel realized that not only must individuals maintain a cohesive picture of reality, but societies and cultures must also maintain a collectively held and unified understanding of what is real. To use a gross example, it is not enough for me to know what a dollar bill is and what it is worth. If I am to be able to buy something with my money, then other people must agree on its value. Reality is not merely an individual event; it is a collective affair of shared agreement. Hegel further saw that the collective understanding of reality that is held in common by many human beings in any given society develops over the course of history. In his book “The Philosophy of History”, Hegel outlines his theory of how this development occurs. Karl Marx started with Hegel’s philosophy and then added his own profound insights – especially in regards to how oppression and class struggle drive the course of history.

Across the Atlantic in America, there was another thinker, Ralph Waldo Emerson, who was strongly influenced by German Idealism and especially the philosophy of Hegel. In the development of the American mind one cannot overstate the role that Emerson played as the pathfinder who marked trails of thought that continue to guide the  current American worldview. His ideas became grooves in consciousness set so deeply in the American psyche that they are often simply experienced as truth.  What excited Emerson about Hegel was his description of how reality emerged from a universal mind. Emerson similarly believed that what we as human beings experience as real has emerged through time from a universal source of intelligence. This distinctly Hegelian tone in Emerson can be heard clearly in this passage from his essay entitled “History”:

“There is one mind common to all individual men. Of the works of this mind history is the record. Man is explicable by nothing less than all his history. All the facts of history pre-exist as laws. Each law in turn is made by circumstances predominant. The creation of a thousand forests is in one acorn, and Egypt, Greece, Rome, Gaul, Britain, America, lie folded already in the first man. Epoch after epoch, camp, kingdom, empire, republic, democracy, are merely the application of this manifold spirit to the manifold world.”

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Image: The portrait of G.W.F. Hegel (1770-1831); Steel engraving by Lazarus Sichling after a lithograph by Julius L. Sebbers. Courtesy of Wikipedia.[end-div]

Computers in the Movies

Most of us now carry around inside our smartphones more computing power than NASA once had in the Apollo command module. So, it’s interesting to look back at old movies to see how celluloid fiction portrayed computers. Most from the 1950s and 60s were replete with spinning tape drives and enough lights to resemble the Manhattan skyline. Our favorite here at theDiagonal is the first “Bat Computer” from the original 1960’s TV series, which could be found churning away in Batman’s crime-fighting nerve center beneath Wayne Mansion.

[div class=attrib]From Wired:[end-div]

The United States government powered up its SAGE defense system in July 1958, at an Air Force base near Trenton, New Jersey. Short for Semi-Automatic Ground Environment, SAGE would eventually span 24 command and control stations across the US and Canada, warning against potential air attacks via radar and an early IBM computer called the AN/FSQ-7.

“It automated air defense,” says Mike Loewen, who worked with SAGE while serving with the Air Force in the 1980s. “It used a versatile, programmable, digital computer to process all this incoming radar data from various sites around the region and display it in a format that made sense to people. It provided a computer display of the digitally processed radar information.”

Fronted by a wall of dials, switches, neon lights, and incandescent lamps — and often plugged into spinning tape drives stretching from floor to ceiling — the AN/FSQ-7 looked like one of those massive computing systems that turned up in Hollywood movies and prime time TV during the ’60s and the ’70s. This is mainly because it is one those massive computing systems that turned up in Hollywood movies and TV during the ’60s and ’70s — over and over and over again. Think Lost In Space. Get Smart. Fantastic Voyage. In Like Flint. Or our person favorite: The Towering Inferno.

That’s the AN/FSQ-7 in The Towering Inferno at the top of this page, operated by a man named OJ Simpson, trying to track a fire that’s threatening to bring down the world’s tallest building.

For decades, the AN/FSQ-7 — Q7 for short — helped define the image of a computer in the popular consciousness. Nevermind that it was just a radar system originally backed by tens of thousands of vacuum tubes. For moviegoers everywhere, this was the sort of thing that automated myriad tasks not only in modern-day America but the distant future.

It never made much sense. But sometimes, it made even less sense. In the ’60s and ’70s, some films didn’t see the future all that clearly. Woody Allen’s Sleeper is set in 2173, and it shows the AN/FSQ-7 helping 22nd-century Teamsters make repairs to robotic man servants. Other films just didn’t see the present all that clearly. Independence Day was made in 1996, and apparently, its producers were unaware that the Air Force decommissioned SAGE 13 years earlier.

Of course, the Q7 is only part of the tale. The history of movies and TV is littered with big, beefy, photogenic machines that make absolutely no sense whatsoever. Sometimes they’re real machines doing unreal tasks. And sometimes they’re unreal machines doing unreal tasks. But we love them all. Oh so very much.

Mike Loewen first noticed the Q7 in a mid-’60s prime time TV series called The Time Tunnel. Produced by the irrepressible Irwin Allen, Time Tunnel concerned a secret government project to build a time machine beneath a trap door in the Arizona desert. A Q7 powered this subterranean time machine, complete with all those dials, switches, neon lights, and incandescent lamps.

No, an AN/FSQ-7 couldn’t really power a time machine. But time machines don’t exist. So it all works out quite nicely.

At first, Loewen didn’t know it was a Q7. But then, after he wound up in front of a SAGE system while in the Air Force many years later, it all came together. “I realized that these computer banks running the Time Tunnel were large sections of panels from the SAGE computer,” Loewen says. “And that’s where I got interested.”

He noticed the Q7 in TV show after TV show, movie after movie — and he started documenting these SAGE star turns on his personal homepage. In each case, the Q7 was seen doing stuff it couldn’t possibly do, but there was no doubt this was the Q7 — or at least part of it.

Here’s that subterranean time machine that caught the eye of Mike Loewen in The Time Tunnel (1966). The cool thing about the Time Tunnel AN/FSQ-7 is that even when it traps two government scientists in an endless time warp, it always sends them to dates of extremely important historical significance. Otherwise, you’d have one boring TV show on your hands.

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Image: The Time Tunnel (1966). Courtesy of Wired.[end-div]

The Most Annoying Technology? The Winner Is…

We all have owned or have used or have come far too close to a technology that we absolutely abhor and wish numerous curses upon its inventors. Said gizmo may be the unfathomable VCR, the forever lost TV remote, the tinny sounding Sony Walkman replete with unraveling cassette tape, the Blackberry, or even Facebook.

Ours over here at theDiagonal is the voice recognition system used by 99 percent of so-called customer service organizations. You know how it goes, something like this: “please say ‘one’ for new accounts”, “please say ‘two’ if you are an existing customer”, please say ‘three’ for returns”, “please say ‘Kyrgyzstan’ to speak with a customer service representative”.

Wired recently listed their least favorite, most hated technologies. No surprises here — winners of this dubious award include the Bluetooth headset, CDROM, and Apple TV remote.

[div class=attrib]From Wired:[end-div]

Bluetooth Headsets

Look, here’s a good rule of thumb: Once you get out of the car, or leave your desk, take off the headset. Nobody wants to hear your end of the conversation. That’s not idle speculation, it’s science! Headsets just make it worse. At least when there’s a phone involved, there are visual cues that say “I’m on the phone.” I mean, other than hearing one end of a shouted conversation.

Leaf Blower

Is your home set on a large wooded lot with acreage to spare between you and your closest neighbor? Did a tornado power through your yard last night, leaving your property covered in limbs and leaves? No? Then get a rake, dude. Leaf blowers are so irritating, they have been been outlawed in some towns. Others should follow suit.

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image courtesy of the Sun/Mercury News.[end-div]

Charles Darwin Runs for Office

British voters may recall Screaming Lord Sutch, 3rd Earl of Harrow, of the Official Monster Raving Loony Party, who ran in over 40 parliamentary elections during the 1980s and 90s. He never won, but garnered a respectable number of votes and many fans (he was also a musician).

The United States followed a more dignified path in the 2012 elections, when Charles Darwin ran for a Congressional seat in Georgia. Darwin failed to win, but collected a respectable 4,000 votes. His opponent, Paul Broun, believes that the Earth “is but about 9,000 years old”. Interestingly, Representative Broun serves on the United States House Committee on Science, Space and Technology.

[div class=attrib]From Slate:[end-div]

Anti-evolution Congressman Paul Broun (R-Ga.) ran unopposed in Tuesday’s election, but nearly 4,000 voters wrote in Charles Darwin to protest their representative’s views. (Broun called evolution “lies straight from the pit of hell.”) Darwin fell more than 205,000 votes short of victory, but what would have happened if the father of evolution had out-polled Broun?

Broun still would have won. Georgia, like many other states, doesn’t count votes for write-in candidates who have not filed a notice of intent to stand for election. Even if the finally tally had been reversed, with Charles Darwin winning 209,000 votes and Paul Broun 4,000, Broun would have kept his job.

That’s not to say dead candidates can’t win elections. It happens all the time, but only when the candidate dies after being placed on the ballot. In Tuesday’s election, Orange County, Fla., tax collector Earl Wood won more than 56 percent of the vote, even though he died in October at the age of 96 after holding the office for more than 40 years. Florida law allowed the Democratic Party, of which Wood was a member, to choose a candidate to receive Wood’s votes. In Alabama, Charles Beasley won a seat on the Bibb County Commission despite dying on Oct. 12. (Beasley’s opponent lamented the challenge of running a negative campaign against a dead man.) The governor will appoint a replacement.

[div class=attrib]Read the entire article after the jump.[end-div]

The Myth of Social Mobility

There is a commonly held myth in the United States that anyone can make it; that is, even if you’re at the bottom of the income distribution curve you have the opportunity to climb up to a wealthier future. Independent research over the last couple of decades debunks this myth and paints a rather different and more disturbing reality. For instance, it shows how Americans are now less socially mobile — in the upward sense — than citizens of Canada and most countries in Europe.

[div class=attrib]From the Economist:[end-div]

THE HAMPTONS, A string of small towns on the south shore of Long Island, have long been a playground for America’s affluent. Nowadays the merely rich are being crimped by the ultra-wealthy. In August it can cost $400,000 to rent a fancy house there. The din of helicopters and private jets is omnipresent. The “Quiet Skies Coalition”, formed by a group of angry residents, protests against the noise, particularly of one billionaire’s military-size Chinook. “You can’t even play tennis,” moans an old-timer who stays near the East Hampton airport. “It’s like the third world war with GIV and GV jets.”

Thirty years ago, Loudoun County, just outside Washington, DC, in Northern Virginia, was a rural backwater with a rich history. During the war of 1812 federal documents were kept safe there from the English. Today it is the wealthiest county in America. Rolling pastures have given way to technology firms, swathes of companies that thrive on government contracts and pristine neighbourhoods with large houses. The average household income, at over $130,000, is twice the national level. The county also marks the western tip of the biggest cluster of affluence in the country. Between Loudoun County and north-west Washington, DC, there are over 800,000 people in exclusive postcodes that are home to the best-educated and wealthiest 5% of the population, dubbed “superzips” by Charles Murray, a libertarian social scientist.

THE HAMPTONS, A string of small towns on the south shore of Long Island, have long been a playground for America’s affluent. Nowadays the merely rich are being crimped by the ultra-wealthy. In August it can cost $400,000 to rent a fancy house there. The din of helicopters and private jets is omnipresent. The “Quiet Skies Coalition”, formed by a group of angry residents, protests against the noise, particularly of one billionaire’s military-size Chinook. “You can’t even play tennis,” moans an old-timer who stays near the East Hampton airport. “It’s like the third world war with GIV and GV jets.”

Thirty years ago, Loudoun County, just outside Washington, DC, in Northern Virginia, was a rural backwater with a rich history. During the war of 1812 federal documents were kept safe there from the English. Today it is the wealthiest county in America. Rolling pastures have given way to technology firms, swathes of companies that thrive on government contracts and pristine neighbourhoods with large houses. The average household income, at over $130,000, is twice the national level. The county also marks the western tip of the biggest cluster of affluence in the country. Between Loudoun County and north-west Washington, DC, there are over 800,000 people in exclusive postcodes that are home to the best-educated and wealthiest 5% of the population, dubbed “superzips” by Charles Murray, a libertarian social scientist.

[div clas=attrib]Read the entire article following the jump.[end-div]

Hearing and Listening

Auditory neuroscientist Seth Horowitz guides us through the science of hearing and listening in his new book, “The Universal Sense: How Hearing Shapes the Mind.” He clarifies the important distinction between attentive listening with the mind and the more passive act of hearing, and laments the many modern distractions that threaten our ability to listen effectively.

[div class=attrib]From the New York Times:[end-div]

HERE’S a trick question. What do you hear right now?

If your home is like mine, you hear the humming sound of a printer, the low throbbing of traffic from the nearby highway and the clatter of plastic followed by the muffled impact of paws landing on linoleum — meaning that the cat has once again tried to open the catnip container atop the fridge and succeeded only in knocking it to the kitchen floor.

The slight trick in the question is that, by asking you what you were hearing, I prompted your brain to take control of the sensory experience — and made you listen rather than just hear. That, in effect, is what happens when an event jumps out of the background enough to be perceived consciously rather than just being part of your auditory surroundings. The difference between the sense of hearing and the skill of listening is attention.

Hearing is a vastly underrated sense. We tend to think of the world as a place that we see, interacting with things and people based on how they look. Studies have shown that conscious thought takes place at about the same rate as visual recognition, requiring a significant fraction of a second per event. But hearing is a quantitatively faster sense. While it might take you a full second to notice something out of the corner of your eye, turn your head toward it, recognize it and respond to it, the same reaction to a new or sudden sound happens at least 10 times as fast.

This is because hearing has evolved as our alarm system — it operates out of line of sight and works even while you are asleep. And because there is no place in the universe that is totally silent, your auditory system has evolved a complex and automatic “volume control,” fine-tuned by development and experience, to keep most sounds off your cognitive radar unless they might be of use as a signal that something dangerous or wonderful is somewhere within the kilometer or so that your ears can detect.

This is where attention kicks in.

Attention is not some monolithic brain process. There are different types of attention, and they use different parts of the brain. The sudden loud noise that makes you jump activates the simplest type: the startle. A chain of five neurons from your ears to your spine takes that noise and converts it into a defensive response in a mere tenth of a second — elevating your heart rate, hunching your shoulders and making you cast around to see if whatever you heard is going to pounce and eat you. This simplest form of attention requires almost no brains at all and has been observed in every studied vertebrate.

More complex attention kicks in when you hear your name called from across a room or hear an unexpected birdcall from inside a subway station. This stimulus-directed attention is controlled by pathways through the temporoparietal and inferior frontal cortex regions, mostly in the right hemisphere — areas that process the raw, sensory input, but don’t concern themselves with what you should make of that sound. (Neuroscientists call this a “bottom-up” response.)

But when you actually pay attention to something you’re listening to, whether it is your favorite song or the cat meowing at dinnertime, a separate “top-down” pathway comes into play. Here, the signals are conveyed through a dorsal pathway in your cortex, part of the brain that does more computation, which lets you actively focus on what you’re hearing and tune out sights and sounds that aren’t as immediately important.

In this case, your brain works like a set of noise-suppressing headphones, with the bottom-up pathways acting as a switch to interrupt if something more urgent — say, an airplane engine dropping through your bathroom ceiling — grabs your attention.

Hearing, in short, is easy. You and every other vertebrate that hasn’t suffered some genetic, developmental or environmental accident have been doing it for hundreds of millions of years. It’s your life line, your alarm system, your way to escape danger and pass on your genes. But listening, really listening, is hard when potential distractions are leaping into your ears every fifty-thousandth of a second — and pathways in your brain are just waiting to interrupt your focus to warn you of any potential dangers.

Listening is a skill that we’re in danger of losing in a world of digital distraction and information overload.

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Image: The Listener (TV series). Courtesy of Shaftsbury Films, CTV / Wikipedia.[end-div]

Big Data Versus Talking Heads

With the election in the United States now decided, the dissection of the result is well underway. And, perhaps the biggest winner of all is the science of big data. Yes, mathematical analysis of vast quantities of demographic and polling data won over the voodoo proclamations and gut felt predictions of the punditocracy. Now, that’s a result truly worth celebrating.

[div class=attrib]From ReadWriteWeb:[end-div]

Political pundits, mostly Republican, went into a frenzy when Nate Silver, a New York Times pollster and stats blogger, predicted that Barack Obama would win reelection.

But Silver was right and the pundits were wrong – and the impact of this goes way beyond politics.

Silver won because, um, science. As ReadWrite’s own Dan Rowinski noted,  Silver’s methodology is all based on data. He “takes deep data sets and applies logical analytical methods” to them. It’s all just numbers.

Silver runs a blog called FiveThirtyEight, which is licensed by the Times. In 2008 he called the presidential election with incredible accuracy, getting 49 out of 50 states right. But this year he rolled a perfect score, 50 out of 50, even nailing the margins in many cases. His uncanny accuracy on this year’s election represents what Rowinski calls a victory of “logic over punditry.”

In fact it’s bigger than that. Bear in mind that before turning his attention to politics in 2007 and 2008, Silver was using computer models to make predictions about baseball. What does it mean when some punk kid baseball nerd can just wade into politics and start kicking butt on all these long-time “experts” who have spent their entire lives covering politics?

It means something big is happening.

Man Versus Machine

This is about the triumph of machines and software over gut instinct.

The age of voodoo is over. The era of talking about something as a “dark art” is done. In a world with big computers and big data, there are no dark arts.

And thank God for that. One by one, computers and the people who know how to use them are knocking off these crazy notions about gut instinct and intuition that humans like to cling to. For far too long we’ve applied this kind of fuzzy thinking to everything, from silly stuff like sports to important stuff like medicine.

Someday, and I hope it’s soon, we will enter the age of intelligent machines, when true artificial intellgence becomes a reality, and when we look back on the late 20th and early 21st century it will seem medieval in its simplicity and reliance on superstition.

What most amazes me is the backlash and freak-out that occurs every time some “dark art” gets knocked over in a particular domain. Watch Moneyball (or read the book) and you’ll see the old guard (in that case, baseball scouts) grow furious as they realize that computers can do their job better than they can. (Of course it’s not computers; it’s people who know how to use computers.)

We saw the same thing when IBM’s Deep Blue defeated Garry Kasparov in 1997. We saw it when Watson beat humans at Jeopardy.

It’s happening in advertising, which used to be a dark art but is increasingly a computer-driven numbers game. It’s also happening in my business, the news media, prompting the same kind of furor as happened with the baseball scouts in Moneyball.

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Political pundits, Left to right: Mark Halperin, David Brooks, Jon Stewart, Tim Russert, Matt Drudge, John Harris & Jim VandeHei, Rush Limbaugh, Sean Hannity, Chris Matthews, Karl Rove. Courtesy of Telegraph.[end-div]

Dragons of the Mind

[div class=attrib]From the Wall Street Journal:[end-div]

Peter Jackson’s “Hobbit” movie is on its way, and with it will come the resurrection of the vile dragon Smaug. With fiery breath, razor-sharp claws, scales as hard as shields and a vast underground lair, Smaug is portrayed in J.R.R. Tolkien’s text as a merciless killer. But where did the idea for such a bizarre beast—with such an odd mixture of traits—come from in the first place?

Historically, most monsters were spawned not from pure imagination but from aspects of the natural world that our ancestors did not understand. Whales seen breaking the surface of the ancient oceans were sea monsters, the fossilized bones of prehistoric humans were the victims of Medusa, the roars of earthquakes were thought to emanate from underground beasts. The list goes on. But tracing Smaug’s draconic heritage is more complicated.

At first glance, dinosaurs seem the obvious source for the dragon myth. Our ancestors simply ran into Tyrannosaur skulls, became terrified and came up with the idea that such monsters must still be around. It all sounds so logical, but it’s unlikely to be true.

Dragon myths were alive and well in the ancient Mediterranean world, despite the fact that the region is entirely bereft of dinosaur fossils. The Assyrians had Tiamat, a giant snake with horns (leading some to dispute whether it even qualifies as a dragon). The Greeks, for their part, had a fierce reptilian beast that guarded the golden fleece. In depicting it, they oscillated between a tiny viper and a huge snake capable of swallowing people whole. But even in this latter case, there was no fire-breathing or underground hoard, just a big reptile.

For decades, zoologists have argued that the only snakes humans ever had to seriously worry about were of the venomous variety. Last year, however, a study published in the Proceedings of the National Academy of Sciences revealed that members of Indonesian tribes are regularly eaten by enormous constrictors and that this was likely a common problem throughout human evolution. Moreover, reports by Pliny the Elder and others describe snakes of such size existing in the ancient Mediterranean world and sometimes attacking people. It seems likely that the early dragon myths were based on these real reptilian threats.

But Tolkien’s Smaug lives below the Lonely Mountain and breathes fire. Some reptiles live below ground, but none breathes anything that looks remotely like flame. Yet as strange as this trait may seem, it too may have natural origins.

Among the earliest mythical dragons that lived underground are those found in the 12th-century tales of Geoffrey of Monmouth. Monmouth recounts the story of Vortigern, an ancient British king who was forced to flee to the hills of Wales as Saxons invaded. Desperate to make a final stand, Vortigern orders a fortress to be built, but its walls keep falling over. Baffled, Vortigern seeks the advice of his wise men, who tell him that the ground must be consecrated with the blood of a child who is not born from the union between a man and a woman. Vortigern agrees and sends the wise men off to find such a child.

Not far away, in the town of Carmarthen, they come across two boys fighting. One insults the other as a bastard who has no father, and the wise men drag him back to Vortigern.

When the boy learns that he is to be killed, he tells Vortigern that his advisers have got things wrong. He declares that there are dragons below the ground and that their wrestling with one another is what keeps the walls from standing. Vortigern tests the boy’s theory out, and sure enough, as his men dig deeper, they discover the dragons’ “panting” flames.

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Zmey Gorynych, the Russian three-headed dragon. Courtesy of Wikipedia.[end-div]

Black and White or Color

Please forget Instagram, Photoshop filters, redeye elimination, automatic camera shake reduction systems and high dynamic range apps. If you’re a true photographer or simply a lover of great photography the choice is much simpler: black and white or color.

A new photography exhibit in London pits these contrasting media alongside each other for you to decide. The great Henri Cartier-Bresson would have you believe that black and white images live in a class of their own, far and above the lowly form of color snaps. He was vociferous in his opinion — that for technical and aeasthetic reasons only black and white photography could be considered art.

So, curators of the exhibition — Cartier-Bresson: A Question of Colour, have juxtaposed 10 of Cartier-Bressons prints alongside the colorful works of 15 international contemporary photographers. The results show that “the decisive moment”, so integral to Cartier-Bresson’s memorable black and white images, can be adapted to great, if not equal effect, in color.

The exhibit can be seen at Somerset House, London and runs from 8 November 2012 to 27 January 2013.

[div class=attrib]From Somerset House:[end-div]

Positive View Foundation announces its inaugural exhibition Cartier-Bresson: A Question of Colour, to be held at Somerset House, 8 November 2012 – 27 January 2013. Curated by William A. Ewing, the exhibition will feature 10 Henri Cartier-Bresson photographs never before exhibited in the UK alongside over 75 works by 15 international contemporary photographers, including: Karl Baden (US), Carolyn Drake (US), Melanie Einzig (US), Andy Freeberg (US), Harry Gruyaert (Belgium), Ernst Haas (Austrian), Fred Herzog (Canadian), Saul Leiter (US), Helen Levitt (US), Jeff Mermelstein (US), Joel Meyerowitz (US), Trent Parke (Australian), Boris Savelev (Ukranian), Robert Walker (Canadian), and Alex Webb (US).

The extensive showcase will illustrate how photographers working in Europe and North America adopted and adapted the master’s ethos famously known as  ‘the decisive moment’ to their work in colour. Though they often departed from the concept in significant ways, something of that challenge remained: how to seize something that happens and capture it in the very moment that it takes place.

It is well-known that Cartier-Bresson was disparaging towards colour photography, which in the 1950s was in its early years of development, and his reasoning was based both on the technical and aesthetic limitations of the medium at the time.

Curator William E. Ewing has conceived the exhibition in terms of, as he puts it, ‘challenge and response’. “This exhibition will show how Henri Cartier-Bresson, in spite of his skeptical attitude regarding the artistic value of colour photography, nevertheless exerted a powerful influence over photographers who took up the new medium and who were determined to put a personal stamp on it. In effect, his criticisms of colour spurred on a new generation, determined to overcome the obstacles and prove him wrong. A Question of Colour simultaneously pays homage to a master who felt that black and white photography was the ideal medium, and could not be bettered, and to a group of photographers of the 20th and 21st centuries who chose the path of colour and made, and continue to make, great strides.”

Cartier-Bresson: A Question of Colour will feature a selection of photographers whose commitment to expression in colour was – or is – wholehearted and highly sophisticated, and which measured up to Cartier-Bresson’s essential requirement that content and form were in perfect balance. Some of these artists were Cartier-Bresson’s contemporaries, like Helen Levitt, or even, as with Ernst Haas, his friends; others, such as Fred Herzog in Vancouver, knew the artist’s seminal work across vast distances; others were junior colleagues, such as Harry Gruyaert, who found himself debating colour ferociously with the master; and others still, like Andy Freeberg or Carolyn Drake, never knew the man first-hand, but were deeply influenced by his example.

[div class=attrib]Find out more about the exhibit here.[end-div]

[div class=attrib]Image Henri Cartier-Bresson. Courtesy of Wikipedia.[end-div]

How We Die (In Britain)

The handy infographic is compiled from data compiled by the Office of National Statistics in the United Kingdom. So, if you live in the British Isles this will give you an inkling of your likely cause of death. Interestingly, if you live in the United States you are more likely to die of a gunshot wound than a Brit is of dying from falling from a building.

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Infographic courtesy of the Guardian.[end-div]

The Military-Industrial Complex

[tube]8y06NSBBRtY[/tube]

In his op-ed, author Aaron B. O’Connell reminds us of Eisenhower’s prescient warning to the nation about the growing power of the military-industrial complex in national affairs.

[div class=attrib]From the New York Times:[end-div]

IN 1961, President Dwight D. Eisenhower left office warning of the growing power of the military-industrial complex in American life. Most people know the term the president popularized, but few remember his argument.

In his farewell address, Eisenhower called for a better equilibrium between military and domestic affairs in our economy, politics and culture. He worried that the defense industry’s search for profits would warp foreign policy and, conversely, that too much state control of the private sector would cause economic stagnation. He warned that unending preparations for war were incongruous with the nation’s history. He cautioned that war and warmaking took up too large a proportion of national life, with grave ramifications for our spiritual health.

The military-industrial complex has not emerged in quite the way Eisenhower envisioned. The United States spends an enormous sum on defense — over $700 billion last year, about half of all military spending in the world — but in terms of our total economy, it has steadily declined to less than 5 percent of gross domestic product from 14 percent in 1953. Defense-related research has not produced an ossified garrison state; in fact, it has yielded a host of beneficial technologies, from the Internet to civilian nuclear power to GPS navigation. The United States has an enormous armaments industry, but it has not hampered employment and economic growth. In fact, Congress’s favorite argument against reducing defense spending is the job loss such cuts would entail.

Nor has the private sector infected foreign policy in the way that Eisenhower warned. Foreign policy has become increasingly reliant on military solutions since World War II, but we are a long way from the Marines’ repeated occupations of Haiti, Nicaragua and the Dominican Republic in the early 20th century, when commercial interests influenced military action. Of all the criticisms of the 2003 Iraq war, the idea that it was done to somehow magically decrease the cost of oil is the least credible. Though it’s true that mercenaries and contractors have exploited the wars of the past decade, hard decisions about the use of military force are made today much as they were in Eisenhower’s day: by the president, advised by the Joint Chiefs of Staff and the National Security Council, and then more or less rubber-stamped by Congress. Corporations do not get a vote, at least not yet.

But Eisenhower’s least heeded warning — concerning the spiritual effects of permanent preparations for war — is more important now than ever. Our culture has militarized considerably since Eisenhower’s era, and civilians, not the armed services, have been the principal cause. From lawmakers’ constant use of “support our troops” to justify defense spending, to TV programs and video games like “NCIS,” “Homeland” and “Call of Duty,” to NBC’s shameful and unreal reality show “Stars Earn Stripes,” Americans are subjected to a daily diet of stories that valorize the military while the storytellers pursue their own opportunistic political and commercial agendas. Of course, veterans should be thanked for serving their country, as should police officers, emergency workers and teachers. But no institution — particularly one financed by the taxpayers — should be immune from thoughtful criticism.

[div class=attrib]Read the entire article after the jump.[end-div]

Prodigies and the Rest of Us

[div class=attrib]From the New York Times:[end-div]

Drew Petersen didn’t speak until he was 3½, but his mother, Sue, never believed he was slow. When he was 18 months old, in 1994, she was reading to him and skipped a word, whereupon Drew reached over and pointed to the missing word on the page. Drew didn’t produce much sound at that stage, but he already cared about it deeply. “Church bells would elicit a big response,” Sue told me. “Birdsong would stop him in his tracks.”

Sue, who learned piano as a child, taught Drew the basics on an old upright, and he became fascinated by sheet music. “He needed to decode it,” Sue said. “So I had to recall what little I remembered, which was the treble clef.” As Drew told me, “It was like learning 13 letters of the alphabet and then trying to read books.” He figured out the bass clef on his own, and when he began formal lessons at 5, his teacher said he could skip the first six months’ worth of material. Within the year, Drew was performing Beethoven sonatas at the recital hall at Carnegie Hall. “I thought it was delightful,” Sue said, “but I also thought we shouldn’t take it too seriously. He was just a little boy.”

On his way to kindergarten one day, Drew asked his mother, “Can I just stay home so I can learn something?” Sue was at a loss. “He was reading textbooks this big, and they’re in class holding up a blowup M,” she said. Drew, who is now 18, said: “At first, it felt lonely. Then you accept that, yes, you’re different from everyone else, but people will be your friends anyway.” Drew’s parents moved him to a private school. They bought him a new piano, because he announced at 7 that their upright lacked dynamic contrast. “It cost more money than we’d ever paid for anything except a down payment on a house,” Sue said. When Drew was 14, he discovered a home-school program created by Harvard; when I met him two years ago, he was 16, studying at the Manhattan School of Music and halfway to a Harvard bachelor’s degree.

Prodigies are able to function at an advanced adult level in some domain before age 12. “Prodigy” derives from the Latin “prodigium,” a monster that violates the natural order. These children have differences so evident as to resemble a birth defect, and it was in that context that I came to investigate them. Having spent 10 years researching a book about children whose experiences differ radically from those of their parents and the world around them, I found that stigmatized differences — having Down syndrome, autism or deafness; being a dwarf or being transgender — are often clouds with silver linings. Families grappling with these apparent problems may find profound meaning, even beauty, in them. Prodigiousness, conversely, looks from a distance like silver, but it comes with banks of clouds; genius can be as bewildering and hazardous as a disability. Despite the past century’s breakthroughs in psychology and neuroscience, prodigiousness and genius are as little understood as autism. “Genius is an abnormality, and can signal other abnormalities,” says Veda Kaplinsky of Juilliard, perhaps the world’s pre-eminent teacher of young pianists. “Many gifted kids have A.D.D. or O.C.D. or Asperger’s. When the parents are confronted with two sides of a kid, they’re so quick to acknowledge the positive, the talented, the exceptional; they are often in denial over everything else.”

We live in ambitious times. You need only to go through the New York preschool application process, as I recently did for my son, to witness the hysteria attached to early achievement, the widespread presumption that a child’s destiny hinges on getting a baby foot on a tall ladder. Parental obsessiveness on this front reflects the hegemony of developmental psychiatry, with its insistence that first experience is formative. We now know that brain plasticity diminishes over time; it is easier to mold a child than to reform an adult. What are we to do with this information? I would hate for my children to feel that their worth is contingent on sustaining competitive advantage, but I’d also hate for them to fall short of their potential. Tiger mothers who browbeat their children into submission overemphasize a narrow category of achievement over psychic health. Attachment parenting, conversely, often sacrifices accomplishment to an ideal of unboundaried acceptance that can be equally pernicious. It’s tempting to propose some universal answer, but spending time with families of remarkably talented children showed me that what works for one child can be disastrous for another.

Children who are pushed toward success and succeed have a very different trajectory from that of children who are pushed toward success and fail. I once told Lang Lang, a prodigy par excellence and now perhaps the most famous pianist in the world, that by American standards, his father’s brutal methods — which included telling him to commit suicide, refusing any praise, browbeating him into abject submission — would count as child abuse. “If my father had pressured me like this and I had not done well, it would have been child abuse, and I would be traumatized, maybe destroyed,” Lang responded. “He could have been less extreme, and we probably would have made it to the same place; you don’t have to sacrifice everything to be a musician. But we had the same goal. So since all the pressure helped me become a world-famous star musician, which I love being, I would say that, for me, it was in the end a wonderful way to grow up.”

While it is true that some parents push their kids too hard and give them breakdowns, others fail to support a child’s passion for his own gift and deprive him of the only life that he would have enjoyed. You can err in either direction. Given that there is no consensus about how to raise ordinary children, it is not surprising that there is none about how to raise remarkable children. Like parents of children who are severely challenged, parents of exceptionally talented children are custodians of young people beyond their comprehension.

Spending time with the Petersens, I was struck not only by their mutual devotion but also by the easy way they avoided the snobberies that tend to cling to classical music. Sue is a school nurse; her husband, Joe, works in the engineering department of Volkswagen. They never expected the life into which Drew has led them, but they have neither been intimidated by it nor brash in pursuing it; it remains both a diligence and an art. “How do you describe a normal family?” Joe said. “The only way I can describe a normal one is a happy one. What my kids do brings a lot of joy into this household.” When I asked Sue how Drew’s talent had affected how they reared his younger brother, Erik, she said: “It’s distracting and different. It would be similar if Erik’s brother had a disability or a wooden leg.”

Prodigiousness manifests most often in athletics, mathematics, chess and music. A child may have a brain that processes chess moves or mathematical equations like some dream computer, which is its own mystery, but how can the mature emotional insight that is necessary to musicianship emerge from someone who is immature? “Young people like romance stories and war stories and good-and-evil stories and old movies because their emotional life mostly is and should be fantasy,” says Ken Noda, a great piano prodigy in his day who gave up public performance and now works at the Metropolitan Opera. “They put that fantasized emotion into their playing, and it is very convincing. I had an amazing capacity for imagining these feelings, and that’s part of what talent is. But it dries up, in everyone. That’s why so many prodigies have midlife crises in their late teens or early 20s. If our imagination is not replenished with experience, the ability to reproduce these feelings in one’s playing gradually diminishes.”

Musicians often talked to me about whether you achieve brilliance on the violin by practicing for hours every day or by reading Shakespeare, learning physics and falling in love. “Maturity, in music and in life, has to be earned by living,” the violinist Yehudi Menuhin once said. Who opens up or blocks access to such living? A musical prodigy’s development hinges on parental collaboration. Without that support, the child would never gain access to an instrument, the technical training that even the most devout genius requires or the emotional nurturance that enables a musician to achieve mature expression. As David Henry Feldman and Lynn T. Goldsmith, scholars in the field, have said, “A prodigy is a group enterprise.”

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image: Portrait of Wolfgang Amadeus Mozart aged six years old, by anonymous. Courtesy of Wikipedia.[end-div]

Democracy is Ugly and Petty

While this election cycle in the United States has been especially partisan this season, it’s worth remembering that politics in an open democracy is sometimes brutal, frequently nasty and often petty. Partisan fights, both metaphorical and physical, have been occuring since the Republic was founded

[div class=attrib]From the New York Times:[end-div]

As the cable news channels count down the hours before the first polls close on Tuesday, an entire election cycle will have passed since President Obama last sat down with Fox News. The organization’s standing request to interview the president is now almost two years old.

At NBC News, the journalists reporting on the Romney campaign will continue to absorb taunts from their sources about their sister cable channel, MSNBC. “You mean, Al Sharpton’s network,” as Stuart Stevens, a senior Romney adviser, is especially fond of reminding them.

Spend just a little time watching either Fox News or MSNBC, and it is easy to see why such tensions run high. In fact, by some measures, the partisan bitterness on cable news has never been as stark — and in some ways, as silly or small.

Martin Bashir, the host of MSNBC’s 4 p.m. hour, recently tried to assess why Mitt Romney seemed irritable on the campaign trail and offered a provocative theory: that he might have mental problems.

“Mrs. Romney has expressed concerns about her husband’s mental well-being,” Mr. Bashir told one of his guests. “But do you get the feeling that perhaps there’s more to this than she’s saying?”

Over on Fox News, similar psychological evaluations were under way on “Fox & Friends.” Keith Ablow, a psychiatrist and a member of the channel’s “Medical A-Team,” suggested that Joseph R. Biden Jr.’s “bizarre laughter” during the vice-presidential debate might have something to do with a larger mental health issue. “You have to put dementia on the differential diagnosis,” he noted matter-of-factly.

Neither outlet has built its reputation on moderation and restraint, but during this presidential election, research shows that both are pushing their stridency to new levels.

A Pew Research Center study found that of Fox News stories about Mr. Obama from the end of August through the end of October, just 6 percent were positive and 46 percent were negative.

Pew also found that Mr. Obama was covered far more than Mr. Romney. The president was a significant figure in 74 percent of Fox’s campaign stories, compared with 49 percent for Romney. In 2008, Pew found that the channel reported on Mr. Obama and John McCain in roughly equal amounts.

The greater disparity was on MSNBC, which gave Mr. Romney positive coverage just 3 percent of the time, Pew found. It examined 259 segments about Mr. Romney and found that 71 percent were negative.

MSNBC, whose programs are hosted by a new crop of extravagant partisans like Mr. Bashir, Mr. Sharpton and Lawrence O’Donnell, has tested the limits of good taste this year. Mr. O’Donnell was forced to apologize in April after describing the Mormon Church as nothing more than a scheme cooked up by a man who “got caught having sex with the maid and explained to his wife that God told him to do it.”

The channel’s hosts recycle talking points handed out by the Obama campaign, even using them as titles for program segments, like Mr. Bashir did recently with a segment he called “Romnesia,” referring to Mr. Obama’s term to explain his opponent’s shifting positions.

The hosts insult and mock, like Alex Wagner did in recently describing Mr. Romney’s trip overseas as “National Lampoon’s European Vacation” — a line she borrowed from an Obama spokeswoman. Mr. Romney was not only hapless, Ms. Wagner said, he also looked “disheveled” and “a little bit sweaty” in a recent appearance.

Not that they save their scorn just for their programs. Some MSNBC hosts even use the channel’s own ads promoting its slogan “Lean Forward,” to criticize Mr. Romney and the Republicans. Mr. O’Donnell accuses the Republican nominee of basing his campaign on the false notion that Mr. Obama is inciting class warfare. “You have to come up with a lie,” he says, when your campaign is based on empty rhetoric.

In her ad, Rachel Maddow breathlessly decodes the logic behind the push to overhaul state voting laws. “The idea is to shrink the electorate,” she says, “so a smaller number of people get to decide what happens to all of us.”

Such stridency has put NBC News journalists who cover Republicans in awkward and compromised positions, several people who work for the network said. To distance themselves from their sister channel, they have started taking steps to reassure Republican sources, like pointing out that they are reporting for NBC programs like “Today” and “Nightly News” — not for MSNBC.

At Fox News, there is a palpable sense that the White House punishes the outlet for its coverage, not only by withholding the president, who has done interviews with every other major network, but also by denying them access to Michelle Obama.

This fall, Mrs. Obama has done a spate of television appearances, from CNN to “Jimmy Kimmel Live” on ABC. But when officials from Fox News recently asked for an interview with the first lady, they were told no. She has not appeared on the channel since 2010, when she sat down with Mike Huckabee.

Lately the White House and Fox News have been at odds over the channel’s aggressive coverage of the attack on the American diplomatic mission in Benghazi, Libya. Fox initially raised questions over the White House’s explanation of the events that led to the attack — questions that other news organizations have since started reporting on more fully.

But the commentary on the channel quickly and often turns to accusations that the White House played politics with American lives. “Everything they told us was a lie,” Sean Hannity said recently as he and John H. Sununu, a former governor of New Hampshire and a Romney campaign supporter, took turns raising questions about how the Obama administration misled the public. “A hoax,” Mr. Hannity called the administration’s explanation. “A cover-up.”

Mr. Hannity has also taken to selectively fact-checking Mr. Obama’s claims, co-opting a journalistic tool that has proliferated in this election as news outlets sought to bring more accountability to their coverage.

Mr. Hannity’s guest fact-checkers have included hardly objective sources, like Dick Morris, the former Clinton aide turned conservative commentator; Liz Cheney, the daughter of former Vice President Dick Cheney; and Michelle Malkin, the right-wing provocateur.

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image courtesy of University of Maine at Farmington.[end-div]

The Beauty of Ugliness

The endless pursuit of beauty in human affairs probably pre-dates our historical record. We certainly know that ancient Egyptians used cosmetics believing them to offer magical and religious powers, in addition to aesthetic value.

Yet paradoxically beauty it is rather subjective and often fleeting. The French singer, songwriter, composer and bon viveur once said that, “ugliness is superior to beauty because it lasts longer”. Author Stephen Bayley argues in his new book “Ugly: The Aesthetics of Everything”, that beauty is downright boring.

[div class=attrib]From the Telegraph:[end-div]

Beauty is boring. And the evidence is piling up. An article in the journal Psychological Science now confirms what partygoers have known forever: that beauty and charm are no more directly linked than a high IQ and a talent for whistling.

A group of scientists set out to discover whether physically attractive people also have appealing character traits and values, and found, according to Lihi Segal-Caspi, who carried out part of the research, that “beautiful people tend to focus more on conformity and self-promotion than independence and tolerance”.

Certainly, while a room full of beautiful people might be impressively stiff with the whiff of Chanel No 5, the intellectual atmosphere will be carrying a very low charge. If positive at all.

The grizzled and gargoyle-like Parisian chanteur, and legendary lover, Serge Gainsbourg always used to pick up the ugliest girls at parties. This was not simply because predatory male folklore insists that ill-favoured women will be more “grateful”, but because Gainsbourg, a stylish contrarian, knew that the conversation would be better, the uglier the girl.

Beauty is a conformist conspiracy. And the conspirators include the fashion, cosmetics and movie businesses: a terrible Greek chorus of brainless idolatry towards abstract form. The conspirators insist that women – and, nowadays, men, too – should be un-creased, smooth, fat-free, tanned and, with the exception of the skull, hairless. Flawlessly dull. Even Hollywood once acknowledged the weakness of this proposition: Marilyn Monroe was made more attractive still by the addition of a “beauty spot”, a blemish turned into an asset.

The red carpet version of beauty is a feeble, temporary construction. Bodies corrode and erode, sag and bulge, just as cars rust and buildings develop a fine patina over time. This is not to be feared, rather to be understood and enjoyed. Anyone wishing to arrest these processes with the aid of surgery, aerosols, paint, glue, drugs, tape and Lycra must be both very stupid and very vain. Hence the problems encountered in conversation with beautiful people: stupidity and vanity rarely contribute much to wit and creativity.

Fine features may be all very well, but the great tragedy of beauty is that it is so ephemeral. Albert Camus said it “drives us to despair, offering for a minute the glimpse of an eternity that we should like to stretch out over the whole of time”. And Gainsbourg agreed when he said: “Ugliness is superior to beauty because it lasts longer.” A hegemony of beautiful perfection would be intolerable: we need a good measure of ugliness to keep our senses keen. If everything were beautiful, nothing would be.

And yet, despite the evidence against, there has been a conviction that beauty and goodness are somehow inextricably and permanently linked. Political propaganda exploited our primitive fear of ugliness, so we had Second World War American posters of Japanese looking like vampire bats. The Greeks believed that beauty had a moral character: beautiful people – discus-throwers and so on – were necessarily good people. Darwin explained our need for “beauty” in saying that breeding attractive children is a survival characteristic: I may feel the need to fuse my premium genetic material with yours, so that humanity continues in the same fine style.

This became a lazy consensus, described as the “beauty premium” by US economists Markus M Mobius and Tanya S Rosenblat. The “beauty premium” insists that as attractive children grow into attractive adults, they may find it easier to develop agreeable interpersonal communications skills because their audience reacts more favourably to them. In this beauty-related employment theory, short people are less likely to get a good job. As Randy Newman sang: “Short people got no reason to live.” So Darwin’s argument that evolutionary forces favour a certain physical type may be proven in the job market as well as the wider world.

But as soon as you try to grasp the concept of beauty, it disappears.

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Image courtesy of Google search.[end-div]

Does Evil Exist?

Humans have a peculiar habit of anthropomorphizing anything that moves, and for that matter, most objects that remain static as well. So, it is not surprising that evil is often personified and even stereotyped; it is said that true evil even has a home somewhere below where you currently stand.

[div class=attrib]From the Guardian:[end-div]

The friction between the presence of evil in our world and belief in a loving creator God sparks some tough questions. For many religious people these are primarily existential questions, as their faith contends with doubt and bewilderment. The biblical figure of Job, the righteous man who loses everything that is dear to him, remains a powerful example of this struggle. But the “problem of evil” is also an intellectual puzzle that has taxed the minds of philosophers and theologians for centuries.

One of the most influential responses to the problem of evil comes from St Augustine. As a young man, Augustine followed the teachings of a Christian sect known as the Manichees. At the heart of Manichean theology was the idea of a cosmic battle between the forces of good and evil. This, of course, proposes one possible solution to the problem of evil: all goodness, purity and light comes from God, and the darkness of evil has a different source.

However, Augustine came to regard this cosmic dualism as heretical, since it undermined God’s sovereignty. Of course, he wanted to hold on to the absolute goodness of God. But if God is the source of all things, where did evil come from? Augustine’s radical answer to this question is that evil does not actually come from anywhere. Rejecting the idea that evil is a positive force, he argues that it is merely a “name for nothing other than the absence of good”.

At first glance this looks like a philosophical sleight of hand. Augustine might try to define evil out of existence, but this cannot diminish the reality of the pain, suffering and cruelty that prompt the question of evil in the first place. As the 20th-century Catholic writer Charles Journet put it, the non-being of evil “can have a terrible reality, like letters carved out of stone”. Any defence of Augustine’s position has to begin by pointing out that his account of evil is metaphysical rather than empirical. In other words, he is not saying that our experience of evil is unreal. On the contrary, since a divinely created world is naturally oriented toward the good, any lack of goodness will be felt as painful, wrong and urgently in need of repair. To say that hunger is “merely” the absence of food is not to deny the intense suffering it involves.

One consequence of Augustine’s mature view of evil as “non-being”, a privation of the good, is that evil eludes our understanding. His sophisticated metaphysics of evil confirms our intuitive response of incomprehension in the face of gratuitous brutality, or of senseless “natural” evil like a child’s cancer. Augustine emphasises that evil is ultimately inexplicable, since it has no substantial existence: “No one therefore must try to get to know from me what I know that I do not know, unless, it may be, in order to learn not to know what must be known to be incapable of being known!” Interestingly, by the way, this mysticism about evil mirrors the “negative theology” which insists that God exceeds the limits of our understanding.

So, by his own admission, Augustine’s “solution” to the problem of evil defends belief in God without properly explaining the kinds of acts which exert real pressure on religious faith. He may be right to point out that the effects of evil tend to be destruction and disorder – a twisting or scarring of nature, and of souls. Nevertheless, believers and non-believers alike will feel that this fails to do justice to the power of evil. We may demand a better account of the apparent positivity of evil – of the fact, for example, that holocausts and massacres often involve meticulous planning, technical innovation and creative processes of justification.

Surprisingly, though, the basic insight of Augustinian theodicy finds support in recent science. In his 2011 book Zero Degrees of Empathy, Cambridge psychopathology professor Simon Baron-Cohen proposes “a new theory of human cruelty”. His goal, he writes, is to replace the “unscientific” term “evil” with the idea of “empathy erosion”: “People said to be cruel or evil are simply at one extreme of the empathy spectrum,” he writes. (He points out, though, that some people at this extreme display no more cruelty than those higher up the empathy scale – they are simply socially isolated.)

Loss of empathy resembles the Augustinian concept of evil in that it is a deficiency of goodness – or, to put it less moralistically, a disruption of normal functioning – rather than a positive force. In this way at least, Baron-Cohen’s theory echoes Augustine’s argument, against the Manicheans, that evil is not an independent reality but, in essence, a lack or a loss.

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Image: Marvel Comics Vault of Evil. Courtesy of Wikia / Marvel Comics.[end-div]

From Finely Textured Beef to Soylent Pink

Blame corporate euphemisms and branding for the obfuscation of everyday things. More sinister yet, is the constant re-working of names for our ever increasingly processed foodstuffs. Only last year as several influential health studies pointed towards the detrimental health effects of high fructose corn syrup (HFC) did the food industry act, but not by removing copious amounts of the addictive additive from many processed foods. Rather, the industry attempted to re-brand HFC as “corn sugar”. And, now on to the battle over “soylent pink” also known as “pink slim”.

[div class=attrib]From Slate:[end-div]

What do you call a mash of beef trimmings that have been chopped and then spun in a centrifuge to remove the fatty bits and gristle? According to the government and to the company that invented the process, you call it lean finely textured beef. But to the natural-food crusaders who would have the stuff removed from the nation’s hamburgers and tacos, the protein-rich product goes by another, more disturbing name: Pink slime.

The story of this activist rebranding—from lean finely textured beef to pink slime—reveals just how much these labels matter. It was the latter phrase that, for example, birthed the great ground-beef scare of 2012. In early March, journalists at both the Daily and at ABC began reporting on a burger panic: Lax rules from the U.S. Department of Agriculture allowed producers to fill their ground-beef packs with a slimy, noxious byproduct—a mush the reporters called unsanitary and without much value as a food. Coverage linked back to a New York Times story from 2009 in which the words pink slime had appeared in public for the first time in a quote from an email written by a USDA microbiologist who was frustrated at a decision to leave the additive off labels for ground meat.

The slimy terror spread in the weeks that followed. Less than a month after ABC’s initial reports, almost a quarter million people had signed a petition to get pink slime out of public school cafeterias. Supermarket chains stopped selling burger meat that contained it—all because of a shift from four matter-of-fact words to two visceral ones.

And now that rebranding has become the basis for a 263-page lawsuit. Last month, Beef Products Inc., the first and principal producer of lean/pink/textured/slimy beef, filed a defamation claim against ABC (along with that microbiologist and a former USDA inspector) in a South Dakota court. The company says the network carried out a malicious and dishonest campaign to discredit its ground-beef additive and that this work had grievous consequences. When ABC began its coverage, Beef Products Inc. was selling 5 million pounds of slime/beef/whatever every week. Then three of its four plants were forced to close, and production dropped to 1.6 million pounds. A weekly profit of $2.3 million had turned into a $583,000 weekly loss.

At Reuters, Steven Brill argued that the suit has merit. I won’t try to comment on its legal viability, but the details of the claim do provide some useful background about how we name our processed foods, in both industry and the media. It turns out the paste now known within the business as lean finely textured beef descends from an older, less purified version of the same. Producers have long tried to salvage the trimmings from a cattle carcass by cleaning off the fat and the bacteria that often congregate on these leftover parts. At best they could achieve a not-so-lean class of meat called partially defatted chopped beef, which USDA deemed too low in quality to be a part of hamburger or ground meat.

By the late 1980s, though, Eldon Roth of Beef Products Inc. had worked out a way to make those trimmings a bit more wholesome. He’d found a way, using centrifuges, to separate the fat more fully. In 1991, USDA approved his product as fat reduced beef and signed off on its use in hamburgers. JoAnn Smith, a government official and former president of the National Cattlemen’s Association, signed off on this “euphemistic designation,” writes Marion Nestle in Food Politics. (Beef Products, Inc. maintains that this decision “was not motivated by any official’s so-called ‘links to the beef industry.’ “) So 20 years ago, the trimmings had already been reformulated and rebranded once.

But the government still said that fat reduced beef could not be used in packages marked “ground beef.” (The government distinction between hamburger and ground beef is that the former can contain added fat, while the latter can’t.) So Beef Products Inc. pressed its case, and in 1993 it convinced the USDA to approve the mash for wider use, with a new and better name: lean finely textured beef. A few years later, Roth started killing the microbes on his trimmings with ammonia gas and got approval to do that, too. With government permission, the company went on to sell several billion pounds of the stuff in the next two decades.

In the meantime, other meat processors started making something similar but using slightly different names. AFA Foods (which filed for bankruptcy in April after the recent ground-beef scandal broke), has referred to its products as boneless lean beef trimmings, a more generic term. Cargill, which decontaminates its meat with citric acid in place of ammonia gas, calls its mash of trimmings finely textured beef.

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Image: Industrial ground beef. Courtesy of Wikipedia.[end-div]

Lillian Moller Gilbreth: Inventor of the Modern Kitchen

Lillian Moller Gilbreth, industrial engineer and psychologist, mother of 12 children, but non-cook, invented the modern kitchen design. Unveiled in 1929 at a Women’s Exposition, her design ideas were codified into what became known as the Kitchen Practical.

[div class=attrib]From Slate:[end-div]

The idea that housework is work now seems like a commonplace. We contract it out to housekeepers, laundromats, cleaning services, takeout places. We divvy it up: You cooked dinner, I’ll do the dishes. We count it as a second shift, as well as primary employment. But it wasn’t until the early part of the 20th century that first a literature, and then a science, developed about the best way to cook and clean. The results of this research shape the way we treat housework today, and created a template for the kitchen that remains conceptually unchanged from the 1920s. And the woman who made the kitchen better? She couldn’t cook.

If that sounds like the set-up for a comedy, that’s because it was. Lillian Moller Gilbreth, industrial psychologist and engineer, was the mother of 12 children. She and husband and partner Frank B. Gilbreth, inventors of what is known as motion study, pioneered the use of short films to watch how industrial processes and office tasks were done, breaking them down into component parts (which they called “therbligs,” Gilbreth backward) to determine how to make a job faster and less taxing. They tested many of their ideas on their children, establishing “the one best way” to take a bath, training preteens to touch type, and charting age-appropriate chores for each child. The ensuing hijinks provided enough material for memoirs written by two Gilbreth children, Cheaper by the Dozen and Belles on Their Toes.

While Frank Gilbreth was alive, he and Lillian worked for industry. She wrote or co-wrote many of his books, but often took no credit, as it was Frank with whom the male executives wanted to deal. After his sudden death in 1924, she had to re-establish herself as a solo female practitioner. According to biographer Jane Lancaster, in Making Time, Gilbreth soon saw that combining her professional expertise on motion study with her assumed expertise on women’s work gave her a marketable niche.

Frank B. Gilbreth Jr. and Ernestine Gilbreth Carey write, in Belles on Their Toes:
If the only way to enter a man’s field was through the kitchen door, that’s the way she’d enter… Mother planned, on paper, an efficiency-type kitchenette of the kind used today in a good many apartments. Under her arrangement, a person could mix a cake, put it in the oven, and do the dishes, without taking more than a couple of dozen steps.

It had to be cake, because that was one of few dishes Gilbreth made well. Gilbreth had grown up in an upper class household in California with a Chinese chef. She had worked side-by-side with Frank Gilbreth from the day they married. As she told a group of businesswomen in 1930, “We considered our time too valuable to be devoted to actual labor in the home. We were executives.”And family councils, at the Gilbreth home in Montclair, were run like board meetings.

Even though she did not do it herself, Gilbreth still considered housework unpaid labor, and as such, capable of efficiencies. The worker in the kitchen in the 1920s was often not a servant but the lady of the house, who spent an estimated 50 percent of her day there. The refrigerator had begun to arrive in middle-class homes, but was the subject of a pitched battle between gas and electric companies as to who made the superior chiller. Smaller electric appliances were also in development. “Home economists” raised the bar for domestic health and hygiene. Women became the targets of intense marketing campaigns for products large and small. Gilbreth worked for these manufacturers, and thus is complicit in the rise of consumerism for the home, but she never made explicit endorsements.

She did, however, partner with the Brooklyn Borough Gas Company to develop Gilbreth’s Kitchen Practical, unveiled in 1929 at a Women’s Exposition. The kitchen was intended to showcase the new gas-fueled appliances as well as Gilbreth’s research on motion savings. It was to replace the loose-fit kitchen of many traditional homes (including the Gilbreths’): a large room with discrete pieces of furniture around the edges. These might include a table, a freestanding cupboard or Hoosier cabinet, an icebox, a sink with a drying board and a stove. Ingredients, utensils and cookware might be across the room, or even in a separate pantry.

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image: Kitchen Practical 1929. Courtesy of Gilbreth Network.[end-div]

It’s About Equality, Stupid

[div class=attrib]From Project Syndicate:[end-div]

The king of Bhutan wants to make us all happier. Governments, he says, should aim to maximize their people’s Gross National Happiness rather than their Gross National Product. Does this new emphasis on happiness represent a shift or just a passing fad?

It is easy to see why governments should de-emphasize economic growth when it is proving so elusive. The eurozone is not expected to grow at all this year. The British economy is contracting. Greece’s economy has been shrinking for years. Even China is expected to slow down. Why not give up growth and enjoy what we have?

No doubt this mood will pass when growth revives, as it is bound to. Nevertheless, a deeper shift in attitude toward growth has occurred, which is likely to make it a less important lodestar in the future – especially in rich countries.

The first factor to undermine the pursuit of growth was concern about its sustainability. Can we continue growing at the old rate without endangering our future?

When people started talking about the “natural” limits to growth in the 1970’s, they meant the impending exhaustion of food and non-renewable natural resources. Recently the debate has shifted to carbon emissions. As the Stern Review of 2006 emphasized, we must sacrifice some growth today to ensure that we do not all fry tomorrow.

Curiously, the one taboo area in this discussion is population. The fewer people there are, the less risk we face of heating up the planet. But, instead of accepting the natural decline in their populations, rich-country governments absorb more and more people to hold down wages and thereby grow faster.

A more recent concern focuses on the disappointing results of growth. It is increasingly understood that growth does not necessarily increase our sense of well-being. So why continue to grow?

The groundwork for this question was laid some time ago. In 1974, the economist Richard Easterlin published a famous paper, “Does Economic Growth Improve the Human Lot? Some Empirical Evidence.” After correlating per capita income and self-reported happiness levels across a number of countries, he reached a startling conclusion: probably not.

Above a rather low level of income (enough to satisfy basic needs), Easterlin found no correlation between happiness and GNP per head. In other words, GNP is a poor measure of life satisfaction.

That finding reinforced efforts to devise alternative indexes. In 1972, two economists, William Nordhaus and James Tobin, introduced a measure that they called “Net Economic Welfare,” obtained by deducting from GNP “bad” outputs, like pollution, and adding non-market activities, like leisure. They showed that a society with more leisure and less work could have as much welfare as one with more work – and therefore more GNP – and less leisure.

More recent metrics have tried to incorporate a wider range of “quality of life” indicators. The trouble is that you can measure quantity of stuff, but not quality of life. How one combines quantity and quality in some index of “life satisfaction” is a matter of morals rather than economics, so it is not surprising that most economists stick to their quantitative measures of “welfare.”

But another finding has also started to influence the current debate on growth: poor people within a country are less happy than rich people. In other words, above a low level of sufficiency, peoples’ happiness levels are determined much less by their absolute income than by their income relative to some reference group. We constantly compare our lot with that of others, feeling either superior or inferior, whatever our income level; well-being depends more on how the fruits of growth are distributed than on their absolute amount.

Put another way, what matters for life satisfaction is the growth not of mean income but of median income – the income of the typical person. Consider a population of ten people (say, a factory) in which the managing director earns $150,000 a year and the other nine, all workers, earn $10,000 each. The mean average of their incomes is $25,000, but 90% earn $10,000. With this kind of income distribution, it would be surprising if growth increased the typical person’s sense of well-being.

[div class=attrib]Read the entire article after the jump.[end-div]

The Benefits and Beauty of Blue

[div class=attrib]From the New York Times:[end-div]

For the French Fauvist painter and color gourmand Raoul Dufy, blue was the only color with enough strength of character to remain blue “in all its tones.” Darkened red looks brown and whitened red turns pink, Dufy said, while yellow blackens with shading and fades away in the light. But blue can be brightened or dimmed, the artist said, and “it will always stay blue.”

Scientists, too, have lately been bullish on blue, captivated by its optical purity, complexity and metaphorical fluency. They’re exploring the physics and chemistry of blueness in nature, the evolution of blue ornaments and blue come-ons, and the sheer brazenness of being blue when most earthly life forms opt for earthy raiments of beige, ruddy or taupe.

One research team recently reported the structural analysis of a small, dazzlingly blue fruit from the African Pollia condensata plant that may well be the brightest terrestrial object in nature. Another group working in the central Congo basin announced the discovery of a new species of monkey, a rare event in mammalogy. Rarer still is the noteworthiest trait of the monkey, called the lesula: a patch of brilliant blue skin on the male’s buttocks and scrotal area that stands out from the surrounding fur like neon underpants.

Still other researchers are tracing the history of blue pigments in human culture, and the role those pigments have played in shaping our notions of virtue, authority, divinity and social class. “Blue pigments played an outstanding role in human development,” said Heinz Berke, an emeritus professor of chemistry at the University of Zurich. For some cultures, he said, they were as valuable as gold.

As a raft of surveys has shown, blue love is a global affair. Ask people their favorite color, and in most parts of the world roughly half will say blue, a figure three to four times the support accorded common second-place finishers like purple or green. Just one in six Americans is blue-eyed, but nearly one in two consider blue the prettiest eye color, which could be why some 50 percent of tinted contact lenses sold are the kind that make your brown eyes blue.

Sick children like their caretakers in blue: A recent study at the Cleveland Clinic found that young patients preferred nurses wearing blue uniforms to those in white or yellow. And am I the only person in the United States who doesn’t own a single pair of those permanently popular pants formerly known as dungarees?

“For Americans, bluejeans have a special connotation because of their association with the Old West and rugged individualism,” said Steven Bleicher, author of “Contemporary Color: Theory and Use.” The jeans take their John Wayne reputation seriously. “Because the indigo dye fades during washing, everyone’s blue becomes uniquely different,” said Dr. Bleicher, a professor of visual arts at Coastal Carolina University. “They’re your bluejeans.”

According to psychologists who explore the complex interplay of color, mood and behavior, blue’s basic emotional valence is calmness and open-endedness, in contrast to the aggressive specificity associated with red. Blue is sea and sky, a pocket-size vacation.

In a study that appeared in the journal Perceptual & Motor Skills, researchers at Aichi University in Japan found that subjects who performed a lengthy video game exercise while sitting next to a blue partition reported feeling less fatigued and claustrophobic, and displayed a more regular heart beat pattern, than did people who sat by red or yellow partitions.

In the journal Science, researchers at the University of British Columbia described their study of how computer screen color affected participants’ ability to solve either creative problems — for example, determining the word that best unifies the terms “shelf,” “read” and “end” (answer: book) — or detail-oriented tasks like copy editing. The researchers found that blue screens were superior to red or white backgrounds at enhancing creativity, while red screens worked best for accuracy tasks. Interestingly, when participants were asked to predict which screen color would improve performance on the two categories of problems, big majorities deemed blue the ideal desktop setting for both.

But skies have their limits, and blue can also imply coldness, sorrow and death. On learning of a good friend’s suicide in 1901, Pablo Picasso fell into a severe depression, and he began painting images of beggars, drunks, the poor and the halt, all famously rendered in a palette of blue.

The provenance of using “the blues” to mean sadness isn’t clear, but L. Elizabeth Crawford, a professor of psychology at the University of Richmond in Virginia, suggested that the association arose from the look of the body when it’s in a low energy, low oxygen state. “The lips turn blue, there’s a blue pallor to the complexion,” she said. “It’s the opposite of the warm flushing of the skin that we associate with love, kindness and affection.”

Blue is also known to suppress the appetite, possibly as an adaptation against eating rotten meat, which can have a bluish tinge. “If you’re on a diet, my advice is, take the white bulb out of the refrigerator and put in a blue one instead,” Dr. Bleicher said. “A blue glow makes food look very unappetizing.”

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Image: Morpho didius, dorsal view of male butterfly. Courtesy of Wikipedia.[end-div]