All posts by Mike

Voyager: A Gift that Keeps on Giving

The little space probe that could — Voyager I — is close to leaving our solar system and entering the relative void of interstellar space. As it does so, from a distance of around 18.4 billion kilometers (today), it continues to send back signals of what it finds. And, surprises continue.

[div class=attrib]From ars technica:[end-div]

Several years ago the Voyager spacecraft neared the edge of the Solar System, where the solar wind and magnetic field started to be influenced by the pressure from the interstellar medium that surrounds them. But the expected breakthrough to interstellar space appeared to be indefinitely put on hold; instead, the particles and magnetic field lines in the area seemed to be sending mixed signals about the Voyagers’ escape. At today’s meeting of the American Geophysical Union, scientists offered an explanation: the durable spacecraft ran into a region that nobody predicted.

The Voyager probes were sent on a grand tour of the outer planets over 35 years ago. After a series of staggeringly successful visits to the planets, the probes shot out beyond the most distant of them toward the edges of the Solar System. Scientists expected that as they neared the edge, we’d see the charge particles of the solar wind changing direction as the interstellar medium alters the direction of the Sun’s magnetic field. But while some aspects of the Voyager’s environment have changed, we’ve not seen any clear indication that it has left the Solar System. The solar wind actually seems to be grinding to a halt.

Today’s announcement clarifies that the confusion was caused by the fact that nature didn’t think much of physicists’ expectations. Instead, there’s an additional region near our Solar System’s boundary that hadn’t been predicted.

Within the Solar System, the environment is dominated by the solar magnetic field and a flow of charged particles sent out by the Sun (called the solar wind). Interstellar space has its own flow of particles in the form of low-energy cosmic rays, which the Sun’s magnetic field deflects away from us. There’s also an interstellar magnetic field with field lines oriented in different directions to our Sun’s.

Researchers expected the Voyagers would reach a relatively clear boundary between the Solar System and interstellar space. The Sun’s magnetic field would first shift directions, then be left behind and the interstellar one would be detected. At the same time, we’d see the loss of the solar wind and start seeing the first low-energy cosmic rays.

As expected, a few years back, the Voyagers reached a region where the interstellar medium forced the Sun’s magnetic field lines to curve north. But the solar wind refused to follow suit. Instead of flowing north, the solar wind slowed to a halt while the cosmic rays were missing in action.

Over the summer, as Voyager 1 approached 122 astronomical units from the Sun, that started to change. Arik Posner of the Voyager team said that, starting in late July, Voyager 1 detected a sudden drop in the presence of particles from the solar wind, which went down by half. At the same time, the first low-energy cosmic rays filtered in. A few days later things returned to normal. A second drop occurred on August 15 and then, on August 28, things underwent a permanent shift. According to Tom Krimigis, particles originating from the Sun dropped by about 1,000-fold. Low-energy cosmic rays rose and stayed elevated.

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Image: Voyager II. Courtesy of NASA / JPL.[end-div]

National Emotions Mapped

Are Canadians as a people more emotional than Brazilians? Are Brits as emotional as Mexicans? While generalizing and mapping a nation’s emotionality is dubious at best, this map is nonetheless fascinating.

[div class=atfrib]From the Washington Post:[end-div]

Since 2009, the Gallup polling firm has surveyed people in 150 countries and territories on, among other things, their daily emotional experience. Their survey asks five questions, meant to gauge whether the respondent felt significant positive or negative emotions the day prior to the survey. The more times that people answer “yes” to questions such as “Did you smile or laugh a lot yesterday?”, the more emotional they’re deemed to be.

Gallup has tallied up the average “yes” responses from respondents in almost every country on Earth. The results, which I’ve mapped out above, are as fascinating as they are indecipherable. The color-coded key in the map indicates the average percentage of people who answered “yes.” Dark purple countries are the most emotional, yellow the least. Here are a few takeaways.

Singapore is the least emotional country in the world. ”Singaporeans recognize they have a problem,” Bloomberg Businessweek writes of the country’s “emotional deficit,” citing a culture in which schools “discourage students from thinking of themselves as individuals.” They also point to low work satisfaction, competitiveness, and the urban experience: “Staying emotionally neutral could be a way of coping with the stress of urban life in a place where 82 percent of the population lives in government-built housing.”

The Philippines is the world’s most emotional country. It’s not even close; the heavily Catholic, Southeast Asian nation, a former colony of Spain and the U.S., scores well above second-ranked El Salvador.

Post-Soviet countries are consistently among the most stoic. Other than Singapore (and, for some reason, Madagascar and Nepal), the least emotional countries in the world are all former members of the Soviet Union. They are also the greatest consumers of cigarettes and alcohol. This could be what you call and chicken-or-egg problem: if the two trends are related, which one came first? Europe appears almost like a gradient here, with emotions increasing as you move West.

People in the Americas are just exuberant. Every nation on the North and South American continents ranked highly on the survey. Americans and Canadians are both among the 15 most emotional countries in the world, as well as ten Latin countries. The only non-American countries in the top 15, other than the Philippines, are the Arab nations of Oman and Bahrain, both of which rank very highly.

[div class=attrib]Read the entire article following the jump.[end-div]

The Immortal Jellyfish

In 1988 marine-biology student made a stunning discovery, though little publicized at the time. In the coral blooms of the Italian Mediterranean Christian Rapallo found a small creature that resembled a jellyfish. It showed a very odd attribute — it refused to die. The true importance of this discovery did not become fully apparent until 1996, when a group of researchers found that this invertebrate, now classified as a hydrozoan and known by its scientific name Turritopsis dohrnii, could at any point during its lifecycle revert back to an earlier stage, and then begin its development all over again. It was to all intents immortal.

For scientists seeking to unravel the mechanisms that underlie the aging process Turritopsis dohrnii — the immortal jellyfish — represents a truly significant finding. Might our progress in slowing or even halting aging in humans come from a lowly jellyfish? Time will tell.

[div class=attrib]From the New York Times:[end-div]

After more than 4,000 years — almost since the dawn of recorded time, when Utnapishtim told Gilgamesh that the secret to immortality lay in a coral found on the ocean floor — man finally discovered eternal life in 1988. He found it, in fact, on the ocean floor. The discovery was made unwittingly by Christian Sommer, a German marine-biology student in his early 20s. He was spending the summer in Rapallo, a small city on the Italian Riviera, where exactly one century earlier Friedrich Nietzsche conceived “Thus Spoke Zarathustra”: “Everything goes, everything comes back; eternally rolls the wheel of being. Everything dies, everything blossoms again. . . .”

Sommer was conducting research on hydrozoans, small invertebrates that, depending on their stage in the life cycle, resemble either a jellyfish or a soft coral. Every morning, Sommer went snorkeling in the turquoise water off the cliffs of Portofino. He scanned the ocean floor for hydrozoans, gathering them with plankton nets. Among the hundreds of organisms he collected was a tiny, relatively obscure species known to biologists as Turritopsis dohrnii. Today it is more commonly known as the immortal jellyfish.

Sommer kept his hydrozoans in petri dishes and observed their reproduction habits. After several days he noticed that his Turritopsis dohrnii was behaving in a very peculiar manner, for which he could hypothesize no earthly explanation. Plainly speaking, it refused to die. It appeared to age in reverse, growing younger and younger until it reached its earliest stage of development, at which point it began its life cycle anew.

Sommer was baffled by this development but didn’t immediately grasp its significance. (It was nearly a decade before the word “immortal” was first used to describe the species.) But several biologists in Genoa, fascinated by Sommer’s finding, continued to study the species, and in 1996 they published a paper called “Reversing the Life Cycle.” The scientists described how the species — at any stage of its development — could transform itself back to a polyp, the organism’s earliest stage of life, “thus escaping death and achieving potential immortality.” This finding appeared to debunk the most fundamental law of the natural world — you are born, and then you die.

One of the paper’s authors, Ferdinando Boero, likened the Turritopsis to a butterfly that, instead of dying, turns back into a caterpillar. Another metaphor is a chicken that transforms into an egg, which gives birth to another chicken. The anthropomorphic analogy is that of an old man who grows younger and younger until he is again a fetus. For this reason Turritopsis dohrnii is often referred to as the Benjamin Button jellyfish.

Yet the publication of “Reversing the Life Cycle” barely registered outside the academic world. You might expect that, having learned of the existence of immortal life, man would dedicate colossal resources to learning how the immortal jellyfish performs its trick. You might expect that biotech multinationals would vie to copyright its genome; that a vast coalition of research scientists would seek to determine the mechanisms by which its cells aged in reverse; that pharmaceutical firms would try to appropriate its lessons for the purposes of human medicine; that governments would broker international accords to govern the future use of rejuvenating technology. But none of this happened.

Some progress has been made, however, in the quarter-century since Christian Sommer’s discovery. We now know, for instance, that the rejuvenation of Turritopsis dohrnii and some other members of the genus is caused by environmental stress or physical assault. We know that, during rejuvenation, it undergoes cellular transdifferentiation, an unusual process by which one type of cell is converted into another — a skin cell into a nerve cell, for instance. (The same process occurs in human stem cells.) We also know that, in recent decades, the immortal jellyfish has rapidly spread throughout the world’s oceans in what Maria Pia Miglietta, a biology professor at Notre Dame, calls “a silent invasion.” The jellyfish has been “hitchhiking” on cargo ships that use seawater for ballast. Turritopsis has now been observed not only in the Mediterranean but also off the coasts of Panama, Spain, Florida and Japan. The jellyfish seems able to survive, and proliferate, in every ocean in the world. It is possible to imagine a distant future in which most other species of life are extinct but the ocean will consist overwhelmingly of immortal jellyfish, a great gelatin consciousness everlasting.

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Image of Turritopsis dohrnii, courtesy of Discovery News.[end-div]

Steam Without Boiling Water

Despite what seems to be an overwhelmingly digital shift in our lives, we still live in a world of steam. Steam plays a vital role in generating most of the world’s electricity, steam heats our buildings (especially if you live in New York City), steam sterilizes our medical supplies.

So, in a research discovery with far-reaching implication, scientists have succeeded in making steam at room temperature without actually boiling water. All courtesy of some ingenious nanoparticles.

[div class=attrib]From Technology Review:[end-div]

Steam is a key ingredient in a wide range of industrial and commercial processes—including electricity generation, water purification, alcohol distillation, and medical equipment sterilization.

Generating that steam, however, typically requires vast amounts of energy to heat and eventually boil water or another fluid. Now researchers at Rice University have found a shortcut. Using light-absorbing nanoparticles suspended in water, the group was able to turn the water molecules surrounding the nanoparticles into steam while scarcely raising the temperature of the remaining water. The trick could dramatically reduce the cost of many steam-reliant processes.

The Rice team used a Fresnel lens to focus sunlight on a small tube of water containing high concentrations of nanoparticles suspended in the fluid. The water, which had been cooled to near freezing, began generating steam within five to 20 seconds, depending on the type of nanoparticles used. Changes in temperature, pressure, and mass revealed that 82 percent of the sunlight absorbed by the nanoparticles went directly to generating steam while only 18 percent went to heating water.

“It’s a new way to make steam without boiling water,” says Naomi Halas, director of the Laboratory for Nanophotonics at Rice University. Halas says that the work “opens up a lot of interesting doors in terms of what you can use steam for.”

The new technique could, for instance, lead to inexpensive steam-generation devices for small-scale water purification, sterilization of medical instruments, and sewage treatment in developing countries with limited resources and infrastructure.

The use of nanoparticles to increase heat transfer in water and other fluids has been well studied, but few researchers have looked at using the particles to absorb light and generate steam.

In the current study, Halas and colleagues used nanoparticles optimized to absorb the widest possible spectrum of sunlight. When light hits the particles, their temperature quickly rises to well above 100 °C, the boiling point of water, causing surrounding water molecules to vaporize.

Precisely how the particles and water molecules interact remains somewhat of a mystery. Conventional heat-transfer models suggest that the absorbed sunlight should dissipate into the surrounding fluid before causing any water to boil. “There seems to be some nanoscale thermal barrier, because it’s clearly making steam like crazy,” Halas says.

The system devised by Halas and colleagues exhibited an efficiency of 24 percent in converting sunlight to steam.

Todd Otanicar, a mechanical engineer at the University of Tulsa who was not involved in the current study, says the findings could have significant implications for large-scale solar thermal energy generation. Solar thermal power stations typically use concentrated sunlight to heat a fluid such as oil, which is then used to heat water to generate steam. Otanicar estimates that by generating steam directly with nanoparticles in water, such a system could see an increased efficiency of 3 to 5 percent and a cost savings of 10 percent because a less complex design could be used.

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image: Stott Park Bobbin Mill Steam Engine. Courtesy of Wikipedia.[end-div]

Sleep Myths

Chronobiologist, Till Roenneberg, debunks 5 commonly held beliefs about sleep. He is author of “Internal Time: Chronotypes, Social Jet Lag, and Why You’re So Tired.

[div class=attrib]From the Washington Post:[end-div]

If shopping on Black Friday leaves you exhausted, or if your holiday guests keep you up until the wee hours, a long Thanksgiving weekend should offer an opportunity for some serious shut-eye. We spend between a quarter and a third of our lives asleep, but that doesn’t make us experts on how much is too much, how little is too little, or how many hours of rest the kids need to be sharp in school. Let’s tackle some popular myths about Mr. Sandman.

1.You need eight hours of sleep per night.

That’s the cliche. Napoleon, for one, didn’t believe it. His prescription went something like this: “Six hours for a man, seven for a woman and eight for a fool.”

But Napoleon’s formula wasn’t right, either. The ideal amount of sleep is different for everyone and depends on many factors, including age and genetic makeup.

In the past 10 years, my research team has surveyed sleep behavior in more than 150,000 people. About 11 percent slept six hours or less, while only 27 percent clocked eight hours or more. The majority fell in between. Women tended to sleep longer than men, but only by 14 minutes.

Bigger differences are seen when comparing various age groups. Ten-year-olds needed about nine hours of sleep, while adults older than 30, including senior citizens, averaged about seven hours. We recently identified the first gene associated with sleep duration — if you have one variant of this gene, you need more sleep than if you have another.

2. Early to bed and early to rise makes a man healthy, wealthy and wise.

Benjamin Franklin’s proverbial praise of early risers made sense in the second half of the 18th century, when his peers were exposed to much more daylight and to very dark nights. Their body clocks were tightly synchronized to this day-night cycle. This changed as work gradually moved indoors, performed under the far weaker intensity of artificial light during the day and, if desired, all night long.

The timing of sleep — earlier or later — is controlled by our internal clocks, which determine what researchers call our optimal “sleep window.” With the widespread use of electric light, our body clocks have shifted later while the workday has essentially remained the same. We fall asleep according to our (late) body clock, and are awakened early for work by the alarm clock. We therefore suffer from chronic sleep deprivation, and then we try to compensate by sleeping in on free days. Many of us sleep more than an hour longer on weekends than we do on workdays.

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Image courtesy of Google search.[end-div]

The Science (and Benefit) of Fasting

For thousands of years people have fasted to cleanse the body and the spirit. And, of course, many fast to lose (some) weight. Recently, a growing body of scientific research seems to suggest that fasting may slow the aging process.

[div class=attrib]From the New Scientist:[end-div]

THERE’S a fuzz in my brain and an ache in my gut. My legs are leaden and my eyesight is blurry. But I have only myself to blame. Besides, I have been assured that these symptoms will pass. Between 10 days and three weeks from now, my body will adjust to the new regime, which entails fasting for two days each week. In the meantime, I just need to keep my eyes on the prize. Forget breakfast and second breakfast, ignore the call of multiple afternoon snacks, because the pay offs of doing without could be enormous.

Fasting is most commonly associated with religious observation. It is the fourth of the Five Pillars of Islam. Buddhists consider it a means to practise self-control and advocate abstaining from food after the noon meal. For some Christians, temporary fasts are seen as a way of getting closer to God. But the benefits I am hoping for are more corporeal.

The idea that fasting might be good for your health has a long, if questionable, history. Back in 1908, “Dr” Linda Hazzard, an American with some training as a nurse, published a book called Fasting for the Cure of Disease, which claimed that minimal food was the route to recovery from a variety of illnesses including cancer. Hazzard was jailed after one of her patients died of starvation. But what if she was, at least partly, right?

A new surge of interest in fasting suggests that it might indeed help people with cancer. It could also reduce the risk of developing cancer, guard against diabetes and heart disease, help control asthma and even stave off Parkinson’s disease and dementia. Many of the scientists who study fasting practise what they research, and they tell me that at my age (39) it could be vital that I start now. “We know from animal models,” says Mark Mattson at the US National Institute on Aging, “that if we start an intermittent fasting diet at what would be the equivalent of middle age in people, we can delay the onset of Alzheimer’s and Parkinson’s.” Surely it’s worth a try?

Until recently, most studies linking diet with health and longevity focused on calorie restriction. They have had some impressive results, with the lifespan of various lab animals lengthened by up to 50 per cent after their daily calorie intake was cut in half. But these effects do not seem to extend to primates. A 23-year-long study of macaques found that although calorie restriction delayed the onset of age-related diseases, it had no impact on lifespan. So other factors such as genetics may be more important for human longevity too (Nature, vol 489, p 318).

That’s bad news for anyone who has gone hungry for decades in the hope of living longer, but the finding has not deterred fasting researchers. They point out that although fasting obviously involves cutting calories – at least on the fast days – it brings about biochemical and physiological changes that daily dieting does not. Besides, calorie restriction may leave people susceptible to infections and biological stress, whereas fasting, done properly, should not. Some even argue that we are evolutionarily adapted to going without food intermittently. “The evidence is pretty strong that our ancestors did not eat three meals a day plus snacks,” says Mattson. “Our genes are geared to being able to cope with periods of no food.”

What’s in a fast?

As I sit here, hungry, it certainly doesn’t feel like that. But researchers do agree that fasting will leave you feeling crummy in the short term because it takes time for your body to break psychological and biological habits. Less reassuring is their lack of agreement on what fasting entails. I have opted for the “5:2” diet, which allows me 600 calories in a single meal on each of two weekly “fast” days. The normal recommended intake is about 2000 calories for a woman and 2500 for a man, and I am allowed to eat whatever I want on the five non-fast days, underlining the fact that fasting is not necessarily about losing weight. A more draconian regimen has similar restricted-calorie “fasts” every other day. Then there’s total fasting, in which participants go without food for anything from one to five days – longer than about a week is considered potentially dangerous. Fasting might be a one-off, or repeated weekly or monthly.

Different regimens have different effects on the body. A fast is considered to start about 10 to 12 hours after a meal, when you have used up all the available glucose in your blood and start converting glycogen stored in liver and muscle cells into glucose to use for energy. If the fast continues, there is a gradual move towards breaking down stored body fat, and the liver produces “ketone bodies” – short molecules that are by-products of the breakdown of fatty acids. These can be used by the brain as fuel. This process is in full swing three to four days into a fast. Various hormones are also affected. For example, production of insulin-like growth factor 1 (IGF-1), drops early and reaches very low levels by day three or four. It is similar in structure to insulin, which also becomes scarcer with fasting, and high levels of both have been linked to cancer.

[div class=attrib]Read the entire article following the jump.[end-div]

Safety and Paranoia Go Hand in Hand

Brooke Allen reviews a handy new tome for those who live in comfort and safety but who perceive threats large and small from all crevices and all angles. Paradoxically, most people in the West are generally safer than any previous generations, and yet they imagine existential threats ranging from viral pandemics to hemispheric mega-storms.

[div class=attrib]From WSJ:[end-div]

Never in our history have Americans been so fearful; never, objectively speaking, have we been so safe. Except for the bombing of Pearl Harbor and the destruction of the World Trade Center, war has not touched our shores in a century and a half. Despite relative decline, we are still militarily No. 1. We have antibiotics, polio vaccines, airbags; our children need no longer suffer even measles or chicken pox. So what are we all so frightened of?

In “Encyclopedia Paranoiaca,” Henry Beard and Christopher Cerf—in association, supposedly, with the staff of something called the Cassandra Institute—try to answer that question in some detail. The result is an amusing and cruelly accurate cultural critique, offering a “comprehensive and authoritative inventory of the perils, menaces, threats, blights, banes, and other assorted pieces of Damoclean cutlery” that hover over our collective head.

There’s the big stuff, of course: global warming and nuclear warfare, not to mention super-volcanoes and mega-tsunamis “capable of crossing entire oceans at jet-airplane speed and wreaking almost unimaginable damage.” The authors don’t even bother to list terror attacks or hurricanes, both high on the list of national obsessions after the events of recent years. But they do dwell on financial perils. “Investments, domestic” and “investments, overseas” are both listed as dangers, as are “gold, failure to invest in” and “gold, investing in.” Damned if you do, damned if you don’t—as with so many of life’s decisions.

Our understandable fear of outsize disasters is matched, oddly enough, by an equally paralyzing terror of the microscopic. American germophobia has only intensified in recent years, as we can see from the sudden ubiquity of hand sanitizers. Messrs. Beard and Cerf gleefully fan the flames of our paranoia. Toilets, flushing of: You’d do well to keep the seat down when engaging in this hazardous activity, because toilet water and all its contents are vaporized by the flushing action and settle upon everything in your bathroom—including your toothbrush. A lovely hot bath turns out to be, according to a scientist at NYU Medical Center, a foul stew of pathogens, with up to 100,000 bacteria per square inch. But showers are not much better—they distribute the scary Mycobacterium avium. And your kitchen is even yuckier than your bathroom! Dishwashers carry fungi on the rubber band in the door. Kitchen sinks: According to one scientist consulted by the authors, “if an alien came from space and studied bacteria counts in the typical home, he would probably conclude he should wash his hands in the toilet, and pee in your sink.” Sponges: Their “damp, porous environment serves as a perfect breeding ground in which the microbes can flourish and multiply until there are literally billions of them.” Cutting boards—let’s not even go there.

But don’t pull out the cleaning products too fast. Through a clever system of cross-referencing, the authors demonstrate that the cure is likely to be as harmful as the malady. Room air purifiers: “The ozone spewed out by these machines is more hazardous than any substances they may remove.” Antibacterial products: Their overuse is creating a new breed of “superbugs” resistant to the original agents and to antibiotics as well. Paper towels might be bad for the environment, but hand-drying machines are actually scary: In one study, “people who used a hot-air hand-drying machine to dry their hands had two to three times as many bacteria on their hands as they did before they washed them.”

And what about toxins? Some of the book’s entries might surprise you. You could probably guess that the popular Brazilian blowout hair-straightening treatment might contain stuff you wouldn’t want to breathe in (it does—formaldehyde), but what about the natural-stone kitchen countertops so beloved by design-conscious Bobos? Their granite emits “a continuous stream of radioactive radon gas.” And those compact fluorescent light bulbs touted by environmentalists? The average CFL bulb “contains enough mercury,” the authors tell us, “to contaminate as many as six thousand gallons of water to a point beyond safe drinking levels. The bulbs are harmless enough unless they break, but if they do, you and your family face the immediate danger of mercury poisoning.”

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Encyclopedia Paranoiaca book cover courtesy of Amazon.com.[end-div]

Testosterone and the Moon

While the United States’ military makes no comment a number of corroborated reports suggest that the country had a plan to drop an atomic bomb on the moon during the height of the Cold War. Apparently, a Hiroshima-like explosion on our satellite would have been seen as a “show of force” by the Soviets. The shear absurdity of this Dr.Strangelove story makes it all the more real.

[div class=attrib]From the Independent:[end-div]

US Military chiefs, keen to intimidate Russia during the Cold War, plotted to blow up the moon with a nuclear bomb, according to project documents kept secret for for nearly 45 years.

The army chiefs allegedly developed a top-secret project called, ‘A Study of Lunar Research Flights’ – or ‘Project A119’, in the hope that their Soviet rivals would be intimidated by a display of America’s Cold War muscle.

According to The Sun newspaper the military bosses developed a classified plan to launch a nuclear weapon 238,000 miles to the moon where it would be detonated upon impact.

The planners reportedly opted for an atom bomb, rather than a hydrogen bomb, because the latter would be too heavy for the missile.

Physicist Leonard Reiffel, who says he was involved in the project, claims the hope was that the flash from the bomb would intimidate the Russians following their successful launching of the Sputnik satellite in October 1957.

The planning of the explosion reportedly included calculations by astronomer Carl Sagan, who was then a young graduate.

Documents reportedly show the plan was abandoned because of fears it would have an adverse effect on Earth should the explosion fail.

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Image courtesy of NASA.[end-div]

UX and the Untergunther: Underground (Literally) Art

Many cities around the globe are home to underground art movements — those whose participants eschew the strictures of modern day gallery wine and cheese, curated exhibits, and formal public art shows. Paris has gone a step further — though deeper, would be more correct — in providing a subterranean home for some truly underground art and the groups of dedicated, clandestine artists, hackers and art restorers.

Wired spent some quality time with a leading group of Parisian underground artists, known as UX, for Underground eXperiment. Follow Wired’s fascinating and lengthy article here.

[div class=attrib]From the BBC:[end-div]

The obsessively secretive members of an underground art collective have spent the last 30 years surreptitiously staging events in tunnels beneath Paris. They say they never ask permission – and never ask for subsidies.

We’re standing nervously on the pavement, trying not to feel self-conscious as we furtively scrutinise each passer-by.

After weeks of negotiation, we have a meeting with someone who says he is a member of the highly secretive French artists’ collective – UX, as they are known for short – outside a town hall in the south of Paris. It is late on a Sunday night but the street is still quite busy.

Finally I notice a young man dressed entirely in black apart from a red beret and a small rucksack on his back. He hovers for a moment and then motions us to follow him. Our destination is the catacombs, the tunnels that run beneath the pavements of Paris.

A few minutes later Tristan (not his real name) and two companions are pulling the heavy steel cover off a manhole. “Quick, quick,” he says, “before the police come.”

I stare down a seemingly endless black hole before stepping gingerly on to a rusty ladder and start to clamber down.

There are several more ladders after that before we finally reach the bottom. To my great relief, there are no rats – we go deeper than the rats ever do – but it is pitch black and very wet.

The water is ankle deep and my shoes are soaked through. “It’s fine, if you’re properly dressed,” laughs Tristan as he splashes ahead in his rubber boots.

Using the flashlight on my phone, we do our best to follow him. Along the way I notice some colourful graffiti and a painting of an evil looking cat.

After a few minutes, we reach a dry, open space with intricate carvings on the wall and it is here that we finally sit down to interrogate our mysterious companions.

Tristan explains that he gets a kick out of getting to places, which are normally off-limits. He is a “cataphile” – somebody who loves to roam the catacombs of Paris.

UX are not the only people who go underground. There is a rap song about cataphiles, people who would rather don the rubber boots of a sewer worker (egoutier) than go clubbing in a normal night spot.

There have been a number of raves underground – some chambers are said to be big enough to hold 1,000 people.

The galleries are turned into makeshift night clubs, with a bar, lighting effects, and DJ turntables, using electricity diverted from the Parisian metro.

He also climbs on the roofs of churches. “You get a great view of the city, especially at night and it’s a cool place for a picnic,” he says.

Tristan who is originally from Lyon says his group is called the Lyonnaise des Os – a reference to the piles of bones (“os” is French for “bone”) in the catacombs – but also a pun on France’s famous water company, Lyonnaise des Eaux. He and his group spend their time exploring the tunnels, and carving sculptures.

The UX are a loose collective of people from a variety of backgrounds. Not just artists but also engineers, civil servants, lawyers and even a state prosecutor. They divide into different groups depending on their interests.

The Untergunther specialise in clandestine acts of restoration of parts of France’s heritage which they believe the state has neglected. There is also an all-women group, nicknamed The Mouse House, who are experts at infiltration.

Another group, called La Mexicaine de Perforation, or The Mexican Consolidated Drilling Authority, stages arts events like film festivals underground. They once created an entire cinema under the Palais de Chaillot, by the Trocadero, with seats cut out of the rock.

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Image: Hacker-artists below Paris. Courtesy of Wired / UX.[end-div]

Antifragile

One of our favorite thinkers (and authors) here at theDiagonal is Nassim Taleb. His new work entitled Antifragile expands on ideas that he first described in his bestseller Black Swan.

Based on humanity’s need to find order and patterns out of chaos, and proclivity to seek causality where none exists we’ll need several more books from him before his profound and yet common-sense ideas sink in. In his latest work, Taleb shows how the improbable and unpredictable lie at the foundation of our universe.

[div class=attrib]From the Guardian:[end-div]

Now much does Nassim Taleb dislike journalists? Let me count the ways. “An erudite is someone who displays less than he knows; a journalist or consultant the opposite.” “This business of journalism is about pure entertainment, not the search for the truth.” “Most so-called writers keep writing and writing with the hope, some day, to find something to say.” He disliked them before, but after he predicted the financial crash in his 2007 book, The Black Swan, a book that became a global bestseller, his antipathy reached new heights. He has dozens and dozens of quotes on the subject, and if that’s too obtuse for us non-erudites, his online home page puts it even plainer: “I beg journalists and members of the media to leave me alone.”

He’s not wildly keen on appointments either. In his new book, Antifragile, he writes that he never makes them because a date in the calendar “makes me feel like a prisoner”.

So imagine, if you will, how keenly he must be looking forward to the prospect of a pre-arranged appointment to meet me, a journalist. I approach our lunch meeting, at the Polytechnic Institute of New York University where he’s the “distinguished professor of risk engineering”, as one might approach a sleeping bear: gingerly. And with a certain degree of fear. And yet there he is, striding into the faculty lobby in a jacket and Steve Jobs turtleneck (“I want you to write down that I started wearing them before he did. I want that to be known.”), smiling and effusive.

First, though, he has to have his photo taken. He claims it’s the first time he’s allowed it in three years, and has allotted just 10 minutes for it, though in the end it’s more like five. “The last guy I had was a fucking dick. He wanted to be artsy fartsy,” he tells the photographer, Mike McGregor. “You’re OK.”

Being artsy fartsy, I will learn, is even lower down the scale of Nassim Taleb pet hates than journalists. But then, being contradictory about what one hates and despises and loves and admires is actually another key Nassim Taleb trait.

In print, the hating and despising is there for all to see: he’s forever having spats and fights. When he’s not slagging off the Nobel prize for economics (a “fraud”), bankers (“I have a physical allergy to them”) and the academic establishment (he has it in for something he calls the “Soviet-Harvard illusion”), he’s trading blows with Steven Pinker (“clueless”), and a random reviewer on Amazon, who he took to his Twitter stream to berate. And this is just in the last week.

And yet here he is, chatting away, surprisingly friendly and approachable. When I say as much as we walk to the restaurant, he asks, “What do you mean?”

“In your book, you’re quite…” and I struggle to find the right word, “grumpy”.

He shrugs. “When you write, you don’t have the social constraints of having people in front of you, so you talk about abstract matters.”

Social constraints, it turns out, have their uses. And he’s an excellent host. We go to his regular restaurant, a no-nonsense, Italian-run, canteen-like place, a few yards from his faculty in central Brooklyn, and he insists that I order a glass of wine.

“And what’ll you have?” asks the waitress.

“I’ll take a coffee,” he says.

“What?” I say. “No way! You can’t trick me into ordering a glass of wine and then have coffee.” It’s like flunking lesson #101 at interviewing school, though in the end he relents and has not one but two glasses and a plate of “pasta without pasta” (though strictly speaking you could call it “mixed vegetables and chicken”), and attacks the bread basket “because it doesn’t have any calories here in Brooklyn”.

But then, having read his latest book, I actually know an awful lot about his diet. How he doesn’t eat sugar, any fruits which “don’t have a Greek or Hebrew name” or any liquid which is less than 1,000 years old. Just as I know that he doesn’t like air-conditioning, soccer moms, sunscreen and copy editors. That he believes the “non-natural” has to prove its harmlessness. That America tranquillises its children with drugs and pathologises sadness. That he values honour above all things, banging on about it so much that at times he comes across as a medieval knight who’s got lost somewhere in the space-time continuum. And that several times a week he goes and lifts weights in a basement gym with a bunch of doormen.

He says that after the financial crisis he received “all manner of threats” and at one time was advised to “stock up on bodyguards”. Instead, “I found it more appealing to look like one”. Now, he writes, when he’s harassed by limo drivers in the arrival hall at JFK, “I calmly tell them to fuck off.”

Taleb started out as a trader, worked as a quantitative analyst and ran his own investment firm, but the more he studied statistics, the more he became convinced that the entire financial system was a keg of dynamite that was ready to blow. In The Black Swan he argued that modernity is too complex to understand, and “Black Swan” events – hitherto unknown and unpredicted shocks – will always occur.

What’s more, because of the complexity of the system, if one bank went down, they all would. The book sold 3m copies. And months later, of course, this was more or less exactly what happened. Overnight, he went from lone-voice-in-the-wilderness, spouting off-the-wall theories, to the great seer of the modern age.

Antifragile, the follow-up, is his most important work so far, he says. It takes the central idea of The Black Swan and expands it to encompass almost every other aspect of life, from the 19th century rise of the nation state to what to eat for breakfast (fresh air, as a general rule).

[div class-attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Image: Black Swan, the movie, not the book by the same name by Nassim Taleb. Courtesy of Wkipedia.[end-div]

Telomere Test: A Date With Death

In 1977 Elizabeth Blackburn and Joseph Gall, molecular biologists, discovered the structure of the end caps, known as telomeres, of chromosomes. In 2009, Blackburn and colleagues Carol Greider and Jack Szostak shared the Nobel prize in Physiology or Medicine for discovering the enzyme telomerase, the enzyme responsible for replenishing telomeres.

It turns out that telomeres are rather important. Studies shows that telomeres regulate cell division, and as a consequence directly influence aging and life span. When a cell divides the length of its chromosomal telomeres shortens. Once a telomere is depleted its chromosome, and DNA, can no longer be replicated accurately, and the cell no longer divides, hastening cell death.

[div class=attrib]From the Independent:[end-div]

A blood test to determine how fast someone is ageing has been shown to work on a population of wild birds, the first time the ageing test has been used successfully on animals living outside a laboratory setting.

The test measures the average length of tiny structures on the tips of chromosomes called telomeres which are known to get shorter each time a cell divides during an organism’s lifetime.

Telomeres are believed to act like internal clocks by providing a more accurate estimate of a person’s true biological age rather than their actual chronological age.

This has led some experts to suggest that telomere tests could be used to estimate not only how fast someone is ageing, but possibly how long they have left to live if they die of natural causes.

Telomere tests have been widely used on experimental animals and at least one company is offering a £400 blood test in the UK for people interested in seeing how fast they are ageing based on their average telomere length.

Now scientists have performed telomere tests on an isolated population of songbirds living on an island in the Seychelles and found that the test does indeed accurately predict an animal’s likely lifespan.

“We saw that telomere length is a better indicator of life expectancy than chronological age. So by measuring telomere length we have a way of estimating the biological age of an individual – how much of its life it has used up,” said David Richardson of the University of East Anglia.

The researchers tested the average telomere lengths of a population of 320 Seychelles Warblers living on the remote Cousin Island, which ornithologists have studied for 20 years, documenting the life history of each bird.

“Our results provide the first clear and unambiguous evidence of a relationship between telomere length and mortality in the wild, and substantiate the prediction that telomere length and shortening rate can act as an indicator of biological age further to chronological age,” says the study published in the journal Molecular Ecology.

Studying an island population of wild birds was important because there were no natural predators and little migration, meaning that the scientists could accurately study the link between telomere length and a bird’s natural lifespan.

“We wanted to understand what happens over an entire lifetime, so the Seychelles warbler is an ideal research subject. They are naturally confined to an isolated tropical island, without any predators, so we can follow individuals throughout their lives, right into old age,” Dr Richardson said.

“We investigated whether, at any given age, their telomere lengths could predict imminent death. We found that short and rapidly shortening telomeres were a good indication that the bird would die within a year,” he said.

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Infographic courtesy of Independent.[end-div]

Lead a Congressional Committee on Science: No Grasp of Science Required

[div class=attrib]From ars technica:[end-div]

The House Committee on Space, Science, and Technology hears testimony on climate change in March 2011.[/ars_img]If you had the chance to ask questions of one of the world’s leading climatologists, would you select a set of topics that would be at home in the heated discussions that take place in the Ars forums? If you watch the video below, you’d find that’s precisely what Dana Rohrabacher (R-CA) chose to do when Penn State’s Richard Alley (a fellow Republican) was called before the House Science Committee, which has already had issues with its grasp of science. Rohrabacher took Alley on a tour of some of the least convincing arguments about climate change, all trying to convince him changes in the Sun were to blame for a changing climate. (Alley, for his part, noted that we have actually measured the Sun, and we’ve seen no such changes.)

Now, if he has his way, Rohrabacher will be chairing the committee once the next Congress is seated. Even if he doesn’t get the job, the alternatives aren’t much better.

There has been some good news for the Science Committee to come out of the last election. Representative Todd Akin (R-MO), whose lack of understanding of biology was made clear by his comments on “legitimate rape,” had to give up his seat to run for the Senate, a race he lost. Meanwhile, Paul Broun (R-GA), who said that evolution and cosmology are “lies straight from the pit of Hell,” won reelection, but he received a bit of a warning in the process: dead English naturalist Charles Darwin, who is ineligible to serve in Congress, managed to draw thousands of write-in votes. And, thanks to limits on chairmanships, Ralph Hall (R-TX), who accused climate scientists of being in it for the money (if so, they’re doing it wrong), will have to step down.

In addition to Rohrabacher, the other Representatives that are vying to lead the Committee are Wisconsin’s James Sensenbrenner and Texas’ Lamar Smith. They all suggest that they will focus on topics like NASA’s budget and the Department of Energy’s plans for future energy tech. But all of them have been embroiled in the controversy over climate change in the past.

In an interview with Science Insider about his candidacy, Rohrabacher engaged in a bit of triumphalism and suggested that his beliefs were winning out. “There were a lot of scientists who were just going along with the flow on the idea that mankind was causing a change in the world’s climate,” he said. “I think that after 10 years of debate, we can show that there are hundreds if not thousands of scientists who have come over to being skeptics, and I don’t know anyone [who was a skeptic] who became a believer in global warming.”

[div class=attrib]Read the entire article following the jump.[end-div]

The Rise of the Industrial Internet

As the internet that connects humans reaches a stable saturation point the industrial internet — the network that connects things — is increasing its growth and reach.

[div class=attrib]From the New York Times:[end-div]

When Sharoda Paul finished a postdoctoral fellowship last year at the Palo Alto Research Center, she did what most of her peers do — considered a job at a big Silicon Valley company, in her case, Google. But instead, Ms. Paul, a 31-year-old expert in social computing, went to work for General Electric.

Ms. Paul is one of more than 250 engineers recruited in the last year and a half to G.E.’s new software center here, in the East Bay of San Francisco. The company plans to increase that work force of computer scientists and software developers to 400, and to invest $1 billion in the center by 2015. The buildup is part of G.E’s big bet on what it calls the “industrial Internet,” bringing digital intelligence to the physical world of industry as never before.

The concept of Internet-connected machines that collect data and communicate, often called the “Internet of Things,” has been around for years. Information technology companies, too, are pursuing this emerging field. I.B.M. has its “Smarter Planet” projects, while Cisco champions the “Internet of Everything.”

But G.E.’s effort, analysts say, shows that Internet-era technology is ready to sweep through the industrial economy much as the consumer Internet has transformed media, communications and advertising over the last decade.

In recent months, Ms. Paul has donned a hard hat and safety boots to study power plants. She has ridden on a rail locomotive and toured hospital wards. “Here, you get to work with things that touch people in so many ways,” she said. “That was a big draw.”

G.E. is the nation’s largest industrial company, a producer of aircraft engines, power plant turbines, rail locomotives and medical imaging equipment. It makes the heavy-duty machinery that transports people, heats homes and powers factories, and lets doctors diagnose life-threatening diseases.

G.E. resides in a different world from the consumer Internet. But the major technologies that animate Google and Facebook are also vital ingredients in the industrial Internet — tools from artificial intelligence, like machine-learning software, and vast streams of new data. In industry, the data flood comes mainly from smaller, more powerful and cheaper sensors on the equipment.

Smarter machines, for example, can alert their human handlers when they will need maintenance, before a breakdown. It is the equivalent of preventive and personalized care for equipment, with less downtime and more output.

“These technologies are really there now, in a way that is practical and economic,” said Mark M. Little, G.E.’s senior vice president for global research.

G.E.’s embrace of the industrial Internet is a long-term strategy. But if its optimism proves justified, the impact could be felt across the economy.

The outlook for technology-led economic growth is a subject of considerable debate. In a recent research paper, Robert J. Gordon, a prominent economist at Northwestern University, argues that the gains from computing and the Internet have petered out in the last eight years.

Since 2000, Mr. Gordon asserts, invention has focused mainly on consumer and communications technologies, including smartphones and tablet computers. Such devices, he writes, are “smaller, smarter and more capable, but do not fundamentally change labor productivity or the standard of living” in the way that electric lighting or the automobile did.

But others say such pessimism misses the next wave of technology. “The reason I think Bob Gordon is wrong is precisely because of the kind of thing G.E. is doing,” said Andrew McAfee, principal research scientist at M.I.T.’s Center for Digital Business.

Today, G.E. is putting sensors on everything, be it a gas turbine or a hospital bed. The mission of the engineers in San Ramon is to design the software for gathering data, and the clever algorithms for sifting through it for cost savings and productivity gains. Across the industries it covers, G.E. estimates such efficiency opportunities at as much as $150 billion.

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Image: Internet of Things. Courtesy of Intel.[end-div]

Startup Culture: New is the New New

Starting up a new business was once a demanding and complex process, often undertaken in anonymity in the long shadows between the hours of a regular job. It still is over course. However nowadays “the startup” has become more of an event. The tech sector has raised this to a fine art by spawning an entire self-sustaining and self-promoting industry around startups.

You’ll find startup gurus, serial entrepreneurs and digital prophets — yes, AOL has a digital prophet on its payroll — strutting around on stage, twittering tips in the digital world, leading business plan bootcamps, pontificating on accelerator panels, hosting incubator love-ins in coffee shops or splashed across the covers of Entrepreneur or Inc or FastCompany magazines on an almost daily basis. Beware! The back of your cereal box may be next.

[div class=attrib]From the Telegraph:[end-div]

I’ve seen the best minds of my generation destroyed by marketing, shilling for ad clicks, dragging themselves through the strip-lit corridors of convention centres looking for a venture capitalist. Just as X Factor has convinced hordes of tone deaf kids they can be pop stars, the startup industry has persuaded thousands that they can be the next rockstar entrepreneur. What’s worse is that while X Factor clogs up the television schedules for a couple of months, tech conferences have proliferated to such an extent that not a week goes by without another excuse to slope off. Some founders spend more time on panels pontificating about their business plans than actually executing them.

Earlier this year, I witnessed David Shing, AOL’s Digital Prophet – that really is his job title – delivering the opening remarks at a tech conference. The show summed up the worst elements of the self-obsessed, hyperactive world of modern tech. A 42-year-old man with a shock of Russell Brand hair, expensive spectacles and paint-splattered trousers, Shingy paced the stage spouting buzzwords: “Attention is the new currency, man…the new new is providing utility, brothers and sisters…speaking on the phone is completely cliche.” The audience lapped it all up. At these rallies in praise of the startup, enthusiasm and energy matter much more than making sense.

Startup culture is driven by slinging around superlatives – every job is an “incredible opportunity”, every product is going to “change lives” and “disrupt” an established industry. No one wants to admit that most startups stay stuck right there at the start, pub singers pining for their chance in the spotlight. While the startups and hangers-on milling around in the halls bring in stacks of cash for the event organisers, it’s the already successful entrepreneurs on stage and the investors who actually benefit from these conferences. They meet up at exclusive dinners and in the speakers’ lounge where the real deals are made. It’s Studio 54 for geeks.

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Image: Startup, WA. Courtesy of Wikipedia.[end-div]

Us: Perhaps It’s All Due to Gene miR-941

Geneticists have discovered a gene that helps explain how humans and apes diverged from their common ancestor around 6 million years ago.

[div class=attrib]From the Guardian:[end-div]

Researchers have discovered a new gene they say helps explain how humans evolved from chimpanzees.

The gene, called miR-941, appears to have played a crucial role in human brain development and could shed light on how we learned to use tools and language, according to scientists.

A team at the University of Edinburgh compared it to 11 other species of mammals, including chimpanzees, gorillas, mice and rats.

The results, published in Nature Communications, showed that the gene is unique to humans.

The team believe it emerged between six and one million years ago, after humans evolved from apes.

Researchers said it is the first time a new gene carried by humans and not by apes has been shown to have a specific function in the human body.

Martin Taylor, who led the study at the Institute of Genetics and Molecular Medicine at the University of Edinburgh, said: “As a species, humans are wonderfully inventive – we are socially and technologically evolving all the time.

“But this research shows that we are innovating at a genetic level too.

“This new molecule sprang from nowhere at a time when our species was undergoing dramatic changes: living longer, walking upright, learning how to use tools and how to communicate.

“We’re now hopeful that we will find more new genes that help show what makes us human.”

The gene is highly active in two areas of the brain, controlling decision-making and language abilities, with the study suggesting it could have a role in the advanced brain functions that make us human.

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Image courtesy of ABCNews.[end-div]

Pluralistic Ignorance

Why study the science of climate change when you can study the complexities of climate change deniers themselves? That was the question that led several groups of independent researchers to study why some groups of people cling to mistaken beliefs and hold inaccurate views of the public consensus.

[div class=attrib]From ars technica:[end-div]

By just about every measure, the vast majority of scientists in general—and climate scientists in particular—have been convinced by the evidence that human activities are altering the climate. However, in several countries, a significant portion of the public has concluded that this consensus doesn’t exist. That has prompted a variety of studies aimed at understanding the large disconnect between scientists and the public, with results pointing the finger at everything from the economy to the weather. Other studies have noted societal influences on acceptance, including ideology and cultural identity.

Those studies have generally focused on the US population, but the public acceptance of climate change is fairly similar in Australia. There, a new study has looked at how societal tendencies can play a role in maintaining mistaken beliefs. The authors of the study have found evidence that two well-known behaviors—the “false consensus” and “pluralistic ignorance”—are helping to shape public opinion in Australia.

False consensus is the tendency of people to think that everyone else shares their opinions. This can arise from the fact that we tend to socialize with people who share our opinions, but the authors note that the effect is even stronger “when we hold opinions or beliefs that are unpopular, unpalatable, or that we are uncertain about.” In other words, our social habits tend to reinforce the belief that we’re part of a majority, and we have a tendency to cling to the sense that we’re not alone in our beliefs.

Pluralistic ignorance is similar, but it’s not focused on our own beliefs. Instead, sometimes the majority of people come to believe that most people think a certain way, even though the majority opinion actually resides elsewhere.

As it turns out, the authors found evidence of both these effects. They performed two identical surveys of over 5,000 Australians, done a year apart; about 1,350 people took the survey both times, which let the researchers track how opinions evolve. Participants were asked to describe their own opinion on climate change, with categories including “don’t know,” “not happening,” “a natural occurrence,” and “human-induced.” After voicing their own opinion, people were asked to estimate what percentage of the population would fall into each of these categories.

In aggregate, over 90 percent of those surveyed accepted that climate change was occurring (a rate much higher than we see in the US), with just over half accepting that humans were driving the change. Only about five percent felt it wasn’t happening, and even fewer said they didn’t know. The numbers changed only slightly between the two polls.

The false consensus effect became obvious when the researchers looked at what these people thought that everyone else believed. Here, the false consensus effect was obvious: every single group believed that their opinion represented the plurality view of the population. This was most dramatic among those who don’t think that the climate is changing; even though they represent far less than 10 percent of the population, they believed that over 40 percent of Australians shared their views. Those who profess ignorance also believed they had lots of company, estimating that their view was shared by a quarter of the populace.

Among those who took the survey twice, the effect became even more pronounced. In the year between the surveys, they respondents went from estimating that 30 percent of the population agreed with them to thinking that 45 percent did. And, in general, this group was the least likely to change its opinion between the two surveys.

But there was also evidence of pluralistic ignorance. Every single group grossly overestimated the number of people who were unsure about climate change or convinced it wasn’t occurring. Even those who were convinced that humans were changing the climate put 20 percent of Australians into each of these two groups.

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Image: Flood victims. Courtesy of NRDC.[end-div]

USANIT

Ever-present in Europe nationalism continues to grow as austerity measures across the continent catalyze xenophobia. And, now it’s spreading westwards across the Atlantic to the United States of America. Well, actually to be more precise nationalistic fervor is spreading to Texas. Perhaps in our lifetimes we’ll have to contend with USANIT — the United States of America Not Including Texas. Seventy-seven thousand Texans, so far, want the Lone Star to fly again across their nascent nation.

[div class=attrib]From the Guardian:[end-div]

Less than a week after Barack Obama was re-elected president, a slew of petitions have appeared on the White House’s We the People site, asking for states to be granted the right to peacefully withdraw from the union.

On Tuesday, all but one of the 33 states listed were far from reaching the 25,000 signature mark needed to get a response from the White House. Texas, however, had gained more than 77,000 online signatures in three days.

People from other states had signed the Texas petition. Another petition on the website was titled: “Deport everyone that signed a petition to withdraw their state from the United States of America.” It had 3,536 signatures.

The Texas petition reads:

Given that the state of Texas maintains a balanced budget and is the 15th largest economy in the world, it is practically feasible for Texas to withdraw from the union, and to do so would protect it’s citizens’ standard of living and re-secure their rights and liberties in accordance with the original ideas and beliefs of our founding fathers which are no longer being reflected by the federal government.

Activists across the country have advocated for independent statehood since the union was restored after the end of the Civil War in 1865. Texas has been host to some of the most fervent fights for independence.

Daniel Miller is the president of the Texas Nationalist Movement, which supports Texan independence and has its own online petition.

“We want to be able to govern ourselves without having some government a thousand-plus miles away that we have to go ask ‘mother may I’ to,” Miller said. “We want to protect our political, our cultural and our economic identities.”

Miller is not a fan of the word “secession”, because he views it as an over-generalization of what his group hopes to accomplish, but he encourages advocates for Texan independence to show their support when they can, including by signing the White House website petition.

“Given the political, cultural and economic pressures the United States is under, it’s not beyond the pale where one could envision the break up of the United States,” he said. “I don’t look at it as possibility, I look at it as an inevitability.”

Miller has been working for Texas independence for 16 years. He pointed to last week’s federal elections as evidence that a state independence movement is gaining traction. Miller pointed to the legalization of the sale of marijuana in Colorado and Washington, disobeying federal mandate.

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]State Flag of Texas courtesy of Wikipedia.[end-div]

Socialism and Capitalism Share the Same Parent

Expanding on the work of Immanuel Kant in the late 18th century, German philosopher Georg Wilhelm Friedrich Hegel laid the foundations for what would later become two opposing political systems, socialism and free market capitalism. His comprehensive framework of Absolute Idealism influenced numerous philosophers and thinkers of all shades including Karl Marx and Ralph Waldo Emerson. While many thinkers later rounded on Hegel’s world view as nothing but a thinly veiled attempt to justify totalitarianism in his own nation, there is no argument as to the profound influence of his works on later thinkers from both the left and the right wings of the political spectrum.

[div class=attrib]From FairObserver:[end-div]

It is common knowledge that among developed western countries the two leading socioeconomic systems are socialism and capitalism. The former is often associated more closely with European systems of governance and the latter with the American free market economy. It is also generally known that these two systems are rooted in two fundamentally different assumptions about how a healthy society progresses. What is not as well known is that they both stem from the same philosophical roots, namely the evolutionary philosophy of Georg Wilhelm Friedrich Hegel.

Georg Wilhelm Friedrich Hegel was a leading figure in the movement known as German Idealism that had its beginnings in the late 18th century. That philosophical movement was initiated by another prominent German thinker, Immanuel Kant. Kant published “The Critique of Pure Reason” in 1781, offering a radical new way to understand how we as human beings get along in the world. Hegel expanded on Kant’s theory of knowledge by adding a theory of social and historical progress. Both socialism and capitalism were inspired by different, and to some extent apposing, interpretations of Hegel’s philosophical system.

Immanuel Kant recognized that human beings create their view of reality by incorporating new information into their previous understanding of reality using the laws of reason. As this integrative process unfolds we are compelled to maintain a coherent picture of what is real in order to operate effectively in the world. The coherent picture of reality that we maintain Kant called a necessary transcendental unity. It can be understood as the overarching picture of reality, or worldview, that helps us make sense of the world and against which we interpret and judge all new experiences and information.

Hegel realized that not only must individuals maintain a cohesive picture of reality, but societies and cultures must also maintain a collectively held and unified understanding of what is real. To use a gross example, it is not enough for me to know what a dollar bill is and what it is worth. If I am to be able to buy something with my money, then other people must agree on its value. Reality is not merely an individual event; it is a collective affair of shared agreement. Hegel further saw that the collective understanding of reality that is held in common by many human beings in any given society develops over the course of history. In his book “The Philosophy of History”, Hegel outlines his theory of how this development occurs. Karl Marx started with Hegel’s philosophy and then added his own profound insights – especially in regards to how oppression and class struggle drive the course of history.

Across the Atlantic in America, there was another thinker, Ralph Waldo Emerson, who was strongly influenced by German Idealism and especially the philosophy of Hegel. In the development of the American mind one cannot overstate the role that Emerson played as the pathfinder who marked trails of thought that continue to guide the  current American worldview. His ideas became grooves in consciousness set so deeply in the American psyche that they are often simply experienced as truth.  What excited Emerson about Hegel was his description of how reality emerged from a universal mind. Emerson similarly believed that what we as human beings experience as real has emerged through time from a universal source of intelligence. This distinctly Hegelian tone in Emerson can be heard clearly in this passage from his essay entitled “History”:

“There is one mind common to all individual men. Of the works of this mind history is the record. Man is explicable by nothing less than all his history. All the facts of history pre-exist as laws. Each law in turn is made by circumstances predominant. The creation of a thousand forests is in one acorn, and Egypt, Greece, Rome, Gaul, Britain, America, lie folded already in the first man. Epoch after epoch, camp, kingdom, empire, republic, democracy, are merely the application of this manifold spirit to the manifold world.”

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Image: The portrait of G.W.F. Hegel (1770-1831); Steel engraving by Lazarus Sichling after a lithograph by Julius L. Sebbers. Courtesy of Wikipedia.[end-div]

Computers in the Movies

Most of us now carry around inside our smartphones more computing power than NASA once had in the Apollo command module. So, it’s interesting to look back at old movies to see how celluloid fiction portrayed computers. Most from the 1950s and 60s were replete with spinning tape drives and enough lights to resemble the Manhattan skyline. Our favorite here at theDiagonal is the first “Bat Computer” from the original 1960’s TV series, which could be found churning away in Batman’s crime-fighting nerve center beneath Wayne Mansion.

[div class=attrib]From Wired:[end-div]

The United States government powered up its SAGE defense system in July 1958, at an Air Force base near Trenton, New Jersey. Short for Semi-Automatic Ground Environment, SAGE would eventually span 24 command and control stations across the US and Canada, warning against potential air attacks via radar and an early IBM computer called the AN/FSQ-7.

“It automated air defense,” says Mike Loewen, who worked with SAGE while serving with the Air Force in the 1980s. “It used a versatile, programmable, digital computer to process all this incoming radar data from various sites around the region and display it in a format that made sense to people. It provided a computer display of the digitally processed radar information.”

Fronted by a wall of dials, switches, neon lights, and incandescent lamps — and often plugged into spinning tape drives stretching from floor to ceiling — the AN/FSQ-7 looked like one of those massive computing systems that turned up in Hollywood movies and prime time TV during the ’60s and the ’70s. This is mainly because it is one those massive computing systems that turned up in Hollywood movies and TV during the ’60s and ’70s — over and over and over again. Think Lost In Space. Get Smart. Fantastic Voyage. In Like Flint. Or our person favorite: The Towering Inferno.

That’s the AN/FSQ-7 in The Towering Inferno at the top of this page, operated by a man named OJ Simpson, trying to track a fire that’s threatening to bring down the world’s tallest building.

For decades, the AN/FSQ-7 — Q7 for short — helped define the image of a computer in the popular consciousness. Nevermind that it was just a radar system originally backed by tens of thousands of vacuum tubes. For moviegoers everywhere, this was the sort of thing that automated myriad tasks not only in modern-day America but the distant future.

It never made much sense. But sometimes, it made even less sense. In the ’60s and ’70s, some films didn’t see the future all that clearly. Woody Allen’s Sleeper is set in 2173, and it shows the AN/FSQ-7 helping 22nd-century Teamsters make repairs to robotic man servants. Other films just didn’t see the present all that clearly. Independence Day was made in 1996, and apparently, its producers were unaware that the Air Force decommissioned SAGE 13 years earlier.

Of course, the Q7 is only part of the tale. The history of movies and TV is littered with big, beefy, photogenic machines that make absolutely no sense whatsoever. Sometimes they’re real machines doing unreal tasks. And sometimes they’re unreal machines doing unreal tasks. But we love them all. Oh so very much.

Mike Loewen first noticed the Q7 in a mid-’60s prime time TV series called The Time Tunnel. Produced by the irrepressible Irwin Allen, Time Tunnel concerned a secret government project to build a time machine beneath a trap door in the Arizona desert. A Q7 powered this subterranean time machine, complete with all those dials, switches, neon lights, and incandescent lamps.

No, an AN/FSQ-7 couldn’t really power a time machine. But time machines don’t exist. So it all works out quite nicely.

At first, Loewen didn’t know it was a Q7. But then, after he wound up in front of a SAGE system while in the Air Force many years later, it all came together. “I realized that these computer banks running the Time Tunnel were large sections of panels from the SAGE computer,” Loewen says. “And that’s where I got interested.”

He noticed the Q7 in TV show after TV show, movie after movie — and he started documenting these SAGE star turns on his personal homepage. In each case, the Q7 was seen doing stuff it couldn’t possibly do, but there was no doubt this was the Q7 — or at least part of it.

Here’s that subterranean time machine that caught the eye of Mike Loewen in The Time Tunnel (1966). The cool thing about the Time Tunnel AN/FSQ-7 is that even when it traps two government scientists in an endless time warp, it always sends them to dates of extremely important historical significance. Otherwise, you’d have one boring TV show on your hands.

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Image: The Time Tunnel (1966). Courtesy of Wired.[end-div]

The Most Annoying Technology? The Winner Is…

We all have owned or have used or have come far too close to a technology that we absolutely abhor and wish numerous curses upon its inventors. Said gizmo may be the unfathomable VCR, the forever lost TV remote, the tinny sounding Sony Walkman replete with unraveling cassette tape, the Blackberry, or even Facebook.

Ours over here at theDiagonal is the voice recognition system used by 99 percent of so-called customer service organizations. You know how it goes, something like this: “please say ‘one’ for new accounts”, “please say ‘two’ if you are an existing customer”, please say ‘three’ for returns”, “please say ‘Kyrgyzstan’ to speak with a customer service representative”.

Wired recently listed their least favorite, most hated technologies. No surprises here — winners of this dubious award include the Bluetooth headset, CDROM, and Apple TV remote.

[div class=attrib]From Wired:[end-div]

Bluetooth Headsets

Look, here’s a good rule of thumb: Once you get out of the car, or leave your desk, take off the headset. Nobody wants to hear your end of the conversation. That’s not idle speculation, it’s science! Headsets just make it worse. At least when there’s a phone involved, there are visual cues that say “I’m on the phone.” I mean, other than hearing one end of a shouted conversation.

Leaf Blower

Is your home set on a large wooded lot with acreage to spare between you and your closest neighbor? Did a tornado power through your yard last night, leaving your property covered in limbs and leaves? No? Then get a rake, dude. Leaf blowers are so irritating, they have been been outlawed in some towns. Others should follow suit.

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image courtesy of the Sun/Mercury News.[end-div]

Charles Darwin Runs for Office

British voters may recall Screaming Lord Sutch, 3rd Earl of Harrow, of the Official Monster Raving Loony Party, who ran in over 40 parliamentary elections during the 1980s and 90s. He never won, but garnered a respectable number of votes and many fans (he was also a musician).

The United States followed a more dignified path in the 2012 elections, when Charles Darwin ran for a Congressional seat in Georgia. Darwin failed to win, but collected a respectable 4,000 votes. His opponent, Paul Broun, believes that the Earth “is but about 9,000 years old”. Interestingly, Representative Broun serves on the United States House Committee on Science, Space and Technology.

[div class=attrib]From Slate:[end-div]

Anti-evolution Congressman Paul Broun (R-Ga.) ran unopposed in Tuesday’s election, but nearly 4,000 voters wrote in Charles Darwin to protest their representative’s views. (Broun called evolution “lies straight from the pit of hell.”) Darwin fell more than 205,000 votes short of victory, but what would have happened if the father of evolution had out-polled Broun?

Broun still would have won. Georgia, like many other states, doesn’t count votes for write-in candidates who have not filed a notice of intent to stand for election. Even if the finally tally had been reversed, with Charles Darwin winning 209,000 votes and Paul Broun 4,000, Broun would have kept his job.

That’s not to say dead candidates can’t win elections. It happens all the time, but only when the candidate dies after being placed on the ballot. In Tuesday’s election, Orange County, Fla., tax collector Earl Wood won more than 56 percent of the vote, even though he died in October at the age of 96 after holding the office for more than 40 years. Florida law allowed the Democratic Party, of which Wood was a member, to choose a candidate to receive Wood’s votes. In Alabama, Charles Beasley won a seat on the Bibb County Commission despite dying on Oct. 12. (Beasley’s opponent lamented the challenge of running a negative campaign against a dead man.) The governor will appoint a replacement.

[div class=attrib]Read the entire article after the jump.[end-div]

The Myth of Social Mobility

There is a commonly held myth in the United States that anyone can make it; that is, even if you’re at the bottom of the income distribution curve you have the opportunity to climb up to a wealthier future. Independent research over the last couple of decades debunks this myth and paints a rather different and more disturbing reality. For instance, it shows how Americans are now less socially mobile — in the upward sense — than citizens of Canada and most countries in Europe.

[div class=attrib]From the Economist:[end-div]

THE HAMPTONS, A string of small towns on the south shore of Long Island, have long been a playground for America’s affluent. Nowadays the merely rich are being crimped by the ultra-wealthy. In August it can cost $400,000 to rent a fancy house there. The din of helicopters and private jets is omnipresent. The “Quiet Skies Coalition”, formed by a group of angry residents, protests against the noise, particularly of one billionaire’s military-size Chinook. “You can’t even play tennis,” moans an old-timer who stays near the East Hampton airport. “It’s like the third world war with GIV and GV jets.”

Thirty years ago, Loudoun County, just outside Washington, DC, in Northern Virginia, was a rural backwater with a rich history. During the war of 1812 federal documents were kept safe there from the English. Today it is the wealthiest county in America. Rolling pastures have given way to technology firms, swathes of companies that thrive on government contracts and pristine neighbourhoods with large houses. The average household income, at over $130,000, is twice the national level. The county also marks the western tip of the biggest cluster of affluence in the country. Between Loudoun County and north-west Washington, DC, there are over 800,000 people in exclusive postcodes that are home to the best-educated and wealthiest 5% of the population, dubbed “superzips” by Charles Murray, a libertarian social scientist.

THE HAMPTONS, A string of small towns on the south shore of Long Island, have long been a playground for America’s affluent. Nowadays the merely rich are being crimped by the ultra-wealthy. In August it can cost $400,000 to rent a fancy house there. The din of helicopters and private jets is omnipresent. The “Quiet Skies Coalition”, formed by a group of angry residents, protests against the noise, particularly of one billionaire’s military-size Chinook. “You can’t even play tennis,” moans an old-timer who stays near the East Hampton airport. “It’s like the third world war with GIV and GV jets.”

Thirty years ago, Loudoun County, just outside Washington, DC, in Northern Virginia, was a rural backwater with a rich history. During the war of 1812 federal documents were kept safe there from the English. Today it is the wealthiest county in America. Rolling pastures have given way to technology firms, swathes of companies that thrive on government contracts and pristine neighbourhoods with large houses. The average household income, at over $130,000, is twice the national level. The county also marks the western tip of the biggest cluster of affluence in the country. Between Loudoun County and north-west Washington, DC, there are over 800,000 people in exclusive postcodes that are home to the best-educated and wealthiest 5% of the population, dubbed “superzips” by Charles Murray, a libertarian social scientist.

[div clas=attrib]Read the entire article following the jump.[end-div]

Hearing and Listening

Auditory neuroscientist Seth Horowitz guides us through the science of hearing and listening in his new book, “The Universal Sense: How Hearing Shapes the Mind.” He clarifies the important distinction between attentive listening with the mind and the more passive act of hearing, and laments the many modern distractions that threaten our ability to listen effectively.

[div class=attrib]From the New York Times:[end-div]

HERE’S a trick question. What do you hear right now?

If your home is like mine, you hear the humming sound of a printer, the low throbbing of traffic from the nearby highway and the clatter of plastic followed by the muffled impact of paws landing on linoleum — meaning that the cat has once again tried to open the catnip container atop the fridge and succeeded only in knocking it to the kitchen floor.

The slight trick in the question is that, by asking you what you were hearing, I prompted your brain to take control of the sensory experience — and made you listen rather than just hear. That, in effect, is what happens when an event jumps out of the background enough to be perceived consciously rather than just being part of your auditory surroundings. The difference between the sense of hearing and the skill of listening is attention.

Hearing is a vastly underrated sense. We tend to think of the world as a place that we see, interacting with things and people based on how they look. Studies have shown that conscious thought takes place at about the same rate as visual recognition, requiring a significant fraction of a second per event. But hearing is a quantitatively faster sense. While it might take you a full second to notice something out of the corner of your eye, turn your head toward it, recognize it and respond to it, the same reaction to a new or sudden sound happens at least 10 times as fast.

This is because hearing has evolved as our alarm system — it operates out of line of sight and works even while you are asleep. And because there is no place in the universe that is totally silent, your auditory system has evolved a complex and automatic “volume control,” fine-tuned by development and experience, to keep most sounds off your cognitive radar unless they might be of use as a signal that something dangerous or wonderful is somewhere within the kilometer or so that your ears can detect.

This is where attention kicks in.

Attention is not some monolithic brain process. There are different types of attention, and they use different parts of the brain. The sudden loud noise that makes you jump activates the simplest type: the startle. A chain of five neurons from your ears to your spine takes that noise and converts it into a defensive response in a mere tenth of a second — elevating your heart rate, hunching your shoulders and making you cast around to see if whatever you heard is going to pounce and eat you. This simplest form of attention requires almost no brains at all and has been observed in every studied vertebrate.

More complex attention kicks in when you hear your name called from across a room or hear an unexpected birdcall from inside a subway station. This stimulus-directed attention is controlled by pathways through the temporoparietal and inferior frontal cortex regions, mostly in the right hemisphere — areas that process the raw, sensory input, but don’t concern themselves with what you should make of that sound. (Neuroscientists call this a “bottom-up” response.)

But when you actually pay attention to something you’re listening to, whether it is your favorite song or the cat meowing at dinnertime, a separate “top-down” pathway comes into play. Here, the signals are conveyed through a dorsal pathway in your cortex, part of the brain that does more computation, which lets you actively focus on what you’re hearing and tune out sights and sounds that aren’t as immediately important.

In this case, your brain works like a set of noise-suppressing headphones, with the bottom-up pathways acting as a switch to interrupt if something more urgent — say, an airplane engine dropping through your bathroom ceiling — grabs your attention.

Hearing, in short, is easy. You and every other vertebrate that hasn’t suffered some genetic, developmental or environmental accident have been doing it for hundreds of millions of years. It’s your life line, your alarm system, your way to escape danger and pass on your genes. But listening, really listening, is hard when potential distractions are leaping into your ears every fifty-thousandth of a second — and pathways in your brain are just waiting to interrupt your focus to warn you of any potential dangers.

Listening is a skill that we’re in danger of losing in a world of digital distraction and information overload.

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Image: The Listener (TV series). Courtesy of Shaftsbury Films, CTV / Wikipedia.[end-div]

Big Data Versus Talking Heads

With the election in the United States now decided, the dissection of the result is well underway. And, perhaps the biggest winner of all is the science of big data. Yes, mathematical analysis of vast quantities of demographic and polling data won over the voodoo proclamations and gut felt predictions of the punditocracy. Now, that’s a result truly worth celebrating.

[div class=attrib]From ReadWriteWeb:[end-div]

Political pundits, mostly Republican, went into a frenzy when Nate Silver, a New York Times pollster and stats blogger, predicted that Barack Obama would win reelection.

But Silver was right and the pundits were wrong – and the impact of this goes way beyond politics.

Silver won because, um, science. As ReadWrite’s own Dan Rowinski noted,  Silver’s methodology is all based on data. He “takes deep data sets and applies logical analytical methods” to them. It’s all just numbers.

Silver runs a blog called FiveThirtyEight, which is licensed by the Times. In 2008 he called the presidential election with incredible accuracy, getting 49 out of 50 states right. But this year he rolled a perfect score, 50 out of 50, even nailing the margins in many cases. His uncanny accuracy on this year’s election represents what Rowinski calls a victory of “logic over punditry.”

In fact it’s bigger than that. Bear in mind that before turning his attention to politics in 2007 and 2008, Silver was using computer models to make predictions about baseball. What does it mean when some punk kid baseball nerd can just wade into politics and start kicking butt on all these long-time “experts” who have spent their entire lives covering politics?

It means something big is happening.

Man Versus Machine

This is about the triumph of machines and software over gut instinct.

The age of voodoo is over. The era of talking about something as a “dark art” is done. In a world with big computers and big data, there are no dark arts.

And thank God for that. One by one, computers and the people who know how to use them are knocking off these crazy notions about gut instinct and intuition that humans like to cling to. For far too long we’ve applied this kind of fuzzy thinking to everything, from silly stuff like sports to important stuff like medicine.

Someday, and I hope it’s soon, we will enter the age of intelligent machines, when true artificial intellgence becomes a reality, and when we look back on the late 20th and early 21st century it will seem medieval in its simplicity and reliance on superstition.

What most amazes me is the backlash and freak-out that occurs every time some “dark art” gets knocked over in a particular domain. Watch Moneyball (or read the book) and you’ll see the old guard (in that case, baseball scouts) grow furious as they realize that computers can do their job better than they can. (Of course it’s not computers; it’s people who know how to use computers.)

We saw the same thing when IBM’s Deep Blue defeated Garry Kasparov in 1997. We saw it when Watson beat humans at Jeopardy.

It’s happening in advertising, which used to be a dark art but is increasingly a computer-driven numbers game. It’s also happening in my business, the news media, prompting the same kind of furor as happened with the baseball scouts in Moneyball.

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Political pundits, Left to right: Mark Halperin, David Brooks, Jon Stewart, Tim Russert, Matt Drudge, John Harris & Jim VandeHei, Rush Limbaugh, Sean Hannity, Chris Matthews, Karl Rove. Courtesy of Telegraph.[end-div]

Dragons of the Mind

[div class=attrib]From the Wall Street Journal:[end-div]

Peter Jackson’s “Hobbit” movie is on its way, and with it will come the resurrection of the vile dragon Smaug. With fiery breath, razor-sharp claws, scales as hard as shields and a vast underground lair, Smaug is portrayed in J.R.R. Tolkien’s text as a merciless killer. But where did the idea for such a bizarre beast—with such an odd mixture of traits—come from in the first place?

Historically, most monsters were spawned not from pure imagination but from aspects of the natural world that our ancestors did not understand. Whales seen breaking the surface of the ancient oceans were sea monsters, the fossilized bones of prehistoric humans were the victims of Medusa, the roars of earthquakes were thought to emanate from underground beasts. The list goes on. But tracing Smaug’s draconic heritage is more complicated.

At first glance, dinosaurs seem the obvious source for the dragon myth. Our ancestors simply ran into Tyrannosaur skulls, became terrified and came up with the idea that such monsters must still be around. It all sounds so logical, but it’s unlikely to be true.

Dragon myths were alive and well in the ancient Mediterranean world, despite the fact that the region is entirely bereft of dinosaur fossils. The Assyrians had Tiamat, a giant snake with horns (leading some to dispute whether it even qualifies as a dragon). The Greeks, for their part, had a fierce reptilian beast that guarded the golden fleece. In depicting it, they oscillated between a tiny viper and a huge snake capable of swallowing people whole. But even in this latter case, there was no fire-breathing or underground hoard, just a big reptile.

For decades, zoologists have argued that the only snakes humans ever had to seriously worry about were of the venomous variety. Last year, however, a study published in the Proceedings of the National Academy of Sciences revealed that members of Indonesian tribes are regularly eaten by enormous constrictors and that this was likely a common problem throughout human evolution. Moreover, reports by Pliny the Elder and others describe snakes of such size existing in the ancient Mediterranean world and sometimes attacking people. It seems likely that the early dragon myths were based on these real reptilian threats.

But Tolkien’s Smaug lives below the Lonely Mountain and breathes fire. Some reptiles live below ground, but none breathes anything that looks remotely like flame. Yet as strange as this trait may seem, it too may have natural origins.

Among the earliest mythical dragons that lived underground are those found in the 12th-century tales of Geoffrey of Monmouth. Monmouth recounts the story of Vortigern, an ancient British king who was forced to flee to the hills of Wales as Saxons invaded. Desperate to make a final stand, Vortigern orders a fortress to be built, but its walls keep falling over. Baffled, Vortigern seeks the advice of his wise men, who tell him that the ground must be consecrated with the blood of a child who is not born from the union between a man and a woman. Vortigern agrees and sends the wise men off to find such a child.

Not far away, in the town of Carmarthen, they come across two boys fighting. One insults the other as a bastard who has no father, and the wise men drag him back to Vortigern.

When the boy learns that he is to be killed, he tells Vortigern that his advisers have got things wrong. He declares that there are dragons below the ground and that their wrestling with one another is what keeps the walls from standing. Vortigern tests the boy’s theory out, and sure enough, as his men dig deeper, they discover the dragons’ “panting” flames.

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Zmey Gorynych, the Russian three-headed dragon. Courtesy of Wikipedia.[end-div]

Black and White or Color

Please forget Instagram, Photoshop filters, redeye elimination, automatic camera shake reduction systems and high dynamic range apps. If you’re a true photographer or simply a lover of great photography the choice is much simpler: black and white or color.

A new photography exhibit in London pits these contrasting media alongside each other for you to decide. The great Henri Cartier-Bresson would have you believe that black and white images live in a class of their own, far and above the lowly form of color snaps. He was vociferous in his opinion — that for technical and aeasthetic reasons only black and white photography could be considered art.

So, curators of the exhibition — Cartier-Bresson: A Question of Colour, have juxtaposed 10 of Cartier-Bressons prints alongside the colorful works of 15 international contemporary photographers. The results show that “the decisive moment”, so integral to Cartier-Bresson’s memorable black and white images, can be adapted to great, if not equal effect, in color.

The exhibit can be seen at Somerset House, London and runs from 8 November 2012 to 27 January 2013.

[div class=attrib]From Somerset House:[end-div]

Positive View Foundation announces its inaugural exhibition Cartier-Bresson: A Question of Colour, to be held at Somerset House, 8 November 2012 – 27 January 2013. Curated by William A. Ewing, the exhibition will feature 10 Henri Cartier-Bresson photographs never before exhibited in the UK alongside over 75 works by 15 international contemporary photographers, including: Karl Baden (US), Carolyn Drake (US), Melanie Einzig (US), Andy Freeberg (US), Harry Gruyaert (Belgium), Ernst Haas (Austrian), Fred Herzog (Canadian), Saul Leiter (US), Helen Levitt (US), Jeff Mermelstein (US), Joel Meyerowitz (US), Trent Parke (Australian), Boris Savelev (Ukranian), Robert Walker (Canadian), and Alex Webb (US).

The extensive showcase will illustrate how photographers working in Europe and North America adopted and adapted the master’s ethos famously known as  ‘the decisive moment’ to their work in colour. Though they often departed from the concept in significant ways, something of that challenge remained: how to seize something that happens and capture it in the very moment that it takes place.

It is well-known that Cartier-Bresson was disparaging towards colour photography, which in the 1950s was in its early years of development, and his reasoning was based both on the technical and aesthetic limitations of the medium at the time.

Curator William E. Ewing has conceived the exhibition in terms of, as he puts it, ‘challenge and response’. “This exhibition will show how Henri Cartier-Bresson, in spite of his skeptical attitude regarding the artistic value of colour photography, nevertheless exerted a powerful influence over photographers who took up the new medium and who were determined to put a personal stamp on it. In effect, his criticisms of colour spurred on a new generation, determined to overcome the obstacles and prove him wrong. A Question of Colour simultaneously pays homage to a master who felt that black and white photography was the ideal medium, and could not be bettered, and to a group of photographers of the 20th and 21st centuries who chose the path of colour and made, and continue to make, great strides.”

Cartier-Bresson: A Question of Colour will feature a selection of photographers whose commitment to expression in colour was – or is – wholehearted and highly sophisticated, and which measured up to Cartier-Bresson’s essential requirement that content and form were in perfect balance. Some of these artists were Cartier-Bresson’s contemporaries, like Helen Levitt, or even, as with Ernst Haas, his friends; others, such as Fred Herzog in Vancouver, knew the artist’s seminal work across vast distances; others were junior colleagues, such as Harry Gruyaert, who found himself debating colour ferociously with the master; and others still, like Andy Freeberg or Carolyn Drake, never knew the man first-hand, but were deeply influenced by his example.

[div class=attrib]Find out more about the exhibit here.[end-div]

[div class=attrib]Image Henri Cartier-Bresson. Courtesy of Wikipedia.[end-div]

How We Die (In Britain)

The handy infographic is compiled from data compiled by the Office of National Statistics in the United Kingdom. So, if you live in the British Isles this will give you an inkling of your likely cause of death. Interestingly, if you live in the United States you are more likely to die of a gunshot wound than a Brit is of dying from falling from a building.

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Infographic courtesy of the Guardian.[end-div]

The Military-Industrial Complex

[tube]8y06NSBBRtY[/tube]

In his op-ed, author Aaron B. O’Connell reminds us of Eisenhower’s prescient warning to the nation about the growing power of the military-industrial complex in national affairs.

[div class=attrib]From the New York Times:[end-div]

IN 1961, President Dwight D. Eisenhower left office warning of the growing power of the military-industrial complex in American life. Most people know the term the president popularized, but few remember his argument.

In his farewell address, Eisenhower called for a better equilibrium between military and domestic affairs in our economy, politics and culture. He worried that the defense industry’s search for profits would warp foreign policy and, conversely, that too much state control of the private sector would cause economic stagnation. He warned that unending preparations for war were incongruous with the nation’s history. He cautioned that war and warmaking took up too large a proportion of national life, with grave ramifications for our spiritual health.

The military-industrial complex has not emerged in quite the way Eisenhower envisioned. The United States spends an enormous sum on defense — over $700 billion last year, about half of all military spending in the world — but in terms of our total economy, it has steadily declined to less than 5 percent of gross domestic product from 14 percent in 1953. Defense-related research has not produced an ossified garrison state; in fact, it has yielded a host of beneficial technologies, from the Internet to civilian nuclear power to GPS navigation. The United States has an enormous armaments industry, but it has not hampered employment and economic growth. In fact, Congress’s favorite argument against reducing defense spending is the job loss such cuts would entail.

Nor has the private sector infected foreign policy in the way that Eisenhower warned. Foreign policy has become increasingly reliant on military solutions since World War II, but we are a long way from the Marines’ repeated occupations of Haiti, Nicaragua and the Dominican Republic in the early 20th century, when commercial interests influenced military action. Of all the criticisms of the 2003 Iraq war, the idea that it was done to somehow magically decrease the cost of oil is the least credible. Though it’s true that mercenaries and contractors have exploited the wars of the past decade, hard decisions about the use of military force are made today much as they were in Eisenhower’s day: by the president, advised by the Joint Chiefs of Staff and the National Security Council, and then more or less rubber-stamped by Congress. Corporations do not get a vote, at least not yet.

But Eisenhower’s least heeded warning — concerning the spiritual effects of permanent preparations for war — is more important now than ever. Our culture has militarized considerably since Eisenhower’s era, and civilians, not the armed services, have been the principal cause. From lawmakers’ constant use of “support our troops” to justify defense spending, to TV programs and video games like “NCIS,” “Homeland” and “Call of Duty,” to NBC’s shameful and unreal reality show “Stars Earn Stripes,” Americans are subjected to a daily diet of stories that valorize the military while the storytellers pursue their own opportunistic political and commercial agendas. Of course, veterans should be thanked for serving their country, as should police officers, emergency workers and teachers. But no institution — particularly one financed by the taxpayers — should be immune from thoughtful criticism.

[div class=attrib]Read the entire article after the jump.[end-div]

Prodigies and the Rest of Us

[div class=attrib]From the New York Times:[end-div]

Drew Petersen didn’t speak until he was 3½, but his mother, Sue, never believed he was slow. When he was 18 months old, in 1994, she was reading to him and skipped a word, whereupon Drew reached over and pointed to the missing word on the page. Drew didn’t produce much sound at that stage, but he already cared about it deeply. “Church bells would elicit a big response,” Sue told me. “Birdsong would stop him in his tracks.”

Sue, who learned piano as a child, taught Drew the basics on an old upright, and he became fascinated by sheet music. “He needed to decode it,” Sue said. “So I had to recall what little I remembered, which was the treble clef.” As Drew told me, “It was like learning 13 letters of the alphabet and then trying to read books.” He figured out the bass clef on his own, and when he began formal lessons at 5, his teacher said he could skip the first six months’ worth of material. Within the year, Drew was performing Beethoven sonatas at the recital hall at Carnegie Hall. “I thought it was delightful,” Sue said, “but I also thought we shouldn’t take it too seriously. He was just a little boy.”

On his way to kindergarten one day, Drew asked his mother, “Can I just stay home so I can learn something?” Sue was at a loss. “He was reading textbooks this big, and they’re in class holding up a blowup M,” she said. Drew, who is now 18, said: “At first, it felt lonely. Then you accept that, yes, you’re different from everyone else, but people will be your friends anyway.” Drew’s parents moved him to a private school. They bought him a new piano, because he announced at 7 that their upright lacked dynamic contrast. “It cost more money than we’d ever paid for anything except a down payment on a house,” Sue said. When Drew was 14, he discovered a home-school program created by Harvard; when I met him two years ago, he was 16, studying at the Manhattan School of Music and halfway to a Harvard bachelor’s degree.

Prodigies are able to function at an advanced adult level in some domain before age 12. “Prodigy” derives from the Latin “prodigium,” a monster that violates the natural order. These children have differences so evident as to resemble a birth defect, and it was in that context that I came to investigate them. Having spent 10 years researching a book about children whose experiences differ radically from those of their parents and the world around them, I found that stigmatized differences — having Down syndrome, autism or deafness; being a dwarf or being transgender — are often clouds with silver linings. Families grappling with these apparent problems may find profound meaning, even beauty, in them. Prodigiousness, conversely, looks from a distance like silver, but it comes with banks of clouds; genius can be as bewildering and hazardous as a disability. Despite the past century’s breakthroughs in psychology and neuroscience, prodigiousness and genius are as little understood as autism. “Genius is an abnormality, and can signal other abnormalities,” says Veda Kaplinsky of Juilliard, perhaps the world’s pre-eminent teacher of young pianists. “Many gifted kids have A.D.D. or O.C.D. or Asperger’s. When the parents are confronted with two sides of a kid, they’re so quick to acknowledge the positive, the talented, the exceptional; they are often in denial over everything else.”

We live in ambitious times. You need only to go through the New York preschool application process, as I recently did for my son, to witness the hysteria attached to early achievement, the widespread presumption that a child’s destiny hinges on getting a baby foot on a tall ladder. Parental obsessiveness on this front reflects the hegemony of developmental psychiatry, with its insistence that first experience is formative. We now know that brain plasticity diminishes over time; it is easier to mold a child than to reform an adult. What are we to do with this information? I would hate for my children to feel that their worth is contingent on sustaining competitive advantage, but I’d also hate for them to fall short of their potential. Tiger mothers who browbeat their children into submission overemphasize a narrow category of achievement over psychic health. Attachment parenting, conversely, often sacrifices accomplishment to an ideal of unboundaried acceptance that can be equally pernicious. It’s tempting to propose some universal answer, but spending time with families of remarkably talented children showed me that what works for one child can be disastrous for another.

Children who are pushed toward success and succeed have a very different trajectory from that of children who are pushed toward success and fail. I once told Lang Lang, a prodigy par excellence and now perhaps the most famous pianist in the world, that by American standards, his father’s brutal methods — which included telling him to commit suicide, refusing any praise, browbeating him into abject submission — would count as child abuse. “If my father had pressured me like this and I had not done well, it would have been child abuse, and I would be traumatized, maybe destroyed,” Lang responded. “He could have been less extreme, and we probably would have made it to the same place; you don’t have to sacrifice everything to be a musician. But we had the same goal. So since all the pressure helped me become a world-famous star musician, which I love being, I would say that, for me, it was in the end a wonderful way to grow up.”

While it is true that some parents push their kids too hard and give them breakdowns, others fail to support a child’s passion for his own gift and deprive him of the only life that he would have enjoyed. You can err in either direction. Given that there is no consensus about how to raise ordinary children, it is not surprising that there is none about how to raise remarkable children. Like parents of children who are severely challenged, parents of exceptionally talented children are custodians of young people beyond their comprehension.

Spending time with the Petersens, I was struck not only by their mutual devotion but also by the easy way they avoided the snobberies that tend to cling to classical music. Sue is a school nurse; her husband, Joe, works in the engineering department of Volkswagen. They never expected the life into which Drew has led them, but they have neither been intimidated by it nor brash in pursuing it; it remains both a diligence and an art. “How do you describe a normal family?” Joe said. “The only way I can describe a normal one is a happy one. What my kids do brings a lot of joy into this household.” When I asked Sue how Drew’s talent had affected how they reared his younger brother, Erik, she said: “It’s distracting and different. It would be similar if Erik’s brother had a disability or a wooden leg.”

Prodigiousness manifests most often in athletics, mathematics, chess and music. A child may have a brain that processes chess moves or mathematical equations like some dream computer, which is its own mystery, but how can the mature emotional insight that is necessary to musicianship emerge from someone who is immature? “Young people like romance stories and war stories and good-and-evil stories and old movies because their emotional life mostly is and should be fantasy,” says Ken Noda, a great piano prodigy in his day who gave up public performance and now works at the Metropolitan Opera. “They put that fantasized emotion into their playing, and it is very convincing. I had an amazing capacity for imagining these feelings, and that’s part of what talent is. But it dries up, in everyone. That’s why so many prodigies have midlife crises in their late teens or early 20s. If our imagination is not replenished with experience, the ability to reproduce these feelings in one’s playing gradually diminishes.”

Musicians often talked to me about whether you achieve brilliance on the violin by practicing for hours every day or by reading Shakespeare, learning physics and falling in love. “Maturity, in music and in life, has to be earned by living,” the violinist Yehudi Menuhin once said. Who opens up or blocks access to such living? A musical prodigy’s development hinges on parental collaboration. Without that support, the child would never gain access to an instrument, the technical training that even the most devout genius requires or the emotional nurturance that enables a musician to achieve mature expression. As David Henry Feldman and Lynn T. Goldsmith, scholars in the field, have said, “A prodigy is a group enterprise.”

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image: Portrait of Wolfgang Amadeus Mozart aged six years old, by anonymous. Courtesy of Wikipedia.[end-div]