All posts by Mike

What’s Next at the LHC: Parallel Universe?

The Large Hadron Collider (LHC) at CERN made headlines in 2012 with the announcement of a probable discovery of the Higgs Boson. Scientists are collecting and analyzing more data before they declare an outright discovery in 2013. In the meantime, they plan to use the giant machine to examine even more interesting science — at very small and very large scales — in the new year.

[div class=attrib]From the Guardian:[end-div]

When it comes to shutting down the most powerful atom smasher ever built, it’s not simply a question of pressing the off switch.

In the French-Swiss countryside on the far side of Geneva, staff at the Cern particle physics laboratory are taking steps to wind down the Large Hadron Collider. After the latest run of experiments ends next month, the huge superconducting magnets that line the LHC’s 27km-long tunnel must be warmed up, slowly and gently, from -271 Celsius to room temperature. Only then can engineers descend into the tunnel to begin their work.

The machine that last year helped scientists snare the elusive Higgs boson – or a convincing subatomic impostor – faces a two-year shutdown while engineers perform repairs that are needed for the collider to ramp up to its maximum energy in 2015 and beyond. The work will beef up electrical connections in the machine that were identified as weak spots after an incident four years ago that knocked the collider out for more than a year.

The accident happened days after the LHC was first switched on in September 2008, when a short circuit blew a hole in the machine and sprayed six tonnes of helium into the tunnel that houses the collider. Soot was scattered over 700 metres. Since then, the machine has been forced to run at near half its design energy to avoid another disaster.

The particle accelerator, which reveals new physics at work by crashing together the innards of atoms at close to the speed of light, fills a circular, subterranean tunnel a staggering eight kilometres in diameter. Physicists will not sit around idle while the collider is down. There is far more to know about the new Higgs-like particle, and clues to its identity are probably hidden in the piles of raw data the scientists have already gathered, but have had too little time to analyse.

But the LHC was always more than a Higgs hunting machine. There are other mysteries of the universe that it may shed light on. What is the dark matter that clumps invisibly around galaxies? Why are we made of matter, and not antimatter? And why is gravity such a weak force in nature? “We’re only a tiny way into the LHC programme,” says Pippa Wells, a physicist who works on the LHC’s 7,000-tonne Atlas detector. “There’s a long way to go yet.”

The hunt for the Higgs boson, which helps explain the masses of other particles, dominated the publicity around the LHC for the simple reason that it was almost certainly there to be found. The lab fast-tracked the search for the particle, but cannot say for sure whether it has found it, or some more exotic entity.

“The headline discovery was just the start,” says Wells. “We need to make more precise measurements, to refine the particle’s mass and understand better how it is produced, and the ways it decays into other particles.” Scientists at Cern expect to have a more complete identikit of the new particle by March, when repair work on the LHC begins in earnest.

By its very nature, dark matter will be tough to find, even when the LHC switches back on at higher energy. The label “dark” refers to the fact that the substance neither emits nor reflects light. The only way dark matter has revealed itself so far is through the pull it exerts on galaxies.

Studies of spinning galaxies show they rotate with such speed that they would tear themselves apart were there not some invisible form of matter holding them together through gravity. There is so much dark matter, it outweighs by five times the normal matter in the observable universe.

The search for dark matter on Earth has failed to reveal what it is made of, but the LHC may be able to make the substance. If the particles that constitute it are light enough, they could be thrown out from the collisions inside the LHC. While they would zip through the collider’s detectors unseen, they would carry energy and momentum with them. Scientists could then infer their creation by totting up the energy and momentum of all the particles produced in a collision, and looking for signs of the missing energy and momentum.

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Image: The eight torodial magnets can be seen on the huge ATLAS detector with the calorimeter before it is moved into the middle of the detector. This calorimeter will measure the energies of particles produced when protons collide in the centre of the detector. ATLAS will work along side the CMS experiment to search for new physics at the 14 TeV level. Courtesy of CERN.[end-div]

From 7 Up to 56 Up

[tube]ngSGIjwwc4U[/tube]

The classic documentary and social experiment continues with the release this week of “56 Up”. Michael Apted began this remarkable process with a documentary called “7 Up” in 1964. It followed the lives of 14 British children aged 7, from different socio-economic backgrounds. Although the 7 Up documentary was initially planned to be a one-off, subsequent installments followed in seven-year cycles. Each time Apted would bring us up to date with the lives of his growing subjects. Now, they are all turning 56 years old. Fifty-six years on the personal stories are poignant and powerful, yet class divisions remain.

[div class=attrib]From the Telegraph:[end-div]

Life rushes by so fast, it flickers today and is gone tomorrow. In “56 Up” — the latest installment in Michael Apted’s remarkable documentary project that has followed a group of Britons since 1964, starting when they were 7 — entire lifetimes race by with a few edits. One minute, a boy is merrily bobbing along. The next, he is 56 years old, with a wife or an ex, a few children or none, a career, a job or just dim prospects. Rolls of fat girdle his middle and thicken his jowls. He has regrets, but their sting has usually softened, along with everything else.

In a lot of documentaries you might not care that much about this boy and what became of him. But if you have watched any of the previous episodes in Mr. Apted’s series, you will care, and deeply, partly because you watched that boy grow up, suffer and triumph in a project that began as a news gimmick and social experiment and turned into a plangent human drama. Conceived as a one-off for a current-affairs program on Granada Television, the first film, “Seven Up!,” was a 40-minute look at the lives of 14 children from different backgrounds. Britain was changing, or so went the conventional wisdom, with postwar affluence having led the working class to adapt middle-class attitudes and lifestyles.

In 1963, though, the sociologists John H. Goldthorpe and David Lockwood disputed this widely held “embourgeoisement thesis,” arguing that the erosion of social class had not been as great as believed. In its deeply personal fashion, the “Up” series went on to make much the same point by checking in with many of the same boys and girls, men and women, every seven years. Despite some dropouts, the group has remained surprisingly intact. For better and sometimes worse, and even with their complaints about the series, participants like Tony Walker, who wanted to be a jockey and found his place as a cabby, have become cyclical celebrities. For longtime viewers they have become something more, including mirrors.

It’s this mirroring that helps make the series so poignant. As in the earlier movies, Mr. Apted again folds in older material from the ages of 7, 14 and so on, to set the scene and jog memories. The abrupt juxtapositions of epochs can be jarring, unnerving or touching — sometimes all three — as bright-faced children bloom and sometimes fade within seconds. An analogous project in print or even still photographs wouldn’t be as powerful, because what gives the “Up” series its punch is not so much its longevity or the human spectacle it offers, but that these are moving images of touchingly vibrant lives at certain moments in time and space. The more you watch, the more the movies transform from mirrors into memory machines, ones that inevitably summon reflections of your own life.

Save for “Seven Up!,” filmed in gorgeous black and white, the documentaries are aesthetically unremarkable. Shot in digital, “56 Up” pretty much plays like the earlier movies, with its mix of interviews and location shooting. Every so often you hear someone off screen, presumably Mr. Apted, make a comment, though mostly he lets his choice of what to show — the subjects at work or play, with family or friends — and his editing do his editorializing. In the past he has brought participants together, but he doesn’t here, which feels like a missed opportunity. Have the three childhood friends from the East End of London, Jackie Bassett, Lynn Johnson and Sue Sullivan, two of whom have recently endured heart-rendingly bad times, remained in contact? Mr. Apted doesn’t say.

With few exceptions and despite potential path-changing milestones like marriages and careers, everyone seems to have remained fairly locked in his or her original social class. At 7, Andrew Brackfield and John Brisby already knew which universities they would or should attend. “We think,” John said in “Seven Up!, “I’m going to Cambridge and Trinity Hall,” though he landed at Oxford. Like Mr. Brackfield, who did attend Cambridge, Mr. Brisby became a lawyer and still sounds to the manner born, with an accent that evokes old-fashioned news readers and Bond villains. The two hold instructively different views about whether the series corroborates the first film’s thesis about the rigidity of the British class structure, never mind that their lives are strong evidence that little has changed.

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Video: 7 Up – Part 1. Courtesy of World in Action, Granada TV.[end-div]

Plagiarism is the Sincerest Form of Capitalism

Plagiarism is fine art in China. But, it’s also very big business. The nation knocks off everything, from Hollywood and Bollywood movies, to software, electronics, appliances, drugs, and military equipment. Now, it’s moved on to copying architectural plans.

[div class=attrib]From the Telegraph:[end-div]

China is famous for its copy-cat architecture: you can find replicas of everything from the Eiffel Tower and the White House to an Austrian village across its vast land. But now they have gone one step further: recreating a building that hasn’t even been finished yet. A building designed by the Iraqi-British architect Dame Zaha Hadid for Beijing has been copied by a developer in Chongqing, south-west China, and now the two projects are racing to be completed first.

Dame Zaha, whose Wangjing Soho complex consists of three pebble-like constructions and will house an office and retail complex, unveiled her designs in August 2011 and hopes to complete the project next year.

Meanwhile, a remarkably similar project called Meiquan 22nd Century is being constructed in Chongqing, that experts (and anyone with eyes, really) deem a rip-off. The developers of the Soho complex are concerned that the other is being built at a much faster rate than their own.

“It is possible that the Chongqing pirates got hold of some digital files or renderings of the project,” Satoshi Ohashi, project director at Zaha Hadid Architects, told Der Spiegel online. “[From these] you could work out a similar building if you are technically very capable, but this would only be a rough simulation of the architecture.”

So where does the law stand? Reporting on the intriguing case, China Intellectual Property magazine commented, “Up to now, there is no special law in China which has specific provisions on IP rights related to architecture.” They added that if it went to court, the likely outcome would be payment of compensation to Dame Zaha’s firm, rather than the defendant being forced to pull the building down. However, Dame Zaha seems somewhat unfazed about the structure, simply remarking that if the finished building contains a certain amount of innovation then “that could be quite exciting”. One of the world’s most celebrated architects, Dame Zaha – who recently designed the Aquatics Centre for the London Olympics – has 11 current projects in China. She is quite the star over there: 15,000 fans flocked to see her give a talk at the unveiling of the designs for the complex.

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image: Wangjing Soho Architecture. Courtesy of Zaha Hadid Architects.[end-div]

You Are Different From Yourself

The next time your spouse tells you that you’re “just not the same person anymore” there may be some truth to it. After all, we are not who we thought we would become, nor are we likely to become what we think. That’s the overall result of a recent study of human personality changes in around 20,000 people over time.

[div class=attrib]From Independent:[end-div]

When we remember our past selves, they seem quite different. We know how much our personalities and tastes have changed over the years. But when we look ahead, somehow we expect ourselves to stay the same, a team of psychologists said Thursday, describing research they conducted of people’s self-perceptions.

They called this phenomenon the “end of history illusion,” in which people tend to “underestimate how much they will change in the future.” According to their research, which involved more than 19,000 people ages 18 to 68, the illusion persists from teenage years into retirement.

“Middle-aged people — like me — often look back on our teenage selves with some mixture of amusement and chagrin,” said one of the authors, Daniel T. Gilbert, a psychologist at Harvard. “What we never seem to realize is that our future selves will look back and think the very same thing about us. At every age we think we’re having the last laugh, and at every age we’re wrong.”

Other psychologists said they were intrigued by the findings, published Thursday in the journal Science, and were impressed with the amount of supporting evidence. Participants were asked about their personality traits and preferences — their favorite foods, vacations, hobbies and bands — in years past and present, and then asked to make predictions for the future. Not surprisingly, the younger people in the study reported more change in the previous decade than did the older respondents.

But when asked to predict what their personalities and tastes would be like in 10 years, people of all ages consistently played down the potential changes ahead.

Thus, the typical 20-year-old woman’s predictions for her next decade were not nearly as radical as the typical 30-year-old woman’s recollection of how much she had changed in her 20s. This sort of discrepancy persisted among respondents all the way into their 60s.

And the discrepancy did not seem to be because of faulty memories, because the personality changes recalled by people jibed quite well with independent research charting how personality traits shift with age. People seemed to be much better at recalling their former selves than at imagining how much they would change in the future.

Why? Dr. Gilbert and his collaborators, Jordi Quoidbach of Harvard and Timothy D. Wilson of the University of Virginia, had a few theories, starting with the well-documented tendency of people to overestimate their own wonderfulness.

“Believing that we just reached the peak of our personal evolution makes us feel good,” Dr. Quoidbach said. “The ‘I wish that I knew then what I know now’ experience might give us a sense of satisfaction and meaning, whereas realizing how transient our preferences and values are might lead us to doubt every decision and generate anxiety.”

Or maybe the explanation has more to do with mental energy: predicting the future requires more work than simply recalling the past. “People may confuse the difficulty of imagining personal change with the unlikelihood of change itself,” the authors wrote in Science.

The phenomenon does have its downsides, the authors said. For instance, people make decisions in their youth — about getting a tattoo, say, or a choice of spouse — that they sometimes come to regret.

And that illusion of stability could lead to dubious financial expectations, as the researchers showed in an experiment asking people how much they would pay to see their favorite bands.

When asked about their favorite band from a decade ago, respondents were typically willing to shell out $80 to attend a concert of the band today. But when they were asked about their current favorite band and how much they would be willing to spend to see the band’s concert in 10 years, the price went up to $129. Even though they realized that favorites from a decade ago like Creed or the Dixie Chicks have lost some of their luster, they apparently expect Coldplay and Rihanna to blaze on forever.

“The end-of-history effect may represent a failure in personal imagination,” said Dan P. McAdams, a psychologist at Northwestern who has done separate research into the stories people construct about their past and future lives. He has often heard people tell complex, dynamic stories about the past but then make vague, prosaic projections of a future in which things stay pretty much the same.

[div class=attrib]Read the entire article after the jump.[end-div]

Planets From Stardust

Stunning images captured by Atacama Millimetre-submillimetre Array (ALMA) radio telescope in Chile show the early stages of a planet forming from stardust around a star located 450 light-years from Earth. This is the first time that astronomers have snapped such a clear picture of the process, confirming long-held theories of planetary formation.

[div class=attrib]From Independent:[end-div]

The world’s highest radio telescope, built on a Chilean plateau in the Andes 5,000 metres above sea level, has captured the first image of a new planet being formed as it gobbles up the cosmic dust and gas surrounding a distant star.

Astronomers have long predicted that giant “gas” planets similar to Jupiter would form by collecting the dust and debris that forms around a young star. Now they have the first visual evidence to support the phenomenon, scientists said.

The image taken by the Atacama Millimetre-submillimetre Array (ALMA) in Chile shows two streams of gas connecting the inner and outer disks of cosmic material surrounding the star HD 142527, which is about 450 light-years from Earth.

Astronomers believe the gas streamers are the result of two giant planets – too small to be visible in this image – exerting a gravitational pull on the cloud of surrounding dust and gas, causing the material to flow from the outer to inner stellar disks, said Simon Casassus of the University of Chile in Santiago.

“The most natural interpretation for the flows seen by ALMA is that the putative proto-planets are pulling streams of gas inward towards them that are channelled by their gravity. Much of the gas then overshoots the planets and continues inward to the portion of the disk close to the star, where it can eventually fall onto the star itself,” Dr Casassus said.

“Astronomers have been predicting that these streams exist, but this is the first time we’ve been able to see them directly. Thanks to the new ALMA telescope, we’ve been able to get direct observations to illuminate current theories of how planets are formed,” he said.

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Image: Observations (left) made with the ALMA telescope of the young star HD 142527. The dust in the outer disc is shown in red. Dense gas in the streams flowing across the gap, as well as in the outer disc, is shown in green. Diffuse gas in the central gap is shown in blue. The gas filaments can be seen at the three o’clock and ten o’clock positions, flowing from the outer disc towards the centre. And (right) an artist’s impression. Courtesy of Independent.[end-div]

Curiosity’s 10K Hike

Scientists and engineers at JPL have Mount Sharp in their sites. It’s no ordinary mountain — it’s situated on Mars. The 5,000 meter high mountain is home to exposed layers of some promising sedimentary rocks, which hold clues to Mars’ geologic, and perhaps biological, history. Unfortunately, Mount Sharp is 10K away from the current home of the Curiosity rover. So, at a top speed of around 100 meters per day it will take Curiosity until the fall of 2013 to reach its destination.

[div class=attrib]From the New Scientist:[end-div]

NASA’S Curiosity rover is about to have its cake and eat it too. Around September, the rover should get its first taste of layered sediments at Aeolis Mons, a mountain over 5 kilometres tall that may hold preserved signs of life on Mars.

Previous rovers uncovered ample evidence of ancient water, a key ingredient for life as we know it. With its sophisticated on-board chemistry lab, Curiosity is hunting for more robust signs of habitability, including organic compounds – the carbon-based building blocks of life as we know it.

Observations from orbit show that the layers in Aeolis Mons – also called Mount Sharp – contain minerals thought to have formed in the presence of water. That fits with theories that the rover’s landing site, Gale crater, was once a large lake. Even better, the layers were probably laid down quickly enough that the rocks could have held on to traces of microorganisms, if they existed there.

If the search for organics turns up empty, Aeolis Mons may hold other clues to habitability, says project scientist John Grotzinger of the California Institute of Technology in Pasadena. The layers will reveal which minerals and chemical processes were present in Mars’s past. “We’re going to find all kinds of good stuff down there, I’m sure,” he says.

Curiosity will explore a region called Glenelg until early February, and then hit the gas. The base of the mountain is 10 kilometres away, and the rover can drive at about 100 metres a day at full speed. The journey should take between six and nine months, but will include stops to check out any interesting landmarks. After all, some of the most exciting discoveries from Mars rovers were a result of serendipity.

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Image: Base of Mount Sharp, Mars. Courtesy of Credit: NASA/JPL-Caltech/MSSS.[end-div]

Evolution and Autocatalysis

A clever idea about the process of emergence from mathematicians at the University of Vermont has some evolutionary biologists thinking.

[div class=attrib]From MIT Review:[end-div]

One of the most puzzling questions about the origin of life is how the rich chemical landscape that makes life possible came into existence.

This landscape would have consisted among other things of amino acids, proteins and complex RNA molecules. What’s more, these molecules must have been part of a rich network of interrelated chemical reactions which generated them in a reliable way.

Clearly, all that must have happened before life itself emerged. But how?

One idea is that groups of molecules can form autocatalytic sets. These are self-sustaining chemical factories, in which the product of one reaction is the feedstock or catalyst for another. The result is a virtuous, self-contained cycle of chemical creation.

Today, Stuart Kauffman at the University of Vermont in Burlington and a couple of pals take a look at the broader mathematical properties of autocatalytic sets. In examining this bigger picture, they come to an astonishing conclusion that could have remarkable consequences for our understanding of complexity, evolution and the phenomenon of emergence.

They begin by deriving some general mathematical properties of autocatalytic sets, showing that such a set can be made up of many autocatalytic subsets of different types, some of which can overlap.

In other words, autocatalytic sets can have a rich complex structure of their own.

They go on to show how evolution can work on a single autocatalytic set, producing new subsets within it that are mutually dependent on each other.  This process sets up an environment in which newer subsets can evolve.

“In other words, self-sustaining, functionally closed structures can arise at a higher level (an autocatalytic set of autocatalytic sets), i.e., true emergence,” they say.

That’s an interesting view of emergence and certainly seems a sensible approach to the problem of the origin of life. It’s not hard to imagine groups of molecules operating together like this. And indeed, biochemists have recently discovered simple autocatalytic sets that behave in exactly this way.

But what makes the approach so powerful is that the mathematics does not depend on the nature of chemistry–it is substrate independent. So the building blocks in an autocatalytic set need not be molecules at all but any units that can manipulate other units in the required way.

These units can be complex entities in themselves. “Perhaps it is not too far-fetched to think, for example, of the collection of bacterial species in your gut (several hundreds of them) as one big autocatalytic set,” say Kauffman and co.

And they go even further. They point out that the economy is essentially the process of transforming raw materials into products such as hammers and spades that themselves facilitate further transformation of raw materials and so on. “Perhaps we can also view the economy as an (emergent) autocatalytic set, exhibiting some sort of functional closure,” they speculate.

[div class=attrib]Read the entire article after the jump.[end-div]

Best Science Stories of 2012

As the year comes to a close it’s fascinating to look back at some of the most breathtaking science of 2012.

 

 

 

 

 

 

 

 

The image above is of Saturn’s moon Enceladus. Evidence from Cassini spacecraft, which took this remarkable image, suggests a deep salty ocean beneath the frozen surface that periodically spews out icy particles into the space. Many scientists believe that Enceladus is the best place to look for signs of life beyond Earth within our Solar System.

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Image courtesy of Cassini Imaging Team/SSI/JPL/ESA/NASA.[end-div]

So You Wanna Be a Rockstar?

Many of us harbor dreams, often secret ones, of becoming a famous rockstar. Well, if you want to live well passed middle age, think again. Being a rockstar and living a long life are not statistically compatible, especially if you’re American. You choose.

[div class=attrib]From ars technica:[end-div]

Hedonism. Substance abuse. Risky behavior. Rock stars from Elvis Presley to Amy Winehouse have ended up famous not only for their music but for the decadent lifestyle it enabled, one that eventually contributed to their deaths. But how much does the rock lifestyle really hurt?

Quite a bit. That’s the conclusion of a new study that tracked nearly 1,500 chart-topping musicians and found that their life expectancy after fame really was lower than that of the general population. North American solo musicians seem to have it especially bad.

This wasn’t necessarily what you’d expect. A huge number of studies have shown that wealth is generally associated with greater longevity, possibly as a result of better health care, better diet, and lower stress. Not only are rock musicians dying faster than the general populace, but they’re completely negating the impact of any wealth that their fame brought to them.

To get a collection of rock stars for their study, the authors combed the charts and took advantage of a large poll that listed the top 1,000 albums of all time. Altogether, their subjects reached fame between the years of 1956 and 2006 and included everyone from Elvis Presley to Regina Spektor to the Arctic Monkeys. From there, the authors searched the news and Wikipedia, looking for reports of death. With that information in hand, they compared the artists’ life expectancies to those of the general population.

Only about two-thirds of North American stars were still alive 40 years after their first brush with fame, compared with about 80 percent of a matched population—and there was never a point at which they outlived their non-famous peers. Typically, Europeans have greater life expectancies, but European stars did not, tracking the longevity of average North Americans for the first few decades.

Oddly, however, once they survived 20 years after hitting the big time, European rock stars started to do better, outliving the typical North American. And, by 35 years, they caught up with the average European’s life expectancy. (No word from the authors on whether this trend would stay the same if the analysis excluded the members of the Rolling Stones.) On both continents, solo performers did worse than members of a band.

So what’s killing the famous? The authors identified cause of death wherever possible and classified it as either “other” or “substance use or risk-related deaths.” The latter category included “drug or alcohol-related chronic disorder, overdose or accident, and other risk-related causes that may or may not have been related to substance use, i.e., suicide and violence.” They also tried to determine (using biographical data) whether any of the deceased stars had suffered adverse childhood experiences, such as a substance abusing or a mentally ill parent.

Of those without any obvious childhood issues, under a third died of substance abuse or other risky behavior. Adding a single adverse childhood influence raised that rate to 42 percent. Two or more adverse events, and the rate shot up to about 80 percent.

These same sorts of childhood problems tend to lead to substance abuse and other troubles in the general population as well, and the authors conclude that the hedonism we associate with rock stars is less a lifestyle choice and more an outcome of early life issues.

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image: Spinal Tap backstage at CBGB’s in New York City. Photograph: Ebet Roberts/Redferns / Guardian.[end-div]

For Sale – Year in Review

Now is the time of year to review all that has passed during 2012. You know how it goes: celebrity marriages, celebrity divorces, extreme weather records, deaths, best and worst movies. Our favorite moments come courtesy of postings on Craigslist. Annually, Craigslist users nominate their favorites for inclusion in the “Best Of” category. A recent favorite of ours from Pensacola, Florida:

guy with skid mark, bought gallon of whole milk, circle k – w4m

i was in my bikini at the circle k, you came in with your short shirt and your bike shorts on. they were white and you had a pretty sexy skid mark staining your behind. you got 11 sticks of beef jerky and a gallon of whole milk, then rode off on your bicycle. i will know its you because you paid in pennies.

[div class=attrib]From Wired:[end-div]

Homer Simpson’s famous ode to alcohol—”The cause of, and solution to, all of life’s problems”—might apply in equal measure to Craigslist, the wildly popular, barebones site where one can find all of life’s problems and solutions, including: a freelance writing gig, roommates, a sex partner, a man-sized fiberglass chili pepper, a lifetime supply of hot sauce, and coffee beans that have been ingested, digested, and excreted by someone living in Portland.

Each year, Craigslist users across the country flag their favorite classified ads for inclusion in the “best of” category. The bar to inclusion is high, but somehow each year America comes through with memorable postings that remind us just why we went ahead with this whole Web 2.0 thing.

This year was no exception. Here are a few of our favorites.

Paging Michelangelo

“Artist needed. Must love owls,” said one September post, which had something quite specific in mind.

We need an artist to depict the following: an owl skeleton with a parrot on its shoulder. The parrot is not a skeleton and is very colorful. The parrot has a peg leg, with a pirate hat on. The owl has an eye patch and a gold chain necklace with a skull on the pendant of said necklace. The skull in the pendant has an eye patch on the opposite eye of the owl (long story there don’t ask). The owl skeleton also has on a wizard’s hat with that typical wizard hat wrinkle. The owl is standing on a cowboy hat from a whale’s spout. This all is within a snow globe. That santa is holding with his only good hand because his other hand is a hook. Mrs. Clause is pulling on Mr. Clause’s coat with one of those dinosaur mouth grabbers that all 80’s children know.

The artist who could handle the commission would get both some cash and “a prize.”

(Side note: the oddly specific nature of this image request parallels those often received by our own creative director, Aurich Lawson, who has fielded article image suggestions that make this one look absolutely normal by comparison.)

Needed: one lap for aging cat

Next up, the “feline lap surrogate,” which I want to believe is a joke but fear is not. This job post is exactly what it sounds like, viz., the surrogate goes to a home each morning from 8am-12pm and gets paid $15 an hour to sit in a chair and “allow my cat to sit on their lap (the cat is attention seeking, and has been decreasing my productivity as of late).” The ideal candidate must have cat handling experience and no allergies.

“I do not need anyone in the afternoon since the sun warms the window sill by that point, and the cat will prefer the window sill to a lap,” the ad concludes. “Breakfast and lunch will be provided each day.”

[div class=attrib]Read the entire article after the jump.[end-div]

The Missing Linc

LincRNA that is. Recent discoveries hint at the potentially crucial role of this new class of genetic material in embryonic development, cell and tissue differentiation and even speciation and evolution.

[div class=attrib]From the Economist:[end-div]

THE old saying that where there’s muck, there’s brass has never proved more true than in genetics. Once, and not so long ago, received wisdom was that most of the human genome—perhaps as much as 99% of it—was “junk”. If this junk had a role, it was just to space out the remaining 1%, the genes in which instructions about how to make proteins are encoded, in a useful way in the cell nucleus.

That, it now seems, was about as far from the truth as it is possible to be. The decade or so since the completion of the Human Genome Project has shown that lots of the junk must indeed have a function. The culmination of that demonstration was the publication, in September, of the results of the ENCODE project. This suggested that almost two-thirds of human DNA, rather than just 1% of it, is being copied into molecules of RNA, the chemical that carries protein-making instructions to the sub-cellular factories which turn those proteins out, and that as a consequence, rather than there being just 23,000 genes (namely, the bits of DNA that encode proteins), there may be millions of them.

The task now is to work out what all these extra genes are up to. And a study just published in Genome Biology, by David Kelley and John Rinn of Harvard University, helps do that for one new genetic class, a type known as lincRNAs. In doing so, moreover, Dr Kelley and Dr Rinn show just how complicated the modern science of genetics has become, and hint also at how animal species split from one another.

Lincs in the chain

Molecules of lincRNA are similar to the messenger-RNA molecules which carry protein blueprints. They do not, however, encode proteins. More than 9,000 sorts are known, and most of those whose job has been tracked down are involved in the regulation of other genes, for example by attaching themselves to the DNA switches that control those genes.

LincRNA is rather odd, though. It often contains members of a second class of weird genetic object. These are called transposable elements (or, colloquially, “jumping genes”, because their DNA can hop from one place to another within the genome). Transposable elements come in several varieties, but one group of particular interest are known as endogenous retroviruses. These are the descendants of ancient infections that have managed to hide away in the genome and get themselves passed from generation to generation along with the rest of the genes.

Dr Kelley and Dr Rinn realised that the movement within the genome of transposable elements is a sort of mutation, and wondered if it has evolutionary consequences. Their conclusion is that it does, for when they looked at the relation between such elements and lincRNA genes, they found some intriguing patterns.

In the first place, lincRNAs are much more likely to contain transposable elements than protein-coding genes are. More than 83% do so, in contrast to only 6% of protein-coding genes.

Second, those transposable elements are particularly likely to be endogenous retroviruses, rather than any of the other sorts of element.

Third, the interlopers are usually found in the bit of the gene where the process of copying RNA from the DNA template begins, suggesting they are involved in switching genes on or off.

And fourth, lincRNAs containing one particular type of endogenous retrovirus are especially active in pluripotent stem cells, the embryonic cells that are the precursors of all other cell types. That indicates these lincRNAs have a role in the early development of the embryo.

Previous work suggests lincRNAs are also involved in creating the differences between various sorts of tissue, since many lincRNA genes are active in only one or a few cell types. Given that their principal job is regulating the activities of other genes, this makes sense.

Even more intriguingly, studies of lincRNA genes from species as diverse as people, fruit flies and nematode worms, have found they differ far more from one species to another than do protein-coding genes. They are, in other words, more species specific. And that suggests they may be more important than protein-coding genes in determining the differences between those species.

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image: Darwin’s finches or Galapagos finches. Darwin, 1845. Courtesy of Wikipedia.[end-div]

British? May the Force be With You

Recent census figures from the United Kingdom show that Jedi is the seventh most popular faith overall, with just over 176,000 followers.

While this is down from a high of around 400,000 in the previous census (2001) it does suggest that George Lucas, creator of the Star Wars franchise, would still be a good stand-in for God in some parts of the U.K.

To learn more about Jediism point your browser here.

[div class=attrib]From the Telegraph:[end-div]

The new figures reveal that the lightsabre-wielding disciples are only behind Christianity, Islam, Hinduism, Sikhism, Judaism and Buddhism in the popularity stakes, excluding non-religious people and people who did not answer.

Following a nationwide campaign, Jedi made it onto the 2001 census, with 390,127 people identifying themselves a decade ago as followers of the fictional Star Wars creed.

Although the number of Jedis has dropped by more than 50 per cent over the past 10 years, they are still the most selected “alternative” faith on the Census, and constitute 0.31% of all people’s stated religious affiliation in England and Wales.

The latest official population survey also revealed 6,242 people subscribe to the Heavy Metal religion, which was set up in 2010 by the Rock magazine, Metal Hammer.

The number of people specifically identifying as Atheists was 29,267, while over 13.8 million refused to identify with a faith at all, ticking the “No religion” box on the census form.

Norwich was revealed as the area with the highest proportion of non-religious people, with 41.5% of residents refusing to identify with a faith. The city also possesses the highest proportion of Heavy Metal followers and the 3rd highest proportion of Jedi Knights.

Other non-mainstream religions that had followers in significant numbers included 56,620 Paganists, 39,061 Spiritualists, 2,418 Scientologists and 20,288 Jainists, some of whom sweep the floor with a broom made of cotton threads as they walk along so as not to kill any insects.

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image: Star Wars Jedi Knights, Qui-Gon Jinn and Obi-Wan Kenobi. Courtesy of Wikipedia / Lucas Films.[end-div]

Rivers of Methane

The image shows what looks like a satellite picture of a river delta, complete with tributaries. It could be the Nile or the Amazon river systems as seen from space.

However, the image is not of an earthbound river at all. It’s a recently discovered river on Titan, Saturn’s largest moon. And, the river’s contents are not even water, but probably a mixture of liquid ethane and methane.

[div class=attrib]From NASA:[end-div]

This image from NASA’s Cassini spacecraft shows a vast river system on Saturn’s moon Titan. It is the first time images from space have revealed a river system so vast and in such high resolution anywhere other than Earth. The image was acquired on Sept. 26, 2012, on Cassini’s 87th close flyby of Titan. The river valley crosses Titan’s north polar region and runs into Ligeia Mare, one of the three great seas in the high northern latitudes of Saturn’s moon Titan. It stretches more than 200 miles (400 kilometers).

Scientists deduce that the river is filled with liquid because it appears dark along its entire extent in the high-resolution radar image, indicating a smooth surface. That liquid is presumably ethane mixed with methane, the former having been positively identified in 2008 by Cassini’s visual and infrared mapping spectrometer at the lake known as Ontario Lacus in Titan’s southern hemisphere. Though there are some short, local meanders, the relative straightness of the river valley suggests it follows the trace of at least one fault, similar to other large rivers running into the southern margin of Ligeia Mare (see PIA10008). Such faults may lead to the opening of basins and perhaps to the formation of the giant seas themselves.

North is toward the top of this image.

The Cassini-Huygens mission is a cooperative project of NASA, the European Space Agency and ASI, the Italian Space Agency. NASA’s Jet Propulsion Laboratory, a division of the California Institute of Technology in Pasadena, manages the mission for NASA’s Science Mission Directorate, Washington. The Cassini orbiter was designed, developed and assembled at JPL. The RADAR instrument was built by JPL and the Italian Space Agency, working with team members from the US and several European countries. JPL is a division of the California Institute of Technology in Pasadena.

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Image courtesy of NASA/JPL-Caltech/ASI.[end-div]

The Future of the Grid

Two common complaints dog the sustainable energy movement: first, energy generated from the sun and wind is not always present; second, renewable energy is too costly. A new study debunks these notions, and shows that cost effective renewable energy could power our needs 99 percent of the time by 2030.

[div class=attrib]From ars technica:[end-div]

You’ve probably heard the argument: wind and solar power are well and good, but what about when the wind doesn’t blow and the sun doesn’t shine? But it’s always windy and sunny somewhere. Given a sufficient distribution of energy resources and a large enough network of electrically conducting tubes, plus a bit of storage, these problems can be overcome—technologically, at least.

But is it cost-effective to do so? A new study from the University of Delaware finds that renewable energy sources can, with the help of storage, power a large regional grid for up to 99.9 percent of the time using current technology. By 2030, the cost of doing so will hit parity with current methods. Further, if you can live with renewables meeting your energy needs for only 90 percent of the time, the economics become positively compelling.

“These results break the conventional wisdom that renewable energy is too unreliable and expensive,” said study co-author Willett Kempton, a professor at the University of Delaware’s School of Marine Science and Policy. “The key is to get the right combination of electricity sources and storage—which we did by an exhaustive search—and to calculate costs correctly.”

By exhaustive, Kempton is referring to the 28 billion combinations of inland and offshore wind and photovoltaic solar sources combined with centralized hydrogen, centralized batteries, and grid-integrated vehicles analyzed in the study. The researchers deliberately overlooked constant renewable sources of energy such as geothermal and hydro power on the grounds that they are less widely available geographically.

These technologies were applied to a real-world test case: that of the PJM Interconnection regional grid, which covers parts of states from New Jersey to Indiana, and south to North Carolina. The model used hourly consumption data from the years 1999 to 2002; during that time, the grid had a generational capacity of 72GW catering to an average demand of 31.5GW. Taking in 13 states, either whole or in part, the PJM Interconnection constitutes one fifth of the USA’s grid. “Large” is no overstatement, even before considering more recent expansions that don’t apply to the dataset used.

The researchers constructed a computer model using standard solar and wind analysis tools. They then fed in hourly weather data from the region for the whole four-year period—35,040 hours worth. The goal was to find the minimum cost at which the energy demand could be met entirely by renewables for a given proportion of the time, based on the following game plan:

  1. When there’s enough renewable energy direct from source to meet demand, use it. Store any surplus.
  2. When there is not enough renewable energy direct from source, meet the shortfall with the stored energy.
  3. When there is not enough renewable energy direct from source, and the stored energy reserves are insufficient to bridge the shortfall, top up the remaining few percent of the demand with fossil fuels.

Perhaps unsurprisingly, the precise mix required depends upon exactly how much time you want renewables to meet the full load. Much more surprising is the amount of excess renewable infrastructure the model proposes as the most economic. To achieve a 90-percent target, the renewable infrastructure should be capable of generating 180 percent of the load. To meet demand 99.9 percent of the time, that rises to 290 percent.

“So much excess generation of renewables is a new idea, but it is not problematic or inefficient, any more than it is problematic to build a thermal power plant requiring fuel input at 250 percent of the electrical output, as we do today,” the study argues.

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image: Bangui Windfarm, Ilocos Norte, Philippines. Courtesy of
Wikipedia.[end-div]

Places to Visit Before World’s End

In case you missed all the apocalyptic hoopla, the world is supposed to end today. Now, if you’re reading this, you obviously still have a little time, since the Mayans apparently did not specify a precise time for prophesied end. So, we highly recommend that you visit one or more of these beautiful places, immediately. Of course, if we’re all still here tomorrow, you will have some extra time to take in these breathtaking sights before the next planned doomsday.

[div class=attrib]Check out the top 100 places according to the Telegraph after the jump.[end-div]

[div class=attrib]Image: Lapland for the northern lights. Courtesy of ALAMY / Telegraph.[end-div]

E or I, T or F: 50 Years of Myers-Briggs

Two million people annually take the Myers-Briggs Type Indicator assessment. Over 10,000 businesses and 2,500 colleges in the United States use the test.

It’s very likely that you have taken the test at some point in your life: during high school, or to get into university or to secure your first job. The test categorizes humans along 4 discrete axes (or dichotomies) of personality types: Extraversion (E) and Introversion (I); Sensing (S) and Intuition (N); Thinking (T) and Feeling (F); Judging (J) and Perceiving (P). If your have a partner it’s likely that he or she has, at sometime or another, (mis-)labeled you as an E or an I, and as a “feeler” rather than a “thinker”, and so on. Countless arguments will have ensued.

[div class=attrib]From the Washington Post:[end-div]

Some grandmothers pass down cameo necklaces. Katharine Cook Briggs passed down the world’s most widely used personality test.

Chances are you’ve taken the Myers-Briggs Type Indicator, or will. Roughly 2 million people a year do. It has become the gold standard of psychological assessments, used in businesses, government agencies and educational institutions. Along the way, it has spawned a multimillion-dollar business around its simple concept that everyone fits one of 16 personality types.

Now, 50 years after the first time anyone paid money for the test, the Myers-Briggs legacy is reaching the end of the family line. The youngest heirs don’t want it. And it’s not clear whether organizations should, either.

That’s not to say it hasn’t had a major influence.

More than 10,000 companies, 2,500 colleges and universities and 200 government agencies in the United States use the test. From the State Department to McKinsey & Co., it’s a rite of passage. It’s estimated that 50 million people have taken the Myers-Briggs personality test since the Educational Testing Service first added the research to its portfolio in 1962.

The test, whose first research guinea pigs were George Washington University students, has seen financial success commensurate to this cultlike devotion among its practitioners. CPP, the private company that publishes Myers-Briggs, brings in roughly $20 million a year from it and the 800 other products, such as coaching guides, that it has spawned.

Yet despite its widespread use and vast financial success, and although it was derived from the work of Carl Jung, one of the most famous psychologists of the 20th century, the test is highly questioned by the scientific community.

To begin even before its arrival in Washington: Myers-Briggs traces its history to 1921, when Jung, a Swiss psychiatrist, published his theory of personality types in the book “Psychologische Typen.” Jung had become well known for his pioneering work in psychoanalysis and close collaboration with Sigmund Freud, though by the 1920s the two had severed ties.

Psychoanalysis was a young field and one many regarded skeptically. Still, it had made its way across the Atlantic not only to the university offices of scientists but also to the home of a mother in Washington.

Katharine Cook Briggs was a voracious reader of the new psychology books coming out in Europe, and she shared her fascination with Jung’s latest work — in which he developed the concepts of introversion and extroversion — with her daughter, Isabel Myers. They would later use Jung’s work as a basis for their own theory, which would become the Myers-Briggs Type Indicator. MBTI is their framework for classifying personality types along four distinct axes: introversion vs. extroversion, sensing vs. intuition, thinking vs. feeling and judging vs. perceiving. A person, according to their hypothesis, has one dominant preference in each of the four pairs. For example, he might be introverted, a sensor, a thinker and a perceiver. Or, in Myers-Briggs shorthand, an “ISTP.”

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Image: Keirsey Temperament Sorter, which utilizes Myers-Briggs dichotomies to group personalities into 16 types. Courtesy of Wikipedia.[end-div]

Single-tasking is Human

If you’re an office worker you will relate. Recently, you will have participated on a team meeting or conference call only to have at least one person say, when asked a question, “sorry can you please repeat that, I was multitasking.”

Many of us believe, or have been tricked into believing, that doing multiple things at once makes us more productive. This phenomenon was branded by business as multitasking. After all, if computers could do it, then why not humans. Yet, experience shows that humans are woefully inadequate at performing multiple concurrent tasks that require dedicated attention. Of course, humans are experts at walking and chewing gum at the same time. However, in the majority of cases these activities require very little involvement from the higher functions of the brain. There is a growing body of anecdotal and experimental evidence that shows poorer performance on multiple tasks done concurrently versus the same tasks performed sequentially. In fact, for quite some time, researchers have shown that dealing with multiple streams of information at once is a real problem for our limited brains.

Yet, most businesses seem to demand or reward multitasking behavior. And damagingly, the multitasking epidemic now seems to be the norm in the home as well.

[div class=attrib]From the WSJ:[end-div]

In the few minutes it takes to read this article, chances are you’ll pause to check your phone, answer a text, switch to your desktop to read an email from the boss’s assistant, or glance at the Facebook or Twitter messages popping up in the corner of your screen. Off-screen, in your open-plan office, crosstalk about a colleague’s preschooler might lure you away, or a co-worker may stop by your desk for a quick question.

And bosses wonder why it is tough to get any work done.

Distraction at the office is hardly new, but as screens multiply and managers push frazzled workers to do more with less, companies say the problem is worsening and is affecting business.

While some firms make noises about workers wasting time on the Web, companies are realizing the problem is partly their own fault.

Even though digital technology has led to significant productivity increases, the modern workday seems custom-built to destroy individual focus. Open-plan offices and an emphasis on collaborative work leave workers with little insulation from colleagues’ chatter. A ceaseless tide of meetings and internal emails means that workers increasingly scramble to get their “real work” done on the margins, early in the morning or late in the evening. And the tempting lure of social-networking streams and status updates make it easy for workers to interrupt themselves.

“It is an epidemic,” says Lacy Roberson, a director of learning and organizational development at eBay Inc. At most companies, it’s a struggle “to get work done on a daily basis, with all these things coming at you,” she says.

Office workers are interrupted—or self-interrupt—roughly every three minutes, academic studies have found, with numerous distractions coming in both digital and human forms. Once thrown off track, it can take some 23 minutes for a worker to return to the original task, says Gloria Mark, a professor of informatics at the University of California, Irvine, who studies digital distraction.

Companies are experimenting with strategies to keep workers focused. Some are limiting internal emails—with one company moving to ban them entirely—while others are reducing the number of projects workers can tackle at a time.

Last year, Jamey Jacobs, a divisional vice president at Abbott Vascular, a unit of health-care company Abbott Laboratories learned that his 200 employees had grown stressed trying to squeeze in more heads-down, focused work amid the daily thrum of email and meetings.

“It became personally frustrating that they were not getting the things they wanted to get done,” he says. At meetings, attendees were often checking email, trying to multitask and in the process obliterating their focus.

Part of the solution for Mr. Jacobs’s team was that oft-forgotten piece of office technology: the telephone.

Mr. Jacobs and productivity consultant Daniel Markovitz found that employees communicated almost entirely over email, whether the matter was mundane, such as cake in the break room, or urgent, like an equipment issue.

The pair instructed workers to let the importance and complexity of their message dictate whether to use cellphones, office phones or email. Truly urgent messages and complex issues merited phone calls or in-person conversations, while email was reserved for messages that could wait.

Workers now pick up the phone more, logging fewer internal emails and say they’ve got clarity on what’s urgent and what’s not, although Mr. Jacobs says staff still have to stay current with emails from clients or co-workers outside the group.

[div class=attrib]Read the entire article after the jump, and learn more in this insightful article on multitasking over at Big Think.[end-div]

[div class=attrib]Image courtesy of Big Think.[end-div]

Guns, Freedom and the Uncivil Society

Associate professor of philosophy, Firmin DeBrabander, argues that guns have no place in a civil society. Guns hinder free speech and free assembly for those at either end of the barrel. Guns fragment our society and undermine the sense and mechanisms of community. He is right.

[div class=attrib]From the New York Times:[end-div]

The night of the shootings at Sandy Hook Elementary School in Newtown, Conn., I was in the car with my wife and children, working out details for our eldest son’s 12th birthday the following Sunday — convening a group of friends at a showing of the film  “The Hobbit.” The memory of the Aurora movie theatre massacre was fresh in his mind, so he was concerned that it not be a late night showing. At that moment, like so many families, my wife and I were weighing whether to turn on the radio and expose our children to coverage of the school shootings in Connecticut. We did. The car was silent in the face of the flood of gory details. When the story was over, there was a long thoughtful pause in the back of the car. Then my eldest son asked if he could be homeschooled.

That incident brought home to me what I have always suspected, but found difficult to articulate: an armed society — especially as we prosecute it at the moment in this country — is the opposite of a civil society.

The Newtown shootings occurred at a peculiar time in gun rights history in this nation. On one hand, since the mid 1970s, fewer households each year on average have had a gun. Gun control advocates should be cheered by that news, but it is eclipsed by a flurry of contrary developments. As has been well publicized, gun sales have steadily risen over the past few years, and spiked with each of Obama’s election victories.

Furthermore, of the weapons that proliferate amongst the armed public, an increasing number are high caliber weapons (the weapon of choice in the goriest shootings in recent years). Then there is the legal landscape, which looks bleak for the gun control crowd.

Every state except for Illinois has a law allowing the carrying of concealed weapons — and just last week, a federal court struck down Illinois’ ban. States are now lining up to allow guns on college campuses. In September, Colorado joined four other states in such a move, and statehouses across the country are preparing similar legislation. And of course, there was Oklahoma’s ominous Open Carry Law approved by voters this election day — the fifteenth of its kind, in fact — which, as the name suggests, allows those with a special permit to carry weapons in the open, with a holster on their hip.

Individual gun ownership — and gun violence — has long been a distinctive feature of American society, setting us apart from the other industrialized democracies of the world. Recent legislative developments, however, are progressively bringing guns out of the private domain, with the ultimate aim of enshrining them in public life. Indeed, the N.R.A. strives for a day when the open carry of powerful weapons might be normal, a fixture even, of any visit to the coffee shop or grocery store — or classroom.

As N.R.A. president Wayne LaPierre expressed in a recent statement on the organization’s Web site, more guns equal more safety, by their account. A favorite gun rights saying is “an armed society is a polite society.” If we allow ever more people to be armed, at any time, in any place, this will provide a powerful deterrent to potential criminals. Or if more citizens were armed — like principals and teachers in the classroom, for example — they could halt senseless shootings ahead of time, or at least early on, and save society a lot of heartache and bloodshed.

As ever more people are armed in public, however — even brandishing weapons on the street — this is no longer recognizable as a civil society. Freedom is vanished at that point.

And yet, gun rights advocates famously maintain that individual gun ownership, even of high caliber weapons, is the defining mark of our freedom as such, and the ultimate guarantee of our enduring liberty. Deeper reflection on their argument exposes basic fallacies.

In her book “The Human Condition,” the philosopher Hannah Arendt states that “violence is mute.” According to Arendt, speech dominates and distinguishes the polis, the highest form of human association, which is devoted to the freedom and equality of its component members. Violence — and the threat of it — is a pre-political manner of communication and control, characteristic of undemocratic organizations and hierarchical relationships. For the ancient Athenians who practiced an incipient, albeit limited form of democracy (one that we surely aim to surpass), violence was characteristic of the master-slave relationship, not that of free citizens.

Arendt offers two points that are salient to our thinking about guns: for one, they insert a hierarchy of some kind, but fundamental nonetheless, and thereby undermine equality. But furthermore, guns pose a monumental challenge to freedom, and particular, the liberty that is the hallmark of any democracy worthy of the name — that is, freedom of speech. Guns do communicate, after all, but in a way that is contrary to free speech aspirations: for, guns chasten speech.

This becomes clear if only you pry a little more deeply into the N.R.A.’s logic behind an armed society. An armed society is polite, by their thinking, precisely because guns would compel everyone to tamp down eccentric behavior, and refrain from actions that might seem threatening. The suggestion is that guns liberally interspersed throughout society would cause us all to walk gingerly — not make any sudden, unexpected moves — and watch what we say, how we act, whom we might offend.

As our Constitution provides, however, liberty entails precisely the freedom to be reckless, within limits, also the freedom to insult and offend as the case may be. The Supreme Court has repeatedly upheld our right to experiment in offensive language and ideas, and in some cases, offensive action and speech. Such experimentation is inherent to our freedom as such. But guns by their nature do not mix with this experiment — they don’t mix with taking offense. They are combustible ingredients in assembly and speech.

I often think of the armed protestor who showed up to one of the famously raucous town hall hearings on Obamacare in the summer of 2009. The media was very worked up over this man, who bore a sign that invoked a famous quote of Thomas Jefferson, accusing the president of tyranny. But no one engaged him at the protest; no one dared approach him even, for discussion or debate — though this was a town hall meeting, intended for just such purposes. Such is the effect of guns on speech — and assembly. Like it or not, they transform the bearer, and end the conversation in some fundamental way. They announce that the conversation is not completely unbounded, unfettered and free; there is or can be a limit to negotiation and debate — definitively.

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image courtesy of Wikipedia.[end-div]

Blind Loyalty and the Importance of Critical Thinking

Two landmark studies in the 1960s and ’70s put behavioral psychology squarely in the public consciousness. The obedience experiments by Stanley Milgram and the Stanford Prison experiment demonstrated how regular individuals could be made, quite simply, to obey figures in authority and to subject others to humiliation, suffering and pain.

A re-examination of these experiments and several recent similar studies have prompted a number of psychologists to offer a reinterpretation of the original conclusions. They suggest that humans may not be inherently evil after all. However, we remain dangerously flawed — our willingness to follow those in authority, especially in those with whom we identify, makes us susceptible to believing in the virtue of actions that by all standards would be monstrous. It turns out that an open mind able to think critically may be the best antidote.

[div class=attrib]From the Pacific Standard:[end-div]

They are among the most famous of all psychological studies, and together they paint a dark portrait of human nature. Widely disseminated in the media, they spread the belief that people are prone to blindly follow authority figures—and will quickly become cruel and abusive when placed in positions of power.

It’s hard to overstate the impact of Stanley Milgram’s obedience experiments of 1961, or the Stanford Prison Experiment of 1971. Yet in recent years, the conclusions derived from those studies have been, if not debunked, radically reinterpreted.

A new perspective—one that views human nature in a more nuanced light—is offered by psychologists Alex Haslam of the University of Queensland, Australia, and Stephen Reicher of the University of St. Andrews in Scotland.

In an essay published in the open-access journal PLoS Biology, they argue that people will indeed comply with the questionable demands of authority figures—but only if they strongly identify with that person, and buy into the rightness of those beliefs.
In other words, we’re not unthinking automatons. Nor are we monsters waiting for permission for our dark sides to be unleashed. However, we are more susceptible to psychological manipulation than we may realize.

In Milgram’s study, members of the general public were placed in the role of “teacher” and told that a “learner” was in a nearby room. Each time the “learner” failed to correctly recall a word as part of a memory experiment, the “teacher” was told to administer an electrical shock.

As the “learner” kept making mistakes, the “teacher” was ordered to give him stronger and stronger jolts of electricity. If a participant hesitated, the experimenter—an authority figure wearing a white coat—instructed him to continue.

Somewhat amazingly, most people did so: 65 percent of participants continued to give stronger and stronger shocks until the experiment ended with the “learner” apparently unconscious. (The torture was entirely fictional; no actual shocks were administered.)
To a world still reeling from the question of why so many Germans obeyed orders and carried out Nazi atrocities, here was a clear answer: We are predisposed to obey authority figures.

The Stanford Prisoner Experiment, conducted a few years later, was equally unnerving. Students were randomly assigned to assume the role of either prisoner or guard in a “prison” set up in the university’s psychology department. As Haslam and Reicher note, “such was the abuse meted out to the prisoners by the guards that the study had to be terminated after just six days.”

Lead author Philip Zimbardo, who assumed the role of “prison superintendent” with a level of zeal he later found frightening, concluded that brutality was “a natural consequence of being in the uniform of a guard and asserting the power inherent in that role.”

So is all this proof of the “banality of evil,” to use historian Hannah Arendt’s memorable phrase? Not really, argue Haslam and Reicher. They point to their own work on the BBC Prison Study, which mimicked the seminal Stanford study.

They found that participants “did not conform automatically to their assigned role” as prisoner or guard. Rather, there was a period of resistance, which ultimately gave way to a “draconian” new hierarchy. Before becoming brutal, the participants needed time to assume their new identities, and internalize their role in the system.

Once they did so, “the hallmark of the tyrannical regime was not conformity, but creative leadership and engaged followership within a group of true believers,” they write. “This analysis mirrors recent conclusions about the Nazi tyranny.”

[div class=attrib]Read the entire article after the jump.[end-div]

The Habitable Exoplanets Catalog

The Habitable Exoplanets Catalog is a fascinating resource for those who dream of starting a new life on a distant world. Only into its first year, the catalog now lists 7 planets outside of our solar system and within our own Milky Way galaxy that could become a future home for adventurous humans — complaints from existing inhabitants notwithstanding. Although, the closest at the moment at a distance of just over 20 light years — Gliese 581g — would take around 200,000 years to reach using current technology.

[div class=attrib]From the Independent:[end-div]

An ambitious project to catalogue every habitable planet has discovered seven worlds inside the Milky Way that could possibly harbour life.

Marking its first anniversary, the Habitable Exoplanets Catalog said it had far exceeded its expectation of adding one or two new planets this year in its search for a new earth.

In recent years scientists from the Puerto Rico-based Planetary Habitability Laboratory that runs the catalogue have sharpened their techniques for finding new planets outside our solar system.

Chile’s High Accuracy Radial Veolocity Planet Searcher and the orbiting Kepler Space Telescope are two of the many tools that have increased the pace of discoveries.

The Planetary Habitability Laboratory launched the Habitable Exoplanets Catalog last year to measure the suitability for life of these emerging worlds and as a way to organise them for the public.

It has found nearly 80 confirmed exoplanets with a similar size to Earth but only a few of those have the right distance from their star to support liquid surface water – the presence of which is considered essential to sustain life.

Seven potentially habitable exoplanets are now listed by the Habitable Exoplanets Catalog, including the disputed Gliese 581g, plus some 27 more from NASA Kepler candidates waiting for confirmation.

Although all these exoplanets are superterrans are considered potentially habitable, scientists have not yet found a true Earth analogue.

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Image: Current Potential Habitable Exoplanets. Courtesy of CREDIT: PHL @ UPR Arecibo.[end-div]

Apocalypse Now… First, Brew Some Tea

We love stories of dystopian futures, apocalyptic prophecies and nightmarish visions here at theDiagonal. For some of our favorite articles on the end of days, check out end of world predictions, and how the world may end.

The next impending catastrophe is due a mere week from now, on December 21st, 2012, according to Mayan-watchers. So, of course, it’s time to make final preparations for the end of the world, again. Not to be outdone by the Mayans, the British, guardians of that very stiff-upper-lip, have some timely advice for doomsayers and doomsday aficionados. After all, only the British could come up with a propaganda poster during the second World War emblazoned with “Keep Calm and Carry On”. While there is some very practical advice, such as “leave extra time for journeys”, we find fault with the British authorities for not suggesting “take time to make a good, strong cup of tea”.

[div class=attrib]From the Independent:[end-div]

With the world edging ever closer to what some believe could be an end of days catastrophe that will see the planet and its inhabitants destroyed, British authorities have been issuing tongue in cheek advice on how to prepare.

The advice comes just two weeks ahead of the day that some believe will mark the end of world.

According to some interpretations of the ancient Mayan calendar the 21st of December will signal the end of a 5,125-year cycle known as the Long Count – and will bring about the apocalypse.

There have been scattered reports of panic buying of candles and essentials in China and Russia. There has also been a reported hike in the sales of survival shelters in America.

An official US government blog was published last week saying it was “just rumours” and insisting that “the world will not end on December 21, 2012, or any day in 2012”.

In France, authorities have even taken steps to prevent access to Bugarach mountain, which is thought by some to be a sacred place that will protect them from the end of the world.

Reports claimed websites in the US were selling tickets to access the mountain on the 21st.

In the UK, however, the impending apocalypse is being treated with dead-pan humour by some organisations.

The AA has advised: “Before heading off, take time to do the basic checks on your car and allow extra time for your journey.

“Local radio is a good source of traffic and weather updates and for any warnings of an impending apocalypse. Should the announcer break such solemn news, try to remain focused on the road ahead and keep your hands on the wheel.”

A London Fire Brigade spokesman issued the following advice: “Fit a smoke alarm on each level of your home, then at least you might stand a chance of knowing that the end of the world is nigh ahead of those who don’t.

“If you survive the apocalypse you’ll be alerted to a fire more quickly should one ever break out.”

An RSPCA [Royal Society for the Prevention of Cruelty to Animals] spokesman offered advice for animal lovers ahead of apocalypse saying: “Luckily for animals, they do not have the same fears of the future – or its imminent destruction – as us humans, so it is unlikely that our pets will be worrying about the end of the world.

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image: Digital scan of original KEEP CALM AND CARRY ON poster owned by wartimeposters.co.uk.. Courtesy of Wikipedia.[end-div]

We The People… Want a Twinkie

The old adage, “be careful what you wish for, lest it come true”, shows that desires may well come to fruition, but often have unintended consequences. In this case, for the White House. A couple of years ago the administration launched an online drive to foster dialogue and participation in civic affairs. Known as “We the People: Your Voice in Our Government” the program allows individuals to petition the government on any important issue of the day. And, while White House officials may have had in mind a discussion of substantive issues, many petitions are somewhat more off the wall. Some of our favorite, colorful petitions, many of which have garnered thousands of signatures to date, include:

“Legalize home distillation for home spirits!”

“Secure resources and funding, and begin construction of a Death Star by 2016.”

“Nationalize the Twinkie industry.”

“Peacefully grant the State of Texas to withdraw from the United States of America and create its own NEW government.”

“Peacefully grant the city of Austin Texas to withdraw from the state of Texas & remain part of the United States.”

“Allow the city of El Paso to secede from the state of Texas. El Paso is tired of being a second class city within Texas.”

“Legalize the use of DMT, magic mushrooms, and mescaline for all people.”

“Outlaw offending prophets of major religions.”

“Legally recognize the tea party as a hate group and remove them from office for treason against the United States.”

“Give us back our incandescent lightbulbs! We, the undersigned, want the freedom to choose our own lightbulbs.”

“Create and Approve The MICHAEL JOSEPH JACKSON National Holiday.”

[div class=attrib]From the Washington Post:[end-div]

Forget the “fiscal cliff”: When it comes to the nation’s most pressing concerns, other matters trump financial calamity.

Several thousand Americans, for example, are calling on President Obama to nationalize the troubled Twinkies industry to prevent the loss of the snack cake’s “sweet creamy center.”

Thousands more have signed petitions calling on the White House to replace the courts with a single Hall of Justice, remove Jerry Jones as owner of the Dallas Cowboys, give federal workers a holiday on Christmas Eve, allow members of the military to put their hands in their pockets and begin construction of a “Star Wars”-style Death Star by 2016.

And that’s just within the past month.

The people have spoken, but it might not be what the Obama administration expected to hear. More than a year after it was launched, an ambitious White House online petition program aimed at encouraging civic participation has become cluttered with thousands of demands that are often little more than extended Internet jokes. Interest has escalated in the wake of Obama’s reelection, which spurred more than a dozen efforts from tens of thousands of petitioners seeking permission for their states to secede from the union.

The idea, dubbed “We the People” and modeled loosely on a British government program, was meant to encourage people to exercise their First Amendment rights by collecting enough electronic signatures to meet a threshold that would guarantee an official administration response. (The level was initially set at 5,000 signatures, but that was quickly raised to 25,000 after the public responded a little too enthusiastically.)

Administration officials have spent federal time and tax dollars answering petitioner demands that the government recognize extraterrestrial life, allow online poker, legalize marijuana, remove “under God” from the Pledge of Allegiance and ban Rush Limbaugh from Armed Forces Network radio.

The last issue merited a formal response from the Defense Department: “AFN does not censor content, and we believe it is important that service members have access to a variety of viewpoints,” spokesman Bryan G. Whitman wrote to the more than 29,000 people who signed the anti-Limbaugh petition.

The “We the People” program emerged in the news last week when petitioners demanded that Obama block an appearance at Sunday’s “Christmas in Washington” concert by Psy, the South Korean “Gangnam Style” singer who is under fire for anti-American lyrics. The program’s rules require that petitions relate to “current or potential actions or policies of the federal government,” prompting the White House to pull down the petition because Obama has no authority over booking at the privately run charitable event.

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image: We The People. U.S. Constitution. Courtesy of Wikipedia.[end-div]

What Thomas Jefferson Never Said

Commentators of all political persuasions often cite Jefferson to add weight and gravitas to further a particular point or position. Yet scholarly analysis shows that many quotes are incorrectly attributed to the Founding Father and 3rd President. Some examples of words never spoken or written by Jefferson:

“Dissent is the highest form of patriotism””The democracy will cease to exist when you take away from those who are willing to work and give to those who would not.”

“My reading of history convinces me that most bad government results from too much government.”

“The beauty of the Second Amendment is that it will not be needed until they try to take it.”

[div class=attrib]From the WSJ:[end-div]

Thomas Jefferson once famously wrote, “All tyranny needs to gain a foothold is for people of good conscience to remain silent.”

Or did he? Numerous social movements attribute the quote to him. “The Complete Idiot’s Guide to U.S. Government and Politics” cites it in a discussion of American democracy. Actor Chuck Norris’s 2010 treatise “Black Belt Patriotism: How to Reawaken America” uses it to urge conservatives to become more involved in politics. It is even on T-shirts and decals.

Yet the founding father and third U.S. president never wrote it or said it, insists Anna Berkes, a 33-year-old research librarian at the Jefferson Library at Monticello, his grand estate just outside Charlottesville, Va. Nor does he have any connection to many of the “Jeffersonian” quotes that politicians on both sides of the aisle have slung back and forth in recent years, she says.

“People will see a quote and it appeals to an opinion that they have and if it has Jefferson’s name attached to it that gives it more weight,” she says. “He’s constantly being invoked by people when they are making arguments about politics and actually all sorts of topics.”

A spokeswoman for the Guide’s publisher said it was looking into the quote. Mr. Norris’s publicist didn’t respond to requests for comment.

To counter what she calls rampant misattribution, Ms. Berkes is fighting the Internet with the Internet. She has set up a “Spurious Quotations” page on the Monticello website listing bogus quotes attributed to the founding father, a prolific writer and rhetorician who was the principal author of the Declaration of Independence.

The fake quotes posted and dissected on Monticello.org include “My reading of history convinces me that most bad government has grown out of too much government.” In detailed footnotes, Ms. Berkes says it resembles a line Jefferson wrote in an 1807 letter: “History, in general, only informs us what bad government is.” But she can’t find that exact quotation in any of his writings.

Another that graces many epicurean websites: “On a hot day in Virginia, I know nothing more comforting than a fine spiced pickle, brought up trout-like from the sparkling depths of the aromatic jar below the stairs of Aunt Sally’s cellar.”

Jefferson never said that either, says Ms. Berkes. The earliest reference to the quote comes from a 1922 speech by a man extolling the benefits of pickles, she says.

Jefferson is a “flypaper figure,” like Abraham Lincoln, Mark Twain, Winston Churchill and baseball player and manager Yogi Berra—larger-than-life figures who have fake or misattributed quotes stick to them all the time, says Ralph Keyes, an author of books about quotes wrongly credited to famous or historical figures.

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Reproduction of the 1805 Rembrandt Peale painting of Thomas Jefferson, New York Historical Society. Courtesy of Wikipedia.[end-div]

Climate change: Not in My Neigborhood

It’s no surprise that in our daily lives we seek information that reinforces our perceptions, opinions and beliefs of the world around us. It’s also the case that if we do not believe in a particular position, we will overlook any evidence in our immediate surroundings that runs contrary to our disbelief — climate change is no different.

[div class=attrib]From ars technica:[end-div]

We all know it’s hard to change someone’s mind. In an ideal, rational world, a person’s opinion about some topic would be based on several pieces of evidence. If you were to supply that person with several pieces of stronger evidence that point in another direction, you might expect them to accept the new information and agree with you.

However, this is not that world, and rarely do we find ourselves in a debate with Star Trek’s Spock. There are a great many reasons that we behave differently. One is the way we rate incoming information for trustworthiness and importance. Once we form an opinion, we rate information that confirms our opinion more highly than information that challenges it. This is one form of “motivated reasoning.” We like to think we’re right, and so we are motivated to come to the conclusion that the facts are still on our side.

Publicly contentious issues often put a spotlight on these processes—issues like climate change, example. In a recent paper published in Nature Climate Change, researchers from George Mason and Yale explore how motivated reasoning influences whether people believe they have personally experienced the effects of climate change.

When it comes to communicating the science of global warming, a common strategy is to focus on the concrete here-and-now rather than the abstract and distant future. The former is easier for people to relate to and connect with. Glazed eyes are the standard response to complicated graphs of projected sea level rise, with ranges of uncertainty and several scenarios of future emissions. Show somebody that their favorite ice fishing spot is iced over for several fewer weeks each winter than it was in the late 1800s, though, and you might have their attention.

Public polls show that acceptance of a warming climate correlates with agreement that one has personally experienced its effects. That could be affirmation that personal experience is a powerful force for the acceptance of climate science. Obviously, there’s another possibility—that those who accept that the climate is warming are more likely to believe they’ve experienced the effects themselves, whereas those who deny that warming is taking place are unlikely to see evidence of it in daily life. That’s, at least partly, motivated reasoning at work. (And of course, this cuts both ways. Individuals who agree that the Earth is warming may erroneously interpret unrelated events as evidence of that fact.)

The survey used for this study was unique in that the same people were polled twice, two and a half years apart, to see how their views changed over time. For the group as a whole, there was evidence for both possibilities—experience affected acceptance, and acceptance predicted statements about experience.

Fortunately, the details were a bit more interesting than that. When you categorize individuals by engagement—essentially how confident and knowledgeable they feel about the facts of the issue—differences are revealed. For the highly-engaged groups (on both sides), opinions about whether climate is warming appeared to drive reports of personal experience. That is, motivated reasoning was prevalent. On the other hand, experience really did change opinions for the less-engaged group, and motivated reasoning took a back seat.

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Image courtesy of: New York Times / Steen Ulrik Johannessen / Agence France-Presse — Getty Images.[end-div]

 

 

Big Brother is Mapping You

One hopes that Google’s intention to “organize the world’s information” will remain benign for the foreseeable future. Yet, as more and more of our surroundings and moves are mapped and tracked online, and increasingly offline, it would be wise to remain ever vigilant. Many put up with the encroachment of advertisers and promoters into almost every facet of their daily lives as a necessary, modern evil. But where is the dividing line that separates an ignorable irritation from an intrusion of privacy and a grab for control? For the paranoid amongst us, it may only be a matter of time before our digital footprints come under the increasing scrutiny, and control, of organizations with grander designs.

[div class=attrib]From the Guardian:[end-div]

Eight years ago, Google bought a cool little graphics business called Keyhole, which had been working on 3D maps. Along with the acquisition came Brian McClendon, aka “Bam”, a tall and serious Kansan who in a previous incarnation had supplied high-end graphics software that Hollywood used in films including Jurassic Park and Terminator 2. It turned out to be a very smart move.

Today McClendon is Google’s Mr Maps – presiding over one of the fastest-growing areas in the search giant’s business, one that has recently left arch-rival Apple red-faced and threatens to make Google the most powerful company in mapping the world has ever seen.

Google is throwing its considerable resources into building arguably the most comprehensive map ever made. It’s all part of the company’s self-avowed mission is to organize all the world’s information, says McClendon.

“You need to have the basic structure of the world so you can place the relevant information on top of it. If you don’t have an accurate map, everything else is inaccurate,” he says.

It’s a message that will make Apple cringe. Apple triggered howls of outrage when it pulled Google Maps off the latest iteration of its iPhone software for its own bug-riddled and often wildly inaccurate map system. “We screwed up,” Apple boss Tim Cook said earlier this week.

McClendon, pictured, won’t comment on when and if Apple will put Google’s application back on the iPhone. Talks are ongoing and he’s at pains to point out what a “great” product the iPhone is. But when – or if – Apple caves, it will be a huge climbdown. In the meantime, what McClendon really cares about is building a better map.

This not the first time Google has made a landgrab in the real world, as the publishing industry will attest. Unhappy that online search was missing all the good stuff inside old books, Google – controversially – set about scanning the treasures of Oxford’s Bodleian library and some of the world’s other most respected collections.

Its ambitions in maps may be bigger, more far reaching and perhaps more controversial still. For a company developing driverless cars and glasses that are wearable computers, maps are a serious business. There’s no doubting the scale of McClendon’s vision. His license plate reads: ITLLHPN.

Until the 1980s, maps were still largely a pen and ink affair. Then mainframe computers allowed the development of geographic information system software (GIS), which was able to display and organise geographic information in new ways. By 2005, when Google launched Google Maps, computing power allowed GIS to go mainstream. Maps were about to change the way we find a bar, a parcel or even a story. Washington DC’s homicidewatch.org, for example, uses Google Maps to track and follow deaths across the city. Now the rise of mobile devices has pushed mapping into everyone’s hands and to the front line in the battle of the tech giants.

It’s easy to see why Google is so keen on maps. Some 20% of Google’s queries are now “location specific”. The company doesn’t split the number out but on mobile the percentage is “even higher”, says McClendon, who believes maps are set to unfold themselves ever further into our lives.

Google’s approach to making better maps is about layers. Starting with an aerial view, in 2007 Google added Street View, an on-the-ground photographic map snapped from its own fleet of specially designed cars that now covers 5 million of the 27.9 million miles of roads on Google Maps.

Google isn’t stopping there. The company has put cameras on bikes to cover harder-to-reach trails, and you can tour the Great Barrier Reef thanks to diving mappers. Luc Vincent, the Google engineer known as “Mr Street View”, carried a 40lb pack of snapping cameras down to the bottom of the Grand Canyon and then back up along another trail as fellow hikers excitedly shouted “Google, Google” at the man with the space-age backpack. McClendon, pictured, has also played his part. He took his camera to Antarctica, taking 500 or more photos of a penguin-filled island to add to Google Maps. “The penguins were pretty oblivious. They just don’t care about people,” he says.

Now the company has projects called Ground Truth, which corrects errors online, and Map Maker, a service that lets people make their own maps. In the western world the product has been used to add a missing road or correct a one-way street that is pointing the wrong way, and to generally improve what’s already there. In Africa, Asia and other less well covered areas of the world, Google is – literally – helping people put themselves on the map.

In 2008, it could take six to 18 months for Google to update a map. The company would have to go back to the firm that provided its map information and get them to check the error, correct it and send it back. “At that point we decided we wanted to bring that information in house,” says McClendon. Google now updates its maps hundreds of times a day. Anyone can correct errors with roads signs or add missing roads and other details; Google double checks and relies on other users to spot mistakes.

Thousands of people use Google’s Map Maker daily to recreate their world online, says Michael Weiss-Malik, engineering director at Google Maps. “We have some Pakistanis living in the UK who have basically built the whole map,” he says. Using aerial shots and local information, people have created the most detailed, and certainly most up-to-date, maps of cities like Karachi that have probably ever existed. Regions of Africa and Asia have been added by map-mad volunteers.

[div class=attrib]Read the entire article following the jump.[end-div]

Art Basel: Cheese Expo, Pool Party or Art Show?

Simon Coonan over a Slate posits a simple question:

“How did the art world become such a vapid hell-hole of investment-crazed pretentiousness?”

In his scathing attack on the contemporary art scene replete with Twitter feeds, pool parties, and gallery-curated designer cheese, Coonan quite rightly asks why window dressing and marketing have replaced artistry and craftsmanship. And, more importantly, has big money replaced great, new art?

As an example, the biggest news from Art Basel, the biggest art show in the United States, is not art at all. Celebrity contemporary artist Jeff Koons’ has defected to a rival gallery from his previous home with Larry Gagosian. Gagosian to the art cognoscenti is the “world’s most powerful art dealer”.

[div class=attrib]From Slate:[end-div]

Freud said the goals of the artist are fame, money, and beautiful lovers. Based on my artist acquaintances, I would say this holds true today. What have changed, however, are the goals of the art itself. Do any exist?

How did the art world become such a vapid hell-hole of investment-crazed pretentiousness? How did it become, as Camille Paglia has recently described it, a place where “too many artists have lost touch with the general audience and have retreated to an airless echo chamber”? (More from her in a moment.)

There are sundry problems bedeviling the contemporary art scene. Here are eight that spring readily to mind:

1. Art Basel Miami.

It’s baaa-ack, and I, for one, will not be attending. The overblown art fair in Miami—an offshoot of the original, held in Basel, Switzerland—has become a promo-party cheese-fest. All that craven socializing and trendy posing epitomize the worst aspects of today’s scene, provoking in me a strong desire to start a Thomas Kinkade collection. Whenever some hapless individual innocently asks me if I will be attending Art Basel—even though the shenanigans don’t start for another two weeks, I am already getting e-vites for pre-Basel parties—I invariably respond in Tourette’s mode:

“No. In fact, I would rather jump in a river of boiling snot, which is ironic since that could very well be the title of a faux-conceptual installation one might expect to see at Art Basel. Have you seen Svetlana’s new piece? It’s a river of boiling snot. No, I’m not kidding. And, guess what, Charles Saatchi wants to buy it and is duking it out with some Russian One Percent-er.”

2. Blood, poo, sacrilege, and porn.

Old-school ’70s punk shock tactics are so widespread in today’s art world that they have lost any resonance. As a result, twee paintings like Gainsborough’s Blue Boy and Constable’s Hay Wain now appear mesmerizing, mysterious, and wildly transgressive. And, as Camille Paglia brilliantly argues in her must-read new book, Glittering Images, this torrent of penises, elephant dung, and smut has not served the broader interests of art. By providing fuel for the Rush Limbaugh-ish prejudice that the art world is full of people who are shoving yams up their bums and doing horrid things to the Virgin Mary, art has, quoting Camille again, “allowed itself to be defined in the public eye as an arrogant, insular fraternity with frivolous tastes and debased standards.” As a result, the funding of school and civic arts programs has screeched to a halt and “American schoolchildren are paying the price for the art world’s delusional sense of entitlement.” Thanks a bunch, Karen Finley, Chris Ofili, Andres Serrano, Damien Hirst, and the rest of you naughty pranksters!

Any taxpayers not yet fully aware of the level of frivolity and debasement to which art has plummeted need look no further than the Museum of Modern Art, which recently hosted a jumbo garage-sale-cum-performance piece created by one Martha Rosler titled “Meta-Monumental Garage Sale.” Maybe this has some reverse-chic novelty for chi-chi arty insiders, but for the rest of us out here in the real world, a garage sale is just a garage sale.

8. Cool is corrosive.

The dorky uncool ’80s was a great time for art. The Harings, Cutrones, Scharfs, and Basquiats—life-enhancing, graffiti-inspired painters—communicated a simple, relevant, populist message of hope and flava during the darkest years of the AIDS crisis. Then, in the early ‘90s, grunge arrived, and displaced the unpretentious communicative culture of the ‘80s with the dour obscurantism of COOL. Simple fun and emotional sincerity were now seen as embarrassing and deeply uncool. Enter artists like Rachel barrel-of-laughs Whiteread, who makes casts of the insides of cardboard boxes. (Nice work if you can get it!)

A couple of decades on, art has become completely pickled in the vinegar of COOL, and that is why it is so irrelevant to the general population.

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Image: Untitled acrylic and mixed media on canvas by Jean-Michel Basquiat, 1984. Courtesy of Wikipedia.[end-div]

Fly Me to the Moon: Mere Millionaries Need Not Apply

Golden Spike, a Boulder Colorado based company, has an interesting proposition for the world’s restless billionaires. It is offering a two-seat trip to the Moon, and back, for a tidy sum of $1.5 billion. And, the company is even throwing in a moon-walk. The first trip is planned for 2020.

[div class=attrib]From the Washington Post:[end-div]

It had to happen: A start-up company is offering rides to the moon. Book your seat now — though it’s going to set you back $750 million (it’s unclear if that includes baggage fees).

At a news conference scheduled for Thursday afternoon in Washington, former NASA science administrator Alan Stern plans to announce the formation of Golden Spike, which, according to a news release, is “the first company planning to offer routine exploration expeditions to the surface of the Moon.”

“We can do this,” an excited Stern said Thursday morning during a brief phone interview.

The gist of the company’s strategy is that it’ll repurpose existing space hardware for commercial lunar missions and take advantage of NASA-sanctioned commercial rockets that, in a few years, are supposed to put astronauts in low Earth orbit. Stern said a two-person lunar mission, complete with moonwalking and, perhaps best of all, a return to Earth, would cost $1.5 billion.

“Two seats, 750 each,” Stern said. “The trick is 40 years old. We know how to do this. The difference is now we have rockets and space capsules in the inventory. .?.?. They’re already developed. .?.?. We don’t have to invent them from a clean sheet of paper. We don’t have to start over.”

The statement says, “The company’s plan is to maximize use of existing rockets and to market the resulting system to nations, individuals, and corporations with lunar exploration objectives and ambitions.” Golden Spike says its plans have been vetted by a former space shuttle commander, a space shuttle program manager and a member of the National Academy of Engineering.

And Newt Gingrich is involved: The former speaker of the House, who was widely mocked this year when, campaigning for president, he talked at length about ambitious plans for a permanent moon base by 2021, is listed as a member of Golden Spike’s board of advisers.

Also on that list is Bill Richardson, the former New Mexico governor and secretary of the Department of Energy. The chairman of the board is Gerry Griffin, a former Apollo mission flight director and former director of NASA’s Johnson Space Center.

The private venture fills a void, as it were, in the wake of President Obama’s decision to cancel NASA’s Constellation program, which was initiated during the George W. Bush years as the next step in space exploration after the retirement of the space shuttle. Constellation aimed to put astronauts back on the moon by 2020 for what would become extended stays at a lunar base.

A sweeping review from a presidential committee led by retired aerospace executive Norman Augustine concluded that NASA didn’t have the money to achieve Constellation’s goals. The administration and Congress have given NASA new marching orders that require the building of a heavy-lift rocket that would give the agency the ability to venture far beyond low Earth orbit.

Routine access to space is being shifted to companies operating under commercial contracts. But as those companies try to develop commercial spaceflight, the United States lacks the ability to launch astronauts directly and must purchase flights to the international space station from the Russians.

[div c;ass=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image courtesy of The Golden Spike Company.[end-div]

A Star is Born, and its Solar System

A diminutive stellar blob some 450 million light years away seems to be a young star giving birth to a planetary system much like our very own Solar System. The developing protostar and its surrounding gas cloud is being tracked astronomers at the National Radio Astronomy Observatory in Charlottesville, Virginia. Stellar and planetary evolution in action.

[div class=attrib]From New Scientist:[end-div]

Swaddled in a cloud of dust and gas, the baby star shows a lot of potential. It is quietly sucking in matter from the cloud, which holds enough cosmic nourishment for the infant to grow as big and bright as our sun. What’s more, the star is surrounded by enough raw material to build at least seven planetary playmates.

Dubbed L1527, the star is still in the earliest stages of development, so it offers one of the best peeks yet at what our solar system may have looked like as it was taking shape.

The young star is currently one-fifth of the mass of the sun, but it is growing. If it has been bulking up at the same rate all its life, the star should be just 300,000 years old – a mere tyke compared to our 4.6-billion-year-old sun. But the newfound star may be even younger, because some theories say stars initially grow at a faster rate.

Diminutive sun

The cloud feeding the protostar contains at least as much material as our sun, says John Tobin of the National Radio Astronomy Observatory in Charlottesville, Virginia.

“The key factor in determining a star’s characteristics is the mass, so L1527 could potentially grow to become similar to the sun,” says Tobin.

Material from the cloud is being funnelled to the star through a swirling disc that contains roughly 0.5 per cent the mass of the sun. That might not sound like a lot, but that’s enough mass to make up at least seven Jupiter-sized planets.

Previous observations of L1527 had hinted that a disk encircled the star, but it was not clear that the disk was rotating, which is an essential ingredient for planet formation. So Tobin and his colleagues took a closer look.

Good rotations

The team used radio observations to detect the presence of carbon monoxide around the star and watched how the material swirled around in the disc to trace its overall motion. They found that matter nearest to the star is rotating faster than material near the edge of the disc – a pattern that mirrors the way planets orbit a star.

“The dust and gas are orbiting the protostar much like how planets orbit the sun,” says Tobin. “Unfortunately there is no telling how many planets might form or how large they will be.”

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Protostar L1527. Courtesy of NASA / JPL, via tumblr.[end-div]

From Man’s Best Friend to a Girl’s Best Friend

Chances are that you have a pet. And, whether you’re a dog person or a cat person, or a bird fancier or a lover of lizards you’d probably mourn if you were to lose your furry, or feathery or scaly, friend. So, when your pet crosses over to the other side why not pulverize her or him, filter out any non-carbon remains and then compress the results into, well, a diamond!

[div class=attrib]From WSJ:[end-div]

Natalie Pilon’s diamond is her best friend.

Every time she looks into the ring on her finger, Ms. Pilon sees Meowy, her late beloved silver cat. Meowy really is there: The ring’s two diamonds were made from her cremated remains.

“It’s a little eccentric—not something everyone would do,” says Ms. Pilon, a biotech sales representative in Boston, whose cat passed away last year. “It’s a way for me to remember my cat, and have her with me all the time.”

Americans have a long tradition of pampering and memorializing their pets. Now, technology lets precious friends become precious gems.

The idea of turning the carbon in ashes into man-made diamonds emerged a decade ago as a way to memorialize humans. Today, departed pets are fueling the industry’s growth, with a handful of companies selling diamonds, gemstones and other jewelry out of pet remains, including hair and feathers.

Some gems start at about $250, while pet diamonds cost at least $1,400, with prices based on color and size. The diamonds have the same physical properties as mined diamonds, purveyors say.

LifeGem, an Elk Grove Village, Ill., company, says it has made more than 1,000 animal diamonds in the past decade, mostly from dogs and cats but also a few birds, rabbits, horses and one armadillo. Customers truly can see facets of their pets, says Dean VandenBiesen, LifeGem’s co-founder, because “remains have some unique characteristics in terms of the ratios of elements, so no two diamonds are exactly alike.”

Jennifer Durante, 42 years old, of St. Petersburg, Fla., commissioned another company, Pet Gems, to create a light-blue zircon gemstone out of remains from her teacup Chihuahua, Tetley. “It reminds me of his eyes when the sun would shine into them,” she says.

Sonya Zofrea, a 42-year-old police officer in San Fernando, Calif., has two yellow diamonds to memorialize Baby, a black cat with yellow eyes who wandered into her life as a stray. The first contained a blemish, so maker LifeGem created another one free of charge with the cat’s ashes. But Ms. Zofrea felt the first reminded her most of her occasionally naughty kitty. “When I saw the imperfection, I thought, that’s just her,” says Ms. Zofrea. “She’s an imperfect little soul, aren’t we all?”

A spokesman for the Gemological Institute of America declined to comment on specific companies or processes, but said that synthetic diamonds, like naturally occurring ones, are made of carbon. “That carbon could come from the remains of a deceased pet,” he said.

Producing a one-carat diamond requires less than a cup of ashes or unpacked hair. Sometimes, companies add outside carbon if there isn’t enough.

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Image courtesy of Google search.[end-div]