Best Science Stories of 2012

As the year comes to a close it’s fascinating to look back at some of the most breathtaking science of 2012.

 

 

 

 

 

 

 

 

The image above is of Saturn’s moon Enceladus. Evidence from Cassini spacecraft, which took this remarkable image, suggests a deep salty ocean beneath the frozen surface that periodically spews out icy particles into the space. Many scientists believe that Enceladus is the best place to look for signs of life beyond Earth within our Solar System.

Read the entire article following the jump.

Image courtesy of Cassini Imaging Team/SSI/JPL/ESA/NASA.

Send to Kindle

So You Wanna Be a Rockstar?

Many of us harbor dreams, often secret ones, of becoming a famous rockstar. Well, if you want to live well passed middle age, think again. Being a rockstar and living a long life are not statistically compatible, especially if you’re American. You choose.

From ars technica:

Hedonism. Substance abuse. Risky behavior. Rock stars from Elvis Presley to Amy Winehouse have ended up famous not only for their music but for the decadent lifestyle it enabled, one that eventually contributed to their deaths. But how much does the rock lifestyle really hurt?

Quite a bit. That’s the conclusion of a new study that tracked nearly 1,500 chart-topping musicians and found that their life expectancy after fame really was lower than that of the general population. North American solo musicians seem to have it especially bad.

This wasn’t necessarily what you’d expect. A huge number of studies have shown that wealth is generally associated with greater longevity, possibly as a result of better health care, better diet, and lower stress. Not only are rock musicians dying faster than the general populace, but they’re completely negating the impact of any wealth that their fame brought to them.

To get a collection of rock stars for their study, the authors combed the charts and took advantage of a large poll that listed the top 1,000 albums of all time. Altogether, their subjects reached fame between the years of 1956 and 2006 and included everyone from Elvis Presley to Regina Spektor to the Arctic Monkeys. From there, the authors searched the news and Wikipedia, looking for reports of death. With that information in hand, they compared the artists’ life expectancies to those of the general population.

Only about two-thirds of North American stars were still alive 40 years after their first brush with fame, compared with about 80 percent of a matched population—and there was never a point at which they outlived their non-famous peers. Typically, Europeans have greater life expectancies, but European stars did not, tracking the longevity of average North Americans for the first few decades.

Oddly, however, once they survived 20 years after hitting the big time, European rock stars started to do better, outliving the typical North American. And, by 35 years, they caught up with the average European’s life expectancy. (No word from the authors on whether this trend would stay the same if the analysis excluded the members of the Rolling Stones.) On both continents, solo performers did worse than members of a band.

So what’s killing the famous? The authors identified cause of death wherever possible and classified it as either “other” or “substance use or risk-related deaths.” The latter category included “drug or alcohol-related chronic disorder, overdose or accident, and other risk-related causes that may or may not have been related to substance use, i.e., suicide and violence.” They also tried to determine (using biographical data) whether any of the deceased stars had suffered adverse childhood experiences, such as a substance abusing or a mentally ill parent.

Of those without any obvious childhood issues, under a third died of substance abuse or other risky behavior. Adding a single adverse childhood influence raised that rate to 42 percent. Two or more adverse events, and the rate shot up to about 80 percent.

These same sorts of childhood problems tend to lead to substance abuse and other troubles in the general population as well, and the authors conclude that the hedonism we associate with rock stars is less a lifestyle choice and more an outcome of early life issues.

Read the entire article after the jump.

Image: Spinal Tap backstage at CBGB’s in New York City. Photograph: Ebet Roberts/Redferns / Guardian.

Send to Kindle

For Sale – Year in Review

Now is the time of year to review all that has passed during 2012. You know how it goes: celebrity marriages, celebrity divorces, extreme weather records, deaths, best and worst movies. Our favorite moments come courtesy of postings on Craigslist. Annually, Craigslist users nominate their favorites for inclusion in the “Best Of” category. A recent favorite of ours from Pensacola, Florida:

guy with skid mark, bought gallon of whole milk, circle k – w4m

i was in my bikini at the circle k, you came in with your short shirt and your bike shorts on. they were white and you had a pretty sexy skid mark staining your behind. you got 11 sticks of beef jerky and a gallon of whole milk, then rode off on your bicycle. i will know its you because you paid in pennies.

From Wired:

Homer Simpson’s famous ode to alcohol—”The cause of, and solution to, all of life’s problems”—might apply in equal measure to Craigslist, the wildly popular, barebones site where one can find all of life’s problems and solutions, including: a freelance writing gig, roommates, a sex partner, a man-sized fiberglass chili pepper, a lifetime supply of hot sauce, and coffee beans that have been ingested, digested, and excreted by someone living in Portland.

Each year, Craigslist users across the country flag their favorite classified ads for inclusion in the “best of” category. The bar to inclusion is high, but somehow each year America comes through with memorable postings that remind us just why we went ahead with this whole Web 2.0 thing.

This year was no exception. Here are a few of our favorites.

Paging Michelangelo

“Artist needed. Must love owls,” said one September post, which had something quite specific in mind.

We need an artist to depict the following: an owl skeleton with a parrot on its shoulder. The parrot is not a skeleton and is very colorful. The parrot has a peg leg, with a pirate hat on. The owl has an eye patch and a gold chain necklace with a skull on the pendant of said necklace. The skull in the pendant has an eye patch on the opposite eye of the owl (long story there don’t ask). The owl skeleton also has on a wizard’s hat with that typical wizard hat wrinkle. The owl is standing on a cowboy hat from a whale’s spout. This all is within a snow globe. That santa is holding with his only good hand because his other hand is a hook. Mrs. Clause is pulling on Mr. Clause’s coat with one of those dinosaur mouth grabbers that all 80’s children know.

The artist who could handle the commission would get both some cash and “a prize.”

(Side note: the oddly specific nature of this image request parallels those often received by our own creative director, Aurich Lawson, who has fielded article image suggestions that make this one look absolutely normal by comparison.)

Needed: one lap for aging cat

Next up, the “feline lap surrogate,” which I want to believe is a joke but fear is not. This job post is exactly what it sounds like, viz., the surrogate goes to a home each morning from 8am-12pm and gets paid $15 an hour to sit in a chair and “allow my cat to sit on their lap (the cat is attention seeking, and has been decreasing my productivity as of late).” The ideal candidate must have cat handling experience and no allergies.

“I do not need anyone in the afternoon since the sun warms the window sill by that point, and the cat will prefer the window sill to a lap,” the ad concludes. “Breakfast and lunch will be provided each day.”

Read the entire article after the jump.

Send to Kindle

The Missing Linc

LincRNA that is. Recent discoveries hint at the potentially crucial role of this new class of genetic material in embryonic development, cell and tissue differentiation and even speciation and evolution.

From the Economist:

THE old saying that where there’s muck, there’s brass has never proved more true than in genetics. Once, and not so long ago, received wisdom was that most of the human genome—perhaps as much as 99% of it—was “junk”. If this junk had a role, it was just to space out the remaining 1%, the genes in which instructions about how to make proteins are encoded, in a useful way in the cell nucleus.

That, it now seems, was about as far from the truth as it is possible to be. The decade or so since the completion of the Human Genome Project has shown that lots of the junk must indeed have a function. The culmination of that demonstration was the publication, in September, of the results of the ENCODE project. This suggested that almost two-thirds of human DNA, rather than just 1% of it, is being copied into molecules of RNA, the chemical that carries protein-making instructions to the sub-cellular factories which turn those proteins out, and that as a consequence, rather than there being just 23,000 genes (namely, the bits of DNA that encode proteins), there may be millions of them.

The task now is to work out what all these extra genes are up to. And a study just published in Genome Biology, by David Kelley and John Rinn of Harvard University, helps do that for one new genetic class, a type known as lincRNAs. In doing so, moreover, Dr Kelley and Dr Rinn show just how complicated the modern science of genetics has become, and hint also at how animal species split from one another.

Lincs in the chain

Molecules of lincRNA are similar to the messenger-RNA molecules which carry protein blueprints. They do not, however, encode proteins. More than 9,000 sorts are known, and most of those whose job has been tracked down are involved in the regulation of other genes, for example by attaching themselves to the DNA switches that control those genes.

LincRNA is rather odd, though. It often contains members of a second class of weird genetic object. These are called transposable elements (or, colloquially, “jumping genes”, because their DNA can hop from one place to another within the genome). Transposable elements come in several varieties, but one group of particular interest are known as endogenous retroviruses. These are the descendants of ancient infections that have managed to hide away in the genome and get themselves passed from generation to generation along with the rest of the genes.

Dr Kelley and Dr Rinn realised that the movement within the genome of transposable elements is a sort of mutation, and wondered if it has evolutionary consequences. Their conclusion is that it does, for when they looked at the relation between such elements and lincRNA genes, they found some intriguing patterns.

In the first place, lincRNAs are much more likely to contain transposable elements than protein-coding genes are. More than 83% do so, in contrast to only 6% of protein-coding genes.

Second, those transposable elements are particularly likely to be endogenous retroviruses, rather than any of the other sorts of element.

Third, the interlopers are usually found in the bit of the gene where the process of copying RNA from the DNA template begins, suggesting they are involved in switching genes on or off.

And fourth, lincRNAs containing one particular type of endogenous retrovirus are especially active in pluripotent stem cells, the embryonic cells that are the precursors of all other cell types. That indicates these lincRNAs have a role in the early development of the embryo.

Previous work suggests lincRNAs are also involved in creating the differences between various sorts of tissue, since many lincRNA genes are active in only one or a few cell types. Given that their principal job is regulating the activities of other genes, this makes sense.

Even more intriguingly, studies of lincRNA genes from species as diverse as people, fruit flies and nematode worms, have found they differ far more from one species to another than do protein-coding genes. They are, in other words, more species specific. And that suggests they may be more important than protein-coding genes in determining the differences between those species.

Read the entire article after the jump.

Image: Darwin’s finches or Galapagos finches. Darwin, 1845. Courtesy of Wikipedia.

Send to Kindle

British? May the Force be With You

Recent census figures from the United Kingdom show that Jedi is the seventh most popular faith overall, with just over 176,000 followers.

While this is down from a high of around 400,000 in the previous census (2001) it does suggest that George Lucas, creator of the Star Wars franchise, would still be a good stand-in for God in some parts of the U.K.

To learn more about Jediism point your browser here.

From the Telegraph:

The new figures reveal that the lightsabre-wielding disciples are only behind Christianity, Islam, Hinduism, Sikhism, Judaism and Buddhism in the popularity stakes, excluding non-religious people and people who did not answer.

Following a nationwide campaign, Jedi made it onto the 2001 census, with 390,127 people identifying themselves a decade ago as followers of the fictional Star Wars creed.

Although the number of Jedis has dropped by more than 50 per cent over the past 10 years, they are still the most selected “alternative” faith on the Census, and constitute 0.31% of all people’s stated religious affiliation in England and Wales.

The latest official population survey also revealed 6,242 people subscribe to the Heavy Metal religion, which was set up in 2010 by the Rock magazine, Metal Hammer.

The number of people specifically identifying as Atheists was 29,267, while over 13.8 million refused to identify with a faith at all, ticking the “No religion” box on the census form.

Norwich was revealed as the area with the highest proportion of non-religious people, with 41.5% of residents refusing to identify with a faith. The city also possesses the highest proportion of Heavy Metal followers and the 3rd highest proportion of Jedi Knights.

Other non-mainstream religions that had followers in significant numbers included 56,620 Paganists, 39,061 Spiritualists, 2,418 Scientologists and 20,288 Jainists, some of whom sweep the floor with a broom made of cotton threads as they walk along so as not to kill any insects.

Read the entire article after the jump.

Image: Star Wars Jedi Knights, Qui-Gon Jinn and Obi-Wan Kenobi. Courtesy of Wikipedia / Lucas Films.

Send to Kindle

Rivers of Methane

The image shows what looks like a satellite picture of a river delta, complete with tributaries. It could be the Nile or the Amazon river systems as seen from space.

However, the image is not of an earthbound river at all. It’s a recently discovered river on Titan, Saturn’s largest moon. And, the river’s contents are not even water, but probably a mixture of liquid ethane and methane.

From NASA:

This image from NASA’s Cassini spacecraft shows a vast river system on Saturn’s moon Titan. It is the first time images from space have revealed a river system so vast and in such high resolution anywhere other than Earth. The image was acquired on Sept. 26, 2012, on Cassini’s 87th close flyby of Titan. The river valley crosses Titan’s north polar region and runs into Ligeia Mare, one of the three great seas in the high northern latitudes of Saturn’s moon Titan. It stretches more than 200 miles (400 kilometers).

Scientists deduce that the river is filled with liquid because it appears dark along its entire extent in the high-resolution radar image, indicating a smooth surface. That liquid is presumably ethane mixed with methane, the former having been positively identified in 2008 by Cassini’s visual and infrared mapping spectrometer at the lake known as Ontario Lacus in Titan’s southern hemisphere. Though there are some short, local meanders, the relative straightness of the river valley suggests it follows the trace of at least one fault, similar to other large rivers running into the southern margin of Ligeia Mare (see PIA10008). Such faults may lead to the opening of basins and perhaps to the formation of the giant seas themselves.

North is toward the top of this image.

The Cassini-Huygens mission is a cooperative project of NASA, the European Space Agency and ASI, the Italian Space Agency. NASA’s Jet Propulsion Laboratory, a division of the California Institute of Technology in Pasadena, manages the mission for NASA’s Science Mission Directorate, Washington. The Cassini orbiter was designed, developed and assembled at JPL. The RADAR instrument was built by JPL and the Italian Space Agency, working with team members from the US and several European countries. JPL is a division of the California Institute of Technology in Pasadena.

Read the entire article following the jump.

Image courtesy of NASA/JPL-Caltech/ASI.

Send to Kindle

The Future of the Grid

Two common complaints dog the sustainable energy movement: first, energy generated from the sun and wind is not always present; second, renewable energy is too costly. A new study debunks these notions, and shows that cost effective renewable energy could power our needs 99 percent of the time by 2030.

From ars technica:

You’ve probably heard the argument: wind and solar power are well and good, but what about when the wind doesn’t blow and the sun doesn’t shine? But it’s always windy and sunny somewhere. Given a sufficient distribution of energy resources and a large enough network of electrically conducting tubes, plus a bit of storage, these problems can be overcome—technologically, at least.

But is it cost-effective to do so? A new study from the University of Delaware finds that renewable energy sources can, with the help of storage, power a large regional grid for up to 99.9 percent of the time using current technology. By 2030, the cost of doing so will hit parity with current methods. Further, if you can live with renewables meeting your energy needs for only 90 percent of the time, the economics become positively compelling.

“These results break the conventional wisdom that renewable energy is too unreliable and expensive,” said study co-author Willett Kempton, a professor at the University of Delaware’s School of Marine Science and Policy. “The key is to get the right combination of electricity sources and storage—which we did by an exhaustive search—and to calculate costs correctly.”

By exhaustive, Kempton is referring to the 28 billion combinations of inland and offshore wind and photovoltaic solar sources combined with centralized hydrogen, centralized batteries, and grid-integrated vehicles analyzed in the study. The researchers deliberately overlooked constant renewable sources of energy such as geothermal and hydro power on the grounds that they are less widely available geographically.

These technologies were applied to a real-world test case: that of the PJM Interconnection regional grid, which covers parts of states from New Jersey to Indiana, and south to North Carolina. The model used hourly consumption data from the years 1999 to 2002; during that time, the grid had a generational capacity of 72GW catering to an average demand of 31.5GW. Taking in 13 states, either whole or in part, the PJM Interconnection constitutes one fifth of the USA’s grid. “Large” is no overstatement, even before considering more recent expansions that don’t apply to the dataset used.

The researchers constructed a computer model using standard solar and wind analysis tools. They then fed in hourly weather data from the region for the whole four-year period—35,040 hours worth. The goal was to find the minimum cost at which the energy demand could be met entirely by renewables for a given proportion of the time, based on the following game plan:

  1. When there’s enough renewable energy direct from source to meet demand, use it. Store any surplus.
  2. When there is not enough renewable energy direct from source, meet the shortfall with the stored energy.
  3. When there is not enough renewable energy direct from source, and the stored energy reserves are insufficient to bridge the shortfall, top up the remaining few percent of the demand with fossil fuels.

Perhaps unsurprisingly, the precise mix required depends upon exactly how much time you want renewables to meet the full load. Much more surprising is the amount of excess renewable infrastructure the model proposes as the most economic. To achieve a 90-percent target, the renewable infrastructure should be capable of generating 180 percent of the load. To meet demand 99.9 percent of the time, that rises to 290 percent.

“So much excess generation of renewables is a new idea, but it is not problematic or inefficient, any more than it is problematic to build a thermal power plant requiring fuel input at 250 percent of the electrical output, as we do today,” the study argues.

Read the entire article after the jump.

Image: Bangui Windfarm, Ilocos Norte, Philippines. Courtesy of
Wikipedia.

Send to Kindle

Places to Visit Before World’s End

In case you missed all the apocalyptic hoopla, the world is supposed to end today. Now, if you’re reading this, you obviously still have a little time, since the Mayans apparently did not specify a precise time for prophesied end. So, we highly recommend that you visit one or more of these beautiful places, immediately. Of course, if we’re all still here tomorrow, you will have some extra time to take in these breathtaking sights before the next planned doomsday.

Check out the top 100 places according to the Telegraph after the jump.

Image: Lapland for the northern lights. Courtesy of ALAMY / Telegraph.

Send to Kindle

E or I, T or F: 50 Years of Myers-Briggs

Two million people annually take the Myers-Briggs Type Indicator assessment. Over 10,000 businesses and 2,500 colleges in the United States use the test.

It’s very likely that you have taken the test at some point in your life: during high school, or to get into university or to secure your first job. The test categorizes humans along 4 discrete axes (or dichotomies) of personality types: Extraversion (E) and Introversion (I); Sensing (S) and Intuition (N); Thinking (T) and Feeling (F); Judging (J) and Perceiving (P). If your have a partner it’s likely that he or she has, at sometime or another, (mis-)labeled you as an E or an I, and as a “feeler” rather than a “thinker”, and so on. Countless arguments will have ensued.

From the Washington Post:

Some grandmothers pass down cameo necklaces. Katharine Cook Briggs passed down the world’s most widely used personality test.

Chances are you’ve taken the Myers-Briggs Type Indicator, or will. Roughly 2 million people a year do. It has become the gold standard of psychological assessments, used in businesses, government agencies and educational institutions. Along the way, it has spawned a multimillion-dollar business around its simple concept that everyone fits one of 16 personality types.

Now, 50 years after the first time anyone paid money for the test, the Myers-Briggs legacy is reaching the end of the family line. The youngest heirs don’t want it. And it’s not clear whether organizations should, either.

That’s not to say it hasn’t had a major influence.

More than 10,000 companies, 2,500 colleges and universities and 200 government agencies in the United States use the test. From the State Department to McKinsey & Co., it’s a rite of passage. It’s estimated that 50 million people have taken the Myers-Briggs personality test since the Educational Testing Service first added the research to its portfolio in 1962.

The test, whose first research guinea pigs were George Washington University students, has seen financial success commensurate to this cultlike devotion among its practitioners. CPP, the private company that publishes Myers-Briggs, brings in roughly $20 million a year from it and the 800 other products, such as coaching guides, that it has spawned.

Yet despite its widespread use and vast financial success, and although it was derived from the work of Carl Jung, one of the most famous psychologists of the 20th century, the test is highly questioned by the scientific community.

To begin even before its arrival in Washington: Myers-Briggs traces its history to 1921, when Jung, a Swiss psychiatrist, published his theory of personality types in the book “Psychologische Typen.” Jung had become well known for his pioneering work in psychoanalysis and close collaboration with Sigmund Freud, though by the 1920s the two had severed ties.

Psychoanalysis was a young field and one many regarded skeptically. Still, it had made its way across the Atlantic not only to the university offices of scientists but also to the home of a mother in Washington.

Katharine Cook Briggs was a voracious reader of the new psychology books coming out in Europe, and she shared her fascination with Jung’s latest work — in which he developed the concepts of introversion and extroversion — with her daughter, Isabel Myers. They would later use Jung’s work as a basis for their own theory, which would become the Myers-Briggs Type Indicator. MBTI is their framework for classifying personality types along four distinct axes: introversion vs. extroversion, sensing vs. intuition, thinking vs. feeling and judging vs. perceiving. A person, according to their hypothesis, has one dominant preference in each of the four pairs. For example, he might be introverted, a sensor, a thinker and a perceiver. Or, in Myers-Briggs shorthand, an “ISTP.”

Read the entire article following the jump.

Image: Keirsey Temperament Sorter, which utilizes Myers-Briggs dichotomies to group personalities into 16 types. Courtesy of Wikipedia.

Send to Kindle

Single-tasking is Human

If you’re an office worker you will relate. Recently, you will have participated on a team meeting or conference call only to have at least one person say, when asked a question, “sorry can you please repeat that, I was multitasking.”

Many of us believe, or have been tricked into believing, that doing multiple things at once makes us more productive. This phenomenon was branded by business as multitasking. After all, if computers could do it, then why not humans. Yet, experience shows that humans are woefully inadequate at performing multiple concurrent tasks that require dedicated attention. Of course, humans are experts at walking and chewing gum at the same time. However, in the majority of cases these activities require very little involvement from the higher functions of the brain. There is a growing body of anecdotal and experimental evidence that shows poorer performance on multiple tasks done concurrently versus the same tasks performed sequentially. In fact, for quite some time, researchers have shown that dealing with multiple streams of information at once is a real problem for our limited brains.

Yet, most businesses seem to demand or reward multitasking behavior. And damagingly, the multitasking epidemic now seems to be the norm in the home as well.

From the WSJ:

In the few minutes it takes to read this article, chances are you’ll pause to check your phone, answer a text, switch to your desktop to read an email from the boss’s assistant, or glance at the Facebook or Twitter messages popping up in the corner of your screen. Off-screen, in your open-plan office, crosstalk about a colleague’s preschooler might lure you away, or a co-worker may stop by your desk for a quick question.

And bosses wonder why it is tough to get any work done.

Distraction at the office is hardly new, but as screens multiply and managers push frazzled workers to do more with less, companies say the problem is worsening and is affecting business.

While some firms make noises about workers wasting time on the Web, companies are realizing the problem is partly their own fault.

Even though digital technology has led to significant productivity increases, the modern workday seems custom-built to destroy individual focus. Open-plan offices and an emphasis on collaborative work leave workers with little insulation from colleagues’ chatter. A ceaseless tide of meetings and internal emails means that workers increasingly scramble to get their “real work” done on the margins, early in the morning or late in the evening. And the tempting lure of social-networking streams and status updates make it easy for workers to interrupt themselves.

“It is an epidemic,” says Lacy Roberson, a director of learning and organizational development at eBay Inc. At most companies, it’s a struggle “to get work done on a daily basis, with all these things coming at you,” she says.

Office workers are interrupted—or self-interrupt—roughly every three minutes, academic studies have found, with numerous distractions coming in both digital and human forms. Once thrown off track, it can take some 23 minutes for a worker to return to the original task, says Gloria Mark, a professor of informatics at the University of California, Irvine, who studies digital distraction.

Companies are experimenting with strategies to keep workers focused. Some are limiting internal emails—with one company moving to ban them entirely—while others are reducing the number of projects workers can tackle at a time.

Last year, Jamey Jacobs, a divisional vice president at Abbott Vascular, a unit of health-care company Abbott Laboratories learned that his 200 employees had grown stressed trying to squeeze in more heads-down, focused work amid the daily thrum of email and meetings.

“It became personally frustrating that they were not getting the things they wanted to get done,” he says. At meetings, attendees were often checking email, trying to multitask and in the process obliterating their focus.

Part of the solution for Mr. Jacobs’s team was that oft-forgotten piece of office technology: the telephone.

Mr. Jacobs and productivity consultant Daniel Markovitz found that employees communicated almost entirely over email, whether the matter was mundane, such as cake in the break room, or urgent, like an equipment issue.

The pair instructed workers to let the importance and complexity of their message dictate whether to use cellphones, office phones or email. Truly urgent messages and complex issues merited phone calls or in-person conversations, while email was reserved for messages that could wait.

Workers now pick up the phone more, logging fewer internal emails and say they’ve got clarity on what’s urgent and what’s not, although Mr. Jacobs says staff still have to stay current with emails from clients or co-workers outside the group.

Read the entire article after the jump, and learn more in this insightful article on multitasking over at Big Think.

Image courtesy of Big Think.

Send to Kindle

Guns, Freedom and the Uncivil Society

Associate professor of philosophy, Firmin DeBrabander, argues that guns have no place in a civil society. Guns hinder free speech and free assembly for those at either end of the barrel. Guns fragment our society and undermine the sense and mechanisms of community. He is right.

From the New York Times:

The night of the shootings at Sandy Hook Elementary School in Newtown, Conn., I was in the car with my wife and children, working out details for our eldest son’s 12th birthday the following Sunday — convening a group of friends at a showing of the film  “The Hobbit.” The memory of the Aurora movie theatre massacre was fresh in his mind, so he was concerned that it not be a late night showing. At that moment, like so many families, my wife and I were weighing whether to turn on the radio and expose our children to coverage of the school shootings in Connecticut. We did. The car was silent in the face of the flood of gory details. When the story was over, there was a long thoughtful pause in the back of the car. Then my eldest son asked if he could be homeschooled.

That incident brought home to me what I have always suspected, but found difficult to articulate: an armed society — especially as we prosecute it at the moment in this country — is the opposite of a civil society.

The Newtown shootings occurred at a peculiar time in gun rights history in this nation. On one hand, since the mid 1970s, fewer households each year on average have had a gun. Gun control advocates should be cheered by that news, but it is eclipsed by a flurry of contrary developments. As has been well publicized, gun sales have steadily risen over the past few years, and spiked with each of Obama’s election victories.

Furthermore, of the weapons that proliferate amongst the armed public, an increasing number are high caliber weapons (the weapon of choice in the goriest shootings in recent years). Then there is the legal landscape, which looks bleak for the gun control crowd.

Every state except for Illinois has a law allowing the carrying of concealed weapons — and just last week, a federal court struck down Illinois’ ban. States are now lining up to allow guns on college campuses. In September, Colorado joined four other states in such a move, and statehouses across the country are preparing similar legislation. And of course, there was Oklahoma’s ominous Open Carry Law approved by voters this election day — the fifteenth of its kind, in fact — which, as the name suggests, allows those with a special permit to carry weapons in the open, with a holster on their hip.

Individual gun ownership — and gun violence — has long been a distinctive feature of American society, setting us apart from the other industrialized democracies of the world. Recent legislative developments, however, are progressively bringing guns out of the private domain, with the ultimate aim of enshrining them in public life. Indeed, the N.R.A. strives for a day when the open carry of powerful weapons might be normal, a fixture even, of any visit to the coffee shop or grocery store — or classroom.

As N.R.A. president Wayne LaPierre expressed in a recent statement on the organization’s Web site, more guns equal more safety, by their account. A favorite gun rights saying is “an armed society is a polite society.” If we allow ever more people to be armed, at any time, in any place, this will provide a powerful deterrent to potential criminals. Or if more citizens were armed — like principals and teachers in the classroom, for example — they could halt senseless shootings ahead of time, or at least early on, and save society a lot of heartache and bloodshed.

As ever more people are armed in public, however — even brandishing weapons on the street — this is no longer recognizable as a civil society. Freedom is vanished at that point.

And yet, gun rights advocates famously maintain that individual gun ownership, even of high caliber weapons, is the defining mark of our freedom as such, and the ultimate guarantee of our enduring liberty. Deeper reflection on their argument exposes basic fallacies.

In her book “The Human Condition,” the philosopher Hannah Arendt states that “violence is mute.” According to Arendt, speech dominates and distinguishes the polis, the highest form of human association, which is devoted to the freedom and equality of its component members. Violence — and the threat of it — is a pre-political manner of communication and control, characteristic of undemocratic organizations and hierarchical relationships. For the ancient Athenians who practiced an incipient, albeit limited form of democracy (one that we surely aim to surpass), violence was characteristic of the master-slave relationship, not that of free citizens.

Arendt offers two points that are salient to our thinking about guns: for one, they insert a hierarchy of some kind, but fundamental nonetheless, and thereby undermine equality. But furthermore, guns pose a monumental challenge to freedom, and particular, the liberty that is the hallmark of any democracy worthy of the name — that is, freedom of speech. Guns do communicate, after all, but in a way that is contrary to free speech aspirations: for, guns chasten speech.

This becomes clear if only you pry a little more deeply into the N.R.A.’s logic behind an armed society. An armed society is polite, by their thinking, precisely because guns would compel everyone to tamp down eccentric behavior, and refrain from actions that might seem threatening. The suggestion is that guns liberally interspersed throughout society would cause us all to walk gingerly — not make any sudden, unexpected moves — and watch what we say, how we act, whom we might offend.

As our Constitution provides, however, liberty entails precisely the freedom to be reckless, within limits, also the freedom to insult and offend as the case may be. The Supreme Court has repeatedly upheld our right to experiment in offensive language and ideas, and in some cases, offensive action and speech. Such experimentation is inherent to our freedom as such. But guns by their nature do not mix with this experiment — they don’t mix with taking offense. They are combustible ingredients in assembly and speech.

I often think of the armed protestor who showed up to one of the famously raucous town hall hearings on Obamacare in the summer of 2009. The media was very worked up over this man, who bore a sign that invoked a famous quote of Thomas Jefferson, accusing the president of tyranny. But no one engaged him at the protest; no one dared approach him even, for discussion or debate — though this was a town hall meeting, intended for just such purposes. Such is the effect of guns on speech — and assembly. Like it or not, they transform the bearer, and end the conversation in some fundamental way. They announce that the conversation is not completely unbounded, unfettered and free; there is or can be a limit to negotiation and debate — definitively.

Read the entire article after the jump.

Image courtesy of Wikipedia.

Send to Kindle

Blind Loyalty and the Importance of Critical Thinking

Two landmark studies in the 1960s and ’70s put behavioral psychology squarely in the public consciousness. The obedience experiments by Stanley Milgram and the Stanford Prison experiment demonstrated how regular individuals could be made, quite simply, to obey figures in authority and to subject others to humiliation, suffering and pain.

A re-examination of these experiments and several recent similar studies have prompted a number of psychologists to offer a reinterpretation of the original conclusions. They suggest that humans may not be inherently evil after all. However, we remain dangerously flawed — our willingness to follow those in authority, especially in those with whom we identify, makes us susceptible to believing in the virtue of actions that by all standards would be monstrous. It turns out that an open mind able to think critically may be the best antidote.

From the Pacific Standard:

They are among the most famous of all psychological studies, and together they paint a dark portrait of human nature. Widely disseminated in the media, they spread the belief that people are prone to blindly follow authority figures—and will quickly become cruel and abusive when placed in positions of power.

It’s hard to overstate the impact of Stanley Milgram’s obedience experiments of 1961, or the Stanford Prison Experiment of 1971. Yet in recent years, the conclusions derived from those studies have been, if not debunked, radically reinterpreted.

A new perspective—one that views human nature in a more nuanced light—is offered by psychologists Alex Haslam of the University of Queensland, Australia, and Stephen Reicher of the University of St. Andrews in Scotland.

In an essay published in the open-access journal PLoS Biology, they argue that people will indeed comply with the questionable demands of authority figures—but only if they strongly identify with that person, and buy into the rightness of those beliefs.
In other words, we’re not unthinking automatons. Nor are we monsters waiting for permission for our dark sides to be unleashed. However, we are more susceptible to psychological manipulation than we may realize.

In Milgram’s study, members of the general public were placed in the role of “teacher” and told that a “learner” was in a nearby room. Each time the “learner” failed to correctly recall a word as part of a memory experiment, the “teacher” was told to administer an electrical shock.

As the “learner” kept making mistakes, the “teacher” was ordered to give him stronger and stronger jolts of electricity. If a participant hesitated, the experimenter—an authority figure wearing a white coat—instructed him to continue.

Somewhat amazingly, most people did so: 65 percent of participants continued to give stronger and stronger shocks until the experiment ended with the “learner” apparently unconscious. (The torture was entirely fictional; no actual shocks were administered.)
To a world still reeling from the question of why so many Germans obeyed orders and carried out Nazi atrocities, here was a clear answer: We are predisposed to obey authority figures.

The Stanford Prisoner Experiment, conducted a few years later, was equally unnerving. Students were randomly assigned to assume the role of either prisoner or guard in a “prison” set up in the university’s psychology department. As Haslam and Reicher note, “such was the abuse meted out to the prisoners by the guards that the study had to be terminated after just six days.”

Lead author Philip Zimbardo, who assumed the role of “prison superintendent” with a level of zeal he later found frightening, concluded that brutality was “a natural consequence of being in the uniform of a guard and asserting the power inherent in that role.”

So is all this proof of the “banality of evil,” to use historian Hannah Arendt’s memorable phrase? Not really, argue Haslam and Reicher. They point to their own work on the BBC Prison Study, which mimicked the seminal Stanford study.

They found that participants “did not conform automatically to their assigned role” as prisoner or guard. Rather, there was a period of resistance, which ultimately gave way to a “draconian” new hierarchy. Before becoming brutal, the participants needed time to assume their new identities, and internalize their role in the system.

Once they did so, “the hallmark of the tyrannical regime was not conformity, but creative leadership and engaged followership within a group of true believers,” they write. “This analysis mirrors recent conclusions about the Nazi tyranny.”

Read the entire article after the jump.

Send to Kindle

The Habitable Exoplanets Catalog

The Habitable Exoplanets Catalog is a fascinating resource for those who dream of starting a new life on a distant world. Only into its first year, the catalog now lists 7 planets outside of our solar system and within our own Milky Way galaxy that could become a future home for adventurous humans — complaints from existing inhabitants notwithstanding. Although, the closest at the moment at a distance of just over 20 light years — Gliese 581g — would take around 200,000 years to reach using current technology.

From the Independent:

An ambitious project to catalogue every habitable planet has discovered seven worlds inside the Milky Way that could possibly harbour life.

Marking its first anniversary, the Habitable Exoplanets Catalog said it had far exceeded its expectation of adding one or two new planets this year in its search for a new earth.

In recent years scientists from the Puerto Rico-based Planetary Habitability Laboratory that runs the catalogue have sharpened their techniques for finding new planets outside our solar system.

Chile’s High Accuracy Radial Veolocity Planet Searcher and the orbiting Kepler Space Telescope are two of the many tools that have increased the pace of discoveries.

The Planetary Habitability Laboratory launched the Habitable Exoplanets Catalog last year to measure the suitability for life of these emerging worlds and as a way to organise them for the public.

It has found nearly 80 confirmed exoplanets with a similar size to Earth but only a few of those have the right distance from their star to support liquid surface water – the presence of which is considered essential to sustain life.

Seven potentially habitable exoplanets are now listed by the Habitable Exoplanets Catalog, including the disputed Gliese 581g, plus some 27 more from NASA Kepler candidates waiting for confirmation.

Although all these exoplanets are superterrans are considered potentially habitable, scientists have not yet found a true Earth analogue.

Read the entire article following the jump.

Image: Current Potential Habitable Exoplanets. Courtesy of CREDIT: PHL @ UPR Arecibo.

Send to Kindle

Apocalypse Now… First, Brew Some Tea

We love stories of dystopian futures, apocalyptic prophecies and nightmarish visions here at theDiagonal. For some of our favorite articles on the end of days, check out end of world predictions, and how the world may end.

The next impending catastrophe is due a mere week from now, on December 21st, 2012, according to Mayan-watchers. So, of course, it’s time to make final preparations for the end of the world, again. Not to be outdone by the Mayans, the British, guardians of that very stiff-upper-lip, have some timely advice for doomsayers and doomsday aficionados. After all, only the British could come up with a propaganda poster during the second World War emblazoned with “Keep Calm and Carry On”. While there is some very practical advice, such as “leave extra time for journeys”, we find fault with the British authorities for not suggesting “take time to make a good, strong cup of tea”.

From the Independent:

With the world edging ever closer to what some believe could be an end of days catastrophe that will see the planet and its inhabitants destroyed, British authorities have been issuing tongue in cheek advice on how to prepare.

The advice comes just two weeks ahead of the day that some believe will mark the end of world.

According to some interpretations of the ancient Mayan calendar the 21st of December will signal the end of a 5,125-year cycle known as the Long Count – and will bring about the apocalypse.

There have been scattered reports of panic buying of candles and essentials in China and Russia. There has also been a reported hike in the sales of survival shelters in America.

An official US government blog was published last week saying it was “just rumours” and insisting that “the world will not end on December 21, 2012, or any day in 2012”.

In France, authorities have even taken steps to prevent access to Bugarach mountain, which is thought by some to be a sacred place that will protect them from the end of the world.

Reports claimed websites in the US were selling tickets to access the mountain on the 21st.

In the UK, however, the impending apocalypse is being treated with dead-pan humour by some organisations.

The AA has advised: “Before heading off, take time to do the basic checks on your car and allow extra time for your journey.

“Local radio is a good source of traffic and weather updates and for any warnings of an impending apocalypse. Should the announcer break such solemn news, try to remain focused on the road ahead and keep your hands on the wheel.”

A London Fire Brigade spokesman issued the following advice: “Fit a smoke alarm on each level of your home, then at least you might stand a chance of knowing that the end of the world is nigh ahead of those who don’t.

“If you survive the apocalypse you’ll be alerted to a fire more quickly should one ever break out.”

An RSPCA [Royal Society for the Prevention of Cruelty to Animals] spokesman offered advice for animal lovers ahead of apocalypse saying: “Luckily for animals, they do not have the same fears of the future – or its imminent destruction – as us humans, so it is unlikely that our pets will be worrying about the end of the world.

Read the entire article after the jump.

Image: Digital scan of original KEEP CALM AND CARRY ON poster owned by wartimeposters.co.uk.. Courtesy of Wikipedia.

Send to Kindle

We The People… Want a Twinkie

The old adage, “be careful what you wish for, lest it come true”, shows that desires may well come to fruition, but often have unintended consequences. In this case, for the White House. A couple of years ago the administration launched an online drive to foster dialogue and participation in civic affairs. Known as “We the People: Your Voice in Our Government” the program allows individuals to petition the government on any important issue of the day. And, while White House officials may have had in mind a discussion of substantive issues, many petitions are somewhat more off the wall. Some of our favorite, colorful petitions, many of which have garnered thousands of signatures to date, include:

“Legalize home distillation for home spirits!”

“Secure resources and funding, and begin construction of a Death Star by 2016.”

“Nationalize the Twinkie industry.”

“Peacefully grant the State of Texas to withdraw from the United States of America and create its own NEW government.”

“Peacefully grant the city of Austin Texas to withdraw from the state of Texas & remain part of the United States.”

“Allow the city of El Paso to secede from the state of Texas. El Paso is tired of being a second class city within Texas.”

“Legalize the use of DMT, magic mushrooms, and mescaline for all people.”

“Outlaw offending prophets of major religions.”

“Legally recognize the tea party as a hate group and remove them from office for treason against the United States.”

“Give us back our incandescent lightbulbs! We, the undersigned, want the freedom to choose our own lightbulbs.”

“Create and Approve The MICHAEL JOSEPH JACKSON National Holiday.”

From the Washington Post:

Forget the “fiscal cliff”: When it comes to the nation’s most pressing concerns, other matters trump financial calamity.

Several thousand Americans, for example, are calling on President Obama to nationalize the troubled Twinkies industry to prevent the loss of the snack cake’s “sweet creamy center.”

Thousands more have signed petitions calling on the White House to replace the courts with a single Hall of Justice, remove Jerry Jones as owner of the Dallas Cowboys, give federal workers a holiday on Christmas Eve, allow members of the military to put their hands in their pockets and begin construction of a “Star Wars”-style Death Star by 2016.

And that’s just within the past month.

The people have spoken, but it might not be what the Obama administration expected to hear. More than a year after it was launched, an ambitious White House online petition program aimed at encouraging civic participation has become cluttered with thousands of demands that are often little more than extended Internet jokes. Interest has escalated in the wake of Obama’s reelection, which spurred more than a dozen efforts from tens of thousands of petitioners seeking permission for their states to secede from the union.

The idea, dubbed “We the People” and modeled loosely on a British government program, was meant to encourage people to exercise their First Amendment rights by collecting enough electronic signatures to meet a threshold that would guarantee an official administration response. (The level was initially set at 5,000 signatures, but that was quickly raised to 25,000 after the public responded a little too enthusiastically.)

Administration officials have spent federal time and tax dollars answering petitioner demands that the government recognize extraterrestrial life, allow online poker, legalize marijuana, remove “under God” from the Pledge of Allegiance and ban Rush Limbaugh from Armed Forces Network radio.

The last issue merited a formal response from the Defense Department: “AFN does not censor content, and we believe it is important that service members have access to a variety of viewpoints,” spokesman Bryan G. Whitman wrote to the more than 29,000 people who signed the anti-Limbaugh petition.

The “We the People” program emerged in the news last week when petitioners demanded that Obama block an appearance at Sunday’s “Christmas in Washington” concert by Psy, the South Korean “Gangnam Style” singer who is under fire for anti-American lyrics. The program’s rules require that petitions relate to “current or potential actions or policies of the federal government,” prompting the White House to pull down the petition because Obama has no authority over booking at the privately run charitable event.

Read the entire article after the jump.

Image: We The People. U.S. Constitution. Courtesy of Wikipedia.

Send to Kindle

What Thomas Jefferson Never Said

Commentators of all political persuasions often cite Jefferson to add weight and gravitas to further a particular point or position. Yet scholarly analysis shows that many quotes are incorrectly attributed to the Founding Father and 3rd President. Some examples of words never spoken or written by Jefferson:

“Dissent is the highest form of patriotism””The democracy will cease to exist when you take away from those who are willing to work and give to those who would not.”

“My reading of history convinces me that most bad government results from too much government.”

“The beauty of the Second Amendment is that it will not be needed until they try to take it.”

From the WSJ:

Thomas Jefferson once famously wrote, “All tyranny needs to gain a foothold is for people of good conscience to remain silent.”

Or did he? Numerous social movements attribute the quote to him. “The Complete Idiot’s Guide to U.S. Government and Politics” cites it in a discussion of American democracy. Actor Chuck Norris’s 2010 treatise “Black Belt Patriotism: How to Reawaken America” uses it to urge conservatives to become more involved in politics. It is even on T-shirts and decals.

Yet the founding father and third U.S. president never wrote it or said it, insists Anna Berkes, a 33-year-old research librarian at the Jefferson Library at Monticello, his grand estate just outside Charlottesville, Va. Nor does he have any connection to many of the “Jeffersonian” quotes that politicians on both sides of the aisle have slung back and forth in recent years, she says.

“People will see a quote and it appeals to an opinion that they have and if it has Jefferson’s name attached to it that gives it more weight,” she says. “He’s constantly being invoked by people when they are making arguments about politics and actually all sorts of topics.”

A spokeswoman for the Guide’s publisher said it was looking into the quote. Mr. Norris’s publicist didn’t respond to requests for comment.

To counter what she calls rampant misattribution, Ms. Berkes is fighting the Internet with the Internet. She has set up a “Spurious Quotations” page on the Monticello website listing bogus quotes attributed to the founding father, a prolific writer and rhetorician who was the principal author of the Declaration of Independence.

The fake quotes posted and dissected on Monticello.org include “My reading of history convinces me that most bad government has grown out of too much government.” In detailed footnotes, Ms. Berkes says it resembles a line Jefferson wrote in an 1807 letter: “History, in general, only informs us what bad government is.” But she can’t find that exact quotation in any of his writings.

Another that graces many epicurean websites: “On a hot day in Virginia, I know nothing more comforting than a fine spiced pickle, brought up trout-like from the sparkling depths of the aromatic jar below the stairs of Aunt Sally’s cellar.”

Jefferson never said that either, says Ms. Berkes. The earliest reference to the quote comes from a 1922 speech by a man extolling the benefits of pickles, she says.

Jefferson is a “flypaper figure,” like Abraham Lincoln, Mark Twain, Winston Churchill and baseball player and manager Yogi Berra—larger-than-life figures who have fake or misattributed quotes stick to them all the time, says Ralph Keyes, an author of books about quotes wrongly credited to famous or historical figures.

Read the entire article after the jump.

Reproduction of the 1805 Rembrandt Peale painting of Thomas Jefferson, New York Historical Society. Courtesy of Wikipedia.

Send to Kindle

Climate change: Not in My Neigborhood

It’s no surprise that in our daily lives we seek information that reinforces our perceptions, opinions and beliefs of the world around us. It’s also the case that if we do not believe in a particular position, we will overlook any evidence in our immediate surroundings that runs contrary to our disbelief — climate change is no different.

From ars technica:

We all know it’s hard to change someone’s mind. In an ideal, rational world, a person’s opinion about some topic would be based on several pieces of evidence. If you were to supply that person with several pieces of stronger evidence that point in another direction, you might expect them to accept the new information and agree with you.

However, this is not that world, and rarely do we find ourselves in a debate with Star Trek’s Spock. There are a great many reasons that we behave differently. One is the way we rate incoming information for trustworthiness and importance. Once we form an opinion, we rate information that confirms our opinion more highly than information that challenges it. This is one form of “motivated reasoning.” We like to think we’re right, and so we are motivated to come to the conclusion that the facts are still on our side.

Publicly contentious issues often put a spotlight on these processes—issues like climate change, example. In a recent paper published in Nature Climate Change, researchers from George Mason and Yale explore how motivated reasoning influences whether people believe they have personally experienced the effects of climate change.

When it comes to communicating the science of global warming, a common strategy is to focus on the concrete here-and-now rather than the abstract and distant future. The former is easier for people to relate to and connect with. Glazed eyes are the standard response to complicated graphs of projected sea level rise, with ranges of uncertainty and several scenarios of future emissions. Show somebody that their favorite ice fishing spot is iced over for several fewer weeks each winter than it was in the late 1800s, though, and you might have their attention.

Public polls show that acceptance of a warming climate correlates with agreement that one has personally experienced its effects. That could be affirmation that personal experience is a powerful force for the acceptance of climate science. Obviously, there’s another possibility—that those who accept that the climate is warming are more likely to believe they’ve experienced the effects themselves, whereas those who deny that warming is taking place are unlikely to see evidence of it in daily life. That’s, at least partly, motivated reasoning at work. (And of course, this cuts both ways. Individuals who agree that the Earth is warming may erroneously interpret unrelated events as evidence of that fact.)

The survey used for this study was unique in that the same people were polled twice, two and a half years apart, to see how their views changed over time. For the group as a whole, there was evidence for both possibilities—experience affected acceptance, and acceptance predicted statements about experience.

Fortunately, the details were a bit more interesting than that. When you categorize individuals by engagement—essentially how confident and knowledgeable they feel about the facts of the issue—differences are revealed. For the highly-engaged groups (on both sides), opinions about whether climate is warming appeared to drive reports of personal experience. That is, motivated reasoning was prevalent. On the other hand, experience really did change opinions for the less-engaged group, and motivated reasoning took a back seat.

Read the entire article following the jump.

Image courtesy of: New York Times / Steen Ulrik Johannessen / Agence France-Presse — Getty Images.

 

 

Send to Kindle

Big Brother is Mapping You

One hopes that Google’s intention to “organize the world’s information” will remain benign for the foreseeable future. Yet, as more and more of our surroundings and moves are mapped and tracked online, and increasingly offline, it would be wise to remain ever vigilant. Many put up with the encroachment of advertisers and promoters into almost every facet of their daily lives as a necessary, modern evil. But where is the dividing line that separates an ignorable irritation from an intrusion of privacy and a grab for control? For the paranoid amongst us, it may only be a matter of time before our digital footprints come under the increasing scrutiny, and control, of organizations with grander designs.

From the Guardian:

Eight years ago, Google bought a cool little graphics business called Keyhole, which had been working on 3D maps. Along with the acquisition came Brian McClendon, aka “Bam”, a tall and serious Kansan who in a previous incarnation had supplied high-end graphics software that Hollywood used in films including Jurassic Park and Terminator 2. It turned out to be a very smart move.

Today McClendon is Google’s Mr Maps – presiding over one of the fastest-growing areas in the search giant’s business, one that has recently left arch-rival Apple red-faced and threatens to make Google the most powerful company in mapping the world has ever seen.

Google is throwing its considerable resources into building arguably the most comprehensive map ever made. It’s all part of the company’s self-avowed mission is to organize all the world’s information, says McClendon.

“You need to have the basic structure of the world so you can place the relevant information on top of it. If you don’t have an accurate map, everything else is inaccurate,” he says.

It’s a message that will make Apple cringe. Apple triggered howls of outrage when it pulled Google Maps off the latest iteration of its iPhone software for its own bug-riddled and often wildly inaccurate map system. “We screwed up,” Apple boss Tim Cook said earlier this week.

McClendon, pictured, won’t comment on when and if Apple will put Google’s application back on the iPhone. Talks are ongoing and he’s at pains to point out what a “great” product the iPhone is. But when – or if – Apple caves, it will be a huge climbdown. In the meantime, what McClendon really cares about is building a better map.

This not the first time Google has made a landgrab in the real world, as the publishing industry will attest. Unhappy that online search was missing all the good stuff inside old books, Google – controversially – set about scanning the treasures of Oxford’s Bodleian library and some of the world’s other most respected collections.

Its ambitions in maps may be bigger, more far reaching and perhaps more controversial still. For a company developing driverless cars and glasses that are wearable computers, maps are a serious business. There’s no doubting the scale of McClendon’s vision. His license plate reads: ITLLHPN.

Until the 1980s, maps were still largely a pen and ink affair. Then mainframe computers allowed the development of geographic information system software (GIS), which was able to display and organise geographic information in new ways. By 2005, when Google launched Google Maps, computing power allowed GIS to go mainstream. Maps were about to change the way we find a bar, a parcel or even a story. Washington DC’s homicidewatch.org, for example, uses Google Maps to track and follow deaths across the city. Now the rise of mobile devices has pushed mapping into everyone’s hands and to the front line in the battle of the tech giants.

It’s easy to see why Google is so keen on maps. Some 20% of Google’s queries are now “location specific”. The company doesn’t split the number out but on mobile the percentage is “even higher”, says McClendon, who believes maps are set to unfold themselves ever further into our lives.

Google’s approach to making better maps is about layers. Starting with an aerial view, in 2007 Google added Street View, an on-the-ground photographic map snapped from its own fleet of specially designed cars that now covers 5 million of the 27.9 million miles of roads on Google Maps.

Google isn’t stopping there. The company has put cameras on bikes to cover harder-to-reach trails, and you can tour the Great Barrier Reef thanks to diving mappers. Luc Vincent, the Google engineer known as “Mr Street View”, carried a 40lb pack of snapping cameras down to the bottom of the Grand Canyon and then back up along another trail as fellow hikers excitedly shouted “Google, Google” at the man with the space-age backpack. McClendon, pictured, has also played his part. He took his camera to Antarctica, taking 500 or more photos of a penguin-filled island to add to Google Maps. “The penguins were pretty oblivious. They just don’t care about people,” he says.

Now the company has projects called Ground Truth, which corrects errors online, and Map Maker, a service that lets people make their own maps. In the western world the product has been used to add a missing road or correct a one-way street that is pointing the wrong way, and to generally improve what’s already there. In Africa, Asia and other less well covered areas of the world, Google is – literally – helping people put themselves on the map.

In 2008, it could take six to 18 months for Google to update a map. The company would have to go back to the firm that provided its map information and get them to check the error, correct it and send it back. “At that point we decided we wanted to bring that information in house,” says McClendon. Google now updates its maps hundreds of times a day. Anyone can correct errors with roads signs or add missing roads and other details; Google double checks and relies on other users to spot mistakes.

Thousands of people use Google’s Map Maker daily to recreate their world online, says Michael Weiss-Malik, engineering director at Google Maps. “We have some Pakistanis living in the UK who have basically built the whole map,” he says. Using aerial shots and local information, people have created the most detailed, and certainly most up-to-date, maps of cities like Karachi that have probably ever existed. Regions of Africa and Asia have been added by map-mad volunteers.

Read the entire article following the jump.

Send to Kindle

Art Basel: Cheese Expo, Pool Party or Art Show?

Simon Coonan over a Slate posits a simple question:

“How did the art world become such a vapid hell-hole of investment-crazed pretentiousness?”

In his scathing attack on the contemporary art scene replete with Twitter feeds, pool parties, and gallery-curated designer cheese, Coonan quite rightly asks why window dressing and marketing have replaced artistry and craftsmanship. And, more importantly, has big money replaced great, new art?

As an example, the biggest news from Art Basel, the biggest art show in the United States, is not art at all. Celebrity contemporary artist Jeff Koons’ has defected to a rival gallery from his previous home with Larry Gagosian. Gagosian to the art cognoscenti is the “world’s most powerful art dealer”.

From Slate:

Freud said the goals of the artist are fame, money, and beautiful lovers. Based on my artist acquaintances, I would say this holds true today. What have changed, however, are the goals of the art itself. Do any exist?

How did the art world become such a vapid hell-hole of investment-crazed pretentiousness? How did it become, as Camille Paglia has recently described it, a place where “too many artists have lost touch with the general audience and have retreated to an airless echo chamber”? (More from her in a moment.)

There are sundry problems bedeviling the contemporary art scene. Here are eight that spring readily to mind:

1. Art Basel Miami.

It’s baaa-ack, and I, for one, will not be attending. The overblown art fair in Miami—an offshoot of the original, held in Basel, Switzerland—has become a promo-party cheese-fest. All that craven socializing and trendy posing epitomize the worst aspects of today’s scene, provoking in me a strong desire to start a Thomas Kinkade collection. Whenever some hapless individual innocently asks me if I will be attending Art Basel—even though the shenanigans don’t start for another two weeks, I am already getting e-vites for pre-Basel parties—I invariably respond in Tourette’s mode:

“No. In fact, I would rather jump in a river of boiling snot, which is ironic since that could very well be the title of a faux-conceptual installation one might expect to see at Art Basel. Have you seen Svetlana’s new piece? It’s a river of boiling snot. No, I’m not kidding. And, guess what, Charles Saatchi wants to buy it and is duking it out with some Russian One Percent-er.”

2. Blood, poo, sacrilege, and porn.

Old-school ’70s punk shock tactics are so widespread in today’s art world that they have lost any resonance. As a result, twee paintings like Gainsborough’s Blue Boy and Constable’s Hay Wain now appear mesmerizing, mysterious, and wildly transgressive. And, as Camille Paglia brilliantly argues in her must-read new book, Glittering Images, this torrent of penises, elephant dung, and smut has not served the broader interests of art. By providing fuel for the Rush Limbaugh-ish prejudice that the art world is full of people who are shoving yams up their bums and doing horrid things to the Virgin Mary, art has, quoting Camille again, “allowed itself to be defined in the public eye as an arrogant, insular fraternity with frivolous tastes and debased standards.” As a result, the funding of school and civic arts programs has screeched to a halt and “American schoolchildren are paying the price for the art world’s delusional sense of entitlement.” Thanks a bunch, Karen Finley, Chris Ofili, Andres Serrano, Damien Hirst, and the rest of you naughty pranksters!

Any taxpayers not yet fully aware of the level of frivolity and debasement to which art has plummeted need look no further than the Museum of Modern Art, which recently hosted a jumbo garage-sale-cum-performance piece created by one Martha Rosler titled “Meta-Monumental Garage Sale.” Maybe this has some reverse-chic novelty for chi-chi arty insiders, but for the rest of us out here in the real world, a garage sale is just a garage sale.

8. Cool is corrosive.

The dorky uncool ’80s was a great time for art. The Harings, Cutrones, Scharfs, and Basquiats—life-enhancing, graffiti-inspired painters—communicated a simple, relevant, populist message of hope and flava during the darkest years of the AIDS crisis. Then, in the early ‘90s, grunge arrived, and displaced the unpretentious communicative culture of the ‘80s with the dour obscurantism of COOL. Simple fun and emotional sincerity were now seen as embarrassing and deeply uncool. Enter artists like Rachel barrel-of-laughs Whiteread, who makes casts of the insides of cardboard boxes. (Nice work if you can get it!)

A couple of decades on, art has become completely pickled in the vinegar of COOL, and that is why it is so irrelevant to the general population.

Read the entire article following the jump.

Image: Untitled acrylic and mixed media on canvas by Jean-Michel Basquiat, 1984. Courtesy of Wikipedia.

Send to Kindle

Fly Me to the Moon: Mere Millionaries Need Not Apply

Golden Spike, a Boulder Colorado based company, has an interesting proposition for the world’s restless billionaires. It is offering a two-seat trip to the Moon, and back, for a tidy sum of $1.5 billion. And, the company is even throwing in a moon-walk. The first trip is planned for 2020.

From the Washington Post:

It had to happen: A start-up company is offering rides to the moon. Book your seat now — though it’s going to set you back $750 million (it’s unclear if that includes baggage fees).

At a news conference scheduled for Thursday afternoon in Washington, former NASA science administrator Alan Stern plans to announce the formation of Golden Spike, which, according to a news release, is “the first company planning to offer routine exploration expeditions to the surface of the Moon.”

“We can do this,” an excited Stern said Thursday morning during a brief phone interview.

The gist of the company’s strategy is that it’ll repurpose existing space hardware for commercial lunar missions and take advantage of NASA-sanctioned commercial rockets that, in a few years, are supposed to put astronauts in low Earth orbit. Stern said a two-person lunar mission, complete with moonwalking and, perhaps best of all, a return to Earth, would cost $1.5 billion.

“Two seats, 750 each,” Stern said. “The trick is 40 years old. We know how to do this. The difference is now we have rockets and space capsules in the inventory. .?.?. They’re already developed. .?.?. We don’t have to invent them from a clean sheet of paper. We don’t have to start over.”

The statement says, “The company’s plan is to maximize use of existing rockets and to market the resulting system to nations, individuals, and corporations with lunar exploration objectives and ambitions.” Golden Spike says its plans have been vetted by a former space shuttle commander, a space shuttle program manager and a member of the National Academy of Engineering.

And Newt Gingrich is involved: The former speaker of the House, who was widely mocked this year when, campaigning for president, he talked at length about ambitious plans for a permanent moon base by 2021, is listed as a member of Golden Spike’s board of advisers.

Also on that list is Bill Richardson, the former New Mexico governor and secretary of the Department of Energy. The chairman of the board is Gerry Griffin, a former Apollo mission flight director and former director of NASA’s Johnson Space Center.

The private venture fills a void, as it were, in the wake of President Obama’s decision to cancel NASA’s Constellation program, which was initiated during the George W. Bush years as the next step in space exploration after the retirement of the space shuttle. Constellation aimed to put astronauts back on the moon by 2020 for what would become extended stays at a lunar base.

A sweeping review from a presidential committee led by retired aerospace executive Norman Augustine concluded that NASA didn’t have the money to achieve Constellation’s goals. The administration and Congress have given NASA new marching orders that require the building of a heavy-lift rocket that would give the agency the ability to venture far beyond low Earth orbit.

Routine access to space is being shifted to companies operating under commercial contracts. But as those companies try to develop commercial spaceflight, the United States lacks the ability to launch astronauts directly and must purchase flights to the international space station from the Russians.

Read the entire article after the jump.

Image courtesy of The Golden Spike Company.

Send to Kindle

A Star is Born, and its Solar System

A diminutive stellar blob some 450 million light years away seems to be a young star giving birth to a planetary system much like our very own Solar System. The developing protostar and its surrounding gas cloud is being tracked astronomers at the National Radio Astronomy Observatory in Charlottesville, Virginia. Stellar and planetary evolution in action.

From New Scientist:

Swaddled in a cloud of dust and gas, the baby star shows a lot of potential. It is quietly sucking in matter from the cloud, which holds enough cosmic nourishment for the infant to grow as big and bright as our sun. What’s more, the star is surrounded by enough raw material to build at least seven planetary playmates.

Dubbed L1527, the star is still in the earliest stages of development, so it offers one of the best peeks yet at what our solar system may have looked like as it was taking shape.

The young star is currently one-fifth of the mass of the sun, but it is growing. If it has been bulking up at the same rate all its life, the star should be just 300,000 years old – a mere tyke compared to our 4.6-billion-year-old sun. But the newfound star may be even younger, because some theories say stars initially grow at a faster rate.

Diminutive sun

The cloud feeding the protostar contains at least as much material as our sun, says John Tobin of the National Radio Astronomy Observatory in Charlottesville, Virginia.

“The key factor in determining a star’s characteristics is the mass, so L1527 could potentially grow to become similar to the sun,” says Tobin.

Material from the cloud is being funnelled to the star through a swirling disc that contains roughly 0.5 per cent the mass of the sun. That might not sound like a lot, but that’s enough mass to make up at least seven Jupiter-sized planets.

Previous observations of L1527 had hinted that a disk encircled the star, but it was not clear that the disk was rotating, which is an essential ingredient for planet formation. So Tobin and his colleagues took a closer look.

Good rotations

The team used radio observations to detect the presence of carbon monoxide around the star and watched how the material swirled around in the disc to trace its overall motion. They found that matter nearest to the star is rotating faster than material near the edge of the disc – a pattern that mirrors the way planets orbit a star.

“The dust and gas are orbiting the protostar much like how planets orbit the sun,” says Tobin. “Unfortunately there is no telling how many planets might form or how large they will be.”

Read the entire article following the jump.

Protostar L1527. Courtesy of NASA / JPL, via tumblr.

Send to Kindle

From Man’s Best Friend to a Girl’s Best Friend

Chances are that you have a pet. And, whether you’re a dog person or a cat person, or a bird fancier or a lover of lizards you’d probably mourn if you were to lose your furry, or feathery or scaly, friend. So, when your pet crosses over to the other side why not pulverize her or him, filter out any non-carbon remains and then compress the results into, well, a diamond!

From WSJ:

Natalie Pilon’s diamond is her best friend.

Every time she looks into the ring on her finger, Ms. Pilon sees Meowy, her late beloved silver cat. Meowy really is there: The ring’s two diamonds were made from her cremated remains.

“It’s a little eccentric—not something everyone would do,” says Ms. Pilon, a biotech sales representative in Boston, whose cat passed away last year. “It’s a way for me to remember my cat, and have her with me all the time.”

Americans have a long tradition of pampering and memorializing their pets. Now, technology lets precious friends become precious gems.

The idea of turning the carbon in ashes into man-made diamonds emerged a decade ago as a way to memorialize humans. Today, departed pets are fueling the industry’s growth, with a handful of companies selling diamonds, gemstones and other jewelry out of pet remains, including hair and feathers.

Some gems start at about $250, while pet diamonds cost at least $1,400, with prices based on color and size. The diamonds have the same physical properties as mined diamonds, purveyors say.

LifeGem, an Elk Grove Village, Ill., company, says it has made more than 1,000 animal diamonds in the past decade, mostly from dogs and cats but also a few birds, rabbits, horses and one armadillo. Customers truly can see facets of their pets, says Dean VandenBiesen, LifeGem’s co-founder, because “remains have some unique characteristics in terms of the ratios of elements, so no two diamonds are exactly alike.”

Jennifer Durante, 42 years old, of St. Petersburg, Fla., commissioned another company, Pet Gems, to create a light-blue zircon gemstone out of remains from her teacup Chihuahua, Tetley. “It reminds me of his eyes when the sun would shine into them,” she says.

Sonya Zofrea, a 42-year-old police officer in San Fernando, Calif., has two yellow diamonds to memorialize Baby, a black cat with yellow eyes who wandered into her life as a stray. The first contained a blemish, so maker LifeGem created another one free of charge with the cat’s ashes. But Ms. Zofrea felt the first reminded her most of her occasionally naughty kitty. “When I saw the imperfection, I thought, that’s just her,” says Ms. Zofrea. “She’s an imperfect little soul, aren’t we all?”

A spokesman for the Gemological Institute of America declined to comment on specific companies or processes, but said that synthetic diamonds, like naturally occurring ones, are made of carbon. “That carbon could come from the remains of a deceased pet,” he said.

Producing a one-carat diamond requires less than a cup of ashes or unpacked hair. Sometimes, companies add outside carbon if there isn’t enough.

Read the entire article following the jump.

Image courtesy of Google search.

Send to Kindle

Voyager: A Gift that Keeps on Giving

The little space probe that could — Voyager I — is close to leaving our solar system and entering the relative void of interstellar space. As it does so, from a distance of around 18.4 billion kilometers (today), it continues to send back signals of what it finds. And, surprises continue.

From ars technica:

Several years ago the Voyager spacecraft neared the edge of the Solar System, where the solar wind and magnetic field started to be influenced by the pressure from the interstellar medium that surrounds them. But the expected breakthrough to interstellar space appeared to be indefinitely put on hold; instead, the particles and magnetic field lines in the area seemed to be sending mixed signals about the Voyagers’ escape. At today’s meeting of the American Geophysical Union, scientists offered an explanation: the durable spacecraft ran into a region that nobody predicted.

The Voyager probes were sent on a grand tour of the outer planets over 35 years ago. After a series of staggeringly successful visits to the planets, the probes shot out beyond the most distant of them toward the edges of the Solar System. Scientists expected that as they neared the edge, we’d see the charge particles of the solar wind changing direction as the interstellar medium alters the direction of the Sun’s magnetic field. But while some aspects of the Voyager’s environment have changed, we’ve not seen any clear indication that it has left the Solar System. The solar wind actually seems to be grinding to a halt.

Today’s announcement clarifies that the confusion was caused by the fact that nature didn’t think much of physicists’ expectations. Instead, there’s an additional region near our Solar System’s boundary that hadn’t been predicted.

Within the Solar System, the environment is dominated by the solar magnetic field and a flow of charged particles sent out by the Sun (called the solar wind). Interstellar space has its own flow of particles in the form of low-energy cosmic rays, which the Sun’s magnetic field deflects away from us. There’s also an interstellar magnetic field with field lines oriented in different directions to our Sun’s.

Researchers expected the Voyagers would reach a relatively clear boundary between the Solar System and interstellar space. The Sun’s magnetic field would first shift directions, then be left behind and the interstellar one would be detected. At the same time, we’d see the loss of the solar wind and start seeing the first low-energy cosmic rays.

As expected, a few years back, the Voyagers reached a region where the interstellar medium forced the Sun’s magnetic field lines to curve north. But the solar wind refused to follow suit. Instead of flowing north, the solar wind slowed to a halt while the cosmic rays were missing in action.

Over the summer, as Voyager 1 approached 122 astronomical units from the Sun, that started to change. Arik Posner of the Voyager team said that, starting in late July, Voyager 1 detected a sudden drop in the presence of particles from the solar wind, which went down by half. At the same time, the first low-energy cosmic rays filtered in. A few days later things returned to normal. A second drop occurred on August 15 and then, on August 28, things underwent a permanent shift. According to Tom Krimigis, particles originating from the Sun dropped by about 1,000-fold. Low-energy cosmic rays rose and stayed elevated.

Read the entire article following the jump.

Image: Voyager II. Courtesy of NASA / JPL.

Send to Kindle

National Emotions Mapped

Are Canadians as a people more emotional than Brazilians? Are Brits as emotional as Mexicans? While generalizing and mapping a nation’s emotionality is dubious at best, this map is nonetheless fascinating.

From the Washington Post:

Since 2009, the Gallup polling firm has surveyed people in 150 countries and territories on, among other things, their daily emotional experience. Their survey asks five questions, meant to gauge whether the respondent felt significant positive or negative emotions the day prior to the survey. The more times that people answer “yes” to questions such as “Did you smile or laugh a lot yesterday?”, the more emotional they’re deemed to be.

Gallup has tallied up the average “yes” responses from respondents in almost every country on Earth. The results, which I’ve mapped out above, are as fascinating as they are indecipherable. The color-coded key in the map indicates the average percentage of people who answered “yes.” Dark purple countries are the most emotional, yellow the least. Here are a few takeaways.

Singapore is the least emotional country in the world. ”Singaporeans recognize they have a problem,” Bloomberg Businessweek writes of the country’s “emotional deficit,” citing a culture in which schools “discourage students from thinking of themselves as individuals.” They also point to low work satisfaction, competitiveness, and the urban experience: “Staying emotionally neutral could be a way of coping with the stress of urban life in a place where 82 percent of the population lives in government-built housing.”

The Philippines is the world’s most emotional country. It’s not even close; the heavily Catholic, Southeast Asian nation, a former colony of Spain and the U.S., scores well above second-ranked El Salvador.

Post-Soviet countries are consistently among the most stoic. Other than Singapore (and, for some reason, Madagascar and Nepal), the least emotional countries in the world are all former members of the Soviet Union. They are also the greatest consumers of cigarettes and alcohol. This could be what you call and chicken-or-egg problem: if the two trends are related, which one came first? Europe appears almost like a gradient here, with emotions increasing as you move West.

People in the Americas are just exuberant. Every nation on the North and South American continents ranked highly on the survey. Americans and Canadians are both among the 15 most emotional countries in the world, as well as ten Latin countries. The only non-American countries in the top 15, other than the Philippines, are the Arab nations of Oman and Bahrain, both of which rank very highly.

Read the entire article following the jump.

Send to Kindle

The Immortal Jellyfish

In 1988 marine-biology student made a stunning discovery, though little publicized at the time. In the coral blooms of the Italian Mediterranean Christian Rapallo found a small creature that resembled a jellyfish. It showed a very odd attribute — it refused to die. The true importance of this discovery did not become fully apparent until 1996, when a group of researchers found that this invertebrate, now classified as a hydrozoan and known by its scientific name Turritopsis dohrnii, could at any point during its lifecycle revert back to an earlier stage, and then begin its development all over again. It was to all intents immortal.

For scientists seeking to unravel the mechanisms that underlie the aging process Turritopsis dohrnii — the immortal jellyfish — represents a truly significant finding. Might our progress in slowing or even halting aging in humans come from a lowly jellyfish? Time will tell.

From the New York Times:

After more than 4,000 years — almost since the dawn of recorded time, when Utnapishtim told Gilgamesh that the secret to immortality lay in a coral found on the ocean floor — man finally discovered eternal life in 1988. He found it, in fact, on the ocean floor. The discovery was made unwittingly by Christian Sommer, a German marine-biology student in his early 20s. He was spending the summer in Rapallo, a small city on the Italian Riviera, where exactly one century earlier Friedrich Nietzsche conceived “Thus Spoke Zarathustra”: “Everything goes, everything comes back; eternally rolls the wheel of being. Everything dies, everything blossoms again. . . .”

Sommer was conducting research on hydrozoans, small invertebrates that, depending on their stage in the life cycle, resemble either a jellyfish or a soft coral. Every morning, Sommer went snorkeling in the turquoise water off the cliffs of Portofino. He scanned the ocean floor for hydrozoans, gathering them with plankton nets. Among the hundreds of organisms he collected was a tiny, relatively obscure species known to biologists as Turritopsis dohrnii. Today it is more commonly known as the immortal jellyfish.

Sommer kept his hydrozoans in petri dishes and observed their reproduction habits. After several days he noticed that his Turritopsis dohrnii was behaving in a very peculiar manner, for which he could hypothesize no earthly explanation. Plainly speaking, it refused to die. It appeared to age in reverse, growing younger and younger until it reached its earliest stage of development, at which point it began its life cycle anew.

Sommer was baffled by this development but didn’t immediately grasp its significance. (It was nearly a decade before the word “immortal” was first used to describe the species.) But several biologists in Genoa, fascinated by Sommer’s finding, continued to study the species, and in 1996 they published a paper called “Reversing the Life Cycle.” The scientists described how the species — at any stage of its development — could transform itself back to a polyp, the organism’s earliest stage of life, “thus escaping death and achieving potential immortality.” This finding appeared to debunk the most fundamental law of the natural world — you are born, and then you die.

One of the paper’s authors, Ferdinando Boero, likened the Turritopsis to a butterfly that, instead of dying, turns back into a caterpillar. Another metaphor is a chicken that transforms into an egg, which gives birth to another chicken. The anthropomorphic analogy is that of an old man who grows younger and younger until he is again a fetus. For this reason Turritopsis dohrnii is often referred to as the Benjamin Button jellyfish.

Yet the publication of “Reversing the Life Cycle” barely registered outside the academic world. You might expect that, having learned of the existence of immortal life, man would dedicate colossal resources to learning how the immortal jellyfish performs its trick. You might expect that biotech multinationals would vie to copyright its genome; that a vast coalition of research scientists would seek to determine the mechanisms by which its cells aged in reverse; that pharmaceutical firms would try to appropriate its lessons for the purposes of human medicine; that governments would broker international accords to govern the future use of rejuvenating technology. But none of this happened.

Some progress has been made, however, in the quarter-century since Christian Sommer’s discovery. We now know, for instance, that the rejuvenation of Turritopsis dohrnii and some other members of the genus is caused by environmental stress or physical assault. We know that, during rejuvenation, it undergoes cellular transdifferentiation, an unusual process by which one type of cell is converted into another — a skin cell into a nerve cell, for instance. (The same process occurs in human stem cells.) We also know that, in recent decades, the immortal jellyfish has rapidly spread throughout the world’s oceans in what Maria Pia Miglietta, a biology professor at Notre Dame, calls “a silent invasion.” The jellyfish has been “hitchhiking” on cargo ships that use seawater for ballast. Turritopsis has now been observed not only in the Mediterranean but also off the coasts of Panama, Spain, Florida and Japan. The jellyfish seems able to survive, and proliferate, in every ocean in the world. It is possible to imagine a distant future in which most other species of life are extinct but the ocean will consist overwhelmingly of immortal jellyfish, a great gelatin consciousness everlasting.

Read the entire article following the jump.

Image of Turritopsis dohrnii, courtesy of Discovery News.

Send to Kindle

Steam Without Boiling Water

Despite what seems to be an overwhelmingly digital shift in our lives, we still live in a world of steam. Steam plays a vital role in generating most of the world’s electricity, steam heats our buildings (especially if you live in New York City), steam sterilizes our medical supplies.

So, in a research discovery with far-reaching implication, scientists have succeeded in making steam at room temperature without actually boiling water. All courtesy of some ingenious nanoparticles.

From Technology Review:

Steam is a key ingredient in a wide range of industrial and commercial processes—including electricity generation, water purification, alcohol distillation, and medical equipment sterilization.

Generating that steam, however, typically requires vast amounts of energy to heat and eventually boil water or another fluid. Now researchers at Rice University have found a shortcut. Using light-absorbing nanoparticles suspended in water, the group was able to turn the water molecules surrounding the nanoparticles into steam while scarcely raising the temperature of the remaining water. The trick could dramatically reduce the cost of many steam-reliant processes.

The Rice team used a Fresnel lens to focus sunlight on a small tube of water containing high concentrations of nanoparticles suspended in the fluid. The water, which had been cooled to near freezing, began generating steam within five to 20 seconds, depending on the type of nanoparticles used. Changes in temperature, pressure, and mass revealed that 82 percent of the sunlight absorbed by the nanoparticles went directly to generating steam while only 18 percent went to heating water.

“It’s a new way to make steam without boiling water,” says Naomi Halas, director of the Laboratory for Nanophotonics at Rice University. Halas says that the work “opens up a lot of interesting doors in terms of what you can use steam for.”

The new technique could, for instance, lead to inexpensive steam-generation devices for small-scale water purification, sterilization of medical instruments, and sewage treatment in developing countries with limited resources and infrastructure.

The use of nanoparticles to increase heat transfer in water and other fluids has been well studied, but few researchers have looked at using the particles to absorb light and generate steam.

In the current study, Halas and colleagues used nanoparticles optimized to absorb the widest possible spectrum of sunlight. When light hits the particles, their temperature quickly rises to well above 100 °C, the boiling point of water, causing surrounding water molecules to vaporize.

Precisely how the particles and water molecules interact remains somewhat of a mystery. Conventional heat-transfer models suggest that the absorbed sunlight should dissipate into the surrounding fluid before causing any water to boil. “There seems to be some nanoscale thermal barrier, because it’s clearly making steam like crazy,” Halas says.

The system devised by Halas and colleagues exhibited an efficiency of 24 percent in converting sunlight to steam.

Todd Otanicar, a mechanical engineer at the University of Tulsa who was not involved in the current study, says the findings could have significant implications for large-scale solar thermal energy generation. Solar thermal power stations typically use concentrated sunlight to heat a fluid such as oil, which is then used to heat water to generate steam. Otanicar estimates that by generating steam directly with nanoparticles in water, such a system could see an increased efficiency of 3 to 5 percent and a cost savings of 10 percent because a less complex design could be used.

Read the entire article after the jump.

Image: Stott Park Bobbin Mill Steam Engine. Courtesy of Wikipedia.

Send to Kindle

Sleep Myths

Chronobiologist, Till Roenneberg, debunks 5 commonly held beliefs about sleep. He is author of “Internal Time: Chronotypes, Social Jet Lag, and Why You’re So Tired.

From the Washington Post:

If shopping on Black Friday leaves you exhausted, or if your holiday guests keep you up until the wee hours, a long Thanksgiving weekend should offer an opportunity for some serious shut-eye. We spend between a quarter and a third of our lives asleep, but that doesn’t make us experts on how much is too much, how little is too little, or how many hours of rest the kids need to be sharp in school. Let’s tackle some popular myths about Mr. Sandman.

1.You need eight hours of sleep per night.

That’s the cliche. Napoleon, for one, didn’t believe it. His prescription went something like this: “Six hours for a man, seven for a woman and eight for a fool.”

But Napoleon’s formula wasn’t right, either. The ideal amount of sleep is different for everyone and depends on many factors, including age and genetic makeup.

In the past 10 years, my research team has surveyed sleep behavior in more than 150,000 people. About 11 percent slept six hours or less, while only 27 percent clocked eight hours or more. The majority fell in between. Women tended to sleep longer than men, but only by 14 minutes.

Bigger differences are seen when comparing various age groups. Ten-year-olds needed about nine hours of sleep, while adults older than 30, including senior citizens, averaged about seven hours. We recently identified the first gene associated with sleep duration — if you have one variant of this gene, you need more sleep than if you have another.

2. Early to bed and early to rise makes a man healthy, wealthy and wise.

Benjamin Franklin’s proverbial praise of early risers made sense in the second half of the 18th century, when his peers were exposed to much more daylight and to very dark nights. Their body clocks were tightly synchronized to this day-night cycle. This changed as work gradually moved indoors, performed under the far weaker intensity of artificial light during the day and, if desired, all night long.

The timing of sleep — earlier or later — is controlled by our internal clocks, which determine what researchers call our optimal “sleep window.” With the widespread use of electric light, our body clocks have shifted later while the workday has essentially remained the same. We fall asleep according to our (late) body clock, and are awakened early for work by the alarm clock. We therefore suffer from chronic sleep deprivation, and then we try to compensate by sleeping in on free days. Many of us sleep more than an hour longer on weekends than we do on workdays.

Read the entire article following the jump.

Image courtesy of Google search.

Send to Kindle

The Science (and Benefit) of Fasting

For thousands of years people have fasted to cleanse the body and the spirit. And, of course, many fast to lose (some) weight. Recently, a growing body of scientific research seems to suggest that fasting may slow the aging process.

From the New Scientist:

THERE’S a fuzz in my brain and an ache in my gut. My legs are leaden and my eyesight is blurry. But I have only myself to blame. Besides, I have been assured that these symptoms will pass. Between 10 days and three weeks from now, my body will adjust to the new regime, which entails fasting for two days each week. In the meantime, I just need to keep my eyes on the prize. Forget breakfast and second breakfast, ignore the call of multiple afternoon snacks, because the pay offs of doing without could be enormous.

Fasting is most commonly associated with religious observation. It is the fourth of the Five Pillars of Islam. Buddhists consider it a means to practise self-control and advocate abstaining from food after the noon meal. For some Christians, temporary fasts are seen as a way of getting closer to God. But the benefits I am hoping for are more corporeal.

The idea that fasting might be good for your health has a long, if questionable, history. Back in 1908, “Dr” Linda Hazzard, an American with some training as a nurse, published a book called Fasting for the Cure of Disease, which claimed that minimal food was the route to recovery from a variety of illnesses including cancer. Hazzard was jailed after one of her patients died of starvation. But what if she was, at least partly, right?

A new surge of interest in fasting suggests that it might indeed help people with cancer. It could also reduce the risk of developing cancer, guard against diabetes and heart disease, help control asthma and even stave off Parkinson’s disease and dementia. Many of the scientists who study fasting practise what they research, and they tell me that at my age (39) it could be vital that I start now. “We know from animal models,” says Mark Mattson at the US National Institute on Aging, “that if we start an intermittent fasting diet at what would be the equivalent of middle age in people, we can delay the onset of Alzheimer’s and Parkinson’s.” Surely it’s worth a try?

Until recently, most studies linking diet with health and longevity focused on calorie restriction. They have had some impressive results, with the lifespan of various lab animals lengthened by up to 50 per cent after their daily calorie intake was cut in half. But these effects do not seem to extend to primates. A 23-year-long study of macaques found that although calorie restriction delayed the onset of age-related diseases, it had no impact on lifespan. So other factors such as genetics may be more important for human longevity too (Nature, vol 489, p 318).

That’s bad news for anyone who has gone hungry for decades in the hope of living longer, but the finding has not deterred fasting researchers. They point out that although fasting obviously involves cutting calories – at least on the fast days – it brings about biochemical and physiological changes that daily dieting does not. Besides, calorie restriction may leave people susceptible to infections and biological stress, whereas fasting, done properly, should not. Some even argue that we are evolutionarily adapted to going without food intermittently. “The evidence is pretty strong that our ancestors did not eat three meals a day plus snacks,” says Mattson. “Our genes are geared to being able to cope with periods of no food.”

What’s in a fast?

As I sit here, hungry, it certainly doesn’t feel like that. But researchers do agree that fasting will leave you feeling crummy in the short term because it takes time for your body to break psychological and biological habits. Less reassuring is their lack of agreement on what fasting entails. I have opted for the “5:2” diet, which allows me 600 calories in a single meal on each of two weekly “fast” days. The normal recommended intake is about 2000 calories for a woman and 2500 for a man, and I am allowed to eat whatever I want on the five non-fast days, underlining the fact that fasting is not necessarily about losing weight. A more draconian regimen has similar restricted-calorie “fasts” every other day. Then there’s total fasting, in which participants go without food for anything from one to five days – longer than about a week is considered potentially dangerous. Fasting might be a one-off, or repeated weekly or monthly.

Different regimens have different effects on the body. A fast is considered to start about 10 to 12 hours after a meal, when you have used up all the available glucose in your blood and start converting glycogen stored in liver and muscle cells into glucose to use for energy. If the fast continues, there is a gradual move towards breaking down stored body fat, and the liver produces “ketone bodies” – short molecules that are by-products of the breakdown of fatty acids. These can be used by the brain as fuel. This process is in full swing three to four days into a fast. Various hormones are also affected. For example, production of insulin-like growth factor 1 (IGF-1), drops early and reaches very low levels by day three or four. It is similar in structure to insulin, which also becomes scarcer with fasting, and high levels of both have been linked to cancer.

Read the entire article following the jump.

Send to Kindle