So, You Want to Be a Brit?

The United Kingdom government has just published its updated 180-page handbook for new residents. So, those seeking to become subjects of Her Majesty will need to brush up on more that Admiral Nelson, Churchill, Spitfires, Chaucer and the Black Death. Now, if you are one of the approximately 150,000 new residents each year, you may well have to learn about Morecambe and Wise, Roald Dahl, and Monty Python. Nudge-nudge, wink-wink!

[div class=attrib]From the Telegraph:[end-div]

It has been described as “essential reading” for migrants and takes readers on a whirlwind historical tour of Britain from Stone Age hunter-gatherers to Morecambe and Wise, skipping lightly through the Black Death and Tudor England.

The latest Home Office citizenship handbook, Life in the United Kingdom: A Guide for New Residents, has scrapped sections on claiming benefits, written under the Labour government in 2007, for a triumphalist vision of events and people that helped make Britain a “great place to live”.

The Home Office said it had stripped-out “mundane information” about water meters, how to find train timetables, and using the internet.

The guide’s 180 pages, filled with pictures of the Queen, Spitfires and Churchill, are a primer for citizenship tests taken by around 150,000 migrants a year.

Comedies such as Monty Python and The Morecambe and Wise Show are highlighted as examples of British people’s “unique sense of humour and satire”, while Olympic athletes including Jessica Ennis and Sir Chris Hoy are included for the first time.

Previously, historical information was included in the handbook but was not tested. Now the book features sections on Roman, Anglo-Saxon and Viking Britain to give migrants an “understanding of how modern Britain has evolved”.

They can equally expect to be quizzed on the children’s author Roald Dahl, the Harrier jump jet and the Turing machine – a theoretical device proposed by Alan Turing and seen as a precursor to the modern computer.

The handbook also refers to the works of William Shakespeare, Geoffrey Chaucer and Jane Austen alongside Coronation Street. Meanwhile, Christmas pudding, the Last Night of the Proms and cricket matches are described as typical “indulgences”.

The handbook goes on sale today and forms the basis of the 45-minute exam in which migrants must gain marks of 75 per cent to pass.

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Image: Group shot of the Monty Python crew in 1969. Courtesy of Wikpedia.[end-div]

Someone Has to Stand Up to Experts

[tube]pzrUt9CHtpY[/tube]

“Someone has to stand up to experts!”. This is what Don McLeroy would have you believe about scientists. We all espouse senseless rants once in a while, so we should give McLeroy the benefit of the doubt – perhaps he had slept poorly the night before this impassioned, irrational plea. On the other hand, when you learn that McLeroy’s statement came as chairman of the Texas State Board of Education (SBOE) in 2010, then you may wish to think again, especially if you have children in the school system of the Lone Star State.

McLeroy and his fellow young-Earth creationists including Cynthia Dunbar are the subject of a documentary out this week titled The Revisionaries. It looks at the messy and yet successful efforts of the SBOE to revise the curriculum standards and the contents of science and social studies textbooks in their favor. So, included in a list of over 100 significant amendments, the non-experts did the following: marginalized Thomas Jefferson for being a secular humanist; watered down the historically accepted rationale for separation of church and state; stressed the positive side of the McCarthyist witchhunts; removed references to Hispanics having fought against Santa Anna in the battle of the Alamo; added the National Rifle Association as a key element in the recent conservative resurgence; and of course, re-opened the entire debate over the validity of evolutionary theory.

While McLeroy and some of his fellow non-experts lost re-election bids, their influence on young minds is likely to be far-reaching — textbooks in Texas are next revised in 2020, and because of Texas’ market power many publishers across the nation tend to follow Texas standards.

[div class=attrib]Video clip courtesy of The Revisionaries, PBS.[end-div]

Orphan Genes

DNA is a remarkable substance. It is the fundamental blueprint for biological systems. It is the basis for all complex life on our planet, it enables parents to share characteristics, both good and bad, with their children. Yet the more geneticists learn about the functions of DNA, the more mysteries it presents. One such conundrum is posed by so-called junk DNA and orphan genes — seemingly useless sequences of DNA that perform no function. Or so researchers previously believed.

[div class=attrib]From New Scientist:[end-div]

NOT having any family is tough. Often unappreciated and uncomfortably different, orphans have to fight to fit in and battle against the odds to realise their potential. Those who succeed, from Aristotle to Steve Jobs, sometimes change the world.

Who would have thought that our DNA plays host to a similar cast of foundlings? When biologists began sequencing genomes, they discovered that up to a third of genes in each species seemed to have no parents or family of any kind. Nevertheless, some of these “orphan genes” are high achievers, and a few even seem have played a part in the evolution of the human brain.

But where do they come from? With no obvious ancestry, it was as if these genes had appeared from nowhere, but that couldn’t be true. Everyone assumed that as we learned more, we would discover what had happened to their families. But we haven’t – quite the opposite, in fact.

Ever since we discovered genes, biologists have been pondering their origins. At the dawn of life, the very first genes must have been thrown up by chance. But life almost certainly began in an RNA world, so back then, genes weren’t just blueprints for making enzymes that guide chemical reactions – they themselves were the enzymes. If random processes threw up a piece of RNA that could help make more copies of itself, natural selection would have kicked in straight away.

As living cells evolved, though, things became much more complex. A gene became a piece of DNA coding for a protein. For a protein to be made, an RNA copy of the DNA has to be created. This cannot happen without “DNA switches”, which are actually just extra bits of DNA alongside the protein-coding bits saying “copy this DNA into RNA”. Next, the RNA has to get to the protein-making factories. In complex cells, this requires the presence of yet more extra sequences, which act as labels saying “export me” and “start making the protein from here”.

The upshot is that the chances of random mutations turning a bit of junk DNA into a new gene seem infinitesimally small. As the French biologist François Jacob famously wrote 35 years ago, “the probability that a functional protein would appear de novo by random association of amino acids is practically zero”.

Instead, back in the 1970s it was suggested that the accidental copying of genes can result in a single gene giving rise to a whole family of genes, rather like the way animals branch into families of related species over time. It’s common for entire genes to be inadvertently duplicated. Spare copies are usually lost, but sometimes the duplicates come to share the function of the original gene between them, or one can diverge and take on a new function.

Take the light-sensing pigments known as opsins. The various opsins in our eyes are not just related to each other, they are also related to the opsins found in all other animals, from jellyfish to insects. The thousands of different opsin genes found across the animal kingdom all evolved by duplication, starting with a single gene in a common ancestor living around 700 million years ago (see diagram).

Most genes belong to similar families, and their ancestry can be traced back many millions of years. But when the yeast genome was sequenced around 15 years ago, it was discovered that around a third of yeast genes appeared to have no family. The term orphans (sometimes spelt ORFans) was used to describe individual genes, or small groups of very similar genes, with no known relatives.

“If you see a gene and you can’t find a relative you get suspicious,” says Ken Weiss, who studies the evolution of complex traits at Penn State University. Some suggested orphans were the genetic equivalent of living fossils like the coelacanth, the last surviving members of an ancient family. Others thought they were nothing special, just normal genes whose family hadn’t been found yet. After all, the sequencing of entire genomes had only just begun.

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image: DNA structure. Courtesy of Wikipedia.[end-div]

Letters of Love to Strangers

[tube]LVFVaWCV1TE[/tube]

It seems impossible to halt the spread of random acts of senseless kindness. This is a good thing. The latest good deeds come courtesy of Hannah Brencher and her army, which now numbers over 10,000 strong. What Hannah does is simple — she writes happy letters to strangers. What began as a sole endeavor is now a growing movement, replete with a starter kit for novice letter-writers, a TED (Technology Entertainment and Design) presentation, and its own website, of course, at The World Needs More Love Letters.

[div class=attrib]From the Guardian:[end-div]

When 24-year-old Hannah Brencher moved to New York after college, she was hit by depression and overwhelming loneliness. One day she felt so alone, she wanted to reach out to someone. And so she put pen to paper and started writing letters. Letters to complete strangers.

But these weren’t sad letters about how she was feeling. They were happy letters, all about the other person, not her. She would write messages for people to have a “bright day” and tell strangers how brilliant they were, even if they thought no one else had noticed. Brencher began dropping the notes all over New York, in cafes, in library books, in parks and on the subway. It made her feel better, knowing that she might be making somebody’s day through just a few short, sweet words. It gave her something to focus on. And so, The World Needs More Love Letters was born.

The World Needs More Love Letters is all about writing letters – not emails, but proper, handwritten letters. Not conventional love letters, written to a real beloved, but surprise letters for strangers. They don’t necessarily say “I love you”, but they are full of kindness (that’s the love Brencher’s talking about) – telling people they are remarkable and special and all-round amazing. It’s the sort of stuff that most people don’t really say out loud even to the people they care about, let alone a total stranger.

Brencher’s initiative has now exploded. She has personally written hundreds, if not thousands of letters. Last year, she did a Ted talk. In it, she talks about a woman whose husband, a soldier, comes back from Afghanistan and they struggle to reconnect – “So she tucks love letters throughout the house as a way to say: ‘Come back to me. Find me when you can'” – and a university student who slips letters around her campus, only to suddenly find everyone is writing them and there are love letters hanging from the trees.

Now there are more than 10,000 people who join in all over the world. Sometimes, they write letters to order, to people who are lonely and down and just want someone to tell them that everything will be OK. Mostly, though, they scribble notes and leave them somewhere unlikely, for somebody to find.

It’s a very cute idea. It also sounds, well, a bit American touchy-feely. I’m not sure that’s something us Brits do well (although this chap from Aberdeen did it for a while, to some success judging by the feedback on his blog. Even if his notes were printouts and not charmingly done by hand). But I know that if I was on the receiving end of a letter like that, it almost certainly might put a smile on my face. So I decide to give it a try and see if I might do the same for someone else.

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Video courtesy of The World Needs More Love Letters / TED.[end-div]

Multi-hub-agnostic

Each year the mega-rich rub shoulders with the super-powerful and the hyper-popular at the World Economic Forum, in where else, Davos, Switzerland. What concrete actions are taken during this event are anybody’s guess. But, we suspect attendees sample some tasty hors d’oeuvres while they tweet to the rest of us.

One positive outcome is this interactive Davos Hotphrase Generator, available from our friends at the Guardian. We recommend you give it a click to get a taste for next year’s critical corporate strategy or Wall Street innovation.

Our 5 favorites:

Post-serendipity-influence

Micro-austerity-capital

Supra-platform-mash

Multi-hub-agnostic

Ur-forward-ability

[div class=attrib]Image: Bobsled team in Davos, 1910. Courtesy of Wikipedia.[end-div]

Shakespearian Sonnets Now Available on DNA

Shakespeare meet thy DNA. The most famous literary figure in the English language had a recent rendezvous with that most famous and studied of molecules. Together chemists, cell biologists, geneticists and computer scientists are doing some amazing things — storing information using the base-pair sequences of amino-acids on the DNA molecule.

[div class=attrib]From ars technica:[end-div]

It’s easy to get excited about the idea of encoding information in single molecules, which seems to be the ultimate end of the miniaturization that has been driving the electronics industry. But it’s also easy to forget that we’ve been beaten there—by a few billion years. The chemical information present in biomolecules was critical to the origin of life and probably dates back to whatever interesting chemical reactions preceded it.

It’s only within the past few decades, however, that humans have learned to speak DNA. Even then, it took a while to develop the technology needed to synthesize and determine the sequence of large populations of molecules. But we’re there now, and people have started experimenting with putting binary data in biological form. Now, a new study has confirmed the flexibility of the approach by encoding everything from an MP3 to the decoding algorithm into fragments of DNA. The cost analysis done by the authors suggest that the technology may soon be suitable for decade-scale storage, provided current trends continue.

Trinary encoding

Computer data is in binary, while each location in a DNA molecule can hold any one of four bases (A, T, C, and G). Rather than using all that extra information capacity, however, the authors used it to avoid a technical problem. Stretches of a single type of base (say, TTTTT) are often not sequenced properly by current techniques—in fact, this was the biggest source of errors in the previous DNA data storage effort. So for this new encoding, they used one of the bases to break up long runs of any of the other three.

(To explain how this works practically, let’s say the A, T, and C encoded information, while G represents “more of the same.” If you had a run of four A’s, you could represent it as AAGA. But since the G doesn’t encode for anything in particular, TTGT can be used to represent four T’s. The only thing that matters is that there are no more than two identical bases in a row.)

That leaves three bases to encode information, so the authors converted their information into trinary. In all, they encoded a large number of works: all 154 Shakespeare sonnets, a PDF of a scientific paper, a photograph of the lab some of them work in, and an MP3 of part of Martin Luther King’s “I have a dream” speech. For good measure, they also threw in the algorithm they use for converting binary data into trinary.

Once in trinary, the results were encoded into the error-avoiding DNA code described above. The resulting sequence was then broken into chunks that were easy to synthesize. Each chunk came with parity information (for error correction), a short file ID, and some data that indicates the offset within the file (so, for example, that the sequence holds digits 500-600). To provide an added level of data security, 100-bases-long DNA inserts were staggered by 25 bases so that consecutive fragments had a 75-base overlap. Thus, many sections of the file were carried by four different DNA molecules.

And it all worked brilliantly—mostly. For most of the files, the authors’ sequencing and analysis protocol could reconstruct an error-free version of the file without any intervention. One, however, ended up with two 25-base-long gaps, presumably resulting from a particular sequence that is very difficult to synthesize. Based on parity and other data, they were able to reconstruct the contents of the gaps, but understanding why things went wrong in the first place would be critical to understanding how well suited this method is to long-term archiving of data.

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Image: Title page of Shakespeare’s Sonnets (1609). Courtesy of Wikipedia / Public Domain.[end-div]

Orwell Lives On

George Orwell passed away on January 21, 1950 — an untimely death. He was only 46 years old. The anniversary of his death leads some to wonder what the great author would be doing if he were still alive. Some believe that he would be a food / restaurant critic. Or perhaps he would still, at the age of 109, be writing about injustice, falsehood and hypocrisy. One suspects that he might still be speaking truth to power as he did back in the 1940s, the difference being that this time power is in private hands versus the public sector. Corporate Big Brother is now watching you.

[div class=attrib]From the Guardian:[end-div]

What if George Orwell hadn’t died of tuberculosis in 1950? What if, instead of expiring aged 46 in University College hospital, he had climbed from his sick-bed, taken the fishing rod a friend had brought him for his convalescence and checked out? What if today he was alive and well (perhaps after a period in cryogenic storage – the details aren’t important now)? What would he think of 2013? What, if anything, would he be writing about?

In many respects Orwell is ubiquitous and more relevant than ever. His once-visionary keywords have grotesque afterlives: Big Brother is a TV franchise to make celebrities of nobodies and Room 101 a light-entertainment show on BBC2 currently hosted by Frank Skinner for celebrities to witter about stuff that gets their goat. Meanwhile, Orwellian is the second-most-overused literary-generated adjective (after Kafkaesque). And now St Vince of Cable has been busted down from visionary analyst of recession to turncoat enabler of George Osborne’s austerity measures. Orwell is the go-to thinker to account for our present woes – even though he is 63 years dead. Which, in the Newspeak of 1984, is doubleplusgood.

As we celebrate the first Orwell Day this week, it’s irresistible to play the game of “what if”? If Orwell was fighting in a war akin to the Spanish civil war in 2012, where would he be – Syria? Would he write Homage to Aleppo, perhaps? Or would he have written Homage to Zuccotti Park or Tottenham? If he was writing Down and Out in Paris and London today would it be very different – and, if so, how? If he took a journey to Wigan pier in 2013, what would he find that would resemble the original trip and what would be different? Would there still be a full chamber pot under his hosts’ breakfast table? Let’s hope not.

Would he be working in a call centre rather than going down a mine? Would he feel as patriotic as he did in some of his essays? Would the man born Eric Arthur Blair have spent much of the past decade tilting at the man born Anthony Charles Lynton Blair? The answers to the last three questions are, you’d hope: yes, probably not, and oh, please God, yes.

“It’s almost impossible to imagine,” says Orwell’s biographer, the novelist and critic DJ Taylor. “One of his closest friends, the novelist Anthony Powell, suggested in his journals that Orwell’s politics would have drifted rightwards. He would have been anti-CND, in favour of the Falklands war, disapproved of the miners’ strikes. Powell was a high Tory right winger, but he was very close to Orwell and so those possibilities of what he would have been like had he lived on shouldn’t be dismissed.”

Adam Stock, an Orwell scholar at Newcastle University who did his PhD on mid-20th-century dystopian fiction and political thought, says: “If he were alive today, then Orwell would surely be writing about many of the sorts of areas you identify, bringing to light inequalities, injustices and arguing for what he termed ‘democratic socialism’, and I would like to think – though this may be projection on my part – that at this moment he would be writing specifically in defence of the welfare state.”

You’d hope. But Stock reckons that in 2013 Orwell would also be writing about the politics of food. “Orwell’s novels are marked by their rich detailing of taste, touch and especially smell. Tinned and processed food is a recurring image in his fiction, and it often represents a smoothing out of difference and individuality, a process which mirrors political attempts to make people conform to certain ideological visions of the world in the 1930s and 1940s,” says Stock.

Indeed, during last week’s horsemeat scandal, Stock says a passage from Orwell’s 1939 novel Coming Up for Air came to mind. The character George Bowling bites into a frankfurter he has bought in an milk bar decorated in chrome and mirrors: “The thing burst in my mouth like a rotten pear. A sort of horrible soft stuff was oozing all over my tongue. But the taste! For a moment I just couldn’t believe it. Then I rolled my tongue round it again and had another try. It was fish! A sausage, a thing calling itself a frankfurter, filled with fish! I got up and walked straight out without touching my coffee. God knows what that might have tasted of.”

What’s the present-day significance of that? “The point, I think, is that appearances mask quite different realities in the milk-bar modernity of mirrors in which the character is sitting, trapped between endless reflections,” says Stock. “Orwell had an abiding interest in the countryside, rural life and growing his own food. One thing I suspect he would be campaigning vociferously about in our time is issues surrounding big agribusiness and the provenance of our food, the biological commons, and particularly the patenting of GM crops.”

[div class=attrib]Read more after the jump.[end-div]

[div class=attrib]Image: George Orwell. Courtesy of the BBC.[end-div]

Las Vegas, Tianducheng and Paris: Cultural Borrowing

These three locations in Nevada, China (near Hangzhou) and Paris, France, have something in common. People the world over travel to these three places to see what they share. But only one has an original. In this case, we’re talking about the Eiffel Tower.

Now, this architectural grand theft is subject to a lengthy debate — the merits of mimicry, on a vast scale. There is even a fascinating coffee table sized book dedicated to this growing trend: Original Copies: Architectural Mimicry in Contemporary China, by Bianca Bosker.

Interestingly, the copycat trend only seems worrisome if those doing the copying are in a powerful and growing nation, and the copying is done on a national scale, perhaps for some form of cultural assimilation. After all, we don’t hear similar cries when developers put up a copy of Venice in Las Vegas — that’s just for entertainment we are told.

Yet haven’t civilizations borrowed, and stolen, ideas both good and bad throughout the ages? The answer of course is an unequivocal yes. Humans are avaricious collectors of memes that work — it’s more efficient to borrow than to invent. The Greeks borrowed from the Egyptians; the Romans borrowed from the Greeks; the Turks borrowed from the Romans; the Arabs borrowed from the Turks; the Spanish from the Arabs, the French from the Spanish, the British from the French, and so on. Of course what seems to be causing a more recent stir is that China is doing the borrowing, and on such a rapid and grand scale — the nation is copying not just buildings (and most other products) but entire urban landscapes. However, this is one way that empires emerge and evolve. In this case, China’s acquisitive impulses could, perhaps, be tempered if most nations of the world borrowed less from the Chinese — money that is. But that’s another story.

[div class=attrib]From the Atlantic:[end-div]

The latest and most famous case of Chinese architectural mimicry doesn’t look much like its predecessors. On December 28, German news weekly Der Spiegel reported that the Wangjing Soho, Zaha Hadid’s soaring new office and retail development under construction in Beijing, is being replicated, wall for wall and window for window, in Chongqing, a city in central China.

To most outside observers, this bold and quickly commissioned counterfeit represents a familiar form of piracy. In fashion, technology, and architecture, great ideas trickle down, often against the wishes of their progenitors. But in China, architectural copies don’t usually ape the latest designs.

In the vast space between Beijing and Chongqing lies a whole world of Chinese architectural simulacra that quietly aspire to a different ideal. In suburbs around China’s booming cities, developers build replicas of towns like Halstatt, Austria and Dorchester, England. Individual homes and offices, too, are designed to look like Versailles or the Chrysler Building. The most popular facsimile in China is the White House. The fastest-urbanizing country in history isn’t scanning design magazines for inspiration; it’s watching movies.

At Beijing’s Palais de Fortune, two hundred chateaus sit behind gold-tipped fences. At Chengdu’s British Town, pitched roofs and cast-iron street lamps dot the streets. At Shanghai’s Thames Town, a Gothic cathedral has become a tourist attraction in itself. Other developments have names like “Top Aristocrat,” (Beijing), “the Garden of Monet” (Shanghai), and “Galaxy Dante,” (Shenzhen).

Architects and critics within and beyond China have treated these derivative designs with scorn, as shameless kitsch or simply trash. Others cite China’s larger knock-off culture, from handbags to housing, as evidence of the innovation gap between China and the United States. For a larger audience on the Internet, they are merely a punchline, another example of China’s endlessly entertaining wackiness.

In short, the majority of Chinese architectural imitation, oozing with historical romanticism, is not taken seriously.

But perhaps it ought to be.

In Original Copies: Architectural Mimicry in Contemporary China, the first detailed book on the subject, Bianca Bosker argues that the significance of these constructions has been unfairly discounted. Bosker, a senior technology editor at the Huffington Post, has been visiting copycat Chinese villages for some six years, and in her view, these distorted impressions of the West offer a glance at the hopes, dreams and contradictions of China’s middle class.

“Clearly there’s an acknowledgement that there’s something great about Paris,” says Bosker. “But it’s also: ‘We can do it ourselves.'”

Armed with firsthand observation, field research, interviews, and a solid historical background, Bosker’s book is an attempt to change the way we think about Chinese duplitecture. “We’re seeing the Chinese dream in action,” she says. “It has to do with this ability to take control of your life. There’s now this plethora of options to choose from.” That is something new in China, as is the role that private enterprise is taking in molding built environments that will respond to people’s fantasies.

While the experts scoff, the people who build and inhabit these places are quite proud of them. As the saying goes, “The way to live best is to eat Chinese food, drive an American car, and live in a British house. That’s the ideal life.” The Chinese middle class is living in Orange County, Beijing, the same way you listen to reggae music or lounge in Danish furniture.

In practice, though, the depth and scale of this phenomenon has few parallels. No one knows how many facsimile communities there are in China, but the number is increasing every day. “Every time I go looking for more,” Bosker says, “I find more.”

How many are there?

“At least hundreds.”

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image: Tianducheng, 13th arrondissement, Paris in China. Courtesy of Bianca Bosker/University of Hawaii Press.[end-div]

Lego Expressionism: But is it Art?

Lego as we know it — think brightly colored, interlinking, metamorphic bricks – has been around for over 60 years. Even in this high-tech, electronic age it is still likely that most kids around the world have made a little house or a robot with Lego bricks. It satisfies our need to create and to build (and of course, to destroy). But is it art? Jonathan Jones has some ideas.

[div class=attrib]From the Guardian:[end-div]

Lego is the clay of the modern world, the stuff of creativity. You can shape it, unshape it, make worlds and smash them up to be replaced by new ideas. It’s a perpetual-motion machine of kids’ imaginations.

Today’s Lego is very different from the Lego I played with when I was eight. For adults like me who grew up with simple Lego bricks and no instructions, just a free-for-all, the kits that now dazzle in their bright impressive boxes take some adjusting to. A puritan might well be troubled that this year’s new Christmas Lego recreates the film The Hobbit in yet another addition to a popular culture repertoire that includes Marvel Superheroes Lego and the ever-popular Star Wars range.

The Danish toymaker is ruthless in its pursuit of mass entertainment. Harry Potter Lego was a major product – until the film series finished. This summer, it suddenly vanished from shops. I had to go to the Harry Potter Studios to get a Knight Bus.

Cool bus, though. Purple Lego! And it fits together in such a way that, when dropped or otherwise subjected to the rigours of play, the three floors of the bus neatly separate and can easily be reconnected. It is a kit, a toy, and a stimulus to story-telling.

Do not doubt the creative value of modern Lego. Making these kits isn’t a fetishistic, sterile enterprise – children don’t think like that. Rather, the ambition of the kits inspires children to aim high with their own crazy designs – the scenarios Lego provides stimulate inventive play. Children can tell stories with Lego, invest the fantastic mini-figures with names and characters, and build what they like after the models disintegrate. Above all, there is something innately humorous about Lego.

But is it art? It definitely teaches something about art. Like a three-dimensional sketchpad, Lego allows you to doodle in bright colours. It is “virtual”, but real and solid. It has practical limits and potentials that have to be respected, while teaching that anyone can create anything. You can be a representational Lego artist, meticulously following instructions and making accurate models, or an abstract one. It really is liberating stuff: shapeshifting, metamorphic. And now I am off to play with it.

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Image courtesy of Nathan Sawaya, the Lego brick artist.[end-div]

Your City as an Information Warehouse

Big data keeps getting bigger and computers keep getting faster. Some theorists believe that the universe is a giant computer or a computer simulation; that principles of information science govern the cosmos. While this notion is one of the most recent radical ideas to explain our existence, there is no doubt that information is our future. Data surrounds us, we are becoming data-points and our cities are our information-rich databases.

[div class=attrib]From the Economist:[end-div]

IN 1995 GEORGE GILDER, an American writer, declared that “cities are leftover baggage from the industrial era.” Electronic communications would become so easy and universal that people and businesses would have no need to be near one another. Humanity, Mr Gilder thought, was “headed for the death of cities”.

It hasn’t turned out that way. People are still flocking to cities, especially in developing countries. Cisco’s Mr Elfrink reckons that in the next decade 100 cities, mainly in Asia, will reach a population of more than 1m. In rich countries, to be sure, some cities are sad shadows of their old selves (Detroit, New Orleans), but plenty are thriving. In Silicon Valley and the newer tech hubs what Edward Glaeser, a Harvard economist, calls “the urban ability to create collaborative brilliance” is alive and well.

Cheap and easy electronic communication has probably helped rather than hindered this. First, connectivity is usually better in cities than in the countryside, because it is more lucrative to build telecoms networks for dense populations than for sparse ones. Second, electronic chatter may reinforce rather than replace the face-to-face kind. In his 2011 book, “Triumph of the City”, Mr Glaeser theorises that this may be an example of what economists call “Jevons’s paradox”. In the 19th century the invention of more efficient steam engines boosted rather than cut the consumption of coal, because they made energy cheaper across the board. In the same way, cheap electronic communication may have made modern economies more “relationship-intensive”, requiring more contact of all kinds.

Recent research by Carlo Ratti, director of the SENSEable City Laboratory at the Massachusetts Institute of Technology, and colleagues, suggests there is something to this. The study, based on the geographical pattern of 1m mobile-phone calls in Portugal, found that calls between phones far apart (a first contact, perhaps) are often followed by a flurry within a small area (just before a meeting).

Data deluge

A third factor is becoming increasingly important: the production of huge quantities of data by connected devices, including smartphones. These are densely concentrated in cities, because that is where the people, machines, buildings and infrastructures that carry and contain them are packed together. They are turning cities into vast data factories. “That kind of merger between physical and digital environments presents an opportunity for us to think about the city almost like a computer in the open air,” says Assaf Biderman of the SENSEable lab. As those data are collected and analysed, and the results are recycled into urban life, they may turn cities into even more productive and attractive places.

Some of these “open-air computers” are being designed from scratch, most of them in Asia. At Songdo, a South Korean city built on reclaimed land, Cisco has fitted every home and business with video screens and supplied clever systems to manage transport and the use of energy and water. But most cities are stuck with the infrastructure they have, at least in the short term. Exploiting the data they generate gives them a chance to upgrade it. Potholes in Boston, for instance, are reported automatically if the drivers of the cars that hit them have an app called Street Bump on their smartphones. And, particularly in poorer countries, places without a well-planned infrastructure have the chance of a leap forward. Researchers from the SENSEable lab have been working with informal waste-collecting co-operatives in São Paulo whose members sift the city’s rubbish for things to sell or recycle. By attaching tags to the trash, the researchers have been able to help the co-operatives work out the best routes through the city so they can raise more money and save time and expense.

Exploiting data may also mean fewer traffic jams. A few years ago Alexandre Bayen, of the University of California, Berkeley, and his colleagues ran a project (with Nokia, then the leader of the mobile-phone world) to collect signals from participating drivers’ smartphones, showing where the busiest roads were, and feed the information back to the phones, with congested routes glowing red. These days this feature is common on smartphones. Mr Bayen’s group and IBM Research are now moving on to controlling traffic and thus easing jams rather than just telling drivers about them. Within the next three years the team is due to build a prototype traffic-management system for California’s Department of Transportation.

Cleverer cars should help, too, by communicating with each other and warning drivers of unexpected changes in road conditions. Eventually they may not even have drivers at all. And thanks to all those data they may be cleaner, too. At the Fraunhofer FOKUS Institute in Berlin, Ilja Radusch and his colleagues show how hybrid cars can be automatically instructed to switch from petrol to electric power if local air quality is poor, say, or if they are going past a school.

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Images of cities courtesy of Google search.[end-div]

Atwood on Orwell

One great writer reflects on the influences of another.

[div class=attrib]From the Guardian:[end-div]

I grew up with George Orwell. I was born in 1939, and Animal Farm was published in 1945. I read it at age nine. It was lying around the house, and I mistook it for a book about talking animals. I knew nothing about the kind of politics in the book – the child’s version of politics then, just after the war, consisted of the simple notion that Hitler was bad but dead. To say that I was horrified by this book would be an understatement. The fate of the farm animals was so grim, the pigs were so mean and mendacious and treacherous, the sheep were so stupid. Children have a keen sense of injustice, and this was the thing that upset me the most: the pigs were so unjust.

The whole experience was deeply disturbing, but I am forever grateful to Orwell for alerting me early to the danger flags I’ve tried to watch out for since. As Orwell taught, it isn’t the labels – Christianity, socialism, Islam, democracy, two legs bad, four legs good, the works – that are definitive, but the acts done in their names.

Animal Farm is one of the most spectacular emperor-has-no-clothes books of the 20th century, and it got Orwell into trouble accordingly. People who run counter to the current popular wisdom, who point out the uncomfortably obvious, are likely to be strenuously baa-ed at by herds of angry sheep. I didn’t have all that figured out at the age of nine, of course – not in any conscious way. But we learn the patterns of stories before we learn their meanings, and Animal Farm has a very clear pattern.

Then along came Nineteen Eighty-Four, which was published in 1949. I read it in paperback (the copy of which is pictured here) a couple of years later, when I was in high school. Then I read it again, and again. It struck me as more realistic, probably because Winston Smith was more like me, a skinny person who got tired a lot and was subjected to physical education under chilly conditions – a feature of my school – and who was silently at odds with the ideas and the manner of life proposed for him. (This may be one of the reasons Nineteen Eighty-Four is best read when you are an adolescent; most adolescents feel like that.) I sympathised particularly with his desire to write his forbidden thoughts down in a secret blank book. I had not yet started to write, but I could see the attractions of it. I could also see the dangers, because it’s this scribbling of his – along with illicit sex, another item with considerable allure for a teenager of the 1950s – that gets Winston into such a mess.

Orwell became a direct model for me much later in my life – in the real 1984, the year in which I began writing a somewhat different dystopia, The Handmaid’s Tale. By that time I was 44, and I’d learned enough about real despotisms that I didn’t need to rely on Orwell alone.

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]First edition cover of The Handmaid’s Tale by Margaret Atwood, and first edition cover of Nineteen-Eighty-Four by George Orwell. Courtesy of Wikipedia and respective publishers.[end-div]

Light From Gravity

Often the best creative ideas and the most elegant solutions are the simplest. GravityLight is an example of this type of innovation. Here’s the problem: replace damaging and expensive kerosene fuel lamps in Africa with a less harmful and cheaper alternative. And, the solution:

[tube]1dd9NIlhvlI[/tube]

[div class=attrib]From ars technica:[end-div]

A London design consultancy has developed a cheap, clean, and safer alternative to the kerosene lamp. Kerosene burning lamps are thought to be used by over a billion people in developing nations, often in remote rural parts where electricity is either prohibitively expensive or simply unavailable. Kerosene’s potential replacement, GravityLight, is powered by gravity without the need of a battery—it’s also seen by its creators as a superior alternative to solar-powered lamps.

Kerosene lamps are problematic in three ways: they release pollutants which can contribute to respiratory disease; they pose a fire risk; and, thanks to the ongoing need to buy kerosene fuel, they are expensive to run. Research out of Brown University from July of last year called kerosene lamps a “significant contributor to respiratory diseases, which kill over 1.5 million people every year” in developing countries. The same paper found that kerosene lamps were responsible for 70 percent of fires (which cause 300,000 deaths every year) and 80 percent of burns. The World Bank has compared the indoor use of a kerosene lamp with smoking two packs of cigarettes per day.

The economics of the kerosene lamps are nearly as problematic, with the fuel costing many rural families a significant proportion of their money. The designers of the GravityLight say 10 to 20 percent of household income is typical, and they describe kerosene as a poverty trap, locking people into a “permanent state of subsistence living.” Considering that the median rural price of kerosene in Tanzania, Mali, Ghana, Kenya, and Senegal is $1.30 per liter, and the average rural income in Tanzania is under $9 per month, the designers’ figures seem depressingly plausible.

Approached by the charity Solar Aid to design a solar-powered LED alternative, London design consultancy Therefore shifted the emphasis away from solar, which requires expensive batteries that degrade over time. The company’s answer is both more simple and more radical: an LED lamp driven by a bag of sand, earth, or stones, pulled toward the Earth by gravity.

It takes only seconds to hoist the bag into place, after which the lamp provides up to half an hour of ambient light, or about 18 minutes of brighter task lighting. Though it isn’t clear quite how much light the GravityLight emits, its makers insist it is more than a kerosene lamp. Also unclear are the precise inner workings of the device, though clearly the weighted bag pulls a cord, driving an inner mechanism with a low-powered dynamo, with the aid of some robust plastic gearing. Talking to Ars by telephone, Therefore’s Jim Fullalove was loath to divulge details, but did reveal the gearing took the kinetic energy from a weighted bag descending at a rate of a millimeter per second to power a dynamo spinning at 2000rpm.

[div class=attrib]Read more about GravityLight after the jump.[end-div]

[div class=attrib]Video courtesy of GravityLight.[end-div]

Gun Deaths in the U.S

Despite the recent atrocity in Newtown, Connecticut, at the hands of a madman carrying an assault weapon, death by gun continues unabated in the United States. Yet, accurate statistics are hard to come by. So, Slate and the Twitter feed @GunDeaths are collecting data to put this in perspective. Just over a month has passed since 20 children and 7 adults were gunned-down at Sandy Hook Elementary School. And since then at least 1,019 more people have died at the hands of a gun in the United States. That’s more than most other civilized countries record in a decade.

You can follow the interactive chart as it is updated daily here; another 4 deaths just today, January 17, 2013. According to the map, North Dakota and Wyoming have been the best States to avoid getting shot — both have recorded no deaths from gun violence since mid-December.

[div class=attrib]Image: partial snapshot of Slate and @GunDeaths interactive graphic. Courtesy of Slate.[end-div]

Politics Driven by Science

Imagine a nation, or even a world, where political decisions and policy are driven by science rather than emotion. Well, small experiments are underway, so this may not be as far off as many would believe, or even dare to hope.

[div class=attrib]From the New Scientist:[end-div]

In your wildest dreams, could you imagine a government that builds its policies on carefully gathered scientific evidence? One that publishes the rationale behind its decisions, complete with data, analysis and supporting arguments? Well, dream no longer: that’s where the UK is heading.

It has been a long time coming, according to Chris Wormald, permanent secretary at the Department for Education. The civil service is not short of clever people, he points out, and there is no lack of desire to use evidence properly. More than 20 years as a serving politician has convinced him that they are as keen as anyone to create effective policies. “I’ve never met a minister who didn’t want to know what worked,” he says. What has changed now is that informed policy-making is at last becoming a practical possibility.

That is largely thanks to the abundance of accessible data and the ease with which new, relevant data can be created. This has supported a desire to move away from hunch-based politics.

Last week, for instance, Rebecca Endean, chief scientific advisor and director of analytical services at the Ministry of Justice, announced that the UK government is planning to open up its data for analysis by academics, accelerating the potential for use in policy planning.

At the same meeting, hosted by innovation-promoting charity NESTA, Wormald announced a plan to create teaching schools based on the model of teaching hospitals. In education, he said, the biggest single problem is a culture that often relies on anecdotal experience rather than systematically reported data from practitioners, as happens in medicine. “We want to move teacher training and research and practice much more onto the health model,” Wormald said.

Test, learn, adapt

In June last year the Cabinet Office published a paper called “Test, Learn, Adapt: Developing public policy with randomised controlled trials”. One of its authors, the doctor and campaigning health journalist Ben Goldacre, has also been working with the Department of Education to compile a comparison of education and health research practices, to be published in the BMJ.

In education, the evidence-based revolution has already begun. A charity called the Education Endowment Foundation is spending £1.4 million on a randomised controlled trial of reading programmes in 50 British schools.

There are reservations though. The Ministry of Justice is more circumspect about the role of such trials. Where it has carried out randomised controlled trials, they often failed to change policy, or even irked politicians with conclusions that were obvious. “It is not a panacea,” Endean says.

Power of prediction

The biggest need is perhaps foresight. Ministers often need instant answers, and sometimes the data are simply not available. Bang goes any hope of evidence-based policy.

“The timescales of policy-making and evidence-gathering don’t match,” says Paul Wiles, a criminologist at the University of Oxford and a former chief scientific adviser to the Home Office. Wiles believes that to get round this we need to predict the issues that the government is likely to face over the next decade. “We can probably come up with 90 per cent of them now,” he says.

Crucial to the process will be convincing the public about the value and use of data, so that everyone is on-board. This is not going to be easy. When the government launched its Administrative Data Taskforce, which set out to look at data in all departments and opening it up so that it could be used for evidence-based policy, it attracted minimal media interest.

The taskforce’s remit includes finding ways to increase trust in data security. Then there is the problem of whether different departments are legally allowed to exchange data. There are other practical issues: many departments format data in incompatible ways. “At the moment it’s incredibly difficult,” says Jonathan Breckon, manager of the Alliance for Useful Evidence, a collaboration between NESTA and the Economic and Social Research Council.

[div class=attrib]Read the entire article after the jump.[end-div]

Map as Illusion

We love maps here at theDiagonal. We also love ideas that challenge the status quo. And, this latest Strange Map, courtesy of Frank Jacobs over at Big Think does both. What we appreciate about his cartographic masterpiece is that it challenges our visual perception, and, more importantly, challenges our assumed hemispheric worldview.

[div class=attrib]Read more of this article after the jump.[end-div]

National Geographic Hits 125

Chances are that if you don’t have some ancient National Geographic magazines hidden in a box in your attic, then you know someone who does. If not, it’s time to see what you have been missing all these years. National Geographic celebrates 125 years in 2013, and what better way to do this than to look back through some of its glorious photographic archives.

[div class=attrib]See more classic images after the jump.[end-div]

[div class=attrib]Image: 1964, Tanzania: a touching moment between the primatologist and National Geographic grantee Jane Goodall and a young chimpanzee called Flint at Tanzania’s Gombe Stream reserve. Courtesy of Guardian / National Geographic.[end-div]

Time for Some Pigovian Taxes

Leaving the merits of capitalism or socialism aside for a moment, let’s consider the case for taxing bad behavior versus good. Adam Davidson, economics columnist and founder of NPR’s Planet Money, reviews the case now being made by a growing number of economists on both the left and the right. They all come to a similar conclusion: Forget about taxing good or constrictive behavior such as entrepreneurialism. Rather, it’s time to tax people for doing destructive and damaging things.

Arthur Pigou, the early-20th century economist, for whom Pigovian taxes are so named, argued that people should face the consequences of externalities. An externality covers an action that we take and that affects others, but to which the market cannot, yet, assign a price. Here’s an example. Say on your morning commute to work your bad habit of driving while using a mobile phone causes an accident followed by an hour-long traffic jam — the lost productivity from all those stuck behind you on the highway is an externality. So, the thinking goes, what if we were to tax such errant behavior? Not only would governments secure an alternate, or — sigh — yet another form of revenue, but we could also collectively discourage bad behavior through monetary means. Taxes on tobacco are a good example — more so due to the addictive nature of nicotine.

Perhaps it’s time for a tax on burgers and fries, a tax on sneezing and coughing in public, and, why not, a tax on those who sing out of tune.

[div class=attrib]From the New York Times:[end-div]

Driving home during the holidays, I found myself trapped in the permanent traffic jam on I-95 near Bridgeport, Conn. In the back seat, my son was screaming. All around, drivers had the menaced, lifeless expressions that people get when they see cars lined up to the horizon. It was enough to make me wish for congestion pricing — a tax paid by drivers to enter crowded areas at peak times. After all, it costs drivers about $16 to enter central London during working hours. A few years ago, it nearly caught on in New York. And on that drive home, I would have happily paid whatever it cost to persuade some other drivers that it wasn’t worth it for them to be on the road.

Instead, we all suffered. Each car added an uncharged burden to every other person. In fact, everyone on the road was doing all sorts of harm to society without paying the cost. I drove about 150 miles that day and emitted, according to E.P.A. data, about 140 pounds of carbon dioxide. My very presence also increased (albeit infinitesimally) the likelihood of a traffic accident, further dependence on foreign oil and the proliferation of urban sprawl. According to an influential study by the I.M.F. economist Ian Parry, my hours on the road cost society around $10. Add up all the cars in all the traffic jams across the country, and it’s clear that drivers are costing hundreds of billions of dollars a year that we don’t pay for.

This is how economists think, anyway. And that’s why a majority of them support some form of Pigovian tax, named after Arthur Pigou, the early-20th-century British economist. Pigou developed the idea of externalities: the things we do that affect others and that the market is unable to price. A negative externality is like the national equivalent of what happens when you go to dinner with three friends and, knowing that you’ll pay only a fourth of the bill, decide to order an expensive entree. Pigou argued that there are so many damaging things that we do — play music too loudly, drive aggressively — and that we’d probably do less if we had to pay for them.

The $10 I cost the economy was based on Parry’s algorithm, which calculates that drivers should pay a tax of at least $1.25 a gallon. Forty percent of that price, he says, is the cost that each vehicle adds to congestion. Another 40 cents or so offsets the price of accidents if we divided the full cost — more than $400 billion annually — by each gallon of gas consumed. (Only about 32 cents would be needed to offset the impact on the environment.) According to Parry’s logic, if we paid a tax of $1.25 per gallon instead of the current average of 50 cents, the price of gas would increase by about 25 percent to around $4 a gallon, which is still well below what much of Europe pays. But it would still encourage us to drive less, pollute less, crash less, lower the country’s dependence on foreign oil and make cities more livable. Not surprisingly, several studies have found that people — especially in Europe, where the gas tax is around $3 a gallon — drive a lot less when they have to pay a lot more for gas.

The idea of raising taxes to help society might sound like the ravings of a left-wing radical, or an idea that would destroy American industry. Yet the nation’s leading proponent of a Pigovian gas tax is N. Gregory Mankiw, chairman of President George W. Bush’s Council of Economic Advisers and a consultant to Mitt Romney’s 2012 campaign. Mankiw keeps track of others who support Pigovian taxes, and his unofficial Pigou Club is surely the only group that counts Ralph Nader and Al Gore along with leading conservatives like Charles Krauthammer, Alan Greenspan and Gary Becker as members.

Republican economists, like Mankiw, normally oppose tax increases, but many support Pigovian taxes because, in some sense, we are already paying them. We pay the tax in the form of the overcrowded roads, higher insurance premiums, smog and global warming. Adding an extra fee at the pump simply makes the cost explicit. Pigou’s approach, Mankiw argues, also converts a burden into a benefit. Imposing taxes on income and capital gains, he notes, punishes the work and investment that improve society; taxing negative externalities allows the government to make money while discouraging activity that hurts the overall economy.

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image: Arthur Cecil Pigou, 1943. Courtesy of Ramsey and Muspratt Collection.[end-div]

Climate Change Report

No pithy headline. The latest U.S. National Climate Assessment makes sobering news. The full 1,146 page report is available for download here.

Over the next 30 years (and beyond), it warns of projected sea-level rises along the Eastern Seaboard of the United States, warmer temperatures across much of the nation, and generally warmer and more acidic oceans. More worrying still are the less direct consequences of climate change: increased threats to human health due to severe weather such as storms, drought and wildfires; more vulnerable infrastructure in regions subject to increasingly volatile weather; and rising threats to regional stability and national security due to a less reliable national and global water supply.

[div class=attrib]From Scientific American:[end-div]

The consequences of climate change are now hitting the United States on several fronts, including health, infrastructure, water supply, agriculture and especially more frequent severe weather, a congressionally mandated study has concluded.

A draft of the U.S. National Climate Assessment, released on Friday, said observable change to the climate in the past half-century “is due primarily to human activities, predominantly the burning of fossil fuel,” and that no areas of the United States were immune to change.

“Corn producers in Iowa, oyster growers in Washington State, and maple syrup producers in Vermont have observed changes in their local climate that are outside of their experience,” the report said.

Months after Superstorm Sandy hurtled into the U.S. East Coast, causing billions of dollars in damage, the report concluded that severe weather was the new normal.

“Certain types of weather events have become more frequent and/or intense, including heat waves, heavy downpours, and, in some regions, floods and droughts,” the report said, days after scientists at the National Oceanic and Atmospheric Administration declared 2012 the hottest year ever in the United States.

Some environmentalists looked for the report to energize climate efforts by the White House or Congress, although many Republican lawmakers are wary of declaring a definitive link between human activity and evidence of a changing climate.

The U.S. Congress has been mostly silent on climate change since efforts to pass “cap-and-trade” legislation collapsed in the Senate in mid-2010.

The advisory committee behind the report was established by the U.S. Department of Commerce to integrate federal research on environmental change and its implications for society. It made two earlier assessments, in 2000 and 2009.

Thirteen departments and agencies, from the Agriculture Department to NASA, are part of the committee, which also includes academics, businesses, nonprofits and others.

‘A WARNING TO ALL OF US’

The report noted that of an increase in average U.S. temperatures of about 1.5 degrees F (.83 degree C) since 1895, when reliable national record-keeping began, more than 80 percent had occurred in the past three decades.

With heat-trapping gases already in the atmosphere, temperatures could rise by a further 2 to 4 degrees F (1.1 to 2.2 degrees C) in most parts of the country over the next few decades, the report said.

[div class=attrib]Read the entire article following the jump.[end-div]

Consumer Electronics Gone Mad

If you eat too quickly, then HAPIfork is the new eating device for you. If you have trouble seeing text on your palm-sized iPad, then Lenovo’s 27 inch tablet is for you. If you need musical motivation from One Direction to get your children to brush their teeth, then the Brush Buddies toothbrush is for you, and your kids. If you’re tired of technology, then stay away from this year’s Consumer Electronics Show (CES 2013).

If you’d like to see other strange products looking for a buyer follow this jump.

[div class=attrib]Image: The HAPIfork monitors how fast its user is eating and alerts them if their speed is faster than a pre-determined rate by vibrating, which altogether sounds like an incredibly strange eating experience. Courtesy of CES / Telegraph.[end-div]

Photography is Now Our Art

Over at the Guardian’s art and culture blog Jonathan Jones argues that photography has now become our de facto medium for contemporary artistic expression. Some may argue that the creative process underlying photography comes up short when compared with the skills and techniques required to produce some art in more traditional media. However, Jones seems right in one respect: today’s photography captures the drama of the human condition in a way that no other medium can today, it’s not even close. We are in awe of the skills demonstrated by the Old Masters. However, that it took months for Rembrandt to paint a single canvas misses the point. It still takes an eye and empathy and a desire to tell a unique story as the photographer clicks the digital shutter in a five-hundredth of a second.

[div class=attrib]From the Guardian:[end-div]

It has taken me a long time to see this, and you can laugh at me if you like. But here goes.

Photography is the serious art of our time. It also happens to be the most accessible and democratic way of making art that has ever been invented. But first, let’s define photography.

A photograph is an image captured on film, paper or – most commonly now – in digital memory. Photography also includes moving images captured on film or video. Moving or still, we all know a photograph is not a pure record of the visual world: it can be edited and transformed in infinite ways.

Moving or still, and however it is taken, whether by pinhole camera or phone, the photographic image is the successor to the great art of the past. It is in pictures by Don McCullin or films by Martin Scorsese that we see the real old master art of our time. Why? Because photography relishes human life. The greatness of art lies in human insight. What matters most is not the oil paints Rembrandt used, but his compassion. Photography is the quickest, most exact tool ever invented to record our lives and deaths – 17th-century painters would have loved it.

It has taken me a long time to see this, and you can laugh at me if you like. But here goes.

Photography is the serious art of our time. It also happens to be the most accessible and democratic way of making art that has ever been invented. But first, let’s define photography.

A photograph is an image captured on film, paper or – most commonly now – in digital memory. Photography also includes moving images captured on film or video. Moving or still, we all know a photograph is not a pure record of the visual world: it can be edited and transformed in infinite ways.

Moving or still, and however it is taken, whether by pinhole camera or phone, the photographic image is the successor to the great art of the past. It is in pictures by Don McCullin or films by Martin Scorsese that we see the real old master art of our time. Why? Because photography relishes human life. The greatness of art lies in human insight. What matters most is not the oil paints Rembrandt used, but his compassion. Photography is the quickest, most exact tool ever invented to record our lives and deaths – 17th-century painters would have loved it.

Or if David Hockney is right, they did love it. Vermeer almost certainly used a camera obscura to compose his scenes. Hockney believes that Caravaggio and many more artists used a “secret knowledge” of early cameras to perfect their almost hallucinatory understanding of the visual world.

However they did it, they painted the flux and drama of real life. From birth to death, great art is a sequence of moving pictures of the human condition.

Today, photography is the only art that seriously maintains this attention to the stuff that matters. Just look (as the world is looking) at this week’s incredible photographs of a family surviving a wild fire in Tasmania. Here is the human creature, vulnerable and heroic.

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image: Tim Holmes (not pictured) and his wife Tammy (second from left) huddled under a jetty for three hours with their grandchildren while their hometown in Tasmania was destroyed by wildfires. Courtesy of Tim Holmes/AP.[end-div]

Shedding Some Light On Dark Matter

Cosmologists theorized the need for dark matter to account for hidden mass in our universe. Yet, as the name implies, it is proving rather hard to find. Now astronomers believe they see hints of it in ancient galactic collisions.

[div class=attrib]From New Scientist:[end-div]

Colliding clusters of galaxies may hold clues to a mysterious dark force at work in the universe. This force would act only on invisible dark matter, the enigmatic stuff that makes up 86 per cent of the mass in the universe.

Dark matter famously refuses to interact with ordinary matter except via gravity, so theorists had assumed that its particles would be just as aloof with each other. But new observations suggest that dark matter interacts significantly with itself, while leaving regular matter out of the conversation.

“There could be a whole class of dark particles that don’t interact with normal matter but do interact with themselves,” says James Bullock of the University of California, Irvine. “Dark matter could be doing all sorts of interesting things, and we’d never know.”

Some of the best evidence for dark matter’s existence came from the Bullet clusterMovie Camera, a smash-up in which a small galaxy cluster plunged through a larger one about 100 million years ago. Separated by hundreds of light years, the individual galaxies sailed right past each other, and the two clusters parted ways. But intergalactic gas collided and pooled on the trailing ends of each cluster.

Mass maps of the Bullet cluster showed that dark matter stayed in line with the galaxies instead of pooling with the gas, proving that it can separate from ordinary matter. This also hinted that dark matter wasn’t interacting with itself, and was affected by gravity alone.

Musket shot

Last year William Dawson of the University of California, Davis, and colleagues found an older set of clusters seen about 700 million years after their collision. Nicknamed the Musket Ball cluster, this smash-up told a different tale. When Dawson’s team analysed the concentration of matter in the Musket Ball, they found that galaxies are separated from dark matter by about 19,000 light years.

“The galaxies outrun the dark matter. That’s what creates the offset,” Dawson said. “This is fitting that picture of self-interacting dark matter.” If dark matter particles do interact, perhaps via a dark force, they would slow down like the gas.

This new picture could solve some outstanding mysteries in cosmology, Dawson said this week during a meeting of the American Astronomical Society in Long Beach, California. Non-interacting dark matter should sink to the cores of star clusters and dwarf galaxies, but observations show that it is more evenly distributed. If it interacts with itself, it could puff up and spread outward like a gas.

So why doesn’t the Bullet cluster show the same separation between dark matter and galaxies? Dawson thinks it’s a question of age – dark matter in the younger Bullet simply hasn’t had time to separate.

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image: An overlay of an optical image of a cluster of galaxies with an x-ray image of hot gas lying within the cluster. Courtesy of NASA.[end-div]

Next Potential Apocalypse: 2036

Having missed the recent apocalypse said to have been predicted by the Mayans, the next possible end of the world is set for 2036. This time it’s courtesy of aptly named asteroid – Apophis.

[div class=attrib]From the Guardian:[end-div]

In Egyptian myth, Apophis was the ancient spirit of evil and destruction, a demon that was determined to plunge the world into eternal darkness.

A fitting name, astronomers reasoned, for a menace now hurtling towards Earth from outerspace. Scientists are monitoring the progress of a 390-metre wide asteroid discovered last year that is potentially on a collision course with the planet, and are imploring governments to decide on a strategy for dealing with it.

Nasa has estimated that an impact from Apophis, which has an outside chance of hitting the Earth in 2036, would release more than 100,000 times the energy released in the nuclear blast over Hiroshima. Thousands of square kilometres would be directly affected by the blast but the whole of the Earth would see the effects of the dust released into the atmosphere.

And, scientists insist, there is actually very little time left to decide. At a recent meeting of experts in near-Earth objects (NEOs) in London, scientists said it could take decades to design, test and build the required technology to deflect the asteroid. Monica Grady, an expert in meteorites at the Open University, said: “It’s a question of when, not if, a near Earth object collides with Earth. Many of the smaller objects break up when they reach the Earth’s atmosphere and have no impact. However, a NEO larger than 1km [wide] will collide with Earth every few hundred thousand years and a NEO larger than 6km, which could cause mass extinction, will collide with Earth every hundred million years. We are overdue for a big one.”

Apophis had been intermittently tracked since its discovery in June last year but, in December, it started causing serious concern. Projecting the orbit of the asteroid into the future, astronomers had calculated that the odds of it hitting the Earth in 2029 were alarming. As more observations came in, the odds got higher.

Having more than 20 years warning of potential impact might seem plenty of time. But, at last week’s meeting, Andrea Carusi, president of the Spaceguard Foundation, said that the time for governments to make decisions on what to do was now, to give scientists time to prepare mitigation missions. At the peak of concern, Apophis asteroid was placed at four out of 10 on the Torino scale – a measure of the threat posed by an NEO where 10 is a certain collision which could cause a global catastrophe. This was the highest of any asteroid in recorded history and it had a 1 in 37 chance of hitting the Earth. The threat of a collision in 2029 was eventually ruled out at the end of last year

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Graphic: This graphic shows the orbit of the asteroid Apophis in relation to the paths of Earth and other planets in the inner solar system. Courtesy of MSNBC.[end-div]

What’s Next at the LHC: Parallel Universe?

The Large Hadron Collider (LHC) at CERN made headlines in 2012 with the announcement of a probable discovery of the Higgs Boson. Scientists are collecting and analyzing more data before they declare an outright discovery in 2013. In the meantime, they plan to use the giant machine to examine even more interesting science — at very small and very large scales — in the new year.

[div class=attrib]From the Guardian:[end-div]

When it comes to shutting down the most powerful atom smasher ever built, it’s not simply a question of pressing the off switch.

In the French-Swiss countryside on the far side of Geneva, staff at the Cern particle physics laboratory are taking steps to wind down the Large Hadron Collider. After the latest run of experiments ends next month, the huge superconducting magnets that line the LHC’s 27km-long tunnel must be warmed up, slowly and gently, from -271 Celsius to room temperature. Only then can engineers descend into the tunnel to begin their work.

The machine that last year helped scientists snare the elusive Higgs boson – or a convincing subatomic impostor – faces a two-year shutdown while engineers perform repairs that are needed for the collider to ramp up to its maximum energy in 2015 and beyond. The work will beef up electrical connections in the machine that were identified as weak spots after an incident four years ago that knocked the collider out for more than a year.

The accident happened days after the LHC was first switched on in September 2008, when a short circuit blew a hole in the machine and sprayed six tonnes of helium into the tunnel that houses the collider. Soot was scattered over 700 metres. Since then, the machine has been forced to run at near half its design energy to avoid another disaster.

The particle accelerator, which reveals new physics at work by crashing together the innards of atoms at close to the speed of light, fills a circular, subterranean tunnel a staggering eight kilometres in diameter. Physicists will not sit around idle while the collider is down. There is far more to know about the new Higgs-like particle, and clues to its identity are probably hidden in the piles of raw data the scientists have already gathered, but have had too little time to analyse.

But the LHC was always more than a Higgs hunting machine. There are other mysteries of the universe that it may shed light on. What is the dark matter that clumps invisibly around galaxies? Why are we made of matter, and not antimatter? And why is gravity such a weak force in nature? “We’re only a tiny way into the LHC programme,” says Pippa Wells, a physicist who works on the LHC’s 7,000-tonne Atlas detector. “There’s a long way to go yet.”

The hunt for the Higgs boson, which helps explain the masses of other particles, dominated the publicity around the LHC for the simple reason that it was almost certainly there to be found. The lab fast-tracked the search for the particle, but cannot say for sure whether it has found it, or some more exotic entity.

“The headline discovery was just the start,” says Wells. “We need to make more precise measurements, to refine the particle’s mass and understand better how it is produced, and the ways it decays into other particles.” Scientists at Cern expect to have a more complete identikit of the new particle by March, when repair work on the LHC begins in earnest.

By its very nature, dark matter will be tough to find, even when the LHC switches back on at higher energy. The label “dark” refers to the fact that the substance neither emits nor reflects light. The only way dark matter has revealed itself so far is through the pull it exerts on galaxies.

Studies of spinning galaxies show they rotate with such speed that they would tear themselves apart were there not some invisible form of matter holding them together through gravity. There is so much dark matter, it outweighs by five times the normal matter in the observable universe.

The search for dark matter on Earth has failed to reveal what it is made of, but the LHC may be able to make the substance. If the particles that constitute it are light enough, they could be thrown out from the collisions inside the LHC. While they would zip through the collider’s detectors unseen, they would carry energy and momentum with them. Scientists could then infer their creation by totting up the energy and momentum of all the particles produced in a collision, and looking for signs of the missing energy and momentum.

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Image: The eight torodial magnets can be seen on the huge ATLAS detector with the calorimeter before it is moved into the middle of the detector. This calorimeter will measure the energies of particles produced when protons collide in the centre of the detector. ATLAS will work along side the CMS experiment to search for new physics at the 14 TeV level. Courtesy of CERN.[end-div]

From 7 Up to 56 Up

[tube]ngSGIjwwc4U[/tube]

The classic documentary and social experiment continues with the release this week of “56 Up”. Michael Apted began this remarkable process with a documentary called “7 Up” in 1964. It followed the lives of 14 British children aged 7, from different socio-economic backgrounds. Although the 7 Up documentary was initially planned to be a one-off, subsequent installments followed in seven-year cycles. Each time Apted would bring us up to date with the lives of his growing subjects. Now, they are all turning 56 years old. Fifty-six years on the personal stories are poignant and powerful, yet class divisions remain.

[div class=attrib]From the Telegraph:[end-div]

Life rushes by so fast, it flickers today and is gone tomorrow. In “56 Up” — the latest installment in Michael Apted’s remarkable documentary project that has followed a group of Britons since 1964, starting when they were 7 — entire lifetimes race by with a few edits. One minute, a boy is merrily bobbing along. The next, he is 56 years old, with a wife or an ex, a few children or none, a career, a job or just dim prospects. Rolls of fat girdle his middle and thicken his jowls. He has regrets, but their sting has usually softened, along with everything else.

In a lot of documentaries you might not care that much about this boy and what became of him. But if you have watched any of the previous episodes in Mr. Apted’s series, you will care, and deeply, partly because you watched that boy grow up, suffer and triumph in a project that began as a news gimmick and social experiment and turned into a plangent human drama. Conceived as a one-off for a current-affairs program on Granada Television, the first film, “Seven Up!,” was a 40-minute look at the lives of 14 children from different backgrounds. Britain was changing, or so went the conventional wisdom, with postwar affluence having led the working class to adapt middle-class attitudes and lifestyles.

In 1963, though, the sociologists John H. Goldthorpe and David Lockwood disputed this widely held “embourgeoisement thesis,” arguing that the erosion of social class had not been as great as believed. In its deeply personal fashion, the “Up” series went on to make much the same point by checking in with many of the same boys and girls, men and women, every seven years. Despite some dropouts, the group has remained surprisingly intact. For better and sometimes worse, and even with their complaints about the series, participants like Tony Walker, who wanted to be a jockey and found his place as a cabby, have become cyclical celebrities. For longtime viewers they have become something more, including mirrors.

It’s this mirroring that helps make the series so poignant. As in the earlier movies, Mr. Apted again folds in older material from the ages of 7, 14 and so on, to set the scene and jog memories. The abrupt juxtapositions of epochs can be jarring, unnerving or touching — sometimes all three — as bright-faced children bloom and sometimes fade within seconds. An analogous project in print or even still photographs wouldn’t be as powerful, because what gives the “Up” series its punch is not so much its longevity or the human spectacle it offers, but that these are moving images of touchingly vibrant lives at certain moments in time and space. The more you watch, the more the movies transform from mirrors into memory machines, ones that inevitably summon reflections of your own life.

Save for “Seven Up!,” filmed in gorgeous black and white, the documentaries are aesthetically unremarkable. Shot in digital, “56 Up” pretty much plays like the earlier movies, with its mix of interviews and location shooting. Every so often you hear someone off screen, presumably Mr. Apted, make a comment, though mostly he lets his choice of what to show — the subjects at work or play, with family or friends — and his editing do his editorializing. In the past he has brought participants together, but he doesn’t here, which feels like a missed opportunity. Have the three childhood friends from the East End of London, Jackie Bassett, Lynn Johnson and Sue Sullivan, two of whom have recently endured heart-rendingly bad times, remained in contact? Mr. Apted doesn’t say.

With few exceptions and despite potential path-changing milestones like marriages and careers, everyone seems to have remained fairly locked in his or her original social class. At 7, Andrew Brackfield and John Brisby already knew which universities they would or should attend. “We think,” John said in “Seven Up!, “I’m going to Cambridge and Trinity Hall,” though he landed at Oxford. Like Mr. Brackfield, who did attend Cambridge, Mr. Brisby became a lawyer and still sounds to the manner born, with an accent that evokes old-fashioned news readers and Bond villains. The two hold instructively different views about whether the series corroborates the first film’s thesis about the rigidity of the British class structure, never mind that their lives are strong evidence that little has changed.

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Video: 7 Up – Part 1. Courtesy of World in Action, Granada TV.[end-div]

Plagiarism is the Sincerest Form of Capitalism

Plagiarism is fine art in China. But, it’s also very big business. The nation knocks off everything, from Hollywood and Bollywood movies, to software, electronics, appliances, drugs, and military equipment. Now, it’s moved on to copying architectural plans.

[div class=attrib]From the Telegraph:[end-div]

China is famous for its copy-cat architecture: you can find replicas of everything from the Eiffel Tower and the White House to an Austrian village across its vast land. But now they have gone one step further: recreating a building that hasn’t even been finished yet. A building designed by the Iraqi-British architect Dame Zaha Hadid for Beijing has been copied by a developer in Chongqing, south-west China, and now the two projects are racing to be completed first.

Dame Zaha, whose Wangjing Soho complex consists of three pebble-like constructions and will house an office and retail complex, unveiled her designs in August 2011 and hopes to complete the project next year.

Meanwhile, a remarkably similar project called Meiquan 22nd Century is being constructed in Chongqing, that experts (and anyone with eyes, really) deem a rip-off. The developers of the Soho complex are concerned that the other is being built at a much faster rate than their own.

“It is possible that the Chongqing pirates got hold of some digital files or renderings of the project,” Satoshi Ohashi, project director at Zaha Hadid Architects, told Der Spiegel online. “[From these] you could work out a similar building if you are technically very capable, but this would only be a rough simulation of the architecture.”

So where does the law stand? Reporting on the intriguing case, China Intellectual Property magazine commented, “Up to now, there is no special law in China which has specific provisions on IP rights related to architecture.” They added that if it went to court, the likely outcome would be payment of compensation to Dame Zaha’s firm, rather than the defendant being forced to pull the building down. However, Dame Zaha seems somewhat unfazed about the structure, simply remarking that if the finished building contains a certain amount of innovation then “that could be quite exciting”. One of the world’s most celebrated architects, Dame Zaha – who recently designed the Aquatics Centre for the London Olympics – has 11 current projects in China. She is quite the star over there: 15,000 fans flocked to see her give a talk at the unveiling of the designs for the complex.

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image: Wangjing Soho Architecture. Courtesy of Zaha Hadid Architects.[end-div]

You Are Different From Yourself

The next time your spouse tells you that you’re “just not the same person anymore” there may be some truth to it. After all, we are not who we thought we would become, nor are we likely to become what we think. That’s the overall result of a recent study of human personality changes in around 20,000 people over time.

[div class=attrib]From Independent:[end-div]

When we remember our past selves, they seem quite different. We know how much our personalities and tastes have changed over the years. But when we look ahead, somehow we expect ourselves to stay the same, a team of psychologists said Thursday, describing research they conducted of people’s self-perceptions.

They called this phenomenon the “end of history illusion,” in which people tend to “underestimate how much they will change in the future.” According to their research, which involved more than 19,000 people ages 18 to 68, the illusion persists from teenage years into retirement.

“Middle-aged people — like me — often look back on our teenage selves with some mixture of amusement and chagrin,” said one of the authors, Daniel T. Gilbert, a psychologist at Harvard. “What we never seem to realize is that our future selves will look back and think the very same thing about us. At every age we think we’re having the last laugh, and at every age we’re wrong.”

Other psychologists said they were intrigued by the findings, published Thursday in the journal Science, and were impressed with the amount of supporting evidence. Participants were asked about their personality traits and preferences — their favorite foods, vacations, hobbies and bands — in years past and present, and then asked to make predictions for the future. Not surprisingly, the younger people in the study reported more change in the previous decade than did the older respondents.

But when asked to predict what their personalities and tastes would be like in 10 years, people of all ages consistently played down the potential changes ahead.

Thus, the typical 20-year-old woman’s predictions for her next decade were not nearly as radical as the typical 30-year-old woman’s recollection of how much she had changed in her 20s. This sort of discrepancy persisted among respondents all the way into their 60s.

And the discrepancy did not seem to be because of faulty memories, because the personality changes recalled by people jibed quite well with independent research charting how personality traits shift with age. People seemed to be much better at recalling their former selves than at imagining how much they would change in the future.

Why? Dr. Gilbert and his collaborators, Jordi Quoidbach of Harvard and Timothy D. Wilson of the University of Virginia, had a few theories, starting with the well-documented tendency of people to overestimate their own wonderfulness.

“Believing that we just reached the peak of our personal evolution makes us feel good,” Dr. Quoidbach said. “The ‘I wish that I knew then what I know now’ experience might give us a sense of satisfaction and meaning, whereas realizing how transient our preferences and values are might lead us to doubt every decision and generate anxiety.”

Or maybe the explanation has more to do with mental energy: predicting the future requires more work than simply recalling the past. “People may confuse the difficulty of imagining personal change with the unlikelihood of change itself,” the authors wrote in Science.

The phenomenon does have its downsides, the authors said. For instance, people make decisions in their youth — about getting a tattoo, say, or a choice of spouse — that they sometimes come to regret.

And that illusion of stability could lead to dubious financial expectations, as the researchers showed in an experiment asking people how much they would pay to see their favorite bands.

When asked about their favorite band from a decade ago, respondents were typically willing to shell out $80 to attend a concert of the band today. But when they were asked about their current favorite band and how much they would be willing to spend to see the band’s concert in 10 years, the price went up to $129. Even though they realized that favorites from a decade ago like Creed or the Dixie Chicks have lost some of their luster, they apparently expect Coldplay and Rihanna to blaze on forever.

“The end-of-history effect may represent a failure in personal imagination,” said Dan P. McAdams, a psychologist at Northwestern who has done separate research into the stories people construct about their past and future lives. He has often heard people tell complex, dynamic stories about the past but then make vague, prosaic projections of a future in which things stay pretty much the same.

[div class=attrib]Read the entire article after the jump.[end-div]

Planets From Stardust

Stunning images captured by Atacama Millimetre-submillimetre Array (ALMA) radio telescope in Chile show the early stages of a planet forming from stardust around a star located 450 light-years from Earth. This is the first time that astronomers have snapped such a clear picture of the process, confirming long-held theories of planetary formation.

[div class=attrib]From Independent:[end-div]

The world’s highest radio telescope, built on a Chilean plateau in the Andes 5,000 metres above sea level, has captured the first image of a new planet being formed as it gobbles up the cosmic dust and gas surrounding a distant star.

Astronomers have long predicted that giant “gas” planets similar to Jupiter would form by collecting the dust and debris that forms around a young star. Now they have the first visual evidence to support the phenomenon, scientists said.

The image taken by the Atacama Millimetre-submillimetre Array (ALMA) in Chile shows two streams of gas connecting the inner and outer disks of cosmic material surrounding the star HD 142527, which is about 450 light-years from Earth.

Astronomers believe the gas streamers are the result of two giant planets – too small to be visible in this image – exerting a gravitational pull on the cloud of surrounding dust and gas, causing the material to flow from the outer to inner stellar disks, said Simon Casassus of the University of Chile in Santiago.

“The most natural interpretation for the flows seen by ALMA is that the putative proto-planets are pulling streams of gas inward towards them that are channelled by their gravity. Much of the gas then overshoots the planets and continues inward to the portion of the disk close to the star, where it can eventually fall onto the star itself,” Dr Casassus said.

“Astronomers have been predicting that these streams exist, but this is the first time we’ve been able to see them directly. Thanks to the new ALMA telescope, we’ve been able to get direct observations to illuminate current theories of how planets are formed,” he said.

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Image: Observations (left) made with the ALMA telescope of the young star HD 142527. The dust in the outer disc is shown in red. Dense gas in the streams flowing across the gap, as well as in the outer disc, is shown in green. Diffuse gas in the central gap is shown in blue. The gas filaments can be seen at the three o’clock and ten o’clock positions, flowing from the outer disc towards the centre. And (right) an artist’s impression. Courtesy of Independent.[end-div]

Curiosity’s 10K Hike

Scientists and engineers at JPL have Mount Sharp in their sites. It’s no ordinary mountain — it’s situated on Mars. The 5,000 meter high mountain is home to exposed layers of some promising sedimentary rocks, which hold clues to Mars’ geologic, and perhaps biological, history. Unfortunately, Mount Sharp is 10K away from the current home of the Curiosity rover. So, at a top speed of around 100 meters per day it will take Curiosity until the fall of 2013 to reach its destination.

[div class=attrib]From the New Scientist:[end-div]

NASA’S Curiosity rover is about to have its cake and eat it too. Around September, the rover should get its first taste of layered sediments at Aeolis Mons, a mountain over 5 kilometres tall that may hold preserved signs of life on Mars.

Previous rovers uncovered ample evidence of ancient water, a key ingredient for life as we know it. With its sophisticated on-board chemistry lab, Curiosity is hunting for more robust signs of habitability, including organic compounds – the carbon-based building blocks of life as we know it.

Observations from orbit show that the layers in Aeolis Mons – also called Mount Sharp – contain minerals thought to have formed in the presence of water. That fits with theories that the rover’s landing site, Gale crater, was once a large lake. Even better, the layers were probably laid down quickly enough that the rocks could have held on to traces of microorganisms, if they existed there.

If the search for organics turns up empty, Aeolis Mons may hold other clues to habitability, says project scientist John Grotzinger of the California Institute of Technology in Pasadena. The layers will reveal which minerals and chemical processes were present in Mars’s past. “We’re going to find all kinds of good stuff down there, I’m sure,” he says.

Curiosity will explore a region called Glenelg until early February, and then hit the gas. The base of the mountain is 10 kilometres away, and the rover can drive at about 100 metres a day at full speed. The journey should take between six and nine months, but will include stops to check out any interesting landmarks. After all, some of the most exciting discoveries from Mars rovers were a result of serendipity.

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Image: Base of Mount Sharp, Mars. Courtesy of Credit: NASA/JPL-Caltech/MSSS.[end-div]

Evolution and Autocatalysis

A clever idea about the process of emergence from mathematicians at the University of Vermont has some evolutionary biologists thinking.

[div class=attrib]From MIT Review:[end-div]

One of the most puzzling questions about the origin of life is how the rich chemical landscape that makes life possible came into existence.

This landscape would have consisted among other things of amino acids, proteins and complex RNA molecules. What’s more, these molecules must have been part of a rich network of interrelated chemical reactions which generated them in a reliable way.

Clearly, all that must have happened before life itself emerged. But how?

One idea is that groups of molecules can form autocatalytic sets. These are self-sustaining chemical factories, in which the product of one reaction is the feedstock or catalyst for another. The result is a virtuous, self-contained cycle of chemical creation.

Today, Stuart Kauffman at the University of Vermont in Burlington and a couple of pals take a look at the broader mathematical properties of autocatalytic sets. In examining this bigger picture, they come to an astonishing conclusion that could have remarkable consequences for our understanding of complexity, evolution and the phenomenon of emergence.

They begin by deriving some general mathematical properties of autocatalytic sets, showing that such a set can be made up of many autocatalytic subsets of different types, some of which can overlap.

In other words, autocatalytic sets can have a rich complex structure of their own.

They go on to show how evolution can work on a single autocatalytic set, producing new subsets within it that are mutually dependent on each other.  This process sets up an environment in which newer subsets can evolve.

“In other words, self-sustaining, functionally closed structures can arise at a higher level (an autocatalytic set of autocatalytic sets), i.e., true emergence,” they say.

That’s an interesting view of emergence and certainly seems a sensible approach to the problem of the origin of life. It’s not hard to imagine groups of molecules operating together like this. And indeed, biochemists have recently discovered simple autocatalytic sets that behave in exactly this way.

But what makes the approach so powerful is that the mathematics does not depend on the nature of chemistry–it is substrate independent. So the building blocks in an autocatalytic set need not be molecules at all but any units that can manipulate other units in the required way.

These units can be complex entities in themselves. “Perhaps it is not too far-fetched to think, for example, of the collection of bacterial species in your gut (several hundreds of them) as one big autocatalytic set,” say Kauffman and co.

And they go even further. They point out that the economy is essentially the process of transforming raw materials into products such as hammers and spades that themselves facilitate further transformation of raw materials and so on. “Perhaps we can also view the economy as an (emergent) autocatalytic set, exhibiting some sort of functional closure,” they speculate.

[div class=attrib]Read the entire article after the jump.[end-div]