Category Archives: Book Reviews

Zebra Stripes

Zebra_Botswana

Why do zebras have stripes? Well, we’ve all learned from an early age that their peculiar and unique black and white stripes are an adaptation to combat predators. One theory suggests that the stripes are camouflage. Another theory suggests that the stripes are there to confuse predators. Yet another proposes that the stripes are a vivid warning signal.

But Tim Caro, professor of wildlife biology at the University of California, has a thoroughly different idea, conveyed in his new book, Zebra Stripes. After twenty years of study he’s convinced that the zebra’s stripes have a more mundane purpose — a deterrent to pesky biting flies.

From Wired:

At four in the morning, Tim Caro roused his colleagues. Bleary-eyed and grumbling, they followed him to the edge of the village, where the beasts were hiding. He sat them down in chairs, and after letting their eyes adjust for a minute, he asked them if they saw anything. And if so, would they please point where?

Not real beasts. Despite being camped in Tanzania’s Katavi National Park, Caro was asking his colleagues to identify pelts—from a wildebeest, an impala, and a zebra—that he had draped over chairs or clotheslines. Caro wanted to know if the zebra’s stripes gave it any sort of camouflage in the pre-dawn, when many predators hunt, and he needed the sort of replicability he could not count on from the animals roaming the savannah. “I lost a lot of social capital on that experiment,” says Caro. “If you’re going to be woken up at all, it’s important to be woken up for something exciting or unpredictable, and this was neither.”

The experiment was one of hundreds Caro performed over a twenty year scientific odyssey to discover why zebras have stripes—a question that nearly every major biologist since Alfred Russel Wallace has tried to answer. “It became sort of a challenge to me to try and investigate all the existing hypotheses so I could not only identify the right one,” he says, “but just as importantly kill all those remaining.” His new book, Zebra Stripes, chronicles every detail.

Read the entire story here.

Image: Zebras, Botswana. Courtesy: Paul Maritz, 2002. Creative Commons Attribution-Share Alike 3.0.

What Up With That: Nationalism

The recent political earthquake in the US is just one example of a nationalistic wave that swept across Western democracies in 2015-2016. The election in the US seemed to surprise many political talking-heads since the nation was, and still is, on a continuing path towards greater liberalism (mostly due to demographics).

So, what exactly is up with that? Can American liberals enter a coma for the next 4 years, sure to awaken refreshed and ready for a new left-of-center regime? Or, is the current nationalistic mood — albeit courtesy of a large minority — likely to prevail for a while longer? Well, there’s no clear answer, and political scientists and researchers are baffled.

Care to learn more about theories of nationalism and the historical underpinnings of nationalism? Visit my reading list over at Goodreads. But make sure you start with: Imagined Communities: Reflections on the Origin and Spread of Nationalism by Benedict Anderson. It’s been the global masterwork on the analysis of nationalism since it was first published in 1983.

I tend to agree with Anderson’s thesis, that a nation is mostly a collective figment of people’s imagination facilitated by modern communications networks. So, I have to believe that eventually our networks will help us overcome the false strictures of our many national walls and borders.

From Scientific American:

Waves of nationalist sentiment are reshaping the politics of Western democracies in unexpected ways — carrying Donald Trump to a surprise victory last month in the US presidential election, and pushing the United Kingdom to vote in June to exit the European Union. And nationalist parties are rising in popularity across Europe.

Many economists see this political shift as a consequence of globalization and technological innovation over the past quarter of a century, which have eliminated many jobs in the West. And political scientists are tracing the influence of cultural tensions arising from immigration and from ethnic, racial and sexual diversity. But researchers are struggling to understand why these disparate forces have combined to drive an unpredictable brand of populist politics.

“We have to start worrying about the stability of our democracies,” says Yascha Mounk, a political scientist at Harvard University in Cambridge, Massachusetts. He notes that the long-running World Values Survey shows that people are increasingly disaffected with their governments — and more willing to support authoritarian leaders.

Some academics have explored potential parallels between the roots of the current global political shift and the rise of populism during the Great Depression, including in Nazi Germany. But Helmut Anheier, president of the Hertie School of Governance in Berlin, cautions that the economic struggles of middle-class citizens across the West today are very different, particularly in mainland Europe.

The Nazis took advantage of the extreme economic hardship that followed the First World War and a global depression, but today’s populist movements are growing powerful in wealthy European countries with strong social programmes. “What brings about a right-wing movement when there are no good reasons for it?”Anheier asks.

In the United States, some have suggested that racism motivated a significant number of Trump voters. But that is too simplistic an explanation, says Theda Skocpol, a sociologist at Harvard University.  “Trump dominated the news for more than a year, and did so with provocative statements that were meant to exacerbate every tension in the US,” she says.

Read the entire story here.

p.s. What Up With That is my homage to the recurring Saturday Night Live (SNL) sketch of the same name.

Heroes Only Die at the Top of Hills

google-search-heroes-comicWe all need heroes. So, if you wish to become one, you would stand a better chance if you took your dying breaths atop a hill. Also, it would really help your cause if you arrived via virgin birth.

Accordingly, please refer to the Rank-Raglan Mythotype — it is a list of 22 universal archetypes that are prerequisites to you becoming a hero of mythological proportions (far beyond being a Youtube sensation):

  1. Hero’s mother is a royal virgin;
  2. His father is a king, and
  3. Often a near relative of his mother, but
  4. The circumstances of his conception are unusual, and
  5. He is also reputed to be the son of a god.
  6. At birth an attempt is made, usually by his father or his maternal grand father to kill him, but
  7. He is spirited away, and
  8. Reared by foster -parents in a far country.
  9. We are told nothing of his childhood, but
  10. On reaching manhood he returns or goes to his future Kingdom.
  11. After a victory over the king and/or a giant, dragon, or wild beast,
  12. He marries a princess, often the daughter of his predecessor and
  13. Becomes king.
  14. For a time he reigns uneventfully and
  15. Prescribes laws, but
  16. Later he loses favor with the gods and/or his subjects, and
  17. Is driven from the throne and city, after which
  18. He meets with a mysterious death,
  19. Often at the top of a hill,
  20. His children, if any do not succeed him.
  21. His body is not buried, but nevertheless
  22. He has one or more holy sepulchres.

By far the most heroic fit to date is Mithradates the Great with 22 out of a possible 22 cross-cultural traits. Jesus comes in with a score of 18-20 (based on interpretation) out of 22 , beaten by Krishna with 21, while Robin Hood only manages a paltry 13. Interestingly, Buddha collects 15 points, followed closely by Czar Nicholas II with 14.

The mythotype comes from the book The Hero: A study in Tradition, Myth and Dreams by Lord Raglan.

List courtesy of Professor Thomas J. Sienkewicz, Monmouth College, Monmouth, Illinois. It is based upon material used in his mythology classes for many years, first at Howard University in Washington, D.C., and then at Monmouth College in Monmouth, Illinois.

Image courtesy of Google Search.

What’s Up With Middle-Aged White Males?

reggie-perrin-bbc

 

 

 

 

 

Not too long ago I came across a number of articles describing the high and growing incidence of suicide among middle-aged white males. Indeed, the suicide rate has skyrocketed 40 percent since the early 2000s.

Understandably, and no less sad, the increase in suicides seems to be driven by acute financial distress, chronic pain and/or illness, alcoholism and drug addiction.

Now, it seems that there is a corresponding increase in the number of white males faking their disappearance or fantasizing about it. A classic example is John Darwin from the UK, also known as “canoe man“, who faked his own death in 2002. But a key difference between this group and those who take their own lives is that the group of white males looking to disappear tends to be financially and (reasonably) emotionally stable.

So what on earth is going on?

A soon too be published book — Playing Dead: A Journey Through the World of Death Fraud, by Elizabeth Greenwood, examines what it’s like to fake your own death and burgeoning “disappearance” industry.

Here’s an excerpt:

Perhaps Todd’s plan for faking his death will remain in the realm of pure fantasy. But were he to put his plan into motion, Todd fits the prime demographic for a death fraudster. As a middle-aged, middle-class, heterosexual white man with a family, Todd represents the person most likely to fake his death. I’d noticed this disproportion in the demographics, and I wondered if there was anything to it. Privacy consultant Frank Ahearn and author of How to Disappear told me that the majority of his clients who sought to leave their lives behind were men, and J. J. Luna, author of How to Be Invisible: Protect Your Home, Your Children, Your Assets, and Your Life, told me that “far more men than women!” seek his “invisibility” services. In the 1996 guidebook How to Disappear Completely and Never Be Found, disappearance enthusiast Doug Richmond writes, “To a man of a certain age, there’s a bit of magic in the very thought of cutting all ties, of getting away from it all, of changing names and jobs and women and living happily ever after in a more salubrious clime!”

But why do these seemingly privileged men, who enjoy every perk that DNA has to offer, feel so hemmed in that they must go off the radar entirely? Perhaps it’s because although men still out-earn women, they then entangle themselves in financial trouble trying to enhance their fortunes. Maybe they shrug off because they feel less responsibility to see their children grow and flourish. Women shoulder the burdens of family and community—they take care of dying parents, snotty kids, shut-in neighbors—anyone before themselves. Though that might be relying too heavily on conventional wisdom about gender roles, the numbers speak for themselves: faking death seems to be a heavily male phenomenon. After combing through the stories and examining the traits that men like Todd share, I noticed that they all seemed to feel emasculated, made impotent, by their mundane lives. So, not earning enough money, they invest in a harebrained scheme. Underwhelmed with their monogamous sex lives, they take up with other women. Faking death seems to be not only a way out but also, counterintuitively, a way to be brave.

Read more here.

Image: Actor Leonard Rossiter plays Reginald Iolanthe Perrin, from The Fall and Rise of Reginald Perrin, a mid-1970s BBC sitcom. Courtesy: BBC.

MondayMap: Connectography

21st-century-silk-road

I have had a peculiar affinity for luscious atlases and maps since childhood. They held promises of future explorations and adventures over ancient peaks, within new cultures, beyond borders. I also have a strange fascination for data, patterns in data, trends, probabilities, statistics (though I’m no mathematician).

So when I see someone combining maps and data, especially in fundamentally new ways, I have to take notice. Enter stage left: Parag Khanna. He’s a global strategist, author and a true cartophile. His new book “Connectography: Mapping the Future of Global Civilization,” uses grand cartographic visualizations to show how the world is steadily integrating.

Even for a reasonably geo-savvy person like me it’s eye-opening to see maps being used in insightful new ways — especially to draw attention to our global neighborhood and its common challenges.

One striking example shows the ties of railways, cables, pipelines and trade that further bind nations rather than the borders, often arbitrarily drawn, that once divided.

Dive into are recent interview with Parag Khanna here.

Map: The emerging silk roads of commerce interlinking 60 Asian nations. Courtesy: Parag Khanna, “Connectography: Mapping the Future of Global Civilization”.

Dune At Fifty

USA_Oregon_Dunes

Quite coincidentally, and with no prescience at work, I had a half-read Dune Messiah (the second installment of the Dune chronicles) at my side when this article spun its way across the ether. So, it made me put digital pen to digital paper. It’s hard to believe that this master work is now well into middle age. And like a fine wine maturing over time, rather than bursting into our collective consciousness when first published, Dune and its successors took decades to reach a critical mass of appeal.

In crafting this epic work of imagination Frank Herbert takes us on a voyage that goes beyond the narrow genres much-needed by our literary establishment. Is Dune science fiction? Is Dune space opera? Is Dune Fantasy or literary fiction? Is Dune thriller or romance? Or is Dune a treatise on politics and religion. The answer is yes.

But rather than seek to pigeonhole the work and thus limit its initial appeal to a new audience, I think it would be wise to took at Dune in an entirely different way. Dune is an evolutionary tale, and at many levels — it tells us of the evolution of ecological philosophy; the evolution of the self and of the state; the evolution of ideas and religion; the evolution of consciousness and culture.

I have to hope that younger generations, evolving fifty years from now and beyond, will be reading and contemplating Herbert’s work with as much awe.

From the Guardian:

In 1959, if you were walking the sand dunes near Florence, Oregon, you might have encountered a burly, bearded extrovert, striding about in Ray-Ban Aviators and practical army surplus clothing. Frank Herbert, a freelance writer with a feeling for ecology, was researching a magazine story about a US Department of Agriculture programme to stabilise the shifting sands by introducing European beach grass. Pushed by strong winds off the Pacific, the dunes moved eastwards, burying everything in their path. Herbert hired a Cessna light aircraft to survey the scene from the air. “These waves [of sand] can be every bit as devastating as a tidal wave … they’ve even caused deaths,” he wrote in a pitch to his agent. Above all he was intrigued by the idea that it might be possible to engineer an ecosystem, to green a hostile desert landscape.

About to turn 40, Herbert had been a working writer since the age of 19, and his fortunes had always been patchy. After a hard childhood in a small coastal community near Tacoma, Washington, where his pleasures had been fishing and messing about in boats, he’d worked for various regional newspapers in the Pacific northwest and sold short stories to magazines. He’d had a relatively easy war, serving eight months as a naval photographer before receiving a medical discharge. More recently he’d spent a weird interlude in Washington as a speechwriter for a Republican senator. There (his only significant time living on the east coast) he attended the daily Army-McCarthy hearings, watching his distant relative senator Joseph McCarthy root out communism. Herbert was a quintessential product of the libertarian culture of the Pacific coast, self-reliant and distrustful of centralised authority, yet with a mile-wide streak of utopian futurism and a concomitant willingness to experiment. He was also chronically broke. During the period he wrote Dune, his wife Beverly Ann was the main bread-winner, her own writing career sidelined by a job producing advertising copy for department stores.

Soon, Herbert’s research into dunes became research into deserts and desert cultures. It overpowered his article about the heroism of the men of the USDA (proposed title “They Stopped the Moving Sands”) and became two short SF novels, serialised in Analog Science Fact & Fiction, one of the more prestigious genre magazines. Unsatisfied, Herbert industriously reworked his two stories into a single, giant epic. The prevailing publishing wisdom of the time had it that SF readers liked their stories short. Dune (400 pages in its first hardcover edition, almost 900 in the paperback on my desk) was rejected by more than 20 houses before being accepted by Chilton, a Philadelphia operation known for trade and hobby magazines such as Motor Age, Jewelers’ Circular and the no-doubt-diverting Dry Goods Economist.

Though Dune won the Nebula and Hugo awards, the two most prestigious science fiction prizes, it was not an overnight commercial success. Its fanbase built through the 60s and 70s, circulating in squats, communes, labs and studios, anywhere where the idea of global transformation seemed attractive. Fifty years later it is considered by many to be the greatest novel in the SF canon, and has sold in millions around the world.

***

Dune is set in a far future, where warring noble houses are kept in line by a ruthless galactic emperor. As part of a Byzantine political intrigue, the noble duke Leto, head of the Homerically named House Atreides, is forced to move his household from their paradisiacal home planet of Caladan to the desert planet Arrakis, colloquially known as Dune. The climate on Dune is frighteningly hostile. Water is so scarce that whenever its inhabitants go outside, they must wear stillsuits, close-fitting garments that capture body moisture and recycle it for drinking.

The great enemy of House Atreides is House Harkonnen, a bunch of sybaritic no-goods who torture people for fun, and whose head, Baron Vladimir, is so obese that he has to use little anti-gravity “suspensors” as he moves around. The Harkonnens used to control Dune, which despite its awful climate and grubby desert nomad people, has incalculable strategic significance: its great southern desert is the only place in the galaxy where a fantastically valuable commodity called “melange” or “spice” is mined. Spice is a drug whose many useful properties include the induction of a kind of enhanced space-time perception in pilots of interstellar spacecraft. Without it, the entire communication and transport system of the Imperium will collapse. It is highly addictive, and has the side effect of turning the eye of the user a deep blue. Spice mining is dangerous, not just because of sandstorms and nomad attacks, but because the noise attracts giant sandworms, behemoths many hundreds of metres in length that travel through the dunes like whales through the ocean.

Have the Harkonnens really given up Dune, this source of fabulous riches? Of course not. Treachery and tragedy duly ensue, and young Paul survives a general bloodbath to go on the run in the hostile open desert, accompanied, unusually for an adventure story, by his mum. Paul is already showing signs of a kind of cosmic precociousness, and people suspect that he may even be the messiah figure foretold in ancient prophecies. His mother, Jessica, is an initiate of the great female powerbase in an otherwise patriarchal galactic order, a religious sisterhood called the Bene Gesserit. Witchy and psychically powerful, the sisters have engaged in millennia of eugenic programming, of which Paul may be the culmination.

This setup owes something to the Mars stories of Edgar Rice Burroughs and Isaac Asimov’s Foundation books, as well as the tales written by Idaho-born food chemist Elmer Edward “Doc” Smith, creator of the popular Lensman space operas of the 1940s and 50s, in which eugenically bred heroes are initiated into a “galactic patrol” of psychically enhanced supercops. For Smith, altered states of consciousness were mainly tools for the whiteous and righteous to vaporise whole solar systems of subversives, aliens and others with undesirable traits. Herbert, by contrast, was no friend of big government. He had also taken peyote and read Jung. In 1960, a sailing buddy introduced him to the Zen thinker Alan Watts, who was living on a houseboat in Sausalito. Long conversations with Watts, the main conduit by which Zen was permeating the west-coast counterculture, helped turn Herbert’s pacy adventure story into an exploration of temporality, the limits of personal identity and the mind’s relationship to the body.

Every fantasy reflects the place and time that produced it. If The Lord of the Rings is about the rise of fascism and the trauma of the second world war, and Game of Thrones, with its cynical realpolitik and cast of precarious, entrepreneurial characters is a fairytale of neoliberalism, then Dune is the paradigmatic fantasy of the Age of Aquarius. Its concerns – environmental stress, human potential, altered states of consciousness and the developing countries’ revolution against imperialism – are blended together into an era-defining vision of personal and cosmic transformation.

Read the entire article here.

Image: The Oregon Dunes, near Florence, Oregon, served as an inspiration for the Dune saga. Courtesy of Rebecca Kennison. Creative Commons.

The Bibliotherapist

google-search-books

No, the bibliotherapist is not a character from Jasper Fforde’s literary detective novels. And yes, there is such a profession. So, perhaps if you’re a committed bibliophile this may be the career for you. The catch: well, you need to get along well with people and books. Counts me out.

From the New Yorker:

Several years ago, I was given as a gift a remote session with a bibliotherapist at the London headquarters of the School of Life, which offers innovative courses to help people deal with the daily emotional challenges of existence. I have to admit that at first I didn’t really like the idea of being given a reading “prescription.” I’ve generally preferred to mimic Virginia Woolf’s passionate commitment to serendipity in my personal reading discoveries, delighting not only in the books themselves but in the randomly meaningful nature of how I came upon them (on the bus after a breakup, in a backpackers’ hostel in Damascus, or in the dark library stacks at graduate school, while browsing instead of studying). I’ve long been wary of the peculiar evangelism of certain readers: You must read this, they say, thrusting a book into your hands with a beatific gleam in their eyes, with no allowance for the fact that books mean different things to people—or different things to the same person—at various points in our lives. I loved John Updike’s stories about the Maples in my twenties, for example, and hate them in my thirties, and I’m not even exactly sure why.

But the session was a gift, and I found myself unexpectedly enjoying the initial questionnaire about my reading habits that the bibliotherapist, Ella Berthoud, sent me. Nobody had ever asked me these questions before, even though reading fiction is and always has been essential to my life. I love to gorge on books over long breaks—I’ll pack more books than clothes, I told Berthoud. I confided my dirty little secret, which is that I don’t like buying or owning books, and always prefer to get them from the library (which, as I am a writer, does not bring me very good book-sales karma). In response to the question “What is preoccupying you at the moment?,” I was surprised by what I wanted to confess: I am worried about having no spiritual resources to shore myself up against the inevitable future grief of losing somebody I love, I wrote. I’m not religious, and I don’t particularly want to be, but I’d like to read more about other people’s reflections on coming to some sort of early, weird form of faith in a “higher being” as an emotional survival tactic. Simply answering the questions made me feel better, lighter.

We had some satisfying back-and-forths over e-mail, with Berthoud digging deeper, asking about my family’s history and my fear of grief, and when she sent the final reading prescription it was filled with gems, none of which I’d previously read. Among the recommendations was “The Guide,” by R. K. Narayan. Berthoud wrote that it was “a lovely story about a man who starts his working life as a tourist guide at a train station in Malgudi, India, but then goes through many other occupations before finding his unexpected destiny as a spiritual guide.” She had picked it because she hoped it might leave me feeling “strangely enlightened.” Another was “The Gospel According to Jesus Christ,” by José Saramago: “Saramago doesn’t reveal his own spiritual stance here but portrays a vivid and compelling version of the story we know so well.” “Henderson the Rain King,” by Saul Bellow, and “Siddhartha,” by Hermann Hesse, were among other prescribed works of fiction, and she included some nonfiction, too, such as “The Case for God,” by Karen Armstrong, and “Sum,” by the neuroscientist David Eagleman, a “short and wonderful book about possible afterlives.”

I worked my way through the books on the list over the next couple of years, at my own pace—interspersed with my own “discoveries”—and while I am fortunate enough to have my ability to withstand terrible grief untested, thus far, some of the insights I gleaned from these books helped me through something entirely different, when, over several months, I endured acute physical pain. The insights themselves are still nebulous, as learning gained through reading fiction often is—but therein lies its power. In a secular age, I suspect that reading fiction is one of the few remaining paths to transcendence, that elusive state in which the distance between the self and the universe shrinks. Reading fiction makes me lose all sense of self, but at the same time makes me feel most uniquely myself. As Woolf, the most fervent of readers, wrote, a book “splits us into two parts as we read,” for “the state of reading consists in the complete elimination of the ego,” while promising “perpetual union” with another mind.

Bibliotherapy is a very broad term for the ancient practice of encouraging reading for therapeutic effect. The first use of the term is usually dated to a jaunty 1916 article in The Atlantic Monthly, “A Literary Clinic.” In it, the author describes stumbling upon a “bibliopathic institute” run by an acquaintance, Bagster, in the basement of his church, from where he dispenses reading recommendations with healing value. “Bibliotherapy is…a new science,” Bagster explains. “A book may be a stimulant or a sedative or an irritant or a soporific. The point is that it must do something to you, and you ought to know what it is. A book may be of the nature of a soothing syrup or it may be of the nature of a mustard plaster.” To a middle-aged client with “opinions partially ossified,” Bagster gives the following prescription: “You must read more novels. Not pleasant stories that make you forget yourself. They must be searching, drastic, stinging, relentless novels.” (George Bernard Shaw is at the top of the list.) Bagster is finally called away to deal with a patient who has “taken an overdose of war literature,” leaving the author to think about the books that “put new life into us and then set the life pulse strong but slow.”

Read the entire story here.

Image courtesy of Google Search.

Real Magic

[tube]UibfDUPJAEU[/tube]

Literary, social, moral and philanthropic leadership. These are all very admirable qualities. We might strive to embody just one of these in our daily lives. Author J.K. Rowling seems to demonstrate all four. In her new book, Very Good Lives: The Fringe Benefits of Failure and the Importance of Imagination, published in April 2015, she distills advice from her self-effacing but powerful Harvard University commencement speech, delivered in 2008.

A couple of my favorite quotes:

Many prefer not to exercise their imaginations at all. They choose to remain comfortably within the bounds of their own experience, never troubling to wonder how it would feel to have been born other than they are.

Some failure in life is inevitable. It is impossible to live without failing at something, unless you live so cautiously that you might as well not lived at all.

Video: J.K. Rowling Harvard Commencement Speech, 2008. Courtesy of Harvard University.

The Demise of the Language of Landscape

IMG_2006

In his new book entitled Landmarks author Robert Macfarlane ponders the relationship of words to our natural landscape. Reviewers describe the book as a “field guide to the literature of nature”. Sadly, Macfarlane’s detailed research for the book chronicles a disturbing trend: the culling of many words from our everyday lexicon that describe our natural world to make way for the buzzwords of progress. This substitution comes in the form of newer memes that describe our narrow, urbanized and increasingly virtual world circumscribed by technology. Macfarlane cited Oxford Junior Dictionary (OJD) as a vivid example of the evisceration of our language of landscape. The OJD has removed words such as acorn, beech, conker, dandelion, heather, heron, kingfisher, pasture and willow. In their place we now find words like attachmentblogbroadbandbullet-pointcelebritychatroomcut-and-pasteMP3 player and voice-mail. Get the idea?

I’m no fundamentalist luddite — I’m writing a blog after all — but surely some aspects of our heritage warrant protection. We are an intrinsic part of the natural environment despite our increasing urbanization. Don’t we all crave the escape to a place where we can lounge under a drooping willow surrounded by nothing more than the buzzing of insects and the babbling of a stream. I’d rather that than deal with the next attachment or voice-mail.

What a loss it would be for our children, and a double-edged loss at that. We, the preceding generation continue to preside over the systematic destruction of our natural landscape. And, in doing so we remove the words as well — the words that once described what we still crave.

From the Guardian:

Eight years ago, in the coastal township of Shawbost on the Outer Hebridean island of Lewis, I was given an extraordinary document. It was entitled “Some Lewis Moorland Terms: A Peat Glossary”, and it listed Gaelic words and phrases for aspects of the tawny moorland that fills Lewis’s interior. Reading the glossary, I was amazed by the compressive elegance of its lexis, and its capacity for fine discrimination: a caochan, for instance, is “a slender moor-stream obscured by vegetation such that it is virtually hidden from sight”, while a feadan is “a small stream running from a moorland loch”, and a fèith is “a fine vein-like watercourse running through peat, often dry in the summer”. Other terms were striking for their visual poetry: rionnach maoim means “the shadows cast on the moorland by clouds moving across the sky on a bright and windy day”; èit refers to “the practice of placing quartz stones in streams so that they sparkle in moonlight and thereby attract salmon to them in the late summer and autumn”, and teine biorach is “the flame or will-o’-the-wisp that runs on top of heather when the moor burns during the summer”.

The “Peat Glossary” set my head a-whirr with wonder-words. It ran to several pages and more than 120 terms – and as that modest “Some” in its title acknowledged, it was incomplete. “There’s so much language to be added to it,” one of its compilers, Anne Campbell, told me. “It represents only three villages’ worth of words. I have a friend from South Uist who said her grandmother would add dozens to it. Every village in the upper islands would have its different phrases to contribute.” I thought of Norman MacCaig’s great Hebridean poem “By the Graveyard, Luskentyre”, where he imagines creating a dictionary out of the language of Donnie, a lobster fisherman from the Isle of Harris. It would be an impossible book, MacCaig concluded:

A volume thick as the height of the Clisham,

A volume big as the whole of Harris,

A volume beyond the wit of scholars.

The same summer I was on Lewis, a new edition of the Oxford Junior Dictionarywas published. A sharp-eyed reader noticed that there had been a culling of words concerning nature. Under pressure, Oxford University Press revealed a list of the entries it no longer felt to be relevant to a modern-day childhood. The deletions included acornadderashbeechbluebellbuttercupcatkinconkercowslipcygnetdandelionfernhazelheatherheronivykingfisherlarkmistletoenectarnewtotterpasture and willow. The words taking their places in the new edition included attachmentblock-graphblogbroadbandbullet-pointcelebritychatroomcommitteecut-and-pasteMP3 player and voice-mail. As I had been entranced by the language preserved in the prose?poem of the “Peat Glossary”, so I was dismayed by the language that had fallen (been pushed) from the dictionary. For blackberry, read Blackberry.

I have long been fascinated by the relations of language and landscape – by the power of strong style and single words to shape our senses of place. And it has become a habit, while travelling in Britain and Ireland, to note down place words as I encounter them: terms for particular aspects of terrain, elements, light and creaturely life, or resonant place names. I’ve scribbled these words in the backs of notebooks, or jotted them down on scraps of paper. Usually, I’ve gleaned them singly from conversations, maps or books. Now and then I’ve hit buried treasure in the form of vernacular word-lists or remarkable people – troves that have held gleaming handfuls of coinages, like the Lewisian “Peat Glossary”.

Not long after returning from Lewis, and spurred on by the Oxford deletions, I resolved to put my word-collecting on a more active footing, and to build up my own glossaries of place words. It seemed to me then that although we have fabulous compendia of flora, fauna and insects (Richard Mabey’s Flora Britannica and Mark Cocker’s Birds Britannica chief among them), we lack a Terra Britannica, as it were: a gathering of terms for the land and its weathers – terms used by crofters, fishermen, farmers, sailors, scientists, miners, climbers, soldiers, shepherds, poets, walkers and unrecorded others for whom particularised ways of describing place have been vital to everyday practice and perception. It seemed, too, that it might be worth assembling some of this terrifically fine-grained vocabulary – and releasing it back into imaginative circulation, as a way to rewild our language. I wanted to answer Norman MacCaig’s entreaty in his Luskentyre poem: “Scholars, I plead with you, / Where are your dictionaries of the wind … ?”

Read the entire article here and then buy the book, which is published in March 2015.

Image: Sunset over the Front Range. Courtesy of the author.

Creative Destruction

Internet_map

Author Andrew Keen ponders the true value of the internet in his new book The Internet is Not the Answer. Quite rightfully he asserts that many billions of consumers have benefited from the improved convenience and usually lower prices of every product imaginable delivered through a couple of clicks online. But there is a higher price to pay — one that touches on the values we want for our society and the deeper costs to our culture.

From the Guardian:

During every minute of every day of 2014, according to Andrew Keen’s new book, the world’s internet users – all three billion of them – sent 204m emails, uploaded 72 hours of YouTube video, undertook 4m Google searches, shared 2.46m pieces of Facebook content, published 277,000 tweets, posted 216,000 new photos on Instagram and spent $83,000 on Amazon.

By any measure, for a network that has existed recognisably for barely 20 years (the first graphical web browser, Mosaic, was released in 1993), those are astonishing numbers: the internet, plainly, has transformed all our lives, making so much of what we do every day – communicating, shopping, finding, watching, booking – unimaginably easier than it was. A Pew survey in the United States found last year that 90% of Americans believed the internet had been good for them.

So it takes a brave man to argue that there is another side to the internet; that stratospheric numbers and undreamed-of personal convenience are not the whole story. Keen (who was once so sure the internet was the answer that he sank all he had into a startup) is now a thoughtful and erudite contrarian who believes the internet is actually doing untold damage. The net, he argues, was meant to be “power to the people, a platform for equality”: an open, decentralised, democratising technology that liberates as it empowers as it informs.

Instead, it has handed extraordinary power and wealth to a tiny handful of people, while simultaneously, for the rest of us, compounding and often aggravating existing inequalities – cultural, social and economic – whenever and wherever it has found them. Individually, it may work wonders for us. Collectively, it’s doing us no good at all. “It was supposed to be win-win,” Keen declares. “The network’s users were supposed to be its beneficiaries. But in a lot of ways, we are its victims.”

This is not, Keen acknowledges, a very popular view, especially in Silicon Valley, where he has spent the best part of the past 30-odd years after an uneventful north London childhood (the family was in the rag trade). But The Internet is Not the Answer – Keen’s third book (the first questioned the value of user-generated content, the second the point of social media; you get where he’s coming from) – has been “remarkably well received”, he says. “I’m not alone in making these points. Moderate opinion is starting to see that this is a problem.”

What seems most unarguable is that, whatever else it has done, the internet – after its early years as a network for academics and researchers from which vulgar commercial activity was, in effect, outlawed – has been largely about the money. The US government’s decision, in 1991, to throw the nascent network open to private enterprise amounted, as one leading (and now eye-wateringly wealthy) Californian venture capitalist has put it, to “the largest creation of legal wealth in the history of the planet”.

The numbers Keen reels off are eye-popping: Google, which now handles 3.5bn searches daily and controls more than 90% of the market in some countries, including Britain, was valued at $400bn last year – more than seven times General Motors, which employs nearly four times more people. Its two founders, Larry Page and Sergey Brin, are worth $30bn apiece. Facebook’s Mark Zuckerberg, head of the world’s second biggest internet site – used by 19% of people in the world, half of whom access it six days a week or more – is sitting on a similar personal pile, while at $190bn in July last year, his company was worth more than Coca-Cola, Disney and AT&T.

Jeff Bezos of Amazon also has $30bn in his bank account. And even more recent online ventures look to be headed the same way: Uber, a five-year-old startup employing about 1,000 people and once succinctly described as “software that eats taxis”, was valued last year at more than $18bn – roughly the same as Hertz and Avis combined. The 700-staff lodging rental site Airbnb was valued at $10bn in February last year, not far off half as much as the Hilton group, which owns nearly 4,000 hotels and employs 150,000 people. The messaging app WhatsApp, bought by Facebook for $19bn, employs just 55, while the payroll of Snapchat – which turned down an offer of $3bn – numbers barely 20.

Part of the problem here, argues Keen, is that the digital economy is, by its nature, winner-takes-all. “There’s no inevitable or conspiratorial logic here; no one really knew it would happen,” he says. “There are just certain structural qualities that mean the internet lends itself to monopolies. The internet is a perfect global platform for free-market capitalism – a pure, frictionless, borderless economy … It’s a libertarian’s wet dream. Digital Milton Friedman.”Nor are those monopolies confined to just one business. Keen cites San Francisco-based writer Rebecca Solnit’s incisive take on Google: imagine it is 100 years ago, and the post office, the phone company, the public libraries, the printing houses, Ordnance Survey maps and the cinemas were all controlled by the same secretive and unaccountable organisation. Plus, he adds, almost as an afterthought: “Google doesn’t just own the post office – it has the right to open everyone’s letters.”

Advertisement

This, Keen argues, is the net economy’s natural tendency: “Google is the search and information monopoly and the largest advertising company in history. It is incredibly strong, joining up the dots across more and more industries. Uber’s about being the transport monopoly; Airbnb the hospitality monopoly; TaskRabbit the labour monopoly. These are all, ultimately, monopoly plays – that’s the logic. And that should worry people.”

It is already having consequences, Keen says, in the real world. Take – surely the most glaring example – Amazon. Keen’s book cites a 2013 survey by the US Institute for Local Self-Reliance, which found that while it takes, on average, a regular bricks-and-mortar store 47 employees to generate $10m in turnover, Bezos’s many-tentacled, all-consuming and completely ruthless “Everything Store” achieves the same with 14. Amazon, that report concluded, probably destroyed 27,000 US jobs in 2012.

“And we love it,” Keen says. “We all use Amazon. We strike this Faustian deal. It’s ultra-convenient, fantastic service, great interface, absurdly cheap prices. But what’s the cost? Truly appalling working conditions; we know this. Deep hostility to unions. A massive impact on independent retail; in books, savage bullying of publishers. This is back to the early years of the 19th century. But we’re seduced into thinking it’s good; Amazon has told us what we want to hear. Bezos says, ‘This is about you, the consumer.’ The problem is, we’re not just consumers. We’re citizens, too.”

Read the entire article here.

Image: Visualization of routing paths through a portion of the Internet. Courtesy of the Opte Project.

Clothing Design by National Sub-Committee

North-Korean-Military

It’s probably safe to assume that clothing designed by committee will be more utilitarian and drab than that from the colored pencils of say Yves Saint Laurent, Tom Ford, Giorgio Armani or Coco Chanel.

So, imagine what clothing would look like if it was designed by the Apparel Research Center, a sub-subcommittee of the Clothing Industry Department, itself a sub-committee of the National Industry Committee. Yes, welcome to the strange, centrally planned and tightly controlled world of our favorite rogue nation, North Korea. Imagine no more as Paul French takes us on a journey through daily life in North Korea, excerpted from his new book North Korea: State of Paranoia by Paul French. It makes for sobering reading.

From the Guardian:

6am The day starts early in Pyongyang, the city described by the North Korean government as the “capital of revolution”. Breakfast is usually corn or maize porridge, possibly a boiled egg and sour yoghurt, with perhaps powdered milk for children.

Then it is time to get ready for work. North Korea has a large working population: approximately 59% of the total in 2010. A growing number of women work in white-collar office jobs; they make up around 90% of workers in light industry and 80% of the rural workforce. Many women are now the major wage-earner in the family – though still housewife, mother and cook as well as a worker, or perhaps a soldier.

Makeup is increasingly common in Pyongyang, though it is rarely worn until after college graduation. Chinese-made skin lotions, foundation, eyeliner and lipstick are available and permissible in the office. Many women suffer from blotchy skin caused by the deteriorating national diet, so are wearing more makeup. Long hair is common, but untied hair is frowned upon.

Men’s hairstyles could not be described as radical. In the 1980s, when Kim Jong-il first came to public prominence, his trademark crewcut, known as a “speed battle cut”, became popular, while the more bouffant style favoured by Kim Il-sung, and then Kim Jong-il, in their later years, is also popular. Kim Jong-un’s trademark short-back-and-sides does not appear to have inspired much imitation so far. Hairdressers and barbers are run by the local Convenience Services Management Committee; at many, customers can wash their hair themselves.

Fashion is not really an applicable term in North Korea, as the Apparel Research Centre under the Clothing Industry Department of the National Light Industry Committee designs most clothing. However, things have loosened up somewhat, with bright colours now permitted as being in accordance with a “socialist lifestyle”. Pyongyang offers some access to foreign styles. A Japanese watch denotes someone in an influential position; a foreign luxury watch indicates a very senior position. The increasing appearance of Adidas, Disney and other brands (usually fake) indicates that access to goods smuggled from China is growing. Jeans have at times been fashionable, though risky – occasionally they have been banned as “decadent”, along with long hair on men, which can lead to arrest and a forced haircut.

One daily ritual of all North Koreans is making sure they have their Kim Il-sung badge attached to their lapel. The badges have been in circulation since the late 1960s, when the Mansudae Art Studio started producing them for party cadres. Desirable ones can change hands on the black market for several hundred NKW. In a city where people rarely carry cash, jewellery or credit cards, Kim badges are one of the most prized targets of Pyongyang’s pickpockets.

Most streets are boulevards of utilitarian high-rise blocks. Those who live on higher floors may have to set out for work or school a little earlier than those lower down. Due to chronic power cuts, many elevators work only intermittently, if at all. Many buildings are between 20 and 40 storeys tall – there are stories of old people who have never been able to leave. Even in the better blocks elevators can be sporadic and so people just don’t take the chance. Families make great efforts to relocate older relatives on lower floors, but this is difficult and a bribe is sometimes required. With food shortages now constant, many older people share their meagre rations with their grandchildren, weakening themselves further and making the prospect of climbing stairs even more daunting.

Some people do drive to work, but congestion is not a major problem. Despite the relative lack of cars, police enforce traffic regulations strictly and issue tickets. Fines can be equivalent to two weeks’ salary. Most cars belong to state organisations, but are often used as if they were privately owned. All vehicles entering Pyongyang must be clean; owners of dirty cars may be fined. Those travelling out of Pyongyang require a travel certificate. There are few driving regulations; however, on hills ascending vehicles have the right of way, and trucks cannot pass passenger cars under any circumstances. Drunk-driving is punished with hard labour. Smoking while driving is banned on the grounds that a smoking driver cannot smell a problem with the car.

Those who have a bicycle usually own a Sea Gull, unless they are privileged and own an imported second-hand Japanese bicycle. But even a Sea Gull costs several months’ wages and requires saving.

7.30am For many North Koreans the day starts with a 30-minute reading session and exercises before work begins. The reading includes receiving instructions and studying the daily editorial in the party papers. This is followed by directives on daily tasks and official announcements.

For children, the school day starts with exercises to a medley of populist songs before a session of marching on the spot and saluting the image of the leader. The curriculum is based Kim Il-sung’s 1977 Thesis on Socialist Education, emphasising the political role of education in developing revolutionary spirit. All children study Kim Il-sung’s life closely. Learning to read means learning to read about Kim Il-sung; music class involves singing patriotic songs. Rote learning and memorising political tracts is integral and can bring good marks, which help in getting into university – although social rank is a more reliable determinant of college admission. After graduation, the state decides where graduates will work.

8am Work begins. Pyongyang is the centre of the country’s white-collar workforce, though a Pyongyang office would appear remarkably sparse to most outsiders. Banks, industrial enterprises and businesses operate almost wholly without computers, photocopiers and modern office technology. Payrolls and accounting are done by hand.

12pm Factories, offices and workplaces break for lunch for an hour. Many workers bring a packed lunch, or, if they live close by, go home to eat. Larger workplaces have a canteen serving cheap lunches, such as corn soup, corn cake and porridge. The policy of eating in work canteens, combined with the lack of food shops and restaurants, means that Pyongyang remains strangely empty during the working day with no busy lunchtime period, as seen in other cities around the world.

Shopping is an as-and-when activity. If a shop has stock, then returning later is not an option as it will be sold out. According to defectors, North Koreans want “five chests and seven appliances”. The chests are a quilt chest, wardrobe, bookshelf, cupboard and shoe closet, while the appliances comprise a TV, refrigerator, washing machine, electric fan, sewing machine, tape recorder and camera. Most ordinary people only have a couple of appliances, usually a television and a sewing machine.

Food shopping is equally problematic. Staples such as soy sauce, soybean paste, salt and oil, as well as toothpaste, soap, underwear and shoes, sell out fast. The range of food items available is highly restricted. White cabbage, cucumber and tomato are the most common; meat is rare, and eggs increasingly so. Fruit is largely confined to apples and pears. The main staple of the North Korean diet is rice, though bread is sometimes available, accompanied by a form of butter that is often rancid. Corn, maize and mushrooms also appear sometimes.

Read the entire excerpt here.

Image: Soldiers from the Korean People’s Army look south while on duty in the Joint Security Area, 2008. Courtesy of U.S. government.

 

Wolfgang Pauli’s Champagne

PauliAustrian theoretical physicist dreamed up neutrinos in 1930, and famously bet a case of fine champagne that these ghostly elementary particles would never be found. Pauli lost the bet in 1956. Since then researchers have made great progress both theoretically and experimentally in trying to delve into the neutrino’s secrets. Two new books describe the ongoing quest.

From the Economist:

Neutrinoa are weird. The wispy particles are far more abundant than the protons and electrons that make up atoms. Billions of them stream through every square centimetre of Earth’s surface each second, but they leave no trace and rarely interact with anything. Yet scientists increasingly agree that they could help unravel one of the biggest mysteries in physics: why the cosmos is made of matter.

Neutrinos’ scientific history is also odd, as two new books explain. The first is “Neutrino Hunters” by Ray Jayawardhana, a professor of astrophysics at the University of Toronto (and a former contributor to The Economist). The second, “The Perfect Wave”, is by Heinrich Päs, a neutrino theorist from Technical University in the German city of Dortmund.

The particles were dreamed up in 1930 by Wolfgang Pauli, an Austrian, to account for energy that appeared to go missing in a type of radioactivity known as beta decay. Pauli apologised for what was a bold idea at a time when physicists knew of just two subatomic particles (protons and electrons), explaining that the missing energy was carried away by a new, electrically neutral and, he believed, undetectable subatomic species. He bet a case of champagne that it would never be found.

Pauli lost the wager in 1956 to two Americans, Frederick Reines and Clyde Cowan. The original experiment they came up with to test the hypothesis was unorthodox. It involved dropping a detector down a shaft within 40 metres of an exploding nuclear bomb, which would act as a source of neutrinos. Though Los Alamos National Laboratory approved the experiment, the pair eventually chose a more practical approach and buried a detector near a powerful nuclear reactor at Savannah River, South Carolina, instead. (Most neutrino detectors are deep underground to shield them from cosmic rays, which can cause similar signals.)

However, as other experiments, in particular those looking for neutrinos in the physical reactions which power the sun, strove to replicate Reines’s and Cowan’s result, they hit a snag. The number of solar neutrinos they recorded was persistently just one third of what theory said the sun ought to produce. Either the theorists had made a mistake, the thinking went, or the experiments had gone awry.

In fact, both were right all along. It was the neutrinos that, true to form, behaved oddly. As early as 1957 Bruno Pontecorvo, an Italian physicist who had defected to the Soviet Union seven years earlier, suggested that neutrinos could come in different types, known to physicists as “flavours”, and that they morph from one type to another on their way from the sun to Earth. Other scientists were sceptical. Their blueprint for how nature works at the subatomic level, called the Standard Model, assumed that neutrinos have no mass. This, as Albert Einstein showed, is the same as saying they travel at the speed of light. On reaching that speed time stops. If neutrinos switch flavours they would have to experience change, and thus time. That means they would have to be slower than light. In other words, they would have mass. (A claim in 2011 by Italian physicists working with CERN, Europe’s main physics laboratory, that neutrinos broke Einstein’s speed limit turned out to be the result of a loose cable.)

Pontecorvo’s hypothesis was proved only in 1998, in Japan. Others have since confirmed the phenomenon known as “oscillation”. The Standard Model had to be tweaked to make room for neutrino mass. But scientists still have little idea about how much any of the neutrinos actually weigh, besides being at least 1m times lighter than an electron.

The answer to the weight question, as well as a better understanding of neutrino oscillations, may help solve the puzzle of why the universe is full of matter. One explanation boffins like a lot because of its elegant maths invokes a whole new category of “heavy” neutrino decaying more readily into matter than antimatter. If that happened a lot when the universe began, then there would have been more matter around than antimatter, and when the matter and antimatter annihilated each other, as they are wont to do, some matter (ie, everything now visible) would be left over. The lighter the known neutrinos, according to this “seesaw” theory, the heftier the heavy sort would have to be. A heavy neutrino has yet to be observed, and may well, as Pauli described it, be unobservable. But a better handle on the light variety, Messrs Jayawardhana and Päs both agree, may offer important clues.

These two books complement each other. Mr Jayawardhana’s is stronger on the history (though his accounts of the neutrino hunters’ personal lives can read a little too much like a professional CV). It is also more comprehensive on the potential use of neutrinos in examining the innards of the sun, of distant exploding stars or of Earth, as well as more practical uses such as fingering illicit nuclear-enrichment programmes (since they spew out a telltale pattern of the particles).

Read the entire article here.

Image: Wolfgang Pauli, c1945. Courtesy of Wikipedia.

Biological Transporter

Molecular-biology entrepreneur and genomics engineering pioneer, Craig Venter, is at it again. In his new book, Life at the Speed of Light: From the Double Helix to the Dawn of Digital Life, Venter explains his grand ideas and the coming era of discovery.

From ars technica:

J Craig Venter has been a molecular-biology pioneer for two decades. After developing expressed sequence tags in the 90s, he led the private effort to map the human genome, publishing the results in 2001. In 2010, the J Craig Venter Institute manufactured the entire genome of a bacterium, creating the first synthetic organism.

Now Venter, author of Life at the Speed of Light: From the Double Helix to the Dawn of Digital Life, explains the coming era of discovery.

Wired: In Life at the Speed of Light, you argue that humankind is entering a new phase of evolution. How so?

J Craig Venter: As the industrial age is drawing to a close, I think that we’re witnessing the dawn of the era of biological design. DNA, as digitized information, is accumulating in computer databases. Thanks to genetic engineering, and now the field of synthetic biology, we can manipulate DNA to an unprecedented extent, just as we can edit software in a computer. We can also transmit it as an electromagnetic wave at or near the speed of light and, via a “biological teleporter,” use it to recreate proteins, viruses, and living cells at another location, changing forever how we view life.

So you view DNA as the software of life?

All the information needed to make a living, self-replicating cell is locked up within the spirals of DNA’s double helix. As we read and interpret that software of life, we should be able to completely understand how cells work, then change and improve them by writing new cellular software.

The software defines the manufacture of proteins that can be viewed as its hardware, the robots and chemical machines that run a cell. The software is vital because the cell’s hardware wears out. Cells will die in minutes to days if they lack their genetic-information system. They will not evolve, they will not replicate, and they will not live.

Of all the experiments you have done over the past two decades involving the reading and manipulation of the software of life, which are the most important?

I do think the synthetic cell is my most important contribution. But if I were to select a single study, paper, or experimental result that has really influenced my understanding of life more than any other, I would choose one that my team published in 2007, in a paper with the title Genome Transplantation in Bacteria: Changing One Species to Another. The research that led to this paper in the journal Science not only shaped my view of the fundamentals of life but also laid the groundwork to create the first synthetic cell. Genome transplantation not only provided a way to carry out a striking transformation, converting one species into another, but would also help prove that DNA is the software of life.

What has happened since your announcement in 2010 that you created a synthetic cell, JCVI-syn1.0?

At the time, I said that the synthetic cell would give us a better understanding of the fundamentals of biology and how life works, help develop techniques and tools for vaccine and pharmaceutical development, enable development of biofuels and biochemicals, and help to create clean water, sources of food, textiles, bioremediation. Three years on that vision is being borne out.

Your book contains a dramatic account of the slog and setbacks that led to the creation of this first synthetic organism. What was your lowest point?

When we started out creating JCVI-syn1.0 in the lab, we had selected M. genitalium because of its extremely small genome. That decision we would come to really regret: in the laboratory, M. genitalium grows slowly. So whereas E. coli divides into daughter cells every 20 minutes, M. genitalium requires 12 hours to make a copy of itself. With logarithmic growth, it’s the difference between having an experimental result in 24 hours versus several weeks. It felt like we were working really hard to get nowhere at all. I changed the target to the M. mycoides genome. It’s twice as large as that of genitalium, but it grows much faster. In the end, that move made all the difference.

Some of your peers were blown away by the synthetic cell; others called it a technical tour de force. But there were also those who were underwhelmed because it was not “life from scratch.”

They haven’t thought much about what they are actually trying to say when they talk about “life from scratch.” How about baking a cake “from scratch”? You could buy one and then ice it at home. Or buy a cake mix, to which you add only eggs, water and oil. Or combining the individual ingredients, such as baking powder, sugar, salt, eggs, milk, shortening and so on. But I doubt that anyone would mean formulating his own baking powder by combining sodium, hydrogen, carbon, and oxygen to produce sodium bicarbonate, or producing homemade corn starch. If we apply the same strictures to creating life “from scratch,” it could mean producing all the necessary molecules, proteins, lipids, organelles, DNA, and so forth from basic chemicals or perhaps even from the fundamental elements carbon, hydrogen, oxygen, nitrogen, phosphate, iron, and so on.

There’s a parallel effort to create virtual life, which you go into in the book. How sophisticated are these models of cells in silico?

In the past year we have really seen how virtual cells can help us understand the real things. This work dates back to 1996 when Masaru Tomita and his students at the Laboratory for Bioinformatics at Keio started investigating the molecular biology of Mycoplasma genitalium—which we had sequenced in 1995—and by the end of that year had established the E-Cell Project. The most recent work on Mycoplasma genitalium has been done in America, by the systems biologist Markus W Covert, at Stanford University. His team used our genome data to create a virtual version of the bacterium that came remarkably close to its real-life counterpart.

You’ve discussed the ethics of synthetic organisms for a long time—where is the ethical argument today?

The Janus-like nature of innovation—its responsible use and so on—was evident at the very birth of human ingenuity, when humankind first discovered how to make fire on demand. (Do I use it burn down a rival’s settlement, or to keep warm?) Every few months, another meeting is held to discuss how powerful technology cuts both ways. It is crucial that we invest in underpinning technologies, science, education, and policy in order to ensure the safe and efficient development of synthetic biology. Opportunities for public debate and discussion on this topic must be sponsored, and the lay public must engage. But it is important not to lose sight of the amazing opportunities that this research presents. Synthetic biology can help address key challenges facing the planet and its population. Research in synthetic biology may lead to new things such as programmed cells that self-assemble at the sites of disease to repair damage.

What worries you more: bioterror or bioerror?

I am probably more concerned about an accidental slip. Synthetic biology increasingly relies on the skills of scientists who have little experience in biology, such as mathematicians and electrical engineers. The democratization of knowledge and the rise of “open-source biology;” the availability of kitchen-sink versions of key laboratory tools, such as the DNA-copying method PCR, make it easier for anyone—including those outside the usual networks of government, commercial, and university laboratories and the culture of responsible training and biosecurity—to play with the software of life.

Following the precautionary principle, should we abandon synthetic biology?

My greatest fear is not the abuse of technology, but that we will not use it at all, and turn our backs to an amazing opportunity at a time when we are over-populating our planet and changing environments forever.

You’re bullish about where this is headed.

I am—and a lot of that comes from seeing the next generation of synthetic biologists. We can get a view of what the future holds from a series of contests that culminate in a yearly event in Cambridge, Massachusetts—the International Genetically Engineered Machine (iGEM) competition. High-school and college students shuffle a standard set of DNA subroutines into something new. It gives me hope for the future.

You’ve been working to convert DNA into a digital signal that can be transmitted to a unit which then rebuilds an organism.

At Synthetic Genomics, Inc [which Venter founded with his long-term collaborator, the Nobel laureate Ham Smith], we can feed digital DNA code into a program that works out how to re-synthesize the sequence in the lab. This automates the process of designing overlapping pieces of DNA base-pairs, called oligonucleotides, adding watermarks, and then feeding them into the synthesizer. The synthesizer makes the oligonucleotides, which are pooled and assembled using what we call our Gibson-assembly robot (named after my talented colleague Dan Gibson). NASA has funded us to carry out experiments at its test site in the Mojave Desert. We will be using the JCVI mobile lab, which is equipped with soil-sampling, DNA-isolation and DNA sequencing equipment, to test the steps for autonomously isolating microbes from soil, sequencing their DNA and then transmitting the information to the cloud with what we call a “digitized-life-sending unit”. The receiving unit, where the transmitted DNA information can be downloaded and reproduced anew, has a number of names at present, including “digital biological converter,” “biological teleporter,” and—the preference of former US Wired editor-in-chief and CEO of 3D Robotics, Chris Anderson—”life replicator”.

Read the entire article here.

Image: J Craig Venter. Courtesy of Wikipedia.

Are You An H-less Socialist?

If you’re British and you drop your Hs while speaking then your likely to be considered of inferior breeding stock by the snootier classes. Or as the Times newspaper put it at the onset of the 20th century, you would be considered  an “h-less socialist”. Of course, a mere fifty years earlier it was generally acceptable to drop aitches, so you would have been correct in pronouncing “hotel” as “otel” or “horse” as “orse”. And, farther back still, in Ancient Rome adding Hs would have earned the scorn of the ruling classes for appearing too Greek. So, who’s right?

If you’re wondering how this all came about and who if anybody is right, check out the new book Alphabetical: How Every Letter Tells A Story by Michael Rosen.

From the Guardian:

The alphabet is something not to be argued with: there are 26 letters in as fixed a sequence as the numbers 1-26; once learned in order and for the “sounds they make”, you have the key to reading and the key to the way the world is classified. Or perhaps not.

Actually, in the course of writing my book about the history of the letters we use, Alphabetical, I discovered that the alphabet is far from neutral. Debates about power and class surround every letter, and H is the most contentious of all. No other letter has had such power to divide people into opposing camps.

In Britain, H owes its name to the Normans, who brought their letter “hache” with them in 1066. Hache is the source of our word “hatchet”: probably because a lower-case H looks a lot like an axe. It has certainly caused a lot of trouble over the years. A century ago people dropping their h’s were described in the Times as “h-less socialists.” In ancient Rome, they were snooty not about people who dropped their Hs but about those who picked up extra ones. Catullus wrote a nasty little poem about Arrius (H’arrius he called him), who littered his sentences with Hs because he wanted to sound more Greek. Almost two thousand years later we are still split, and pronouncing H two ways: “aitch”, which is posh and “right”; and “haitch”, which is not posh and thus “wrong”. The two variants used to mark the religious divide in Northern Ireland – aitch was Protestant, haitch was Catholic, and getting it wrong could be a dangerous business.

Perhaps the letter H was doomed from the start: given that the sound we associate with H is so slight (a little outbreath), there has been debate since at least AD 500 whether it was a true letter or not. In England, the most up-to-date research suggests that some 13th-century dialects were h-dropping, but by the time elocution experts came along in the 18th century, they were pointing out what a crime it is. And then received wisdom shifted, again: by 1858, if I wanted to speak correctly, I should have said “erb”, “ospital” and “umble”.

The world is full of people laying down the law about the “correct” choice: is it “a hotel” or “an otel”; is it “a historian” or “an historian”? But there is no single correct version. You choose. We have no academy to rule on these matters and, even if we did, it would have only marginal effect. When people object to the way others speak, it rarely has any linguistic logic. It is nearly always because of the way that a particular linguistic feature is seen as belonging to a cluster of disliked social features. Writing this book has been a fascinating journey: the story of our alphabet turns out to be a complex tug of war between the people who want to own our language and the people who use it. I know which side I’m on.

Read the (h)entire (h)article ‘ere.

Image: Alphabetical book cover. Courtesy of Michael Rosen.

First, Build A Blue Box; Second, Build Apple

Edward Tufte built the first little blue box in 1962. The blue box contained home-made circuitry and a tone generator that could place free calls over the phone network to anywhere in the world.

This electronic revelation spawned groups of “phone phreaks” (hackers) who would build their own blue boxes to fight MaBell (AT&T), illegally of course. The phreaks assumed suitably disguised names, such as Captain Crunch and Cheshire Cat, to hide from the long-arm of the FBI.

This later caught the attention of a pair of new recruits to the subversive cause, Berkeley Blue and Oaf Tobar, who would go on to found Apple under their more common pseudonyms, Steve Wozniak and Steve Jobs. The rest, as the saying goes, is history.

Put it down to curiosity, an anti-authoritarian streak and a quest to ever-improve.

[div class=attrib]From Slate:[end-div]

One of the most heartfelt—and unexpected—remembrances of Aaron Swartz, who committed suicide last month at the age of 26, came from Yale professor Edward Tufte. During a speech at a recent memorial service for Swartz in New York City, Tufte reflected on his secret past as a hacker—50 years ago.

“In 1962, my housemate and I invented the first blue box,” Tufte said to the crowd. “That’s a device that allows for undetectable, unbillable long distance telephone calls. We played around with it and the end of our research came when we completed what we thought was the longest long-distance phone call ever made, which was from Palo Alto to New York … via Hawaii.”

Tufte was never busted for his youthful forays into phone hacking, also known as phone phreaking. He rose to become one of Yale’s most famous professors, a world authority on data visualization and information design. One can’t help but think that Swartz might have followed in the distinguished footsteps of a professor like Tufte, had he lived.

Swartz faced 13 felony charges and up to 35 years in prison for downloading 4.8 million academic articles from the digital repository JSTOR, using MIT’s network. In the face of the impending trial, Swartz—a brilliant young hacker and activist who was a key force behind many worthy projects, including the RSS 1.0 specification and Creative Commons—killed himself on Jan. 11.

“Aaron’s unique quality was that he was marvelously and vigorously different,” Tufte said, a tear in his eye, as he closed his speech. “There is a scarcity of that. Perhaps we can all be a little more different, too.”

Swartz was too young to be a phone phreak like Tufte. In our present era of Skype and smartphones, the old days of outsmarting Ma Bell with 2600 Hertz sine wave tones and homemade “blue boxes” seems quaint, charmingly retro. But there is a thread that connects these old-school phone hackers to Swartz—common traits that Tufte recognized. It’s not just that, like Swartz, many phone phreaks faced trumped-up charges (wire fraud, in their cases). The best of these proto-computer hackers possessed Swartz’s enterprising spirit, his penchant for questioning authority, and his drive to figure out how a complicated system works from the inside. They were nerds, they were misfits; like Swartz, they were a little more different.

In his new history of phone phreaking, Exploding the Phone, engineer and consultant Phil Lapsley details the story of the 1960s and 1970s culture of hackers who, like Tufte, devised numerous ways to outwit the phone system. The foreword of the book is by Steve Wozniak, co-founder of Apple—and, as it happens, an old-school hacker himself. Before Wozniak and Steve Jobs built Apple in the 1970s, they were phone phreaks. (Wozniak’s hacker name was Berkeley Blue; Jobs’ handle was Oaf Tobar.)

In 1971, Esquire published an article about phone phreaking called “Secrets of the Little Blue Box,” by Ron Rosenbaum (a Slate columnist). It chronicled a ragtag crew sporting names like Captain Crunch and the Cheshire Cat, who prided themselves on using ingenuity and rudimentary electronics to outsmart the many-tentacled monstrosities of Ma Bell and the FBI. A blind 22-year-old named Joe Engressia was one of the scene’s heroes; according to Rosenbaum, Engressia could whistle at exactly the right frequency to place a free phone call.

Wozniak, age 20 in ’71, devoured the now-legendary article. “You know how some articles just grab you from the first paragraph?” he wrote in his 2006 memoir, iWoz, quoted in Lapsley’s book. “Well, it was one of those articles. It was the most amazing article I’d ever read!” Wozniak was entranced by the way these hackers seemed so much like himself. “I could tell that the characters being described were really tech people, much like me, people who liked to design things just to see what was possible, and for no other reason, really.” Building a blue box—a device that could generate the same tones that the phone system used to route phone calls, in a certain sequence—required technical smarts, and Wozniak loved nerdy challenges. Plus, the payoff—and the potential for epic pranks—was irresistible. (Wozniak once used a blue box to call the Vatican; impersonating Henry Kissinger he asked to talk to the pope.)

Wozniak immediately called Jobs, who was then a 17-year-old senior in high school. The friends drove to the technical library at Stanford’s Linear Accelerator Center to find a phone manual that listed tone frequencies. That same day, as Lapsley details in the book, Wozniak and Jobs bought analog tone generator kits, but were soon frustrated that the generators weren’t good enough for really high-quality phone phreaking.

Wozniak had a better, geekier idea: They needed to build their own blue boxes, but make them with digital circuits, which were more precise and easier to control than the usual analog ones. Wozniak and Jobs didn’t just build one blue box—they went on to build dozens of them, which they sold for about $170 apiece. In a way, their sophisticated, compact design foreshadowed the Apple products to come. Their digital circuitry incorporated several smart tricks, including a method to make the battery last longer. “I have never designed a circuit I was prouder of,” Wozniak says.

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image: Exploding the Phone by Phil Lapsley, book cover. Courtesy of Barnes & Noble.[end-div]

Is It Good That Money Can Buy (Almost) Anything?

Money is a curious invention. It enables efficient and almost frictionless commerce and it allows us to assign tangible value to our time. Yet it poses enormous societal challenges and ethical dilemmas. For instance, should we bribe our children with money in return for better grades? Should we allow a chronically ill kidney patient to purchase a replacement organ from a donor?

Raghuram Rajan, professor of finance at the University of Chicago, reviews a fascinating new book that attempts to answer some of these questions. The book, “What Money Can’t Buy: The Moral Limits of the Market” is written by noted Harvard philosopher Michael Sandel.

[div class=attrib]From Project Syndicate:[end-div]

In an interesting recent book, What Money Can’t Buy: The Moral Limits of the Market, the Harvard philosopher Michael Sandel points to the range of things that money can buy in modern societies and gently tries to stoke our outrage at the market’s growing dominance. Is he right that we should be alarmed?

While Sandel worries about the corrupting nature of some monetized transactions (do kids really develop a love of reading if they are bribed to read books?), he is also concerned about unequal access to money, which makes trades using money inherently unequal. More generally, he fears that the expansion of anonymous monetary exchange erodes social cohesion, and argues for reducing money’s role in society.

Sandel’s concerns are not entirely new, but his examples are worth reflecting upon. In the United States, some companies pay the unemployed to stand in line for free public tickets to congressional hearings. They then sell the tickets to lobbyists and corporate lawyers who have a business interest in the hearing but are too busy to stand in line.

Clearly, public hearings are an important element of participatory democracy. All citizens should have equal access. So selling access seems to be a perversion of democratic principles.

The fundamental problem, though, is scarcity. We cannot accommodate everyone in the room who might have an interest in a particularly important hearing. So we have to “sell” entry. We can either allow people to use their time (standing in line) to bid for seats, or we can auction seats for money. The former seems fairer, because all citizens seemingly start with equal endowments of time. But is a single mother with a high-pressure job and three young children as equally endowed with spare time as a student on summer vacation? And is society better off if she, the chief legal counsel for a large corporation, spends much of her time standing in line?

Whether it is better to sell entry tickets for time or for money thus depends on what we hope to achieve. If we want to increase society’s productive efficiency, people’s willingness to pay with money is a reasonable indicator of how much they will gain if they have access to the hearing. Auctioning seats for money makes sense – the lawyer contributes more to society by preparing briefs than by standing in line.

On the other hand, if it is important that young, impressionable citizens see how their democracy works, and that we build social solidarity by making corporate executives stand in line with jobless teenagers, it makes sense to force people to bid with their time and to make entry tickets non-transferable. But if we think that both objectives – efficiency and solidarity – should play some role, perhaps we should turn a blind eye to hiring the unemployed to stand in line in lieu of busy lawyers, so long as they do not corner all of the seats.

What about the sale of human organs, another example Sandel worries about? Something seems wrong when a lung or a kidney is sold for money. Yet we celebrate the kindness of a stranger who donates a kidney to a young child. So, clearly, it is not the transfer of the organ that outrages us – we do not think that the donor is misinformed about the value of a kidney or is being fooled into parting with it. Nor, I think, do we have concerns about the scruples of the person selling the organ – after all, they are parting irreversibly with something that is dear to them for a price that few of us would accept.

I think part of our discomfort has to do with the circumstances in which the transaction takes place. What kind of society do we live in if people have to sell their organs to survive?

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image courtesy of Google.[end-div]

Book Review: Thinking, Fast and Slow. Daniel Kahneman

Daniel Kahneman brings together for the first time his decades of groundbreaking research and profound thinking in social psychology and cognitive science in his new book, Thinking Fast and Slow. He presents his current understanding of judgment and decision making and offers insight into how we make choices in our daily lives. Importantly, Kahneman describes how we can identify and overcome the cognitive biases that frequently lead us astray. This is an important work by one of our leading thinkers.

[div class=attrib]From Skeptic:[end-div]

The ideas of the Princeton University Psychologist Daniel Kahneman, recipient of the Nobel Prize in Economic Sciences for his seminal work that challenged the rational model of judgment and decision making, have had a profound and widely regarded impact on psychology, economics, business, law and philosophy. Until now, however, he has never brought together his many years of research and thinking in one book. In the highly anticipated Thinking, Fast and Slow, Kahneman introduces the “machinery of the mind.” Two systems drive the way we think and make choices: System One is fast, intuitive, and emotional; System Two is slower, more deliberative, and more logical. Examining how both systems function within the mind, Kahneman exposes the extraordinary capabilities and also the faults and biases of fast thinking, and the pervasive influence of intuitive impressions on our thoughts and our choices. Kahneman shows where we can trust our intuitions and how we can tap into the benefits of slow thinking. He offers practical and enlightening insights into how choices are made in both our business and personal lives, and how we can guard against the mental glitches that often get us into trouble. Kahneman will change the way you think about thinking.

[div class=attrib]Image: Thinking, Fast and Slow, Daniel Kahneman. Courtesy of Publishers Weekly.[end-div]

Book Review: The Big Thirst. Charles Fishman

Charles Fishman has a fascinating new book entitled The Big Thirst: The Secret Life and Turbulent Future of Water. In it Fishman examines the origins of water on our planet and postulates an all to probable future where water becomes an increasingly limited and precious resource.

[div class=attrib]A brief excerpt from a recent interview, courtesy of NPR:[end-div]

For most of us, even the most basic questions about water turn out to be stumpers.

Where did the water on Earth come from?

Is water still being created or added somehow?

How old is the water coming out of the kitchen faucet?

For that matter, how did the water get to the kitchen faucet?

And when we flush, where does the water in the toilet actually go?

The things we think we know about water — things we might have learned in school — often turn out to be myths.

We think of Earth as a watery planet, indeed, we call it the Blue Planet; but for all of water’s power in shaping our world, Earth turns out to be surprisingly dry. A little water goes a long way.

We think of space as not just cold and dark and empty, but as barren of water. In fact, space is pretty wet. Cosmic water is quite common.

At the most personal level, there is a bit of bad news. Not only don’t you need to drink eight glasses of water every day, you cannot in any way make your complexion more youthful by drinking water. Your body’s water-balance mechanisms are tuned with the precision of a digital chemistry lab, and you cannot possibly “hydrate” your skin from the inside by drinking an extra bottle or two of Perrier. You just end up with pee sourced in France.

In short, we know nothing of the life of water — nothing of the life of the water inside us, around us, or beyond us. But it’s a great story — captivating and urgent, surprising and funny and haunting. And if we’re going to master our relationship to water in the next few decades — really, if we’re going to remaster our relationship to water — we need to understand the life of water itself.

[div class=attrib]Read more of this article and Charles Fishman’s interview with NPR here.[end-div]

Book Review: Are You Serious? Lee Siegel

“You cannot be serious”, goes the oft quoted opening to a John McEnroe javelin thrown at an unsuspecting tennis umpire. This leads us to an earnest review of what is means to be serious from Lee Siegel’s new book, “Are You Serious?” As Michael Agger points out for Slate:

We don’t know what to take seriously anymore. Is Brian Williams a serious news anchor or is he playing at being serious? How about Jon Stewart? The New York Times exudes seriousness, but the satire of The Onion can also be very serious.

Do we indeed need a how-to manual on how to exude required seriousness in the correct circumstances? Do we need a 3rd party narrator to tell us when to expect seriousness or irony or serious irony? Perhaps Lee Siegel’s book can shed some light.

[div class=attrib]More from Slate’s review of Siegel’s book:[end-div]

Siegel’s business-casual jaunt through seriosity begins with the Victorian poet Matthew Arnold, who saw the decline of religious spirit and proposed the “high seriousness” of poetry and literature in its place. “Seriousness implied a trustworthy personality,” Siegel writes, “just as faith in God once implied a trustworthy soul.” The way in which Arnold connected morality to cultural refinement soothed the intellectual insecurity of Americans vis-à-vis Europe and appealed to our ethos of self-improvement. The contemporary disciples of Arnold are those friends of yours who read Ulysses along with Ulysses Annotated, actually go to art galleries, and know their way around a Ring cycle. The way they enjoy culture expresses their seriousness of purpose.

I’ve only pulled at a few of the provocative strings in Siegel’s book. His argument that Sarah Palin is someone who has signaled seriousness by being willing to humiliate herself on reality TV makes a wild sort of sense. At other times, Siegel floats some nonsense that he knows to be silly.

But I don’t want to leave you hanging without providing Siegel’s answer to the question of finding seriousness in life. He gives us his “three pillars”: attention, purpose, continuity. That could mean being a really competent lawyer. Or being so skilled at being a pilot that you land a plane on the Hudson and save everyone onboard. Or being like Socrates and drinking the hemlock to prove that you believed in your ideas. Just find the thing that makes you “fully alive” and then you’re set. Which is to say that although the cultural and political figures we should take seriously change, the prospect of becoming a serious person remains dauntingly unchanged.

[div class=attrib]More from theSource here.[end-div]

Book Review: The Believing Brain. Michael Shermer

Skeptic in-chief, Michael Shermer has an important and fascinating new book. The Believing Brain: From Ghosts and Gods to Politics and Conspiracies – How We Construct Beliefs and Reinforce Them as Truths – describes how our beliefs arise from patterns and that these beliefs come first, and explanations for those beliefs comes second.

Shermer reviews 30 years of leading research in cognitive science, neurobiology, evolutionary psychology and anthropology and numerous real-world examples to show how the belief mechanism works. This holds for our beliefs in all manner of important spheres: religion, politics, economics, superstition and the supernatural.

Shermer proposes that our brains are “belief engines” that “look for and find patterns” quite naturally, and it is only following this that our brains assign these patterns with meaning. It is these meaningful patterns that form what Shermer terms “belief-dependent reality.” Additionally, our brains tend to gravitate towards information that further reinforces our beliefs, and ignore data that contradicts these beliefs. This becomes a self-reinforcing loop where beliefs drive explanation seeking behaviors to confirm those beliefs which are further reinforced, and drive further confirmation seeking behavior.

In fact, the human brain is so adept at looking for patterns it “sees” them in places where none exist. Shermer calls this “illusory correlation”. Birds do it, rats to it; humans are masters at it. B.F. Skinner’s groundbreaking experiments on partial reinforcement in animals shows this “patternicity” exquisitely. As Shermer describes:

Skinner discovered that if he randomly delivered the food reinforcement, whatever the pigeon happened to be doing jiust before the delivery of the food would be repeated the next time, such as spinning around once to the left before pecking at the key. This is pigeon patternicity or the learning of a superstition.

. . . If you doubt its potency as a force in  human behavior, just visit a Las Vegas casino and observe people playing the slots with their varied attempts to find a pattern between (A) pulling the slot machine handle and (B) the payoff.

This goes a long way to describing all manner of superstitious behaviors in humans. But Shermer doesn’t stop there. He also describes how and why we look for patterns in the behaviors of others and assign meaning to these as well. Shermer call this “agenticity”. This is “the tendency to infuse patterns with meaning, intention and agency”. As he goes on to describe:

… we often impart the patterns we find with agency and intention, and believe that these intentional agents control the world, sometimes invisibly from the top down, instead of bottom-up causal laws and randomness that makes up much of our world. Souls, spirits, ghosts, gods, demons, angels, aliens, intelligent designers, government conspiracists, and all manner of invisible agents with power and intention are believed to haunt our world and control our lives. Combined with our propensity to find meaningful patterns in both meaningful and meaningless noise, patternicity and agenticity form the cognitive basis of shamanism, paganism, animism, polytheism, monotheism, and all modes of Old and New Age spiritualisms.

Backed with the results of numerous cross-disciplinary scientific studies, Shermer’s arguments are thoroughly engrossing and objectively difficult to refute. This is by far Shermer’s best book to date.

(By the way, in the interest of full disclosure this book thoroughly validated the reviewer’s own beliefs.)

Book Review: Linchpin. Seth Godin

Phew! Another heartfelt call to action from business blogger Seth Godin to become indispensable.

Author, public speaker, orthogonal thinker and internet marketing maven, Seth Godin makes a compelling case to the artist within us all to get off our backsides, ignore the risk averse “lizard brain” as he puts it, get creative, and give the gift of art. After all there is no way to win the “race to the bottom” wrought by commoditization of both product and labor.

Bear in mind, Godin uses “art” in its most widely used sense, not merely a canvas or a sculpture. Here, art is anything that its maker so creates; it may be a service just as well as an object. Importantly also, to be art it has to be given with the correct intent — as a gift (a transcendent, unexpected act that surpasses expectation).

Critics maintain that his latest bestseller is short on specifics, but indeed it should be. After all if the process of creating art could be decomposed to an instruction manual it wouldn’t deliver art, it would deliver a Big Mac. So while, we do not get a “7 point plan” that leads to creative nirvana, Godin does a good job through his tireless combination of anecdote, repetition, historical analysis and social science at convincing the “anonymous cogs in the machine” to think and act more like the insightful, innovators that we can all become.

Godin rightly believes that the new world of work is rife with opportunity to add value through creativity, human connection and generosity, and this is the area where the indispensable artist gets to create his or her art, and to become a linchpin in the process. Godin’s linchpin is a rule-breaker, not a follower; a map-maker, not an order taker; a doer not a whiner.

In reading Linchpin we are reminded of the other side of the economy, in which we all unfortunately participate as well, the domain of commoditization, homogeneity and anonymity. This is the domain that artists so their utmost to avoid, and better still, subvert. Of course, this economy provides a benefit too – lower price. However, a “Volkswagen-sized jar of pickles for $3” can only go so far. Commoditization undermines our very social fabric: it undermines our desire for uniqueness and special connection in a service or product that we purchase; it removes our dignity and respect when we allow ourselves to become a disposable part, a human cog, in the job machine. So, jettison the bland, the average, and the subservient, learn to take risk, face fear and become an indispensable, passionate, discerning artist – one who creates and one who gives.

Book Review: America Pacifica

Classic dystopian novels from the likes of Aldous Huxley, George Orwell, Philip K. Dick, Ursula K. Le Guin, and Margaret Atwood appeal for their fantastic narrative journeys. More so they resonate for it often seems that contemporary society is precariously close to this fictional chaos, dysfunction and destruction; one small step in the wrong direction and over the precipice we go. America Pacifica continues this tradition.

[div class=attrib]From The Barnes & Noble Review:[end-div]

Anna North is both a graduate of the Iowa Writers’ Work Shop and a writer for the feminist Web site Jezebel. It’s no surprise, then, that her debut novel, America Pacifica, is overflowing with big ideas about revolution, ecology, feminism, class, and poverty. But by the end of page one, when a teenage daughter, Darcy, watches her beloved mother, Sarah, emerge from a communal bathroom down the hall carrying “their” toothbrush, one also knows that this novel, like, say, the dystopic fiction of Margaret Atwood or Ursula K. Le Guin, aims not only to transmit those ideas in the form of an invented narrative, but also to give them the animating, detailed, and less predictable life of literature.

The “America Pacifica” of the title is an unnamed island upon which a generation of North American refugees have attempted to create a simulacra of their old home–complete with cities named Manhattanville and Little Los Angeles–after an environmental calamity rendered “the mainland” too frigid for human life. Daniel, a mainland scientist, argued that the humans should adapt themselves to the changing climate, while a man named Tyson insisted that they look for a warmer climate and use technology and dirty industrial processes to continue human life as it was once lived. The island’s population is comprised entirely of those who took Tyson’s side of the argument.

But this haven can only sustain enough luxuries for a tiny few. Every aspect of island life is governed by a brutal caste system which divides people into rigid hierarchies based on the order in which they and their families arrived by boat. The rich eat strawberries and fresh tomatoes, wear real fiber, and live in air-conditioned apartments. The poor subsist on meat products fabricated from jellyfish and seaweed, wear synthetic SeaFiber clothing, and dream of somehow getting into college (which isn’t open to them) so they can afford an apartment with their own bathroom and shower.

[div class=attrib]More from theSource here.[end-div]

Richard Feynman on the Ascendant

Genius – The Life and Science of Richard Feynman by James Gleick was a good first course for those fascinated by Richard Feynman’s significant contributions to physics, cosmology (and percussion).

Now, eight years later come two more biographies that observe Richard Feynman from very different perspectives, reviewed in the New York Review of Books. The first, Lawrence Krauss’s book, Quantum Man is the weighty main course; the second, by Jim Ottaviani and artist Leland Myrick, is a graphic-book (as in comic) biography, and delicious dessert.

In his review — The ‘Dramatic Picture’ of Richard Feynman — Freeman Dyson rightly posits that Richard Feynman’s star may now, or soon, be in the same exalted sphere as Einstein and Hawking. Though, type “Richard” in Google search and wait for its predictive text to fill in the rest and you’ll find that Richard Nixon, Richard Dawkins and Richard Branson rank higher than this giant of physics.

[div class=attrib]Freeman Dyson for the New York Review of Books:[end-div]

In the last hundred years, since radio and television created the modern worldwide mass-market entertainment industry, there have been two scientific superstars, Albert Einstein and Stephen Hawking. Lesser lights such as Carl Sagan and Neil Tyson and Richard Dawkins have a big public following, but they are not in the same class as Einstein and Hawking. Sagan, Tyson, and Dawkins have fans who understand their message and are excited by their science. Einstein and Hawking have fans who understand almost nothing about science and are excited by their personalities.

On the whole, the public shows good taste in its choice of idols. Einstein and Hawking earned their status as superstars, not only by their scientific discoveries but by their outstanding human qualities. Both of them fit easily into the role of icon, responding to public adoration with modesty and good humor and with provocative statements calculated to command attention. Both of them devoted their lives to an uncompromising struggle to penetrate the deepest mysteries of nature, and both still had time left over to care about the practical worries of ordinary people. The public rightly judged them to be genuine heroes, friends of humanity as well as scientific wizards.

Two new books now raise the question of whether Richard Feynman is rising to the status of superstar. The two books are very different in style and in substance. Lawrence Krauss’s book, Quantum Man, is a narrative of Feynman’s life as a scientist, skipping lightly over the personal adventures that have been emphasized in earlier biographies. Krauss succeeds in explaining in nontechnical language the essential core of Feynman’s thinking.

… The other book, by writer Jim Ottaviani and artist Leland Myrick, is very different. It is a comic-book biography of Feynman, containing 266 pages of pictures of Feynman and his legendary adventures. In every picture, bubbles of text record Feynman’s comments, mostly taken from stories that he and others had told and published in earlier books.

[div class=attrib]More from theSource here.[end-div]

[div class=attrib]Image courtesy of Shelley Gazin/Corbis.[end-div]

Book Review: The First Detective

A new book by James Morton examines the life and times of cross-dressing burglar, prison-escapee and snitch turned super-detective Eugène-François Vidocq.

[div class=attrib]From The Barnes & Noble Review:[end-div]

The daring costumed escapes and bedsheet-rope prison breaks of the old romances weren’t merely creaky plot devices; they were also the objective correlatives of the lost politics of early modern Europe. Not yet susceptible to legislative amelioration, rules and customs that seemed both indefensible and unassailable had to be vaulted over like collapsing bridges or tunneled under like manor walls. Not only fictional musketeers but such illustrious figures as the young Casanova and the philosopher Jean-Jacques Rousseau spent their early years making narrow escapes from overlapping orthodoxies, swimming moats to marriages of convenience and digging their way  out of prisons of privilege by dressing in drag or posing as noblemen’s sons. If one ran afoul of the local clergy or some aristocratic cuckold, there were always new bishops and magistrates to charm in the next diocese or département.

In 1775–roughly a generation after the exploits of Rousseau and Casanova–a prosperous baker’s son named Eugène-François Vidocq was born in Arras, in northern France. Indolent and adventuresome, he embarked upon a career that in its early phase looked even more hapless and disastrous than those of his illustrious forebears. An indifferent soldier in the chaotic, bloody interregnum of revolutionary France, Vidocq quickly fell into petty crime (at one point, he assumed the name Rousseau for a time as an alias and nom de guerre). A hapless housebreaker and a credulous co-conspirator, his criminal misadventures were equaled only by his skill escaping from the dungeons and bagnes that passed for a penal system in the pre-Napoleonic era.

By 1809, his canniness as an informer landed him a job with the police; with his old criminal comrades as willing foot soldiers, Vidocq organized a brigade de sûreté, a unit of plainclothes police, which in 1813 Napoleon made an official organ of state security. Throughout his subsequent career he would lay much of the foundation of modern policing, and may be considered a forebear not only to the Dupins and the Holmes of modern detective literature but of swashbuckling, above-the-law policemen like Eliot Ness and J. Edgar Hoover as well.

[div class=attrib]More from theSource here.[end-div]

Book Review: “Millennium People”: J.G. Ballard’s last hurrah

[div class=attrib]From Salon:[end-div]

In this, his last novel, the darkly comic “Millennium People,” J.G. Ballard returns to many of the themes that have established him as one of the 20th century’s principal chroniclers of modernity as dystopia. Throughout his career Ballard, who died in 2009, wrote many different variations on the same theme: A random act of violence propels a somewhat affectless protagonist into a violent pathology lurking just under the tissue-thin layer of postmodern civilization. As in “Crash” (1973) and “Concrete Island” (1974), the car parks, housing estates, motorways and suburban sprawl of London in “Millennium People” form a psychological geography. At its center, Heathrow Airport — a recurrent setting for Ballard — exerts its subtly malevolent pull on the bored lives and violent dreams of the alienated middle class.

“Millennium People” begins with the explosion of a bomb at Heathrow, which kills the ex-wife of David Markham, an industrial psychologist. The normally passive Markham sets out to investigate the anonymous bombing and the gated community of Chelsea Marina, a middle-class neighborhood that has become ground zero for a terrorist group and a burgeoning rebellion of London’s seemingly docile middle class. Exploited not so much for their labor as for their deeply ingrained and self-policing sense of social responsibility and good manners, the educated and professional residents of Chelsea Marina regard themselves as the “new proletariat,” with their exorbitant maintenance and parking fees as the new form of oppression, their careers, cultured tastes and education the new gulag.

In the company of a down-and-out priest and a film professor turned Che Guevara of the Volvo set, Markham quickly discovers that the line between amateur detective and amateur terrorist is not so clear, as he is drawn deeper into acts of sabotage and violence against the symbols and institutions of his own safe and sensible life. Targets include travel agencies, video stores, the Tate Modern, the BBC and National Film Theater — all “soporifics” designed to con people into believing their lives are interesting or going somewhere.

[div class=attrib]More from theSource here.[end-div]

Book Review: The Psychopath Test. Jon Ronson

Hilarious and disturbing. I suspect Jon Ronson would strike a couple of checkmarks in the Hare PCL-R Checklist against my name for finding his latest work both hilarious and disturbing. Would this, perhaps, make me a psychopath?

Jon Ronson is author of The Psychopath Test and the Hare PCL-R, named for its inventor,  Canadian psychologist Bob Hare, is the gold standard in personality trait measurement for psychopathic disorder (officially known as Antisocial Personality Disorder).

Ronson’s book is a fascinating journey through the “madness industry” covering psychiatrists, clinical psychologists, criminal scientists, criminal profilers, and of course their clients: patients, criminals and the “insane” at large. Fascinated by the psychopathic traits that the industry applied to the criminally insane, Ronson goes on to explore these behavior and personality traits in the general population. And, perhaps to no surprise he finds that a not insignificant proportion of business leaders and others in positions on authority could be classified as “psychopaths” based on the standard PCL-R checklist.

Ronson’s stories are poignant. He tells us the tale of Tony, who feigned madness to avoid what he believed would be have been a harsher prison sentence for a violent crime. Instead, Tony found himself in Broadmoor, a notorious maximum security institution for the criminally insane. Twelve years on, Tony still incarcerated, finds it impossible to convince anyone of his sanity, despite behaving quite normally. His doctors now admit that he was sane at the time of admission, but agree that he must have been nuts to feign insanity in the first place, and furthermore only someone who is insane could behave so “sanely” while surrounded by the insane!

Tony’s story and the other characters that Ronson illuminates in this work are thoroughly memorable, especially Al Dunlap, empathy poor, former CEO of Sunbeam — perhaps one of the high-functioning psychopaths who lives in our midst. Peppered throughout Ronson’s interviews with madmen and madwomen, are his perpetual anxiety and self-reflection; he now has considerable diagnostic power and insight versed on such tools as the PCL-R checklist. As a result, Ronson begins seeing “psychopaths” everywhere.

My only criticism of the book is that Jon Ronson should have made it 200 pages longer and focused much more on the “psychopathic” personalities that roam amongst us, not just those who live behind bars, and on the madness industry itself, now seemingly lead by the major  pharmaceutical companies.

Book Review: Solar. Ian McEwan

Solar is a timely, hilarious novel from the author of Atonement that examines the self-absorption and (self-)deceptions of Nobel Prize-winning physicist Michael Beard. With his best work many decades behind him Beard trades on his professional reputation to earn continuing financial favor, and maintain influence and respect amongst his peers. And, with his personal life in an ever-decreasing spiral, with his fifth marriage coming to an end, Beard manages to entangle himself in an impossible accident which has the power to re-shape his own world, and the planet in the process.

Ian McEwan’s depiction of Michael Beard is engaging and thoroughly entertaining. Beard hops from relationship to relationship in his singular quest for “love”, but very much on his own terms. And, this very centric view of himself extends to his own science, where his personal contributions don’t seem to be all that they appear. Satire and climate science makes a stylish and witty combination in the hands of McEwan.

Book Review: The Social Animal. David Brooks

David Brooks brings us a detailed journey through the building blocks of the self in his new book, The Social Animal: A Story of Love, Character and Achievement. With his insight and gift for narrative Brooks weaves an engaging and compelling story of Erica and Harold. Brooks uses the characters of Erica and Harold as platforms on which he visualizes the results of numerous psychological, social and cultural studies. Placed in contemporary time the two characters show us a holistic picture in practical terms of the unconscious effects of physical and social context on behavioral and character traits. The narrative takes us through typical life events and stages: infancy, childhood, school, parenting, work-life, attachment, aging. At each stage, Brooks illustrates his views of the human condition by selecting a flurry of facts and anecdotal studies.

The psychologist in me would say that this is a rather shallow attempt at synthesizing profoundly complex issues. Brooks certainly makes use of many studies from the brain and social sciences, but never dwells long enough to give us a detailed sense of major underlying implications or competing scientific positions. So too, the character development of Erica and Harold lacks the depth and breadth one would expect — Brooks fails to explore much of what typically seems to motivate human behavior: greed, ambition, lust, violence, empathy.  Despite these flaws in the execution of the idea, Brooks’ attempt is praiseworthy; perhaps in the hands of a more skilled social scientist, or Rousseau who used this technique much more effectively, this type of approach would gain a better grade.

Book Review: The Drunkard’s Walk: How Randomness Rules Our Lives. Leonard Mlodinow

Leonard Mlodinow weaves a compelling path through the world of statistical probability showing us how the laws of chance affect our lives on personal and grande scales. Mlodinow skillfully illustrates randomness and its profound implications by presenting complex mathematical constructs in language for the rest of us (non-mathematicians), without dumbing-down this important subject.

The book defines many of the important mathematical concepts behind randomness and exposes the key fallacies that often blind us as we wander through life on our “drunkard’s walk”. The law of large numbers, the prosecutor’s fallacy, conditional probability, the availability bias and bell curves were never so approachable.

Whether it’s a deluded gambler, baseball star on a “winning streak” or a fortunate CEO wallowing in the good times, Mlodinow debunks the common conceptions that skill, planning and foresight result in any significant results beyond pure chance. With the skill of a storyteller Mlodinow shows us how polls, grades, ratings and even measures of corporate success are far less objective and reliable than we ought to believe. Lords of Wall Street take notice, the secrets of your successes are not all that they seem.