The Middleman is Dead; Long Live the Middleman

In another sign of Amazon’s unquenchable thirst for all things commerce, the company is now moving more aggressively into publishing.

From the New York Times:

Amazon.com has taught readers that they do not need bookstores. Now it is encouraging writers to cast aside their publishers.

Amazon will publish 122 books this fall in an array of genres, in both physical and e-book form. It is a striking acceleration of the retailer’s fledging publishing program that will place Amazon squarely in competition with the New York houses that are also its most prominent suppliers.

It has set up a flagship line run by a publishing veteran, Laurence Kirshbaum, to bring out brand-name fiction and nonfiction. It signed its first deal with the self-help author Tim Ferriss. Last week it announced a memoir by the actress and director Penny Marshall, for which it paid $800,000, a person with direct knowledge of the deal said.

Publishers say Amazon is aggressively wooing some of their top authors. And the company is gnawing away at the services that publishers, critics and agents used to provide.

Several large publishers declined to speak on the record about Amazon’s efforts. “Publishers are terrified and don’t know what to do,” said Dennis Loy Johnson of Melville House, who is known for speaking his mind.

“Everyone’s afraid of Amazon,” said Richard Curtis, a longtime agent who is also an e-book publisher. “If you’re a bookstore, Amazon has been in competition with you for some time. If you’re a publisher, one day you wake up and Amazon is competing with you too. And if you’re an agent, Amazon may be stealing your lunch because it is offering authors the opportunity to publish directly and cut you out.

Read more here.

Send to Kindle

The World Wide Web of Terrorism

From Eurozine:

There are clear signs that Internet-radicalization was behind the terrorism of Anders Behring Breivik. Though most research on this points to jihadism, it can teach us a lot about how Internet-radicalization of all kinds can be fought.

On 21 September 2010, Interpol released a press statement on their homepage warning against extremist websites. They pointed out that this is a global threat and that ever more terrorist groups use the Internet to radicalize young people.

“Terrorist recruiters exploit the web to their full advantage as they target young, middle class vulnerable individuals who are usually not on the radar of law enforcement”, said Secretary General Ronald K. Noble. He continued: “The threat is global; it is virtual; and it is on our doorsteps. It is a global threat that only international police networks can fully address.”

Noble pointed out that the Internet has made the radicalization process easier and the war on terror more difficult. Part of the reason, he claimed, is that much of what takes place is not really criminal.

Much research has been done on Internet radicalization over the last few years but the emphasis has been on Islamist terror. The phenomenon can be summarized thus: young boys and men of Muslim background have, via the Internet, been exposed to propaganda, films from war zones, horrifying images of war in Afghanistan, Iraq and Chechnya, and also extreme interpretations of Islam. They are, so to speak, caught in the web, and some have resorted to terrorism, or at least planned it. The BBC documentary Generation Jihad gives an interesting and frightening insight into the phenomenon.

Researchers Tim Stevens and Peter Neumann write in a report focused on Islamist Internet radicalization that Islamist groups are hardly unique in putting the Internet in the service of political extremism:
Although Al Qaeda-inspired Islamist militants represented the most significant terrorist threat to the United Kingdom at the time of writing, Islamist militants are not the only – or even the predominant – group of political extremists engaged in radicalization and recruitment on the internet. Visitor numbers are notoriously difficult to verify, but some of the most popular Islamist militant web forums (for example, Al Ekhlaas, Al Hesbah, or Al Boraq) are easily rivalled in popularity by white supremacist websites such as Stormfront.

Strikingly, Stormfront – an international Internet forum advocating “white nationalism” and dominated by neo-Nazis – is one of the websites visited by the terrorist Anders Behring Breivik, and a forum where he also left comments. In one place he writes about his hope that “the various fractured rightwing movements in Europe and the US reach a common consensus regarding the ‘Islamification of Europe/US’ can try and reach a consensus regarding the issue”. He continues: “After all, we all want the best for our people, and we owe it to them to try to create the most potent alliance which will have the strength to overthrow the governments which support multiculturalism.”

Read more of this article here.

Image courtesy of Eurozine.

Send to Kindle

Corporations As People And the Threat to Truth

In 2010 the U.S. Supreme Court ruled that corporations can be treated as people, assigning companies First Amendment rights under the Constitution. So, it’s probably only a matter of time before a real person legally marries (and divorces) a corporation. And, we’re probably not too far from a future where an American corporate CEO can take the life of competing company’s boss and “rightfully” declare that it was in competitive self-defense.

In the meantime, the growing, and much needed, debate over corporate power, corporate responsibility and corporate consciousness rolls on. A timely opinion by Gary Gutting over at the New York Times, gives us more on which to chew.

From the New York Times:

The Occupy Wall Street protest movement has raised serious questions about the role of capitalist institutions, particularly corporations, in our society.   Well before the first protester set foot in Zucotti Park, a heckler urged Mitt Romney to tax corporations rather than people.  Romney’s response — “Corporations are people” — stirred a brief but intense controversy.  Now thousands of demonstrators have in effect joined the heckler, denouncing corporations as ”enemies of the people.”

Who’s right? Thinking pedantically, we can see ways in which Romney was literally correct; for example, corporations are nothing other than the people who own, run and work for them, and they are recognized as “persons” in some technical legal sense.  But it is also obvious that corporations are not people in a full moral sense: they cannot, for example, fall in love, write poetry or be depressed.

Far more important than questions about what corporations are (ontological questions, as philosophers say) is the question of what attitude we should have toward them.  Should we, as corporate public relations statements often suggest, think of them as friends (if we buy and are satisfied with their products) or as family (if we work for them)?  Does it make sense to be loyal to a corporation as either a customer or as an employee?  More generally, even granted that corporations are not fully persons in the way that individuals are, do they have some important moral standing in our society?

My answer to all these questions is no, because corporations have no core dedication to fundamental human values.  (To be clear, I am speaking primarily of large, for-profit, publicly owned corporations.)  Such corporations exist as instruments of profit for their shareholders.  This does not mean that they are inevitably evil or that they do not make essential economic contributions to society.  But it does mean that their moral and social value is entirely instrumental.   There are ways we can use corporations as means to achieve fundamental human values, but corporations do not of themselves work for these values. In fact, left to themselves, they can be serious threats to human values that conflict with the goal of corporate profit.

Corporations are a particular threat to truth, a value essential in a democracy, which places a premium on the informed decisions of individual citizens.  The corporate threat is most apparent in advertising, which explicitly aims at convincing us to prefer a product regardless of its actual merit.

Read more here.

Time Saving Truth from Falsehood and Envy by François Lemoyne. Image courtesy of Wikipedia / Wallace Collection, London.

Send to Kindle

The Myth of Bottled Water

In 2010 the world spent around $50 Billion on bottled water, with over a third accounted for by the United States alone. During this period the United States House of Representatives spent $860,000 on bottled water for its 435 members. This is close to $2,000 per person per year. (Figures according to Corporate Accountability International).

This is despite the fact that on average bottled water costs around 1,900 times more than it’s cheaper, less glamorous sibling — tap water. Bottled water has become a truly big business even though science shows no discernible benefit of bottled water over that from the faucet. In fact, around 40 percent of bottled water comes from municipal water supplies anyway.

In 2007 Charles Fishman wrote a ground-breaking cover story on the bottled water industry for Fast Company. We excerpt part of the article, Message in a Bottle, below.

By Charles Fishman:

The largest bottled-water factory in North America is located on the outskirts of Hollis, Maine. In the back of the plant stretches the staging area for finished product: 24 million bottles of Poland Spring water. As far as the eye can see, there are double-stacked pallets packed with half-pint bottles, half-liters, liters, “Aquapods” for school lunches, and 2.5-gallon jugs for the refrigerator.

Really, it is a lake of Poland Spring water, conveniently celled off in plastic, extending across 6 acres, 8 feet high. A week ago, the lake was still underground; within five days, it will all be gone, to supermarkets and convenience stores across the Northeast, replaced by another lake’s worth of bottles.

Looking at the piles of water, you can have only one thought: Americans sure are thirsty.

Bottled water has become the indispensable prop in our lives and our culture. It starts the day in lunch boxes; it goes to every meeting, lecture hall, and soccer match; it’s in our cubicles at work; in the cup holder of the treadmill at the gym; and it’s rattling around half-finished on the floor of every minivan in America. Fiji Water shows up on the ABC show Brothers & Sisters; Poland Spring cameos routinely on NBC’s The Office. Every hotel room offers bottled water for sale, alongside the increasingly ignored ice bucket and drinking glasses. At Whole Foods, the upscale emporium of the organic and exotic, bottled water is the number-one item by units sold.

Thirty years ago, bottled water barely existed as a business in the United States. Last year, we spent more on Poland Spring, Fiji Water, Evian, Aquafina, and Dasani than we spent on iPods or movie tickets–$15 billion. It will be $16 billion this year.

Bottled water is the food phenomenon of our times. We–a generation raised on tap water and water fountains–drink a billion bottles of water a week, and we’re raising a generation that views tap water with disdain and water fountains with suspicion. We’ve come to pay good money–two or three or four times the cost of gasoline–for a product we have always gotten, and can still get, for free, from taps in our homes.

When we buy a bottle of water, what we’re often buying is the bottle itself, as much as the water. We’re buying the convenience–a bottle at the 7-Eleven isn’t the same product as tap water, any more than a cup of coffee at Starbucks is the same as a cup of coffee from the Krups machine on your kitchen counter. And we’re buying the artful story the water companies tell us about the water: where it comes from, how healthy it is, what it says about us. Surely among the choices we can make, bottled water isn’t just good, it’s positively virtuous.

Except for this: Bottled water is often simply an indulgence, and despite the stories we tell ourselves, it is not a benign indulgence. We’re moving 1 billion bottles of water around a week in ships, trains, and trucks in the United States alone. That’s a weekly convoy equivalent to 37,800 18-wheelers delivering water. (Water weighs 81/3 pounds a gallon. It’s so heavy you can’t fill an 18-wheeler with bottled water–you have to leave empty space.)

Meanwhile, one out of six people in the world has no dependable, safe drinking water. The global economy has contrived to deny the most fundamental element of life to 1 billion people, while delivering to us an array of water “varieties” from around the globe, not one of which we actually need. That tension is only complicated by the fact that if we suddenly decided not to purchase the lake of Poland Spring water in Hollis, Maine, none of that water would find its way to people who really are thirsty.

Please read the entire article here.

Image courtesy of Wikipedia.

Send to Kindle

Brokering the Cloud

Computer hardware reached (or plummeted, depending upon your viewpoint) the level of commodity a while ago. And of course, some types of operating systems platforms, and software and applications have followed suit recently — think Platform as a Service (PaaS) and Software as a Service (SaaS). So, it should come as no surprise to see new services arise that try to match supply and demand, and profit in the process. Welcome to the “cloud brokerage”.

From MIT Technology Review:

Cloud computing has already made accessing computer power more efficient. Instead of buying computers, companies can now run websites or software by leasing time at data centers run by vendors like Amazon or Microsoft. The idea behind cloud brokerages is to take the efficiency of cloud computing a step further by creating a global marketplace where computing capacity can be bought and sold at auction.

Such markets offer steeply discounted rates, and they may also offer financial benefits to companies running cloud data centers, some of which are flush with excess capacity. “The more utilized you are as a [cloud services] provider … the faster return on investment you’ll realize on your hardware,” says Reuven Cohen, founder of Enomaly, a Toronto-based firm that last February launched SpotCloud, cloud computing’s first online spot market.

On SpotCloud, computing power can be bought and sold like coffee, soybeans, or any other commodity. But it’s caveat emptor for buyers, since unlike purchasing computer time with Microsoft, buying on SpotCloud doesn’t offer many contractual guarantees. There is no assurance computers won’t suffer an outage, and sellers can even opt to conceal their identity in a blind auction, so buyers don’t always know whether they’re purchasing capacity from an established vendor or a fly-by-night startup.

Read more here.

Image courtesy of MIT Technology Review.

Send to Kindle

In Praise of the Bad Bookstore

Tens of thousands of independent bookstores have disappeared from the United States and Europe over the last decade. Even mega-chains like Borders have fallen prey to monumental shifts in the distribution of ideas and content. The very notion of the physical book is under increasing threat from the accelerating momentum of digitalization.

For bibliophiles, particularly those who crave the feel of physical paper, there is a peculiar attractiveness even to the “bad” bookstore or bookshop (in the UK): the airport bookshop of last resort, the pulp fiction bookstore in a suburban mall. Mark O’Connell over at The Millions tells us there is no such thing as a bad bookstore.

From The Millions:

Cultural anxieties are currently running high about the future of the book as a physical object, and about the immediate prospects for survival of actual brick and mortar booksellers. When most people think about the (by now very real) possibility of the retail side of the book business disappearing entirely into the online ether, they mostly tend to focus on the idea of their favorite bookshops shutting their doors for the last time. Sub-Borgesian bibliomaniac that I am (or, if you prefer, pathetic nerd), I have a mental image of the perfect bookshop that I hold in my mind. It’s a sort of Platonic ideal of the retail environment, a perfect confluence of impeccable curation and expansive selection, artfully cluttered and with the kind of quietly hospitable ambiance that makes the passage of time seem irrelevant once you start in on browsing the shelves. For me, the actual place that comes closest to embodying this ideal is the London Review Bookshop in Bloomsbury, run by the people behind the London Review of Books. It’s a beautifully laid-out space in a beautiful building, and its selection of books makes it feel less like an actual shop than the personal library of some extremely wealthy and exceptionally well-read individual. It’s the kind of place, in other words, where you don’t so much want to buy half the books in the shop as buy the shop itself, move in straight away and start living in it. The notion that places like this might no longer exist in a decade or so is depressing beyond measure.

But I don’t live in Bloomsbury, or anywhere near it. I live in a suburb of Dublin where the only bookshop within any kind of plausible walking distance is a small and frankly feeble set-up on the second floor of a grim 1970s-era shopping center, above a large supermarket. It’s flanked by two equally moribund concerns, a small record store and a travel agent, thereby forming the centerpiece of a sad triptych of retail obsolescence. It’s one of those places that makes you wonder how it manages to survive at all.

But I have an odd fondness for it anyway, and I’ll occasionally just wander up there in order to get out of the apartment, or to see whether, through some fluke convergence of whim and circumstance, they have something I might actually want to buy. I’ve often bought books there that I would never have thought to pick up in a better bookshop, gravitating toward them purely by virtue of the fact that there’s nothing else remotely interesting to be had.

And this brings me to the point I want to make about bad bookshops, which is that they’re rarely actually as bad as they seem. In a narrow and counterintuitive sense, they’re sometimes better than good bookshops. The way I see it, there are three basic categories of retail bookseller. There’s the vast warehouse that has absolutely everything you could possibly think of (Strand Bookstore in New York’s East Village, for instance, is a fairly extreme representative of this group, or at least it was the last time I was there ten years ago). Then there’s the “boutique” bookshop, where you get a sense of a strong curatorial presence behind the scenes, and which seems to cater for some aspirational ideal of your better intellectual self. The London Review Bookshop is, for me at least, the ultimate instance of this. And then there’s the third — and by far the largest — category, which is the rubbish bookshop. There are lots of subgenii to this grouping. The suburban shopping center fiasco, as discussed above. The chain outlet crammed with celebrity biographies and supernatural teen romances. The opportunistic fly-by-night operation that takes advantage of some short-term lease opening to sell off a random selection of remaindered titles at low prices before shutting down and moving elsewhere. And, of course, the airport bookshop of last resort.

Catch more of this essay here.

Image courtesy of The Millions.

Send to Kindle

Book Review: The Big Thirst. Charles Fishman

Charles Fishman has a fascinating new book entitled The Big Thirst: The Secret Life and Turbulent Future of Water. In it Fishman examines the origins of water on our planet and postulates an all to probable future where water becomes an increasingly limited and precious resource.

A brief excerpt from a recent interview, courtesy of NPR:

For most of us, even the most basic questions about water turn out to be stumpers.

Where did the water on Earth come from?

Is water still being created or added somehow?

How old is the water coming out of the kitchen faucet?

For that matter, how did the water get to the kitchen faucet?

And when we flush, where does the water in the toilet actually go?

The things we think we know about water — things we might have learned in school — often turn out to be myths.

We think of Earth as a watery planet, indeed, we call it the Blue Planet; but for all of water’s power in shaping our world, Earth turns out to be surprisingly dry. A little water goes a long way.

We think of space as not just cold and dark and empty, but as barren of water. In fact, space is pretty wet. Cosmic water is quite common.

At the most personal level, there is a bit of bad news. Not only don’t you need to drink eight glasses of water every day, you cannot in any way make your complexion more youthful by drinking water. Your body’s water-balance mechanisms are tuned with the precision of a digital chemistry lab, and you cannot possibly “hydrate” your skin from the inside by drinking an extra bottle or two of Perrier. You just end up with pee sourced in France.

In short, we know nothing of the life of water — nothing of the life of the water inside us, around us, or beyond us. But it’s a great story — captivating and urgent, surprising and funny and haunting. And if we’re going to master our relationship to water in the next few decades — really, if we’re going to remaster our relationship to water — we need to understand the life of water itself.

Read more of this article and Charles Fishman’s interview with NPR here.

Send to Kindle

Science at its Best: The Universe is Expanding AND Accelerating

The 2011 Nobel Prize in Physics was recently awarded to three scientists: Adam Riess, Saul Perlmutter and Brian Schmidt. Their computations and observations of a very specific type of exploding star upended decades of commonly accepted beliefs of our universe. Namely, that the expansion of the universe is accelerating.

Prior to their observations, first publicly articulated in 1998, general scientific consensus held that the universe would expand at a steady rate forever or slow, and eventually fold back in on itself in a cosmic Big Crunch.

The discovery by Riess, Perlmutter and Schmidt laid the groundwork for the idea that a mysterious force called “dark energy” is fueling the acceleration. This dark energy is now believed to make up 75 percent of the universe. Direct evidence of dark energy is lacking, but most cosmologists now accept that universal expansion is indeed accelerating.

Re-published here are the notes and a page scan from Riess’s logbook that led to this year’s Nobel Prize, which show the value of the scientific process:

The original article is courtesy of Symmetry Breaking:

In the fall of 1997, I was leading the calibration and analysis of data gathered by the High-z Supernova Search Team, one of two teams of scientists—the other was the Supernova Cosmology Project—trying to determine the fate of our universe: Will it expand forever, or will it halt and contract, resulting in the Big Crunch?

To find the answer, we had to determine the mass of the universe. It can be calculated by measuring how much the expansion of the universe is slowing.

First, we had to find cosmic candles—distant objects of known brightness—and use them as yardsticks. On this page, I checked the reliability of the supernovae, or exploding stars, that we had collected to serve as our candles. I found that the results they yielded for the present expansion rate of the universe (known as the Hubble constant) did not appear to be affected by the age or dustiness of their host galaxies.

Next, I used the data to calculate ?M, the relative mass of the universe.

It was significantly negative!

The result, if correct, meant that the assumption of my analysis was wrong. The expansion of the universe was not slowing. It was speeding up! How could that be?

I spent the next few days checking my calculation. I found one could explain the acceleration by introducing a vacuum energy, also called the cosmological constant, that pushes the universe apart. In March 1998, we submitted these results, which were published in September 1998.

Today, we know that 74 percent of the universe consists of this dark energy. Understanding its nature remains one of the most pressing tasks for physicists and astronomers alike.

Adam Riess, Johns Hopkins University

The discovery, and many others like it both great and small, show the true power of the scientific process. Scientific results are open for constant refinement, or re-evaluation or refutation and re-interpretation. The process leads to inexorable progress towards greater and greater knowledge and understanding, and eventually to truth that most skeptics can embrace. That is, until the next and better theory and corresponding results come along.

Image courtesy of Symmetry Breaking, Adam Riess.

Send to Kindle

MondayPoem: Water

This week, theDiagonal focuses its energies on that most precious of natural resources — water.

In his short poem “Water”, Ralph Waldo Emerson reminds us of its more fundamental qualities.

Emerson published his first book, Nature, in 1836, in which he outlined his transcendentalist philosophy. As Poetry Foundation elaborates:

His manifesto stated that the world consisted of Spirit (thought, ideas, moral laws, abstract truth, meaning itself ) and Nature (all of material reality, all that atoms comprise); it held that the former, which is timeless, is the absolute cause of the latter, which serves in turn to express Spirit, in a medium of time and space, to the senses. In other words, the objective, physical world—what Emerson called the “Not-Me”—is symbolic and exists for no other purpose than to acquaint human beings with its complement—the subjective, ideational world, identified with the conscious self and referred to in Emersonian counterpoint as the “Me.” Food, water, and air keep us alive, but the ultimate purpose for remaining alive is simply to possess the meanings of things, which by definition involves a translation of the attention from the physical fact to its spiritual value.

By Ralph Waldo Emerson

– Water

The water understands
Civilization well;
It wets my foot, but prettily,
It chills my life, but wittily,
It is not disconcerted,
It is not broken-hearted:
Well used, it decketh joy,
Adorneth, doubleth joy:
Ill used, it will destroy,
In perfect time and measure
With a face of golden pleasure
Elegantly destroy.

Image courtesy of Wikipedia / Creative Commons.

Send to Kindle

Greatest Literary Suicides

Hot on the heals of our look at literary deaths we look specifically at the greatest suicides in literature. Although subject to personal taste and sensibility the starter list excerpted below is a fine beginning, and leaves much to ponder.

From Flavorpill:

1. Ophelia, Hamlet, William Shakespeare

Hamlet’s jilted lover Ophelia drowns in a stream surrounded by the flowers she had held in her arms. Though Ophelia’s death can be parsed as an accident, her growing madness and the fact that she was, as Gertrude says, “incapable of her own distress.” And as far as we’re concerned, Gertrude’s monologue about Ophelia’s drowning is one of the most beautiful descriptions of death in Shakespeare.

2. Anna Karenina, Anna Karenina, Leo Tolstoy

In an extremely dramatic move only befitting the emotional mess that is Anna Karenina, the heroine throws herself under a train in her despair, mirroring the novel’s early depiction of a railway worker’s death by similar means.

3. Cecilia Lisbon, The Virgin Suicides, Jeffrey Eugenides

Eugenides’ entire novel deserves to be on this list for its dreamy horror of five sisters killing themselves in the 1970s Michigan suburbs. But the death of the youngest, Cecilia, is the most brutal and distressing. Having failed to kill herself by cutting her wrists, she leaves her own party to throw herself from her bedroom window, landing impaled on the steel fence below.

4. Emma Bovary, Madame Bovary, Gustave Flaubert

In life, Emma Bovary wished for romance, for intrigue, to escape the banalities of her provincial life as a doctor’s wife. Hoping to expire gracefully, she eats a bowl of arsenic, but is punished by hours of indelicate and public suffering before she finally dies.

5. Edna Pontellier, The Awakening, Kate Chopin

This is the first suicide that many students experience in literature, and it is a strange and calm one: Edna simply walks into the water. We imagine the reality of drowning yourself would be much messier, but Chopin’s version is a relief, a cool compress against the pains of Edna’s psyche in beautiful, fluttering prose.

Topping out the top 10 we have:

Lily Bart, The House of Mirth, Edith Wharton
Septimus Warren Smith, Mrs. Dalloway, Virginia Woolf
James O. Incandeza, Infinite Jest, David Foster Wallace
Romeo and Juliet, Romeo and Juliet, William Shakespeare
Inspector Javert, Les Misérables, Victor Hugo

Read the entire article here.

Ophelia by John Everett Millais (1829–1896). Image courtesy of Wikipedia / Creative Commons.

Send to Kindle

How Many People Have Died?

Ever wonder how many people have gone before? The succinct infographic courtesy of Jon Gosier takes a good stab at answering the question. First, a few assumptions and explanations:

The numbers in this piece are speculative but are as accurate as modern research allows. It’s widely accepted that prior to 2002 there had been somewhere between 106 and 140 billion homo sapiens born to the world. The graphic below uses the conservative number (106 bn) as the basis for a circle graph. The center dot represents how many people are currently living (red) versus the dead (white). The dashed vertical line shows how much time passed between milestones. The spectral graph immediately below this text illustrates the population ‘benchmarks’ that were used to estimate the population over time. Adding the population numbers gets you to 106 billion. The red sphere is then used to compare against other data.

Checkout the original here.

Send to Kindle

Greatest Literary Deaths

Tim Lott over at the Guardian Book Blog wonders which are the most dramatic literary deaths — characters rather than novelist. Think Heathcliff in Emily Brontë’s Wuthering Heights.

From the Guardian:

What makes for a great literary death scene? This is the question I and the other four judges of the 2012 Wellcome Trust book prize for medicine in literature have been pondering in advance of an event at the Cheltenham festival.

I find many famous death scenes more ludicrous than lachrymose. As with Oscar Wilde’s comment on the death of Dickens’s Little Nell, you would have to have a heart of stone not to laugh at the passing of the awful Tullivers in Mill on the Floss, dragged down clutching one another as the river deliciously finishes them off. More consciously designed to wring laughter out of tragedy, the suicide of Ronald Nimkin in Roth’s Portnoy’s Complaint takes some beating, with Nimkins’s magnificent farewell note to his mother: “Mrs Blumenthal called. Please bring your mah-jongg rules to the game tonight.”

To write a genuinely moving death scene is a challenge for any author. The temptation to retreat into cliché is powerful. For me, the best and most affecting death is that of Harry “Rabbit” Angstrom in John Updike’s Rabbit at Rest. I remember my wife reading this to me out loud as I drove along a motorway. We were both in tears, as he says his farewell to his errant son, Nelson, and then runs out of words, and life itself – “enough. Maybe. Enough.”

But death is a matter of personal taste. The other judges were eclectic in their choices. Roger Highfield, editor of New Scientist, admired the scenes in Sebastian Junger’s A Perfect Storm. At the end of the chapter that seals the fate of the six men on board, Junger writes: “The body could be likened to a crew that resorts to increasingly desperate measures to keep their vessel afloat. Eventually the last wire has shorted out, the last bit of decking has settled under the water.” “The details of death by drowning,” Highfield says, “are so rich and dispassionately drawn that they feel chillingly true.”

Read the entire article here.

Send to Kindle

When Will I Die?

Would you like to know when you will die?

This is a fundamentally personal and moral question which many may prefer to keep unanswered.  That said, while scientific understanding of aging is making great strides it cannot yet provide an answer to the question. Though it may only be a matter of time.

Giles Tremlett over at the Guardian gives us a personal account of the fascinating science of telomeres, the end-caps on our chromosomes, and why they potentially hold a key to that most fateful question.

From the Guardian:

As a taxi takes me across Madrid to the laboratories of Spain’s National Cancer Research Centre, I am fretting about the future. I am one of the first people in the world to provide a blood sample for a new test, which has been variously described as a predictor of how long I will live, a waste of time or a handy indicator of how well (or badly) my body is ageing. Today I get the results.

Some newspapers, to the dismay of the scientists involved, have gleefully announced that the test – which measures the telomeres (the protective caps on the ends of my chromosomes) – can predict when I will die. Am I about to find out that, at least statistically, my days are numbered? And, if so, might new telomere research suggesting we can turn back the hands of the body’s clock and make ourselves “biologically younger” come to my rescue?

The test is based on the idea that biological ageing grinds at your telomeres. And, although time ticks by uniformly, our bodies age at different rates. Genes, environment and our own personal habits all play a part in that process. A peek at your telomeres is an indicator of how you are doing. Essentially, they tell you whether you have become biologically younger or older than other people born at around the same time.

The key measure, explains María Blasco, a 45-year-old molecular biologist, head of Spain’s cancer research centre and one of the world’s leading telomere researchers, is the number of short telomeres. Blasco, who is also one of the co-founders of the Life Length company which is offering the tests, says that short telomeres do not just provide evidence of ageing. They also cause it. Often compared to the plastic caps on a shoelace, there is a critical level at which the fraying becomes irreversible and triggers cell death. “Short telomeres are causal of disease because when they are below a [certain] length they are damaging for the cells. The stem cells of our tissues do not regenerate and then we have ageing of the tissues,” she explains. That, in a cellular nutshell, is how ageing works. Eventually, so many of our telomeres are short that some key part of our body may stop working.

The research is still in its early days but extreme stress, for example, has been linked to telomere shortening. I think back to a recent working day that took in three countries, three news stories, two international flights, a public lecture and very little sleep. Reasonable behaviour, perhaps, for someone in their 30s – but I am closer to my 50s. Do days like that shorten my expected, or real, life-span?

Read more of this article here.

Image: chromosomes capped by telomeres (white), courtesy of Wikipedia.

Send to Kindle

The Climate Spin Cycle

There’s something to be said for a visual aide that puts a complex conversation about simple ideas into perspective. So, here we have a high-level flow chart that characterizes one on the most important debates of our time — climate change. Whether you are for or against the notion or the science, or merely perplexed by the hyperbole inside the “echo chamber” there is no denying that this debate will remain with us for quite sometime.

Chart courtesy of Riley E. Dunlap and Aaron M. McCright, “Organized Climate-Change Denial,” In J. S. Dryzek, R. B. Norgaard and D. Schlosberg, (eds.), Oxford
Handbook of Climate Change and Society. New York: Oxford University Press, 2011.

Send to Kindle

C is For Dennis Richie

Last week on October 8, 2011, Dennis Richie passed away. Most of the mainstream media failed to report his death — after all he was never quite as flamboyant as another technology darling, Steve Jobs. However, his contributions to the worlds of technology and computer science should certainly place him in the same club.

After all, Dennis Richie developed the computer language C, and he significantly influenced the development of other languages. He also pioneered the operating system, Unix. Both C and Unix now run much of the world’s computer systems.

Dennis Ritchie, and co-developer, Ken Thompson, were awarded the National Medal of Technology in 1999 by President Bill Clinton.

Image courtesy of Wikipedia.

Send to Kindle

Mapping the Murder Rate

A sad but nonetheless interesting infographic of murder rates throughout the world. The rates are per 100,000 of the population. The United States with a rate of 5 per 100,000 ranks close to Belarus, Peru and Thailand. Interestingly, it has a higher murder rate than Turkmenistan (4.4), Uzbekistan (3.1), Afghanistan (2.4) , Syria (3) and Iran (3).

The top 5 countries with the highest murder rates are:

Send to Kindle

Selflessness versus Selfishness: Either Extreme Can Be Bad

From the New York Times:

Some years ago, Dr. Robert A. Burton was the neurologist on call at a San Francisco hospital when a high-profile colleague from the oncology department asked him to perform a spinal tap on an elderly patient with advanced metastatic cancer. The patient had seemed a little fuzzy-headed that morning, and the oncologist wanted to check for meningitis or another infection that might be treatable with antibiotics.

Dr. Burton hesitated. Spinal taps are painful. The patient’s overall prognosis was beyond dire. Why go after an ancillary infection? But the oncologist, known for his uncompromising and aggressive approach to treatment, insisted.

“For him, there was no such thing as excessive,” Dr. Burton said in a telephone interview. “For him, there was always hope.”

On entering the patient’s room with spinal tap tray portentously agleam, Dr. Burton encountered the patient’s family members. They begged him not to proceed. The frail, bedridden patient begged him not to proceed. Dr. Burton conveyed their pleas to the oncologist, but the oncologist continued to lobby for a spinal tap, and the exhausted family finally gave in.

As Dr. Burton had feared, the procedure proved painful and difficult to administer. It revealed nothing of diagnostic importance. And it left the patient with a grinding spinal-tap headache that lasted for days, until the man fell into a coma and died of his malignancy.

Dr. Burton had admired his oncology colleague (now deceased), yet he also saw how the doctor’s zeal to heal could border on fanaticism, and how his determination to help his patients at all costs could perversely end up hurting them.

The author of “On Being Certain” and the coming “A Skeptic’s Guide to the Mind,” Dr. Burton is a contributor to a scholarly yet surprisingly sprightly volume called “Pathological Altruism,” to be published this fall by Oxford University Press. And he says his colleague’s behavior is a good example of that catchily contradictory term, just beginning to make the rounds through the psychological sciences.

As the new book makes clear, pathological altruism is not limited to showcase acts of self-sacrifice, like donating a kidney or a part of one’s liver to a total stranger. The book is the first comprehensive treatment of the idea that when ostensibly generous “how can I help you?” behavior is taken to extremes, misapplied or stridently rhapsodized, it can become unhelpful, unproductive and even destructive.

Selflessness gone awry may play a role in a broad variety of disorders, including anorexia and animal hoarding, women who put up with abusive partners and men who abide alcoholic ones.

Read more here.

Image courtesy of Serge Bloch, New York Times.

Send to Kindle

MondayPoem: And Death Shall Have No Dominion

Ushering in our week of articles focused mostly on death and loss is a classic piece by Welshman, Dylan Thomas. Although Thomas’ literary legacy is colored by his legendary drinking and philandering, many critics now seem to agree that his poetry belongs in the same class as that of W.H. Auden.

By Dylan Thomas:

– And Death Shall Have No Dominion

And death shall have no dominion.
Dead men naked they shall be one
With the man in the wind and the west moon;
When their bones are picked clean and the clean bones gone,
They shall have stars at elbow and foot;
Though they go mad they shall be sane,
Though they sink through the sea they shall rise again;
Though lovers be lost love shall not;
And death shall have no dominion.

And death shall have no dominion.
Under the windings of the sea
They lying long shall not die windily;
Twisting on racks when sinews give way,
Strapped to a wheel, yet they shall not break;
Faith in their hands shall snap in two,
And the unicorn evils run them through;
Split all ends up they shan’t crack;
And death shall have no dominion.

And death shall have no dominion.
No more may gulls cry at their ears
Or waves break loud on the seashores;
Where blew a flower may a flower no more
Lift its head to the blows of the rain;
Though they be mad and dead as nails,
Heads of the characters hammer through daisies;
Break in the sun till the sun breaks down,
And death shall have no dominion.
Send to Kindle

Remembering Another Great Inventor: Edwin Land

From the New York Times:

IN the memorials to Steven P. Jobs this week, Apple’s co-founder was compared with the world’s great inventor-entrepreneurs: Thomas Edison, Henry Ford, Alexander Graham Bell. Yet virtually none of the obituaries mentioned the man Jobs himself considered his hero, the person on whose career he explicitly modeled his own: Edwin H. Land, the genius domus of Polaroid Corporation and inventor of instant photography.

Land, in his time, was nearly as visible as Jobs was in his. In 1972, he made the covers of both Time and Life magazines, probably the only chemist ever to do so. (Instant photography was a genuine phenomenon back then, and Land had created the entire medium, once joking that he’d worked out the whole idea in a few hours, then spent nearly 30 years getting those last few details down.) And the more you learn about Land, the more you realize how closely Jobs echoed him.

Both built multibillion-dollar corporations on inventions that were guarded by relentless patent enforcement. (That also kept the competition at bay, and the profit margins up.) Both were autodidacts, college dropouts (Land from Harvard, Jobs from Reed) who more than made up for their lapsed educations by cultivating extremely refined taste. At Polaroid, Land used to hire Smith College’s smartest art-history majors and send them off for a few science classes, in order to create chemists who could keep up when his conversation turned from Maxwell’s equations to Renoir’s brush strokes.

Most of all, Land believed in the power of the scientific demonstration. Starting in the 60s, he began to turn Polaroid’s shareholders’ meetings into dramatic showcases for whatever line the company was about to introduce. In a perfectly art-directed setting, sometimes with live music between segments, he would take the stage, slides projected behind him, the new product in hand, and instead of deploying snake-oil salesmanship would draw you into Land’s World. By the end of the afternoon, you probably wanted to stay there.

Three decades later, Jobs would do exactly the same thing, except in a black turtleneck and jeans. His admiration for Land was open and unabashed. In 1985, he told an interviewer, “The man is a national treasure. I don’t understand why people like that can’t be held up as models: This is the most incredible thing to be — not an astronaut, not a football player — but this.”

Read the full article here.

Edwin Herbert Land. Photograph by J. J. Scarpetti, The National Academies Press.

Send to Kindle

A Medical Metaphor for Climate Risk

While scientific evidence of climate change continues to mount and an increasing number of studies point causal fingers at ourselves there is perhaps another way to visualize the risk of inaction or over-reaction. So, since most people can leave ideology aside when it comes to their own health, a medical metaphor, courtesy of Andrew Revkin over at Dot Earth, may be of use to broaden acceptance of the message.

From the New York Times:

Paul C. Stern, the director of the National Research Council committee on the human dimensions of global change, has been involved in a decades-long string of studies of behavior, climate change and energy choices.

This is an arena that is often attacked by foes of cuts in greenhouse gases, who see signs of mind control and propaganda. Stern says that has nothing to do with his approach, as he made clear in “Contributions of Psychology to Limiting Climate Change,” a paper that was part of a special issue of the journal American Psychologist on climate change and behavior:

Psychological contributions to limiting climate change will come not from trying to change people’s attitudes, but by helping to make low-carbon technologies more attractive and user-friendly, economic incentives more transparent and easier to use, and information more actionable and relevant to the people who need it.

The special issue of the journal builds on a 2009 report on climate and behavior from the American Psychological Association that was covered here. Stern has now offered a reaction to the discussion last week of Princeton researcher Robert Socolow’s call for a fresh approach to climate policy that acknowledges “the news about climate change is unwelcome, that today’s climate science is incomplete, and that every ’solution’ carries risk.” Stern’s response, centered on a medical metaphor (not the first) is worth posting as a “Your Dot” contribution. You can find my reaction to his idea below. Here’s Stern’s piece:

I agree with Robert Socolow that scientists could do better at encouraging a high quality of discussion about climate change.

But providing better technical descriptions will not help most people because they do not follow that level of detail.  Psychological research shows that people often use simple, familiar mental models as analogies for complex phenomena.  It will help people think through climate choices to have a mental model that is familiar and evocative and that also neatly encapsulates Socolow’s points that the news is unwelcome, that science is incomplete, and that some solutions are dangerous. There is such a model.

Too many people think of climate science as an exact science like astronomy that can make highly confident predictions, such as about lunar eclipses.  That model misrepresents the science, does poorly at making Socolow’s points, and has provided an opening for commentators and bloggers seeking to use any scientific disagreement to discredit the whole body of knowledge.

A mental model from medical science might work better.  In the analogy, the planet is a patient suspected of having a serious, progressive disease (anthropogenic climate change).  The symptoms are not obvious, just as they are not with diabetes or hypertension, but the disease may nevertheless be serious.  Humans, as guardians of the planet, must decide what to do.  Scientists are in the role of physician.  The guardians have been asking the physicians about the diagnosis (is this disease present?), the nature of the disease, its prognosis if untreated, and the treatment options, including possible side effects.  The medical analogy helps clarify the kinds of errors that are possible and can help people better appreciate how science can help and think through policy choices.

Diagnosis. A physician must be careful to avoid two errors:  misdiagnosing the patient with a dread disease that is not present, and misdiagnosing a seriously ill patient as healthy.  To avoid these types of error, physicians often run diagnostic tests or observe the patient over a period of time before recommending a course of treatment.  Scientists have been doing this with Earth’s climate at least since 1959, when strong signs of illness were reported from observations in Hawaii.

Scientists now have high confidence that the patient has the disease.  We know the causes:  fossil fuel consumption, certain land cover changes, and a few other physical processes. We know that the disease produces a complex syndrome of symptoms involving change in many planetary systems (temperature, precipitation, sea level and acidity balance, ecological regimes, etc.).  The patient is showing more and more of the syndrome, and although we cannot be sure that each particular symptom is due to climate change rather than some other cause, the combined evidence justifies strong confidence that the syndrome is present.

Prognosis. Fundamental scientific principles tell us that the disease is progressive and very hard to reverse.  Observations tell us that the processes that cause it have been increasing, as have the symptoms.  Without treatment, they will get worse.  However, because this is an extremely rare disease (in fact, the first known case), there is uncertainty about how fast it will progress.  The prognosis could be catastrophic, but we cannot assign a firm probability to the worst outcomes, and we are not even sure what the most likely outcome is.  We want to avoid either seriously underestimating or overestimating the seriousness of the prognosis.

Treatment. We want treatments that improve the patient’s chances at low cost and with limited adverse side effects and we want to avoid “cures” that might be worse than the disease.  We want to consider the chances of improvement for each treatment, and its side effects, in addition to the untreated prognosis.  We want to avoid the dangers both of under-treatment and of side effects.  We know that some treatments (the ones limiting climate change) get at the causes and could alleviate all the symptoms if taken soon enough.  But reducing the use of fossil fuels quickly could be painful.  Other treatments, called adaptations, offer only symptomatic relief.  These make sense because even with strong medicine for limiting climate change, the disease will get worse before it gets better.

Choices. There are no risk-free choices.  We know that the longer treatment is postponed, the more painful it will be, and the worse the prognosis.  We can also use an iterative treatment approach (as Socolow proposed), starting some treatments and monitoring their effects and side effects before raising the dose.  People will disagree about the right course of treatment, but thinking about the choices in this way might give the disagreements the appropriate focus.

Read more here.

Image courtesy of Stephen Wilkes for The New York Times.

Send to Kindle

A Commencement Address for Each of Us: Stay Hungry. Stay Foolish.

Much has been written to honor the life of Steve Jobs, who passed away October 5, 2011 at the young age of 56. Much more will be written. To honor his vision and passion we re-print below a rare public speech given Steve Jobs at the Stanford University Commencement on June 12, 2005. The address is a very personal and thoughtful story of innovation, love and loss, and death.

Courtesy of Stanford University:

I am honored to be with you today at your commencement from one of the finest universities in the world. I never graduated from college. Truth be told, this is the closest I’ve ever gotten to a college graduation. Today I want to tell you three stories from my life. That’s it. No big deal. Just three stories.

The first story is about connecting the dots.

I dropped out of Reed College after the first 6 months, but then stayed around as a drop-in for another 18 months or so before I really quit. So why did I drop out?

It started before I was born. My biological mother was a young, unwed college graduate student, and she decided to put me up for adoption. She felt very strongly that I should be adopted by college graduates, so everything was all set for me to be adopted at birth by a lawyer and his wife. Except that when I popped out they decided at the last minute that they really wanted a girl. So my parents, who were on a waiting list, got a call in the middle of the night asking: “We have an unexpected baby boy; do you want him?” They said: “Of course.” My biological mother later found out that my mother had never graduated from college and that my father had never graduated from high school. She refused to sign the final adoption papers. She only relented a few months later when my parents promised that I would someday go to college.

And 17 years later I did go to college. But I naively chose a college that was almost as expensive as Stanford, and all of my working-class parents’ savings were being spent on my college tuition. After six months, I couldn’t see the value in it. I had no idea what I wanted to do with my life and no idea how college was going to help me figure it out. And here I was spending all of the money my parents had saved their entire life. So I decided to drop out and trust that it would all work out OK. It was pretty scary at the time, but looking back it was one of the best decisions I ever made. The minute I dropped out I could stop taking the required classes that didn’t interest me, and begin dropping in on the ones that looked interesting.

It wasn’t all romantic. I didn’t have a dorm room, so I slept on the floor in friends’ rooms, I returned coke bottles for the 5¢ deposits to buy food with, and I would walk the 7 miles across town every Sunday night to get one good meal a week at the Hare Krishna temple. I loved it. And much of what I stumbled into by following my curiosity and intuition turned out to be priceless later on. Let me give you one example:

Reed College at that time offered perhaps the best calligraphy instruction in the country. Throughout the campus every poster, every label on every drawer, was beautifully hand calligraphed. Because I had dropped out and didn’t have to take the normal classes, I decided to take a calligraphy class to learn how to do this. I learned about serif and san serif typefaces, about varying the amount of space between different letter combinations, about what makes great typography great. It was beautiful, historical, artistically subtle in a way that science can’t capture, and I found it fascinating.

None of this had even a hope of any practical application in my life. But ten years later, when we were designing the first Macintosh computer, it all came back to me. And we designed it all into the Mac. It was the first computer with beautiful typography. If I had never dropped in on that single course in college, the Mac would have never had multiple typefaces or proportionally spaced fonts. And since Windows just copied the Mac, it’s likely that no personal computer would have them. If I had never dropped out, I would have never dropped in on this calligraphy class, and personal computers might not have the wonderful typography that they do. Of course it was impossible to connect the dots looking forward when I was in college. But it was very, very clear looking backwards ten years later.

Again, you can’t connect the dots looking forward; you can only connect them looking backwards. So you have to trust that the dots will somehow connect in your future. You have to trust in something — your gut, destiny, life, karma, whatever. This approach has never let me down, and it has made all the difference in my life.

My second story is about love and loss.

I was lucky — I found what I loved to do early in life. Woz and I started Apple in my parents garage when I was 20. We worked hard, and in 10 years Apple had grown from just the two of us in a garage into a $2 billion company with over 4000 employees. We had just released our finest creation — the Macintosh — a year earlier, and I had just turned 30. And then I got fired. How can you get fired from a company you started? Well, as Apple grew we hired someone who I thought was very talented to run the company with me, and for the first year or so things went well. But then our visions of the future began to diverge and eventually we had a falling out. When we did, our Board of Directors sided with him. So at 30 I was out. And very publicly out. What had been the focus of my entire adult life was gone, and it was devastating.

I really didn’t know what to do for a few months. I felt that I had let the previous generation of entrepreneurs down – that I had dropped the baton as it was being passed to me. I met with David Packard and Bob Noyce and tried to apologize for screwing up so badly. I was a very public failure, and I even thought about running away from the valley. But something slowly began to dawn on me — I still loved what I did. The turn of events at Apple had not changed that one bit. I had been rejected, but I was still in love. And so I decided to start over.

I didn’t see it then, but it turned out that getting fired from Apple was the best thing that could have ever happened to me. The heaviness of being successful was replaced by the lightness of being a beginner again, less sure about everything. It freed me to enter one of the most creative periods of my life.

During the next five years, I started a company named NeXT, another company named Pixar, and fell in love with an amazing woman who would become my wife. Pixar went on to create the worlds first computer animated feature film, Toy Story, and is now the most successful animation studio in the world. In a remarkable turn of events, Apple bought NeXT, I returned to Apple, and the technology we developed at NeXT is at the heart of Apple’s current renaissance. And Laurene and I have a wonderful family together.

I’m pretty sure none of this would have happened if I hadn’t been fired from Apple. It was awful tasting medicine, but I guess the patient needed it. Sometimes life hits you in the head with a brick. Don’t lose faith. I’m convinced that the only thing that kept me going was that I loved what I did. You’ve got to find what you love. And that is as true for your work as it is for your lovers. Your work is going to fill a large part of your life, and the only way to be truly satisfied is to do what you believe is great work. And the only way to do great work is to love what you do. If you haven’t found it yet, keep looking. Don’t settle. As with all matters of the heart, you’ll know when you find it. And, like any great relationship, it just gets better and better as the years roll on. So keep looking until you find it. Don’t settle.

My third story is about death.

When I was 17, I read a quote that went something like: “If you live each day as if it was your last, someday you’ll most certainly be right.” It made an impression on me, and since then, for the past 33 years, I have looked in the mirror every morning and asked myself: “If today were the last day of my life, would I want to do what I am about to do today?” And whenever the answer has been “No” for too many days in a row, I know I need to change something.

Remembering that I’ll be dead soon is the most important tool I’ve ever encountered to help me make the big choices in life. Because almost everything — all external expectations, all pride, all fear of embarrassment or failure – these things just fall away in the face of death, leaving only what is truly important. Remembering that you are going to die is the best way I know to avoid the trap of thinking you have something to lose. You are already naked. There is no reason not to follow your heart.

About a year ago I was diagnosed with cancer. I had a scan at 7:30 in the morning, and it clearly showed a tumor on my pancreas. I didn’t even know what a pancreas was. The doctors told me this was almost certainly a type of cancer that is incurable, and that I should expect to live no longer than three to six months. My doctor advised me to go home and get my affairs in order, which is doctor’s code for prepare to die. It means to try to tell your kids everything you thought you’d have the next 10 years to tell them in just a few months. It means to make sure everything is buttoned up so that it will be as easy as possible for your family. It means to say your goodbyes.

I lived with that diagnosis all day. Later that evening I had a biopsy, where they stuck an endoscope down my throat, through my stomach and into my intestines, put a needle into my pancreas and got a few cells from the tumor. I was sedated, but my wife, who was there, told me that when they viewed the cells under a microscope the doctors started crying because it turned out to be a very rare form of pancreatic cancer that is curable with surgery. I had the surgery and I’m fine now.

This was the closest I’ve been to facing death, and I hope it’s the closest I get for a few more decades. Having lived through it, I can now say this to you with a bit more certainty than when death was a useful but purely intellectual concept:

No one wants to die. Even people who want to go to heaven don’t want to die to get there. And yet death is the destination we all share. No one has ever escaped it. And that is as it should be, because Death is very likely the single best invention of Life. It is Life’s change agent. It clears out the old to make way for the new. Right now the new is you, but someday not too long from now, you will gradually become the old and be cleared away. Sorry to be so dramatic, but it is quite true.

Your time is limited, so don’t waste it living someone else’s life. Don’t be trapped by dogma — which is living with the results of other people’s thinking. Don’t let the noise of others’ opinions drown out your own inner voice. And most important, have the courage to follow your heart and intuition. They somehow already know what you truly want to become. Everything else is secondary.

When I was young, there was an amazing publication called The Whole Earth Catalog, which was one of the bibles of my generation. It was created by a fellow named Stewart Brand not far from here in Menlo Park, and he brought it to life with his poetic touch. This was in the late 1960’s, before personal computers and desktop publishing, so it was all made with typewriters, scissors, and polaroid cameras. It was sort of like Google in paperback form, 35 years before Google came along: it was idealistic, and overflowing with neat tools and great notions.

Stewart and his team put out several issues of The Whole Earth Catalog, and then when it had run its course, they put out a final issue. It was the mid-1970s, and I was your age. On the back cover of their final issue was a photograph of an early morning country road, the kind you might find yourself hitchhiking on if you were so adventurous. Beneath it were the words: “Stay Hungry. Stay Foolish.” It was their farewell message as they signed off. Stay Hungry. Stay Foolish. And I have always wished that for myself. And now, as you graduate to begin anew, I wish that for you.

Stay Hungry. Stay Foolish.

Thank you all very much.

Send to Kindle

Global Interconnectedness: Submarine Cables

Apparently only 1 percent of global internet traffic is transmitted via satellite or terrestrially-based radio frequency. The remaining 99 percent is still carried via cable – fiber optic and copper. Much of this cable is strewn for many thousands of miles across the seabeds of our deepest oceans.

For a fascinating view of these intricate systems and to learn why and how Brazil is connected to Angola, or Auckland, New Zealand connected to Redondo Beach California via the 12,750 km long Pacific Fiber check the interactive Submarine Cable Map from TeleGeography.

Send to Kindle

Steve Jobs: The Secular Prophet

The world will miss Steve Jobs.

In early 2010 the U.S. Supreme Court overturned years of legal precedent by assigning First Amendment (free speech) protections to corporations. We could argue the merits and demerits of this staggering ruling until the cows come home. However, one thing is clear if corporations are to be judged as people. And, that is the world would in all likelihood benefit more from a corporation with a human, optimistic and passionate face (Apple) rather than from a faceless one (Exxon) or an ideological one (News Corp) or an opaque one (Koch Industries).

That said, we excerpt a fascinating essay on Steve Jobs by Andy Crouch below. We would encourage Mr.Crouch to take this worthy idea further by examining the Fortune 1000 list of corporations. Could he deliver a similar analysis for each of these corporations’ leaders? We believe not.

The world will miss Steve Jobs.

By Andy Crouch for the Wall Street Journal:

Steve Jobs was extraordinary in countless ways—as a designer, an innovator, a (demanding and occasionally ruthless) leader. But his most singular quality was his ability to articulate a perfectly secular form of hope. Nothing exemplifies that ability more than Apple’s early logo, which slapped a rainbow on the very archetype of human fallenness and failure—the bitten fruit—and turned it into a sign of promise and progress.

That bitten apple was just one of Steve Jobs’s many touches of genius, capturing the promise of technology in a single glance. The philosopher Albert Borgmann has observed that technology promises to relieve us of the burden of being merely human, of being finite creatures in a harsh and unyielding world. The biblical story of the Fall pronounced a curse upon human work—”cursed is the ground for thy sake; in sorrow shalt thou eat of it all the days of thy life.” All technology implicitly promises to reverse the curse, easing the burden of creaturely existence. And technology is most celebrated when it is most invisible—when the machinery is completely hidden, combining godlike effortlessness with blissful ignorance about the mechanisms that deliver our disburdened lives.

Steve Jobs was the evangelist of this particular kind of progress—and he was the perfect evangelist because he had no competing source of hope. He believed so sincerely in the “magical, revolutionary” promise of Apple precisely because he believed in no higher power. In his celebrated Stanford commencement address (which is itself an elegant, excellent model of the genre), he spoke frankly about his initial cancer diagnosis in 2003. It’s worth pondering what Jobs did, and didn’t, say:

“No one wants to die. Even people who want to go to heaven don’t want to die to get there. And yet death is the destination we all share. No one has ever escaped it. And that is as it should be, because death is very likely the single best invention of life. It’s life’s change agent; it clears out the old to make way for the new. Right now, the new is you. But someday, not too long from now, you will gradually become the old and be cleared away. Sorry to be so dramatic, but it’s quite true. Your time is limited, so don’t waste it living someone else’s life. Don’t be trapped by dogma, which is living with the results of other people’s thinking. Don’t let the noise of others’ opinions drown out your own inner voice, heart and intuition. They somehow already know what you truly want to become.”

This is the gospel of a secular age.

Steve Jobs by Tim O’Brien, image courtesy of Wall Street Journal.

Send to Kindle

Googlization of the Globe: For Good (or Evil)

Google’s oft quoted corporate mantra — do no evil — reminds us to remain vigilant even if the company believes it does good and can do no wrong.

Google serves up countless search results to ease our never-ending thirst for knowledge, deals, news, quotes, jokes, user manuals, contacts, products and so on. This is clearly of tremendous benefit to us, to Google and to Google’s advertisers. Of course in fulfilling our searches Google collects equally staggering amounts of information — about us. Increasingly the company will know where we are, what we like and dislike, what we prefer, what we do, where we travel, with whom and why, how our friends are, what we read, what we buy.

As Jaron Lanier remarked in a recent post, there is a fine line between being a global index to the world’s free and open library of information and being the paid gatekeeper to our collective knowledge and hoarder of our collective online (and increasingly offline) behaviors, tracks and memories. We have already seen how Google, and others, can personalize search results based on our previous tracks thus filtering and biasing what we see and read, limiting our exposure to alternate views and opinions.

It’s quite easy to imagine a rather more dystopian view of a society gone awry manipulated by a not-so-benevolent Google when, eventually, founders Brin and Page retire to their vacation bases on the moon.

With this in mind Daniel Soar over at London Review of Books reviews several recent books about Google and offers some interesting insights.

London Review of Books:

This spring, the billionaire Eric Schmidt announced that there were only four really significant technology companies: Apple, Amazon, Facebook and Google, the company he had until recently been running. People believed him. What distinguished his new ‘gang of four’ from the generation it had superseded – companies like Intel, Microsoft, Dell and Cisco, which mostly exist to sell gizmos and gadgets and innumerable hours of expensive support services to corporate clients – was that the newcomers sold their products and services to ordinary people. Since there are more ordinary people in the world than there are businesses, and since there’s nothing that ordinary people don’t want or need, or can’t be persuaded they want or need when it flashes up alluringly on their screens, the money to be made from them is virtually limitless. Together, Schmidt’s four companies are worth more than half a trillion dollars. The technology sector isn’t as big as, say, oil, but it’s growing, as more and more traditional industries – advertising, travel, real estate, used cars, new cars, porn, television, film, music, publishing, news – are subsumed into the digital economy. Schmidt, who as the ex-CEO of a multibillion-dollar corporation had learned to take the long view, warned that not all four of his disruptive gang could survive. So – as they all converge from their various beginnings to compete in the same area, the place usually referred to as ‘the cloud’, a place where everything that matters is online – the question is: who will be the first to blink?

If the company that falters is Google, it won’t be because it didn’t see the future coming. Of Schmidt’s four technology juggernauts, Google has always been the most ambitious, and the most committed to getting everything possible onto the internet, its mission being ‘to organise the world’s information and make it universally accessible and useful’. Its ubiquitous search box has changed the way information can be got at to such an extent that ten years after most people first learned of its existence you wouldn’t think of trying to find out anything without typing it into Google first. Searching on Google is automatic, a reflex, just part of what we do. But an insufficiently thought-about fact is that in order to organise the world’s information Google first has to get hold of the stuff. And in the long run ‘the world’s information’ means much more than anyone would ever have imagined it could. It means, of course, the totality of the information contained on the World Wide Web, or the contents of more than a trillion webpages (it was a trillion at the last count, in 2008; now, such a number would be meaningless). But that much goes without saying, since indexing and ranking webpages is where Google began when it got going as a research project at Stanford in 1996, just five years after the web itself was invented. It means – or would mean, if lawyers let Google have its way – the complete contents of every one of the more than 33 million books in the Library of Congress or, if you include slightly varying editions and pamphlets and other ephemera, the contents of the approximately 129,864,880 books published in every recorded language since printing was invented. It means every video uploaded to the public internet, a quantity – if you take the Google-owned YouTube alone – that is increasing at the rate of nearly an hour of video every second.

Read more here.

Send to Kindle

MondayPoem: Further In

Tomas Tranströmer is one of Sweden’s leading poets. He studied poetry and psychology at the University of Stockholm. Tranströmer was awarded the 2011 Nobel Prize for Literature “because, through his condensed, translucent images, he gives us fresh access to reality”.

By Tomas Tranströmer:

– Further In
On the main road into the city
when the sun is low.
The traffic thickens, crawls.
It is a sluggish dragon glittering.
I am one of the dragon’s scales.
Suddenly the red sun is
right in the middle of the windscreen
streaming in.
I am transparent
and writing becomes visible
inside me
words in invisible ink
which appear
when the paper is held to the fire!
I know I must get far away
straight through the city and then
further until it is time to go out
and walk far in the forest.
Walk in the footprints of the badger.
It gets dark, difficult to see.
In there on the moss lie stones.
One of the stones is precious.
It can change everything
it can make the darkness shine.
It is a switch for the whole country.
Everything depends on it.
Look at it, touch it…

Send to Kindle

Human Evolution Marches On

From Wired:

Though ongoing human evolution is difficult to see, researchers believe they’ve found signs of rapid genetic changes among the recent residents of a small Canadian town.

Between 1800 and 1940, mothers in Ile aux Coudres, Quebec gave birth at steadily younger ages, with the average age of first maternity dropping from 26 to 22. Increased fertility, and thus larger families, could have been especially useful in the rural settlement’s early history.

According to University of Quebec geneticist Emmanuel Milot and colleagues, other possible explanations, such as changing cultural or environmental influences, don’t fit. The changes appear to reflect biological evolution.

“It is often claimed that modern humans have stopped evolving because cultural and technological advancements have annihilated natural selection,” wrote Milot’s team in their Oct. 3 Proceedings of the National Academy of Sciences paper. “Our study supports the idea that humans are still evolving. It also demonstrates that microevolution is detectable over just a few generations.”

Milot’s team based their study on detailed birth, marriage and death records kept by the Catholic church in Ile aux Coudres, a small and historically isolated French-Canadian island town in the Gulf of St. Lawrence. It wasn’t just the fact that average first birth age — a proxy for fertility — dropped from 26 to 22 in 140 years that suggested genetic changes. After all, culture or environment might have been wholly responsible, as nutrition and healthcare are for recent, rapid changes in human height. Rather, it was how ages dropped that caught their eye.

The patterns fit with models of gene-influenced natural selection. Moreover, thanks to the detailed record-keeping, it was possible to look at other possible explanations. Were better nutrition responsible, for example, improved rates of infant and juvenile mortality should have followed; they didn’t. Neither did the late-19th century transition from farming to more diversified professions.

Read more here.

Send to Kindle

Misconceptions of Violence

We live in violent times. Or do we?

Despite the seemingly constant flow of human engineered destruction on our fellow humans, other species and our precious environment some thoughtful analysis — beyond the headlines of cable news — shows that all may not be lost to our violent nature. An insightful interview with psychologist Steven Pinker, author of “How the Mind Works” shows us that contemporary humans are not as bad as we may have thought. His latest book, “The Better Angels of Our Nature: Why Violence Has Declined,” analyzes the basis and history of human violence. Perhaps surprisingly Pinker suggests that we live in remarkably peaceful times, comparatively speaking. Characteristically he backs up his claims with clear historical evidence.

From Gareth Cook for Mind Matters:

COOK: What would you say is the biggest misconception people have about violence?
PINKER: That we are living in a violent age. The statistics suggest that this may be the most peaceable time in our species’s existence.

COOK: Can you give a sense for how violent life was 500 or 1000 years ago?
PINKER: Statistics aside, accounts of daily life in medieval and early modern Europe reveal a society soaked in blood and gore. Medieval knights—whom today we would call warlords—fought their numerous private wars with a single strategy: kill as many of the opposing knight’s peasants as possible. Religious instruction included prurient descriptions of how the saints of both sexes were tortured and mutilated in ingenious ways. Corpses broken on the wheel, hanging from gibbets, or rotting in iron cages where the sinner had been left to die of exposure and starvation were a common part of the landscape. For entertainment, one could nail a cat to a post and try to head-butt it to death, or watch a political prisoner get drawn and quartered, which is to say partly strangled, disemboweled, and castrated before being decapitated. So many people had their noses cut off in private disputes that medical textbooks had procedures that were alleged to grow them back.

COOK: How has neuroscience contributed to our understanding of violence and its origins?
PINKER: Neuroscientists have long known that aggression in animals is not a unitary phenomenon driven by a single hormone or center. When they stimulate one part of the brain of a cat, it will lunge for the experimenter in a hissing, fangs-out rage; when they stimulate another, it will silently stalk a hallucinatory mouse. Still another circuit primes a male cat for a hostile confrontation with another male. Similar systems for rage, predatory seeking, and male-male aggression may be found in Homo sapiens, together with uniquely human, cognitively-driven  systems of aggression such as political and religious ideologies and moralistic punishment. Today, even the uniquely human systems can be investigated using functional neuroimaging. So neuroscience has given us the crucial starting point in understanding violence, namely that it is not a single thing. And it has helped us to discover biologically realistic taxonomies of the major motives for violence.

COOK: Is the general trend toward less violence going to continue in the future?
PINKER: It depends. In the arena of custom and institutional practices, it’s a good bet. I suspect that violence against women, the criminalization of homosexuality, the use of capital punishment, the callous treatment of animals on farms, corporal punishment of children, and other violent social practices will continue to decline, based on the fact that worldwide moralistic shaming movements in the past (such as those against slavery, whaling, piracy, and punitive torture) have been effective over long stretches of time. I also don’t expect war between developed countries to make a comeback any time soon. But civil wars, terrorist acts, government repression, and genocides in backward parts of the world are simply too capricious to allow predictions. With six billion people in the world, there’s no predicting what some cunning fanatic or narcissistic despot might do.

Read more of the interview here.

Image courtesy of Scientific American.

Send to Kindle