MondayPoem: When the World Ended as We Knew It

Joy Harjo is an acclaimed poet, musician and noted teacher. Her poetry is grounded in the United States’ Southwest and often encompasses Native American stories and values.

As Poetry Foundation remarks:

Consistently praised for the depth and thematic concerns in her writings, Harjo has emerged as a major figure in contemporary American poetry.

She once commented, “I feel strongly that I have a responsibility to all the sources that I am: to all past and future ancestors, to my home country, to all places that I touch down on and that are myself, to all voices, all women, all of my tribe, all people, all earth, and beyond that to all beginnings and endings. In a strange kind of sense [writing] frees me to believe in myself, to be able to speak, to have voice, because I have to; it is my survival.” Harjo’s work is largely autobiographical, informed by her love of the natural world and her preoccupation with survival and the limitations of language.

By Joy Harjo

– When the World Ended as We Knew It

We were dreaming on an occupied island at the farthest edge
of a trembling nation when it went down.

Two towers rose up from the east island of commerce and touched
the sky. Men walked on the moon. Oil was sucked dry
by two brothers. Then it went down. Swallowed
by a fire dragon, by oil and fear.
Eaten whole.

It was coming.

We had been watching since the eve of the missionaries in their
long and solemn clothes, to see what would happen.

We saw it
from the kitchen window over the sink
as we made coffee, cooked rice and
potatoes, enough for an army.

We saw it all, as we changed diapers and fed
the babies. We saw it,
through the branches
of the knowledgeable tree
through the snags of stars, through
the sun and storms from our knees
as we bathed and washed
the floors.

The conference of the birds warned us, as the flew over
destroyers in the harbor, parked there since the first takeover.
It was by their song and talk we knew when to rise
when to look out the window
to the commotion going on—
the magnetic field thrown off by grief.

We heard it.
The racket in every corner of the world. As
the hunger for war rose up in those who would steal to be president
to be king or emperor, to own the trees, stones, and everything
else that moved about the earth, inside the earth
and above it.

We knew it was coming, tasted the winds who gathered intelligence
from each leaf and flower, from every mountain, sea
and desert, from every prayer and song all over this tiny universe
floating in the skies of infinite
being.

And then it was over, this world we had grown to love
for its sweet grasses, for the many-colored horses
and fishes, for the shimmering possibilities
while dreaming.

But then there were the seeds to plant and the babies
who needed milk and comforting, and someone
picked up a guitar or ukulele from the rubble
and began to sing about the light flutter
the kick beneath the skin of the earth
we felt there, beneath us

a warm animal
a song being born between the legs of her;
a poem.

[div class=attrib]Image courtesy of PBS.[end-div]

The Hiddeous Sound of Chalk on a Blackboard

We promise. There is no screeching embedded audio of someone slowly dragging a piece of chalk, or worse, fingernails, across a blackboard! Though, even the thought of this sound causes many to shudder. Why? A plausible explanation over at Wired UK.

[div class=attrib]From Wired:[end-div]

Much time has been spent, over the past century, on working out exactly what it is about the sound of fingernails on a blackboard that’s so unpleasant. A new study pins the blame on psychology and the design of our ear canals.

Previous research on the subject suggested that the sound is acoustically similar to the warning call of a primate, but that theory was debunked after monkeys responded to amplitude-matched white noise and other high-pitched sounds, whereas humans did not. Another study, in 1986, manipulated a recording of blackboard scraping and found that the medium-pitched frequencies are the source of the adverse reaction, rather than the the higher pitches (as previously thought). The work won author Randolph Blake an Ig Nobel Prize in 2006.

The latest study, conducted by musicologists Michael Oehler of the Macromedia University for Media and Communication in Cologne, Germany, and Christoph Reuter of the University of Vienna, looked at other sounds that generate a similar reaction — including chalk on slate, styrofoam squeaks, a plate being scraped by a fork, and the ol’ fingernails on blackboard.

Some participants were told the genuine source of the sound, and others were told that the sounds were part of a contemporary music composition. Researchers asked the participants to rank which were the worst, and also monitored physical indicators of distress — heart rate, blood pressure and the electrical conductivity of skin.

They found that disturbing sounds do cause a measurable physical reaction, with skin conductivity changing significantly, and that the frequencies involved with unpleasant sounds also lie firmly within the range of human speech — between 2,000 and 4,000 Hz. Removing those frequencies from the sound made them much easier to listen to. But, interestingly, removing the noisy, scraping part of the sound made little difference.

A powerful psychological component was identified. If the listeners knew that the sound was fingernails on the chalkboard, they rated it as more unpleasant than if they were told it was from a musical composition. Even when they thought it was from music, however, their skin conductivity still changed consistently, suggesting that the physical part of the response remained.

[div class=attrib]Read the full article here.[end-div]

[div class=attrib]Images courtesy of Wired / Flickr.[end-div]

Lights That You Can Print

The lowly incandescent light bulb continues to come under increasing threat. First, came the fluorescent tube, then the compact fluorescent. More recently the LED (light emitting diode) seems to be gaining ground. Now LED technology takes another leap forward with printed LED “light sheets”.

[div class=attrib]From Technology Review:[end-div]

A company called Nth Degree Technologies hopes to replace light bulbs with what look like glowing sheets of paper (as shown in this video). The company’s first commercial product is a two-by-four-foot-square light, which it plans to start shipping to select customers for evaluation by the end of the year.

The technology could allow for novel lighting designs at costs comparable to the fluorescent light bulbs and fixtures used now, says Neil Shotton, Nth Degree’s president and CEO. Light could be emitted over large areas from curved surfaces of unusual shapes. The printing processes used to make the lights also make it easy to vary the color and brightness of the light emitted by a fixture. “It’s a new kind of lighting,” Shotton says.

Nth Degree makes its light sheets by first carving up a wafer of gallium nitride to produce millions of tiny LEDs—one four-inch wafer yields about eight million of them. The LEDs are then mixed with resin and binders, and a standard screen printer is used to deposit the resulting “ink” over a large surface.

In addition to the LED ink, there’s a layer of silver ink for the back electrical contact, a layer of phosphors to change the color of light emitted by the LEDs (from blue to various shades of white), and an insulating layer to prevent short circuits between the front and back. The front electrical contact, which needs to be transparent to let the light out, is made using an ink that contains invisibly small metal wires.

[div class=attrib]Read the entire article here.[end-div]

[div class=attrib]Image courtesy of Technology Review.[end-div]

The Battle of Evidence and Science versus Belief and Magic

An insightful article over at the Smithsonian ponders the national (U.S.) decline in the trust of science. Regardless of the topic in question — climate change, health supplements, vaccinations, air pollution, “fracking”, evolution — and regardless of the specific position on a particular topic, scientific evidence continues to be questioned, ignored, revised, and politicized. And perhaps it is in this last issue, that of politics, that we may see a possible cause for a growing national pandemic of denialism. The increasingly fractured, fractious and rancorous nature of the U.S. political system threatens to undermine all debate and true skepticism, whether based on personal opinion or scientific fact.

[div class=attrib]From the Smithsonian:[end-div]

A group of scientists and statisticians led by the University of California at Berkeley set out recently to conduct an independent assessment of climate data and determine once and for all whether the planet has warmed in the last century and by how much. The study was designed to address concerns brought up by prominent climate change skeptics, and it was funded by several groups known for climate skepticism. Last week, the group released its conclusions: Average land temperatures have risen by about 1.8 degrees Fahrenheit since the middle of the 20th century. The result matched the previous research.

The skeptics were not happy and immediately claimed that the study was flawed.

Also in the news last week were the results of yet another study that found no link between cell phones and brain cancer. Researchers at the Institute of Cancer Epidemiology in Denmark looked at data from 350,000 cell phone users over an 18-year period and found they were no more likely to develop brain cancer than people who didn’t use the technology.

But those results still haven’t killed the calls for more monitoring of any potential link.

Study after study finds no link between autism and vaccines (and plenty of reason to worry about non-vaccinated children dying from preventable diseases such as measles). But a quarter of parents in a poll released last year said that they believed that “some vaccines cause autism in healthy children” and 11.5 percent had refused at least one vaccination for their child.

Polls say that Americans trust scientists more than, say, politicians, but that trust is on the decline. If we’re losing faith in science, we’ve gone down the wrong path. Science is no more than a process (as recent contributors to our “Why I Like Science” series have noted), and skepticism can be a good thing. But for many people that skepticism has grown to the point that they can no longer accept good evidence when they get it, with the result that “we’re now in an epidemic of fear like one I’ve never seen and hope never to see again,” says Michael Specter, author of Denialism, in his TEDTalk below.

If you’re reading this, there’s a good chance that you think I’m not talking about you. But here’s a quick question: Do you take vitamins? There’s a growing body of evidence that vitamins and dietary supplements are no more than a placebo at best and, in some cases, can actually increase the risk of disease or death. For example, a study earlier this month in the Archives of Internal Medicine found that consumption of supplements, such as iron and copper, was associated with an increased risk of death among older women. In a related commentary, several doctors note that the concept of dietary supplementation has shifted from preventing deficiency (there’s a good deal of evidence for harm if you’re low in, say, folic acid) to one of trying to promote wellness and prevent disease, and many studies are showing that more supplements do not equal better health.

But I bet you’ll still take your pills tomorrow morning. Just in case.

[div class=attrib]Read the entire article here.[end-div]

Texi as the Plural for Texas?

Imagine more than one state of Texas. Or, imagine the division of Texas into a handful of sub-states smaller in size and perhaps more manageable. Frank Jacobs over at Strange Maps ponders a United States where there could be more than one Texas.

[div class=attrib]From Strange Maps:[end-div]

The plural of Texas? My money’s on Texases, even though that sounds almost as wrong as Texae, Texi or whatever alternative you might try to think up. Texas is defiantly singular. It is the Lone Star State, priding itself on its brief independence and distinct culture. Discounting Alaska, it is also the largest state in the Union.

Texas is both a maverick and a behemoth, and as much an claimant to exceptionalism within the US as America itself is on the world stage. Texans are superlative Americans. When other countries reach for an American archetype to caricature (or to demonise), it’s often one they imagine having a Texan drawl: the greedy oil baron, the fundamentalist preacher, the trigger-happy cowboy (1).

Texans will rightly object to being pigeonholed, but they probably won’t mind the implied reference to their tough-guy image. Nobody minds being provided with some room to swagger. See also the popularity of the slogan Don’t Mess With Texas, the state’s unofficial motto. It is less historical than it sounds, beginning life only in 1986 as the tagline of an anti-littering campaign.

You’d have to be crazy to mess with a state that’s this big and fierce. In fact, you’d have to be Texas to mess with Texas. Really. That’s not just a clever put-down. It’s the law. When Texas joined the Union in 1845, voluntarily giving up its independence, it was granted the right by Congress to form “new States of convenient size, not exceeding four in number and in addition to the said State of Texas.”

This would increase the total number of Texases to five, and enhance their political weight – at least in the US Senate, which would have to make room for 10 Senators from all five states combined, as opposed to just the twosome that represents the single state of Texas now.

In 2009, the political blog FiveThirtyEight overlaid their plan on a county-level map of the Obama-McCain presidential election results (showing Texas to be overwhelmingly red, except for a band of blue along the Rio Grande). The five Texases are:

  • (New) Texas, comprising the Austin-San Antonio metropolitan area in central Texas;
  • Trinity, uniting Dallas, Fort Worth and Arlington;
  • Gulfland, along the coast and including Houston;
  • Plainland, from Lubbock all the way up the panhandle (with 40% of Texas’s territory, the largest successor state);
  • El Norte, south of the other states but north of Mexico, where most of the new state’s 85% Hispanics would have their roots.

[div class=attrib]Read the entire article here.[end-div]

A Better Way to Study and Learn

Our current educational process in one sentence: assume student is empty vessel; provide student with content; reward student for remembering and regurgitating content; repeat.

Yet, we have known for a while, and an increasing body of research corroborates our belief, that this method of teaching and learning is not very effective, or stimulating for that matter. It’s simply an efficient mechanism for the mass production of an adequate resource for the job market. Of course, for most it then takes many more decades following high school or college to unlearn the rote trivia and re-learn what is really important.

Mind Hacks reviews some recent studies that highlight better approaches to studying.

[div class=attrib]From Mind Hacks:[end-div]

Decades old research into how memory works should have revolutionised University teaching. It didn’t.

If you’re a student, what I’m about to tell you will let you change how you study so that it is more effective, more enjoyable and easier. If you work at a University, you – like me – should hang your head in shame that we’ve known this for decades but still teach the way we do.

There’s a dangerous idea in education that students are receptacles, and teachers are responsible for providing content that fills them up. This model encourages us to test students by the amount of content they can regurgitate, to focus overly on statements rather than skills in assessment and on syllabuses rather than values in teaching. It also encourages us to believe that we should try and learn things by trying to remember them. Sounds plausible, perhaps, but there’s a problem. Research into the psychology of memory shows that intention to remember is a very minor factor in whether you remember something or not. Far more important than whether you want to remember something is how you think about the material when you encounter it.

A classic experiment by Hyde and Jenkins (1973) illustrates this. These researchers gave participants lists of words, which they later tested recall of, as their memory items. To affect their thinking about the words, half the participants were told to rate the pleasentness of each word, and half were told to check if the word contained the letters ‘e’ or ‘g’. This manipulation was designed to affect ‘depth of processing’. The participants in the rating-pleasentness condition had to think about what the word meant, and relate it to themselves (how they felt about it) – “deep processing”. Participants in the letter-checking condition just had to look at the shape of the letters, they didn’t even have to read the word if they didn’t want to – “shallow processing”. The second, independent, manipulation concerned whether participants knew that they would be tested later on the words. Half of each group were told this – the “intentional learning” condition – and half weren’t told, the test would come as a surprise – the “incidental learning” condition.

[div class=attrib]Read the entire article here.[end-div]

[div class=attrib]Image courtesy of the Telegraph / AP.[end-div]

The Middleman is Dead; Long Live the Middleman

In another sign of Amazon’s unquenchable thirst for all things commerce, the company is now moving more aggressively into publishing.

[div class=attrib]From the New York Times:[end-div]

Amazon.com has taught readers that they do not need bookstores. Now it is encouraging writers to cast aside their publishers.

Amazon will publish 122 books this fall in an array of genres, in both physical and e-book form. It is a striking acceleration of the retailer’s fledging publishing program that will place Amazon squarely in competition with the New York houses that are also its most prominent suppliers.

It has set up a flagship line run by a publishing veteran, Laurence Kirshbaum, to bring out brand-name fiction and nonfiction. It signed its first deal with the self-help author Tim Ferriss. Last week it announced a memoir by the actress and director Penny Marshall, for which it paid $800,000, a person with direct knowledge of the deal said.

Publishers say Amazon is aggressively wooing some of their top authors. And the company is gnawing away at the services that publishers, critics and agents used to provide.

Several large publishers declined to speak on the record about Amazon’s efforts. “Publishers are terrified and don’t know what to do,” said Dennis Loy Johnson of Melville House, who is known for speaking his mind.

“Everyone’s afraid of Amazon,” said Richard Curtis, a longtime agent who is also an e-book publisher. “If you’re a bookstore, Amazon has been in competition with you for some time. If you’re a publisher, one day you wake up and Amazon is competing with you too. And if you’re an agent, Amazon may be stealing your lunch because it is offering authors the opportunity to publish directly and cut you out.

[div class=attrib]Read more here.[end-div]

The World Wide Web of Terrorism

[div class=attrib]From Eurozine:[end-div]

There are clear signs that Internet-radicalization was behind the terrorism of Anders Behring Breivik. Though most research on this points to jihadism, it can teach us a lot about how Internet-radicalization of all kinds can be fought.

On 21 September 2010, Interpol released a press statement on their homepage warning against extremist websites. They pointed out that this is a global threat and that ever more terrorist groups use the Internet to radicalize young people.

“Terrorist recruiters exploit the web to their full advantage as they target young, middle class vulnerable individuals who are usually not on the radar of law enforcement”, said Secretary General Ronald K. Noble. He continued: “The threat is global; it is virtual; and it is on our doorsteps. It is a global threat that only international police networks can fully address.”

Noble pointed out that the Internet has made the radicalization process easier and the war on terror more difficult. Part of the reason, he claimed, is that much of what takes place is not really criminal.

Much research has been done on Internet radicalization over the last few years but the emphasis has been on Islamist terror. The phenomenon can be summarized thus: young boys and men of Muslim background have, via the Internet, been exposed to propaganda, films from war zones, horrifying images of war in Afghanistan, Iraq and Chechnya, and also extreme interpretations of Islam. They are, so to speak, caught in the web, and some have resorted to terrorism, or at least planned it. The BBC documentary Generation Jihad gives an interesting and frightening insight into the phenomenon.

Researchers Tim Stevens and Peter Neumann write in a report focused on Islamist Internet radicalization that Islamist groups are hardly unique in putting the Internet in the service of political extremism:
Although Al Qaeda-inspired Islamist militants represented the most significant terrorist threat to the United Kingdom at the time of writing, Islamist militants are not the only – or even the predominant – group of political extremists engaged in radicalization and recruitment on the internet. Visitor numbers are notoriously difficult to verify, but some of the most popular Islamist militant web forums (for example, Al Ekhlaas, Al Hesbah, or Al Boraq) are easily rivalled in popularity by white supremacist websites such as Stormfront.

Strikingly, Stormfront – an international Internet forum advocating “white nationalism” and dominated by neo-Nazis – is one of the websites visited by the terrorist Anders Behring Breivik, and a forum where he also left comments. In one place he writes about his hope that “the various fractured rightwing movements in Europe and the US reach a common consensus regarding the ‘Islamification of Europe/US’ can try and reach a consensus regarding the issue”. He continues: “After all, we all want the best for our people, and we owe it to them to try to create the most potent alliance which will have the strength to overthrow the governments which support multiculturalism.”

[div class=attrib]Read more of this article here.[end-div]

[div class=attrib]Image courtesy of Eurozine.[end-div]

Corporations As People And the Threat to Truth

In 2010 the U.S. Supreme Court ruled that corporations can be treated as people, assigning companies First Amendment rights under the Constitution. So, it’s probably only a matter of time before a real person legally marries (and divorces) a corporation. And, we’re probably not too far from a future where an American corporate CEO can take the life of competing company’s boss and “rightfully” declare that it was in competitive self-defense.

In the meantime, the growing, and much needed, debate over corporate power, corporate responsibility and corporate consciousness rolls on. A timely opinion by Gary Gutting over at the New York Times, gives us more on which to chew.

[div class=attrib]From the New York Times:[end-div]

The Occupy Wall Street protest movement has raised serious questions about the role of capitalist institutions, particularly corporations, in our society.   Well before the first protester set foot in Zucotti Park, a heckler urged Mitt Romney to tax corporations rather than people.  Romney’s response — “Corporations are people” — stirred a brief but intense controversy.  Now thousands of demonstrators have in effect joined the heckler, denouncing corporations as ”enemies of the people.”

Who’s right? Thinking pedantically, we can see ways in which Romney was literally correct; for example, corporations are nothing other than the people who own, run and work for them, and they are recognized as “persons” in some technical legal sense.  But it is also obvious that corporations are not people in a full moral sense: they cannot, for example, fall in love, write poetry or be depressed.

Far more important than questions about what corporations are (ontological questions, as philosophers say) is the question of what attitude we should have toward them.  Should we, as corporate public relations statements often suggest, think of them as friends (if we buy and are satisfied with their products) or as family (if we work for them)?  Does it make sense to be loyal to a corporation as either a customer or as an employee?  More generally, even granted that corporations are not fully persons in the way that individuals are, do they have some important moral standing in our society?

My answer to all these questions is no, because corporations have no core dedication to fundamental human values.  (To be clear, I am speaking primarily of large, for-profit, publicly owned corporations.)  Such corporations exist as instruments of profit for their shareholders.  This does not mean that they are inevitably evil or that they do not make essential economic contributions to society.  But it does mean that their moral and social value is entirely instrumental.   There are ways we can use corporations as means to achieve fundamental human values, but corporations do not of themselves work for these values. In fact, left to themselves, they can be serious threats to human values that conflict with the goal of corporate profit.

Corporations are a particular threat to truth, a value essential in a democracy, which places a premium on the informed decisions of individual citizens.  The corporate threat is most apparent in advertising, which explicitly aims at convincing us to prefer a product regardless of its actual merit.

[div class=attrib]Read more here.[end-div]

[div class=attrib]Time Saving Truth from Falsehood and Envy by François Lemoyne. Image courtesy of Wikipedia / Wallace Collection, London.[end-div]

The Myth of Bottled Water

In 2010 the world spent around $50 Billion on bottled water, with over a third accounted for by the United States alone. During this period the United States House of Representatives spent $860,000 on bottled water for its 435 members. This is close to $2,000 per person per year. (Figures according to Corporate Accountability International).

This is despite the fact that on average bottled water costs around 1,900 times more than it’s cheaper, less glamorous sibling — tap water. Bottled water has become a truly big business even though science shows no discernible benefit of bottled water over that from the faucet. In fact, around 40 percent of bottled water comes from municipal water supplies anyway.

In 2007 Charles Fishman wrote a ground-breaking cover story on the bottled water industry for Fast Company. We excerpt part of the article, Message in a Bottle, below.

[div class=attrib]By Charles Fishman:[end-div]

The largest bottled-water factory in North America is located on the outskirts of Hollis, Maine. In the back of the plant stretches the staging area for finished product: 24 million bottles of Poland Spring water. As far as the eye can see, there are double-stacked pallets packed with half-pint bottles, half-liters, liters, “Aquapods” for school lunches, and 2.5-gallon jugs for the refrigerator.

Really, it is a lake of Poland Spring water, conveniently celled off in plastic, extending across 6 acres, 8 feet high. A week ago, the lake was still underground; within five days, it will all be gone, to supermarkets and convenience stores across the Northeast, replaced by another lake’s worth of bottles.

Looking at the piles of water, you can have only one thought: Americans sure are thirsty.

Bottled water has become the indispensable prop in our lives and our culture. It starts the day in lunch boxes; it goes to every meeting, lecture hall, and soccer match; it’s in our cubicles at work; in the cup holder of the treadmill at the gym; and it’s rattling around half-finished on the floor of every minivan in America. Fiji Water shows up on the ABC show Brothers & Sisters; Poland Spring cameos routinely on NBC’s The Office. Every hotel room offers bottled water for sale, alongside the increasingly ignored ice bucket and drinking glasses. At Whole Foods, the upscale emporium of the organic and exotic, bottled water is the number-one item by units sold.

Thirty years ago, bottled water barely existed as a business in the United States. Last year, we spent more on Poland Spring, Fiji Water, Evian, Aquafina, and Dasani than we spent on iPods or movie tickets–$15 billion. It will be $16 billion this year.

Bottled water is the food phenomenon of our times. We–a generation raised on tap water and water fountains–drink a billion bottles of water a week, and we’re raising a generation that views tap water with disdain and water fountains with suspicion. We’ve come to pay good money–two or three or four times the cost of gasoline–for a product we have always gotten, and can still get, for free, from taps in our homes.

When we buy a bottle of water, what we’re often buying is the bottle itself, as much as the water. We’re buying the convenience–a bottle at the 7-Eleven isn’t the same product as tap water, any more than a cup of coffee at Starbucks is the same as a cup of coffee from the Krups machine on your kitchen counter. And we’re buying the artful story the water companies tell us about the water: where it comes from, how healthy it is, what it says about us. Surely among the choices we can make, bottled water isn’t just good, it’s positively virtuous.

Except for this: Bottled water is often simply an indulgence, and despite the stories we tell ourselves, it is not a benign indulgence. We’re moving 1 billion bottles of water around a week in ships, trains, and trucks in the United States alone. That’s a weekly convoy equivalent to 37,800 18-wheelers delivering water. (Water weighs 81/3 pounds a gallon. It’s so heavy you can’t fill an 18-wheeler with bottled water–you have to leave empty space.)

Meanwhile, one out of six people in the world has no dependable, safe drinking water. The global economy has contrived to deny the most fundamental element of life to 1 billion people, while delivering to us an array of water “varieties” from around the globe, not one of which we actually need. That tension is only complicated by the fact that if we suddenly decided not to purchase the lake of Poland Spring water in Hollis, Maine, none of that water would find its way to people who really are thirsty.

[div class=attrib]Please read the entire article here.[end-div]

[div class=attrib]Image courtesy of Wikipedia.[end-div]

Brokering the Cloud

Computer hardware reached (or plummeted, depending upon your viewpoint) the level of commodity a while ago. And of course, some types of operating systems platforms, and software and applications have followed suit recently — think Platform as a Service (PaaS) and Software as a Service (SaaS). So, it should come as no surprise to see new services arise that try to match supply and demand, and profit in the process. Welcome to the “cloud brokerage”.

[div class=attrib]From MIT Technology Review:[end-div]

Cloud computing has already made accessing computer power more efficient. Instead of buying computers, companies can now run websites or software by leasing time at data centers run by vendors like Amazon or Microsoft. The idea behind cloud brokerages is to take the efficiency of cloud computing a step further by creating a global marketplace where computing capacity can be bought and sold at auction.

Such markets offer steeply discounted rates, and they may also offer financial benefits to companies running cloud data centers, some of which are flush with excess capacity. “The more utilized you are as a [cloud services] provider … the faster return on investment you’ll realize on your hardware,” says Reuven Cohen, founder of Enomaly, a Toronto-based firm that last February launched SpotCloud, cloud computing’s first online spot market.

On SpotCloud, computing power can be bought and sold like coffee, soybeans, or any other commodity. But it’s caveat emptor for buyers, since unlike purchasing computer time with Microsoft, buying on SpotCloud doesn’t offer many contractual guarantees. There is no assurance computers won’t suffer an outage, and sellers can even opt to conceal their identity in a blind auction, so buyers don’t always know whether they’re purchasing capacity from an established vendor or a fly-by-night startup.

[div class=attrib]Read more here.[end-div]

[div class=attrib]Image courtesy of MIT Technology Review.[end-div]

In Praise of the Bad Bookstore

Tens of thousands of independent bookstores have disappeared from the United States and Europe over the last decade. Even mega-chains like Borders have fallen prey to monumental shifts in the distribution of ideas and content. The very notion of the physical book is under increasing threat from the accelerating momentum of digitalization.

For bibliophiles, particularly those who crave the feel of physical paper, there is a peculiar attractiveness even to the “bad” bookstore or bookshop (in the UK): the airport bookshop of last resort, the pulp fiction bookstore in a suburban mall. Mark O’Connell over at The Millions tells us there is no such thing as a bad bookstore.

[div class=attrib]From The Millions:[end-div]

Cultural anxieties are currently running high about the future of the book as a physical object, and about the immediate prospects for survival of actual brick and mortar booksellers. When most people think about the (by now very real) possibility of the retail side of the book business disappearing entirely into the online ether, they mostly tend to focus on the idea of their favorite bookshops shutting their doors for the last time. Sub-Borgesian bibliomaniac that I am (or, if you prefer, pathetic nerd), I have a mental image of the perfect bookshop that I hold in my mind. It’s a sort of Platonic ideal of the retail environment, a perfect confluence of impeccable curation and expansive selection, artfully cluttered and with the kind of quietly hospitable ambiance that makes the passage of time seem irrelevant once you start in on browsing the shelves. For me, the actual place that comes closest to embodying this ideal is the London Review Bookshop in Bloomsbury, run by the people behind the London Review of Books. It’s a beautifully laid-out space in a beautiful building, and its selection of books makes it feel less like an actual shop than the personal library of some extremely wealthy and exceptionally well-read individual. It’s the kind of place, in other words, where you don’t so much want to buy half the books in the shop as buy the shop itself, move in straight away and start living in it. The notion that places like this might no longer exist in a decade or so is depressing beyond measure.

But I don’t live in Bloomsbury, or anywhere near it. I live in a suburb of Dublin where the only bookshop within any kind of plausible walking distance is a small and frankly feeble set-up on the second floor of a grim 1970s-era shopping center, above a large supermarket. It’s flanked by two equally moribund concerns, a small record store and a travel agent, thereby forming the centerpiece of a sad triptych of retail obsolescence. It’s one of those places that makes you wonder how it manages to survive at all.

But I have an odd fondness for it anyway, and I’ll occasionally just wander up there in order to get out of the apartment, or to see whether, through some fluke convergence of whim and circumstance, they have something I might actually want to buy. I’ve often bought books there that I would never have thought to pick up in a better bookshop, gravitating toward them purely by virtue of the fact that there’s nothing else remotely interesting to be had.

And this brings me to the point I want to make about bad bookshops, which is that they’re rarely actually as bad as they seem. In a narrow and counterintuitive sense, they’re sometimes better than good bookshops. The way I see it, there are three basic categories of retail bookseller. There’s the vast warehouse that has absolutely everything you could possibly think of (Strand Bookstore in New York’s East Village, for instance, is a fairly extreme representative of this group, or at least it was the last time I was there ten years ago). Then there’s the “boutique” bookshop, where you get a sense of a strong curatorial presence behind the scenes, and which seems to cater for some aspirational ideal of your better intellectual self. The London Review Bookshop is, for me at least, the ultimate instance of this. And then there’s the third — and by far the largest — category, which is the rubbish bookshop. There are lots of subgenii to this grouping. The suburban shopping center fiasco, as discussed above. The chain outlet crammed with celebrity biographies and supernatural teen romances. The opportunistic fly-by-night operation that takes advantage of some short-term lease opening to sell off a random selection of remaindered titles at low prices before shutting down and moving elsewhere. And, of course, the airport bookshop of last resort.

[div class=attrib]Catch more of this essay here.[end-div]

[div class=attrib]Image courtesy of The Millions.[end-div]

Book Review: The Big Thirst. Charles Fishman

Charles Fishman has a fascinating new book entitled The Big Thirst: The Secret Life and Turbulent Future of Water. In it Fishman examines the origins of water on our planet and postulates an all to probable future where water becomes an increasingly limited and precious resource.

[div class=attrib]A brief excerpt from a recent interview, courtesy of NPR:[end-div]

For most of us, even the most basic questions about water turn out to be stumpers.

Where did the water on Earth come from?

Is water still being created or added somehow?

How old is the water coming out of the kitchen faucet?

For that matter, how did the water get to the kitchen faucet?

And when we flush, where does the water in the toilet actually go?

The things we think we know about water — things we might have learned in school — often turn out to be myths.

We think of Earth as a watery planet, indeed, we call it the Blue Planet; but for all of water’s power in shaping our world, Earth turns out to be surprisingly dry. A little water goes a long way.

We think of space as not just cold and dark and empty, but as barren of water. In fact, space is pretty wet. Cosmic water is quite common.

At the most personal level, there is a bit of bad news. Not only don’t you need to drink eight glasses of water every day, you cannot in any way make your complexion more youthful by drinking water. Your body’s water-balance mechanisms are tuned with the precision of a digital chemistry lab, and you cannot possibly “hydrate” your skin from the inside by drinking an extra bottle or two of Perrier. You just end up with pee sourced in France.

In short, we know nothing of the life of water — nothing of the life of the water inside us, around us, or beyond us. But it’s a great story — captivating and urgent, surprising and funny and haunting. And if we’re going to master our relationship to water in the next few decades — really, if we’re going to remaster our relationship to water — we need to understand the life of water itself.

[div class=attrib]Read more of this article and Charles Fishman’s interview with NPR here.[end-div]

Science at its Best: The Universe is Expanding AND Accelerating

The 2011 Nobel Prize in Physics was recently awarded to three scientists: Adam Riess, Saul Perlmutter and Brian Schmidt. Their computations and observations of a very specific type of exploding star upended decades of commonly accepted beliefs of our universe. Namely, that the expansion of the universe is accelerating.

Prior to their observations, first publicly articulated in 1998, general scientific consensus held that the universe would expand at a steady rate forever or slow, and eventually fold back in on itself in a cosmic Big Crunch.

The discovery by Riess, Perlmutter and Schmidt laid the groundwork for the idea that a mysterious force called “dark energy” is fueling the acceleration. This dark energy is now believed to make up 75 percent of the universe. Direct evidence of dark energy is lacking, but most cosmologists now accept that universal expansion is indeed accelerating.

Re-published here are the notes and a page scan from Riess’s logbook that led to this year’s Nobel Prize, which show the value of the scientific process:

[div class=attrib]The original article is courtesy of Symmetry Breaking:[end-div]

In the fall of 1997, I was leading the calibration and analysis of data gathered by the High-z Supernova Search Team, one of two teams of scientists—the other was the Supernova Cosmology Project—trying to determine the fate of our universe: Will it expand forever, or will it halt and contract, resulting in the Big Crunch?

To find the answer, we had to determine the mass of the universe. It can be calculated by measuring how much the expansion of the universe is slowing.

First, we had to find cosmic candles—distant objects of known brightness—and use them as yardsticks. On this page, I checked the reliability of the supernovae, or exploding stars, that we had collected to serve as our candles. I found that the results they yielded for the present expansion rate of the universe (known as the Hubble constant) did not appear to be affected by the age or dustiness of their host galaxies.

Next, I used the data to calculate ?M, the relative mass of the universe.

It was significantly negative!

The result, if correct, meant that the assumption of my analysis was wrong. The expansion of the universe was not slowing. It was speeding up! How could that be?

I spent the next few days checking my calculation. I found one could explain the acceleration by introducing a vacuum energy, also called the cosmological constant, that pushes the universe apart. In March 1998, we submitted these results, which were published in September 1998.

Today, we know that 74 percent of the universe consists of this dark energy. Understanding its nature remains one of the most pressing tasks for physicists and astronomers alike.

Adam Riess, Johns Hopkins University

The discovery, and many others like it both great and small, show the true power of the scientific process. Scientific results are open for constant refinement, or re-evaluation or refutation and re-interpretation. The process leads to inexorable progress towards greater and greater knowledge and understanding, and eventually to truth that most skeptics can embrace. That is, until the next and better theory and corresponding results come along.

[div class=attrib]Image courtesy of Symmetry Breaking, Adam Riess.[end-div]

MondayPoem: Water

This week, theDiagonal focuses its energies on that most precious of natural resources — water.

In his short poem “Water”, Ralph Waldo Emerson reminds us of its more fundamental qualities.

Emerson published his first book, Nature, in 1836, in which he outlined his transcendentalist philosophy. As Poetry Foundation elaborates:

His manifesto stated that the world consisted of Spirit (thought, ideas, moral laws, abstract truth, meaning itself ) and Nature (all of material reality, all that atoms comprise); it held that the former, which is timeless, is the absolute cause of the latter, which serves in turn to express Spirit, in a medium of time and space, to the senses. In other words, the objective, physical world—what Emerson called the “Not-Me”—is symbolic and exists for no other purpose than to acquaint human beings with its complement—the subjective, ideational world, identified with the conscious self and referred to in Emersonian counterpoint as the “Me.” Food, water, and air keep us alive, but the ultimate purpose for remaining alive is simply to possess the meanings of things, which by definition involves a translation of the attention from the physical fact to its spiritual value.

By Ralph Waldo Emerson

– Water

The water understands
Civilization well;
It wets my foot, but prettily,
It chills my life, but wittily,
It is not disconcerted,
It is not broken-hearted:
Well used, it decketh joy,
Adorneth, doubleth joy:
Ill used, it will destroy,
In perfect time and measure
With a face of golden pleasure
Elegantly destroy.

[div class=attrib]Image courtesy of Wikipedia / Creative Commons.[end-div]

Greatest Literary Suicides

Hot on the heals of our look at literary deaths we look specifically at the greatest suicides in literature. Although subject to personal taste and sensibility the starter list excerpted below is a fine beginning, and leaves much to ponder.

[div class=attrib]From Flavorpill:[end-div]

1. Ophelia, Hamlet, William Shakespeare

Hamlet’s jilted lover Ophelia drowns in a stream surrounded by the flowers she had held in her arms. Though Ophelia’s death can be parsed as an accident, her growing madness and the fact that she was, as Gertrude says, “incapable of her own distress.” And as far as we’re concerned, Gertrude’s monologue about Ophelia’s drowning is one of the most beautiful descriptions of death in Shakespeare.

2. Anna Karenina, Anna Karenina, Leo Tolstoy

In an extremely dramatic move only befitting the emotional mess that is Anna Karenina, the heroine throws herself under a train in her despair, mirroring the novel’s early depiction of a railway worker’s death by similar means.

3. Cecilia Lisbon, The Virgin Suicides, Jeffrey Eugenides

Eugenides’ entire novel deserves to be on this list for its dreamy horror of five sisters killing themselves in the 1970s Michigan suburbs. But the death of the youngest, Cecilia, is the most brutal and distressing. Having failed to kill herself by cutting her wrists, she leaves her own party to throw herself from her bedroom window, landing impaled on the steel fence below.

4. Emma Bovary, Madame Bovary, Gustave Flaubert

In life, Emma Bovary wished for romance, for intrigue, to escape the banalities of her provincial life as a doctor’s wife. Hoping to expire gracefully, she eats a bowl of arsenic, but is punished by hours of indelicate and public suffering before she finally dies.

5. Edna Pontellier, The Awakening, Kate Chopin

This is the first suicide that many students experience in literature, and it is a strange and calm one: Edna simply walks into the water. We imagine the reality of drowning yourself would be much messier, but Chopin’s version is a relief, a cool compress against the pains of Edna’s psyche in beautiful, fluttering prose.

Topping out the top 10 we have:

Lily Bart, The House of Mirth, Edith Wharton
Septimus Warren Smith, Mrs. Dalloway, Virginia Woolf
James O. Incandeza, Infinite Jest, David Foster Wallace
Romeo and Juliet, Romeo and Juliet, William Shakespeare
Inspector Javert, Les Misérables, Victor Hugo

[div class=attrib]Read the entire article here.[end-div]

[div class=attrib]Ophelia by John Everett Millais (1829–1896). Image courtesy of Wikipedia / Creative Commons.[end-div]

How Many People Have Died?

Ever wonder how many people have gone before? The succinct infographic courtesy of Jon Gosier takes a good stab at answering the question. First, a few assumptions and explanations:

The numbers in this piece are speculative but are as accurate as modern research allows. It’s widely accepted that prior to 2002 there had been somewhere between 106 and 140 billion homo sapiens born to the world. The graphic below uses the conservative number (106 bn) as the basis for a circle graph. The center dot represents how many people are currently living (red) versus the dead (white). The dashed vertical line shows how much time passed between milestones. The spectral graph immediately below this text illustrates the population ‘benchmarks’ that were used to estimate the population over time. Adding the population numbers gets you to 106 billion. The red sphere is then used to compare against other data.

[div class=attrib]Checkout the original here.[end-div]

Greatest Literary Deaths

Tim Lott over at the Guardian Book Blog wonders which are the most dramatic literary deaths — characters rather than novelist. Think Heathcliff in Emily Brontë’s Wuthering Heights.

[div class=attrib]From the Guardian:[end-div]

What makes for a great literary death scene? This is the question I and the other four judges of the 2012 Wellcome Trust book prize for medicine in literature have been pondering in advance of an event at the Cheltenham festival.

I find many famous death scenes more ludicrous than lachrymose. As with Oscar Wilde’s comment on the death of Dickens’s Little Nell, you would have to have a heart of stone not to laugh at the passing of the awful Tullivers in Mill on the Floss, dragged down clutching one another as the river deliciously finishes them off. More consciously designed to wring laughter out of tragedy, the suicide of Ronald Nimkin in Roth’s Portnoy’s Complaint takes some beating, with Nimkins’s magnificent farewell note to his mother: “Mrs Blumenthal called. Please bring your mah-jongg rules to the game tonight.”

To write a genuinely moving death scene is a challenge for any author. The temptation to retreat into cliché is powerful. For me, the best and most affecting death is that of Harry “Rabbit” Angstrom in John Updike’s Rabbit at Rest. I remember my wife reading this to me out loud as I drove along a motorway. We were both in tears, as he says his farewell to his errant son, Nelson, and then runs out of words, and life itself – “enough. Maybe. Enough.”

But death is a matter of personal taste. The other judges were eclectic in their choices. Roger Highfield, editor of New Scientist, admired the scenes in Sebastian Junger’s A Perfect Storm. At the end of the chapter that seals the fate of the six men on board, Junger writes: “The body could be likened to a crew that resorts to increasingly desperate measures to keep their vessel afloat. Eventually the last wire has shorted out, the last bit of decking has settled under the water.” “The details of death by drowning,” Highfield says, “are so rich and dispassionately drawn that they feel chillingly true.”

[div class=attrib]Read the entire article here.[end-div]

When Will I Die?

Would you like to know when you will die?

This is a fundamentally personal and moral question which many may prefer to keep unanswered.  That said, while scientific understanding of aging is making great strides it cannot yet provide an answer to the question. Though it may only be a matter of time.

Giles Tremlett over at the Guardian gives us a personal account of the fascinating science of telomeres, the end-caps on our chromosomes, and why they potentially hold a key to that most fateful question.

[div class=attrib]From the Guardian:[end-div]

As a taxi takes me across Madrid to the laboratories of Spain’s National Cancer Research Centre, I am fretting about the future. I am one of the first people in the world to provide a blood sample for a new test, which has been variously described as a predictor of how long I will live, a waste of time or a handy indicator of how well (or badly) my body is ageing. Today I get the results.

Some newspapers, to the dismay of the scientists involved, have gleefully announced that the test – which measures the telomeres (the protective caps on the ends of my chromosomes) – can predict when I will die. Am I about to find out that, at least statistically, my days are numbered? And, if so, might new telomere research suggesting we can turn back the hands of the body’s clock and make ourselves “biologically younger” come to my rescue?

The test is based on the idea that biological ageing grinds at your telomeres. And, although time ticks by uniformly, our bodies age at different rates. Genes, environment and our own personal habits all play a part in that process. A peek at your telomeres is an indicator of how you are doing. Essentially, they tell you whether you have become biologically younger or older than other people born at around the same time.

The key measure, explains María Blasco, a 45-year-old molecular biologist, head of Spain’s cancer research centre and one of the world’s leading telomere researchers, is the number of short telomeres. Blasco, who is also one of the co-founders of the Life Length company which is offering the tests, says that short telomeres do not just provide evidence of ageing. They also cause it. Often compared to the plastic caps on a shoelace, there is a critical level at which the fraying becomes irreversible and triggers cell death. “Short telomeres are causal of disease because when they are below a [certain] length they are damaging for the cells. The stem cells of our tissues do not regenerate and then we have ageing of the tissues,” she explains. That, in a cellular nutshell, is how ageing works. Eventually, so many of our telomeres are short that some key part of our body may stop working.

The research is still in its early days but extreme stress, for example, has been linked to telomere shortening. I think back to a recent working day that took in three countries, three news stories, two international flights, a public lecture and very little sleep. Reasonable behaviour, perhaps, for someone in their 30s – but I am closer to my 50s. Do days like that shorten my expected, or real, life-span?

[div class=attrib]Read more of this article here.[end-div]

[div class]Image: chromosomes capped by telomeres (white), courtesy of Wikipedia.[end-div]

The Climate Spin Cycle

There’s something to be said for a visual aide that puts a complex conversation about simple ideas into perspective. So, here we have a high-level flow chart that characterizes one on the most important debates of our time — climate change. Whether you are for or against the notion or the science, or merely perplexed by the hyperbole inside the “echo chamber” there is no denying that this debate will remain with us for quite sometime.

[div class=attrib]Chart courtesy of Riley E. Dunlap and Aaron M. McCright, “Organized Climate-Change Denial,” In J. S. Dryzek, R. B. Norgaard and D. Schlosberg, (eds.), Oxford
Handbook of Climate Change and Society. New York: Oxford University Press, 2011.[end-div]

Berlin’s Festival of Lights

Since 2005 Berlin’s Festival of Lights has brought annual color and drama to the city. This year the event runs from October 12-23, and bathes light on around 20 of Berlin’s most famous landmarks and iconic buildings. Here’s a sampling from the 2010 event:

[div class=attrib]For more information on the Festival of Lights visit the official site here.[end-div]

C is For Dennis Richie

Last week on October 8, 2011, Dennis Richie passed away. Most of the mainstream media failed to report his death — after all he was never quite as flamboyant as another technology darling, Steve Jobs. However, his contributions to the worlds of technology and computer science should certainly place him in the same club.

After all, Dennis Richie developed the computer language C, and he significantly influenced the development of other languages. He also pioneered the operating system, Unix. Both C and Unix now run much of the world’s computer systems.

Dennis Ritchie, and co-developer, Ken Thompson, were awarded the National Medal of Technology in 1999 by President Bill Clinton.

[div class=attrib]Image courtesy of Wikipedia.[end-div]

Mapping the Murder Rate

A sad but nonetheless interesting infographic of murder rates throughout the world. The rates are per 100,000 of the population. The United States with a rate of 5 per 100,000 ranks close to Belarus, Peru and Thailand. Interestingly, it has a higher murder rate than Turkmenistan (4.4), Uzbekistan (3.1), Afghanistan (2.4) , Syria (3) and Iran (3).

The top 5 countries with the highest murder rates are:

Selflessness versus Selfishness: Either Extreme Can Be Bad

[div class=attrib]From the New York Times:[end-div]

Some years ago, Dr. Robert A. Burton was the neurologist on call at a San Francisco hospital when a high-profile colleague from the oncology department asked him to perform a spinal tap on an elderly patient with advanced metastatic cancer. The patient had seemed a little fuzzy-headed that morning, and the oncologist wanted to check for meningitis or another infection that might be treatable with antibiotics.

Dr. Burton hesitated. Spinal taps are painful. The patient’s overall prognosis was beyond dire. Why go after an ancillary infection? But the oncologist, known for his uncompromising and aggressive approach to treatment, insisted.

“For him, there was no such thing as excessive,” Dr. Burton said in a telephone interview. “For him, there was always hope.”

On entering the patient’s room with spinal tap tray portentously agleam, Dr. Burton encountered the patient’s family members. They begged him not to proceed. The frail, bedridden patient begged him not to proceed. Dr. Burton conveyed their pleas to the oncologist, but the oncologist continued to lobby for a spinal tap, and the exhausted family finally gave in.

As Dr. Burton had feared, the procedure proved painful and difficult to administer. It revealed nothing of diagnostic importance. And it left the patient with a grinding spinal-tap headache that lasted for days, until the man fell into a coma and died of his malignancy.

Dr. Burton had admired his oncology colleague (now deceased), yet he also saw how the doctor’s zeal to heal could border on fanaticism, and how his determination to help his patients at all costs could perversely end up hurting them.

The author of “On Being Certain” and the coming “A Skeptic’s Guide to the Mind,” Dr. Burton is a contributor to a scholarly yet surprisingly sprightly volume called “Pathological Altruism,” to be published this fall by Oxford University Press. And he says his colleague’s behavior is a good example of that catchily contradictory term, just beginning to make the rounds through the psychological sciences.

As the new book makes clear, pathological altruism is not limited to showcase acts of self-sacrifice, like donating a kidney or a part of one’s liver to a total stranger. The book is the first comprehensive treatment of the idea that when ostensibly generous “how can I help you?” behavior is taken to extremes, misapplied or stridently rhapsodized, it can become unhelpful, unproductive and even destructive.

Selflessness gone awry may play a role in a broad variety of disorders, including anorexia and animal hoarding, women who put up with abusive partners and men who abide alcoholic ones.

[div class=attrib]Read more here.[end-div]

[div class=attrib]Image courtesy of Serge Bloch, New York Times.[end-div]

MondayPoem: And Death Shall Have No Dominion

Ushering in our week of articles focused mostly on death and loss is a classic piece by Welshman, Dylan Thomas. Although Thomas’ literary legacy is colored by his legendary drinking and philandering, many critics now seem to agree that his poetry belongs in the same class as that of W.H. Auden.

By Dylan Thomas:

– And Death Shall Have No Dominion

And death shall have no dominion.
Dead men naked they shall be one
With the man in the wind and the west moon;
When their bones are picked clean and the clean bones gone,
They shall have stars at elbow and foot;
Though they go mad they shall be sane,
Though they sink through the sea they shall rise again;
Though lovers be lost love shall not;
And death shall have no dominion.

And death shall have no dominion.
Under the windings of the sea
They lying long shall not die windily;
Twisting on racks when sinews give way,
Strapped to a wheel, yet they shall not break;
Faith in their hands shall snap in two,
And the unicorn evils run them through;
Split all ends up they shan’t crack;
And death shall have no dominion.

And death shall have no dominion.
No more may gulls cry at their ears
Or waves break loud on the seashores;
Where blew a flower may a flower no more
Lift its head to the blows of the rain;
Though they be mad and dead as nails,
Heads of the characters hammer through daisies;
Break in the sun till the sun breaks down,
And death shall have no dominion.

Remembering Another Great Inventor: Edwin Land

[div class=attrib]From the New York Times:[end-div]

IN the memorials to Steven P. Jobs this week, Apple’s co-founder was compared with the world’s great inventor-entrepreneurs: Thomas Edison, Henry Ford, Alexander Graham Bell. Yet virtually none of the obituaries mentioned the man Jobs himself considered his hero, the person on whose career he explicitly modeled his own: Edwin H. Land, the genius domus of Polaroid Corporation and inventor of instant photography.

Land, in his time, was nearly as visible as Jobs was in his. In 1972, he made the covers of both Time and Life magazines, probably the only chemist ever to do so. (Instant photography was a genuine phenomenon back then, and Land had created the entire medium, once joking that he’d worked out the whole idea in a few hours, then spent nearly 30 years getting those last few details down.) And the more you learn about Land, the more you realize how closely Jobs echoed him.

Both built multibillion-dollar corporations on inventions that were guarded by relentless patent enforcement. (That also kept the competition at bay, and the profit margins up.) Both were autodidacts, college dropouts (Land from Harvard, Jobs from Reed) who more than made up for their lapsed educations by cultivating extremely refined taste. At Polaroid, Land used to hire Smith College’s smartest art-history majors and send them off for a few science classes, in order to create chemists who could keep up when his conversation turned from Maxwell’s equations to Renoir’s brush strokes.

Most of all, Land believed in the power of the scientific demonstration. Starting in the 60s, he began to turn Polaroid’s shareholders’ meetings into dramatic showcases for whatever line the company was about to introduce. In a perfectly art-directed setting, sometimes with live music between segments, he would take the stage, slides projected behind him, the new product in hand, and instead of deploying snake-oil salesmanship would draw you into Land’s World. By the end of the afternoon, you probably wanted to stay there.

Three decades later, Jobs would do exactly the same thing, except in a black turtleneck and jeans. His admiration for Land was open and unabashed. In 1985, he told an interviewer, “The man is a national treasure. I don’t understand why people like that can’t be held up as models: This is the most incredible thing to be — not an astronaut, not a football player — but this.”

[div class=attrib]Read the full article here.[end-div]

[div class=attrib]Edwin Herbert Land. Photograph by J. J. Scarpetti, The National Academies Press.[end-div]

A Medical Metaphor for Climate Risk

While scientific evidence of climate change continues to mount and an increasing number of studies point causal fingers at ourselves there is perhaps another way to visualize the risk of inaction or over-reaction. So, since most people can leave ideology aside when it comes to their own health, a medical metaphor, courtesy of Andrew Revkin over at Dot Earth, may be of use to broaden acceptance of the message.

[div class=attrib]From the New York Times:[end-div]

Paul C. Stern, the director of the National Research Council committee on the human dimensions of global change, has been involved in a decades-long string of studies of behavior, climate change and energy choices.

This is an arena that is often attacked by foes of cuts in greenhouse gases, who see signs of mind control and propaganda. Stern says that has nothing to do with his approach, as he made clear in “Contributions of Psychology to Limiting Climate Change,” a paper that was part of a special issue of the journal American Psychologist on climate change and behavior:

Psychological contributions to limiting climate change will come not from trying to change people’s attitudes, but by helping to make low-carbon technologies more attractive and user-friendly, economic incentives more transparent and easier to use, and information more actionable and relevant to the people who need it.

The special issue of the journal builds on a 2009 report on climate and behavior from the American Psychological Association that was covered here. Stern has now offered a reaction to the discussion last week of Princeton researcher Robert Socolow’s call for a fresh approach to climate policy that acknowledges “the news about climate change is unwelcome, that today’s climate science is incomplete, and that every ’solution’ carries risk.” Stern’s response, centered on a medical metaphor (not the first) is worth posting as a “Your Dot” contribution. You can find my reaction to his idea below. Here’s Stern’s piece:

I agree with Robert Socolow that scientists could do better at encouraging a high quality of discussion about climate change.

But providing better technical descriptions will not help most people because they do not follow that level of detail.  Psychological research shows that people often use simple, familiar mental models as analogies for complex phenomena.  It will help people think through climate choices to have a mental model that is familiar and evocative and that also neatly encapsulates Socolow’s points that the news is unwelcome, that science is incomplete, and that some solutions are dangerous. There is such a model.

Too many people think of climate science as an exact science like astronomy that can make highly confident predictions, such as about lunar eclipses.  That model misrepresents the science, does poorly at making Socolow’s points, and has provided an opening for commentators and bloggers seeking to use any scientific disagreement to discredit the whole body of knowledge.

A mental model from medical science might work better.  In the analogy, the planet is a patient suspected of having a serious, progressive disease (anthropogenic climate change).  The symptoms are not obvious, just as they are not with diabetes or hypertension, but the disease may nevertheless be serious.  Humans, as guardians of the planet, must decide what to do.  Scientists are in the role of physician.  The guardians have been asking the physicians about the diagnosis (is this disease present?), the nature of the disease, its prognosis if untreated, and the treatment options, including possible side effects.  The medical analogy helps clarify the kinds of errors that are possible and can help people better appreciate how science can help and think through policy choices.

Diagnosis. A physician must be careful to avoid two errors:  misdiagnosing the patient with a dread disease that is not present, and misdiagnosing a seriously ill patient as healthy.  To avoid these types of error, physicians often run diagnostic tests or observe the patient over a period of time before recommending a course of treatment.  Scientists have been doing this with Earth’s climate at least since 1959, when strong signs of illness were reported from observations in Hawaii.

Scientists now have high confidence that the patient has the disease.  We know the causes:  fossil fuel consumption, certain land cover changes, and a few other physical processes. We know that the disease produces a complex syndrome of symptoms involving change in many planetary systems (temperature, precipitation, sea level and acidity balance, ecological regimes, etc.).  The patient is showing more and more of the syndrome, and although we cannot be sure that each particular symptom is due to climate change rather than some other cause, the combined evidence justifies strong confidence that the syndrome is present.

Prognosis. Fundamental scientific principles tell us that the disease is progressive and very hard to reverse.  Observations tell us that the processes that cause it have been increasing, as have the symptoms.  Without treatment, they will get worse.  However, because this is an extremely rare disease (in fact, the first known case), there is uncertainty about how fast it will progress.  The prognosis could be catastrophic, but we cannot assign a firm probability to the worst outcomes, and we are not even sure what the most likely outcome is.  We want to avoid either seriously underestimating or overestimating the seriousness of the prognosis.

Treatment. We want treatments that improve the patient’s chances at low cost and with limited adverse side effects and we want to avoid “cures” that might be worse than the disease.  We want to consider the chances of improvement for each treatment, and its side effects, in addition to the untreated prognosis.  We want to avoid the dangers both of under-treatment and of side effects.  We know that some treatments (the ones limiting climate change) get at the causes and could alleviate all the symptoms if taken soon enough.  But reducing the use of fossil fuels quickly could be painful.  Other treatments, called adaptations, offer only symptomatic relief.  These make sense because even with strong medicine for limiting climate change, the disease will get worse before it gets better.

Choices. There are no risk-free choices.  We know that the longer treatment is postponed, the more painful it will be, and the worse the prognosis.  We can also use an iterative treatment approach (as Socolow proposed), starting some treatments and monitoring their effects and side effects before raising the dose.  People will disagree about the right course of treatment, but thinking about the choices in this way might give the disagreements the appropriate focus.

[div class=attrib]Read more here.[end-div]

[div class=attrib]Image courtesy of Stephen Wilkes for The New York Times.[end-div]

A Commencement Address for Each of Us: Stay Hungry. Stay Foolish.

Much has been written to honor the life of Steve Jobs, who passed away October 5, 2011 at the young age of 56. Much more will be written. To honor his vision and passion we re-print below a rare public speech given Steve Jobs at the Stanford University Commencement on June 12, 2005. The address is a very personal and thoughtful story of innovation, love and loss, and death.

[div class=attrib]Courtesy of Stanford University:[end-div]

I am honored to be with you today at your commencement from one of the finest universities in the world. I never graduated from college. Truth be told, this is the closest I’ve ever gotten to a college graduation. Today I want to tell you three stories from my life. That’s it. No big deal. Just three stories.

The first story is about connecting the dots.

I dropped out of Reed College after the first 6 months, but then stayed around as a drop-in for another 18 months or so before I really quit. So why did I drop out?

It started before I was born. My biological mother was a young, unwed college graduate student, and she decided to put me up for adoption. She felt very strongly that I should be adopted by college graduates, so everything was all set for me to be adopted at birth by a lawyer and his wife. Except that when I popped out they decided at the last minute that they really wanted a girl. So my parents, who were on a waiting list, got a call in the middle of the night asking: “We have an unexpected baby boy; do you want him?” They said: “Of course.” My biological mother later found out that my mother had never graduated from college and that my father had never graduated from high school. She refused to sign the final adoption papers. She only relented a few months later when my parents promised that I would someday go to college.

And 17 years later I did go to college. But I naively chose a college that was almost as expensive as Stanford, and all of my working-class parents’ savings were being spent on my college tuition. After six months, I couldn’t see the value in it. I had no idea what I wanted to do with my life and no idea how college was going to help me figure it out. And here I was spending all of the money my parents had saved their entire life. So I decided to drop out and trust that it would all work out OK. It was pretty scary at the time, but looking back it was one of the best decisions I ever made. The minute I dropped out I could stop taking the required classes that didn’t interest me, and begin dropping in on the ones that looked interesting.

It wasn’t all romantic. I didn’t have a dorm room, so I slept on the floor in friends’ rooms, I returned coke bottles for the 5¢ deposits to buy food with, and I would walk the 7 miles across town every Sunday night to get one good meal a week at the Hare Krishna temple. I loved it. And much of what I stumbled into by following my curiosity and intuition turned out to be priceless later on. Let me give you one example:

Reed College at that time offered perhaps the best calligraphy instruction in the country. Throughout the campus every poster, every label on every drawer, was beautifully hand calligraphed. Because I had dropped out and didn’t have to take the normal classes, I decided to take a calligraphy class to learn how to do this. I learned about serif and san serif typefaces, about varying the amount of space between different letter combinations, about what makes great typography great. It was beautiful, historical, artistically subtle in a way that science can’t capture, and I found it fascinating.

None of this had even a hope of any practical application in my life. But ten years later, when we were designing the first Macintosh computer, it all came back to me. And we designed it all into the Mac. It was the first computer with beautiful typography. If I had never dropped in on that single course in college, the Mac would have never had multiple typefaces or proportionally spaced fonts. And since Windows just copied the Mac, it’s likely that no personal computer would have them. If I had never dropped out, I would have never dropped in on this calligraphy class, and personal computers might not have the wonderful typography that they do. Of course it was impossible to connect the dots looking forward when I was in college. But it was very, very clear looking backwards ten years later.

Again, you can’t connect the dots looking forward; you can only connect them looking backwards. So you have to trust that the dots will somehow connect in your future. You have to trust in something — your gut, destiny, life, karma, whatever. This approach has never let me down, and it has made all the difference in my life.

My second story is about love and loss.

I was lucky — I found what I loved to do early in life. Woz and I started Apple in my parents garage when I was 20. We worked hard, and in 10 years Apple had grown from just the two of us in a garage into a $2 billion company with over 4000 employees. We had just released our finest creation — the Macintosh — a year earlier, and I had just turned 30. And then I got fired. How can you get fired from a company you started? Well, as Apple grew we hired someone who I thought was very talented to run the company with me, and for the first year or so things went well. But then our visions of the future began to diverge and eventually we had a falling out. When we did, our Board of Directors sided with him. So at 30 I was out. And very publicly out. What had been the focus of my entire adult life was gone, and it was devastating.

I really didn’t know what to do for a few months. I felt that I had let the previous generation of entrepreneurs down – that I had dropped the baton as it was being passed to me. I met with David Packard and Bob Noyce and tried to apologize for screwing up so badly. I was a very public failure, and I even thought about running away from the valley. But something slowly began to dawn on me — I still loved what I did. The turn of events at Apple had not changed that one bit. I had been rejected, but I was still in love. And so I decided to start over.

I didn’t see it then, but it turned out that getting fired from Apple was the best thing that could have ever happened to me. The heaviness of being successful was replaced by the lightness of being a beginner again, less sure about everything. It freed me to enter one of the most creative periods of my life.

During the next five years, I started a company named NeXT, another company named Pixar, and fell in love with an amazing woman who would become my wife. Pixar went on to create the worlds first computer animated feature film, Toy Story, and is now the most successful animation studio in the world. In a remarkable turn of events, Apple bought NeXT, I returned to Apple, and the technology we developed at NeXT is at the heart of Apple’s current renaissance. And Laurene and I have a wonderful family together.

I’m pretty sure none of this would have happened if I hadn’t been fired from Apple. It was awful tasting medicine, but I guess the patient needed it. Sometimes life hits you in the head with a brick. Don’t lose faith. I’m convinced that the only thing that kept me going was that I loved what I did. You’ve got to find what you love. And that is as true for your work as it is for your lovers. Your work is going to fill a large part of your life, and the only way to be truly satisfied is to do what you believe is great work. And the only way to do great work is to love what you do. If you haven’t found it yet, keep looking. Don’t settle. As with all matters of the heart, you’ll know when you find it. And, like any great relationship, it just gets better and better as the years roll on. So keep looking until you find it. Don’t settle.

My third story is about death.

When I was 17, I read a quote that went something like: “If you live each day as if it was your last, someday you’ll most certainly be right.” It made an impression on me, and since then, for the past 33 years, I have looked in the mirror every morning and asked myself: “If today were the last day of my life, would I want to do what I am about to do today?” And whenever the answer has been “No” for too many days in a row, I know I need to change something.

Remembering that I’ll be dead soon is the most important tool I’ve ever encountered to help me make the big choices in life. Because almost everything — all external expectations, all pride, all fear of embarrassment or failure – these things just fall away in the face of death, leaving only what is truly important. Remembering that you are going to die is the best way I know to avoid the trap of thinking you have something to lose. You are already naked. There is no reason not to follow your heart.

About a year ago I was diagnosed with cancer. I had a scan at 7:30 in the morning, and it clearly showed a tumor on my pancreas. I didn’t even know what a pancreas was. The doctors told me this was almost certainly a type of cancer that is incurable, and that I should expect to live no longer than three to six months. My doctor advised me to go home and get my affairs in order, which is doctor’s code for prepare to die. It means to try to tell your kids everything you thought you’d have the next 10 years to tell them in just a few months. It means to make sure everything is buttoned up so that it will be as easy as possible for your family. It means to say your goodbyes.

I lived with that diagnosis all day. Later that evening I had a biopsy, where they stuck an endoscope down my throat, through my stomach and into my intestines, put a needle into my pancreas and got a few cells from the tumor. I was sedated, but my wife, who was there, told me that when they viewed the cells under a microscope the doctors started crying because it turned out to be a very rare form of pancreatic cancer that is curable with surgery. I had the surgery and I’m fine now.

This was the closest I’ve been to facing death, and I hope it’s the closest I get for a few more decades. Having lived through it, I can now say this to you with a bit more certainty than when death was a useful but purely intellectual concept:

No one wants to die. Even people who want to go to heaven don’t want to die to get there. And yet death is the destination we all share. No one has ever escaped it. And that is as it should be, because Death is very likely the single best invention of Life. It is Life’s change agent. It clears out the old to make way for the new. Right now the new is you, but someday not too long from now, you will gradually become the old and be cleared away. Sorry to be so dramatic, but it is quite true.

Your time is limited, so don’t waste it living someone else’s life. Don’t be trapped by dogma — which is living with the results of other people’s thinking. Don’t let the noise of others’ opinions drown out your own inner voice. And most important, have the courage to follow your heart and intuition. They somehow already know what you truly want to become. Everything else is secondary.

When I was young, there was an amazing publication called The Whole Earth Catalog, which was one of the bibles of my generation. It was created by a fellow named Stewart Brand not far from here in Menlo Park, and he brought it to life with his poetic touch. This was in the late 1960’s, before personal computers and desktop publishing, so it was all made with typewriters, scissors, and polaroid cameras. It was sort of like Google in paperback form, 35 years before Google came along: it was idealistic, and overflowing with neat tools and great notions.

Stewart and his team put out several issues of The Whole Earth Catalog, and then when it had run its course, they put out a final issue. It was the mid-1970s, and I was your age. On the back cover of their final issue was a photograph of an early morning country road, the kind you might find yourself hitchhiking on if you were so adventurous. Beneath it were the words: “Stay Hungry. Stay Foolish.” It was their farewell message as they signed off. Stay Hungry. Stay Foolish. And I have always wished that for myself. And now, as you graduate to begin anew, I wish that for you.

Stay Hungry. Stay Foolish.

Thank you all very much.