All posts by Mike

When Will I Die?

Would you like to know when you will die?

This is a fundamentally personal and moral question which many may prefer to keep unanswered.  That said, while scientific understanding of aging is making great strides it cannot yet provide an answer to the question. Though it may only be a matter of time.

Giles Tremlett over at the Guardian gives us a personal account of the fascinating science of telomeres, the end-caps on our chromosomes, and why they potentially hold a key to that most fateful question.

[div class=attrib]From the Guardian:[end-div]

As a taxi takes me across Madrid to the laboratories of Spain’s National Cancer Research Centre, I am fretting about the future. I am one of the first people in the world to provide a blood sample for a new test, which has been variously described as a predictor of how long I will live, a waste of time or a handy indicator of how well (or badly) my body is ageing. Today I get the results.

Some newspapers, to the dismay of the scientists involved, have gleefully announced that the test – which measures the telomeres (the protective caps on the ends of my chromosomes) – can predict when I will die. Am I about to find out that, at least statistically, my days are numbered? And, if so, might new telomere research suggesting we can turn back the hands of the body’s clock and make ourselves “biologically younger” come to my rescue?

The test is based on the idea that biological ageing grinds at your telomeres. And, although time ticks by uniformly, our bodies age at different rates. Genes, environment and our own personal habits all play a part in that process. A peek at your telomeres is an indicator of how you are doing. Essentially, they tell you whether you have become biologically younger or older than other people born at around the same time.

The key measure, explains María Blasco, a 45-year-old molecular biologist, head of Spain’s cancer research centre and one of the world’s leading telomere researchers, is the number of short telomeres. Blasco, who is also one of the co-founders of the Life Length company which is offering the tests, says that short telomeres do not just provide evidence of ageing. They also cause it. Often compared to the plastic caps on a shoelace, there is a critical level at which the fraying becomes irreversible and triggers cell death. “Short telomeres are causal of disease because when they are below a [certain] length they are damaging for the cells. The stem cells of our tissues do not regenerate and then we have ageing of the tissues,” she explains. That, in a cellular nutshell, is how ageing works. Eventually, so many of our telomeres are short that some key part of our body may stop working.

The research is still in its early days but extreme stress, for example, has been linked to telomere shortening. I think back to a recent working day that took in three countries, three news stories, two international flights, a public lecture and very little sleep. Reasonable behaviour, perhaps, for someone in their 30s – but I am closer to my 50s. Do days like that shorten my expected, or real, life-span?

[div class=attrib]Read more of this article here.[end-div]

[div class]Image: chromosomes capped by telomeres (white), courtesy of Wikipedia.[end-div]

The Climate Spin Cycle

There’s something to be said for a visual aide that puts a complex conversation about simple ideas into perspective. So, here we have a high-level flow chart that characterizes one on the most important debates of our time — climate change. Whether you are for or against the notion or the science, or merely perplexed by the hyperbole inside the “echo chamber” there is no denying that this debate will remain with us for quite sometime.

[div class=attrib]Chart courtesy of Riley E. Dunlap and Aaron M. McCright, “Organized Climate-Change Denial,” In J. S. Dryzek, R. B. Norgaard and D. Schlosberg, (eds.), Oxford
Handbook of Climate Change and Society. New York: Oxford University Press, 2011.[end-div]

Berlin’s Festival of Lights

Since 2005 Berlin’s Festival of Lights has brought annual color and drama to the city. This year the event runs from October 12-23, and bathes light on around 20 of Berlin’s most famous landmarks and iconic buildings. Here’s a sampling from the 2010 event:

[div class=attrib]For more information on the Festival of Lights visit the official site here.[end-div]

C is For Dennis Richie

Last week on October 8, 2011, Dennis Richie passed away. Most of the mainstream media failed to report his death — after all he was never quite as flamboyant as another technology darling, Steve Jobs. However, his contributions to the worlds of technology and computer science should certainly place him in the same club.

After all, Dennis Richie developed the computer language C, and he significantly influenced the development of other languages. He also pioneered the operating system, Unix. Both C and Unix now run much of the world’s computer systems.

Dennis Ritchie, and co-developer, Ken Thompson, were awarded the National Medal of Technology in 1999 by President Bill Clinton.

[div class=attrib]Image courtesy of Wikipedia.[end-div]

Mapping the Murder Rate

A sad but nonetheless interesting infographic of murder rates throughout the world. The rates are per 100,000 of the population. The United States with a rate of 5 per 100,000 ranks close to Belarus, Peru and Thailand. Interestingly, it has a higher murder rate than Turkmenistan (4.4), Uzbekistan (3.1), Afghanistan (2.4) , Syria (3) and Iran (3).

The top 5 countries with the highest murder rates are:

Selflessness versus Selfishness: Either Extreme Can Be Bad

[div class=attrib]From the New York Times:[end-div]

Some years ago, Dr. Robert A. Burton was the neurologist on call at a San Francisco hospital when a high-profile colleague from the oncology department asked him to perform a spinal tap on an elderly patient with advanced metastatic cancer. The patient had seemed a little fuzzy-headed that morning, and the oncologist wanted to check for meningitis or another infection that might be treatable with antibiotics.

Dr. Burton hesitated. Spinal taps are painful. The patient’s overall prognosis was beyond dire. Why go after an ancillary infection? But the oncologist, known for his uncompromising and aggressive approach to treatment, insisted.

“For him, there was no such thing as excessive,” Dr. Burton said in a telephone interview. “For him, there was always hope.”

On entering the patient’s room with spinal tap tray portentously agleam, Dr. Burton encountered the patient’s family members. They begged him not to proceed. The frail, bedridden patient begged him not to proceed. Dr. Burton conveyed their pleas to the oncologist, but the oncologist continued to lobby for a spinal tap, and the exhausted family finally gave in.

As Dr. Burton had feared, the procedure proved painful and difficult to administer. It revealed nothing of diagnostic importance. And it left the patient with a grinding spinal-tap headache that lasted for days, until the man fell into a coma and died of his malignancy.

Dr. Burton had admired his oncology colleague (now deceased), yet he also saw how the doctor’s zeal to heal could border on fanaticism, and how his determination to help his patients at all costs could perversely end up hurting them.

The author of “On Being Certain” and the coming “A Skeptic’s Guide to the Mind,” Dr. Burton is a contributor to a scholarly yet surprisingly sprightly volume called “Pathological Altruism,” to be published this fall by Oxford University Press. And he says his colleague’s behavior is a good example of that catchily contradictory term, just beginning to make the rounds through the psychological sciences.

As the new book makes clear, pathological altruism is not limited to showcase acts of self-sacrifice, like donating a kidney or a part of one’s liver to a total stranger. The book is the first comprehensive treatment of the idea that when ostensibly generous “how can I help you?” behavior is taken to extremes, misapplied or stridently rhapsodized, it can become unhelpful, unproductive and even destructive.

Selflessness gone awry may play a role in a broad variety of disorders, including anorexia and animal hoarding, women who put up with abusive partners and men who abide alcoholic ones.

[div class=attrib]Read more here.[end-div]

[div class=attrib]Image courtesy of Serge Bloch, New York Times.[end-div]

MondayPoem: And Death Shall Have No Dominion

Ushering in our week of articles focused mostly on death and loss is a classic piece by Welshman, Dylan Thomas. Although Thomas’ literary legacy is colored by his legendary drinking and philandering, many critics now seem to agree that his poetry belongs in the same class as that of W.H. Auden.

By Dylan Thomas:

– And Death Shall Have No Dominion

And death shall have no dominion.
Dead men naked they shall be one
With the man in the wind and the west moon;
When their bones are picked clean and the clean bones gone,
They shall have stars at elbow and foot;
Though they go mad they shall be sane,
Though they sink through the sea they shall rise again;
Though lovers be lost love shall not;
And death shall have no dominion.

And death shall have no dominion.
Under the windings of the sea
They lying long shall not die windily;
Twisting on racks when sinews give way,
Strapped to a wheel, yet they shall not break;
Faith in their hands shall snap in two,
And the unicorn evils run them through;
Split all ends up they shan’t crack;
And death shall have no dominion.

And death shall have no dominion.
No more may gulls cry at their ears
Or waves break loud on the seashores;
Where blew a flower may a flower no more
Lift its head to the blows of the rain;
Though they be mad and dead as nails,
Heads of the characters hammer through daisies;
Break in the sun till the sun breaks down,
And death shall have no dominion.

Remembering Another Great Inventor: Edwin Land

[div class=attrib]From the New York Times:[end-div]

IN the memorials to Steven P. Jobs this week, Apple’s co-founder was compared with the world’s great inventor-entrepreneurs: Thomas Edison, Henry Ford, Alexander Graham Bell. Yet virtually none of the obituaries mentioned the man Jobs himself considered his hero, the person on whose career he explicitly modeled his own: Edwin H. Land, the genius domus of Polaroid Corporation and inventor of instant photography.

Land, in his time, was nearly as visible as Jobs was in his. In 1972, he made the covers of both Time and Life magazines, probably the only chemist ever to do so. (Instant photography was a genuine phenomenon back then, and Land had created the entire medium, once joking that he’d worked out the whole idea in a few hours, then spent nearly 30 years getting those last few details down.) And the more you learn about Land, the more you realize how closely Jobs echoed him.

Both built multibillion-dollar corporations on inventions that were guarded by relentless patent enforcement. (That also kept the competition at bay, and the profit margins up.) Both were autodidacts, college dropouts (Land from Harvard, Jobs from Reed) who more than made up for their lapsed educations by cultivating extremely refined taste. At Polaroid, Land used to hire Smith College’s smartest art-history majors and send them off for a few science classes, in order to create chemists who could keep up when his conversation turned from Maxwell’s equations to Renoir’s brush strokes.

Most of all, Land believed in the power of the scientific demonstration. Starting in the 60s, he began to turn Polaroid’s shareholders’ meetings into dramatic showcases for whatever line the company was about to introduce. In a perfectly art-directed setting, sometimes with live music between segments, he would take the stage, slides projected behind him, the new product in hand, and instead of deploying snake-oil salesmanship would draw you into Land’s World. By the end of the afternoon, you probably wanted to stay there.

Three decades later, Jobs would do exactly the same thing, except in a black turtleneck and jeans. His admiration for Land was open and unabashed. In 1985, he told an interviewer, “The man is a national treasure. I don’t understand why people like that can’t be held up as models: This is the most incredible thing to be — not an astronaut, not a football player — but this.”

[div class=attrib]Read the full article here.[end-div]

[div class=attrib]Edwin Herbert Land. Photograph by J. J. Scarpetti, The National Academies Press.[end-div]

A Medical Metaphor for Climate Risk

While scientific evidence of climate change continues to mount and an increasing number of studies point causal fingers at ourselves there is perhaps another way to visualize the risk of inaction or over-reaction. So, since most people can leave ideology aside when it comes to their own health, a medical metaphor, courtesy of Andrew Revkin over at Dot Earth, may be of use to broaden acceptance of the message.

[div class=attrib]From the New York Times:[end-div]

Paul C. Stern, the director of the National Research Council committee on the human dimensions of global change, has been involved in a decades-long string of studies of behavior, climate change and energy choices.

This is an arena that is often attacked by foes of cuts in greenhouse gases, who see signs of mind control and propaganda. Stern says that has nothing to do with his approach, as he made clear in “Contributions of Psychology to Limiting Climate Change,” a paper that was part of a special issue of the journal American Psychologist on climate change and behavior:

Psychological contributions to limiting climate change will come not from trying to change people’s attitudes, but by helping to make low-carbon technologies more attractive and user-friendly, economic incentives more transparent and easier to use, and information more actionable and relevant to the people who need it.

The special issue of the journal builds on a 2009 report on climate and behavior from the American Psychological Association that was covered here. Stern has now offered a reaction to the discussion last week of Princeton researcher Robert Socolow’s call for a fresh approach to climate policy that acknowledges “the news about climate change is unwelcome, that today’s climate science is incomplete, and that every ’solution’ carries risk.” Stern’s response, centered on a medical metaphor (not the first) is worth posting as a “Your Dot” contribution. You can find my reaction to his idea below. Here’s Stern’s piece:

I agree with Robert Socolow that scientists could do better at encouraging a high quality of discussion about climate change.

But providing better technical descriptions will not help most people because they do not follow that level of detail.  Psychological research shows that people often use simple, familiar mental models as analogies for complex phenomena.  It will help people think through climate choices to have a mental model that is familiar and evocative and that also neatly encapsulates Socolow’s points that the news is unwelcome, that science is incomplete, and that some solutions are dangerous. There is such a model.

Too many people think of climate science as an exact science like astronomy that can make highly confident predictions, such as about lunar eclipses.  That model misrepresents the science, does poorly at making Socolow’s points, and has provided an opening for commentators and bloggers seeking to use any scientific disagreement to discredit the whole body of knowledge.

A mental model from medical science might work better.  In the analogy, the planet is a patient suspected of having a serious, progressive disease (anthropogenic climate change).  The symptoms are not obvious, just as they are not with diabetes or hypertension, but the disease may nevertheless be serious.  Humans, as guardians of the planet, must decide what to do.  Scientists are in the role of physician.  The guardians have been asking the physicians about the diagnosis (is this disease present?), the nature of the disease, its prognosis if untreated, and the treatment options, including possible side effects.  The medical analogy helps clarify the kinds of errors that are possible and can help people better appreciate how science can help and think through policy choices.

Diagnosis. A physician must be careful to avoid two errors:  misdiagnosing the patient with a dread disease that is not present, and misdiagnosing a seriously ill patient as healthy.  To avoid these types of error, physicians often run diagnostic tests or observe the patient over a period of time before recommending a course of treatment.  Scientists have been doing this with Earth’s climate at least since 1959, when strong signs of illness were reported from observations in Hawaii.

Scientists now have high confidence that the patient has the disease.  We know the causes:  fossil fuel consumption, certain land cover changes, and a few other physical processes. We know that the disease produces a complex syndrome of symptoms involving change in many planetary systems (temperature, precipitation, sea level and acidity balance, ecological regimes, etc.).  The patient is showing more and more of the syndrome, and although we cannot be sure that each particular symptom is due to climate change rather than some other cause, the combined evidence justifies strong confidence that the syndrome is present.

Prognosis. Fundamental scientific principles tell us that the disease is progressive and very hard to reverse.  Observations tell us that the processes that cause it have been increasing, as have the symptoms.  Without treatment, they will get worse.  However, because this is an extremely rare disease (in fact, the first known case), there is uncertainty about how fast it will progress.  The prognosis could be catastrophic, but we cannot assign a firm probability to the worst outcomes, and we are not even sure what the most likely outcome is.  We want to avoid either seriously underestimating or overestimating the seriousness of the prognosis.

Treatment. We want treatments that improve the patient’s chances at low cost and with limited adverse side effects and we want to avoid “cures” that might be worse than the disease.  We want to consider the chances of improvement for each treatment, and its side effects, in addition to the untreated prognosis.  We want to avoid the dangers both of under-treatment and of side effects.  We know that some treatments (the ones limiting climate change) get at the causes and could alleviate all the symptoms if taken soon enough.  But reducing the use of fossil fuels quickly could be painful.  Other treatments, called adaptations, offer only symptomatic relief.  These make sense because even with strong medicine for limiting climate change, the disease will get worse before it gets better.

Choices. There are no risk-free choices.  We know that the longer treatment is postponed, the more painful it will be, and the worse the prognosis.  We can also use an iterative treatment approach (as Socolow proposed), starting some treatments and monitoring their effects and side effects before raising the dose.  People will disagree about the right course of treatment, but thinking about the choices in this way might give the disagreements the appropriate focus.

[div class=attrib]Read more here.[end-div]

[div class=attrib]Image courtesy of Stephen Wilkes for The New York Times.[end-div]

A Commencement Address for Each of Us: Stay Hungry. Stay Foolish.

Much has been written to honor the life of Steve Jobs, who passed away October 5, 2011 at the young age of 56. Much more will be written. To honor his vision and passion we re-print below a rare public speech given Steve Jobs at the Stanford University Commencement on June 12, 2005. The address is a very personal and thoughtful story of innovation, love and loss, and death.

[div class=attrib]Courtesy of Stanford University:[end-div]

I am honored to be with you today at your commencement from one of the finest universities in the world. I never graduated from college. Truth be told, this is the closest I’ve ever gotten to a college graduation. Today I want to tell you three stories from my life. That’s it. No big deal. Just three stories.

The first story is about connecting the dots.

I dropped out of Reed College after the first 6 months, but then stayed around as a drop-in for another 18 months or so before I really quit. So why did I drop out?

It started before I was born. My biological mother was a young, unwed college graduate student, and she decided to put me up for adoption. She felt very strongly that I should be adopted by college graduates, so everything was all set for me to be adopted at birth by a lawyer and his wife. Except that when I popped out they decided at the last minute that they really wanted a girl. So my parents, who were on a waiting list, got a call in the middle of the night asking: “We have an unexpected baby boy; do you want him?” They said: “Of course.” My biological mother later found out that my mother had never graduated from college and that my father had never graduated from high school. She refused to sign the final adoption papers. She only relented a few months later when my parents promised that I would someday go to college.

And 17 years later I did go to college. But I naively chose a college that was almost as expensive as Stanford, and all of my working-class parents’ savings were being spent on my college tuition. After six months, I couldn’t see the value in it. I had no idea what I wanted to do with my life and no idea how college was going to help me figure it out. And here I was spending all of the money my parents had saved their entire life. So I decided to drop out and trust that it would all work out OK. It was pretty scary at the time, but looking back it was one of the best decisions I ever made. The minute I dropped out I could stop taking the required classes that didn’t interest me, and begin dropping in on the ones that looked interesting.

It wasn’t all romantic. I didn’t have a dorm room, so I slept on the floor in friends’ rooms, I returned coke bottles for the 5¢ deposits to buy food with, and I would walk the 7 miles across town every Sunday night to get one good meal a week at the Hare Krishna temple. I loved it. And much of what I stumbled into by following my curiosity and intuition turned out to be priceless later on. Let me give you one example:

Reed College at that time offered perhaps the best calligraphy instruction in the country. Throughout the campus every poster, every label on every drawer, was beautifully hand calligraphed. Because I had dropped out and didn’t have to take the normal classes, I decided to take a calligraphy class to learn how to do this. I learned about serif and san serif typefaces, about varying the amount of space between different letter combinations, about what makes great typography great. It was beautiful, historical, artistically subtle in a way that science can’t capture, and I found it fascinating.

None of this had even a hope of any practical application in my life. But ten years later, when we were designing the first Macintosh computer, it all came back to me. And we designed it all into the Mac. It was the first computer with beautiful typography. If I had never dropped in on that single course in college, the Mac would have never had multiple typefaces or proportionally spaced fonts. And since Windows just copied the Mac, it’s likely that no personal computer would have them. If I had never dropped out, I would have never dropped in on this calligraphy class, and personal computers might not have the wonderful typography that they do. Of course it was impossible to connect the dots looking forward when I was in college. But it was very, very clear looking backwards ten years later.

Again, you can’t connect the dots looking forward; you can only connect them looking backwards. So you have to trust that the dots will somehow connect in your future. You have to trust in something — your gut, destiny, life, karma, whatever. This approach has never let me down, and it has made all the difference in my life.

My second story is about love and loss.

I was lucky — I found what I loved to do early in life. Woz and I started Apple in my parents garage when I was 20. We worked hard, and in 10 years Apple had grown from just the two of us in a garage into a $2 billion company with over 4000 employees. We had just released our finest creation — the Macintosh — a year earlier, and I had just turned 30. And then I got fired. How can you get fired from a company you started? Well, as Apple grew we hired someone who I thought was very talented to run the company with me, and for the first year or so things went well. But then our visions of the future began to diverge and eventually we had a falling out. When we did, our Board of Directors sided with him. So at 30 I was out. And very publicly out. What had been the focus of my entire adult life was gone, and it was devastating.

I really didn’t know what to do for a few months. I felt that I had let the previous generation of entrepreneurs down – that I had dropped the baton as it was being passed to me. I met with David Packard and Bob Noyce and tried to apologize for screwing up so badly. I was a very public failure, and I even thought about running away from the valley. But something slowly began to dawn on me — I still loved what I did. The turn of events at Apple had not changed that one bit. I had been rejected, but I was still in love. And so I decided to start over.

I didn’t see it then, but it turned out that getting fired from Apple was the best thing that could have ever happened to me. The heaviness of being successful was replaced by the lightness of being a beginner again, less sure about everything. It freed me to enter one of the most creative periods of my life.

During the next five years, I started a company named NeXT, another company named Pixar, and fell in love with an amazing woman who would become my wife. Pixar went on to create the worlds first computer animated feature film, Toy Story, and is now the most successful animation studio in the world. In a remarkable turn of events, Apple bought NeXT, I returned to Apple, and the technology we developed at NeXT is at the heart of Apple’s current renaissance. And Laurene and I have a wonderful family together.

I’m pretty sure none of this would have happened if I hadn’t been fired from Apple. It was awful tasting medicine, but I guess the patient needed it. Sometimes life hits you in the head with a brick. Don’t lose faith. I’m convinced that the only thing that kept me going was that I loved what I did. You’ve got to find what you love. And that is as true for your work as it is for your lovers. Your work is going to fill a large part of your life, and the only way to be truly satisfied is to do what you believe is great work. And the only way to do great work is to love what you do. If you haven’t found it yet, keep looking. Don’t settle. As with all matters of the heart, you’ll know when you find it. And, like any great relationship, it just gets better and better as the years roll on. So keep looking until you find it. Don’t settle.

My third story is about death.

When I was 17, I read a quote that went something like: “If you live each day as if it was your last, someday you’ll most certainly be right.” It made an impression on me, and since then, for the past 33 years, I have looked in the mirror every morning and asked myself: “If today were the last day of my life, would I want to do what I am about to do today?” And whenever the answer has been “No” for too many days in a row, I know I need to change something.

Remembering that I’ll be dead soon is the most important tool I’ve ever encountered to help me make the big choices in life. Because almost everything — all external expectations, all pride, all fear of embarrassment or failure – these things just fall away in the face of death, leaving only what is truly important. Remembering that you are going to die is the best way I know to avoid the trap of thinking you have something to lose. You are already naked. There is no reason not to follow your heart.

About a year ago I was diagnosed with cancer. I had a scan at 7:30 in the morning, and it clearly showed a tumor on my pancreas. I didn’t even know what a pancreas was. The doctors told me this was almost certainly a type of cancer that is incurable, and that I should expect to live no longer than three to six months. My doctor advised me to go home and get my affairs in order, which is doctor’s code for prepare to die. It means to try to tell your kids everything you thought you’d have the next 10 years to tell them in just a few months. It means to make sure everything is buttoned up so that it will be as easy as possible for your family. It means to say your goodbyes.

I lived with that diagnosis all day. Later that evening I had a biopsy, where they stuck an endoscope down my throat, through my stomach and into my intestines, put a needle into my pancreas and got a few cells from the tumor. I was sedated, but my wife, who was there, told me that when they viewed the cells under a microscope the doctors started crying because it turned out to be a very rare form of pancreatic cancer that is curable with surgery. I had the surgery and I’m fine now.

This was the closest I’ve been to facing death, and I hope it’s the closest I get for a few more decades. Having lived through it, I can now say this to you with a bit more certainty than when death was a useful but purely intellectual concept:

No one wants to die. Even people who want to go to heaven don’t want to die to get there. And yet death is the destination we all share. No one has ever escaped it. And that is as it should be, because Death is very likely the single best invention of Life. It is Life’s change agent. It clears out the old to make way for the new. Right now the new is you, but someday not too long from now, you will gradually become the old and be cleared away. Sorry to be so dramatic, but it is quite true.

Your time is limited, so don’t waste it living someone else’s life. Don’t be trapped by dogma — which is living with the results of other people’s thinking. Don’t let the noise of others’ opinions drown out your own inner voice. And most important, have the courage to follow your heart and intuition. They somehow already know what you truly want to become. Everything else is secondary.

When I was young, there was an amazing publication called The Whole Earth Catalog, which was one of the bibles of my generation. It was created by a fellow named Stewart Brand not far from here in Menlo Park, and he brought it to life with his poetic touch. This was in the late 1960’s, before personal computers and desktop publishing, so it was all made with typewriters, scissors, and polaroid cameras. It was sort of like Google in paperback form, 35 years before Google came along: it was idealistic, and overflowing with neat tools and great notions.

Stewart and his team put out several issues of The Whole Earth Catalog, and then when it had run its course, they put out a final issue. It was the mid-1970s, and I was your age. On the back cover of their final issue was a photograph of an early morning country road, the kind you might find yourself hitchhiking on if you were so adventurous. Beneath it were the words: “Stay Hungry. Stay Foolish.” It was their farewell message as they signed off. Stay Hungry. Stay Foolish. And I have always wished that for myself. And now, as you graduate to begin anew, I wish that for you.

Stay Hungry. Stay Foolish.

Thank you all very much.

Global Interconnectedness: Submarine Cables

Apparently only 1 percent of global internet traffic is transmitted via satellite or terrestrially-based radio frequency. The remaining 99 percent is still carried via cable – fiber optic and copper. Much of this cable is strewn for many thousands of miles across the seabeds of our deepest oceans.

For a fascinating view of these intricate systems and to learn why and how Brazil is connected to Angola, or Auckland, New Zealand connected to Redondo Beach California via the 12,750 km long Pacific Fiber check the interactive Submarine Cable Map from TeleGeography.

Steve Jobs: The Secular Prophet

The world will miss Steve Jobs.

In early 2010 the U.S. Supreme Court overturned years of legal precedent by assigning First Amendment (free speech) protections to corporations. We could argue the merits and demerits of this staggering ruling until the cows come home. However, one thing is clear if corporations are to be judged as people. And, that is the world would in all likelihood benefit more from a corporation with a human, optimistic and passionate face (Apple) rather than from a faceless one (Exxon) or an ideological one (News Corp) or an opaque one (Koch Industries).

That said, we excerpt a fascinating essay on Steve Jobs by Andy Crouch below. We would encourage Mr.Crouch to take this worthy idea further by examining the Fortune 1000 list of corporations. Could he deliver a similar analysis for each of these corporations’ leaders? We believe not.

The world will miss Steve Jobs.

[div class=attrib]By Andy Crouch for the Wall Street Journal:[end-div]

Steve Jobs was extraordinary in countless ways—as a designer, an innovator, a (demanding and occasionally ruthless) leader. But his most singular quality was his ability to articulate a perfectly secular form of hope. Nothing exemplifies that ability more than Apple’s early logo, which slapped a rainbow on the very archetype of human fallenness and failure—the bitten fruit—and turned it into a sign of promise and progress.

That bitten apple was just one of Steve Jobs’s many touches of genius, capturing the promise of technology in a single glance. The philosopher Albert Borgmann has observed that technology promises to relieve us of the burden of being merely human, of being finite creatures in a harsh and unyielding world. The biblical story of the Fall pronounced a curse upon human work—”cursed is the ground for thy sake; in sorrow shalt thou eat of it all the days of thy life.” All technology implicitly promises to reverse the curse, easing the burden of creaturely existence. And technology is most celebrated when it is most invisible—when the machinery is completely hidden, combining godlike effortlessness with blissful ignorance about the mechanisms that deliver our disburdened lives.

Steve Jobs was the evangelist of this particular kind of progress—and he was the perfect evangelist because he had no competing source of hope. He believed so sincerely in the “magical, revolutionary” promise of Apple precisely because he believed in no higher power. In his celebrated Stanford commencement address (which is itself an elegant, excellent model of the genre), he spoke frankly about his initial cancer diagnosis in 2003. It’s worth pondering what Jobs did, and didn’t, say:

“No one wants to die. Even people who want to go to heaven don’t want to die to get there. And yet death is the destination we all share. No one has ever escaped it. And that is as it should be, because death is very likely the single best invention of life. It’s life’s change agent; it clears out the old to make way for the new. Right now, the new is you. But someday, not too long from now, you will gradually become the old and be cleared away. Sorry to be so dramatic, but it’s quite true. Your time is limited, so don’t waste it living someone else’s life. Don’t be trapped by dogma, which is living with the results of other people’s thinking. Don’t let the noise of others’ opinions drown out your own inner voice, heart and intuition. They somehow already know what you truly want to become.”

This is the gospel of a secular age.

[div class=attrib]Steve Jobs by Tim O’Brien, image courtesy of Wall Street Journal.[end-div]

Googlization of the Globe: For Good (or Evil)

Google’s oft quoted corporate mantra — do no evil — reminds us to remain vigilant even if the company believes it does good and can do no wrong.

Google serves up countless search results to ease our never-ending thirst for knowledge, deals, news, quotes, jokes, user manuals, contacts, products and so on. This is clearly of tremendous benefit to us, to Google and to Google’s advertisers. Of course in fulfilling our searches Google collects equally staggering amounts of information — about us. Increasingly the company will know where we are, what we like and dislike, what we prefer, what we do, where we travel, with whom and why, how our friends are, what we read, what we buy.

As Jaron Lanier remarked in a recent post, there is a fine line between being a global index to the world’s free and open library of information and being the paid gatekeeper to our collective knowledge and hoarder of our collective online (and increasingly offline) behaviors, tracks and memories. We have already seen how Google, and others, can personalize search results based on our previous tracks thus filtering and biasing what we see and read, limiting our exposure to alternate views and opinions.

It’s quite easy to imagine a rather more dystopian view of a society gone awry manipulated by a not-so-benevolent Google when, eventually, founders Brin and Page retire to their vacation bases on the moon.

With this in mind Daniel Soar over at London Review of Books reviews several recent books about Google and offers some interesting insights.

[div class=attrib]London Review of Books:[end-div]

This spring, the billionaire Eric Schmidt announced that there were only four really significant technology companies: Apple, Amazon, Facebook and Google, the company he had until recently been running. People believed him. What distinguished his new ‘gang of four’ from the generation it had superseded – companies like Intel, Microsoft, Dell and Cisco, which mostly exist to sell gizmos and gadgets and innumerable hours of expensive support services to corporate clients – was that the newcomers sold their products and services to ordinary people. Since there are more ordinary people in the world than there are businesses, and since there’s nothing that ordinary people don’t want or need, or can’t be persuaded they want or need when it flashes up alluringly on their screens, the money to be made from them is virtually limitless. Together, Schmidt’s four companies are worth more than half a trillion dollars. The technology sector isn’t as big as, say, oil, but it’s growing, as more and more traditional industries – advertising, travel, real estate, used cars, new cars, porn, television, film, music, publishing, news – are subsumed into the digital economy. Schmidt, who as the ex-CEO of a multibillion-dollar corporation had learned to take the long view, warned that not all four of his disruptive gang could survive. So – as they all converge from their various beginnings to compete in the same area, the place usually referred to as ‘the cloud’, a place where everything that matters is online – the question is: who will be the first to blink?

If the company that falters is Google, it won’t be because it didn’t see the future coming. Of Schmidt’s four technology juggernauts, Google has always been the most ambitious, and the most committed to getting everything possible onto the internet, its mission being ‘to organise the world’s information and make it universally accessible and useful’. Its ubiquitous search box has changed the way information can be got at to such an extent that ten years after most people first learned of its existence you wouldn’t think of trying to find out anything without typing it into Google first. Searching on Google is automatic, a reflex, just part of what we do. But an insufficiently thought-about fact is that in order to organise the world’s information Google first has to get hold of the stuff. And in the long run ‘the world’s information’ means much more than anyone would ever have imagined it could. It means, of course, the totality of the information contained on the World Wide Web, or the contents of more than a trillion webpages (it was a trillion at the last count, in 2008; now, such a number would be meaningless). But that much goes without saying, since indexing and ranking webpages is where Google began when it got going as a research project at Stanford in 1996, just five years after the web itself was invented. It means – or would mean, if lawyers let Google have its way – the complete contents of every one of the more than 33 million books in the Library of Congress or, if you include slightly varying editions and pamphlets and other ephemera, the contents of the approximately 129,864,880 books published in every recorded language since printing was invented. It means every video uploaded to the public internet, a quantity – if you take the Google-owned YouTube alone – that is increasing at the rate of nearly an hour of video every second.

[div class=attrib]Read more here.[end-div]

MondayPoem: Further In

Tomas Tranströmer is one of Sweden’s leading poets. He studied poetry and psychology at the University of Stockholm. Tranströmer was awarded the 2011 Nobel Prize for Literature “because, through his condensed, translucent images, he gives us fresh access to reality”.

By Tomas Tranströmer:

– Further In
On the main road into the city
when the sun is low.
The traffic thickens, crawls.
It is a sluggish dragon glittering.
I am one of the dragon’s scales.
Suddenly the red sun is
right in the middle of the windscreen
streaming in.
I am transparent
and writing becomes visible
inside me
words in invisible ink
which appear
when the paper is held to the fire!
I know I must get far away
straight through the city and then
further until it is time to go out
and walk far in the forest.
Walk in the footprints of the badger.
It gets dark, difficult to see.
In there on the moss lie stones.
One of the stones is precious.
It can change everything
it can make the darkness shine.
It is a switch for the whole country.
Everything depends on it.
Look at it, touch it…

Human Evolution Marches On

[div class=attrib]From Wired:[end-div]

Though ongoing human evolution is difficult to see, researchers believe they’ve found signs of rapid genetic changes among the recent residents of a small Canadian town.

Between 1800 and 1940, mothers in Ile aux Coudres, Quebec gave birth at steadily younger ages, with the average age of first maternity dropping from 26 to 22. Increased fertility, and thus larger families, could have been especially useful in the rural settlement’s early history.

According to University of Quebec geneticist Emmanuel Milot and colleagues, other possible explanations, such as changing cultural or environmental influences, don’t fit. The changes appear to reflect biological evolution.

“It is often claimed that modern humans have stopped evolving because cultural and technological advancements have annihilated natural selection,” wrote Milot’s team in their Oct. 3 Proceedings of the National Academy of Sciences paper. “Our study supports the idea that humans are still evolving. It also demonstrates that microevolution is detectable over just a few generations.”

Milot’s team based their study on detailed birth, marriage and death records kept by the Catholic church in Ile aux Coudres, a small and historically isolated French-Canadian island town in the Gulf of St. Lawrence. It wasn’t just the fact that average first birth age — a proxy for fertility — dropped from 26 to 22 in 140 years that suggested genetic changes. After all, culture or environment might have been wholly responsible, as nutrition and healthcare are for recent, rapid changes in human height. Rather, it was how ages dropped that caught their eye.

The patterns fit with models of gene-influenced natural selection. Moreover, thanks to the detailed record-keeping, it was possible to look at other possible explanations. Were better nutrition responsible, for example, improved rates of infant and juvenile mortality should have followed; they didn’t. Neither did the late-19th century transition from farming to more diversified professions.

[div class=attrib]Read more here.[end-div]

Misconceptions of Violence

We live in violent times. Or do we?

Despite the seemingly constant flow of human engineered destruction on our fellow humans, other species and our precious environment some thoughtful analysis — beyond the headlines of cable news — shows that all may not be lost to our violent nature. An insightful interview with psychologist Steven Pinker, author of “How the Mind Works” shows us that contemporary humans are not as bad as we may have thought. His latest book, “The Better Angels of Our Nature: Why Violence Has Declined,” analyzes the basis and history of human violence. Perhaps surprisingly Pinker suggests that we live in remarkably peaceful times, comparatively speaking. Characteristically he backs up his claims with clear historical evidence.

[div class=attrib]From Gareth Cook for Mind Matters:[end-div]

COOK: What would you say is the biggest misconception people have about violence?
PINKER: That we are living in a violent age. The statistics suggest that this may be the most peaceable time in our species’s existence.

COOK: Can you give a sense for how violent life was 500 or 1000 years ago?
PINKER: Statistics aside, accounts of daily life in medieval and early modern Europe reveal a society soaked in blood and gore. Medieval knights—whom today we would call warlords—fought their numerous private wars with a single strategy: kill as many of the opposing knight’s peasants as possible. Religious instruction included prurient descriptions of how the saints of both sexes were tortured and mutilated in ingenious ways. Corpses broken on the wheel, hanging from gibbets, or rotting in iron cages where the sinner had been left to die of exposure and starvation were a common part of the landscape. For entertainment, one could nail a cat to a post and try to head-butt it to death, or watch a political prisoner get drawn and quartered, which is to say partly strangled, disemboweled, and castrated before being decapitated. So many people had their noses cut off in private disputes that medical textbooks had procedures that were alleged to grow them back.

COOK: How has neuroscience contributed to our understanding of violence and its origins?
PINKER: Neuroscientists have long known that aggression in animals is not a unitary phenomenon driven by a single hormone or center. When they stimulate one part of the brain of a cat, it will lunge for the experimenter in a hissing, fangs-out rage; when they stimulate another, it will silently stalk a hallucinatory mouse. Still another circuit primes a male cat for a hostile confrontation with another male. Similar systems for rage, predatory seeking, and male-male aggression may be found in Homo sapiens, together with uniquely human, cognitively-driven  systems of aggression such as political and religious ideologies and moralistic punishment. Today, even the uniquely human systems can be investigated using functional neuroimaging. So neuroscience has given us the crucial starting point in understanding violence, namely that it is not a single thing. And it has helped us to discover biologically realistic taxonomies of the major motives for violence.

COOK: Is the general trend toward less violence going to continue in the future?
PINKER: It depends. In the arena of custom and institutional practices, it’s a good bet. I suspect that violence against women, the criminalization of homosexuality, the use of capital punishment, the callous treatment of animals on farms, corporal punishment of children, and other violent social practices will continue to decline, based on the fact that worldwide moralistic shaming movements in the past (such as those against slavery, whaling, piracy, and punitive torture) have been effective over long stretches of time. I also don’t expect war between developed countries to make a comeback any time soon. But civil wars, terrorist acts, government repression, and genocides in backward parts of the world are simply too capricious to allow predictions. With six billion people in the world, there’s no predicting what some cunning fanatic or narcissistic despot might do.

[div class=attrib]Read more of the interview here.[end-div]

[div class=attrib]Image courtesy of Scientific American.[end-div]

All Power Corrupts

[div class=attrib]From the Economist:[end-div]

DURING the second world war a new term of abuse entered the English language. To call someone “a little Hitler” meant he was a menial functionary who employed what power he had in order to annoy and frustrate others for his own gratification. From nightclub bouncers to the squaddies at Abu Ghraib prison who tormented their prisoners for fun, little Hitlers plague the world. The phenomenon has not, though, hitherto been subject to scientific investigation.

Nathanael Fast of the University of Southern California has changed that. He observed that lots of psychological experiments have been done on the effects of status and lots on the effects of power. But few, if any, have been done on both combined. He and his colleagues Nir Halevy of Stanford University and Adam Galinsky of Northwestern University, in Chicago, set out to correct this. In particular they wanted to see if it is circumstances that create little Hitlers or, rather, whether people of that type simply gravitate into jobs which allow them to behave badly. Their results have just been published in the Journal of Experimental Social Psychology.

Dr Fast’s experiment randomly assigned each of 213 participants to one of four situations that manipulated their status and power. All participants were informed that they were taking part in a study on virtual organisations and would be interacting with, but not meeting, a fellow student who worked in the same fictional consulting firm. Participants were then assigned either the role of “idea producer”, a job that entailed generating and working with important ideas, or of “worker”, a job that involved menial tasks like checking for typos. A post-experiment questionnaire demonstrated that participants did, as might be expected, look upon the role of idea producer with respect and admiration. Equally unsurprisingly, they looked down on the role of worker.

Participants who had both status and power did not greatly demean their partners. They chose an average of 0.67 demeaning activities for those partners to perform. Low-power/low-status and low-power/high-status participants behaved similarly. They chose, on average, 0.67 and 0.85 demeaning activities. However, participants who were low in status but high in power—the classic “little Hitler” combination—chose an average of 1.12 deeply demeaning tasks for their partners to engage in. That was a highly statistically significant distinction.

Of course, not everybody in the high-power/low-status quadrant of the experiment behaved badly. Underlying personality may still have a role. But as with previous experiments in which random members of the public have been asked to play prison guard or interrogator, Dr Fast’s result suggests that many quite ordinary people will succumb to bad behaviour if the circumstances are right.

[div class=attrib]Read more here.[end-div]

[div class=attrib]Image courtesy of the Economist / Getty Images.[end-div]

The Cult of the Super Person

It is undeniable that there is ever increasing societal pressure on children to perform compete, achieve and succeed, and to do so at ever younger ages. However, while average college test admission scores have improved it’s also arguable that admission standards have dropped. So, the picture painted by James Atlas in the article below is far from clear. Nonetheless, it’s disturbing that our children get less and less time to dream, play, explore and get dirty.

[div class=attrib]From the New York Times:[end-div]

A BROCHURE arrives in the mail announcing this year’s winners of a prestigious fellowship to study abroad. The recipients are allotted a full page each, with a photo and a thick paragraph chronicling their achievements. It’s a select group to begin with, but even so, there doesn’t seem to be anyone on this list who hasn’t mastered at least one musical instrument; helped build a school or hospital in some foreign land; excelled at a sport; attained fluency in two or more languages; had both a major and a minor, sometimes two, usually in unrelated fields (philosophy and molecular science, mathematics and medieval literature); and yet found time — how do they have any? — to enjoy such arduous hobbies as mountain biking and white-water kayaking.

Let’s call this species Super Person.

Do we have some anomalous cohort here? Achievement freaks on a scale we haven’t seen before? Has our hysterically competitive, education-obsessed society finally outdone itself in its tireless efforts to produce winners whose abilities are literally off the charts? And if so, what convergence of historical, social and economic forces has been responsible for the emergence of this new type? Why does Super Person appear among us now?

Perhaps there’s an evolutionary cause, and these robust intellects reflect the leap in the physical development of humans that we ascribe to better diets, exercise and other forms of health-consciousness. (Stephen Jay Gould called this mechanism “extended scope.”) All you have to do is watch a long rally between Novak Djokovic and Rafael Nadal to recognize — if you’re old enough — how much faster the sport has become over the last half century.

The Super Person training for the college application wars is the academic version of the Super Person slugging it out on the tennis court. For wonks, Harvard Yard is Arthur Ashe Stadium.

Preparing for Super Personhood begins early. “We see kids who’ve been training from an early age,” says Charles Bardes, chairman of admissions at Weill Cornell Medical College. “The bar has been set higher. You have to be at the top of the pile.”

And to clamber up there you need a head start. Thus the well-documented phenomenon of helicopter parents. In her influential book “Perfect Madness: Motherhood in the Age of Anxiety,” Judith Warner quotes a mom who gave up her career to be a full-time parent: “The children are the center of the household and everything goes around them. You want to do everything and be everything for them because this is your job now.” Bursting with pent-up energy, the mothers transfer their shelved career ambitions to their children. Since that book was published in 2005, the situation has only intensified. “One of my daughter’s classmates has a pilot’s license; 12-year-olds are taking calculus,” Ms. Warner said last week.

[div class=attrib]Read more of this article here.[end-div]

[div class=attrib]Image courtesy of Mark Todd. New York Times.[end-div]

Art Criticism at its Best

[div class=attrib]From Jonathan Jones over at the Guardian:[end-div]

Works of art are not objects. They are … Oh lord, what are they? Take, for convenience, a painting. It is a physical object, obviously, in that it consists of a wooden panel or a stretched canvas covered in daubs of colour. Depending on the light you may be more or less aware of cracks, brush marks, different layers of paint. Turn it around and it is even more obviously a physical object. But as such it is not art. Only when it is experienced as art can it be called art, and the intensity and value of that experience varies according to the way it is made and the way it is seen, that is, the receptiveness of the beholder to that particular work of art.

And this is why critics are the only real art writers. We are the only ones who acknowledge, as a basic principle, that art is an unstable category – it lives or dies according to rules that cannot ever be systematised. If you treat art in a pseudo-scientific way, as some kinds of art history do, you miss everything that makes it matter. Only on the hoof can it be caught, or rather followed on its elusive meanderings in and out of meaning, significance, and beauty.

Equally, an uncritical, purely literary approach to art also risks missing the whole point about it. You have to be critical, not just belle-lettriste, to get to the pulse of art. To respond to a work is to compare it with other works, and that comparison only has meaning if you judge their relative merits.

No such judgment is final. No critic is right, necessarily. It’s just that criticism offers a more honest and realistic understanding of the deep strangeness of our encounters with these mysterious human creations called works of art.

That is why the really great art historians were critics, who never fought shy of judgment. Kenneth Clark and EH Gombrich were extremely opinionated about what is and is not good art. Were they right or wrong? That is irrelevant. The response of one passionate and critical writer is worth a hundred, or a thousand, uncritical surveys that, by refusing to come off the fence, never get anywhere near the life of art.

[div class=attrib]Read more of this article here.[end-div]

[div class=attrib]Photograph of John Ruskin, circa 1870. Image courtesy of W. & D. Downey / Wikipedia.[end-div]

MondayPoem: Immortal Autumn

The Autumnal Equinox finally ushers in some cooler temperatures for the northern hemisphere, and with that we reflect on this most human of seasons courtesy of a poem by Archibald MacLeish.

By Archibald MacLeish:

– Immortal Autumn

I speak this poem now with grave and level voice
In praise of autumn, of the far-horn-winding fall.

I praise the flower-barren fields, the clouds, the tall
Unanswering branches where the wind makes sullen noise.

I praise the fall: it is the human season.
Now

No more the foreign sun does meddle at our earth,
Enforce the green and bring the fallow land to birth,
Nor winter yet weigh all with silence the pine bough,

But now in autumn with the black and outcast crows
Share we the spacious world: the whispering year is gone:
There is more room to live now: the once secret dawn
Comes late by daylight and the dark unguarded goes.

Between the mutinous brave burning of the leaves
And winter’s covering of our hearts with his deep snow
We are alone: there are no evening birds: we know
The naked moon: the tame stars circle at our eaves.

It is the human season. On this sterile air
Do words outcarry breath: the sound goes on and on.
I hear a dead man’s cry from autumn long since gone.

I cry to you beyond upon this bitter air.

Is Our Children Learning: Testing the Standardized Tests

Test grades once measured student performance. Nowadays test grades are used to measure teacher and parent, educational institution and even national performance. Gary Cutting over at the Stone forum has some instructive commentary.

[div class=attrib]From the New York Times:[end-div]

So what exactly do test scores tell us?

Poor test scores are the initial premises in most current arguments for educational reform.  At the end of last year, reading scores that showed American 15-year-olds in the middle of an international pack, led by Asian countries, prompted calls from researchers and educators for immediate action.  This year two sociologists, Richard Arum and Josipa Roksa, showed that 45 percent of students, after two years of college, have made no significant gains on a test of critical thinking.  Last week’s report of falling SAT scores is the latest example.

Given poor test results, many critics conclude that our schools are failing and propose plans for immediate action.  For example, when Arum and Raksa published their results, many concluded that college teachers need to raise standards in their courses, requiring more hours of study and assigning longer papers.

It is, however, not immediately obvious what follows from poor test scores.  Without taking any position about the state of our schools or how, if at all, they need reform, I want to reflect on what we need to add to the fact of poor scores to construct an argument for changing the way we educate.

The first question is whether a test actually tests for things that we want students to know.   We very seldom simply want students to do well on a test for its own sake.

[div class=attrib]Read more of this article here.[end-div]

[div class=attrib]Image courtesy of U.S. College Search.[end-div]

Map Your Favorite Red (Wine)

This season’s Beaujolais Nouveau is just over a month away so what better way to pave the road to French wines than a viticultural map. The wine map is based on the 1930’s iconic design by Harry Beck of the London Tube (subway).

[div class=attrib]From Frank Jacobs at Strange Maps:[end-div]

The coloured lines on this wine map denote the main wine-producing regions in France, the dots are significant cities or towns in those regions. Names that branch off from the main line via little streaks are the so-called appellations [2].

This schematic approach is illuminating for non-aficionados. In the first place, it clarifies the relation between region and appellation. For example: Médoc, Margaux and St-Emilion are three wines from the same region. So they are all Bordeaux wines, but each with its own appellation.

Secondly, it provides a good indication of the geographic relation between appellations within regions. Chablis and Nuits-St-Georges are northern Burgundy wines, while Beaujolais is a southern one. It also permits some comparison between regions: Beaujolais, although a Burgundy, neighbours Côte Rôtie, a northern Rhône Valley wine.

And lastly, it provides the names of the main grape varieties used in each region (the white ones italicised), like merlot or chardonnay.

Which Couch, the Blue or White? Stubbornness and Social Pressure

Counterintuitive results show that we are more likely to resist changing our minds when more people tell us where are wrong. A team of researchers from HP’s Social Computing Research Group found that humans are more likely to change their minds when fewer, rather than more, people disagree with them.

[div class=attrib]From HP:[end-div]

The research has practical applications for businesses, especially in marketing, suggests co-author Bernardo Huberman,  Senior HP Fellow and director of HP’s Social Computing Research Group.

“What this implies,” he says, “is that rather than overwhelming consumers with strident messages about an alternative product or service, in social media, gentle reporting of a few people having chosen that product or service can be more persuasive.”

The experiment – devised by Huberman along with Haiyi Zhu, an HP labs summer intern from Carnegie Mellon University, and Yarun Luon of HP Labs – reveals several other factors that determine whether choices can be reversed though social influence, too. It’s the latest product of HP Lab’s pioneering program in social computing, which is dedicated to creating software and algorithms that provide meaningful context to huge sets of unstructured data.

Study results: the power of opinion
Opinions and product ratings are everywhere online. But when do they actually influence our own choices?

To find out, the HP team asked several hundred people to make a series of choices between two different pieces of furniture.  After varying amounts of time, they were asked to choose again between the same items, but this time they were told that a certain number of other people had preferred the opposite item.  (Separately, the experiment also asked subjects to choose between two different baby pictures, to control for variance in subject matter).

Analysis of the resulting choices showed that receiving a small amount of social pressure to reverse one’s opinion (by being told that a just few people had chosen differently) was more likely to produce a reversed vote than when the pressure felt was much greater (i.e. where an overwhelming number of people were shown as having made a different choice).

The team also discovered:

– People were more likely to be influenced if they weren’t prompted to change their mind immediately after they had expressed their original preference.
– The more time that people spent on their choice, the more likely they were to reverse that choice and conform to the opinion of others later on.

[div class=attrib]More of this fascinating article here.[end-div]

Complex Decision To Make? Go With the Gut

Over the last couple of years a number of researchers have upended conventional wisdom by finding that complex decisions, for instance, those having lots of variables, are better “made” through our emotional system. This flies in the face of the commonly held belief that complexity is best handled by our rational side.

[div class=attrib]Jonah Lehrer over at the Frontal Cortex brings us up to date on current thinking.[end-div]

We live in a world filled with difficult decisions. In fact, we’ve managed to turn even trivial choices – say, picking a toothpaste – into a tortured mental task, as the typical supermarket has more than 200 different dental cleaning options. Should I choose a toothpaste based on fluoride content? Do I need a whitener in my toothpaste? Is Crest different than Colgate? The end result is that the banal selection becomes cognitively demanding, as I have to assess dozens of alternatives and take an array of variables into account. And it’s not just toothpaste: The same thing has happened to nearly every consumption decision, from bottled water to blue jeans to stocks. There are no simple choices left – capitalism makes everything complicated.

How should we make all these hard choices? How does one navigate a world of seemingly infinite alternatives? For thousands of years, the answer has seemed obvious: when faced with a difficult dilemma, we should carefully assess our options and spend a few moments consciously deliberating the information. Then, we should choose the toothpaste that best fits our preferences. This is how we maximize utility and get the most bang for the buck. We are rational agents – we should make decisions in a rational manner.

But what if rationality backfires? What if we make better decisions when we trust our gut instincts? While there is an extensive literature on the potential wisdom of human emotion, it’s only in the last few years that researchers have demonstrated that the emotional system (aka Type 1 thinking) might excel at complex decisions, or those involving lots of variables. If true, this would suggest that the unconscious is better suited for difficult cognitive tasks than the conscious brain, that the very thought process we’ve long disregarded as irrational and impulsive might actually be “smarter” than reasoned deliberation. This is largely because the unconscious is able to handle a surfeit of information, digesting the facts without getting overwhelmed. (Human reason, in contrast, has a very strict bottleneck and can only process about four bits of data at any given moment.) When confused in the toothpaste aisle, bewildered by all the different options, we should go with the product that feels the best.

The most widely cited demonstration of this theory is a 2006 Science paper led by Ap Dijksterhuis. (I wrote about the research in How We Decide.) The experiment went like this: Dijksterhuis got together a group of Dutch car shoppers and gave them descriptions of four different used cars. Each of the cars was rated in four different categories, for a total of sixteen pieces of information. Car number 1, for example, was described as getting good mileage, but had a shoddy transmission and poor sound system. Car number 2 handled poorly, but had lots of legroom. Dijksterhuis designed the experiment so that one car was objectively ideal, with “predominantly positive aspects”. After showing people these car ratings, Dijksterhuis then gave them a few minutes to consciously contemplate their decision. In this “easy” situation, more than fifty percent of the subjects ended up choosing the best car.

[div class=attrib]Read more of the article and Ap Dijksterhuis’ classic experiment here.[end-div]

[div class=attrib]Image courtesy of CustomerSpeak.[end-div]

Movies in the Mind: A Great Leap in Brain Imaging

A common premise of “mad scientists” in science fiction movies: a computer reconstructs video images from someone’s thoughts via a brain scanning device. Yet, now this is no longer the realm of fantasy. Researchers from the University of California at Berkeley have successfully decoded and reconstructed people’s dynamic visual experiences – in this case watching Hollywood movie trailers –using functional Magnetic Resonance Imaging (fMRI) and computer simulation models.

Watch the stunning video clip below showing side-by-side movies of what a volunteer was actually watching and a computer reconstruction of fMRI data from the same volunteer.

[youtube]nsjDnYxJ0bo[/youtube]

The results are a rudimentary first step, with the technology requiring decades of refinement before the fiction of movies, such as Brainstorm, becomes a closer reality. However, this groundbreaking research nonetheless paves the way to a future of tremendous promise in brain science. Imagine the ability to reproduce and share images of our dreams and memories, or peering into the brain of a comatose patient.

[div class=attrib]More from the UC-Berkeley article here.[end-div]

How Will You Die?

Bad news and good news. First, the bad news. If you’re between 45-54 years of age your cause of death will most likely be heart disease, that is, if you’re a male. If you are a female on the other hand, you’re more likely to fall prey to cancer. And, interestingly you are about 5 times more likely to die falling down stairs than from (accidental) electrocution. Now the good news. While the data may give us a probabilistic notion of how we may perish, no one (yet) knows when.

More vital statistics courtesy of this macabre infographic derived from data of National Center for Health Statistics and the National Safety Council.

Chance as a Subjective or Objective Measure

[div class=attrib]From Rationally Speaking:[end-div]

Stop me if you’ve heard this before: suppose I flip a coin, right now. I am not giving you any other information. What odds (or probability, if you prefer) do you assign that it will come up heads?

If you would happily say “Even” or “1 to 1” or “Fifty-fifty” or “probability 50%” — and you’re clear on WHY you would say this — then this post is not aimed at you, although it may pleasantly confirm your preexisting opinions as a Bayesian on probability. Bayesians, broadly, consider probability to be a measure of their state of knowledge about some proposition, so that different people with different knowledge may correctly quote different probabilities for the same proposition.

If you would say something along the lines of “The question is meaningless; probability only has meaning as the many-trials limit of frequency in a random experiment,” or perhaps “50%, but only given that a fair coin and fair flipping procedure is being used,” this post is aimed at you. I intend to try to talk you out of your Frequentist view; the view that probability exists out there and is an objective property of certain physical systems, which we humans, merely fallibly, measure.

My broader aim is therefore to argue that “chance” is always and everywhere subjective — a result of the limitations of minds — rather than objective in the sense of actually existing in the outside world.

[div class=attrib]Much more of this article here.[end-div]

[div class=attrib]Image courtesy of Wikipedia.[end-div]