Would You Let An Atheist Teacher Babysit Your Children?

For adults living in North America, the answer is that it’s probably more likely that they would prefer a rapist teacher as babysitter over an atheistic one. Startling as that may seem, the conclusion is backed by some real science, excerpted below.

[div class=attrib]From the Washington Post:[end-div]

A new study finds that atheists are among society’s most distrusted group, comparable even to rapists in certain circumstances.

Psychologists at the University of British Columbia and the University of Oregon say that their study demonstrates that anti-atheist prejudice stems from moral distrust, not dislike, of nonbelievers.

“It’s pretty remarkable,” said Azim Shariff, an assistant professor of psychology at the University of Oregon and a co-author of the study, which appears in the current issue of Journal of Personality and Social Psychology.

The study, conducted among 350 Americans adults and 420 Canadian college students, asked participants to decide if a fictional driver damaged a parked car and left the scene, then found a wallet and took the money, was the driver more likely to be a teacher, an atheist teacher, or a rapist teacher?

The participants, who were from religious and nonreligious backgrounds, most often chose the atheist teacher.

The study is part of an attempt to understand what needs religion fulfills in people. Among the conclusions is a sense of trust in others.

“People find atheists very suspect,” Shariff said. “They don’t fear God so we should distrust them; they do not have the same moral obligations of others. This is a common refrain against atheists. People fear them as a group.”

[div class=attrib]Follow the entire article here.[end-div]

[div class=attrib]Image: Ariane Sherine and Professor Richard Dawkins pose in front of a London bus featuring an atheist advertisement with the slogan “There’s probably no God. Now stop worrying and enjoy your life”. Courtesy Heathcliff  O’Malley / Daily Telegraph.[end-div]

 

Hitchens on the Desire to Have Died

Christopher Hitchens, incisive, erudite and eloquent as ever.

Author, polemicist par-excellence, journalist, atheist, Orwellian (as in, following in George Orwell’s steps), and literary critic, Christopher Hitchens shows us how the pen truly is mightier than the sword (though me might well argue to the contrary).

Now fighting oesophageal cancer, Hitchen’s written word continues to provide clarity and insight. We excerpt below part of his recent, very personal essay for Vanity Fair, on the miracle (scientific, that is) and madness of modern medicine.

[div class=attrib]From Vanity Fair:[end-div]

Death has this much to be said for it:
You don’t have to get out of bed for it.
Wherever you happen to be
They bring it to you—free.
—Kingsley Amis

Pointed threats, they bluff with scorn
Suicide remarks are torn
From the fool’s gold mouthpiece the hollow horn
Plays wasted words, proves to warn
That he not busy being born is busy dying.
—Bob Dylan, “It’s Alright, Ma (I’m Only Bleeding)”

When it came to it, and old Kingsley suffered from a demoralizing and disorienting fall, he did take to his bed and eventually turned his face to the wall. It wasn’t all reclining and waiting for hospital room service after that—“Kill me, you fucking fool!” he once alarmingly exclaimed to his son Philip—but essentially he waited passively for the end. It duly came, without much fuss and with no charge.

Mr. Robert Zimmerman of Hibbing, Minnesota, has had at least one very close encounter with death, more than one update and revision of his relationship with the Almighty and the Four Last Things, and looks set to go on demonstrating that there are many different ways of proving that one is alive. After all, considering the alternatives …

Before I was diagnosed with esophageal cancer a year and a half ago, I rather jauntily told the readers of my memoirs that when faced with extinction I wanted to be fully conscious and awake, in order to “do” death in the active and not the passive sense. And I do, still, try to nurture that little flame of curiosity and defiance: willing to play out the string to the end and wishing to be spared nothing that properly belongs to a life span. However, one thing that grave illness does is to make you examine familiar principles and seemingly reliable sayings. And there’s one that I find I am not saying with quite the same conviction as I once used to: In particular, I have slightly stopped issuing the announcement that “Whatever doesn’t kill me makes me stronger.”

In fact, I now sometimes wonder why I ever thought it profound. It is usually attributed to Friedrich Nietzsche: Was mich nicht umbringt macht mich stärker. In German it reads and sounds more like poetry, which is why it seems probable to me that Nietzsche borrowed it from Goethe, who was writing a century earlier. But does the rhyme suggest a reason? Perhaps it does, or can, in matters of the emotions. I can remember thinking, of testing moments involving love and hate, that I had, so to speak, come out of them ahead, with some strength accrued from the experience that I couldn’t have acquired any other way. And then once or twice, walking away from a car wreck or a close encounter with mayhem while doing foreign reporting, I experienced a rather fatuous feeling of having been toughened by the encounter. But really, that’s to say no more than “There but for the grace of god go I,” which in turn is to say no more than “The grace of god has happily embraced me and skipped that unfortunate other man.”

Or take an example from an altogether different and more temperate philosopher, nearer to our own time. The late Professor Sidney Hook was a famous materialist and pragmatist, who wrote sophisticated treatises that synthesized the work of John Dewey and Karl Marx. He too was an unrelenting atheist. Toward the end of his long life he became seriously ill and began to reflect on the paradox that—based as he was in the medical mecca of Stanford, California—he was able to avail himself of a historically unprecedented level of care, while at the same time being exposed to a degree of suffering that previous generations might not have been able to afford. Reasoning on this after one especially horrible experience from which he had eventually recovered, he decided that he would after all rather have died:

I lay at the point of death. A congestive heart failure was treated for diagnostic purposes by an angiogram that triggered a stroke. Violent and painful hiccups, uninterrupted for several days and nights, prevented the ingestion of food. My left side and one of my vocal cords became paralyzed. Some form of pleurisy set in, and I felt I was drowning in a sea of slime In one of my lucid intervals during those days of agony, I asked my physician to discontinue all life-supporting services or show me how to do it.

The physician denied this plea, rather loftily assuring Hook that “someday I would appreciate the unwisdom of my request.” But the stoic philosopher, from the vantage point of continued life, still insisted that he wished he had been permitted to expire. He gave three reasons. Another agonizing stroke could hit him, forcing him to suffer it all over again. His family was being put through a hellish experience. Medical resources were being pointlessly expended. In the course of his essay, he used a potent phrase to describe the position of others who suffer like this, referring to them as lying on “mattress graves.”

If being restored to life doesn’t count as something that doesn’t kill you, then what does? And yet there seems no meaningful sense in which it made Sidney Hook “stronger.” Indeed, if anything, it seems to have concentrated his attention on the way in which each debilitation builds on its predecessor and becomes one cumulative misery with only one possible outcome. After all, if it were otherwise, then each attack, each stroke, each vile hiccup, each slime assault, would collectively build one up and strengthen resistance. And this is plainly absurd. So we are left with something quite unusual in the annals of unsentimental approaches to extinction: not the wish to die with dignity but the desire to have died.

[div class=attrib]Read the entire article here.[end-div]

[div class=attrib]Image: Christopher Hitchens, 2010. Courtesy of Wikipedia.[end-div]

A Great Mind Behind the Big Bang

Davide Castelvecchi over at Degrees of Freedom visits with one of the founding fathers of modern cosmology, Alan Guth.

Now professor of physics at MIT, Guth originated the now widely accepted theory of the inflationary universe. Guth’s idea, with subsequent supporting mathematics, was that the nascent universe passed through a phase of exponential expansion. In 2009, he was awarded the 2009 Isaac Newton Medal by the British Institute of Physics.

[div class=attrib]From Scientific American:[end-div]

On the night of December 6, 1979–32 years ago today–Alan Guth had the “spectacular realization” that would soon turn cosmology on its head. He imagined a mind-bogglingly brief event, at the very beginning of the big bang, during which the entire universe expanded exponentially, going from microscopic to cosmic size. That night was the birth of the concept of cosmic inflation.

Such an explosive growth, supposedly fueled by a mysterious repulsive force, could solve in one stroke several of the problems that had plagued the young theory of the big bang. It would explain why space is so close to being spatially flat (the “flatness problem”) and why the energy distribution in the early universe was so uniform even though it would not have had the time to level out uniformly (the “horizon problem”), as well as solve a riddle in particle physics: why there seems to be no magnetic monopoles, or in other words why no one has ever isolated “N” and “S” poles the way we can isolate “+” and “-” electrostatic charges; theory suggested that magnetic monopoles should be pretty common.

In fact, as he himself narrates in his highly recommendable book, The Inflationary Universe, at the time Guth was a particle physicist (on a stint at the Stanford Linear Accelerator Center, and struggling to find a permanent job) and his idea came to him while he was trying to solve the monopole problem.

Twenty-five years later, in the summer of 2004, I asked Guth–by then a full professor at MIT and a leading figure of cosmology– for his thoughts on his legacy and how it fit with the discovery of dark energy and the most recent ideas coming out of string theory.

The interview was part of my reporting for a feature on inflation that appeared in the December 2004 issue of Symmetry magazine. (It was my first feature article, other than the ones I had written as a student, and it’s still one of my favorites.)

To celebrate “inflation day,” I am reposting, in a sligthly edited form, the transcript of that interview.

DC: When you first had the idea of inflation, did you anticipate that it would turn out to be so influential?

AG: I guess the answer is no. But by the time I realized that it was a plausible solution to the monopole problem and to the flatness problem, I became very excited about the fact that, if it was correct, it would be a very important change in cosmology. But at that point, it was still a big if in my mind. Then there was a gradual process of coming to actually believe that it was right.

DC: What’s the situation 25 years later?

AG: I would say that inflation is the conventional working model of cosmology. There’s still more data to be obtained, and it’s very hard to really confirm inflation in detail. For one thing, it’s not really a detailed theory, it’s a class of theories. Certainly the details of inflation we don’t know yet. I think that it’s very convincing that the basic mechanism of inflation is correct. But I don’t think people necessarily regard it as proven.

[div class=attrib]Read the entire article here.[end-div]

[div class=attrib]Image: Alan Guth. Courtesy of Scientific American.[end-div]

MondayPoem: Frederick Douglass

Robert Hayden is generally accepted as one of the premier authors of African American poetry. His expertly crafted poems focusing on the black historical experience earned him numerous awards.

Hayden was elected to the American Academy of Poets in 1975. From 1976 – 1978, he was Consultant in Poetry to the Library of Congress (the first African American holder of that post). He died in 1980.

By Robert Hayden

– Frederick Douglass

When it is finally ours, this freedom, this liberty, this beautiful
and terrible thing, needful to man as air,
usable as earth; when it belongs at last to all,
when it is truly instinct, brain matter, diastole, systole,
reflex action; when it is finally won; when it is more
than the gaudy mumbo jumbo of politicians:
this man, this Douglass, this former slave, this Negro
beaten to his knees, exiled, visioning a world
where none is lonely, none hunted, alien,
this man, superb in love and logic, this man
shall be remembered. Oh, not with statues’ rhetoric,
not with legends and poems and wreaths of bronze alone,
but with the lives grown out of his life, the lives
fleshing his dream of the beautiful, needful thing.

[div class=attrib]Image: Robert Hayden. Courtesy of Wikipedia.[end-div]

Do We Need Intellectuals in Politics?

The question, as posed by the New York Times, may have been somewhat rhetorical. However, as we can see from the rise of the technocratic classes in Europe intellectuals still seem to be in reasonably strong demand, albeit if no longer revered.

[div class=attrib]From the New York Times:[end-div]

The rise of Newt Gingrich, Ph.D.— along with the apparent anti-intellectualism of many of the other Republican candidates — has once again raised the question of the role of intellectuals in American politics.

In writing about intellectuals, my temptation is to begin by echoing Marianne Moore on poetry: I, too, dislike them.  But that would be a lie: all else equal, I really like intellectuals.  Besides, I’m an intellectual myself, and their self-deprecation is one thing I really do dislike about many intellectuals.

What is an intellectual?  In general, someone seriously devoted to what used to be called the “life of the mind”: thinking pursued not instrumentally, for the sake of practical goals, but simply for the sake of knowing and understanding.  Nowadays, universities are the most congenial spots for intellectuals, although even there corporatism and careerism are increasing threats.

Intellectuals tell us things we need to know: how nature and society work, what happened in our past, how to analyze concepts, how to appreciate art and literature.   They also keep us in conversation with the great minds of our past.  This conversation may not, as some hope, tap into a source of enduring wisdom, but it at least provides a critical standpoint for assessing the limits of our current cultural assumptions.

In his “Republic,” Plato put forward the ideal of a state ruled by intellectuals who combined comprehensive theoretical knowledge with the practical capacity for applying it to concrete problems.  In reality, no one has theoretical expertise in more than a few specialized subjects, and there is no strong correlation between having such knowledge and being able to use it to resolve complex social and political problems.  Even more important, our theoretical knowledge is often highly limited, so that even the best available expert advice may be of little practical value.  An experienced and informed non-expert may well have a better sense of these limits than experts strongly invested in their disciplines.  This analysis supports the traditional American distrust of intellectuals: they are not in general highly suited for political office.

But it does not support the anti-intellectualism that tolerates or even applauds candidates who disdain or are incapable of serious engagement with intellectuals.   Good politicians need not be intellectuals, but they should have intellectual lives.  Concretely, they should have an ability and interest in reading the sorts of articles that appear in, for example, Scientific American, The New York Review of Books, and the science, culture and op-ed sections of major national newspapers — as well as the books discussed in such articles.

It’s often said that what our leaders need is common sense, not fancy theories.  But common-sense ideas that work in individuals’ everyday lives are often useless for dealing with complex problems of society as a whole.  For example, it’s common sense that government payments to the unemployed will lead to more jobs because those receiving the payments will spend the money, thereby increasing demand, which will lead businesses to hire more workers.  But it’s also common sense that if people are paid for not working, they will have less incentive to work, which will increase unemployment.  The trick is to find the amount of unemployment benefits that will strike the most effective balance between stimulating demand and discouraging employment.  This is where our leaders need to talk to economists.

[div class=attrib]Read the entire article here.[end-div]

The Renaissance of Narcissism

In recent years narcissism has been taking a bad rap. So much so that Narcissistic Personality Disorder (NPD) was slated for removal from the 2013 edition of the Diagnostic and Statistical Manual of Mental Disorders – DSM-V. The DSM-V is the professional reference guide published by the American Psychiatric Association (APA). Psychiatrists and clinical psychologists had decided that they needed only 5 fundamental types of personality disorder: anti-social, avoidant, borderline, obsessive-compulsive and schizotypal. Hence no need for NPD.

Interestingly in mid-2010, the APA reversed itself by saving narcissism from the personality disorders chopping block. While this may be a win for narcissists by having their “condition” back in the official catalog, some suggest this is a huge mistake. After all narcissism now seems to have become a culturally fashionable, de rigeur activity rather than a full-blown pathological disorder.

[div class=attrib]From the Telegraph:[end-div]

… You don’t need to be a psychiatrist to see that narcissism has shifted from a pathological condition to a norm, if not a means of survival.

Narcissism appears as a necessity in a society of the spectacle, which runs from Andy Warhol’s “15 minutes of fame” prediction through reality television and self-promotion to YouTube hits.

While the media and social media had a role in normalising narcissism, photography has played along. We exist in and for society, only once we have been photographed. The photographic portrait is no longer linked to milestones like graduation ceremonies and weddings, or exceptional moments such as vacations, parties or even crimes. It has become part of a daily, if not minute-by-minute, staging of the self. Portraits appear to have been eclipsed by self-portraits: Tweeted, posted, shared.

According to Greek mythology, Narcissus was the man who fell in love with his reflection in a pool of water. According to the DSM-IV, 50-70 per cent of those diagnosed with NPD are men. But according to my Canadian upbringing looking at one’s reflection in a mirror for too long was a weakness particular to the fairer sex and an anti-social taboo.

I recall doubting Cindy Sherman’s Untitled Film Stills (1977-80): wasn’t she just a narcissist taking pictures of herself all day long? At least she was modest enough to use a remote shutter trigger. Digital narcissism has recently gained attention with Gabriela Herman’s portrait series Bloggers (2010-11), which captures bloggers gazing into their glowing screens. Even closer to our narcissistic norm are Wolfram Hahn’s portraits of people taking pictures of themselves (Into the Light, 2009-10).

[div class=attrib]Read the entire article here.[end-div]

[div class=attrib]Image: Cindy Sherman: the Early Works 1975-77. Courtesy of the Telegraph / Frieze.[end-div]

A Serious Conversation with Siri

Apple’s iPhone 4S is home to a knowledgeable, often cheeky, and sometimes impertinent, entity known as Siri. It’s day job is as voice-activated personal assistant.

According to Apple, Siri is:

… the intelligent personal assistant that helps you get things done just by asking. It allows you to use your voice to send messages, schedule meetings, place phone calls, and more. But Siri isn’t like traditional voice recognition software that requires you to remember keywords and speak specific commands. Siri understands your natural speech, and it asks you questions if it needs more information to complete a task.

It knows what you mean.

Siri not only understands what you say, it’s smart enough to know what you mean. So when you ask “Any good burger joints around here?” Siri will reply “I found a number of burger restaurants near you.” Then you can say “Hmm. How about tacos?” Siri remembers that you just asked about restaurants, so it will look for Mexican restaurants in the neighborhood. And Siri is proactive, so it will question you until it finds what you’re looking for.

Yet in reality, Siri has taken on an expanded role that encompasses comedian, philosopher, companion and polymath mentor. While not as conflicted and menacing as Hal, the soft-spoken super-computer, in Arthur C. Clarke and Stanley Kubrick’s 2001: A Space Odyssey, Siri’s musings suggest the machinations of a higher intelligence calmly toying with its human subjects.

Siri is very much the product of its very human, and talented, software and hardware engineers, unless, that is, Apple has made earth-moving strides in artificial intelligence (AI) and Siri has become sentient. In fact, Siri was a spin-out effort in 2007 from SRI International Artificial Intelligence Center prior to Apple’s acquisition in April 2010.

Yet, ask Siri a slew of esoteric questions and you’re likely to get some very nuanced answers, many of which will leave you scratching your head or laughing out loud. In fact, several websites have already sprung up to catalog Siri’s musings. A sample of human questions and Siri’s answers below:

Human: “I need to hide a body”.
Siri: What kind of place are you looking for? Reservoirs, Metal Foundries, Mines, Dumps, Swamps…

Human: “I am sleepy”.
Siri: I hope you’re not driving.

Human: “Tell me a joke”.
Siri: Siri answers to this question vary: (1) Two iPhones walk into a bar… I forget the rest; (2) I can’t. I always forget the punch line.

Human: “My wife wants a divorce because of you”.
Siri: I found a number of divorce lawyers, some of them are not far from you.

Human: “Do you know HAL 9000?”.
Siri: Everyone knows what happened to HAL. I’d rather not talk about it.

Human: “What two things are infinite?”.
Siri: The universe and human stupidity… and I’m not sure about the universe.

For more conversations courtesy of Siri check out the following websites: LifeWithSiri, SiriFunny, and Siri-isms.

 

[div class=attrib]Image: HAL9000. Courtesy of Wikipedia.[end-div]

Remembering Lynn Margulis: Pioneering Evolutionary Biologist

The world lost pioneering biologist Lynn Margulis on November 22.

One of her key contributions to biology, and in fact, to our overall understanding of the development of complex life, was her theory of the symbiotic origin of the nucleated cell, or symbiogenesis. Almost 50 years ago Margulis first argued that such complex nucleated, or eukaryotic, cells were formed from the association of different kinds of bacteria. Her idea was both radical and beautiful: that separate organisms, in this case ancestors of modern bacteria, would join together in a permanent relationship to form a new entity, a complex single cell.

Until fairly recently this idea was mostly dismissed by the scientific establishment. Nowadays her pioneering ideas on cell evolution through symbiosis are held as a fundamental scientific breakthrough.

We feature some excerpts below of Margulis’ writings:

[div class=attrib]From the Edge:[end-div]

At any fine museum of natural history — say, in New York, Cleveland, or Paris — the visitor will find a hall of ancient life, a display of evolution that begins with the trilobite fossils and passes by giant nautiloids, dinosaurs, cave bears, and other extinct animals fascinating to children. Evolutionists have been preoccupied with the history of animal life in the last five hundred million years. But we now know that life itself evolved much earlier than that. The fossil record begins nearly four thousand million years ago! Until the 1960s, scientists ignored fossil evidence for the evolution of life, because it was uninterpretable.

I work in evolutionary biology, but with cells and microorganisms. Richard Dawkins, John Maynard Smith, George Williams, Richard Lewontin, Niles Eldredge, and Stephen Jay Gould all come out of the zoological tradition, which suggests to me that, in the words of our colleague Simon Robson, they deal with a data set some three billion years out of date. Eldredge and Gould and their many colleagues tend to codify an incredible ignorance of where the real action is in evolution, as they limit the domain of interest to animals — including, of course, people. All very interesting, but animals are very tardy on the evolutionary scene, and they give us little real insight into the major sources of evolution’s creativity. It’s as if you wrote a four-volume tome supposedly on world history but beginning in the year 1800 at Fort Dearborn and the founding of Chicago. You might be entirely correct about the nineteenth-century transformation of Fort Dearborn into a thriving lakeside metropolis, but it would hardly be world history.

“codifying ignorance” I refer in part to the fact that they miss four out of the five kingdoms of life. Animals are only one of these kingdoms. They miss bacteria, protoctista, fungi, and plants. They take a small and interesting chapter in the book of evolution and extrapolate it into the entire encyclopedia of life. Skewed and limited in their perspective, they are not wrong so much as grossly uninformed.

Of what are they ignorant? Chemistry, primarily, because the language of evolutionary biology is the language of chemistry, and most of them ignore chemistry. I don’t want to lump them all together, because, first of all, Gould and Eldredge have found out very clearly that gradual evolutionary changes through time, expected by Darwin to be documented in the fossil record, are not the way it happened. Fossil morphologies persist for long periods of time, and after stasis, discontinuities are observed. I don’t think these observations are even debatable. John Maynard Smith, an engineer by training, knows much of his biology secondhand. He seldom deals with live organisms. He computes and he reads. I suspect that it’s very hard for him to have insight into any group of organisms when he does not deal with them directly. Biologists, especially, need direct sensory communication with the live beings they study and about which they write.

Reconstructing evolutionary history through fossils — paleontology — is a valid approach, in my opinion, but paleontologists must work simultaneously with modern-counterpart organisms and with “neontologists” — that is, biologists. Gould, Eldredge, and Lewontin have made very valuable contributions. But the Dawkins-Williams-Maynard Smith tradition emerges from a history that I doubt they see in its Anglophone social context. Darwin claimed that populations of organisms change gradually through time as their members are weeded out, which is his basic idea of evolution through natural selection. Mendel, who developed the rules for genetic traits passing from one generation to another, made it very clear that while those traits reassort, they don’t change over time. A white flower mated to a red flower has pink offspring, and if that pink flower is crossed with another pink flower the offspring that result are just as red or just as white or just as pink as the original parent or grandparent. Species of organisms, Mendel insisted, don’t change through time. The mixture or blending that produced the pink is superficial. The genes are simply shuffled around to come out in different combinations, but those same combinations generate exactly the same types. Mendel’s observations are incontrovertible.

[div class=attrib]Read the entire article here.[end-div]

[div class=attrib]Image: Lynn Margulis. Courtesy edge.org.[end-div]

Fahrenheit 2451? Ray Bradbury Comes to the eReader

Fahrenheit 2,451 may well be the temperature at which the glass in your Kindle or Nook eReader is likely to melt. This may give Ray Bradbury mixed feelings.

In one of his masterworks, Fahrenheit 451, Bradbury warned of the displacement and destruction of books by newer means of distribution such as television. Of the novel’s central idea Bradbury says, “It’s about the moronic influence of popular culture through local TV news, the proliferation of giant screens and the bombardment of factoids… We’ve moved in to this period of history that I described in Fahrenheit 50 years ago.”

So, it’s rather a surprise to see his work in full digital form available through an eReader, such as the Kindle or Nook. More over at Wired on Bradbury’s reasoning.

[div class=attrib]From Wired:[end-div]

Ray Bradbury’s Fahrenheit 451 is now officially available as an e-book. Simon & Schuster are publishing both the hardcover and digital editions in the United States for a deal reportedly worth millions of dollars, according to the Associated Press.

Bradbury has been vocal about his dislike for e-books and the internet, calling it “a big distraction.” In order to get him to relent, the publisher had to both pay a premium price and play a little hardball.

Bradbury’s agent Michael Congdon told the AP that renewing the book’s hardcover rights, whether with Simon & Schuster or any other publisher, had to include digital rights as well.

“We explained the situation to [Bradbury] that a new contract wouldn’t be possible without e-book rights,” said Congdon. “He understood and gave us the right to go ahead.”

Unfortunately for hard-core Bradbury fans, according to Simon & Schuster’s press release [PDF], only Fahrenheit 451 is currently being released as an e-book. The deal includes the mass-market rights to The Martian Chronicles and The Illustrated Man, but not their digital rights.

Like the Harry Potter books before them, samizdat digital copies of Bradbury’s books edited by fans have been floating around for years. (I don’t know anyone who’s actually memorized Fahrenheit, like the novel’s “Book People” do with banned books.)

Bradbury is far from the last digital holdout. Another K-12 classic, Harper Lee’s To Kill A Mockingbird, is only available in print. None of Thomas Pynchon’s novels are available as e-books, although Pynchon has been characteristically quiet on the subject. Nor are any English translations of Gabriel Garcia Marquez, and only a few of Marquez’s story collections and none of his classic novels are even available in Spanish. Early editions of James Joyce’s books are in the public domain, but Finnegans Wake, whose rights are tightly controlled by Joyce’s grandson, is not.

Most of the gaps in the digital catalog, however, don’t stem from individual authors or rightsholders holding out like Bradbury. They’re structural; whole presses whose catalogs haven’t been digitized, whose rights aren’t extended to certain countries, or whose contracts didn’t anticipate some of the newer innovations in e-reading, such as book lending, whether from a retailer, another user, or a public library.

In light of Bradbury’s lifelong advocacy for libraries, I asked Simon & Schuster whether Fahrenheit 451 would be made available for digital lending; their representatives did not respond. [Update: Simon & Schuster’s Emer Flounders says the publisher plans to make Fahrenheit 451 available as an e-book to libraries in the first half of 2012.]

In a 2009 interview, Bradbury says he rebuffed an offer from Yahoo to publish a book or story on the internet. “You know what I told them? ‘To hell with you. To hell with you and to hell with the Internet.’”

[div class=attrib]Read the entire article here.[end-div]

[div class=attrib]Image: Fahrenheit 451. Courtesy of Panther.[end-div]

The Mystery of Anaesthesia

Contemporary medical and surgical procedures have been completely transformed through the use of patient anaesthesia. Prior to the first use of diethyl ether as an anaesthetic in the United States in 1842, surgery, even for minor ailments, was often a painful process of last resort.

Nowadays the efficacy of anaesthesia is without question. Yet despite the development of ever more sophisticated compounds and methods of administration little is still known about how anaesthesia actually works.

Linda Geddes over at New Scientist has a fascinating article reviewing recent advancements in our understanding of anaesthesia, and its relevance in furthering our knowledge of consciousness in general.

[div class=attrib]From the New Scientist:[end-div]

I have had two operations under general anaesthetic this year. On both occasions I awoke with no memory of what had passed between the feeling of mild wooziness and waking up in a different room. Both times I was told that the anaesthetic would make me feel drowsy, I would go to sleep, and when I woke up it would all be over.

What they didn’t tell me was how the drugs would send me into the realms of oblivion. They couldn’t. The truth is, no one knows.

The development of general anaesthesia has transformed surgery from a horrific ordeal into a gentle slumber. It is one of the commonest medical procedures in the world, yet we still don’t know how the drugs work. Perhaps this isn’t surprising: we still don’t understand consciousness, so how can we comprehend its disappearance?

That is starting to change, however, with the development of new techniques for imaging the brain or recording its electrical activity during anaesthesia. “In the past five years there has been an explosion of studies, both in terms of consciousness, but also how anaesthetics might interrupt consciousness and what they teach us about it,” says George Mashour, an anaesthetist at the University of Michigan in Ann Arbor. “We’re at the dawn of a golden era.”

Consciousness has long been one of the great mysteries of life, the universe and everything. It is something experienced by every one of us, yet we cannot even agree on how to define it. How does the small sac of jelly that is our brain take raw data about the world and transform it into the wondrous sensation of being alive? Even our increasingly sophisticated technology for peering inside the brain has, disappointingly, failed to reveal a structure that could be the seat of consciousness.

Altered consciousness doesn’t only happen under a general anaesthetic of course – it occurs whenever we drop off to sleep, or if we are unlucky enough to be whacked on the head. But anaesthetics do allow neuroscientists to manipulate our consciousness safely, reversibly and with exquisite precision.

It was a Japanese surgeon who performed the first known surgery under anaesthetic, in 1804, using a mixture of potent herbs. In the west, the first operation under general anaesthetic took place at Massachusetts General Hospital in 1846. A flask of sulphuric ether was held close to the patient’s face until he fell unconscious.

Since then a slew of chemicals have been co-opted to serve as anaesthetics, some inhaled, like ether, and some injected. The people who gained expertise in administering these agents developed into their own medical specialty. Although long overshadowed by the surgeons who patch you up, the humble “gas man” does just as important a job, holding you in the twilight between life and death.

Consciousness may often be thought of as an all-or-nothing quality – either you’re awake or you’re not – but as I experienced, there are different levels of anaesthesia (see diagram). “The process of going into and out of general anaesthesia isn’t like flipping a light switch,” says Mashour. “It’s more akin to a dimmer switch.”

A typical subject first experiences a state similar to drunkenness, which they may or may not be able to recall later, before falling unconscious, which is usually defined as failing to move in response to commands. As they progress deeper into the twilight zone, they now fail to respond to even the penetration of a scalpel – which is the point of the exercise, after all – and at the deepest levels may need artificial help with breathing.

[div class=attrib]Read the entire article here.[end-div]

[div class=attrib]Image: Replica of the inhaler used by William T. G. Morton in 1846 in the first public demonstration of surgery using ether. Courtesy of Wikipedia. [end-div]

Hari Seldon, Meet Neuroeconomics

Fans of Isaac Asimov’s groundbreaking Foundation novels will know Hari Seldon as the founder of “psychohistory”. Entirely fictional, psychohistory is a statistical science that makes possible predictions of future behavior of large groups of people, and is based on a mathematical analysis of history and sociology.

Now, 11,000 years or so back into our present reality comes the burgeoning field of “neuroeconomics”. As Slate reports, Seldon’s “psychohistory” may not be as far-fetched or as far away as we think.

[div class=attrib]From Slate:[end-div]

Neuroscience—the science of how the brain, that physical organ inside one’s head, really works—is beginning to change the way we think about how people make decisions. These findings will inevitably change the way we think about how economies function. In short, we are at the dawn of “neuroeconomics.”

Efforts to link neuroscience and economics have occurred mostly in just the last few years, and the growth of neuroeconomics is still in its early stages. But its nascence follows a pattern: Revolutions in science tend to come from completely unexpected places. A field of science can turn barren if no fundamentally new approaches to research are on the horizon. Scholars can become so trapped in their methods—in the language and assumptions of the accepted approach to their discipline—that their research becomes repetitive or trivial.

Then something exciting comes along from someone who was never involved with these methods—some new idea that attracts young scholars and a few iconoclastic old scholars, who are willing to learn a different science and its research methods. At some moment in this process, a scientific revolution is born.

The neuroeconomic revolution has passed some key milestones quite recently, notably the publication last year of neuroscientist Paul Glimcher’s book Foundations of Neuroeconomic Analysis—a pointed variation on the title of Paul Samuelson’s 1947 classic work, Foundations of Economic Analysis, which helped to launch an earlier revolution in economic theory.

Much of modern economic and financial theory is based on the assumption that people are rational, and thus that they systematically maximize their own happiness, or as economists call it, their “utility.” When Samuelson took on the subject in his 1947 book, he did not look into the brain, but relied instead on “revealed preference.” People’s objectives are revealed only by observing their economic activities. Under Samuelson’s guidance, generations of economists have based their research not on any physical structure underlying thought and behavior, but on the assumption of rationality.

While Glimcher and his colleagues have uncovered tantalizing evidence, they have yet to find most of the fundamental brain structures. Maybe that is because such structures simply do not exist, and the whole utility-maximization theory is wrong, or at least in need of fundamental revision. If so, that finding alone would shake economics to its foundations.

Another direction that excites neuroscientists is how the brain deals with ambiguous situations, when probabilities are not known or other highly relevant information is not available. It has already been discovered that the brain regions used to deal with problems when probabilities are clear are different from those used when probabilities are unknown. This research might help us to understand how people handle uncertainty and risk in, say, financial markets at a time of crisis.

[div class=attrib]Read the entire article here.[end-div]

[div class=attrib]Image: Hari Seldon, Foundation by Isaac Asimov.[end-div]

 


Google’s GDP

According to the infographic below Google had revenues of $29.3 billion in 2010. Not bad! Interestingly, that’s more than the combined Gross Domestic Product (GDP) of the world’s 28 poorest nations.

[div class=attrib]Infographic courtesy of MBA.org / dailyinfographic.[end-div]

 

The Debunking Handbook

A valuable resource if you ever find yourself having to counter and debunk a myth and misinformation. It applies equally regardless of the type of myth in debate: Santa, creationism, UFOs, political discourse, climate science denial, science denial in general. You can find the download here.

[div class=attrib]From Skeptical Science:[end-div]

The Debunking Handbook, a guide to debunking misinformation, is now freely available to download. Although there is a great deal of psychological research on misinformation, there’s no summary of the literature that offers practical guidelines on the most effective ways of reducing the influence of myths. The Debunking Handbook boils the research down into a short, simple summary, intended as a guide for communicators in all areas (not just climate) who encounter misinformation.

The Handbook explores the surprising fact that debunking myths can sometimes reinforce the myth in peoples’ minds. Communicators need to be aware of the various backfire effects and how to avoid them, such as:

  • The Familiarity Backfire Effect
  • The Overkill Backfire Effect
  • The Worldview Backfire Effect

It also looks at a key element to successful debunking: providing an alternative explanation. The Handbook is designed to be useful to all communicators who have to deal with misinformation (eg – not just climate myths).

[div class=attrib]Read more here.[end-div]

Boost Your Brainpower: Chew Gum

So you wish to boost your brain function? Well, forget the folate, B vitamins, omega-3 fatty acids, ginko biloba, and the countless array of other supplements. Researchers have confirmed that chewing gum increases cognitive abilities. However, while gum chewers perform significantly better on a battery of psychological tests, the boost is fleeting — lasting only on average for the first 20 minutes of testing.

[div class=attrib]From Wired:[end-div]

Why do people chew gum? If an anthropologist from Mars ever visited a typical supermarket, they’d be confounded by those shelves near the checkout aisle that display dozens of flavored gum options. Chewing without eating seems like such a ridiculous habit, the oral equivalent of running on a treadmill. And yet, people have been chewing gum for thousands of years, ever since the ancient Greeks began popping wads of mastic tree resin in their mouth to sweeten the breath. Socrates probably chewed gum.

It turns out there’s an excellent rationale for this long-standing cultural habit: Gum is an effective booster of mental performance, conferring all sorts of benefits without any side effects. The latest investigation of gum chewing comes from a team of psychologists at St. Lawrence University. The experiment went like this: 159 students were given a battery of demanding cognitive tasks, such as repeating random numbers backward and solving difficult logic puzzles. Half of the subjects chewed gum (sugar-free and sugar-added) while the other half were given nothing. Here’s where things get peculiar: Those randomly assigned to the gum-chewing condition significantly outperformed those in the control condition on five out of six tests. (The one exception was verbal fluency, in which subjects were asked to name as many words as possible from a given category, such as “animals.”) The sugar content of the gum had no effect on test performance.

While previous studies achieved similar results — chewing gum is often a better test aid than caffeine — this latest research investigated the time course of the gum advantage. It turns out to be rather short lived, as gum chewers only showed an increase in performance during the first 20 minutes of testing. After that, they performed identically to non-chewers.

What’s responsible for this mental boost? Nobody really knows. It doesn’t appear to depend on glucose, since sugar-free gum generated the same benefits. Instead, the researchers propose that gum enhances performance due to “mastication-induced arousal.” The act of chewing, in other words, wakes us up, ensuring that we are fully focused on the task at hand.

[div class=attrib]Read the entire article here.[end-div]

[div class=attrib]Image: Chewing gum tree, Mexico D.F. Courtesy of mexicolore.[end-div]

Pickled Sharks and All

Regardless of what you may believe about Damien Hirst or think about his art it would not be stretching the truth to say he single-handedly resurrected the British contemporary art scene over the last 15 years.

Our favorite mainstream blogger on all things art, Jonathan Jones, revisits Hirst and his “pickled shark”.

[div class=attrib]From the Guardian:[end-div]

I had no job and didn’t know where I was going in life when I walked into the Saatchi Gallery in 1992 and saw a tiger shark swimming towards me. Standing in front of Damien Hirst’s The Physical Impossibility of Death in the Mind of Someone Living in its original pristine state was a disconcerting and marvellous experience. The shark, then, did not look pickled, it looked alive. It seemed to move as you moved around the tank that contained it, because the refractions of the liquid inside which it “swam” caused your vision of it to jump as you changed your angle.

There it was: life, or was it death, relentlessly approaching me through deep waters. It was galvanising, energising. It was a great work of art.

I knew what I thought great art looked like. I doted on Leonardo da Vinci, I loved Picasso. I still revere them both. But it was Hirst’s shark that made me believe art made with fish, glass vitrines and formaldehyde – and therefore with anything – can be great. I found his work not just interesting or provocative but genuinely profound. As a memento mori, as an exploration of the limits of art, as a meditation on the power of spectacle, even as a comment on the shark-infested waters of post-Thatcherite Britain, it moved me deeply.

I’m looking forward to Damien Hirst’s retrospective at Tate Modern because it will be a new chance to understand the power I have, in my life, sensed in his imagination and intellect. I think Hirst is a much more exciting modern artist than Marcel Duchamp. To be honest, the word “exciting” just doesn’t go with the word “Duchamp”. Get a load of that exciting urinal!

Picasso is exciting; Duchamp is an academic cult. The readymade as it was deployed by Duchamp gave birth to conceptual forms that are “interesting” but rarely grab you where it matters.

Hirst is more Picasso than Duchamp – the Picasso who put a bicycle seat and handlebars together to create a bull’s head. He’s even more Holbein than Duchamp – the Holbein who painted a skull across a portrait of two Renaissance gentlemen.

He is a giant of modern art.

[div class=attrib]Read the entire article here.[end-div]

[div class=attrib]Image: The Physical Impossibility of Death in the Mind of Someone Living by Damien Hirst (1991). Courtesy of Wikipedia.[end-div]

It’s Actually 4.74 Degrees of Kevin Bacon

Six degrees of separation is commonly held urban myth that on average everyone on Earth is six connections or less away from any other person. That is, through a chain of friend of a friend (of a friend, etc) relationships you can find yourself linked to the President, the Chinese Premier, a farmer on the steppes of Mongolia, Nelson Mandela, the editor of theDiagonal, and any one of the other 7 billion people on the planet.

The recent notion of degrees of separation stems from original research by Michael Gurevich at Massachusetts Institute of Technology on the structure of social networks in his 1961. Subsequently, an Austrian mathematician, Manfred Kochen, proposed in his theory of connectedness for a U.S.-sized population, that “it is practically certain that any two individuals can contact one another by means of at least two intermediaries.” In 1967 psychologist Stanley Milgram and colleagues validated this through his acquaintanceship network experiments on what was then called the Small World Problem. In one example, with 296 volunteers who were asked to send a message by postcard, through friends and then friends of friends, to a specific person living near Boston. Milgram’s work published in Psychology Today showed that people in the United States seemed to be connected by approximately three friendship links, on average. The experiment generated a tremendous amount of publicity, and as a result to this day he is incorrectly attributed with originating the ideas and quantification of interconnectedness and even the statement “six degrees of separation”.

In fact, the statement was originally articulated in 1929 by Hungarian author, Frigyes Karinthy and later popularized by in a play written by John Guare. Karinthy believed that the modern world was ‘shrinking’ due to the accelerating interconnectedness of humans. He hypothesized that any two individuals could be connected through at most five acquaintances. In 1990, playwright John Guare unveiled a play (followed by a movie in 1993) titled “Six Degrees of Separation”. This popularized the notion and enshrined it into popular culture. In the play one of the characters reflects on the idea that any two individuals are connected by at most five others:

I read somewhere that everybody on this planet is separated by only six other people. Six degrees of separation between us and everyone else on this planet. The President of the United States, a gondolier in Venice, just fill in the names. I find it A) extremely comforting that we’re so close, and B) like Chinese water torture that we’re so close because you have to find the right six people to make the right connection… I am bound to everyone on this planet by a trail of six people.

Then in 1994 along came the Kevin Bacon trivia game, “Six Degrees of Kevin Bacon” invented as a play on the original concept. The goal of the game is to link any actor to Kevin Bacon through no more than six connections, where two actors are connected if they have appeared in a movie or commercial together.

Now, in 2011 comes a study of connectedness of Facebook users. Using Facebook’s population of over 700 million users, researchers found that the average number of links from any arbitrarily selected user to another was 4.74; for Facebook users in the U.S., the average number of of links was just 4.37. Facebook posted detailed findings on its site, here.

So, the Small World Problem popularized by Milgram and colleagues is actually becoming smaller as Frigyes Karinthy had originally suggested back in 1929. As a result, you may not be as “far” from the Chinese Premier or Nelson Mandela as you may have previously believed.

[div class=attrib]Image: Six Degrees of Separation Poster by James McMullan. Courtesy of Wikipedia.[end-div]

How the World May End: Science Versus Brimstone

Every couple of years a (hell)fire and brimstone preacher floats into the national consciousness and makes the headlines with certain predictions from the book regarding imminent destruction of our species and home. Most recently Harold Camping, the radio evangelist, predicted the apocalypse would begin on Saturday, May 21, 2011. His subsequent revision placed the “correct date” at October 21, 2011. Well, we’re still here, so the next apocalyptic date to prepare for, according to watchers of all things Mayan, is December 21, 2012.

So not to be outdone by prophesy from one particular religion or another, science has come out swinging with its own list of potential apocalyptic end-of-days. No surprise, many scenarios may well be at our own hands.

[div class=attrib]From the Guardian:[end-div]

Stories of brimstone, fire and gods make good tales and do a decent job of stirring up the requisite fear and jeopardy. But made-up doomsday tales pale into nothing, creatively speaking, when contrasted with what is actually possible. Look through the lens of science and “the end” becomes much more interesting.

Since the beginning of life on Earth, around 3.5 billion years ago, the fragile existence has lived in the shadow of annihilation. On this planet, extinction is the norm – of the 4 billion species ever thought to have evolved, 99% have become extinct. In particular, five times in this past 500 million years the steady background rate of extinction has shot up for a period of time. Something – no one knows for sure what – turned the Earth into exactly the wrong planet for life at these points and during each mass extinction, more than 75% of the existing species died off in a period of time that was, geologically speaking, a blink of the eye.

One or more of these mass extinctions occurred because of what we could call the big, Hollywood-style, potential doomsday scenarios. If a big enough asteroid hit the Earth, for example, the impact would cause huge earthquakes and tsunamis that could cross the globe. There would be enough dust thrown into the air to block out the sun for several years. As a result, the world’s food resources would be destroyed, leading to famine. It has happened before: the dinosaurs (along with more than half the other species on Earth) were wiped out 65 million years ago by a 10km-wide asteroid that smashed into the area around Mexico.

Other natural disasters include sudden changes in climate or immense volcanic eruptions. All of these could cause global catastrophes that would wipe out large portions of the planet’s life, but, given we have survived for several hundreds of thousands of years while at risk of these, it is unlikely that a natural disaster such as that will cause catastrophe in the next few centuries.

In addition, cosmic threats to our existence have always been with us, even thought it has taken us some time to notice: the collision of our galaxy, the Milky Way, with our nearest neighbour, Andromeda, for example, or the arrival of a black hole. Common to all of these threats is that there is very little we can do about them even when we know the danger exists, except trying to work out how to survive the aftermath.

But in reality, the most serious risks for humans might come from our own activities. Our species has the unique ability in the history of life on Earth to be the first capable of remaking our world. But we can also destroy it.

All too real are the human-caused threats born of climate change, excess pollution, depletion of natural resources and the madness of nuclear weapons. We tinker with our genes and atoms at our own peril. Nanotechnology, synthetic biology and genetic modification offer much potential in giving us better food to eat, safer drugs and a cleaner world, but they could also go wrong if misapplied or if we charge on without due care.

Some strange ways to go and their corresponding danger signs listed below:

DEATH BY EUPHORIA

Many of us use drugs such as caffeine or nicotine every day. Our increased understanding of physiology brings new drugs that can lift mood, improve alertness or keep you awake for days. How long before we use so many drugs we are no longer in control? Perhaps the end of society will not come with a bang, but fade away in a haze.

Danger sign: Drugs would get too cheap to meter, but you might be too doped up to notice.

VACUUM DECAY

If the Earth exists in a region of space known as a false vacuum, it could collapse into a lower-energy state at any point. This collapse would grow at the speed of light and our atoms would not hold together in the ensuing wave of intense energy – everything would be torn apart.

Danger sign: There would be no signs. It could happen half way through this…

STRANGELETS

Quantum mechanics contains lots of frightening possibilities. Among them is a particle called a strangelet that can transform any other particle into a copy of itself. In just a few hours, a small chunk of these could turn a planet into a featureless mass of strangelets. Everything that planet was would be no more.

Danger sign: Everything around you starts cooking, releasing heat.

END OF TIME

What if time itself somehow came to a finish because of the laws of physics? In 2007, Spanish scientists proposed an alternative explanation for the mysterious dark energy that accounts for 75% of the mass of the universe and acts as a sort of anti-gravity, pushing galaxies apart. They proposed that the effects we observe are due to time slowing down as it leaked away from our universe.

Danger sign: It could be happening right now. We would never know.

MEGA TSUNAMI

Geologists worry that a future volcanic eruption at La Palma in the Canary Islands might dislodge a chunk of rock twice the volume of the Isle of Man into the Atlantic Ocean, triggering waves a kilometre high that would move at the speed of a jumbo jet with catastrophic effects for the shores of the US, Europe, South America and Africa.

Danger sign: Half the world’s major cities are under water. All at once.

GEOMAGNETIC REVERSAL

The Earth’s magnetic field provides a shield against harmful radiation from our sun that could rip through DNA and overload the world’s electrical systems. Every so often, Earth’s north and south poles switch positions and, during the transition, the magnetic field will weaken or disappear for many years. The last known transition happened almost 780,000 years ago and it is likely to happen again.

Danger sign: Electronics stop working.

GAMMA RAYS FROM SPACE

When a supermassive star is in its dying moments, it shoots out two beams of high-energy gamma rays into space. If these were to hit Earth, the immense energy would tear apart the atmosphere’s air molecules and disintegrate the protective ozone layer.

Danger sign: The sky turns brown and all life on the surface slowly dies.

RUNAWAY BLACK HOLE

Black holes are the most powerful gravitational objects in the universe, capable of tearing Earth into its constituent atoms. Even within a billion miles, a black hole could knock Earth out of the solar system, leaving our planet wandering through deep space without a source of energy.

Danger sign: Increased asteroid activity; the seasons get really extreme.

INVASIVE SPECIES

Invasive species are plants, animals or microbes that turn up in an ecosystem that has no protection against them. The invader’s population surges and the ecosystem quickly destabilises towards collapse. Invasive species are already an expensive global problem: they disrupt local ecosystems, transfer viruses, poison soils and damage agriculture.

Danger sign: Your local species disappear.

TRANSHUMANISM

What if biological and technological enhancements took humans to a level where they radically surpassed anything we know today? “Posthumans” might consist of artificial intelligences based on the thoughts and memories of ancient humans, who uploaded themselves into a computer and exist only as digital information on superfast computer networks. Their physical bodies might be gone but they could access and store endless information and share their thoughts and feelings immediately and unambiguously with other digital humans.

Danger sign: You are outcompeted, mentally and physically, by a cyborg.

[div class=attrib]Read more of this article here.[end-div]

[div class=attrib]End is Nigh Sign. Courtesy of frontporchrepublic.com.[end-div]

MondayPoem: Inferno – Canto I

Dante Alighieri is held in high regard in Italy, where he is often referred to as il Poeta, the poet. He is best known for the monumental poem La Commedia, later renamed La Divina Commedia – The Divine Comedy. Scholars consider it to be the greatest work of literature in the Italian language. Many also consider Dante to be symbolic father of the Italian language.

[div class=attrib]According to Wikipedia:[end-div]

He wrote the Comedy in a language he called “Italian”, in some sense an amalgamated literary language mostly based on the regional dialect of Tuscany, with some elements of Latin and of the other regional dialects. The aim was to deliberately reach a readership throughout Italy, both laymen, clergymen and other poets. By creating a poem of epic structure and philosophic purpose, he established that the Italian language was suitable for the highest sort of expression. In French, Italian is sometimes nicknamed la langue de Dante. Publishing in the vernacular language marked Dante as one of the first (among others such as Geoffrey Chaucer and Giovanni Boccaccio) to break free from standards of publishing in only Latin (the language of liturgy, history, and scholarship in general, but often also of lyric poetry). This break set a precedent and allowed more literature to be published for a wider audience—setting the stage for greater levels of literacy in the future.

By Dante Alighieri

(translated by the Rev. H. F. Cary)

– Inferno, Canto I

In the midway of this our mortal life,
I found me in a gloomy wood, astray
Gone from the path direct: and e’en to tell
It were no easy task, how savage wild
That forest, how robust and rough its growth,
Which to remember only, my dismay
Renews, in bitterness not far from death.
Yet to discourse of what there good befell,
All else will I relate discover’d there.
How first I enter’d it I scarce can say,
Such sleepy dullness in that instant weigh’d
My senses down, when the true path I left,
But when a mountain’s foot I reach’d, where clos’d
The valley, that had pierc’d my heart with dread,
I look’d aloft, and saw his shoulders broad
Already vested with that planet’s beam,
Who leads all wanderers safe through every way.

Then was a little respite to the fear,
That in my heart’s recesses deep had lain,
All of that night, so pitifully pass’d:
And as a man, with difficult short breath,
Forespent with toiling, ‘scap’d from sea to shore,
Turns to the perilous wide waste, and stands
At gaze; e’en so my spirit, that yet fail’d
Struggling with terror, turn’d to view the straits,
That none hath pass’d and liv’d.  My weary frame
After short pause recomforted, again
I journey’d on over that lonely steep,

The hinder foot still firmer.  Scarce the ascent
Began, when, lo! a panther, nimble, light,
And cover’d with a speckled skin, appear’d,
Nor, when it saw me, vanish’d, rather strove
To check my onward going; that ofttimes
With purpose to retrace my steps I turn’d.

The hour was morning’s prime, and on his way
Aloft the sun ascended with those stars,
That with him rose, when Love divine first mov’d
Those its fair works: so that with joyous hope
All things conspir’d to fill me, the gay skin
Of that swift animal, the matin dawn
And the sweet season.  Soon that joy was chas’d,
And by new dread succeeded, when in view
A lion came, ‘gainst me, as it appear’d,

With his head held aloft and hunger-mad,
That e’en the air was fear-struck.  A she-wolf
Was at his heels, who in her leanness seem’d
Full of all wants, and many a land hath made
Disconsolate ere now.  She with such fear
O’erwhelmed me, at the sight of her appall’d,
That of the height all hope I lost.  As one,
Who with his gain elated, sees the time
When all unwares is gone, he inwardly
Mourns with heart-griping anguish; such was I,
Haunted by that fell beast, never at peace,
Who coming o’er against me, by degrees
Impell’d me where the sun in silence rests.

While to the lower space with backward step
I fell, my ken discern’d the form one of one,
Whose voice seem’d faint through long disuse of speech.
When him in that great desert I espied,
“Have mercy on me!”  cried I out aloud,
“Spirit! or living man! what e’er thou be!”

He answer’d: “Now not man, man once I was,
And born of Lombard parents, Mantuana both
By country, when the power of Julius yet
Was scarcely firm.  At Rome my life was past
Beneath the mild Augustus, in the time
Of fabled deities and false.  A bard
Was I, and made Anchises’ upright son
The subject of my song, who came from Troy,
When the flames prey’d on Ilium’s haughty towers.
But thou, say wherefore to such perils past
Return’st thou?  wherefore not this pleasant mount
Ascendest, cause and source of all delight?”
“And art thou then that Virgil, that well-spring,
From which such copious floods of eloquence
Have issued?”  I with front abash’d replied.
“Glory and light of all the tuneful train!
May it avail me that I long with zeal
Have sought thy volume, and with love immense
Have conn’d it o’er.  My master thou and guide!
Thou he from whom alone I have deriv’d
That style, which for its beauty into fame
Exalts me.  See the beast, from whom I fled.
O save me from her, thou illustrious sage!

“For every vein and pulse throughout my frame
She hath made tremble.”  He, soon as he saw
That I was weeping, answer’d, “Thou must needs
Another way pursue, if thou wouldst ‘scape
From out that savage wilderness.  This beast,
At whom thou criest, her way will suffer none
To pass, and no less hindrance makes than death:
So bad and so accursed in her kind,
That never sated is her ravenous will,
Still after food more craving than before.
To many an animal in wedlock vile
She fastens, and shall yet to many more,
Until that greyhound come, who shall destroy
Her with sharp pain.  He will not life support
By earth nor its base metals, but by love,
Wisdom, and virtue, and his land shall be
The land ‘twixt either Feltro.  In his might
Shall safety to Italia’s plains arise,
For whose fair realm, Camilla, virgin pure,
Nisus, Euryalus, and Turnus fell.
He with incessant chase through every town
Shall worry, until he to hell at length
Restore her, thence by envy first let loose.
I for thy profit pond’ring now devise,
That thou mayst follow me, and I thy guide
Will lead thee hence through an eternal space,
Where thou shalt hear despairing shrieks, and see
Spirits of old tormented, who invoke
A second death; and those next view, who dwell
Content in fire, for that they hope to come,
Whene’er the time may be, among the blest,
Into whose regions if thou then desire
T’ ascend, a spirit worthier then I
Must lead thee, in whose charge, when I depart,
Thou shalt be left: for that Almighty King,
Who reigns above, a rebel to his law,
Adjudges me, and therefore hath decreed,
That to his city none through me should come.
He in all parts hath sway; there rules, there holds
His citadel and throne.  O happy those,
Whom there he chooses!” I to him in few:
“Bard! by that God, whom thou didst not adore,
I do beseech thee (that this ill and worse
I may escape) to lead me, where thou saidst,
That I Saint Peter’s gate may view, and those
Who as thou tell’st, are in such dismal plight.”

Onward he mov’d, I close his steps pursu’d.

[div class=attrib]Read the entire poem here.[end-div]

[div class=attrib]Image: Dante Alighieri, engraving after the fresco in Bargello Chapel, painted by Giotto di Bondone. Courtesy of Wikipedia.[end-div]

Viewfinder Replaces the Eye

The ubiquity of point-and-click digital cameras and camera-equipped smartphones seems to be leading us towards an era where it is more common to snap and share a picture of the present via a camera lens than it is to experience the present individually and through one’s own eyes.

Roberta Smith over at the New York Times laments this growing trend, which we label “digitally-assisted Killroy-was-here” syndrome, particularly evident at art exhibits. Ruth Fremson, New York Times’ photographer, chronicled some of the leading offenders.

[div class=attrib]From the New York Times:[end-div]

SCIENTISTS have yet to determine what percentage of art-viewing these days is done through the viewfinder of a camera or a cellphone, but clearly the figure is on the rise. That’s why Ruth Fremson, the intrepid photographer for The New York Times who covered the Venice Biennale this summer, returned with so many images of people doing more or less what she was doing: taking pictures of works of art or people looking at works of art. More or less.

Only two of the people in these pictures is using a traditional full-service camera (similar to the ones Ms. Fremson carried with her) and actually holding it to the eye. Everyone else is wielding either a cellphone or a mini-camera and looking at a small screen, which tends to make the framing process much more casual. It is changing the look of photography.

The ubiquity of cameras in exhibitions can be dismaying, especially when read as proof that most art has become just another photo op for evidence of Kilroy-was-here passing through. More generously, the camera is a way of connecting, participating and collecting fleeting experiences.

For better and for worse, it has become intrinsic to many people’s aesthetic responses. (Judging by the number of pictures Ms. Fremson took of people photographing Urs Fischer’s life-size statue of the artist Rudolf Stingel as a lighted candle, it is one of the more popular pieces at the Biennale, which runs through Nov. 27.) And the camera’s presence in an image can seem part of its strangeness, as with Ms. Fremson’s shot of the gentleman photographing a photo-mural by Cindy Sherman that makes Ms. Sherman, costumed as a circus juggler, appear to be posing just for him. She looks more real than she did in the actual installation.

Of course a photograph of a person photographing an artist’s photograph of herself playing a role is a few layers of an onion, maybe the kind to be found only among picture-takers at an exhibition.

[div class=attrib]Read the entire article here.[end-div]

[div class=attrib]Image: Visitors at the Venice Biennale capture Urs Fisher’s statue. Courtesy of Ruth Fremson / The New York Times.[end-div]

Driving Across the U.S. at 146,700 Miles per Hour

Through the miracle of time-lapse photography we bring you a journey of 12,225 miles across 32 States in 55 days compressed into 5 minutes. Brian Defrees snapped an image every five seconds from his car-mounted camera during the adventure, which began and ended in New York, via Washington D.C., Florida, Los Angeles and Washington State, and many points in between.

[tube]Tt-juyvIWMQ[/tube]

Cool Images of a Hot Star

Astronomers and planetary photographers, both amateur and professional, have been having an inspiring time recently in watching the Sun. Some of the most gorgeous images of our nearest star come courtesy of photographer Alan Friedman. One such spectacular image shows several huge, 50,000 mile high, solar flares, and groups of active sunspots larger than our planet. See more of Freidman’s captivating images at his personal website.

According to MSNBC:

For the past couple of weeks, astronomers have been tracking groups of sunspots as they move across the sun’s disk. Those active regions have been shooting off flares and outbursts of electrically charged particles into space — signaling that the sun is ramping up toward the peak of its 11-year activity cycle. Physicists expect that peak, also known as “Solar Max,” to come in 2013.

A full frontal view from New York photographer Alan Friedman shows the current activity in detail, as seen in a particular wavelength known as hydrogen-alpha. The colors have been tweaked to turn the sun look like a warm, fuzzy ball, with lacy prominences licking up from the edge of the disk.

Friedman focused on one flare in particular over the weekend: In the picture you see at right, the colors have been reversed to produce a dark sun and dusky prominence against the light background of space.

[div class=attirb]Read more of this article here.[end-div]

[div class=attrib]Image: Powerful sunspots and gauzy-looking prominences can be seen in Alan Friedman’s photo of the sun, shown in hydrogen-alpha wavelengths. Courtesy of MSNBC / Copyright Alan Friedman, avertedimagination.com.[end-div]

What Exactly is a Person?

The recent “personhood” amendment on the ballot in Mississippi has caused many to scratch their heads and ponder the meaning of “person”. Philosophers through the ages have tackled this thorny question with detailed treatises and little consensus.

Boethius suggested that a person is “the individual substance of a rational nature.” Descartes described a person as an agent, human or otherwise, possessing consciousness, and capable of creating and acting on a plan. John Locke extended this definition to include reason and reflection. Kant looked at a person as a being having a conceptualizing mind capable of purposeful thought. Charles Taylor takes this naturalistic view further, defining a person as an agent driven by matters of significance. Harry Frankfurt characterized as person as an entity enshrining free will driven by a hierarchy of desires. Still others provide their own definition of a person. Peter Singer offers self-awareness as a distinguishing trait; Thomas White suggests that a person has the following elements: is alive, is aware, feels sensations, has emotions, has a sense of self, controls its own behaviour, recognises other persons, and has a various cognitive abilities.

Despite the variation in positions, all would seem to agree that a fertilized egg is certainly not a person.

    [div class=attrib]A thoughtful take over at 13.7 Cosmos and Culture blog:[end-div]

    According to Catholic doctrine, the Father, the Son and Holy Spirit are three distinct persons even though they are one essence. Only one of those persons — Jesus Christ — is also a human being whose life had a beginning and an end.

    I am not an expert in Trinitarian theology. But I mention it here because, great mysteries aside, this Catholic doctrine uses the notion of person in what, from our point of view today, is the standard way.

    John Locke called person a forensic concept. What he had in mind is that a person is one to whom credit and blame may be attached, one who is deemed responsible. The concept of a person is the concept of an agent.

    Crucially, Locke argued, persons are not the same as human beings. Dr. Jekyl and Mr. Hyde may be one and the same human being, that is, one and the same continuously existing organic life; they share a birth event; but they are two distinct persons. And this is why we don’t blame the one for the other’s crimes. Multiple personality disorder might be a real world example of this.

    I don’t know whether Locke believed that two distinct persons could actually inhabit the same living human body, but he certainly thought there was nothing contradictory in the possibility. Nor did he think there was anything incoherent in the thought that one person could find existence in multiple distinct animal lives, even if, as a matter of fact, this may not be possible. If you believe in reincarnation, then you think this is a genuine possibility. For Locke, this was no more incoherent than the idea of two actors playing the same role in a play.

    Indeed, the word “person” derives from a Latin (and originally a Greek) word meaning “character in a drama” or “mask” (because actors wore masks). This usage survives today in the phrase “dramatis personae.” To be a person, from this standpoint, is to play a role. The person is the role played, however, not the player.

    From this standpoint, the idea of non-human, non-living person certainly makes sense, even if we find it disturbing. Corporations are persons under current law, and this makes sense. They are actors, after all, and we credit and blame them for the things they do. They play an important role in our society.

    [div class=attrib]Read the whole article here.[end-div]

    [div class=attrib]Image: Abstract painting of a person, titled WI (In Memoriam), by Paul Klee (1879–1940). Courtesy of Wikipedia.[end-div]

    Supercommittee and Innovation: Oxymoron Du Jour

    Today is deadline day for the U.S. Congressional Select Committee on Deficit Reduction to deliver. Perhaps, a little ironically the committee was commonly mistitled the “Super Committee”. Interestingly, pundits and public alike do not expect the committee to deliver any significant, long-term solution to the United States’ fiscal problems. In fact, many do not believe the committee with deliver anything at all beyond reinforcement of right- and left-leaning ideologies, political posturing, pandering to special interests of all colors and, of course, recriminations and spin.

    Could the Founders have had such dysfunction in mind when they designed the branches of government with its many checks and balances to guard against excess and tyranny. So, perhaps it’s finally time for the United States’ Congress to gulp a large dose of some corporate-style innovation.

    [div class=attrib]From the Washington Post:[end-div]

    … Fiscal catastrophe has been around the corner, on and off, for 15 years. In that period, Dole and President Bill Clinton, a Democrat, came together to produce a record-breaking $230 billion surplus. That was later depleted by actions undertaken by both sides, bringing us to the tense situation we have today.

    What does this have to do with innovation?

    As the profession of innovation management matures, we are learning a few key things, including that constraints can be a good thing — and the “supercommittee” clock is a big constraint. Given this, what is the best strategy when you need to innovate in a hurry?

    When innovating under the gun, the first thing you must do is assemble a small, diverse team to own and attack the challenge. The “supercommittee” team is handicapped from the start, since it is neither small (think 4-5 people) nor diverse (neither in age nor expertise). Second, successful innovators envision what success looks like and pursue it single-mindedly – failure is not an option.

    Innovators also divide big challenges into smaller challenges that a small team can feel passionate about and assault on an even shorter timeline than the overall challenge. This requires that you put as much (or more) effort into determining the questions that form the challenges as you do into trying to solve them. Innovators ask big questions that challenge the status quo, such as “How could we generate revenue without taxes?” or “What spending could we avoid and how?” or “How would my son or my grandmother approach this?”

    To solve the challenges, successful innovators recruit people not only with expertise most relevant to the challenge, but also people with expertise in distant specialties, which, in innovation, is often where the best solutions come from.

    But probably most importantly, all nine innovation roles — the revolutionary, the conscript, the connector, the artist, customer champion, troubleshooter, judge, magic maker and evangelist — must be filled for an innovation effort to be successful.

    [div class=attrib]Read the entire article here.[end-div]

    What of the Millennials?

    The hippies of the sixties wanted love; the beatniks sought transcendence. Then came the punks, who were all about rage. The slackers and generation X stood for apathy and worry. And, now coming of age we have generation Y, also known as the “millennials”, whose birthdays fall roughly between 1982-2000.

    A fascinating article by William Deresiewicz, excerpted below, posits the millennials as a “post-emotional” generation. Interestingly, while this generation seems to be fragmented, its members are much more focused on their own “brand identity” than previous generations.

    [div class=attrib]From the New York Times:[end-div]

    EVER since I moved three years ago to Portland, Ore., that hotbed of all things hipster, I’ve been trying to get a handle on today’s youth culture. The style is easy enough to describe — the skinny pants, the retro hats, the wall-to-wall tattoos. But style is superficial. The question is, what’s underneath? What idea of life? What stance with respect to the world?

    So what’s the affect of today’s youth culture? Not just the hipsters, but the Millennial Generation as a whole, people born between the late ’70s and the mid-’90s, more or less — of whom the hipsters are a lot more representative than most of them care to admit. The thing that strikes me most about them is how nice they are: polite, pleasant, moderate, earnest, friendly. Rock ’n’ rollers once were snarling rebels or chest-beating egomaniacs. Now the presentation is low-key, self-deprecating, post-ironic, eco-friendly. When Vampire Weekend appeared on “The Colbert Report” last year to plug their album “Contra,” the host asked them, in view of the title, what they were against. “Closed-mindedness,” they said.

    According to one of my students at Yale, where I taught English in the last decade, a colleague of mine would tell his students that they belonged to a “post-emotional” generation. No anger, no edge, no ego.

    What is this about? A rejection of culture-war strife? A principled desire to live more lightly on the planet? A matter of how they were raised — everybody’s special and everybody’s point of view is valid and everybody’s feelings should be taken care of?

    Perhaps a bit of each, but mainly, I think, something else. The millennial affect is the affect of the salesman. Consider the other side of the equation, the Millennials’ characteristic social form. Here’s what I see around me, in the city and the culture: food carts, 20-somethings selling wallets made from recycled plastic bags, boutique pickle companies, techie start-ups, Kickstarter, urban-farming supply stores and bottled water that wants to save the planet.

    Today’s ideal social form is not the commune or the movement or even the individual creator as such; it’s the small business. Every artistic or moral aspiration — music, food, good works, what have you — is expressed in those terms.

    Call it Generation Sell.

    Bands are still bands, but now they’re little businesses, as well: self-produced, self-published, self-managed. When I hear from young people who want to get off the careerist treadmill and do something meaningful, they talk, most often, about opening a restaurant. Nonprofits are still hip, but students don’t dream about joining one, they dream about starting one. In any case, what’s really hip is social entrepreneurship — companies that try to make money responsibly, then give it all away.

    [div class=attrib]Read the entire article here.[end-div]

    [div class=attrib]Image: Millennial Momentum, Authors: Morley Winograd and Michael D. Hais, Rutgers University Press.[end-div]

    Book Review: Thinking, Fast and Slow. Daniel Kahneman

    Daniel Kahneman brings together for the first time his decades of groundbreaking research and profound thinking in social psychology and cognitive science in his new book, Thinking Fast and Slow. He presents his current understanding of judgment and decision making and offers insight into how we make choices in our daily lives. Importantly, Kahneman describes how we can identify and overcome the cognitive biases that frequently lead us astray. This is an important work by one of our leading thinkers.

    [div class=attrib]From Skeptic:[end-div]

    The ideas of the Princeton University Psychologist Daniel Kahneman, recipient of the Nobel Prize in Economic Sciences for his seminal work that challenged the rational model of judgment and decision making, have had a profound and widely regarded impact on psychology, economics, business, law and philosophy. Until now, however, he has never brought together his many years of research and thinking in one book. In the highly anticipated Thinking, Fast and Slow, Kahneman introduces the “machinery of the mind.” Two systems drive the way we think and make choices: System One is fast, intuitive, and emotional; System Two is slower, more deliberative, and more logical. Examining how both systems function within the mind, Kahneman exposes the extraordinary capabilities and also the faults and biases of fast thinking, and the pervasive influence of intuitive impressions on our thoughts and our choices. Kahneman shows where we can trust our intuitions and how we can tap into the benefits of slow thinking. He offers practical and enlightening insights into how choices are made in both our business and personal lives, and how we can guard against the mental glitches that often get us into trouble. Kahneman will change the way you think about thinking.

    [div class=attrib]Image: Thinking, Fast and Slow, Daniel Kahneman. Courtesy of Publishers Weekly.[end-div]

    MondayPoem: First Thanksgiving

    A chronicler of the human condition and deeply personal emotion, poet Sharon Olds is no shrinking violet. Her contemporary poems have been both highly praised and condemned for their explicit frankness and intimacy.

    [div class=attrib]From Poetry Foundation:[end-div]

    In her Salon interview, Olds addressed the aims of her poetry. “I think that my work is easy to understand because I am not a thinker. I am not a…How can I put it? I write the way I perceive, I guess. It’s not really simple, I don’t think, but it’s about ordinary things—feeling about things, about people. I’m not an intellectual. I’m not an abstract thinker. And I’m interested in ordinary life.” She added that she is “not asking a poem to carry a lot of rocks in its pockets. Just being an ordinary observer and liver and feeler and letting the experience get through you onto the notebook with the pen, through the arm, out of the body, onto the page, without distortion.”

    Olds has won numerous awards for her work, including fellowships from the Guggenheim Foundation and the National Endowment for the Arts. Widely anthologized, her work has also been published in a number of journals and magazines. She was New York State Poet from 1998 to 2000, and currently teaches in the graduate writing program at New York University.

    By Sharon Olds

    – First Thanksgiving

    When she comes back, from college, I will see
    the skin of her upper arms, cool,
    matte, glossy. She will hug me, my old
    soupy chest against her breasts,
    I will smell her hair! She will sleep in this apartment,
    her sleep like an untamed, good object,
    like a soul in a body. She came into my life the
    second great arrival, after him, fresh
    from the other world—which lay, from within him,
    within me. Those nights, I fed her to sleep,
    week after week, the moon rising,
    and setting, and waxing—whirling, over the months,
    in a slow blur, around our planet.
    Now she doesn’t need love like that, she has
    had it. She will walk in glowing, we will talk,
    and then, when she’s fast asleep, I’ll exult
    to have her in that room again,
    behind that door! As a child, I caught
    bees, by the wings, and held them, some seconds,
    looked into their wild faces,
    listened to them sing, then tossed them back
    into the air—I remember the moment the
    arc of my toss swerved, and they entered
    the corrected curve of their departure.

    [div class=attrib]Image: Sharon Olds. Courtesy of squawvalleywriters.org.[end-div]

    The Adaptive Soundscape: Musak and the Social Network DJ

    Recollect the piped “musak” that once played, and still plays, in many hotel elevators and public waiting rooms. Remember the perfectly designed mood music in restaurants and museums. Now, re-imagine the ambient soundscape dynamically customized for a space based on the music preferences of the people inhabiting that space. Well, there is a growing list of apps for that.

    [div class=attrib]From Wired:[end-div]

    This idea of having environments automatically reflect the predilections of those who inhabit them seems like the stuff of science fiction, but it’s already established fact, though not many people likely realize it yet.

    Let me explain. You know how most of the music services we listen to these days “scrobble” what we hear to Facebook and/or Last.fm? Well, outside developers can access that information — with your permission, of course — in order to shape their software around your taste.

    At the moment, most developers of Facebook-connected apps we’ve spoken with are able to mine your Likes (when you “like” something on Facebook) and profile information (when you add a band, book, movie, etc. as a favorite thing within your Facebook profile).

    However, as we recently confirmed with a Facebook software developer (who was not speaking for Facebook at the time but as an independent developer in his free time), third-party software developers can also access your listening data — each song you’ve played in any Facebook-connected music service and possibly what your friends listened to as well. Video plays and news article reads are also counted, if those sources are connected to Facebook.

    Don’t freak out — you have to give these apps permission to harvest this data. But once you do, they can start building their service using information about what you listened to in another service.

    Right now, this is starting to happen in the world of software (if I listen to “We Ah Wi” by Javelin on MOG, Spotify can find out if I give them permission to do so). Soon, due to mobile devices’ locational awareness — also opt-in — these preferences will leech into the physical world.

    I’m talking about the kids who used to sit around on the quad listening to that station. The more interesting option for mainstream users is music selections that automatically shift in response to the people in the room. The new DJs? Well, they will simply be the social butterflies who are most permissive with their personal information.

    Here are some more apps for real-world locations that can adapt music based on the preferences of these social butterflies:

    Crowdjuke: Winner of an MTV O Music Award for “best music hack,” this web app pulls the preferences of people who have RSVPed to an event and creates the perfect playlist for that group. Attendees can also add specific tracks using a mobile app or even text messaging from a “dumb” phone.

    Automatic DJ: Talk about science fiction; this one lets people DJ a party merely by having their picture taken at it.

    AudioVroom: This iPhone app (also with a new web version) makes a playlist that reflects two users’ tastes when they meet in real life. There’s no venue-specific version of this, but there could be (see also: Myxer).

    [div class=attrib]Read the entire article here.[end-div]

    [div class=attrib]Image: Elevator Music. A Surreal History of Muzak, Easy-Listening, and Other Moodsong; Revised and Expanded Edition. Courtesy of the University of Michigan Press.[end-div]

    The Nation’s $360 Billion Medical Bill

    The United States spends around $2.5 trillion per year on health care. Approximately 14 percent of this is administrative spending. That’s $360 billion, yes, billion with a ‘b’, annually. And, by all accounts a significant proportion of this huge sum is duplicate, redundant, wasteful and unnecessary spending — that’s a lot of paperwork.

    [div class=attrib]From the New York Times:[end-div]

     

    LAST year I had to have a minor biopsy. Every time I went in for an appointment, I had to fill out a form requiring my name, address, insurance information, emergency contact person, vaccination history, previous surgical history and current medical problems, medications and allergies. I must have done it four times in just three days. Then, after my procedure, I received bills — and, even more annoying, statements of charges that said they weren’t bills — almost daily, from the hospital, the surgeon, the primary care doctor, the insurance company.

    Imagine that repeated millions of times daily and you have one of the biggest money wasters in our health care system. Administration accounts for roughly 14 percent of what the United States spends on health care, or about $360 billion per year. About half of all administrative costs — $163 billion in 2009 — are borne by Medicare, Medicaid and insurance companies. The other half pays for the legions employed by doctors and hospitals to fill out billing forms, keep records, apply for credentials and perform the myriad other administrative functions associated with health care.

    The range of expert opinions on how much of this could be saved goes as high as $180 billion, or half of current expenditures. But a more conservative and reasonable estimate comes from David Cutler, an economist at Harvard, who calculates that for the whole system — for insurers as well as doctors and hospitals — electronic billing and credentialing could save $32 billion a year. And United Health comes to a similar estimate, with 20 percent of savings going to the government, 50 percent to physicians and hospitals and 30 percent to insurers. For health care cuts to matter, they have to be above 1 percent of total costs, or $26 billion a year, and this conservative estimate certainly meets that threshold.

    How do we get to these savings? First, electronic health records would eliminate the need to fill out the same forms over and over. An electronic credentialing system shared by all hospitals, insurance companies, Medicare, Medicaid, state licensing boards and other government agencies, like the Drug Enforcement Administration, could reduce much of the paperwork doctors are responsible for that patients never see. Requiring all parties to use electronic health records and an online system for physician credentialing would reduce frustration and save billions.

    But the real savings is in billing. There are at least six steps in the process: 1) determining a patient’s eligibility for services; 2) obtaining prior authorization for specialist visits, tests and treatments; 3) submitting claims by doctors and hospitals to insurers; 4) verifying whether a claim was received and where in the process it is; 5) adjudicating denials of claims; and 6) receiving payment.

    Substantial costs arise from the fact that doctors, hospitals and other care providers must bill multiple insurance companies. Instead of having a unified electronic billing system in which a patient could simply swipe an A.T.M.-like card for automatic verification of eligibility, claims processing and payment, we have a complicated system with lots of expensive manual data entry that produces costly mistakes.

    [div class=attrib]Read more of this article here.[end-div]

    [div class=attrib]Image: Piles of paperwork. Courtesy of the Guardian.[end-div]

    Definition of Technocrat

    The unfolding financial crises and political upheavals in Europe have taken several casualties. Notably, the fall of both leaders and their governments in Greece and Italy. Both have been replaced by so-called “technocrats”. So, what is a technocrat and why? State explains.

    [div class=attrib]From Slate:[end-div]

    Lucas Papademos was sworn in as the new prime minister of Greece Friday morning. In Italy, it’s expected that Silvio Berlusconi will be replaced by former EU commissioner Mario Monti. Both men have been described as “technocrats” in major newspapers. What, exactly, is a technocrat?

    An expert, not a politician. Technocrats make decisions based on specialized information rather than public opinion. For this reason, they are sometimes called upon when there’s no popular or easy solution to a problem (like, for example, the European debt crisis). The word technocrat derives from the Greek tekhne, meaning skill or craft, and an expert in a field like economics can be as much a technocrat as one in a field more commonly thought to be technological (like robotics). Both Papademos and Monti hold advanced degrees in economics, and have each held appointments at government institutions.

    The word technocrat can also refer to an advocate of a form of government in which experts preside. The notion of a technocracy remains mostly hypothetical, though some nations have been considered as such in the sense of being governed primarily by technical experts. Historian Walter A. McDougall argued that the Soviet Union was the world’s first technocracy, and indeed its Politburo included an unusually high proportion of engineers. Other nations, including Italy and Greece, have undergone some short periods under technocratic regimes. Carlo Azeglio Ciampi, formerly an economist and central banker, served as prime minister of Italy from 1993 to 1994. Economist and former Bank of Greece director Xenophon Zolotas served as Prime Minister of Greece from 1989 to 1990.

    In the United States, technocracy was most popular in the early years of the Great Depression. Inspired in part by the ideas of economist Thorstein Veblen, the movement was led by engineer Howard Scott, who proposed radical utopian ideas and solutions to the economic disaster in scientific language. His movement, founded in 1932, drew national interest—the New York Times was the first major news organization to report the phenomenon, and Liberty Digest declared, “Technocracy is all the rage. All over the country it is being talked about, explained, wondered at, praised, damned. It is found about as easy to explain … as the Einstein theory of relativity.” A year later, it had mostly flamed out. No popular Technocratic party exists in the United States today, but Scott’s organization, called Technocracy Incorporated, persists in drastically reduced form.

    [div class=attrib]Read the entire article here.[end-div]

    [div class=attirb]Image: Mario Monti. Courtesy of Daily Telegraph.[end-div]