All posts by Mike

The Pervasive Threat of Conformity: Peer Pressure Is Here to Stay

[div class=attrib]From BigThink:[end-div]

Today, I’d like to revisit one of the most well-known experiments in social psychology: Solomon Asch’s lines study. Let’s look once more at his striking findings on the power of group conformity and consider what they mean now, more than 50 years later, in a world that is much changed from Asch’s 1950s America.

How long are these lines? I don’t know until you tell me.

In the 1950s, Solomon Asch conducted a series of studies to examine the effects of peer pressure, in as clear-cut a setting as possible: visual perception. The idea was to see if, when presented with lines of differing lengths and asked questions about the lines (Which was the longest? Which corresponded to a reference line of a certain length?), participants would answer with the choice that was obviously correct – or would fall sway to the pressure of a group that gave an incorrect response. Here is a sample stimulus from one of the studies:

Which line matches the reference line? It seems obvious, no? Now, imagine that you were in a group with six other people – and they all said that it was, in fact, Line B.  Now, you would have no idea that you were the only actual participant and that the group was carefully arranged with confederates, who were instructed to give that answer and were seated in such a way that they would answer before you. You’d think that they, like you, were participants in the study – and that they all gave what appeared to you to be a patently wrong answer. Would you call their bluff and say, no, the answer is clearly Line A? Are you all blind? Or, would you start to question your own judgment? Maybe it really is Line B. Maybe I’m just not seeing things correctly. How could everyone else be wrong and I be the only person who is right?

We don’t like to be the lone voice of dissent

While we’d all like to imagine that we fall into the second camp, statistically speaking, we are three times more likely to be in the first: over 75% of Asch’s subjects (and far more in the actual condition given above) gave the wrong answer, going along with the group opinion.

[div class=attrib]More from theSource here.[end-div]

Richard Feynman on the Ascendant

Genius – The Life and Science of Richard Feynman by James Gleick was a good first course for those fascinated by Richard Feynman’s significant contributions to physics, cosmology (and percussion).

Now, eight years later come two more biographies that observe Richard Feynman from very different perspectives, reviewed in the New York Review of Books. The first, Lawrence Krauss’s book, Quantum Man is the weighty main course; the second, by Jim Ottaviani and artist Leland Myrick, is a graphic-book (as in comic) biography, and delicious dessert.

In his review — The ‘Dramatic Picture’ of Richard Feynman — Freeman Dyson rightly posits that Richard Feynman’s star may now, or soon, be in the same exalted sphere as Einstein and Hawking. Though, type “Richard” in Google search and wait for its predictive text to fill in the rest and you’ll find that Richard Nixon, Richard Dawkins and Richard Branson rank higher than this giant of physics.

[div class=attrib]Freeman Dyson for the New York Review of Books:[end-div]

In the last hundred years, since radio and television created the modern worldwide mass-market entertainment industry, there have been two scientific superstars, Albert Einstein and Stephen Hawking. Lesser lights such as Carl Sagan and Neil Tyson and Richard Dawkins have a big public following, but they are not in the same class as Einstein and Hawking. Sagan, Tyson, and Dawkins have fans who understand their message and are excited by their science. Einstein and Hawking have fans who understand almost nothing about science and are excited by their personalities.

On the whole, the public shows good taste in its choice of idols. Einstein and Hawking earned their status as superstars, not only by their scientific discoveries but by their outstanding human qualities. Both of them fit easily into the role of icon, responding to public adoration with modesty and good humor and with provocative statements calculated to command attention. Both of them devoted their lives to an uncompromising struggle to penetrate the deepest mysteries of nature, and both still had time left over to care about the practical worries of ordinary people. The public rightly judged them to be genuine heroes, friends of humanity as well as scientific wizards.

Two new books now raise the question of whether Richard Feynman is rising to the status of superstar. The two books are very different in style and in substance. Lawrence Krauss’s book, Quantum Man, is a narrative of Feynman’s life as a scientist, skipping lightly over the personal adventures that have been emphasized in earlier biographies. Krauss succeeds in explaining in nontechnical language the essential core of Feynman’s thinking.

… The other book, by writer Jim Ottaviani and artist Leland Myrick, is very different. It is a comic-book biography of Feynman, containing 266 pages of pictures of Feynman and his legendary adventures. In every picture, bubbles of text record Feynman’s comments, mostly taken from stories that he and others had told and published in earlier books.

[div class=attrib]More from theSource here.[end-div]

[div class=attrib]Image courtesy of Shelley Gazin/Corbis.[end-div]

The Worst of States, the Best of States

Following on from our recent article showing the best of these United States, it’s time to look at the worst.

[div class=attrib]From Frank Jacobs / BigThink:[end-div]

The United States of Shame again gets most of its data from health stats, detailing the deplorable firsts of 14 states (9). Eight states get worst marks for crime, from white-collar to violent (10), while four lead in road accidents (11). Six can be classed as economic worst cases (12), five as moral nadirs (13), two as environmental basket cases (14). In a category of one are states like Ohio (‘Nerdiest’), Maine (‘Dumbest’) and North Dakota (‘Ugliest’).

All claims are neatly backed up by references, some of them to reliable statistics, others to less scientific straw polls. In at least one case, to paraphrase Dickens, the best of stats really is the worst of stats. Ohio’s ‘shameful’ status as nerdiest state is based on its top ranking in library visits. Yet on the ‘awesome’ map, Ohio is listed as the state with… most library visits.

Juxtaposing each state’s best and worst leads to interesting statistical pairings. But with data as haphazardly corralled together as this, causal linkage should be avoided. Otherwise it could be concluded that:

A higher degree of equality leads to an increase in suicides (Alaska);
Sunny weather induces alcoholism (Arizona);
Breastfeeding raises the risk of homelessness (Oregon).
Yet in some cases, some kind of link can be inferred. New Yorkers use more public transit than other Americans, but are also stuck with the longest commutes.

[div class=attrib]More from theSource here.[end-div]

MondayPoem: The Enigma of the Infinitesimal

Monday’s poem comes from Mark Strand over a Slate. Strand was United States Poet Laureate during 1990-91. He won the 1999 Pulitzer Prize for Poetry for “Blizzard of One”.

The poem is austere and spare yet is simply evocative. Is Strand conjuring the spirits and ghosts of our imagination? Perhaps not. These “[l]overs of the in-between” are more likely to be the creative misfits who shy away from attention and who don’t conform to our societal norms. Bloggers perhaps?

[div class=attrib]By Mark Strand for Slate:[end-div]

You’ve seen them at dusk, walking along the shore, seen them standing in doorways, leaning from windows, or straddling the slow moving edge of a shadow. Lovers of the in-between, they are neither here nor there, neither in nor out. Poor souls, they are driven to experience the impossible. Even at night, they lie in bed with one eye closed and the other open, hoping to catch the last second of consciousness and the first of sleep, to inhabit that no man’s land, that beautiful place, to behold as only a god might, the luminous conjunction of nothing and all.

[div class=attrib]Listen to the author read his poem at theSource here.[end-div]

Book Review: The First Detective

A new book by James Morton examines the life and times of cross-dressing burglar, prison-escapee and snitch turned super-detective Eugène-François Vidocq.

[div class=attrib]From The Barnes & Noble Review:[end-div]

The daring costumed escapes and bedsheet-rope prison breaks of the old romances weren’t merely creaky plot devices; they were also the objective correlatives of the lost politics of early modern Europe. Not yet susceptible to legislative amelioration, rules and customs that seemed both indefensible and unassailable had to be vaulted over like collapsing bridges or tunneled under like manor walls. Not only fictional musketeers but such illustrious figures as the young Casanova and the philosopher Jean-Jacques Rousseau spent their early years making narrow escapes from overlapping orthodoxies, swimming moats to marriages of convenience and digging their way  out of prisons of privilege by dressing in drag or posing as noblemen’s sons. If one ran afoul of the local clergy or some aristocratic cuckold, there were always new bishops and magistrates to charm in the next diocese or département.

In 1775–roughly a generation after the exploits of Rousseau and Casanova–a prosperous baker’s son named Eugène-François Vidocq was born in Arras, in northern France. Indolent and adventuresome, he embarked upon a career that in its early phase looked even more hapless and disastrous than those of his illustrious forebears. An indifferent soldier in the chaotic, bloody interregnum of revolutionary France, Vidocq quickly fell into petty crime (at one point, he assumed the name Rousseau for a time as an alias and nom de guerre). A hapless housebreaker and a credulous co-conspirator, his criminal misadventures were equaled only by his skill escaping from the dungeons and bagnes that passed for a penal system in the pre-Napoleonic era.

By 1809, his canniness as an informer landed him a job with the police; with his old criminal comrades as willing foot soldiers, Vidocq organized a brigade de sûreté, a unit of plainclothes police, which in 1813 Napoleon made an official organ of state security. Throughout his subsequent career he would lay much of the foundation of modern policing, and may be considered a forebear not only to the Dupins and the Holmes of modern detective literature but of swashbuckling, above-the-law policemen like Eliot Ness and J. Edgar Hoover as well.

[div class=attrib]More from theSource here.[end-div]

When the multiverse and many-worlds collide

[div class=attrib]From the New Scientist:[end-div]

TWO of the strangest ideas in modern physics – that the cosmos constantly splits into parallel universes in which every conceivable outcome of every event happens, and the notion that our universe is part of a larger multiverse – have been unified into a single theory. This solves a bizarre but fundamental problem in cosmology and has set physics circles buzzing with excitement, as well as some bewilderment.

The problem is the observability of our universe. While most of us simply take it for granted that we should be able to observe our universe, it is a different story for cosmologists. When they apply quantum mechanics – which successfully describes the behaviour of very small objects like atoms – to the entire cosmos, the equations imply that it must exist in many different states simultaneously, a phenomenon called a superposition. Yet that is clearly not what we observe.

Cosmologists reconcile this seeming contradiction by assuming that the superposition eventually “collapses” to a single state. But they tend to ignore the problem of how or why such a collapse might occur, says cosmologist Raphael Bousso at the University of California, Berkeley. “We’ve no right to assume that it collapses. We’ve been lying to ourselves about this,” he says.

In an attempt to find a more satisfying way to explain the universe’s observability, Bousso, together with Leonard Susskind at Stanford University in California, turned to the work of physicists who have puzzled over the same problem but on a much smaller scale: why tiny objects such as electrons and photons exist in a superposition of states but larger objects like footballs and planets apparently do not.

This problem is captured in the famous thought experiment of Schrödinger’s cat. This unhappy feline is inside a sealed box containing a vial of poison that will break open when a radioactive atom decays. Being a quantum object, the atom exists in a superposition of states – so it has both decayed and not decayed at the same time. This implies that the vial must be in a superposition of states too – both broken and unbroken. And if that’s the case, then the cat must be both dead and alive as well.

[div class=attrib]More from theSource here.[end-div]

Dark energy spotted in the cosmic microwave background

[div class=attrib]From Institute of Physics:[end-div]

Astronomers studying the cosmic microwave background (CMB) have uncovered new direct evidence for dark energy – the mysterious substance that appears to be accelerating the expansion of the universe. Their findings could also help map the structure of dark matter on the universe’s largest length scales.

The CMB is the faint afterglow of the universe’s birth in the Big Bang. Around 400,000 years after its creation, the universe had cooled sufficiently to allow electrons to bind to atomic nuclei. This “recombination” set the CMB radiation free from the dense fog of plasma that was containing it. Space telescopes such as WMAP and Planck have charted the CMB and found its presence in all parts of the sky, with a temperature of 2.7 K. However, measurements also show tiny fluctuations in this temperature on the scale of one part in a million. These fluctuations follow a Gaussian distribution.

In the first of two papers, a team of astronomers including Sudeep Das at the University of California, Berkeley, has uncovered fluctuations in the CMB that deviate from this Gaussian distribution. The deviations, observed with the Atacama Cosmology Telescope in Chile, are caused by interactions with large-scale structures in the universe, such as galaxy clusters. “On average, a CMB photon will have encountered around 50 large-scale structures before it reaches our telescope,” Das told physicsworld.com. “The gravitational influence of these structures, which are dominated by massive clumps of dark matter, will each deflect the path of the photon,” he adds. This process, called “lensing”, eventually adds up to a total deflection of around 3 arc minutes – one-20th of a degree.

Dark energy versus structure

In the second paper Das, along with Blake Sherwin of Princeton University and Joanna Dunkley of Oxford University, looks at how lensing could reveal dark energy. Dark energy acts to counter the emergence of structures within the universe. A universe with no dark energy would have a lot of structure. As a result, the CMB photons would undergo greater lensing and the fluctuations would deviate more from the original Gaussian distribution.

[div class=attrib]More from theSource here.[end-div]

Green Bootleggers and Baptists

[div class=attrib]Bjørn Lomborg for Project Syndicate:[end-div]

In May, the United Nations’ International Panel on Climate Change made media waves with a new report on renewable energy. As in the past, the IPCC first issued a short summary; only later would it reveal all of the data. So it was left up to the IPCC’s spin-doctors to present the take-home message for journalists.

The first line of the IPCC’s press release declared, “Close to 80% of the world‘s energy supply could be met by renewables by mid-century if backed by the right enabling public policies.” That story was repeated by media organizations worldwide.

Last month, the IPCC released the full report, together with the data behind this startlingly optimistic claim. Only then did it emerge that it was based solely on the most optimistic of 164 modeling scenarios that researchers investigated. And this single scenario stemmed from a single study that was traced back to a report by the environmental organization Greenpeace. The author of that report – a Greenpeace staff member – was one of the IPCC’s lead authors.

The claim rested on the assumption of a large reduction in global energy use. Given the number of people climbing out of poverty in China and India, that is a deeply implausible scenario.

When the IPCC first made the claim, global-warming activists and renewable-energy companies cheered. “The report clearly demonstrates that renewable technologies could supply the world with more energy than it would ever need,” boasted Steve Sawyer, Secretary-General of the Global Wind Energy Council.

This sort of behavior – with activists and big energy companies uniting to applaud anything that suggests a need for increased subsidies to alternative energy – was famously captured by the so-called “bootleggers and Baptists” theory of politics.

The theory grew out of the experience of the southern United States, where many jurisdictions required stores to close on Sunday, thus preventing the sale of alcohol. The regulation was supported by religious groups for moral reasons, but also by bootleggers, because they had the market to themselves on Sundays. Politicians would adopt the Baptists’ pious rhetoric, while quietly taking campaign contributions from the criminals.

Of course, today’s climate-change “bootleggers” are not engaged in any illegal behavior. But the self-interest of energy companies, biofuel producers, insurance firms, lobbyists, and others in supporting “green” policies is a point that is often missed.

Indeed, the “bootleggers and Baptists” theory helps to account for other developments in global warming policy over the past decade or so. For example, the Kyoto Protocol would have cost trillions of dollars, but would have achieved a practically indiscernible difference in stemming the rise in global temperature. Yet activists claimed that there was a moral obligation to cut carbon-dioxide emissions, and were cheered on by businesses that stood to gain.

[div class=attrib]More from theSource here[end-div]

Hello Internet; Goodbye Memory

Imagine a world without books; you’d have to commit useful experiences, narratives and data to handwritten form and memory.Imagine a world without the internet and real-time search; you’d have to rely on a trusted expert or a printed dictionary to find answers to your questions. Imagine a world without the written word; you’d have to revert to memory and oral tradition to pass on meaningful life lessons and stories.

Technology is a wonderfully double-edged mechanism. It brings convenience. It helps in most aspects of our lives. Yet, it also brings fundamental cognitive change that brain scientists have only recently begun to fathom. Recent studies, including the one cited below from Columbia University explore this in detail.

[div class=attrib]From Technology Review:[end-div]

A study says that we rely on external tools, including the Internet, to augment our memory.

The flood of information available online with just a few clicks and finger-taps may be subtly changing the way we retain information, according to a new study. But this doesn’t mean we’re becoming less mentally agile or thoughtful, say the researchers involved. Instead, the change can be seen as a natural extension of the way we already rely upon social memory aids—like a friend who knows a particular subject inside out.

Researchers and writers have debated over how our growing reliance on Internet-connected computers may be changing our mental faculties. The constant assault of tweets and YouTube videos, the argument goes, might be making us more distracted and less thoughtful—in short, dumber. However, there is little empirical evidence of the Internet’s effects, particularly on memory.

Betsy Sparrow, assistant professor of psychology at Columbia University and lead author of the new study, put college students through a series of four experiments to explore this question.

One experiment involved participants reading and then typing out a series of statements, like “Rubber bands last longer when refrigerated,” on a computer. Half of the participants were told that their statements would be saved, and the other half were told they would be erased. Additionally, half of the people in each group were explicitly told to remember the statements they typed, while the other half were not. Participants who believed the statements would be erased were better at recalling them, regardless of whether they were told to remember them.

[div class=attrib]More from theSource here.[end-div]

The Good, the Bad and the Ugly – 40 years on

One of the most fascinating and (in)famous experiments in social psychology began in the bowels of Stanford University 40 years ago next month. The experiment intended to evaluate how people react to being powerless. However, on conclusion it took a broader look at role assignment and reaction to authority.

The Stanford Prison Experiment incarcerated male college student volunteers in a mock prison for 6 fateful days. Some of the students were selected to be prison guards, the remainder would be prisoners. The researchers, led by psychology professor Philip Zimbardo encouraged the guards to think of themselves as actual guards in a real prison. What happened during these 6 days in “prison” is the stuff of social science legend. The results continues to shock psychologists to this day; many were not prepared for the outcome after 6 days, which saw guards take their roles to the extreme becoming overarchingly authoritarian and mentally abusive, and prisoners become down-trodden and eventually rebellious. A whistle-blower eventually called the experiment to an abrupt end (it was to have continued for 2 weeks).

Forty years on, researchers went back to interview professor Zimbardo and some of the participating guards and prisoners to probe their feelings now. Recollections from one of the guards is below.

[div class=attrib]From Stanford Magazine:[end-div]

I was just looking for some summer work. I had a choice of doing this or working at a pizza parlor. I thought this would be an interesting and different way of finding summer employment.

The only person I knew going in was John Mark. He was another guard and wasn’t even on my shift. That was critical. If there were prisoners in there who knew me before they encountered me, then I never would have been able to pull off anything I did. The act that I put on—they would have seen through it immediately.

What came over me was not an accident. It was planned. I set out with a definite plan in mind, to try to force the action, force something to happen, so that the researchers would have something to work with. After all, what could they possibly learn from guys sitting around like it was a country club? So I consciously created this persona. I was in all kinds of drama productions in high school and college. It was something I was very familiar with: to take on another personality before you step out on the stage. I was kind of running my own experiment in there, by saying, “How far can I push these things and how much abuse will these people take before they say, ‘knock it off?'” But the other guards didn’t stop me. They seemed to join in. They were taking my lead. Not a single guard said, “I don’t think we should do this.”

The fact that I ramped up the intimidation and the mental abuse without any real sense as to whether I was hurting anybody— I definitely regret that. But in the long run, no one suffered any lasting damage. When the Abu Ghraib scandal broke, my first reaction was, this is so familiar to me. I knew exactly what was going on. I could picture myself in the middle of that and watching it spin out of control. When you have little or no supervision as to what you’re doing, and no one steps in and says, “Hey, you can’t do this”—things just keep escalating. You think, how can we top what we did yesterday? How do we do something even more outrageous? I felt a deep sense of familiarity with that whole situation.

Sometimes when people know about the experiment and then meet me, it’s like, My God, this guy’s a psycho! But everyone who knows me would just laugh at that.

[div class=attrib]More from theSource here.[end-div]

3D Printing – A demonstration

Three dimensional “printing” has been around for a few years now, but the technology continues to advance by leaps and bounds. The technology has already progressed to such an extent that some 3D print machines can now “print” objects with moving parts and in color as well. And, we all thought those cool replicator machines in Star Trek were the stuff of science fiction.

[tube]LQfYm4ZVcVI[/tube]

Book Review: “Millennium People”: J.G. Ballard’s last hurrah

[div class=attrib]From Salon:[end-div]

In this, his last novel, the darkly comic “Millennium People,” J.G. Ballard returns to many of the themes that have established him as one of the 20th century’s principal chroniclers of modernity as dystopia. Throughout his career Ballard, who died in 2009, wrote many different variations on the same theme: A random act of violence propels a somewhat affectless protagonist into a violent pathology lurking just under the tissue-thin layer of postmodern civilization. As in “Crash” (1973) and “Concrete Island” (1974), the car parks, housing estates, motorways and suburban sprawl of London in “Millennium People” form a psychological geography. At its center, Heathrow Airport — a recurrent setting for Ballard — exerts its subtly malevolent pull on the bored lives and violent dreams of the alienated middle class.

“Millennium People” begins with the explosion of a bomb at Heathrow, which kills the ex-wife of David Markham, an industrial psychologist. The normally passive Markham sets out to investigate the anonymous bombing and the gated community of Chelsea Marina, a middle-class neighborhood that has become ground zero for a terrorist group and a burgeoning rebellion of London’s seemingly docile middle class. Exploited not so much for their labor as for their deeply ingrained and self-policing sense of social responsibility and good manners, the educated and professional residents of Chelsea Marina regard themselves as the “new proletariat,” with their exorbitant maintenance and parking fees as the new form of oppression, their careers, cultured tastes and education the new gulag.

In the company of a down-and-out priest and a film professor turned Che Guevara of the Volvo set, Markham quickly discovers that the line between amateur detective and amateur terrorist is not so clear, as he is drawn deeper into acts of sabotage and violence against the symbols and institutions of his own safe and sensible life. Targets include travel agencies, video stores, the Tate Modern, the BBC and National Film Theater — all “soporifics” designed to con people into believing their lives are interesting or going somewhere.

[div class=attrib]More from theSource here.[end-div]

Happy Birthday Neptune

One hundred and sixty-four years ago, or one Neptunian year, Neptune was first observed by telescope. Significantly, it was the first planet to be discovered deliberately; the existence and location of the gas giant was calculated mathematically. Subsequently, it was located by telescope, on 24 September 1846, and found to be within one degree of the mathematically predicted location. Astronomers hypothesized Neptune’s existence due to perturbations in the orbit of its planetary neighbor, Uranus, around the sun, which could only be explained by the presence of another object in nearby orbit. A triumph for the scientific method, and besides, it’s beautiful too.

[div class=attrib]Image courtesy of NASA.[end-div]

Culturally Specific Mental Disorders: A Bad Case of the Brain Fags

Is this man buff enough? Image courtesy of Slate

If you happen to have just read The Psychopath Test by Jon Ronson, this article in Slate is appropriately timely, and presents new fodder for continuing research (and a sequel). It would therefore come as no surprise to find Mr.Ronson trekking through Newfoundland in search of “Old Hag Syndrome”, a type of sleep paralysis, visiting art museums in Italy for “Stendhal Syndrome,” a delusional disorder experienced by Italians after studying artistic masterpieces, and checking on Nigerian college students afflicted by “Brain Fag Syndrome”. Then there is: “Wild Man Syndrome,” from New Guinea (a syndrome combining hyperactivity, clumsiness and forgetfulness), “Koro Syndrome” (a delusion of disappearing protruding body parts) first described in China over 2,000 years ago, “Jiko-shisen-kyofu” from Japan (a fear of offending others by glancing at them), and here in the west, “Muscle Dysmorphia Syndrome” (a delusion common in weight-lifters that one’s body is insufficiently ripped).

All of these and more can be found in the latest version of the DSM-IV (Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition) manual.

[div class=attrib]From Slate:[end-div]

In 1951, Hong Kong psychiatrist Pow-Meng Yap authored an influential paper in the Journal of Mental Sciences on the subject of “peculiar psychiatric disorders”—those that did not fit neatly into the dominant disease-model classification scheme of the time and yet appeared to be prominent, even commonplace, in certain parts of the world. Curiously these same conditions—which include “amok” in Southeast Asia and bouffée délirante in French-speaking countries—were almost unheard of outside particular cultural contexts. The American Psychiatric Association has conceded that certain mysterious mental afflictions are so common, in some places, that they do in fact warrant inclusion as “culture-bound syndromes” in the official Diagnostic and Statistical Manual of Mental Disorders.

he working version of this manual, the DSM-IV, specifies 25 such syndromes. Take “Old Hag Syndrome,” a type of sleep paralysis in Newfoundland in which one is visited by what appears to be a rather unpleasant old hag sitting on one’s chest at night. (If I were a bitter, divorced straight man, I’d probably say something diabolical about my ex-wife here.) Then there’s gururumba, or “Wild Man Syndrome,” in which New Guinean males become hyperactive, clumsy, kleptomaniacal, and conveniently amnesic, “Brain Fag Syndrome” (more on that in a moment), and “Stendhal Syndrome,” a delusional disorder experienced mostly by Italians after gazing upon artistic masterpieces. The DSM-IV defines culture-bound syndromes as “recurrent, locality-specific patterns of aberrant behavior and troubling experience that may or may not be linked to a particular diagnostic category.”
And therein lies the nosological pickle: The symptoms of culture-bound syndromes often overlap with more general, known psychiatric conditions that are universal in nature, such as schizophrenia, body dysmorphia, and social anxiety. What varies across cultures, and is presumably moulded by them, is the unique constellation of symptoms, or “idioms of distress.”

Some scholars believe that many additional distinct culture-bound syndromes exist. One that’s not in the manual but could be, argue psychiatrists Gen Kanayama and Harrison Pope in a short paper published earlier this year in the Harvard Review of Psychiatry, is “muscle dysmorphia.” The condition is limited to Western males, who suffer the delusion that they are insufficiently ripped. “As a result,” write the authors, “they may lift weights compulsively in the gym, often gain large amounts of muscle mass, yet still perceive themselves as too small.” Within body-building circles, in fact, muscle dysmorphia has long been recognized as a sort of reverse anorexia nervosa. But it’s almost entirely unheard of among Asian men. Unlike hypermasculine Western heroes such as Hercules, Thor, and the chiseled Arnold of yesteryear, the Japanese and Chinese have tended to prefer their heroes fully clothed, mentally acute, and lithe, argue Kanayama and Pope. In fact, they say anabolic steroid use is virtually nonexistent in Asian countries, even though the drugs are considerably easier to obtain, being available without a prescription at most neighborhood drugstores.

[div class=attrib]More from theSource here.[end-div]

Disconnected?

[div class=attrib]From Slate:[end-div]

Have you heard that divorce is contagious? A lot of people have. Last summer a study claiming to show that break-ups can propagate from friend to friend to friend like a marriage-eating bacillus spread across the news agar from CNN to CBS to ABC with predictable speed. “Think of this ‘idea’ of getting divorced, this ‘option’ of getting divorced like a virus, because it spreads more or less the same way,” explained University of California-San Diego professor James Fowler to the folks at Good Morning America.

It’s a surprising, quirky, and seemingly plausible finding, which explains why so many news outlets caught the bug. But one weird thing about the media outbreak was that the study on which it was based had never been published in a scientific journal. The paper had been posted to the Social Science Research Network web site, a sort of academic way station for working papers whose tagline is “Tomorrow’s Research Today.” But tomorrow had not yet come for the contagious divorce study: It had never actually passed peer review, and still hasn’t. “It is under review,” Fowler explained last week in an email. He co-authored the paper with his long-time collaborator, Harvard’s Nicholas Christakis, and lead author Rose McDermott.

A few months before the contagious divorce story broke, Slate ran an article I’d written based on a related, but also unpublished, scientific paper. The mathematician Russell Lyons had posted a dense treatise on his website suggesting that the methods employed by Christakis and Fowler in their social network studies were riddled with statistical errors at many levels. The authors were claiming—in the New England Journal of Medicine, in a popular book, in TED talks, in snappy PR videos—that everything from obesity to loneliness to poor sleep could spread from person to person to person like a case of the galloping crud. But according to Lyons and several other experts, their arguments were shaky at best. “It’s not clear that the social contagionists have enough evidence to be telling people that they owe it to their social network to lose weight,” I wrote last April. As for the theory that obesity and divorce and happiness contagions radiate from human beings through three degrees of friendship, I concluded “perhaps it’s best to flock away for now.”

The case against Christakis and Fowler has grown since then. The Lyons paper passed peer review and was published in the May issue of the journal Statistics, Politics, and Policy. Two other recent papers raise serious doubts about their conclusions. And now something of a consensus is forming within the statistics and social-networking communities that Christakis and Fowler’s headline-grabbing contagion papers are fatally flawed. Andrew Gelman, a professor of statistics at Columbia, wrote a delicately worded blog post in June noting that he’d “have to go with Lyons” and say that the claims of contagious obesity, divorce and the like “have not been convincingly demonstrated.” Another highly respected social-networking expert, Tom Snijders of Oxford, called the mathematical model used by Christakis and Fowler “not coherent.” And just a few days ago, Cosma Shalizi, a statistician at Carnegie Mellon, declared, “I agree with pretty much everything Snijders says.”

[div class=attrib]More from theSource here.[end-div]

MondayPoem: If You Forget Me

Pablo Neruda (1904–1973)

[div class=attrib]If You Forget Me, Pablo Neruda[end-div]

I want you to know
one thing.

You know how this is:
if I look
at the crystal moon, at the red branch
of the slow autumn at my window,
if I touch
near the fire
the impalpable ash
or the wrinkled body of the log,
everything carries me to you,
as if everything that exists,
aromas, light, metals,
were little boats
that sail
toward those isles of yours that wait for me.

Well, now,
if little by little you stop loving me
I shall stop loving you little by little.

If suddenly
you forget me
do not look for me,
for I shall already have forgotten you.

If you think it long and mad,
the wind of banners
that passes through my life,
and you decide
to leave me at the shore
of the heart where I have roots,
remember
that on that day,
at that hour,
I shall lift my arms
and my roots will set off
to seek another land.

But
if each day,
each hour,
you feel that you are destined for me
with implacable sweetness,
if each day a flower
climbs up to your lips to seek me,
ah my love, ah my own,
in me all that fire is repeated,
in me nothing is extinguished or forgotten,
my love feeds on your love, beloved,
and as long as you live it will be in your arms
without leaving mine.

The Best of States, the Worst of States

[div class=attrib]From Frank Jacobs / BigThink:[end-div]

Are these maps cartograms or mere infographics?

An ‘information graphic’ is defined as any graphic representation of data. It follows from that definition that infographics are less determined by type than by purpose. Which is to represent complex information in a readily graspable graphic format. Those formats are often, but not only: diagrams, flow charts, and maps.

Although one definition of maps – the graphic representation of spatial data – is very similar to that of infographics, the two are easily distinguished by, among other things, the context of the latter, which are usually confined to and embedded in technical and journalistic writing.

Cartograms are a subset of infographics, limited to one type of graphic representation: maps. On these maps, one set of quantitative information (usually surface or distance) is replaced by another (often demographic data or electoral results). The result is an informative distortion of the map (1).

The distortion on these maps is not of the distance-bending or surface-stretching kind. It merely substitutes the names of US states with statistical information relevant to each of them (2). This substitution is non-quantitative, affecting the toponymy rather than the topography of the map. So is this a mere infographic? As the information presented is statistical (each label describes each state as first or last in a Top 50), I’d say this is – if you’ll excuse the pun – a borderline case.

What’s more relevant, from this blog’s perspective, is that it is an atypical, curious and entertaining use of cartography.

The first set of maps labels each and every one of the states as best and worst at something. All of those distinctions, both the favourable and the unfavourable kind, are backed up by some sort of evidence.

The first map, the United States of Awesome, charts fifty things that each state of the Union is best at. Most of those indicators, 12 in all, are related to health and well-being (3). Ten are economic (4), six environmental (5), five educational (6). Three can be classified as ‘moral’, even if these particular distinctions make for strange bedfellows (7).

The best thing that can be said about Missouri and Illinois, apparently, is that they’re extremely average (8). While that may excite few people, it will greatly interest political pollsters and anyone in need of a focus group. Virginia and Indiana are the states with the most birthplaces of presidents and vice-presidents, respectively. South Carolinians prefer to spend their time golfing, Pennsylvanians hunting. Violent crime is lowest in Maine, public corruption in Nebraska. The most bizarre distinctions, finally, are reserved for New Mexico (Spaceport Home), Oklahoma (Best Licence Plate) and Missouri (Bromine Production). If that’s the best thing about those states, what might be the worst?

[div class=attrib]More from theSource here.[end-div]

Cy Twombly, Idiosyncratic Painter, Dies at 83

Cy Twombly. Image courtesy of Sundance Channel

[div class=attrib]From the New York Times:[end-div]

Cy Twombly, whose spare childlike scribbles and poetic engagement with antiquity left him stubbornly out of step with the movements of postwar American art even as he became one of the era’s most important painters, died in Rome Tuesday. He was 83.

The cause was not immediately known, although Mr. Twombly had suffered from cancer. His death was announced by the Gagosian Gallery, which represents his work.

In a career that slyly subverted Abstract Expressionism, toyed briefly with Minimalism, seemed barely to acknowledge Pop Art and anticipated some of the concerns of Conceptualism, Mr. Twombly was a divisive artist almost from the start. The curator Kirk Varnedoe, on the occasion of a 1994 retrospective at the Museum of Modern Art, wrote that his work was “influential among artists, discomfiting to many critics and truculently difficult not just for a broad public, but for sophisticated initiates of postwar art as well.” The critic Robert Hughes called him “the Third Man, a shadowy figure, beside that vivid duumvirate of his friends Jasper Johns and Robert Rauschenberg.”

Mr. Twombly’s decision to settle permanently in southern Italy in 1957 as the art world shifted decisively in the other direction, from Europe to New York, was only the most symbolic of his idiosyncrasies. He avoided publicity throughout his life and mostly ignored his critics, who questioned constantly whether his work deserved a place at the forefront of 20th-century abstraction, though he lived long enough to see it arrive there. It didn’t help that his paintings, because of their surface complexity and whirlwinds of tiny detail – scratches, erasures, drips, penciled fragments of Italian and classical verse amid scrawled phalluses and buttocks – lost much of their power in reproduction.

But Mr. Twombly, a tall, rangy Virginian who once practiced drawing in the dark to make his lines less purposeful, steadfastly followed his own program and looked to his own muses: often literary ones like Catullus, Rumi, Pound and Rilke. He seemed to welcome the privacy that came with unpopularity.

“I had my freedom and that was nice,” he said in a rare interview, with Nicholas Serota, the director of the Tate, before a 2008 survey of his career at the Tate Modern.

The critical low point probably came after a 1964 exhibition at the Leo Castelli Gallery in New York that was widely panned. The artist and writer Donald Judd, who was hostile toward painting in general, was especially damning even so, calling the show a fiasco. “There are a few drips and splatters and an occasional pencil line,” he wrote in a review. “There isn’t anything to these paintings.”

[div class=attrib]More from theSource here.[end-div]

MondayPoem: Let America Be America Again

[div class=attrib]Let America Be America Again, Langston Hughes[end-div]

Let America be America again.
Let it be the dream it used to be.
Let it be the pioneer on the plain
Seeking a home where he himself is free.

(America never was America to me.)

Let America be the dream the dreamers dreamed–
Let it be that great strong land of love
Where never kings connive nor tyrants scheme
That any man be crushed by one above.

(It never was America to me.)

O, let my land be a land where Liberty
Is crowned with no false patriotic wreath,
But opportunity is real, and life is free,
Equality is in the air we breathe.

(There’s never been equality for me,
Nor freedom in this “homeland of the free.”)

Say, who are you that mumbles in the dark?
And who are you that draws your veil across the stars?

I am the poor white, fooled and pushed apart,
I am the Negro bearing slavery’s scars.
I am the red man driven from the land,
I am the immigrant clutching the hope I seek–
And finding only the same old stupid plan
Of dog eat dog, of mighty crush the weak.

I am the young man, full of strength and hope,
Tangled in that ancient endless chain
Of profit, power, gain, of grab the land!
Of grab the gold! Of grab the ways of satisfying need!
Of work the men! Of take the pay!
Of owning everything for one’s own greed!

I am the farmer, bondsman to the soil.
I am the worker sold to the machine.
I am the Negro, servant to you all.
I am the people, humble, hungry, mean–
Hungry yet today despite the dream.
Beaten yet today–O, Pioneers!
I am the man who never got ahead,
The poorest worker bartered through the years.

Yet I’m the one who dreamt our basic dream
In the Old World while still a serf of kings,
Who dreamt a dream so strong, so brave, so true,
That even yet its mighty daring sings
In every brick and stone, in every furrow turned
That’s made America the land it has become.
O, I’m the man who sailed those early seas
In search of what I meant to be my home–
For I’m the one who left dark Ireland’s shore,
And Poland’s plain, and England’s grassy lea,
And torn from Black Africa’s strand I came
To build a “homeland of the free.”

The free?

Who said the free? Not me?
Surely not me? The millions on relief today?
The millions shot down when we strike?
The millions who have nothing for our pay?
For all the dreams we’ve dreamed
And all the songs we’ve sung
And all the hopes we’ve held
And all the flags we’ve hung,
The millions who have nothing for our pay–
Except the dream that’s almost dead today.

O, let America be America again–
The land that never has been yet–
And yet must be–the land where every man is free.
The land that’s mine–the poor man’s, Indian’s, Negro’s, ME–
Who made America,
Whose sweat and blood, whose faith and pain,
Whose hand at the foundry, whose plow in the rain,
Must bring back our mighty dream again.

Sure, call me any ugly name you choose–
The steel of freedom does not stain.
From those who live like leeches on the people’s lives,
We must take back our land again,
America!

O, yes,
I say it plain,
America never was America to me,
And yet I swear this oath–
America will be!

Out of the rack and ruin of our gangster death,
The rape and rot of graft, and stealth, and lies,
We, the people, must redeem
The land, the mines, the plants, the rivers.
The mountains and the endless plain–
All, all the stretch of these great green states–
And make America again!

MondayPoem: Morning In The Burned House

[div class=attrib]Morning In The Burned House, Margaret Atwood[end-div]

In the burned house I am eating breakfast.
You understand: there is no house, there is no breakfast,
yet here I am.

The spoon which was melted scrapes against
the bowl which was melted also.
No one else is around.

Where have they gone to, brother and sister,
mother and father? Off along the shore,
perhaps. Their clothes are still on the hangers,

their dishes piled beside the sink,
which is beside the woodstove
with its grate and sooty kettle,

every detail clear,
tin cup and rippled mirror.
The day is bright and songless,

the lake is blue, the forest watchful.
In the east a bank of cloud
rises up silently like dark bread.

I can see the swirls in the oilcloth,
I can see the flaws in the glass,
those flares where the sun hits them.

I can’t see my own arms and legs
or know if this is a trap or blessing,
finding myself back here, where everything

in this house has long been over,
kettle and mirror, spoon and bowl,
including my own body,

including the body I had then,
including the body I have now
as I sit at this morning table, alone and happy,

bare child’s feet on the scorched floorboards
(I can almost see)
in my burning clothes, the thin green shorts

and grubby yellow T-shirt
holding my cindery, non-existent,
radiant flesh. Incandescent.

Cosmic Smoothness

Simulations based on the standard cosmological model, as shown here, indicate that on very large distance scales, galaxies should be uniformly distributed. But observations show a clumpier distribution than expected. (The length bar represents about $2.3$ billion light years.)[div class=attrib]From American Physical Society, Michael J. Hudson:[end-div]

The universe is expected to be very nearly homogeneous in density on large scales. In Physical Review Letters, Shaun Thomas and colleagues from University College London analyze measurements of the density of galaxies on the largest spatial scales so far—billions of light years—and find that the universe is less smooth than expected. If it holds up, this result will have important implications for our understanding of dark matter, dark energy, and perhaps gravity itself.

In the current standard cosmological model, the average mass-energy density of the observable universe consists of 5% normal matter (most of which is hydrogen and helium), 23% dark matter, and 72% dark energy. The dark energy is assumed to be uniform, but the normal and dark matter are not. The balance between matter and dark energy determines both how the universe expands and how regions of unusually high or low matter density evolve with time.

The same cosmological model predicts the statistics of the nonuniform structure and their dependence on spatial scale. On scales that are small by cosmological standards, fluctuations in the matter density are comparable to its mean, in agreement with what is seen: matter is clumped into galaxies, clusters of galaxies, and filaments of the “cosmic web.” On larger scales, however, the contrast of the structures compared to the mean density decreases. On the largest cosmological scales, these density fluctuations are small in amplitude compared to the average density of the universe and so are well described by linear perturbation theory (see simulation results in Fig. 1). Moreover, these perturbations can be calibrated at early times directly from the cosmic microwave background (CMB), a snapshot of the universe from when it was only 380,000 years old. Despite the fact that only 5% of the Universe is well understood, this model is an excellent fit to data spanning a wide range of spatial scales as the fluctuations evolved from the time of the CMB to the present age of the universe, some 13.8 billion years. On the largest scales, dark energy drives accelerated expansion of the universe. Because this aspect of the standard model is least understood, it is important to test it on these scales.

Thomas et al. use publicly-released catalogs from the Sloan Digital Sky Survey to select more than 700,000 galaxies whose observed colors indicate a significant redshift and are therefore presumed to be at large cosmological distances. They use the redshift of the galaxies, combined with their observed positions on the sky, to create a rough three-dimensional map of the galaxies in space and to assess the homogeneity on scales of a couple of billion light years. One complication is that Thomas et al. measure the density of galaxies, not the density of all matter, but we expect that fluctuations of these two densities about their means to be proportional; the constant of proportionality can be calibrated by observations on smaller scales. Indeed, on small scales the galaxy data are in good agreement with the standard model. On the largest scales, the fluctuations in galaxy density are expected to be of order a percent of the mean density, but Thomas et al. find fluctuations double this prediction. This result then suggests that the universe is less homogeneous than expected.

This result is not entirely new: previous studies based on subsets of the data studied by Thomas et al. showed the same effect, albeit with a lower statistical significance. In addition, there are other ways of probing the large-scale mass distribution. For example, inhomogeneities in the mass distribution lead to inhomogeneities in the local rate of expansion. Some studies have suggested that, on very large scales, this expansion too is less homogeneous than the model predictions.

Future large-scale surveys will produce an avalanche of data. These surveys will allow the methods employed by Thomas et al. and others to be extended to still larger scales. Of course, the challenge for these future surveys will be to correct for the systematic effects to even greater accuracy.

[div class=attrib]More from theSource here.[end-div]

Self-Published Author Sells a Million E-Books on Amazon

[div class=attrib]From ReadWriteWeb:[end-div]

Since the Kindle’s launch, Amazon has heralded each new arrival into what it calls the “Kindle Million Club,” the group of authors who have sold over 1 million Kindle e-books. There have been seven authors in this club up ’til now – some of the big names in publishing: Stieg Larsson, James Patterson, and Nora Roberts for example.

But the admission today of the eighth member of this club is really quite extraordinary. Not because John Locke is a 60 year old former insurance salesman from Kentucky with no writing or publishing background. But because John Locke has accomplished the feat of selling one million e-books as a completely self-published author.

Rather than being published by major publishing house – and all the perks that have long been associated with that (marketing, book tours, prime shelf space in retail stores) – Locke has sold 1,010,370 Kindle books (as of yesterday) having used Kindle Direct Publishing to get his e-books into the Amazon store. No major publisher. No major marketing.

Locke writes primarily crime and adventure stories, including Vegas Moon, Wish List, and the New York Times E-Book Bestseller, Saving Rachel. Most of the e-books sell for $.99, and he says he makes 35 cents on every sale. That sort of per book profit is something that authors would never get from a traditional book deal.

[div class=attrib]More from theSource here.[end-div]

Book Review: Solar. Ian McEwan

Solar is a timely, hilarious novel from the author of Atonement that examines the self-absorption and (self-)deceptions of Nobel Prize-winning physicist Michael Beard. With his best work many decades behind him Beard trades on his professional reputation to earn continuing financial favor, and maintain influence and respect amongst his peers. And, with his personal life in an ever-decreasing spiral, with his fifth marriage coming to an end, Beard manages to entangle himself in an impossible accident which has the power to re-shape his own world, and the planet in the process.

Ian McEwan’s depiction of Michael Beard is engaging and thoroughly entertaining. Beard hops from relationship to relationship in his singular quest for “love”, but very much on his own terms. And, this very centric view of himself extends to his own science, where his personal contributions don’t seem to be all that they appear. Satire and climate science makes a stylish and witty combination in the hands of McEwan.

Book Review: The Social Animal. David Brooks

David Brooks brings us a detailed journey through the building blocks of the self in his new book, The Social Animal: A Story of Love, Character and Achievement. With his insight and gift for narrative Brooks weaves an engaging and compelling story of Erica and Harold. Brooks uses the characters of Erica and Harold as platforms on which he visualizes the results of numerous psychological, social and cultural studies. Placed in contemporary time the two characters show us a holistic picture in practical terms of the unconscious effects of physical and social context on behavioral and character traits. The narrative takes us through typical life events and stages: infancy, childhood, school, parenting, work-life, attachment, aging. At each stage, Brooks illustrates his views of the human condition by selecting a flurry of facts and anecdotal studies.

The psychologist in me would say that this is a rather shallow attempt at synthesizing profoundly complex issues. Brooks certainly makes use of many studies from the brain and social sciences, but never dwells long enough to give us a detailed sense of major underlying implications or competing scientific positions. So too, the character development of Erica and Harold lacks the depth and breadth one would expect — Brooks fails to explore much of what typically seems to motivate human behavior: greed, ambition, lust, violence, empathy.  Despite these flaws in the execution of the idea, Brooks’ attempt is praiseworthy; perhaps in the hands of a more skilled social scientist, or Rousseau who used this technique much more effectively, this type of approach would gain a better grade.

Book Review: The Drunkard’s Walk: How Randomness Rules Our Lives. Leonard Mlodinow

Leonard Mlodinow weaves a compelling path through the world of statistical probability showing us how the laws of chance affect our lives on personal and grande scales. Mlodinow skillfully illustrates randomness and its profound implications by presenting complex mathematical constructs in language for the rest of us (non-mathematicians), without dumbing-down this important subject.

The book defines many of the important mathematical concepts behind randomness and exposes the key fallacies that often blind us as we wander through life on our “drunkard’s walk”. The law of large numbers, the prosecutor’s fallacy, conditional probability, the availability bias and bell curves were never so approachable.

Whether it’s a deluded gambler, baseball star on a “winning streak” or a fortunate CEO wallowing in the good times, Mlodinow debunks the common conceptions that skill, planning and foresight result in any significant results beyond pure chance. With the skill of a storyteller Mlodinow shows us how polls, grades, ratings and even measures of corporate success are far less objective and reliable than we ought to believe. Lords of Wall Street take notice, the secrets of your successes are not all that they seem.

Art. Does it have to be BOLD to be good?

The lengthy corridors of art history over the last five hundred years are decorated with numerous bold and monumental works. Just to name a handful of memorable favorites you’ll see a pattern emerge: Guernica (Pablo Picasso), The Persistence of Memory (Salvador Dali), The Dance (Henri Matisse), The Garden of Earthly Delights (Heironymous Bosch). Yes, these works are bold. They’re bold in the sense that they represented a fundamental shift from the artistic sensibilities and ideas of their times. These works stirred the salons and caused commotion among the “cognosenti” and the chattering classes. They implored (or decried) the establishment to take notice of new forms, new messages, new perspectives.

And, now here we are in the 21st century, floating in a bottomless bowl of a bold media soup; 24-hour opinion and hyperbole; oversized interactive billboards, explosive 3D movies, voyeuristic reality TV, garish commercials, sexually charged headlines and suggestive mainstream magazines. The provocative images, the loudness, the vividness, the anger – it’s all bold and it’s vying for your increasingly fragmented and desensitized attention. But, this contemporary boldness seems more aligned with surface brightness and bigness than it is with depth of meaning. The boldness of works by earlier artists such as Picasso, Dali, Bosch came from depth of meaning rather than use of neon paints or other bold visual noise.

So, what of contemporary art over the last couple of decades? Well, a pseudo-scientific tour of half-a-dozen art galleries featuring the in-the-moment works of art may well tell you the same story – it’s mostly bold as well. What’s been selling at the top art auction houses? Bold. What’s been making headlines in the art world? Bold.

The trend is and has been set for a while: it has to be brighter, louder, bigger. Indeed, a recent feature article in the New York Times on the 25th Paris Biennale seems to confirm this trend in Western art. (Background: The Biennale is home to around a hundred of the world’s most exclusive art galleries, those that purport to set the art world’s trends, make or break emerging artists and most importantly (for them) set “market” prices.) The article’s author, Souren Melikian, states:

Perception is changing. Interest in subtle nuances is receding as our attention span shortens. Awareness of this trend probably accounts for the recent art trade emphasis on clarity and monumentality and the striking progression of 20th-century modernity.

Well, I certainly take no issue with the observation that “commercial” art has become much more monumental and less subtle, especially over the last 40 years. By it’s very nature for most art to be successful in today’s market overflowing with noise, distraction and mediocrity it must draw someone’s fragmented and limited attention, and sadly, it does this by being bold, bright or big! However, I strongly disagree that “clarity” is a direct result of this new trend in boldness. I could recite a list as long as my arm of paintings and other art works that show remarkable clarity even though they are merely subtle.

Perhaps paradoxically, brokers and buyers of bold seem exclusively to associate boldness with a statement of modernity, compositional complexity, and layered meaning. The galleries at the Biennale seem to be confusing subtlety with dullness, simplicity and shallowness. Yet, the world is full of an equal number of works that exhibit just as much richness, depth and emotion as their bolder counterparts despite their surface subtlety. There is room for reflection and nuanced mood; there is room for complexity and depth in meaning from simple composition; there is room for pastels in this over-saturated, bold neon world.

As Bob Duggan eloquently states, at BigThink:

The meek, such as 2009 Turner Prize winner Richard Wright (reviewed recently by me here) may yet inherit the earth, but only in a characteristically quiet way. Hirst’s jewel-encrusted skulls will always grab headlines, but Wright’s simpler, pensive work can engage hearts and minds in a more fulfilling way. And why is it important that the right thing happens and the Wrights win out over the Hirsts? Because art remains one of the few havens for thought in our noise- and light-polluted world.

So, I’m encouraged to see that I am not yet a lost and lone voice in this noisy wilderness of bold brashness. Oh, and in case you’re wondering what a meaningfully complex yet subtle painting looks like, gaze at Half Light by Dana Blanchard above.

The Evolution of the Physicist’s Picture of Nature

[div class=attrib]From Scientific American:[end-div]

Editor’s Note: We are republishing this article by Paul Dirac from the May 1963 issue of Scientific American, as it might be of interest to listeners to the June 24, 2010, and June 25, 2010 Science Talk podcasts, featuring award-winning writer and physicist Graham Farmelo discussing The Strangest Man, his biography of the Nobel Prize-winning British theoretical physicist.

In this article I should like to discuss the development of general physical theory: how it developed in the past and how one may expect it to develop in the future. One can look on this continual development as a process of evolution, a process that has been going on for several centuries.

The first main step in this process of evolution was brought about by Newton. Before Newton, people looked on the world as being essentially two-dimensional-the two dimensions in which one can walk about-and the up-and-down dimension seemed to be something essentially different. Newton showed how one can look on the up-and-down direction as being symmetrical with the other two directions, by bringing in gravitational forces and showing how they take their place in physical theory. One can say that Newton enabled us to pass from a picture with two-dimensional symmetry to a picture with three-dimensional symmetry.

Einstein made another step in the same direction, showing how one can pass from a picture with three-dimensional symmetry to a picture with four­dimensional symmetry. Einstein brought in time and showed how it plays a role that is in many ways symmetrical with the three space dimensions. However, this symmetry is not quite perfect. With Einstein’s picture one is led to think of the world from a four-dimensional point of view, but the four dimensions are not completely symmetrical. There are some directions in the four-dimensional picture that are different from others: directions that are called null directions, along which a ray of light can move; hence the four-dimensional picture is not completely symmetrical. Still, there is a great deal of symmetry among the four dimensions. The only lack of symmetry, so far as concerns the equations of physics, is in the appearance of a minus sign in the equations with respect to the time dimension as compared with the three space dimensions [see top equation in diagram].

four-dimensional symmetry equation and Schrodinger's equationsWe have, then, the development from the three-dimensional picture of the world to the four-dimensional picture. The reader will probably not be happy with this situation, because the world still appears three-dimensional to his consciousness. How can one bring this appearance into the four-dimensional picture that Einstein requires the physicist to have?

What appears to our consciousness is really a three-dimensional section of the four-dimensional picture. We must take a three-dimensional section to give us what appears to our consciousness at one time; at a later time we shall have a different three-dimensional section. The task of the physicist consists largely of relating events in one of these sections to events in another section referring to a later time. Thus the picture with four­dimensional symmetry does not give us the whole situation. This becomes particularly important when one takes into account the developments that have been brought about by quantum theory. Quantum theory has taught us that we have to take the process of observation into account, and observations usually require us to bring in the three-dimensional sections of the four-dimensional picture of the universe.

The special theory of relativity, which Einstein introduced, requires us to put all the laws of physics into a form that displays four-dimensional symmetry. But when we use these laws to get results about observations, we have to bring in something additional to the four-dimensional symmetry, namely the three-dimensional sections that describe our consciousness of the universe at a certain time.

Einstein made another most important contribution to the development of our physical picture: he put forward the general theory of relativity, which requires us to suppose that the space of physics is curved. Before this physicists had always worked with a flat space, the three-dimensional flat space of Newton which was then extended to the four­dimensional flat space of special relativity. General relativity made a really important contribution to the evolution of our physical picture by requiring us to go over to curved space. The general requirements of this theory mean that all the laws of physics can be formulated in curved four-dimensional space, and that they show symmetry among the four dimensions. But again, when we want to bring in observations, as we must if we look at things from the point of view of quantum theory, we have to refer to a section of this four-dimensional space. With the four-dimensional space curved, any section that we make in it also has to be curved, because in general we cannot give a meaning to a flat section in a curved space. This leads us to a picture in which we have to take curved three­dimensional sections in the curved four­dimensional space and discuss observations in these sections.

During the past few years people have been trying to apply quantum ideas to gravitation as well as to the other phenomena of physics, and this has led to a rather unexpected development, namely that when one looks at gravitational theory from the point of view of the sections, one finds that there are some degrees of freedom that drop out of the theory. The gravitational field is a tensor field with 10 components. One finds that six of the components are adequate for describing everything of physical importance and the other four can be dropped out of the equations. One cannot, however, pick out the six important components from the complete set of 10 in any way that does not destroy the four-dimensional symmetry. Thus if one insists on preserving four-dimensional symmetry in the equations, one cannot adapt the theory of gravitation to a discussion of measurements in the way quantum theory requires without being forced to a more complicated description than is needed bv the physical situation. This result has led me to doubt how fundamental the four-dimensional requirement in physics is. A few decades ago it seemed quite certain that one had to express the whole of physics in four­dimensional form. But now it seems that four-dimensional symmetry is not of such overriding importance, since the description of nature sometimes gets simplified when one departs from it.

Now I should like to proceed to the developments that have been brought about by quantum theory. Quantum theory is the discussion of very small things, and it has formed the main subject of physics for the past 60 years. During this period physicists have been amassing quite a lot of experimental information and developing a theory to correspond to it, and this combination of theory and experiment has led to important developments in the physicist’s picture of the world.

[div class=attrib]More from theSource here.[end-div]

Immaculate creation: birth of the first synthetic cell

[div class=attrib]From the New Scientist:[end-div]

For the first time, scientists have created life from scratch – well, sort of. Craig Venter‘s team at the J. Craig Venter Institute in Rockville, Maryland, and San Diego, California, has made a bacterial genome from smaller DNA subunits and then transplanted the whole thing into another cell. So what exactly is the science behind the first synthetic cell, and what is its broader significance?

What did Venter’s team do?

The cell was created by stitching together the genome of a goat pathogen called Mycoplasma mycoides from smaller stretches of DNA synthesised in the lab, and inserting the genome into the empty cytoplasm of a related bacterium. The transplanted genome booted up in its host cell, and then divided over and over to make billions of M. mycoides cells.

Venter and his team have previously accomplished both feats – creating a synthetic genome and transplanting a genome from one bacterium into another – but this time they have combined the two.

“It’s the first self-replicating cell on the planet that’s parent is a computer,” says Venter, referring to the fact that his team converted a cell’s genome that existed as data on a computer into a living organism.

How can they be sure that the new bacteria are what they intended?

Venter and his team introduced several distinctive markers into their synthesised genome. All of them were found in the synthetic cell when it was sequenced.

These markers do not make any proteins, but they contain the names of 46 scientists on the project and several quotations written out in a secret code. The markers also contain the key to the code.

Crack the code and you can read the messages, but as a hint, Venter revealed the quotations: “To live, to err, to fall, to triumph, to recreate life out of life,” from James Joyce’s A Portrait of the Artist as a Young Man; “See things not as they are but as they might be,” which comes from American Prometheus, a biography of nuclear physicist Robert Oppenheimer; and Richard Feynman’s famous words: “What I cannot build I cannot understand.”

Does this mean they created life?

It depends on how you define “created” and “life”. Venter’s team made the new genome out of DNA sequences that had initially been made by a machine, but bacteria and yeast cells were used to stitch together and duplicate the million base pairs that it contains. The cell into which the synthetic genome was then transplanted contained its own proteins, lipids and other molecules.

Venter himself maintains that he has not created life . “We’ve created the first synthetic cell,” he says. “We definitely have not created life from scratch because we used a recipient cell to boot up the synthetic chromosome.”

Whether you agree or not is a philosophical question, not a scientific one as there is no biological difference between synthetic bacteria and the real thing, says Andy Ellington, a synthetic biologist at the University of Texas in Austin. “The bacteria didn’t have a soul, and there wasn’t some animistic property of the bacteria that changed,” he says.

What can you do with a synthetic cell?

Venter’s work was a proof of principle, but future synthetic cells could be used to create drugs, biofuels and other useful products. He is collaborating with Exxon Mobil to produce biofuels from algae and with Novartis to create vaccines.

“As soon as next year, the flu vaccine you get could be made synthetically,” Venter says.

Ellington also sees synthetic bacteria as having potential as a scientific tool. It would be interesting, he says, to create bacteria that produce a new amino acid – the chemical units that make up proteins – and see how these bacteria evolve, compared with bacteria that produce the usual suite of amino acids. “We can ask these questions about cyborg cells in ways we never could before.”

[div class=attrib]More from theSource here.[end-div]

The Search for Genes Leads to Unexpected Places

[div class=attrib]From The New York Times:[end-div]

Edward M. Marcotte is looking for drugs that can kill tumors by stopping blood vessel growth, and he and his colleagues at the University of Texas at Austin recently found some good targets — five human genes that are essential for that growth. Now they’re hunting for drugs that can stop those genes from working. Strangely, though, Dr. Marcotte did not discover the new genes in the human genome, nor in lab mice or even fruit flies. He and his colleagues found the genes in yeast.

“On the face of it, it’s just crazy,” Dr. Marcotte said. After all, these single-cell fungi don’t make blood vessels. They don’t even make blood. In yeast, it turns out, these five genes work together on a completely unrelated task: fixing cell walls.

Crazier still, Dr. Marcotte and his colleagues have discovered hundreds of other genes involved in human disorders by looking at distantly related species. They have found genes associated with deafness in plants, for example, and genes associated with breast cancer in nematode worms. The researchers reported their results recently in The Proceedings of the National Academy of Sciences.

The scientists took advantage of a peculiar feature of our evolutionary history. In our distant, amoeba-like ancestors, clusters of genes were already forming to work together on building cell walls and on other very basic tasks essential to life. Many of those genes still work together in those same clusters, over a billion years later, but on different tasks in different organisms.

[div class=attrib]More from theSource here.[end-div]

Why Athletes Are Geniuses

[div class=attrib]From Discover:[end-div]

The qualities that set a great athlete apart from the rest of us lie not just in the muscles and the lungs but also between the ears. That’s because athletes need to make complicated decisions in a flash. One of the most spectacular examples of the athletic brain operating at top speed came in 2001, when the Yankees were in an American League playoff game with the Oakland Athletics. Shortstop Derek Jeter managed to grab an errant throw coming in from right field and then gently tossed the ball to catcher Jorge Posada, who tagged the base runner at home plate. Jeter’s quick decision saved the game—and the series—for the Yankees. To make the play, Jeter had to master both conscious decisions, such as whether to intercept the throw, and unconscious ones. These are the kinds of unthinking thoughts he must make in every second of every game: how much weight to put on a foot, how fast to rotate his wrist as he releases a ball, and so on.

In recent years neuroscientists have begun to catalog some fascinating differences between average brains and the brains of great athletes. By understanding what goes on in athletic heads, researchers hope to understand more about the workings of all brains—those of sports legends and couch potatoes alike.

As Jeter’s example shows, an athlete’s actions are much more than a set of automatic responses; they are part of a dynamic strategy to deal with an ever-changing mix of intricate challenges. Even a sport as seemingly straightforward as pistol shooting is surprisingly complex. A marksman just points his weapon and fires, and yet each shot calls for many rapid decisions, such as how much to bend the elbow and how tightly to contract the shoulder muscles. Since the shooter doesn’t have perfect control over his body, a slight wobble in one part of the arm may require many quick adjustments in other parts. Each time he raises his gun, he has to make a new calculation of what movements are required for an accurate shot, combining previous experience with whatever variations he is experiencing at the moment.

To explain how brains make these on-the-fly decisions, Reza Shadmehr of Johns Hopkins University and John Krakauer of Columbia University two years ago reviewed studies in which the brains of healthy people and of brain-damaged patients who have trouble controlling their movements were scanned. They found that several regions of the brain collaborate to make the computations needed for detailed motor actions. The brain begins by setting a goal—pick up the fork, say, or deliver the tennis serve—and calculates the best course of action to reach it. As the brain starts issuing commands, it also begins to make predictions about what sort of sensations should come back from the body if it achieves the goal. If those predictions don’t match the actual sensations, the brain then revises its plan to reduce error. Shadmehr and Krakauer’s work demonstrates that the brain does not merely issue rigid commands; it also continually updates its solution to the problem of how to move the body. Athletes may perform better than the rest of us because their brains can find better solutions than ours do.

[div class=attrib]More from theSource here.[end-div]