The IBM Songbook

IBM Songbook

It would be fascinating to see a Broadway or West End show based on lyrics penned in honor of IBM and Thomas Watson, Sr., its first president. Makes you wonder if faithful employees of say, Facebook or Apple, would ever write a songbook — not in jest — for their corporate alma mater. I think not.

From ars technica:

“For thirty-seven years,” reads the opening passage in the book, “the gatherings and conventions of our IBM workers have expressed in happy songs the fine spirit of loyal cooperation and good fellowship which has promoted the signal success of our great IBM Corporation in its truly International Service for the betterment of business and benefit to mankind.”

That’s a hell of a mouthful, but it’s only the opening volley in the war on self-respect and decency that is the 1937 edition of Songs of the IBM, a booklet of corporate ditties first published in 1927 on the order of IBM company founder Thomas Watson, Sr.

The 1937 edition of the songbook is a 54-page monument to glassey-eyed corporate inhumanity, with every page overflowing with trite praise to The Company and Its Men. The booklet reads like a terribly parody of a hymnal—one that praises not the traditional Christian trinity but the new corporate triumvirate of IBM the father, Watson the son, and American entrepreneurship as the holy spirit:

Thomas Watson is our inspiration,
Head and soul of our splendid I.B.M.
We are pledged to him in every nation,
Our President and most beloved man.
His wisdom has guided each division
In service to all humanity
We have grown and broadened with his vision,
None can match him or our great company.
T. J. Watson, we all honor you,
You’re so big and so square and so true,
We will follow and serve with you forever,
All the world must know what I. B. M. can do.

—from “To Thos. J. Watson, President, I.B.M. Our Inspiration”

The wording transcends sense and sanity—these aren’t songs that normal human beings would choose to line up and sing, are they? Have people changed so much in the last 70-80 years that these songs—which seem expressly designed to debase their singers and deify their subjects—would be joyfully sung in harmony without complaint at company meetings? Were workers in the 1920s and 1930s so dehumanized by the rampaging robber barons of high industry that the only way to keep a desirable corporate job at a place like IBM was to toe the line and sing for your paycheck?

Surely no one would stand for this kind of thing in the modern world—to us, company songs seem like relics of a less-enlightened age. If anything, the mindless overflowing trite words sound like the kind of praises one would find directed at a cult of personality dictator in a decaying wreck of a country like North Korea.

Indeed, some of the songs in the book wouldn’t be out of place venerating the Juche ideal instead of IBM:

We don’t pretend we’re gay.
We always feel that way,
Because we’re filling the world with sunshine.
With I.B.M. machines,
We’ve got the finest means,
For brightly painting the clouds with sunshine.

—from “Painting the Clouds with Sunshine”

Surely no one would stand for this kind of thing in the modern world—to us, company songs seem like relics of a less-enlightened age. If anything, the mindless overflowing trite words sound like the kind of praises one would find directed at a cult of personality dictator in a decaying wreck of a country like North Korea.

Tie an onion to your belt

All right, time to come clean: it’s incredibly easy to cherry pick terrible examples out of a 77-year old corporate songbook (though this songbook makes it easy because of how crazy it is to modern eyes). Moreover, to answer one of the rhetorical questions above, no—people have not changed so much over the past 80-ish years that they could sing mawkishly pro-IBM songs with an irony-free straight face. At least, not without some additional context.

There’s a decade-old writeup on NetworkWorld about the IBM corporate song phenomena that provides a lot of the glue necessary to build a complete mental picture of what was going on in both employees’ and leaderships’ heads. The key takeaway to deflate a lot of the looniness is that the majority of the songs came out of the Great Depression era, and employees lucky enough to be steadfastly employed by a company like IBM often werereally that grateful.

The formal integration of singing as an aspect of IBM’s culture at the time was heavily encouraged by Thomas J. Watson Sr. Watson and his employees co-opted the era’s showtunes and popular melodies for their proto-filking, ensuring that everyone would know the way the song went, if not the exact wording. Employees belting out “To the International Ticketograph Division” to the tune of “My Bonnie Lies Over the Ocean” (“In I.B.M. There’s a division. / That’s known as the Ticketograph; / It’s peopled by men who have vision, / Progressive and hard-working staff”) really isn’t all that different from any other team-building exercise that modern companies do—in fact, in a lot of ways, it’s far less humiliating than a company picnic with Mandatory Interdepartmental Three-Legged Races.

Many of the songs mirror the kinds of things that university students of the same time period might sing in honor of their alma mater. When viewed from the perspective of the Depression and post-Depression era, the singing is still silly—but it also makes a lot more sense. Watson reportedly wanted to inspire loyalty and cohesion among employees—and, remember, this was also an era where “normal” employee behavior was to work at a single company for most of one’s professional life, and then retire with a pension. It’s certainly a lot easier to sing a company’s praises if there’s paid retirement at the end of the last verse.

Read the entire article and see more songs here.

Image: Page 99-100 of the IBM Songbook, 1937. Courtesy of IBM / are technica.

Syndrome X

DNA_Structure

The quest for immortality or even great longevity has probably led humans since they first became self-aware. Entire cultural movements and industries are founded on the desire to enhance and extend our lives. Genetic research, of course, may eventually unlock some or all of life and death’s mysteries. In the meantime, groups of dedicated scientists continue to look for for the foundation of aging with a view to understanding the process and eventually slowing (and perhaps stopping) it. Richard Walker is one of these singularly focused researchers.

From the BBC:

Richard Walker has been trying to conquer ageing since he was a 26-year-old free-loving hippie. It was the 1960s, an era marked by youth: Vietnam War protests, psychedelic drugs, sexual revolutions. The young Walker relished the culture of exultation, of joie de vivre, and yet was also acutely aware of its passing. He was haunted by the knowledge that ageing would eventually steal away his vitality – that with each passing day his body was slightly less robust, slightly more decayed. One evening he went for a drive in his convertible and vowed that by his 40th birthday, he would find a cure for ageing.

Walker became a scientist to understand why he was mortal. “Certainly it wasn’t due to original sin and punishment by God, as I was taught by nuns in catechism,” he says. “No, it was the result of a biological process, and therefore is controlled by a mechanism that we can understand.”

Scientists have published several hundred theories of ageing, and have tied it to a wide variety of biological processes. But no one yet understands how to integrate all of this disparate information.

Walker, now 74, believes that the key to ending ageing may lie in a rare disease that doesn’t even have a real name, “Syndrome X”. He has identified four girls with this condition, marked by what seems to be a permanent state of infancy, a dramatic developmental arrest. He suspects that the disease is caused by a glitch somewhere in the girls’ DNA. His quest for immortality depends on finding it.

It’s the end of another busy week and MaryMargret Williams is shuttling her brood home from school. She drives an enormous SUV, but her six children and their coats and bags and snacks manage to fill every inch. The three big kids are bouncing in the very back. Sophia, 10, with a mouth of new braces, is complaining about a boy-crazy friend. She sits next to Anthony, seven, and Aleena, five, who are glued to something on their mother’s iPhone. The three little kids squirm in three car seats across the middle row. Myah, two, is mining a cherry slushy, and Luke, one, is pawing a bag of fresh crickets bought for the family gecko.

Finally there’s Gabrielle, who’s the smallest child, and the second oldest, at nine years old. She has long, skinny legs and a long, skinny ponytail, both of which spill out over the edges of her car seat. While her siblings giggle and squeal, Gabby’s dusty-blue eyes roll up towards the ceiling. By the calendar, she’s almost an adolescent. But she has the buttery skin, tightly clenched fingers and hazy awareness of a newborn.

Back in 2004, when MaryMargret and her husband, John, went to the hospital to deliver Gabby, they had no idea anything was wrong. They knew from an ultrasound that she would have clubbed feet, but so had their other daughter, Sophia, who was otherwise healthy. And because MaryMargret was a week early, they knew Gabby would be small, but not abnormally so. “So it was such a shock to us when she was born,” MaryMargret says.

Gabby came out purple and limp. Doctors stabilised her in the neonatal intensive care unit and then began a battery of tests. Within days the Williamses knew their new baby had lost the genetic lottery. Her brain’s frontal lobe was smooth, lacking the folds and grooves that allow neurons to pack in tightly. Her optic nerve, which runs between the eyes and the brain, was atrophied, which would probably leave her blind. She had two heart defects. Her tiny fists couldn’t be pried open. She had a cleft palate and an abnormal swallowing reflex, which meant she had to be fed through a tube in her nose. “They started trying to prepare us that she probably wouldn’t come home with us,” John says. Their family priest came by to baptise her.

Day after day, MaryMargret and John shuttled between Gabby in the hospital and 13-month-old Sophia at home. The doctors tested for a few known genetic syndromes, but they all came back negative. Nobody had a clue what was in store for her. Her strong Catholic family put their faith in God. “MaryMargret just kept saying, ‘She’s coming home, she’s coming home’,” recalls her sister, Jennie Hansen. And after 40 days, she did.

Gabby cried a lot, loved to be held, and ate every three hours, just like any other newborn. But of course she wasn’t. Her arms would stiffen and fly up to her ears, in a pose that the family nicknamed her “Harley-Davidson”. At four months old she started having seizures. Most puzzling and problematic, she still wasn’t growing. John and MaryMargret took her to specialist after specialist: a cardiologist, a gastroenterologist, a geneticist, a neurologist, an ophthalmologist and an orthopaedist. “You almost get your hopes up a little – ’This is exciting! We’re going to the gastro doctor, and maybe he’ll have some answers’,” MaryMargret says. But the experts always said the same thing: nothing could be done.

The first few years with Gabby were stressful. When she was one and Sophia two, the Williamses drove from their home in Billings, Montana, to MaryMargret’s brother’s home outside of St Paul, Minnesota. For nearly all of those 850 miles, Gabby cried and screamed. This continued for months until doctors realised she had a run-of-the-mill bladder infection. Around the same period, she acquired a severe respiratory infection that left her struggling to breathe. John and MaryMargret tried to prepare Sophia for the worst, and even planned which readings and songs to use at Gabby’s funeral. But the tiny toddler toughed it out.

While Gabby’s hair and nails grew, her body wasn’t getting bigger. She was developing in subtle ways, but at her own pace. MaryMargret vividly remembers a day at work when she was pushing Gabby’s stroller down a hallway with skylights in the ceiling. She looked down at Gabby and was shocked to see her eyes reacting to the sunlight. “I thought, ‘Well, you’re seeing that light!’” MaryMargret says. Gabby wasn’t blind, after all.

Despite the hardships, the couple decided they wanted more children. In 2007 MaryMargret had Anthony, and the following year she had Aleena. By this time, the Williamses had stopped trudging to specialists, accepting that Gabby was never going to be fixed. “At some point we just decided,” John recalls, “it’s time to make our peace.”

Mortal questions

When Walker began his scientific career, he focused on the female reproductive system as a model of “pure ageing”: a woman’s ovaries, even in the absence of any disease, slowly but inevitably slide into the throes of menopause. His studies investigated how food, light, hormones and brain chemicals influence fertility in rats. But academic science is slow. He hadn’t cured ageing by his 40th birthday, nor by his 50th or 60th. His life’s work was tangential, at best, to answering the question of why we’re mortal, and he wasn’t happy about it. He was running out of time.

So he went back to the drawing board. As he describes in his book, Why We Age, Walker began a series of thought experiments to reflect on what was known and not known about ageing.

Ageing is usually defined as the slow accumulation of damage in our cells, organs and tissues, ultimately causing the physical transformations that we all recognise in elderly people. Jaws shrink and gums recede. Skin slacks. Bones brittle, cartilage thins and joints swell. Arteries stiffen and clog. Hair greys. Vision dims. Memory fades. The notion that ageing is a natural, inevitable part of life is so fixed in our culture that we rarely question it. But biologists have been questioning it for a long time.

It’s a harsh world out there, and even young cells are vulnerable. It’s like buying a new car: the engine runs perfectly but is still at risk of getting smashed on the highway. Our young cells survive only because they have a slew of trusty mechanics on call. Take DNA, which provides the all-important instructions for making proteins. Every time a cell divides, it makes a near-perfect copy of its three-billion-letter code. Copying mistakes happen frequently along the way, but we have specialised repair enzymes to fix them, like an automatic spellcheck. Proteins, too, are ever vulnerable. If it gets too hot, they twist into deviant shapes that keep them from working. But here again, we have a fixer: so-called ‘heat shock proteins’ that rush to the aid of their misfolded brethren. Our bodies are also regularly exposed to environmental poisons, such as the reactive and unstable ‘free radical’ molecules that come from the oxidisation of the air we breathe. Happily, our tissues are stocked with antioxidants and vitamins that neutralise this chemical damage. Time and time again, our cellular mechanics come to the rescue.

Which leads to the biologists’ longstanding conundrum: if our bodies are so well tuned, why, then, does everything eventually go to hell?

One theory is that it all boils down to the pressures of evolution. Humans reproduce early in life, well before ageing rears its ugly head. All of the repair mechanisms that are important in youth – the DNA editors, the heat shock proteins, the antioxidants – help the young survive until reproduction, and are therefore passed down to future generations. But problems that show up after we’re done reproducing cannot be weeded out by evolution. Hence, ageing.

Most scientists say that ageing is not caused by any one culprit but by the breakdown of many systems at once. Our sturdy DNA mechanics become less effective with age, meaning that our genetic code sees a gradual increase in mutations. Telomeres, the sequences of DNA that act as protective caps on the ends of our chromosomes, get shorter every year. Epigenetic messages, which help turn genes on and off, get corrupted with time. Heat shock proteins run down, leading to tangled protein clumps that muck up the smooth workings of a cell. Faced with all of this damage, our cells try to adjust by changing the way they metabolise nutrients and store energy. To ward off cancer, they even know how to shut themselves down. But eventually cells stop dividing and stop communicating with each other, triggering the decline we see from the outside.

Scientists trying to slow the ageing process tend to focus on one of these interconnected pathways at a time. Some researchers have shown, for example, that mice on restricted-calorie diets live longer than normal. Other labs have reported that giving mice rapamycin, a drug that targets an important cell-growth pathway, boosts their lifespan. Still other groups are investigating substances that restore telomeres, DNA repair enzymes and heat shock proteins.

During his thought experiments, Walker wondered whether all of these scientists were fixating on the wrong thing. What if all of these various types of cellular damages were the consequences of ageing, but not the root cause of it? He came up with an alternative theory: that ageing is the unavoidable fallout of our development.

The idea sat on the back burner of Walker’s mind until the evening of 23 October 2005. He was working in his home office when his wife called out to him to join her in the family room. She knew he would want to see what was on TV: an episode of Dateline about a young girl who seemed to be “frozen in time”. Walker watched the show and couldn’t believe what he was seeing. Brooke Greenberg was 12 years old, but just 13 pounds (6kg) and 27 inches (69cm) long. Her doctors had never seen anything like her condition, and suspected the cause was a random genetic mutation. “She literally is the Fountain of Youth,” her father, Howard Greenberg, said.

Walker was immediately intrigued. He had heard of other genetic diseases, such as progeria and Werner syndrome, which cause premature ageing in children and adults respectively. But this girl seemed to be different. She had a genetic disease that stopped her development and with it, Walker suspected, the ageing process. Brooke Greenberg, in other words, could help him test his theory.

Uneven growth

Brooke was born a few weeks premature, with many birth defects. Her paediatrician labeled her with Syndrome X, not knowing what else to call it.

After watching the show, Walker tracked down Howard Greenberg’s address. Two weeks went by before Walker heard back, and after much discussion he was allowed to test Brooke. He was sent Brooke’s medical records as well as blood samples for genetic testing. In 2009, his team published a brief report describing her case.

Walker’s analysis found that Brooke’s organs and tissues were developing at different rates. Her mental age, according to standardised tests, was between one and eight months. Her teeth appeared to be eight years old; her bones, 10 years. She had lost all of her baby fat, and her hair and nails grew normally, but she had not reached puberty. Her telomeres were considerably shorter than those of healthy teenagers, suggesting that her cells were ageing at an accelerated rate.

All of this was evidence of what Walker dubbed “developmental disorganisation”. Brooke’s body seemed to be developing not as a coordinated unit, he wrote, but rather as a collection of individual, out-of-sync parts. “She is not simply ‘frozen in time’,” Walker wrote. “Her development is continuing, albeit in a disorganised fashion.”

The big question remained: why was Brooke developmentally disorganised? It wasn’t nutritional and it wasn’t hormonal. The answer had to be in her genes. Walker suspected that she carried a glitch in a gene (or a set of genes, or some kind of complex genetic programme) that directed healthy development. There must be some mechanism, after all, that allows us to develop from a single cell to a system of trillions of cells. This genetic programme, Walker reasoned, would have two main functions: it would initiate and drive dramatic changes throughout the organism, and it would also coordinate these changes into a cohesive unit.

Ageing, he thought, comes about because this developmental programme, this constant change, never turns off. From birth until puberty, change is crucial: we need it to grow and mature. After we’ve matured, however, our adult bodies don’t need change, but rather maintenance. “If you’ve built the perfect house, you would want to stop adding bricks at a certain point,” Walker says. “When you’ve built a perfect body, you’d want to stop screwing around with it. But that’s not how evolution works.” Because natural selection cannot influence traits that show up after we have passed on our genes, we never evolved a “stop switch” for development, Walker says. So we keep adding bricks to the house. At first this doesn’t cause much damage – a sagging roof here, a broken window there. But eventually the foundation can’t sustain the additions, and the house topples. This, Walker says, is ageing.

Brooke was special because she seemed to have been born with a stop switch. But finding the genetic culprit turned out to be difficult. Walker would need to sequence Brooke’s entire genome, letter by letter.

That never happened. Much to Walker’s chagrin, Howard Greenberg abruptly severed their relationship. The Greenbergs have not publicly explained why they ended their collaboration with Walker, and declined to comment for this article.

Second chance

In August 2009, MaryMargret Williams saw a photo of Brooke on the cover of People magazine, just below the headline “Heartbreaking mystery: The 16-year-old baby”. She thought Brooke sounded a lot like Gabby, so contacted Walker.

After reviewing Gabby’s details, Walker filled her in on his theory. Testing Gabby’s genes, he said, could help him in his mission to end age-related disease – and maybe even ageing itself.

This didn’t sit well with the Williamses. John, who works for the Montana Department of Corrections, often interacts with people facing the reality of our finite time on Earth. “If you’re spending the rest of your life in prison, you know, it makes you think about the mortality of life,” he says. What’s important is not how long you live, but rather what you do with the life you’re given. MaryMargret feels the same way. For years she has worked in a local dermatology office. She knows all too well the cultural pressures to stay young, and wishes more people would embrace the inevitability of getting older. “You get wrinkles, you get old, that’s part of the process,” she says.

But Walker’s research also had its upside. First and foremost, it could reveal whether the other Williams children were at risk of passing on Gabby’s condition.

For several months, John and MaryMargret hashed out the pros and cons. They were under no illusion that the fruits of Walker’s research would change Gabby’s condition, nor would they want it to. But they did want to know why. “What happened, genetically, to make her who she is?” John says. And more importantly: “Is there a bigger meaning for it?”

John and MaryMargret firmly believe that God gave them Gabby for a reason. Walker’s research offered them a comforting one: to help treat Alzheimer’s and other age-related diseases. “Is there a small piece that Gabby could present to help people solve these awful diseases?” John asks. “Thinking about it, it’s like, no, that’s for other people, that’s not for us.” But then he thinks back to the day Gabby was born. “I was in that delivery room, thinking the same thing – this happens to other people, not us.”

Still not entirely certain, the Williamses went ahead with the research.

Amassing evidence

Walker published his theory in 2011, but he’s only the latest of many researchers to think along the same lines. “Theories relating developmental processes to ageing have been around for a very long time, but have been somewhat under the radar for most researchers,” says Joao Pedro de Magalhaes, a biologist at the University of Liverpool. In 1932, for example, English zoologist George Parker Bidder suggested that mammals have some kind of biological “regulator” that stops growth after the animal reaches a specific size. Ageing, Bidder thought, was the continued action of this regulator after growth was done.

Subsequent studies showed that Bidder wasn’t quite right; there are lots of marine organisms, for example, that never stop growing but age anyway. Still, his fundamental idea of a developmental programme leading to ageing has persisted.

For several years, Stuart Kim’s group at Stanford University has been comparing which genes are expressed in young and old nematode worms. It turns out that some genes involved in ageing also help drive development in youth.

Kim suggested that the root cause of ageing is the “drift”, or mistiming, of developmental pathways during the ageing process, rather than an accumulation of cellular damage.

Other groups have since found similar patterns in mice and primates. One study, for example, found that many genes turned on in the brains of old monkeys and humans are the same as those expressed in young brains, suggesting that ageing and development are controlled by some of the same gene networks.

Perhaps most provocative of all, some studies of worms have shown that shutting down essential development genes in adults significantly prolongs life. “We’ve found quite a lot of genes in which this happened – several dozen,” de Magalhaes says.

Nobody knows whether the same sort of developmental-programme genes exist in people. But say that they do exist. If someone was born with a mutation that completely destroyed this programme, Walker reasoned, that person would undoubtedly die. But if a mutation only partially destroyed it, it might lead to a condition like what he saw in Brooke Greenberg or Gabby Williams. So if Walker could identify the genetic cause of Syndrome X, then he might also have a driver of the ageing process in the rest of us.

And if he found that, then could it lead to treatments that slow – or even end – ageing? “There’s no doubt about it,” he says.

Public stage

After agreeing to participate in Walker’s research, the Williamses, just like the Greenbergs before them, became famous. In January 2011, when Gabby was six, the television channel TLC featured her on a one-hour documentary. The Williams family also appeared on Japanese television and in dozens of newspaper and magazine articles.

Other than becoming a local celebrity, though, Gabby’s everyday life hasn’t changed much since getting involved in Walker’s research. She spends her days surrounded by her large family. She’ll usually lie on the floor, or in one of several cushions designed to keep her spine from twisting into a C shape. She makes noises that would make an outsider worry: grunting, gasping for air, grinding her teeth. Her siblings think nothing of it. They play boisterously in the same room, somehow always careful not to crash into her. Once a week, a teacher comes to the house to work with Gabby. She uses sounds and shapes on an iPad to try to teach cause and effect. When Gabby turned nine, last October, the family made her a birthday cake and had a party, just as they always do. Most of her gifts were blankets, stuffed animals and clothes, just as they are every year. Her aunt Jennie gave her make-up.

Walker teamed up with geneticists at Duke University and screened the genomes of Gabby, John and MaryMargret. This test looked at the exome, the 2% of the genome that codes for proteins. From this comparison, the researchers could tell that Gabby did not inherit any exome mutations from her parents – meaning that it wasn’t likely that her siblings would be able to pass on the condition to their kids. “It was a huge relief – huge,” MaryMargret says.

Still, the exome screening didn’t give any clues as to what was behind Gabby’s disease. Gabby carries several mutations in her exome, but none in a gene that would make sense of her condition. All of us have mutations littering our genomes. So it’s impossible to know, in any single individual, whether a particular mutation is harmful or benign – unless you can compare two people with the same condition.

All girls

Luckily for him, Walker’s continued presence in the media has led him to two other young girls who he believes have the same syndrome. One of them, Mackenzee Wittke, of Alberta, Canada, is now five years old, with has long and skinny limbs, just like Gabby. “We have basically been stuck in a time warp,” says her mother, Kim Wittke. The fact that all of these possible Syndrome X cases are girls is intriguing – it could mean that the crucial mutation is on their X chromosome. Or it could just be a coincidence.

Walker is working with a commercial outfit in California to compare all three girls’ entire genome sequences – the exome plus the other 98% of DNA code, which is thought to be responsible for regulating the expression of protein-coding genes.

For his theory, Walker says, “this is do or die – we’re going to do every single bit of DNA in these girls. If we find a mutation that’s common to them all, that would be very exciting.”

But that seems like a very big if.

Most researchers agree that finding out the genes behind Syndrome X is a worthwhile scientific endeavour, as these genes will no doubt be relevant to our understanding of development. They’re far less convinced, though, that the girls’ condition has anything to do with ageing. “It’s a tenuous interpretation to think that this is going to be relevant to ageing,” says David Gems, a geneticist at University College London. It’s not likely that these girls will even make it to adulthood, he says, let alone old age.

It’s also not at all clear that these girls have the same condition. Even if they do, and even if Walker and his collaborators discover the genetic cause, there would still be a steep hill to climb. The researchers would need to silence the same gene or genes in laboratory mice, which typically have a lifespan of two or three years. “If that animal lives to be 10, then we’ll know we’re on the right track,” Walker says. Then they’d have to find a way to achieve the same genetic silencing in people, whether with a drug or some kind of gene therapy. And then they’d have to begin long and expensive clinical trials to make sure that the treatment was safe and effective. Science is often too slow, and life too fast.

End of life

On 24 October 2013, Brooke passed away. She was 20 years old. MaryMargret heard about it when a friend called after reading it in a magazine. The news hit her hard. “Even though we’ve never met the family, they’ve just been such a part of our world,” she says.

MaryMargret doesn’t see Brooke as a template for Gabby – it’s not as if she now believes that she only has 11 years left with her daughter. But she can empathise with the pain the Greenbergs must be feeling. “It just makes me feel so sad for them, knowing that there’s a lot that goes into a child like that,” she says. “You’re prepared for them to die, but when it finally happens, you can just imagine the hurt.”

Today Gabby is doing well. MaryMargret and John are no longer planning her funeral. Instead, they’re beginning to think about what would happen if Gabby outlives them. (Sophia has offered to take care of her sister.) John turned 50 this year, and MaryMargret will be 41. If there were a pill to end ageing, they say they’d have no interest in it. Quite the contrary: they look forward to getting older, because it means experiencing the new joys, new pains and new ways to grow that come along with that stage of life.

Richard Walker, of course, has a fundamentally different view of growing old. When asked why he’s so tormented by it, he says it stems from childhood, when he watched his grandparents physically and psychologically deteriorate. “There was nothing charming to me about sedentary old people, rocking chairs, hot houses with Victorian trappings,” he says. At his grandparents’ funerals, he couldn’t help but notice that they didn’t look much different in death than they did at the end of life. And that was heartbreaking. “To say I love life is an understatement,” he says. “Life is the most beautiful and magic of all things.”

If his hypothesis is correct – who knows? – it might one day help prevent disease and modestly extend life for millions of people. Walker is all too aware, though, that it would come too late for him. As he writes in his book: “I feel a bit like Moses who, after wandering in the desert for most years of his life, was allowed to gaze upon the Promised Land but not granted entrance into it.”

 Read the entire story here.

Story courtesy of BBC and Mosaic under Creative Commons License.

Image: DNA structure. Courtesy of Wikipedia.

The Idea Shower and The Strategic Staircase

Every now and then we visit the world of corporatespeak to see how business jargon is faring: which words are in, which phrases are out. Unfortunately, many of the most used and over-used still find their way into common office parlance. With apologies to our state-side readers some of the most popular British phrases follow, and, no surprise, many of these cringeworthy euphemisms seem to emanate from the U.S. Ugh!

From the Guardian:

I don’t know about you, but I’m a sucker for a bit of joined up, blue sky thinking. I love nothing more than the opportunity to touch base with my boss first thing on a Monday morning. It gives me that 24 carat feeling.

I apologise for the sarcasm, but management speak makes most people want to staple the boss’s tongue to the desk. A straw poll around my office found jargon is seen by staff as a tool for making something seem more impressive than it actually is.

The Plain English Campaign says that many staff working for big corporate organisations find themselves using management speak as a way of disguising the fact that they haven’t done their job properly. Some people think that it is easy to bluff their way through by using long, impressive-sounding words and phrases, even if they don’t know what they mean, which is telling in itself.

Furthermore, a recent survey by Institute of Leadership & Management, revealed that management speak is used in almost two thirds (64%) of offices, with nearly a quarter (23%) considering it to be a pointless irritation. “Thinking outside the box” (57%), “going forward” (55%) and “let’s touch base” (39%) were identified as the top three most overused pieces of jargon.

Walk through any office and you’ll hear this kind of thing going on every day. Here are some of the most irritating euphemisms doing the rounds:

Helicopter view – need a phrase that means broad overview of the business? Then why not say “a broad view of the business”?

Idea shower – brainstorm might be out of fashion, but surely we can thought cascade something better than this drivel.

Touch base offline – meaning let’s meet and talk. Because, contrary to popular belief, it is possible to communicate without a Wi-Fi signal. No, really, it is. Fancy a coffee?

Low hanging fruit – easy win business. This would be perfect for hungry children in orchards, but what is really happening is an admission that you don’t want to take the complicated route.

Look under the bonnet – analyse a situation. Most people wouldn’t have a clue about a car engine. When I look under a car bonnet I scratch my head, try not to look like I haven’t got a clue, jiggle a few pipes and kick the tyres before handing the job over to a qualified professional.

Get all your ducks in a row – be organised. Bert and Ernie from Sesame Street had an obsession with rubber ducks. You may think I’m disorganised, but there’s no need to talk to me like a five-year-old.

Don’t let the grass grow too long on this one – work fast. I’m looking for a polite way of suggesting that you get off your backside and get on with it.

Not enough bandwidth – too busy. Really? Try upgrading to fibre optics. I reckon I know a few people who haven’t been blessed with enough “bandwidth” and it’s got nothing to do with being busy.

Cascading relevant information – speaking to your colleagues. If anything, this is worse than touching base offline. From the flourish of cascading through to relevant, and onto information – this is complete nonsense.

The strategic staircase – business plan. Thanks, but I’ll take the lift.

Run it up the flagpole – try it out. Could you attach yourself while you’re at it?

Read the entire story here.

Sugar Is Bad For You, Really? Really!

 

sugar moleculesIn case you may not have heard, sugar is bad for you. In fact, an increasing number of food scientists will tell you that sugar is a poison, and that it’s time to fight the sugar oligarchs in much the same way that health advocates resolved to take on big tobacco many decades ago.

From the Guardian:

If you have any interest at all in diet, obesity, public health, diabetes, epidemiology, your own health or that of other people, you will probably be aware that sugar, not fat, is now considered the devil’s food. Dr Robert Lustig’s book, Fat Chance: The Hidden Truth About Sugar, Obesity and Disease, for all that it sounds like a Dan Brown novel, is the difference between vaguely knowing something is probably true, and being told it as a fact. Lustig has spent the past 16 years treating childhood obesity. His meta-analysis of the cutting-edge research on large-cohort studies of what sugar does to populations across the world, alongside his own clinical observations, has him credited with starting the war on sugar. When it reaches the enemy status of tobacco, it will be because of Lustig.

“Politicians have to come in and reset the playing field, as they have with any substance that is toxic and abused, ubiquitous and with negative consequence for society,” he says. “Alcohol, cigarettes, cocaine. We don’t have to ban any of them. We don’t have to ban sugar. But the food industry cannot be given carte blanche. They’re allowed to make money, but they’re not allowed to make money by making people sick.”

Lustig argues that sugar creates an appetite for itself by a determinable hormonal mechanism – a cycle, he says, that you could no more break with willpower than you could stop feeling thirsty through sheer strength of character. He argues that the hormone related to stress, cortisol, is partly to blame. “When cortisol floods the bloodstream, it raises blood pressure; increases the blood glucose level, which can precipitate diabetes. Human research shows that cortisol specifically increases caloric intake of ‘comfort foods’.” High cortisol levels during sleep, for instance, interfere with restfulness, and increase the hunger hormone ghrelin the next day. This differs from person to person, but I was jolted by recognition of the outrageous deliciousness of doughnuts when I haven’t slept well.

“The problem in obesity is not excess weight,” Lustig says, in the central London hotel that he has made his anti-metabolic illness HQ. “The problem with obesity is that the brain is not seeing the excess weight.” The brain can’t see it because appetite is determined by a binary system. You’re either in anorexigenesis – “I’m not hungry and I can burn energy” – or you’re in orexigenesis – “I’m hungry and I want to store energy.” The flip switch is your leptin level (the hormone that regulates your body fat) but too much insulin in your system blocks the leptin signal.

It helps here if you have ever been pregnant or remember much of puberty and that savage hunger; the way it can trick you out of your best intentions, the lure of ridiculous foods: six-month-old Christmas cake, sweets from a bin. If you’re leptin resistant – that is, if your insulin is too high as a result of your sugar intake – you’ll feel like that all the time.

Telling people to simply lose weight, he tells me, “is physiologically impossible and it’s clinically dangerous. It’s a goal that’s not achievable.” He explains further in the book: “Biochemistry drives behaviour. You see a patient who drinks 10 gallons of water a day and urinates 10 gallons of water a day. What is wrong with him? Could he have a behavioural disorder and be a psychogenic water drinker? Could be. Much more likely he has diabetes.” To extend that, you could tell people with diabetes not to drink water, and 3% of them might succeed – the outliers. But that wouldn’t help the other 97% just as losing the weight doesn’t, long-term, solve the metabolic syndrome – the addiction to sugar – of which obesity is symptomatic.

Many studies have suggested that diets tend to work for two months, some for as long as six. “That’s what the data show. And then everybody’s weight comes roaring back.” During his own time working night shifts, Lustig gained 3st, which he never lost and now uses exuberantly to make two points. The first is that weight is extremely hard to lose, and the second – more important, I think – is that he’s no diet and fitness guru himself. He doesn’t want everybody to be perfect: he’s just a guy who doesn’t want to surrender civilisation to diseases caused by industry. “I’m not a fitness guru,” he says, puckishly. “I’m 45lb overweight!”

“Sugar causes diseases: unrelated to their calories and unrelated to the attendant weight gain. It’s an independent primary-risk factor. Now, there will be food-industry people who deny it until the day they die, because their livelihood depends on it.” And here we have the reason why he sees this is a crusade and not a diet book, the reason that Lustig is in London and not Washington. This is an industry problem; the obesity epidemic began in 1980. Back then, nobody knew about leptin. And nobody knew about insulin resistance until 1984.

“What they knew was, when they took the fat out they had to put the sugar in, and when they did that, people bought more. And when they added more, people bought more, and so they kept on doing it. And that’s how we got up to current levels of consumption.” Approximately 80% of the 600,000 packaged foods you can buy in the US have added calorific sweeteners (this includes bread, burgers, things you wouldn’t add sugar to if you were making them from scratch). Daily fructose consumption has doubled in the past 30 years in the US, a pattern also observable (though not identical) here, in Canada, Malaysia, India, right across the developed and developing world. World sugar consumption has tripled in the past 50 years, while the population has only doubled; it makes sense of the obesity pandemic.

“It would have happened decades earlier; the reason it didn’t was that sugar wasn’t cheap. The thing that made it cheap was high-fructose corn syrup. They didn’t necessarily know the physiology of it, but they knew the economics of it.” Adding sugar to everyday food has become as much about the industry prolonging the shelf life as it has about palatability; if you’re shopping from corner shops, you’re likely to be eating unnecessary sugar in pretty well everything. It is difficult to remain healthy in these conditions. “You here in Britain are light years ahead of us in terms of understanding the problem. We don’t get it in the US, we have this libertarian streak. You don’t have that. You’re going to solve it first. So it’s in my best interests to help you, because that will help me solve it back there.”

The problem has mushroomed all over the world in 30 years and is driven by the profits of the food and diet industries combined. We’re not looking at a global pandemic of individual greed and fecklessness: it would be impossible for the citizens of the world to coordinate their human weaknesses with that level of accuracy. Once you stop seeing it as a problem of personal responsibility it’s easier to accept how profound and serious the war on sugar is. Life doesn’t have to become wholemeal and joyless, but traffic-light systems and five-a-day messaging are under-ambitious.

“The problem isn’t a knowledge deficit,” an obesity counsellor once told me. “There isn’t a fat person on Earth who doesn’t know vegetables are good for you.” Lustig agrees. “I, personally, don’t have a lot of hope that those things will turn things around. Education has not solved any substance of abuse. This is a substance of abuse. So you need two things, you need personal intervention and you need societal intervention. Rehab and laws, rehab and laws. Education would come in with rehab. But we need laws.”

Read the entire article here.

Image: Molecular diagrams of sucrose (left) and fructose (right). Courtesy of Wikipedia.

 

National Extinction Coming Soon

Based on declining fertility rates in some Asian nations a new study predicts complete national extinctions in the not too distant future.

From the Telegraph:

South Koreans will be ‘extinct’ by 2750 if nothing is done to halt the nation’s falling fertility rate, according to a study by The National Assembly Research Service in Seoul.

The fertility rate declined to a new low of 1.19 children per woman in 2013, the study showed, well below the fertility rate required to sustainSouth Korea‘s current population of 50 million people, the Chosun Ilbo reported.

In a simulation, the NARS study suggests that the population will shrink to 40 million in 2056 and 10 million in 2136. The last South Korean, the report indicates, will die in 2750, making it the first national group in the world to become extinct.

The simulation is a worst-case scenario and does not consider possible changes in immigration policy, for example.

The study, carried out at the request of Yang Seung-jo, a member of the opposition New Politics Alliance for Democracy, underlines the challenges facing a number of nations in the Asia-Pacific region.

Japan, Taiwan, Singapore and increasingly China are all experiencing growing financial pressures caused by rising healthcare costs and pension payments for an elderly population.

The problem is particularly serious in South Korea, where more than 38 per cent of the population is predicted to be of retirement age by 2050, according to the National Statistics Office. The equivalent figure in Japan is an estimated 39.6 per cent by 2050.

According to a 2012 study conducted by Tohoku University, Japan will go extinct in about one thousand years, with the last Japanese child born in 3011.

David Coleman, a population expert at Oxford University, has previously warned that South Korea’s fertility rate is so low that it threatens the existence of the nation.

The NARS study suggests that the southern Korean port city of Busan is most at risk, largely because of a sharp decline in the number of young and middle-aged residents, and that the last person will be born in the city in 2413.

Read the entire article here.

Those 25,000 Unread Emails

Google-search-emailIt may not be you. You may not be the person who has tens of thousands of unread emails scattered across various email accounts. However, you know someone just like this — buried in a virtual avalanche of unopened text, unable to extricate herself (or him) and with no pragmatic plan to tackle the digital morass.

Washington Post writer Brigid Schulte has some ideas to help your friend  (or you of course — your secret is safe with us).

From the Washington Post:

I was drowning in e-mail. Overwhelmed. Overloaded. Spending hours a day, it seemed, roiling in an unending onslaught of info turds and falling further and further behind. The day I returned from a two-week break, I had 23,768 messages in my inbox. And 14,460 of them were unread.

I had to do something. I kept missing stuff. Forgetting stuff. Apologizing. And getting miffed and increasingly angry e-mails from friends and others who wondered why I was ignoring them. It wasn’t just vacation that put me so far behind. I’d been behind for more than a year. Vacation only made it worse. Every time I thought of my inbox, I’d start to hyperventilate.

I’d tried tackling it before: One night a few months ago, I was determined to stay at my desk until I’d powered through all the unread e-mails. At dawn, I was still powering through and nowhere near the end. And before long, the inbox was just as crammed as it had been before I lost that entire night’s sleep.

On the advice of a friend, I’d even hired a Virtual Assistant to help me with the backlog. But I had no idea how to use one. And though I’d read about people declaring e-mail bankruptcy when their inbox was overflowing — deleting everything and starting over from scratch — I was positive there were gems somewhere in that junk, and I couldn’t bear to lose them.

I knew I wasn’t alone. I’d get automatic response messages saying someone was on vacation and the only way they could relax was by telling me they’d never, ever look at my e-mail, so please send it again when they returned. My friend, Georgetown law professor Rosa Brooks, often sends out this auto response: “My inbox looks like Pompeii, post-volcano. Will respond as soon as I have time to excavate.” And another friend, whenever an e-mail is longer than one or two lines, sends a short note, “This sounds like a conversation,” and she won’t respond unless you call her.

E-mail made the late writer Nora Ephron’s list of the 22 things she won’t miss in life. Twice. In 2013, more than 182 billion e-mails were sent every day, no doubt clogging up millions of inboxes around the globe.

Bordering on despair, I sought help from four productivity gurus. And, following their advice, in two weeks of obsession-bordering-on-compulsion, my inbox was down to zero.

Here’s how.

*CREATE A SYSTEM. Julie Gray, a time coach who helps people dig out of e-mail overload all the time, said the first thing I had to change was my mind.

“This is such a pervasive problem. People think, ‘What am I doing wrong? They think they don’t have discipline or focus or that there’s some huge character flaw and they’re beating themselves up all the time. Which only makes it worse,” she said.

“So I first start changing their e-mail mindset from ‘This is an example of my failure,’ to ‘This just means I haven’t found the right system for me yet.’ It’s really all about finding your own path through the craziness.”

Do not spend another minute on e-mail, she admonished me, until you’ve begun to figure out a system. Otherwise, she said, I’d never dig out.

So we talked systems. It soon became clear that I’d created a really great e-mail system for when I was writing my book — ironically enough, on being overwhelmed — spending most of my time not at all overwhelmed in yoga pants in my home office working on my iMac. I was a follower of Randy Pausch who wrote, in “The Last Lecture,” to keep your e-mail inbox down to one page and religiously file everything once you’ve handled it. And I had for a couple years.

But now that I was traveling around the country to talk about the book, and back at work at The Washington Post, using my laptop, iPhone and iPad, that system was completely broken. I had six different e-mail accounts. And my main Verizon e-mail that I’d used for years and the Mac Mail inbox with meticulous file folders that I loved on my iMac didn’t sync across any of them.

Gray asked: “If everything just blew up today, and you had to start over, how would you set up your system?”

I wanted one inbox. One e-mail account. And I wanted the same inbox on all my devices. If I deleted an e-mail on my laptop, I wanted it deleted on my iMac. If I put an e-mail into a folder on my iMac, I wanted that same folder on my laptop.

So I decided to use Gmail, which does sync, as my main account. I set up an auto responder on my Verizon e-mail saying I was no longer using it and directing people to my Gmail account. I updated all my accounts to send to Gmail. And I spent hours on the phone with Apple one Sunday (thank you, Chazz,) to get my Gmail account set up in my beloved Mac mail inbox that would sync. Then I transferred old files and created new ones on Gmail. I had to keep my Washington Post account separate, but that wasn’t the real problem.

All systems go.

Read the entire article here.

Image courtesy of Google Search.

 

Robin Williams You Will Be Missed

Google-search-robin-williams

Mork returned to Ork this weekend; sadly, his creator Robin Williams passed away on August 11, 2014. He was 63. His unique comic genius will be sorely missed.

From NYT:

Some years ago, at a party at the Cannes Film Festival, I was leaning against a rail watching a fireworks display when I heard a familiar voice behind me. Or rather, at least a dozen voices, punctuating the offshore explosions with jokes, non sequiturs and off-the-wall pop-cultural, sexual and political references.

There was no need to turn around: The voices were not talking directly to me and they could not have belonged to anyone other than Robin Williams, who was extemporizing a monologue at least as pyrotechnically amazing as what was unfolding against the Mediterranean sky. I’m unable to recall the details now, but you can probably imagine the rapid-fire succession of accents and pitches — macho basso, squeaky girly, French, Spanish, African-American, human, animal and alien — entangling with curlicues of self-conscious commentary about the sheer ridiculousness of anyone trying to narrate explosions of colored gunpowder in real time.

Part of the shock of his death on Monday came from the fact that he had been on — ubiquitous, self-reinventing, insistently present — for so long. On Twitter, mourners dated themselves with memories of the first time they had noticed him. For some it was the movie “Aladdin.” For others “Dead Poets Society” or “Mrs. Doubtfire.” I go back even further, to the “Mork and Mindy” television show and an album called “Reality — What a Concept” that blew my eighth-grade mind.

Back then, it was clear that Mr. Williams was one of the most explosively, exhaustingly, prodigiously verbal comedians who ever lived. The only thing faster than his mouth was his mind, which was capable of breathtaking leaps of free-associative absurdity. Janet Maslin, reviewing his standup act in 1979, cataloged a tumble of riffs that ranged from an impression of Jacques Cousteau to “an evangelist at the Disco Temple of Comedy,” to Truman Capote Jr. at “the Kindergarten of the Stars” (whatever that was). “He acts out the Reader’s Digest condensed version of ‘Roots,’ ” Ms. Maslin wrote, “which lasts 15 seconds in its entirety. He improvises a Shakespearean-sounding epic about the Three Mile Island nuclear disaster, playing all the parts himself, including Einstein’s ghost.” (That, or something like it, was a role he would reprise more than 20 years later in Steven Spielberg’s “A.I.”)

Read the entire article here.

Image courtesy of Google Search.

Kissing for the Sake of Art

The Makeout ProjectThe process for many artists in often long and arduous. Despite the creative and, usually, fulfilling end result the path is frequently punctuated with disrespect, self-deprivation, suffering and pain. Indeed, many artists have paid a heavier price for their expression: censorship, imprisonment, torture, death.

So, it’s refreshing to see an artist taking a more pleasure-filled route. Kissing. Someone has to do it!

From the Guardian:

From the naked women that Yves Klein covered in blue paint to Terry Richardson’s bevy of porny subjects, the art world is full of work that for one person seems liberated and for another exploitative. Continuing to skirt that line is Jedediah Johnson, an American photographer whose ongoing series the Makeout Project involves him putting on lipstick then kissing people, before documenting the resulting smears in portraits.

Johnson’s shots are really striking, with his LaChapellian palette of bright colours making the lipstick jump out from its surprisingly circuitous path across each person’s face. The subjects look variously flirtatious, amused and ashamed; some have strange narratives, like the woman who is holding a baby just out of shot, her partner hovering off to one side.

It’s sensational enough to have been covered in the Daily Mail with their characteristically BIZARRE use of capitalisation, perhaps chiefly because it seems cheeky – or indeed sleazy. “People say ‘oh, it’s disgusting and he’s just doing it to get cheap thrills’, and I guess that is kind of not totally untrue,” Johnson tells me, explaining the germ of his project. “I just got this thought of this lipstick mark on your face when someone kisses you as being a powerful, loaded gesture that could communicate a lot. And also, y’know, there were a lot of people I knew who I wanted to kiss.” It was a way of addressing his “romantic anxiety”, which was holding him back from kissing those he desired.

So he started asking to kiss people at parties, generally picking someone he knew first of all, so the other partygoers could see it was an art project rather than a novel way of getting his end away. After a while, he graduated to studio portraits, and not just of attractive young women. He says he didn’t want to be “the guy using art as an excuse to kiss people he wants to – and I don’t think there’s necessarily anything wrong with that, but that’s just not who I wanted to be. So I’m going to have to kiss some people I don’t want to.” This includes a series of (still pretty attractive) men, who ended up teaching Jedediah a lot. “I didn’t realise people lead a kiss – I would always just kiss people, and I was leading, and I had no idea. There have been a couple of times when I kissed guys and they led; I tried to move into different real estate on their face, and they wouldn’t let me.”

His work is understandably misinterpreted though, with some people seeing the hand that cradles the face in each shot as a controlling, violent image. “I understand that when you are just pointing the viewer in a direction, they come up with stuff you’re not into.” But the only thing that really grates him is when people accuse him of not making art. “I have two degrees in art, and I don’t feel I have the ability to declare whether something is art or not. It’s an awful thing to say.”

The intrigue of his images comes from trying to assess the dynamic between the pair, from the woman biting her lip faux-seductively to those trying to hide their feelings about what’s just happened. Is there ever an erotic charge? “A few times it’s got really real for me; there’s some where I was probably like oh that was nice, and they’re thinking oh that was incredible, I don’t know what to do now. The different levels are very interesting.” He has had one unfortunate bad breath incident, though: “I was like hey, let’s make out, and she was like, great, just let me finish my garlic string beans. She still had garlic in her mouth.”

Read the entire story and see more images here.

Visit Jedediah Johnson’s website to see the entire Makeout Project here.

Image: The Makeout Project by Jedediah Johnson. Courtesy of Jedediah Johnson / Guardian.

Privacy and Potato Chips

Google-search-potato-chip

Privacy and lack thereof is much in the news and on or minds. New revelations of data breaches, phone taps, corporate hackers and governmental overreach surface on a daily basis. So, it is no surprise to learn that researchers have found a cheap way to eavesdrop on our conversations via a potato chip (crisp, to our British-English readers) packet. No news yet on which flavor of chip makes for the best spying!

From ars technica:

Watch enough spy thrillers, and you’ll undoubtedly see someone setting up a bit of equipment that points a laser at a distant window, letting the snoop listen to conversations on the other side of the glass. This isn’t something Hollywood made up; high-tech snooping devices of this sort do exist, and they take advantage of the extremely high-precision measurements made possible with lasers in order to measure the subtle vibrations caused by sound waves.

A team of researchers has now shown, however, that you can skip the lasers. All you really need is a consumer-level digital camera and a conveniently located bag of Doritos. A glass of water or a plant would also do.

Good vibrations

Despite the differences in the technology involved, both approaches rely on the same principle: sound travels on waves of higher and lower pressure in the air. When these waves reach a flexible object, they set off small vibrations in the object. If you can detect these vibrations, it’s possible to reconstruct the sound. Laser-based systems detect the vibrations by watching for changes in the reflections of the laser light, but researchers wondered whether you could simply observe the object directly, using the ambient light it reflects. (The team involved researchers at MIT, Adobe Research, and Microsoft Research.)

The research team started with a simple test system made from a loudspeaker playing a rising tone, a high-speed camera, and a variety of objects: water, cardboard, a candy wrapper, some metallic foil, and (as a control) a brick. Each of these (even the brick) showed some response at the lowest end of the tonal range, but the other objects, particularly the cardboard and foil, had a response into much higher tonal regions. To observe the changes in ambient light, the camera didn’t have to capture the object at high resolution—it was used at 700 x 700 pixels or less—but it did have to be high-speed, capturing as many as 20,000 frames a second.

Processing the images wasn’t simple, however. A computer had to perform a weighted average over all the pixels captured, and even a twin 3.5GHz machine with 32GB of RAM took more than two hours to process one capture. Nevertheless, the results were impressive, as the algorithm was able to detect motion on the order of a thousandth of a pixel. This enabled the system to recreate the audio waves emitted by the loudspeaker.

Most of the rest of the paper describing the results involved making things harder on the system, as the researchers shifted to using human voices and moving the camera outside the room. They also showed that pre-testing the vibrating object’s response to a tone scale could help them improve their processing.

But perhaps the biggest surprise came when they showed that they didn’t actually need a specialized, high-speed camera. It turns out that most consumer-grade equipment doesn’t expose its entire sensor at once and instead scans an image across the sensor grid in a line-by-line fashion. Using a consumer video camera, the researchers were able to determine that there’s a 16 microsecond delay between each line, with a five millisecond delay between frames. Using this information, they treated each line as a separate exposure and were able to reproduce sound that way.

Read the entire article here.

Image courtesy of Google Search.

 

 

The Enigma of Privacy

Privacy is still a valued and valuable right. It should not be a mere benefit in a democratic society. But, in our current age privacy is becoming an increasingly threatened species. We are surrounded with social networks that share and mine our behaviors and we are assaulted by the snoopers and spooks from local and national governments.

From the Observer:

We have come to the end of privacy; our private lives, as our grandparents would have recognised them, have been winnowed away to the realm of the shameful and secret. To quote ex-tabloid hack Paul McMullan, “privacy is for paedos”. Insidiously, through small concessions that only mounted up over time, we have signed away rights and privileges that other generations fought for, undermining the very cornerstones of our personalities in the process. While outposts of civilisation fight pyrrhic battles, unplugging themselves from the web – “going dark” – the rest of us have come to accept that the majority of our social, financial and even sexual interactions take place over the internet and that someone, somewhere, whether state, press or corporation, is watching.

The past few years have brought an avalanche of news about the extent to which our communications are being monitored: WikiLeaks, the phone-hacking scandal, the Snowden files. Uproar greeted revelations about Facebook’s “emotional contagion” experiment (where it tweaked mathematical formulae driving the news feeds of 700,000 of its members in order to prompt different emotional responses). Cesar A Hidalgo of the Massachusetts Institute of Technology described the Facebook news feed as “like a sausage… Everyone eats it, even though nobody knows how it is made”.

Sitting behind the outrage was a particularly modern form of disquiet – the knowledge that we are being manipulated, surveyed, rendered and that the intelligence behind this is artificial as well as human. Everything we do on the web, from our social media interactions to our shopping on Amazon, to our Netflix selections, is driven by complex mathematical formulae that are invisible and arcane.

Most recently, campaigners’ anger has turned upon the so-called Drip (Data Retention and Investigatory Powers) bill in the UK, which will see internet and telephone companies forced to retain and store their customers’ communications (and provide access to this data to police, government and up to 600 public bodies). Every week, it seems, brings a new furore over corporations – Apple, Google, Facebook – sidling into the private sphere. Often, it’s unclear whether the companies act brazenly because our governments play so fast and loose with their citizens’ privacy (“If you have nothing to hide, you’ve nothing to fear,” William Hague famously intoned); or if governments see corporations feasting upon the private lives of their users and have taken this as a licence to snoop, pry, survey.

We, the public, have looked on, at first horrified, then cynical, then bored by the revelations, by the well-meaning but seemingly useless protests. But what is the personal and psychological impact of this loss of privacy? What legal protection is afforded to those wishing to defend themselves against intrusion? Is it too late to stem the tide now that scenes from science fiction have become part of the fabric of our everyday world?

Novels have long been the province of the great What If?, allowing us to see the ramifications from present events extending into the murky future. As long ago as 1921, Yevgeny Zamyatin imagined One State, the transparent society of his dystopian novel, We. For Orwell, Huxley, Bradbury, Atwood and many others, the loss of privacy was one of the establishing nightmares of the totalitarian future. Dave Eggers’s 2013 novel The Circle paints a portrait of an America without privacy, where a vast, internet-based, multimedia empire surveys and controls the lives of its people, relying on strict adherence to its motto: “Secrets are lies, sharing is caring, and privacy is theft.” We watch as the heroine, Mae, disintegrates under the pressure of scrutiny, finally becoming one of the faceless, obedient hordes. A contemporary (and because of this, even more chilling) account of life lived in the glare of the privacy-free internet is Nikesh Shukla’s Meatspace, which charts the existence of a lonely writer whose only escape is into the shallows of the web. “The first and last thing I do every day,” the book begins, “is see what strangers are saying about me.”

Our age has seen an almost complete conflation of the previously separate spheres of the private and the secret. A taint of shame has crept over from the secret into the private so that anything that is kept from the public gaze is perceived as suspect. This, I think, is why defecation is so often used as an example of the private sphere. Sex and shitting were the only actions that the authorities in Zamyatin’s One State permitted to take place in private, and these remain the battlegrounds of the privacy debate almost a century later. A rather prim leaked memo from a GCHQ operative monitoring Yahoo webcams notes that “a surprising number of people use webcam conversations to show intimate parts of their body to the other person”.

It is to the bathroom that Max Mosley turns when we speak about his own campaign for privacy. “The need for a private life is something that is completely subjective,” he tells me. “You either would mind somebody publishing a film of you doing your ablutions in the morning or you wouldn’t. Personally I would and I think most people would.” In 2008, Mosley’s “sick Nazi orgy”, as the News of the World glossed it, featured in photographs published first in the pages of the tabloid and then across the internet. Mosley’s defence argued, successfully, that the romp involved nothing more than a “standard S&M prison scenario” and the former president of the FIA won £60,000 damages under Article 8 of the European Convention on Human Rights. Now he has rounded on Google and the continued presence of both photographs and allegations on websites accessed via the company’s search engine. If you type “Max Mosley” into Google, the eager autocomplete presents you with “video,” “case”, “scandal” and “with prostitutes”. Half-way down the first page of the search we find a link to a professional-looking YouTube video montage of the NotW story, with no acknowledgment that the claims were later disproved. I watch it several times. I feel a bit grubby.

“The moment the Nazi element of the case fell apart,” Mosley tells me, “which it did immediately, because it was a lie, any claim for public interest also fell apart.”

Here we have a clear example of the blurred lines between secrecy and privacy. Mosley believed that what he chose to do in his private life, even if it included whips and nipple-clamps, should remain just that – private. The News of the World, on the other hand, thought it had uncovered a shameful secret that, given Mosley’s professional position, justified publication. There is a momentary tremor in Mosley’s otherwise fluid delivery as he speaks about the sense of invasion. “Your privacy or your private life belongs to you. Some of it you may choose to make available, some of it should be made available, because it’s in the public interest to make it known. The rest should be yours alone. And if anyone takes it from you, that’s theft and it’s the same as the theft of property.”

Mosley has scored some recent successes, notably in continental Europe, where he has found a culture more suspicious of Google’s sweeping powers than in Britain or, particularly, the US. Courts in France and then, interestingly, Germany, ordered Google to remove pictures of the orgy permanently, with far-reaching consequences for the company. Google is appealing against the rulings, seeing it as absurd that “providers are required to monitor even the smallest components of content they transmit or store for their users”. But Mosley last week extended his action to the UK, filing a claim in the high court in London.

Mosley’s willingness to continue fighting, even when he knows that it means keeping alive the image of his white, septuagenarian buttocks in the minds (if not on the computers) of the public, seems impressively principled. He has fallen victim to what is known as the Streisand Effect, where his very attempt to hide information about himself has led to its proliferation (in 2003 Barbra Streisand tried to stop people taking pictures of her Malibu home, ensuring photos were posted far and wide). Despite this, he continues to battle – both in court, in the media and by directly confronting the websites that continue to display the pictures. It is as if he is using that initial stab of shame, turning it against those who sought to humiliate him. It is noticeable that, having been accused of fetishising one dark period of German history, he uses another to attack Google. “I think, because of the Stasi,” he says, “the Germans can understand that there isn’t a huge difference between the state watching everything you do and Google watching everything you do. Except that, in most European countries, the state tends to be an elected body, whereas Google isn’t. There’s not a lot of difference between the actions of the government of East Germany and the actions of Google.”

All this brings us to some fundamental questions about the role of search engines. Is Google the de facto librarian of the internet, given that it is estimated to handle 40% of all traffic? Is it something more than a librarian, since its algorithms carefully (and with increasing use of your personal data) select the sites it wants you to view? To what extent can Google be held responsible for the content it puts before us?

Read the entire article here.

Frozen Moving Pictures

green-salt-flotowarner

Recent works by artist duo Floto+Warner could be mistaken for a family of bizarrely fluid, alien life-forms, not 3D sculptures of colorful chemicals. While these still images of fluorescent airborne liquids certainly pay homage to Jackson Pollock, they have a unique and playful character all of their own. And, in this case the creative process is just as fascinating as the end result.

From Jonathan Jones over at the Guardian:

Luridly chemical colours hang in the air in the vast wastelands of Nevada in an eye-catching set of pictures by the New York art duo Floto+Warner. To make these images of bright liquids arrested in space, Cassandra and Jeremy Floto threw up cocktails of colour until their camera caught just the splashy, fluid, stilled moments they wanted to record. Apparently, Photoshop is not involved.

These images echo the great modern tradition that pictures motion, energy and flux. “Energy and motion made visible – memories arrested in space,” as Jackson Pollock said of his paintings that he made by dripping, flicking and throwing paint on to canvases laid on the floor. Pollock’s “action paintings” are the obvious source of Floto and Warner’s hurled colours: their photographs are playful riffs on Pollock. And they bring out one of the most startling things about his art: the sense it is still in motion even when it has stopped; the feel of paint being liquid long after it has dried.

Floto and Warner prove that Pollock is still the Great American Artist, 58 years after his death. American art still can’t help echoing him. Works from Robert Smithson’s Spiral Jetty to Andy Warhol’s piss paintings echo his free-ranging exploration of space and his dynamic expansion of the act of drawing.

Yet these images of arrested veils and clouds of colour also echo other attempts to capture living motion. In 1830 to 1831 Hokusai depicted The Great Wave off Kanagawa as a tower of blueness cresting into white foam and about to fall onto the boats helplessly caught in its path. Hokusai’s woodblock print is a decisive moment in the story of art. It takes motion as a topic, and distills its essence in an image at once dynamic and suspended.

Photographers would soon take up Hokusai’s challenge to understand the nature of motion. Famously, Eadweard Muybridge in the late 19th century took strange serial studies of human and animal bodies in motion. Yet the photographer whom Floto+Warner echo most vividly is Harold E Edgerton, who brought the scientific photography of movement into modern times in striking pictures of a foot kicking a ball or a bullet piercing an apple.

Read the entire story and see more of Floto+Warner’s images here.

Image: Green Salt, Floto+Warner. Courtesy of the Guardian.

The Cosmological Axis of Evil

WMAP_temp-anisotropy

The cosmos seems remarkably uniform — look in any direction with the naked eye or the most powerful telescopes and you’ll see much the same as in any other direction. Yet, on a grand scale, our universe shows some peculiar fluctuations that have cosmologists scratching their heads. The temperature of the universe, as described by the cosmic microwave background (CMB), shows some interesting fluctuations in specific, vast regions. It is the distribution of these temperature variations that shows what seem to be non-random patterns. Cosmologists have dubbed the pattern, “axis of evil”.

From ars technica:

The Universe is incredibly regular. The variation of the cosmos’ temperature across the entire sky is tiny: a few millionths of a degree, no matter which direction you look. Yet the same light from the very early cosmos that reveals the Universe’s evenness also tells astronomers a great deal about the conditions that gave rise to irregularities like stars, galaxies, and (incidentally) us.

That light is the cosmic microwave background, and it provides some of the best knowledge we have about the structure, content, and history of the Universe. But it also contains a few mysteries: on very large scales, the cosmos seems to have a certain lopsidedness. That slight asymmetry is reflected in temperature fluctuations much larger than any galaxy, aligned on the sky in a pattern facetiously dubbed “the axis of evil.”

The lopsidedness is real, but cosmologists are divided over whether it reveals anything meaningful about the fundamental laws of physics. The fluctuations are sufficiently small that they could arise from random chance. We have just one observable Universe, but nobody sensible believes we can see all of it. With a sufficiently large cosmos beyond the reach of our telescopes, the rest of the Universe may balance the oddity that we can see, making it a minor, local variation.

However, if the asymmetry can’t be explained away so simply, it could indicate that some new physical mechanisms were at work in the early history of the Universe. As Amanda Yoho, a graduate student in cosmology at Case Western Reserve University, told Ars, “I think the alignments, in conjunction with all of the other large angle anomalies, must point to something we don’t know, whether that be new fundamental physics, unknown astrophysical or cosmological sources, or something else.”

Over the centuries, astronomers have provided increasing evidence that Earth, the Solar System, and the Milky Way don’t occupy a special position in the cosmos. Not only are we not at the center of existence—much less the corrupt sinkhole surrounded by the pure crystal heavens, as in early geocentric Christian theology—the Universe has no center and no edge.

In cosmology, that’s elevated to a principle. The Universe is isotropic, meaning it’s (roughly) the same in every direction. The cosmic microwave background (CMB) is the strongest evidence for the isotropic principle: the spectrum of the light reaching Earth from every direction indicates that it was emitted by matter at almost exactly the same temperature.

The Big Bang model explains why. In the early years of the Universe’s history, matter was very dense and hot, forming an opaque plasma of electrons, protons, and helium nuclei. The expansion of space-time thinned out until the plasma cooled enough that stable atoms could form. That event, which ended roughly 380,000 years after the Big Bang, is known as recombination. The immediate side effect was to make the Universe transparent and liberate vast numbers of photons, most of which have traveled through space unmolested ever since.

We observe the relics of recombination in the form of the CMB. The temperature of the Universe today is about 2.73 degrees above absolute zero in every part of the sky. The lack of variation makes the cosmos nearly as close to a perfect thermal body as possible. However, measurements show anisotropies—tiny fluctuations in temperature, roughly 10 millionths of a degree or less. These irregularities later gave rise to areas where mass gathered. A perfectly featureless, isotropic cosmos would have no stars, galaxies, or planets full of humans.

To measure the physical size of these anisotropies, researchers turn the whole-sky map of temperature fluctuations into something called a power spectrum. That’s akin to the process of taking light from a galaxy and finding the component wavelengths (colors) that make it up. The power spectrum encompasses fluctuations over the whole sky down to very small variations in temperature. (For those with some higher mathematics knowledge, this process involves decomposing the temperature fluctuations in spherical harmonics.)

Smaller details in the fluctuations tell cosmologists the relative amounts of ordinary matter, dark matter, and dark energy. However, some of the largest fluctuations—covering one-fourth, one-eighth, and one-sixteenth of the sky—are bigger than any structure in the Universe, therefore representing temperature variations across the whole sky.

Those large-scale fluctuations in the power spectrum are where something weird happens. The temperature variations are both larger than expected and aligned with each other to a high degree. That’s at odds with theoretical expectations: the CMB anisotropies should be randomly oriented, not aligned. In fact, the smaller-scale variations are random, which makes the deviation at larger scales that much stranger.

Kate Land and Joao Magueijo jokingly dubbed the strange alignment “the axis of evil” in a 2005 paper (freely available on the ArXiv), riffing on an infamous statement by then-US President George W. Bush. Their findings were based on data from an earlier observatory, the Wilkinson Microwave Anisotropy Probe (WMAP), but the follow-up Planck mission found similar results. There’s no question that the “axis of evil” is there; cosmologists just have to figure out what to think about it.

The task of interpretation is complicated by what’s called “cosmic variance,” or the fact that our observable Universe is just one region in a larger Universe. Random chance dictates that some pockets of the whole Universe will have larger or smaller fluctuations than others, and those fluctuations might even be aligned entirely by coincidence.

In other words, the “axis of evil” could very well be an illusion, a pattern that wouldn’t seem amiss if we could see more of the Universe. However, cosmic variance also predicts how big those local, random deviations should be—and the fluctuations in the CMB data are larger. They’re not so large as to rule out the possibility of a local variation entirely—they’re above-average height—but cosmologists can’t easily dismiss the possibility that something else is going on.

Read the entire article here.

Image courtesy of Hinshaw et al WMAP paper.

Don’t Hitchhike, Unless You’re a Robot

hitchbot

 

A Canadian is trying valiantly to hitchhike across the nation, from coast-to-coast — Nova Scotia to British Columbia. While others have made this trek before, this journey is peculiar in one respect. The intrepid hiker is a child-sized robot. She or he — we don’t really know — is named hitchBOT.

hitchBOT is currently still in eastern Canada; New Brunswick to be more precise. So one has to wonder if (s)he would have made better progress from commandeering one of Google’s self-propelled, driverless cars to make the 3,781 mile journey.

Read the entire story and follow hitchBOT’s progress across Canada here.

Image courtesy of hitchBOT / Independent.

 

Ugliness Behind the Beautiful Game

Google-map-QatarQatar hosts the World Cup in 2022. This gives the emirate another 8 years to finish construction of the various football venues, hotels and infrastructure required to support the world’s biggest single sporting event.

Perhaps, it will also give the emirate some time to clean up its appalling record of worker abuse and human rights violations. Numerous  laborers have died during the construction process, while others are paid minimal wages or not at all. And to top it off most employees live in atrocious conditions , cannot move freely, nor can they change jobs or even repatriate — many come from the Indian subcontinent or East Asia. You could be forgiven for labeling these people indentured servants rather than workers.

From the Guardian:

Migrant workers who built luxury offices used by Qatar’s 2022 football World Cup organisers have told the Guardian they have not been paid for more than a year and are now working illegally from cockroach-infested lodgings.

Officials in Qatar’s Supreme Committee for Delivery and Legacy have been using offices on the 38th and 39th floors of Doha’s landmark al-Bidda skyscraper – known as the Tower of Football – which were fitted out by men from Nepal, Sri Lanka and India who say they have not been paid for up to 13 months’ work.

The project, a Guardian investigation shows, was directly commissioned by the Qatar government and the workers’ plight is set to raise fresh doubts over the autocratic emirate’s commitment to labour rights as construction starts this year on five new stadiums for the World Cup.

The offices, which cost £2.5m to fit, feature expensive etched glass, handmade Italian furniture, and even a heated executive toilet, project sources said. Yet some of the workers have not been paid, despite complaining to the Qatari authorities months ago and being owed wages as modest as £6 a day.

By the end of this year, several hundred thousand extra migrant workers from some of the world’s poorest countries are scheduled to have travelled to Qatar to build World Cup facilities and infrastructure. The acceleration in the building programme comes amid international concern over a rising death toll among migrant workers and the use of forced labour.

“We don’t know how much they are spending on the World Cup, but we just need our salary,” said one worker who had lost a year’s pay on the project. “We were working, but not getting the salary. The government, the company: just provide the money.”

The migrants are squeezed seven to a room, sleeping on thin, dirty mattresses on the floor and on bunk beds, in breach of Qatar’s own labour standards. They live in constant fear of imprisonment because they have been left without paperwork after the contractor on the project, Lee Trading and Contracting, collapsed. They say they are now being exploited on wages as low as 50p an hour.

Their case was raised with Qatar’s prime minister by Amnesty International last November, but the workers have said 13 of them remain stranded in Qatar. Despite having done nothing wrong, five have even been arrested and imprisoned by Qatari police because they did not have ID papers. Legal claims lodged against the former employer at the labour court in November have proved fruitless. They are so poor they can no longer afford the taxi to court to pursue their cases, they say.

A 35-year-old Nepalese worker and father of three who ssaid he too had lost a year’s pay: “If I had money to buy a ticket, I would go home.”

Qatar’s World Cup organising committee confirmed that it had been granted use of temporary offices on the floors fitted out by the unpaid workers. It said it was “heavily dismayed to learn of the behaviour of Lee Trading with regard to the timely payment of its workers”. The committee stressed it did not commission the firm. “We strongly disapprove and will continue to press for a speedy and fair conclusion to all cases,” it said.

Jim Murphy, the shadow international development secretary, said the revelation added to the pressure on the World Cup organising committee. “They work out of this building, but so far they can’t even deliver justice for the men who toiled at their own HQ,” he said.

Sharan Burrow, secretary general of the International Trade Union Confederation, said the workers’ treatment was criminal. “It is an appalling abuse of fundamental rights, yet there is no concern from the Qatar government unless they are found out,” she said. “In any other country you could prosecute this behaviour.”

Read the entire article here.

Image: Qatar. Courtesy of Google Maps.

MondayMap: Drought Mapping

US-droughtThe NYT has an fascinating and detailed article bursting with charts and statistics that shows the pervasive grip of the drought in the United States. The desert Southwest and West continue to be parched and scorching. This is not a pretty picture for farmers and increasingly for those (sub-)urban dwellers who rely upon a fragile and dwindling water supply.

From the NYT:

Droughts appear to be intensifying over much of the West and Southwest as a result of global warming. Over the past decade, droughts in some regions have rivaled the epic dry spells of the 1930s and 1950s. About 34 percent of the contiguous United States was in at least a moderate drought as of July 22.
Things have been particularly bad in California, where state officials have approved drastic measures to reduce water consumption. California farmers, without water from reservoirs in the Central Valley, are left to choose which of their crops to water. Parts of Texas, Oklahoma and surrounding states are also suffering from drought conditions.
The relationship between the climate and droughts is complicated. Parts of the country are becoming wetter: East of the Mississippi, rainfall has been rising. But global warming also appears to be causing moisture to evaporate faster in places that were already dry. Researchers believe drought conditions in these places are likely to intensify in coming years.
There has been little relief for some places since the summer of 2012. At the recent peak this May, about 40 percent of the country was abnormally dry or in at least a moderate drought.

Read the entire story and see the statistics for yourself here.

Image courtesy of Drought Monitor / NYT.

Computer Generated Reality

[tube]nLtmEjqzg7M[/tube]

Computer games have come a very long way since the pioneering days of Pong and Pacman. Games are now so realistic that many are indistinguishable from the real-world characters and scenarios they emulate. It is a testament to the skill and ingenuity of hardware and software engineers and the creativity of developers who bring all the diverse underlying elements of a game together. Now, however, they have a match in the form of computer system that is able to generate richly  imagined and rendered world for use in the games themselves. It’s all done through algorithms.

From Technology Review:

Read the entire story here.

Video: No Man’s Sky. Courtesy of Hello Games.

 

 

Gun Love

Gun Violence in America

The second amendment remains ever strong in the U.S. And, of course so does the number of homicides and child deaths at the hands of guns. Sigh!

From the Guardian:

In February, a nine-year-old Arkansas boy called Hank asked his uncle if he could head off on his own from their remote camp to hunt a rabbit with his .22 calibre rifle. “I said all right,” recalled his uncle Brent later. “It wasn’t a concern. Some people are like, ‘a nine year old shouldn’t be off by himself,’ but he wasn’t an average nine year old.”

Hank was steeped in hunting: when he was two, his father, Brad, would put him in a rucksack on his back when he went turkey hunting. Brad regularly took Hank hunting and said that his son often went off hunting by himself. On this particular day, Hank and his uncle Brent had gone squirrel hunting together as his father was too sick to go.

When Hank didn’t return from hunting the rabbit, his uncle raised the alarm. His mother, Kelli, didn’t learn about his disappearance for seven hours. “They didn’t want to bother me unduly,” she says.

The following morning, though, after police, family and hundreds of locals searched around the camp, Hank’s body was found by a creek with a single bullet wound to the forehead. The cause of death was, according to the police, most likely a hunting accident.

“He slipped and the butt of the gun hit the ground and the gun fired,” says Kelli.

Kelli had recently bought the gun for Hank. “It was the first gun I had purchased for my son, just a youth .22 rifle. I never thought it would be a gun that would take his life.”

Both Kelli and Brad, from whom she is separated, believe that the gun was faulty – it shouldn’t have gone off unless the trigger was pulled, they claim. Since Hank’s death, she’s been posting warnings on her Facebook page about the gun her son used: “I wish someone else had posted warnings about it before what happened,” she says.

Had Kelli not bought the gun and had Brad not trained his son to use it, Hank would have celebrated his 10th birthday on 6 June, which his mother commemorated by posting Hank’s picture on her Facebook page with the message: “Happy Birthday Hank! Mommy loves you!”

Little Hank thus became one in a tally of what the makers of a Channel 4 documentary called Kids and Guns claim to be 3,000 American children who die each year from gun-related accidents. A recent Yale University study found that more than 7,000 US children and adolescents are hospitalised or killed by guns each year and estimates that about 20 children a day are treated in US emergency rooms following incidents involving guns.

Hank’s story is striking, certainly for British readers, for two reasons. One, it dramatises how hunting is for many Americans not the privileged pursuit it is overwhelmingly here, but a traditional family activity as much to do with foraging for food as it is a sport.

Francine Shaw, who directed Kids and Guns, says: “In rural America … people hunt to eat.”

Kelli has a fond memory of her son coming home with what he’d shot. “He’d come in and say: “Momma – I’ve got some squirrel to cook.” And I’d say ‘Gee, thanks.’ That child was happy to bring home meat. He was the happiest child when he came in from shooting.”

But Hank’s story is also striking because it shows how raising kids to hunt and shoot is seen as good parenting, perhaps even as an essential part of bringing up children in America – a society rife with guns and temperamentally incapable of overturning the second amendment that confers the right to bear arms, no matter how many innocent Americans die or get maimed as a result.

“People know I was a good mother and loved him dearly,” says Kelli. “We were both really good parents and no one has said anything hateful to us. The only thing that has been said is in a news report about a nine year old being allowed to hunt alone.”

Does Kelli regret that Hank was allowed to hunt alone at that young age? “Obviously I do, because I’ve lost my son,” she tells me. But she doesn’t blame Brent for letting him go off from camp unsupervised with a gun.

“We’re sure not anti-gun here, but do I wish I could go back in time and not buy that gun? Yes I do. I know you in England don’t have guns. I wish I could go back and have my son back. I would live in England, away from the guns.”

Read the entire article here.

Infographic courtesy of Care2 via visua.ly

The Best

The United States is home to many first and superlatives: first in democracy, wealth, openness, innovation, industry, innovation. The nation also takes great pride in its personal and cultural freedoms. Yet it is also home to another superlative: first in rates of incarceration.  In fact, the US leads other nations by such a wide margin that questions continue to be asked. In the land of the free, something must be wrong.

From the Atlantic:

On Friday, the U.S. Sentencing Commission voted unanimously to allow nearly 50,000 nonviolent federal drug offenders to seek lower sentences. The commission’s decision retroactively applied an earlier change in sentencing guidelines to now cover roughly half of those serving federal drug sentences. Endorsed by both the Department of Justice and prison-reform advocates, the move is a significant step forward in reversing decades of mass incarcerationthough in a global context, still modest—step forward in reversing decades of mass incarceration.

How large is America’s prison problem? More than 2.4 million people are behind bars in the United States today, either awaiting trial or serving a sentence. That’s more than the combined population of 15 states, all but three U.S. cities, and the U.S. armed forces. They’re scattered throughout a constellation of 102 federal prisons, 1,719 state prisons, 2,259 juvenile facilities, 3,283 local jails, and many more military, immigration, territorial, and Indian Country facilities.

Compared to the rest of the world, these numbers are staggering. Here’s how the United States’ incarceration rate compares with those of other modern liberal democracies like Britain and Canada:

That graph is from a recent report by Prison Policy Initiative, an invaluable resource on mass incarceration. (PPI also has a disturbing graph comparing state incarceration rates with those of other countries around the world, which I highly recommend looking at here.) “Although our level of crime is comparable to those of other stable, internally secure, industrialized nations,” the report says, “the United States has an incarceration rate far higher than any other country.”

Some individual states like Louisiana contribute disproportionately, but no state is free from mass incarceration. Disturbingly, many states’ prison populations outrank even those of dictatorships and illiberal democracies around the world. New York jails more people per capita than Rwanda, where tens of thousands await trial for their roles in the 1994 genocide. California, Illinois, and Ohio each have a higher incarceration rate than Cuba and Russia. Even Maine and Vermont imprison a greater share of people than Saudi Arabia, Venezuela, or Egypt.

But mass incarceration is more than just an international anomaly; it’s also a relatively recent phenomenon in American criminal justice. Starting in the 1970s with the rise of tough-on-crime politicians and the War on Drugs, America’s prison population jumped eightfold between 1970 and 2010.

These two metrics—the international and the historical—have to be seen together to understand how aberrant mass incarceration is. In time or in space, the warehousing of millions of Americans knows no parallels. In keeping with American history, however, it also disproportionately harms the non-white and the non-wealthy. “For a great many poor people in America, particularly poor black men, prison is a destination that braids through an ordinary life, much as high school and college do for rich white ones,” wrote Adam Gopnik in his seminal 2012 article.

Mass incarceration on a scale almost unexampled in human history is a fundamental fact of our country today—perhaps the fundamental fact, as slavery was the fundamental fact of 1850. In truth, there are more black men in the grip of the criminal-justice system—in prison, on probation, or on parole—than were in slavery then. Over all, there are now more people under “correctional supervision” in America—more than six million—than were in the Gulag Archipelago under Stalin at its height.

Mass incarceration’s effects are not confined to the cell block. Through the inescapable stigma it imposes, a brush with the criminal-justice system can hamstring a former inmate’s employment and financial opportunities for life. The effect is magnified for those who already come from disadvantaged backgrounds. Black men, for example, made substantial economic progress between 1940 and 1980 thanks to the post-war economic boom and the dismantling of de jure racial segregation. But mass incarceration has all but ground that progress to a halt: A new University of Chicago study found that black men are no better off in 2014 than they were when Congress passed the Civil Rights Act 50 years earlier.

Read the entire article here.

Climate Change Denial: English Only

It’s official. Native English-speakers are more likely to be in denial over climate change than non-English speakers. In fact, many who do not see a human hand in our planet’s environmental and climatic troubles are located in the United States, Britain,  Australia and Canada. Enough said, in English.

Sacre bleu!

Now, the Guardian would have you believe that media monopolist — Rupert Murdoch — is behind the climate change skeptics and deniers. After all, he is well known for his views on climate and his empire controls large swathes of the media that most English-speaking people consume.  However, it’s probably a little more complicated.

From the Guardian:

Here in the United States, we fret a lot about global warming denial. Not only is it a dangerous delusion, it’s an incredibly prevalent one. Depending on your survey instrument of choice, we regularly learn that substantial minorities of Americans deny, or are sceptical of, the science of climate change.

The global picture, however, is quite different. For instance, recently the UK-based market research firm Ipsos MORI released its “Global Trends 2014” report, which included a number of survey questions on the environment asked across 20 countries. (h/t Leo Hickman). And when it came to climate change, the result was very telling.

Note that these results are not perfectly comparable across countries, because the data were gathered online, and Ipsos MORI cautions that for developing countries like India and China, “the results should be viewed as representative of a more affluent and ‘connected’ population.”

Nonetheless, some pretty significant patterns are apparent. Perhaps most notably: Not only is the United States clearly the worst in its climate denial, but Great Britain and Australia are second and third worst, respectively. Canada, meanwhile, is the seventh worst.

What do these four nations have in common? They all speak the language of Shakespeare.

Why would that be? After all, presumably there is nothing about English, in and of itself, that predisposes you to climate change denial. Words and phrases like “doubt,” “natural causes,” “climate models,” and other sceptic mots are readily available in other languages. So what’s the real cause?

One possible answer is that it’s all about the political ideologies prevalent in these four countries.

The US climate change counter movement is comprised of 91 separate organizations, with annual funding, collectively, of “just over $900 million.” And they all speak English.

“I do not find these results surprising,” says Riley Dunlap, a sociologist at Oklahoma State University who has extensively studied the climate denial movement. “It’s the countries where neo-liberalism is most hegemonic and with strong neo-liberal regimes (both in power and lurking on the sidelines to retake power) that have bred the most active denial campaigns—US, UK, Australia and now Canada. And the messages employed by these campaigns filter via the media and political elites to the public, especially the ideologically receptive portions.” (Neoliberalism is an economic philosophy centered on the importance of free markets and broadly opposed to big government interventions.)

Indeed, the English language media in three of these four countries are linked together by a single individual: Rupert Murdoch. An apparent climate sceptic or lukewarmer, Murdoch is the chairman of News Corp and 21st Century Fox. (You can watch him express his climate views here.) Some of the media outlets subsumed by the two conglomerates that he heads are responsible for quite a lot of English language climate scepticism and denial.

In the US, Fox News and the Wall Street Journal lead the way; research shows that Fox watching increases distrust of climate scientists. (You can also catch Fox News in Canada.) In Australia, a recent study found that slightly under a third of climate-related articles in 10 top Australian newspapers “did not accept” the scientific consensus on climate change, and that News Corp papers — the Australian, the Herald Sun, and the Daily Telegraph — were particular hotbeds of scepticism. “TheAustralian represents climate science as matter of opinion or debate rather than as a field for inquiry and investigation like all scientific fields,” noted the study.

And then there’s the UK. A 2010 academic study found that while News Corp outlets in this country from 1997 to 2007 did not produce as much strident climate scepticism as did their counterparts in the US and Australia, “the Sun newspaper offered a place for scornful sceptics on its opinion pages as did The Times and Sunday Times to a lesser extent.” (There are also other outlets in the UK, such as the Daily Mail, that feature plenty of scepticism but aren’t owned by News Corp.)

Thus, while there may not be anything inherent to the English language that impels climate denial, the fact that English language media are such a major source of that denial may in effect create a language barrier.

And media aren’t the only reason that denialist arguments are more readily available in the English language. There’s also the Anglophone nations’ concentration of climate “sceptic” think tanks, which provide the arguments and rationalisations necessary to feed this anti-science position.

According to a study in the journal Climatic Change earlier this year, the US is home to 91 different organisations (think tanks, advocacy groups, and trade associations) that collectively comprise a “climate change counter-movement.” The annual funding of these organisations, collectively, is “just over $900 million.” That is a truly massive amount of English-speaking climate “sceptic” activity, and while the study was limited to the US, it is hard to imagine that anything comparable exists in non-English speaking countries.

Read the entire article here.

A Godless Universe: Mind or Mathematics

In his science column for the NYT George Johnson reviews several recent books by noted thinkers who for different reasons believe science needs to expand its borders. Philosopher Thomas Nagel and physicist Max Tegmark both agree that our current understanding of the universe is rather limited and that science needs to turn to new or alternate explanations. Nagel, still an atheist, suggests in his book Mind and Cosmos that the mind somehow needs to be considered a fundamental structure of the universe. While Tegmark in his book Our Mathematical Universe: My Quest for the Ultimate Nature of Reality suggests that mathematics is the core, irreducible framework of the cosmos. Two radically different ideas — yet both are correct in one respect: we still know so very little about ourselves and our surroundings.

From the NYT:

Though he probably didn’t intend anything so jarring, Nicolaus Copernicus, in a 16th-century treatise, gave rise to the idea that human beings do not occupy a special place in the heavens. Nearly 500 years after replacing the Earth with the sun as the center of the cosmic swirl, we’ve come to see ourselves as just another species on a planet orbiting a star in the boondocks of a galaxy in the universe we call home. And this may be just one of many universes — what cosmologists, some more skeptically than others, have named the multiverse.

Despite the long string of demotions, we remain confident, out here on the edge of nowhere, that our band of primates has what it takes to figure out the cosmos — what the writer Timothy Ferris called “the whole shebang.” New particles may yet be discovered, and even new laws. But it is almost taken for granted that everything from physics to biology, including the mind, ultimately comes down to four fundamental concepts: matter and energy interacting in an arena of space and time.

There are skeptics who suspect we may be missing a crucial piece of the puzzle. Recently, I’ve been struck by two books exploring that possibility in very different ways. There is no reason why, in this particular century, Homo sapiens should have gathered all the pieces needed for a theory of everything. In displacing humanity from a privileged position, the Copernican principle applies not just to where we are in space but to when we are in time.

Since it was published in 2012, “Mind and Cosmos,” by the philosopher Thomas Nagel, is the book that has caused the most consternation. With his taunting subtitle — “Why the Materialist Neo-Darwinian Conception of Nature Is Almost Certainly False” — Dr. Nagel was rejecting the idea that there was nothing more to the universe than matter and physical forces. He also doubted that the laws of evolution, as currently conceived, could have produced something as remarkable as sentient life. That idea borders on anathema, and the book quickly met with a blistering counterattack. Steven Pinker, a Harvard psychologist, denounced it as “the shoddy reasoning of a once-great thinker.”

What makes “Mind and Cosmos” worth reading is that Dr. Nagel is an atheist, who rejects the creationist idea of an intelligent designer. The answers, he believes, may still be found through science, but only by expanding it further than it may be willing to go.

“Humans are addicted to the hope for a final reckoning,” he wrote, “but intellectual humility requires that we resist the temptation to assume that the tools of the kind we now have are in principle sufficient to understand the universe as a whole.”

Dr. Nagel finds it astonishing that the human brain — this biological organ that evolved on the third rock from the sun — has developed a science and a mathematics so in tune with the cosmos that it can predict and explain so many things.

Neuroscientists assume that these mental powers somehow emerge from the electrical signaling of neurons — the circuitry of the brain. But no one has come close to explaining how that occurs.

Continue reading the main story Continue reading the main story
Continue reading the main story

That, Dr. Nagel proposes, might require another revolution: showing that mind, along with matter and energy, is “a fundamental principle of nature” — and that we live in a universe primed “to generate beings capable of comprehending it.” Rather than being a blind series of random mutations and adaptations, evolution would have a direction, maybe even a purpose.

“Above all,” he wrote, “I would like to extend the boundaries of what is not regarded as unthinkable, in light of how little we really understand about the world.”

Dr. Nagel is not alone in entertaining such ideas. While rejecting anything mystical, the biologist Stuart Kauffman has suggested that Darwinian theory must somehow be expanded to explain the emergence of complex, intelligent creatures. And David J. Chalmers, a philosopher, has called on scientists to seriously consider “panpsychism” — the idea that some kind of consciousness, however rudimentary, pervades the stuff of the universe.

Some of this is a matter of scientific taste. It can be just as exhilarating, as Stephen Jay Gould proposed in “Wonderful Life,” to consider the conscious mind as simply a fluke, no more inevitable than the human appendix or a starfish’s five legs. But it doesn’t seem so crazy to consider alternate explanations.

Heading off in another direction, a new book by the physicist Max Tegmark suggests that a different ingredient — mathematics — needs to be admitted into science as one of nature’s irreducible parts. In fact, he believes, it may be the most fundamental of all.

In a well-known 1960 essay, the physicist Eugene Wigner marveled at “the unreasonable effectiveness of mathematics” in explaining the world. It is “something bordering on the mysterious,” he wrote, for which “there is no rational explanation.”

The best he could offer was that mathematics is “a wonderful gift which we neither understand nor deserve.”

Dr. Tegmark, in his new book, “Our Mathematical Universe: My Quest for the Ultimate Nature of Reality,” turns the idea on its head: The reason mathematics serves as such a forceful tool is that the universe is a mathematical structure. Going beyond Pythagoras and Plato, he sets out to show how matter, energy, space and time might emerge from numbers.

Read the entire article here.

Non-Spooky Action at a Distance

Albert Einstein famously called quantum entanglement “spooky action at a distance”. It refers to the notion that measuring the state of one of two entangled particles makes the state of the second particle known instantaneously, regardless of the distance  separating the two particles. Entanglement seems to link these particles and make them behave as one system. This peculiar characteristic has been a core element of the counterintuitiive world of quantum theory. Yet while experiments have verified this spookiness, other theorists maintain that both theory and experiment are flawed, and that a different interpretation is required. However, one such competing theory — the many worlds interpretation — makes equally spooky predictions.

From ars technica:

Quantum nonlocality, perhaps one of the most mysterious features of quantum mechanics, may not be a real phenomenon. Or at least that’s what a new paper in the journal PNAS asserts. Its author claims that nonlocality is nothing more than an artifact of the Copenhagen interpretation, the most widely accepted interpretation of quantum mechanics.

Nonlocality is a feature of quantum mechanics where particles are able to influence each other instantaneously regardless of the distance between them, an impossibility in classical physics. Counterintuitive as it may be, nonlocality is currently an accepted feature of the quantum world, apparently verified by many experiments. It’s achieved such wide acceptance that even if our understandings of quantum physics turn out to be completely wrong, physicists think some form of nonlocality would be a feature of whatever replaced it.

The term “nonlocality” comes from the fact that this “spooky action at a distance,” as Einstein famously called it, seems to put an end to our intuitive ideas about location. Nothing can travel faster than the speed of light, so if two quantum particles can influence each other faster than light could travel between the two, then on some level, they act as a single system—there must be no real distance between them.

The concept of location is a bit strange in quantum mechanics anyway. Each particle is described by a mathematical quantity known as the “wave function.” The wave function describes a probability distribution for the particle’s location, but not a definite location. These probable locations are not just scientists’ guesses at the particle’s whereabouts; they’re actual, physical presences. That is to say, the particles exist in a swarm of locations at the same time, with some locations more probable than others.

A measurement collapses the wave function so that the particle is no longer spread out over a variety of locations. It begins to act just like objects we’re familiar with—existing in one specific location.

The experiments that would measure nonlocality, however, usually involve two particles that are entangled, which means that both are described by a shared wave function. The wave function doesn’t just deal with the particle’s location, but with other aspects of its state as well, such as the direction of the particle’s spin. So if scientists can measure the spin of one of the two entangled particles, the shared wave function collapses and the spins of both particles become certain. This happens regardless of the distance between the particles.

The new paper calls all this into question.

The paper’s sole author, Frank Tipler, argues that the reason previous studies apparently confirmed quantum nonlocality is that they were relying on an oversimplified understanding of quantum physics in which the quantum world and the macroscopic world we’re familiar with are treated as distinct from one another. Even large structures obey the laws of quantum Physics, Tipler points out, so the scientists making the measurements must be considered part of the system being studied.

It is intuitively easy to separate the quantum world from our everyday world, as they appear to behave so differently. However, the equations of quantum mechanics can be applied to large objects like human beings, and they essentially predict that you’ll behave just as classical physics—and as observation—says you will. (Physics students who have tried calculating their own wave functions can attest to this). The laws of quantum physics do govern the entire Universe, even if distinctly quantum effects are hard to notice at a macroscopic level.

When this is taken into account, according to Tipler, the results of familiar nonlocality experiments are altered. Typically, such experiments are thought to involve only two measurements: one on each of two entangled particles. But Tipler argues that in such experiments, there’s really a third measurement taking place when the scientists compare the results of the two.

This third measurement is crucial, Tipler argues, as without it, the first two measurements are essentially meaningless. Without comparing the first two, there’s no way to know that one particle’s behavior is actually linked to the other’s. And crucially, in order for the first two measurements to be compared, information must be exchanged between the particles, via the scientists, at a speed less than that of light. In other words, when the third measurement is taken into account, the two particles are not communicating faster than light. There is no “spooky action at a distance.”

Tipler has harsh criticism for the reasoning that led to nonlocality. “The standard argument that quantum phenomena are nonlocal goes like this,” he says in the paper. “(i) Let us add an unmotivated, inconsistent, unobservable, nonlocal process (collapse) to local quantum mechanics; (ii) note that the resulting theory is nonlocal; and (iii) conclude that quantum mechanics is [nonlocal].”

He’s essentially saying that scientists are arbitrarily adding nonlocality, which they can’t observe, and then claiming they have discovered nonlocality. Quite an accusation, especially for the science world. (The “collapse” he mentions is the collapse of the particle’s wave function, which he asserts is not a real phenomenon.) Instead, he claims that the experiments thought to confirm nonlocality are in fact confirming an alternative to the Copenhagen interpretation called the many-worlds interpretation (MWI). As its name implies, the MWI predicts the existence of other universes.

The Copenhagen interpretation has been summarized as “shut up and measure.” Even though the consequences of a wave function-based world don’t make much intuitive sense, it works. The MWI tries to keep particles concrete at the cost of making our world a bit fuzzy. It posits that rather than becoming a wave function, particles remain distinct objects but enter one of a number of alternative universes, which recombine to a single one when the particle is measured.

Scientists who thought they were measuring nonlocality, Tipler claims, were in fact observing the effects of alternate universe versions of themselves, also measuring the same particles.

Part of the significance of Tipler’s claim is that he’s able to mathematically derive the same experimental results from the MWI without use of nonlocality. But this does not necessarily make for evidence that the MWI is correct; either interpretation remains consistent with the data. Until the two can be distinguished experimentally, it all comes down to whether you personally like or dislike nonlocality.

Read the entire article here.

We Are Back

Old-Kiln trail Boulder Jul2014

After a month-long respite, marred by sporadic writing, theDiagonal is finally back. Your friendly editor has relocated to Boulder, CO, where the air is fresh, the streams are cold, and natural beauty is all-enveloping. Writing continues apace.

 

Isolation Fractures the Mind

Through the lens of extreme isolation Michael Bond shows us in this fascinating article how we really are social animals. Remove a person from all meaningful social contact — even for a short while — and her mind will begin to play tricks and eventually break. Michael Bond is author of The Power of Others.

From the BBC:

When people are isolated from human contact, their mind can do some truly bizarre things, says Michael Bond. Why does this happen?

Sarah Shourd’s mind began to slip after about two months into her incarceration. She heard phantom footsteps and flashing lights, and spent most of her day crouched on all fours, listening through a gap in the door.

That summer, the 32-year-old had been hiking with two friends in the mountains of Iraqi Kurdistan when they were arrested by Iranian troops after straying onto the border with Iran. Accused of spying, they were kept in solitary confinement in Evin prison in Tehran, each in their own tiny cell. She endured almost 10,000 hours with little human contact before she was freed. One of the most disturbing effects was the hallucinations.

“In the periphery of my vision, I began to see flashing lights, only to jerk my head around to find that nothing was there,” she wrote in the New York Times in 2011. “At one point, I heard someone screaming, and it wasn’t until I felt the hands of one of the friendlier guards on my face, trying to revive me, that I realised the screams were my own.”

We all want to be alone from time to time, to escape the demands of our colleagues or the hassle of crowds. But not alone alone. For most people, prolonged social isolation is all bad, particularly mentally. We know this not only from reports by people like Shourd who have experienced it first-hand, but also from psychological experiments on the effects of isolation and sensory deprivation, some of which had to be called off due to the extreme and bizarre reactions of those involved. Why does the mind unravel so spectacularly when we’re truly on our own, and is there any way to stop it?

We’ve known for a while that isolation is physically bad for us. Chronically lonely people have higher blood pressure, are more vulnerable to infection, and are also more likely to develop Alzheimer’s disease and dementia. Loneliness also interferes with a whole range of everyday functioning, such as sleep patterns, attention and logical and verbal reasoning. The mechanisms behind these effects are still unclear, though what is known is that social isolation unleashes an extreme immune response – a cascade of stress hormones and inflammation. This may have been appropriate in our early ancestors, when being isolated from the group carried big physical risks, but for us the outcome is mostly harmful.

Yet some of the most profound effects of loneliness are on the mind. For starters, isolation messes with our sense of time. One of the strangest effects is the ‘time-shifting’ reported by those who have spent long periods living underground without daylight. In 1961, French geologist Michel Siffre led a two-week expedition to study an underground glacier beneath the French Alps and ended up staying two months, fascinated by how the darkness affected human biology. He decided to abandon his watch and “live like an animal”. While conducting tests with his team on the surface, they discovered it took him five minutes to count to what he thought was 120 seconds.

A similar pattern of ‘slowing time’ was reported by Maurizio Montalbini, a sociologist and caving enthusiast. In 1993, Montalbini spent 366 days in an underground cavern near Pesaro in Italy that had been designed with Nasa to simulate space missions, breaking his own world record for time spent underground. When he emerged, he was convinced only 219 days had passed. His sleep-wake cycles had almost doubled in length. Since then, researchers have found that in darkness most people eventually adjust to a 48-hour cycle: 36 hours of activity followed by 12 hours of sleep. The reasons are still unclear.

As well as their time-shifts, Siffre and Montalbini reported periods of mental instability too. But these experiences were nothing compared with the extreme reactions seen in notorious sensory deprivation experiments in the mid-20th Century.

In the 1950s and 1960s, China was rumoured to be using solitary confinement to “brainwash” American prisoners captured during the Korean War, and the US and Canadian governments were all too keen to try it out. Their defence departments funded a series of research programmes that might be considered ethically dubious today.

The most extensive took place at McGill University Medical Center in Montreal, led by the psychologist Donald Hebb. The McGill researchers invited paid volunteers – mainly college students – to spend days or weeks by themselves in sound-proof cubicles, deprived of meaningful human contact. Their aim was to reduce perceptual stimulation to a minimum, to see how their subjects would behave when almost nothing was happening. They minimised what they could feel, see, hear and touch, fitting them with translucent visors, cotton gloves and cardboard cuffs extending beyond the fingertips. As Scientific American magazine reported at the time, they had them lie on U-shaped foam pillows to restrict noise, and set up a continuous hum of air-conditioning units to mask small sounds.

After only a few hours, the students became acutely restless. They started to crave stimulation, talking, singing or reciting poetry to themselves to break the monotony. Later, many of them became anxious or highly emotional. Their mental performance suffered too, struggling with arithmetic and word association tests.

But the most alarming effects were the hallucinations. They would start with points of light, lines or shapes, eventually evolving into bizarre scenes, such as squirrels marching with sacks over their shoulders or processions of eyeglasses filing down a street. They had no control over what they saw: one man saw only dogs; another, babies.

Some of them experienced sound hallucinations as well: a music box or a choir, for instance. Others imagined sensations of touch: one man had the sense he had been hit in the arm by pellets fired from guns. Another, reaching out to touch a doorknob, felt an electric shock.

When they emerged from the experiment they found it hard to shake this altered sense of reality, convinced that the whole room was in motion, or that objects were constantly changing shape and size.

Read the entire article here.

 

The Art of Annoyance

g-g-clad

Our favorite voyeurs and provocateurs of contemporary British culture are at it again. Artists Gilbert & George have resurfaced with a new and thoroughly annoying collection — Scapegoating Pictures. You can catch their latest treatise on the state of their city (London) and nation at White Cube in London from July 18 – September 28.

From the Guardian.

The world of art is overwhelmingly liberal and forward looking. Unless you start following the money into Charles Saatchi’s bank account, the mood, content and operating assumptions of contemporary art are strikingly leftwing, from Bob and Roberta Smith’s cute posters to Jeremy Deller’s people’s art. The consensus is so progressive it does not need saying.

Gilbert & George have never signed up to that consensus. I am not saying they are rightwing. I am definitely not saying they are “racist”. But throughout their long careers, from a nostalgia for Edwardian music-hall songs to a more unsettling affinity for skinheads, they have delighted in provoking … us, dear Guardian reader.

Their new exhibition of grand, relentless photomontages restates their defiant desire to offend on a colossal scale. I could almost hear them at my shoulder asking: “Are you annoyed yet?”

Then suddenly they were at my shoulder, as I wrote down choice quotes from Scapegoating Pictures, the scabrous triptych of slogan-spattered pictures that climaxes this exhibition. When I confessed I was wondering which ones I could quote in a newspaper they insisted it’s all quotable: “We have a free press.” So here goes: “Fuck the Vicar.” “Get Frotting.” “Be candid with christians.” “Jerk off a judge.” “Crucify a curator.” “Molest a mullah.”

This wall of insults, mostly directed at religion, is the manifesto of Gilbert & George’s new pictures – and yet you discover it only at the end of the show. Before revealing where they are really coming from in this dirty-mouthed atheist onslaught, they have teased you with all kinds of dubious paranoias. What are these old men – Gilbert & George are 70 and 72, and the self-portraits that warp and gyrate through this kaleidoscopic digital-age profusion of images make no attempt to conceal their ageing process – so scared of?

At times this exhibition is like going on a tour of east London with one of Ukip’s less presentable candidates. Just look at that woman veiling her face. And here is a poster calling for an Islamic state in Britain.

Far from being scared, these artists are bold as brass. No one is asking Gilbert & George to go over the top one more time and plumb the psychic depths of Britain. They’re respectable now; they could just sit back in their suits. But, in these turbulent and estranging works, they give voice to the divided reality of a country at one and the same time gloriously plural and savagely bigoted.

In reality, nothing could be further from the mentality of racists and little Englanders than the polymorphically playful world of Gilbert & George. Their images merge with the faces of young men of all races who have caught their eye. Bullet-like metal canisters pulse through the pictures like threats of violence. Yet these menacing forms are actually empty containers for the drug nitrous oxide found by the artists outside their home, things that look evil but are residues of ecstatic nights.

No other artists today portray their own time and place with the curiosity that Gilbert & George display here. Their own lives are starkly visible, as they walk around their local streets in Spitalfiields, collecting the evidence of drug-fuelled mayhem and looking at the latest graffiti.

Read the entire story and see more of G & G’s works here.

Image: Clad, Gilbert & George, 2013. Courtesy of Gilbert & George / Guardian.

You Are a Neural Computation

Since the days of Aristotle, and later Descartes, thinkers have sought to explain consciousness and free will. Several thousand years on and we are still pondering the notion; science has made great strides and yet fundamentally we still have little idea.

Many neuroscientists now armed with new and very precise research tools are aiming to change this. Yet, increasingly it seems that free will may indeed by a cognitive illusion. Evidence suggests that our subconscious decides and initiates action for us long before we are aware of making a conscious decision. There seems to be no god or ghost in the machine.

From Technology Review:

It was an expedition seeking something never caught before: a single human neuron lighting up to create an urge, albeit for the minor task of moving an index finger, before the subject was even aware of feeling anything. Four years ago, Itzhak Fried, a neurosurgeon at the University of California, Los Angeles, slipped several probes, each with eight hairlike electrodes able to record from single neurons, into the brains of epilepsy patients. (The patients were undergoing surgery to diagnose the source of severe seizures and had agreed to participate in experiments during the process.) Probes in place, the patients—who were conscious—were given instructions to press a button at any time of their choosing, but also to report when they’d first felt the urge to do so.

Later, Gabriel Kreiman, a neuroscientist at Harvard Medical School and Children’s Hospital in Boston, captured the quarry. Poring over data after surgeries in 12 patients, he found telltale flashes of individual neurons in the pre-­supplementary motor area (associated with movement) and the anterior cingulate (associated with motivation and attention), preceding the reported urges by anywhere from hundreds of milliseconds to several seconds. It was a direct neural measurement of the unconscious brain at work—caught in the act of formulating a volitional, or freely willed, decision. Now Kreiman and his colleagues are planning to repeat the feat, but this time they aim to detect pre-urge signatures in real time and stop the subject from performing the action—or see if that’s even possible.

A variety of imaging studies in humans have revealed that brain activity related to decision-making tends to precede conscious action. Implants in macaques and other animals have examined brain circuits involved in perception and action. But Kreiman broke ground by directly measuring a preconscious decision in humans at the level of single neurons. To be sure, the readouts came from an average of just 20 neurons in each patient. (The human brain has about 86 billion of them, each with thousands of connections.) And ultimately, those neurons fired only in response to a chain of even earlier events. But as more such experiments peer deeper into the labyrinth of neural activity behind decisions—whether they involve moving a finger or opting to buy, eat, or kill something—science could eventually tease out the full circuitry of decision-making and perhaps point to behavioral therapies or treatments. “We need to understand the neuronal basis of voluntary decision-making—or ‘freely willed’ decision-­making—and its pathological counterparts if we want to help people such as drug, sex, food, and gambling addicts, or patients with obsessive-compulsive disorder,” says Christof Koch, chief scientist at the Allen Institute of Brain Science in Seattle (see “Cracking the Brain’s Codes”). “Many of these people perfectly well know that what they are doing is dysfunctional but feel powerless to prevent themselves from engaging in these behaviors.”

Kreiman, 42, believes his work challenges important Western philosophical ideas about free will. The Argentine-born neuroscientist, an associate professor at Harvard Medical School, specializes in visual object recognition and memory formation, which draw partly on unconscious processes. He has a thick mop of black hair and a tendency to pause and think a long moment before reframing a question and replying to it expansively. At the wheel of his Jeep as we drove down Broadway in Cambridge, Massachusetts, Kreiman leaned over to adjust the MP3 player—toggling between Vivaldi, Lady Gaga, and Bach. As he did so, his left hand, the one on the steering wheel, slipped to let the Jeep drift a bit over the double yellow lines. Kreiman’s view is that his neurons made him do it, and they also made him correct his small error an instant later; in short, all actions are the result of neural computations and nothing more. “I am interested in a basic age-old question,” he says. “Are decisions really free? I have a somewhat extreme view of this—that there is nothing really free about free will. Ultimately, there are neurons that obey the laws of physics and mathematics. It’s fine if you say ‘I decided’—that’s the language we use. But there is no god in the machine—only neurons that are firing.”

Our philosophical ideas about free will date back to Aristotle and were systematized by René Descartes, who argued that humans possess a God-given “mind,” separate from our material bodies, that endows us with the capacity to freely choose one thing rather than another. Kreiman takes this as his departure point. But he’s not arguing that we lack any control over ourselves. He doesn’t say that our decisions aren’t influenced by evolution, experiences, societal norms, sensations, and perceived consequences. “All of these external influences are fundamental to the way we decide what we do,” he says. “We do have experiences, we do learn, we can change our behavior.”

But the firing of a neuron that guides us one way or another is ultimately like the toss of a coin, Kreiman insists. “The rules that govern our decisions are similar to the rules that govern whether a coin will land one way or the other. Ultimately there is physics; it is chaotic in both cases, but at the end of the day, nobody will argue the coin ‘wanted’ to land heads or tails. There is no real volition to the coin.”

Testing Free Will

It’s only in the past three to four decades that imaging tools and probes have been able to measure what actually happens in the brain. A key research milestone was reached in the early 1980s when Benjamin Libet, a researcher in the physiology department at the University of California, San Francisco, made a remarkable study that tested the idea of conscious free will with actual data.

Libet fitted subjects with EEGs—gadgets that measure aggregate electrical brain activity through the scalp—and had them look at a clock dial that spun around every 2.8 seconds. The subjects were asked to press a button whenever they chose to do so—but told they should also take note of where the time hand was when they first felt the “wish or urge.” It turns out that the actual brain activity involved in the action began 300 milliseconds, on average, before the subject was conscious of wanting to press the button. While some scientists criticized the methods—questioning, among other things, the accuracy of the subjects’ self-reporting—the study set others thinking about how to investigate the same questions. Since then, functional magnetic resonance imaging (fMRI) has been used to map brain activity by measuring blood flow, and other studies have also measured brain activity processes that take place before decisions are made. But while fMRI transformed brain science, it was still only an indirect tool, providing very low spatial resolution and averaging data from millions of neurons. Kreiman’s own study design was the same as Libet’s, with the important addition of the direct single-neuron measurement.

When Libet was in his prime, ­Kreiman was a boy. As a student of physical chemistry at the University of Buenos Aires, he was interested in neurons and brains. When he went for his PhD at Caltech, his passion solidified under his advisor, Koch. Koch was deep in collaboration with Francis Crick, co-discoverer of DNA’s structure, to look for evidence of how consciousness was represented by neurons. For the star-struck kid from Argentina, “it was really life-changing,” he recalls. “Several decades ago, people said this was not a question serious scientists should be thinking about; they either had to be smoking something or have a Nobel Prize”—and Crick, of course, was a Nobelist. Crick hypothesized that studying how the brain processed visual information was one way to study consciousness (we tap unconscious processes to quickly decipher scenes and objects), and he collaborated with Koch on a number of important studies. Kreiman was inspired by the work. “I was very excited about the possibility of asking what seems to be the most fundamental aspect of cognition, consciousness, and free will in a reductionist way—in terms of neurons and circuits of neurons,” he says.

One thing was in short supply: humans willing to have scientists cut open their skulls and poke at their brains. One day in the late 1990s, Kreiman attended a journal club—a kind of book club for scientists reviewing the latest literature—and came across a paper by Fried on how to do brain science in people getting electrodes implanted in their brains to identify the source of severe epileptic seizures. Before he’d heard of Fried, “I thought examining the activity of neurons was the domain of monkeys and rats and cats, not humans,” Kreiman says. Crick introduced Koch to Fried, and soon Koch, Fried, and Kreiman were collaborating on studies that investigated human neural activity, including the experiment that made the direct neural measurement of the urge to move a finger. “This was the opening shot in a new phase of the investigation of questions of voluntary action and free will,” Koch says.

Read the entire article here.

Go Forth And Declutter

Google-search-hoarding

Having only just recently re-located to Colorado’s wondrous Front Range of the Rocky Mountains, your friendly editor now finds himself surrounded by figurative, less-inspiring mountains: moving boxes, bins, bags, more boxes. It’s floor to ceiling clutter as far as the eye can see.

Some of these boxes contain essentials, yet probably around 80 percent hold stuff. Yes, just stuff — aging items that hold some kind of sentimental meaning or future promise: old CDs, baby clothes, used ticket stubs, toys from an attic three moves ago, too many socks, ill-fitting clothing, 13 allen wrenches and screwdrivers, first-grade school projects, photo negatives, fading National Geographic magazines, gummed-up fountain pens, European postcards…

So, here’s a very timely story on the psychology of clutter and hoarding.

From the WSJ:

Jennifer James and her husband don’t have a lot of clutter—but they do find it hard to part with their children’s things. The guest cottage behind their home in Oklahoma City is half-filled with old toys, outgrown clothing, artwork, school papers, two baby beds, a bassinet and a rocking horse.

“Every time I think about getting rid of it, I want to cry,” says Ms. James, a 46-year-old public-relations consultant. She fears her children, ages 6, 8 and 16, will grow up and think she didn’t love them if she doesn’t save it all. “In keeping all this stuff, I think someday I’ll be able to say to my children, ‘See—I treasured your innocence. I treasured you!’ “

Many powerful emotions are lurking amid stuff we keep. Whether it’s piles of unread newspapers, clothes that don’t fit, outdated electronics, even empty margarine tubs, the things we accumulate reflect some of our deepest thoughts and feelings.

Now there’s growing recognition among professional organizers that to come to grips with their clutter, clients need to understand why they save what they save, or things will inevitably pile up again. In some cases, therapists are working along with organizers to help clients confront their psychological demons.

“The work we do with clients goes so much beyond making their closets look pretty,” says Collette Shine, president of the New York chapter of the National Association of Professional Organizers. “It involves getting into their hearts and their heads.”

For some people—especially those with big basements—hanging onto old and unused things doesn’t present a problem. But many others say they’re drowning in clutter.

“I have clients who say they are distressed at all the clutter they have, and distressed at the thought of getting rid of things,” says Simon Rego, director of psychology training at Montefiore Medical Center in Bronx, N.Y., who makes house calls, in extreme cases, to help hoarders.

In some cases, chronic disorganization can be a symptom of Attention Deficit Hyperactivity Disorder, Obsessive-Compulsive Disorder and dementia—all of which involve difficulty with planning, focusing and making decisions.

The extreme form, hoarding, is now a distinct psychiatric disorder, defined in the new Diagnostic and Statistical Manual-5 as “persistent difficulty discarding possessions, regardless of their value” such that living areas cannot be used. Despite all the media attention, only 2% to 5% of people fit the criteria—although many more joke, or fear, they are headed that way.

Difficulty letting go of your stuff can also go hand in hand with separation anxiety, compulsive shopping, perfectionism, procrastination and body-image issues. And the reluctance to cope can create a vicious cycle of avoidance, anxiety and guilt.

In most cases, however, psychologists say that clutter can be traced to what they call cognitive errors—flawed thinking that drives dysfunctional behaviors that can get out of hand.

Among the most common clutter-generating bits of logic: “I might need these someday.” “These might be valuable.” “These might fit again if I lose (or gain) weight.”

“We all have these dysfunctional thoughts. It’s perfectly normal,” Dr. Rego says. The trick, he says, is to recognize the irrational thought that makes you cling to an item and substitute one that helps you let go, such as, “Somebody else could use this, so I’ll give it away.”

He concedes he has saved “maybe 600” disposable Allen wrenches that came with IKEA furniture over the years.

The biggest sources of clutter and the hardest to discard are things that hold sentimental meaning. Dr. Rego says it’s natural to want to hang onto objects that trigger memories, but some people confuse letting go of the object with letting go of the person.

Linda Samuels, president of the Institute for Challenging Disorganization, an education and research group, says there’s no reason to get rid of things just for the sake of doing it.

“Figure out what’s important to you and create an environment that supports that,” she says.

Robert McCollum, a state tax auditor and Ms. James’s husband, says he treasures items like the broken fairy wand one daughter carried around for months.

“I don’t want to lose my memories, and I don’t need a professional organizer,” he says. “I’ve already organized it all in bins.” The only problem would be if they ever move to a place that doesn’t have 1,000 square feet of storage, he adds.

Sometimes the memories people cling to are images of themselves in different roles or happier times. “Our closets are windows into our internal selves,” says Jennifer Baumgartner, a Baltimore psychologist and author of “You Are What You Wear.”

“Say you’re holding on to your team uniforms from college,” she says. “Ask yourself, what about that experience did you like? What can you do in your life now to recapture that?”

Somebody-might-need-this thinking is often what drives people to save stacks of newspapers, magazines, outdated electronic equipment, decades of financial records and craft supplies. With a little imagination, anything could be fodder for scrapbooks or Halloween costumes.

For people afraid to toss things they might want in the future, Dr. Baumgartner says it helps to have a worst-case scenario plan. “What if you do need that tutu you’ve given away for a Halloween costume? What would you do? You can find almost anything on eBay.

Read the entire story here.

Image courtesy of Google search.

Questioning Quantum Orthodoxy

de-BrogliePhysics works very well in explaining our world, yet it is also broken — it cannot, at the moment, reconcile our views of the very small (quantum theory) with those of the very large (relativity theory).

So although the probabilistic underpinnings of quantum theory have done wonders in allowing physicists to construct the Standard Model, gaps remain.

Back in the mid-1920s, the probabilistic worldview proposed by Niels Bohr and others gained favor and took hold. A competing theory, known as the pilot wave theory, proposed by a young Louis de Broglie, was given short shrift. Yet some theorists have maintained that it may do a better job of reconciling this core gap in our understanding — so it is time to revisit and breathe fresh life into pilot wave theory.

From Wired / Quanta:

For nearly a century, “reality” has been a murky concept. The laws of quantum physics seem to suggest that particles spend much of their time in a ghostly state, lacking even basic properties such as a definite location and instead existing everywhere and nowhere at once. Only when a particle is measured does it suddenly materialize, appearing to pick its position as if by a roll of the dice.

This idea that nature is inherently probabilistic — that particles have no hard properties, only likelihoods, until they are observed — is directly implied by the standard equations of quantum mechanics. But now a set of surprising experiments with fluids has revived old skepticism about that worldview. The bizarre results are fueling interest in an almost forgotten version of quantum mechanics, one that never gave up the idea of a single, concrete reality.

The experiments involve an oil droplet that bounces along the surface of a liquid. The droplet gently sloshes the liquid with every bounce. At the same time, ripples from past bounces affect its course. The droplet’s interaction with its own ripples, which form what’s known as a pilot wave, causes it to exhibit behaviors previously thought to be peculiar to elementary particles — including behaviors seen as evidence that these particles are spread through space like waves, without any specific location, until they are measured.

Particles at the quantum scale seem to do things that human-scale objects do not do. They can tunnel through barriers, spontaneously arise or annihilate, and occupy discrete energy levels. This new body of research reveals that oil droplets, when guided by pilot waves, also exhibit these quantum-like features.

To some researchers, the experiments suggest that quantum objects are as definite as droplets, and that they too are guided by pilot waves — in this case, fluid-like undulations in space and time. These arguments have injected new life into a deterministic (as opposed to probabilistic) theory of the microscopic world first proposed, and rejected, at the birth of quantum mechanics.

“This is a classical system that exhibits behavior that people previously thought was exclusive to the quantum realm, and we can say why,” said John Bush, a professor of applied mathematics at the Massachusetts Institute of Technology who has led several recent bouncing-droplet experiments. “The more things we understand and can provide a physical rationale for, the more difficult it will be to defend the ‘quantum mechanics is magic’ perspective.”

Magical Measurements

The orthodox view of quantum mechanics, known as the “Copenhagen interpretation” after the home city of Danish physicist Niels Bohr, one of its architects, holds that particles play out all possible realities simultaneously. Each particle is represented by a “probability wave” weighting these various possibilities, and the wave collapses to a definite state only when the particle is measured. The equations of quantum mechanics do not address how a particle’s properties solidify at the moment of measurement, or how, at such moments, reality picks which form to take. But the calculations work. As Seth Lloyd, a quantum physicist at MIT, put it, “Quantum mechanics is just counterintuitive and we just have to suck it up.”

A classic experiment in quantum mechanics that seems to demonstrate the probabilistic nature of reality involves a beam of particles (such as electrons) propelled one by one toward a pair of slits in a screen. When no one keeps track of each electron’s trajectory, it seems to pass through both slits simultaneously. In time, the electron beam creates a wavelike interference pattern of bright and dark stripes on the other side of the screen. But when a detector is placed in front of one of the slits, its measurement causes the particles to lose their wavelike omnipresence, collapse into definite states, and travel through one slit or the other. The interference pattern vanishes. The great 20th-century physicist Richard Feynman said that this double-slit experiment “has in it the heart of quantum mechanics,” and “is impossible, absolutely impossible, to explain in any classical way.”

Some physicists now disagree. “Quantum mechanics is very successful; nobody’s claiming that it’s wrong,” said Paul Milewski, a professor of mathematics at the University of Bath in England who has devised computer models of bouncing-droplet dynamics. “What we believe is that there may be, in fact, some more fundamental reason why [quantum mechanics] looks the way it does.”

Riding Waves

The idea that pilot waves might explain the peculiarities of particles dates back to the early days of quantum mechanics. The French physicist Louis de Broglie presented the earliest version of pilot-wave theory at the 1927 Solvay Conference in Brussels, a famous gathering of the founders of the field. As de Broglie explained that day to Bohr, Albert Einstein, Erwin Schrödinger, Werner Heisenberg and two dozen other celebrated physicists, pilot-wave theory made all the same predictions as the probabilistic formulation of quantum mechanics (which wouldn’t be referred to as the “Copenhagen” interpretation until the 1950s), but without the ghostliness or mysterious collapse.

The probabilistic version, championed by Bohr, involves a single equation that represents likely and unlikely locations of particles as peaks and troughs of a wave. Bohr interpreted this probability-wave equation as a complete definition of the particle. But de Broglie urged his colleagues to use two equations: one describing a real, physical wave, and another tying the trajectory of an actual, concrete particle to the variables in that wave equation, as if the particle interacts with and is propelled by the wave rather than being defined by it.

For example, consider the double-slit experiment. In de Broglie’s pilot-wave picture, each electron passes through just one of the two slits, but is influenced by a pilot wave that splits and travels through both slits. Like flotsam in a current, the particle is drawn to the places where the two wavefronts cooperate, and does not go where they cancel out.

De Broglie could not predict the exact place where an individual particle would end up — just like Bohr’s version of events, pilot-wave theory predicts only the statistical distribution of outcomes, or the bright and dark stripes — but the two men interpreted this shortcoming differently. Bohr claimed that particles don’t have definite trajectories; de Broglie argued that they do, but that we can’t measure each particle’s initial position well enough to deduce its exact path.

In principle, however, the pilot-wave theory is deterministic: The future evolves dynamically from the past, so that, if the exact state of all the particles in the universe were known at a given instant, their states at all future times could be calculated.

At the Solvay conference, Einstein objected to a probabilistic universe, quipping, “God does not play dice,” but he seemed ambivalent about de Broglie’s alternative. Bohr told Einstein to “stop telling God what to do,” and (for reasons that remain in dispute) he won the day. By 1932, when the Hungarian-American mathematician John von Neumann claimed to have proven that the probabilistic wave equation in quantum mechanics could have no “hidden variables” (that is, missing components, such as de Broglie’s particle with its well-defined trajectory), pilot-wave theory was so poorly regarded that most physicists believed von Neumann’s proof without even reading a translation.

More than 30 years would pass before von Neumann’s proof was shown to be false, but by then the damage was done. The physicist David Bohm resurrected pilot-wave theory in a modified form in 1952, with Einstein’s encouragement, and made clear that it did work, but it never caught on. (The theory is also known as de Broglie-Bohm theory, or Bohmian mechanics.)

Later, the Northern Irish physicist John Stewart Bell went on to prove a seminal theorem that many physicists today misinterpret as rendering hidden variables impossible. But Bell supported pilot-wave theory. He was the one who pointed out the flaws in von Neumann’s original proof. And in 1986 he wrote that pilot-wave theory “seems to me so natural and simple, to resolve the wave-particle dilemma in such a clear and ordinary way, that it is a great mystery to me that it was so generally ignored.”

The neglect continues. A century down the line, the standard, probabilistic formulation of quantum mechanics has been combined with Einstein’s theory of special relativity and developed into the Standard Model, an elaborate and precise description of most of the particles and forces in the universe. Acclimating to the weirdness of quantum mechanics has become a physicists’ rite of passage. The old, deterministic alternative is not mentioned in most textbooks; most people in the field haven’t heard of it. Sheldon Goldstein, a professor of mathematics, physics and philosophy at Rutgers University and a supporter of pilot-wave theory, blames the “preposterous” neglect of the theory on “decades of indoctrination.” At this stage, Goldstein and several others noted, researchers risk their careers by questioning quantum orthodoxy.

A Quantum Drop

Now at last, pilot-wave theory may be experiencing a minor comeback — at least, among fluid dynamicists. “I wish that the people who were developing quantum mechanics at the beginning of last century had access to these experiments,” Milewski said. “Because then the whole history of quantum mechanics might be different.”

The experiments began a decade ago, when Yves Couder and colleagues at Paris Diderot University discovered that vibrating a silicon oil bath up and down at a particular frequency can induce a droplet to bounce along the surface. The droplet’s path, they found, was guided by the slanted contours of the liquid’s surface generated from the droplet’s own bounces — a mutual particle-wave interaction analogous to de Broglie’s pilot-wave concept.

Read the entire article here.

Image: Louis de Broglie. Courtesy of Wikipedia.

Defying Enemy Number One

Sir_Isaac_NewtonEnemy number one in this case is not your favorite team’s arch-rival or your political nemesis or your neighbor’s nocturnal barking dog. It is not sugar, nor is it trans-fat. Enemy number one is not North Korea (close),  nor is it the latest group of murderous  terrorists  (closer).

The real enemy is gravity. Not the movie, that is, but the natural phenomenon.

Gravity is constricting: it anchors us to our measly home  planet, making extra-terrestrial exploration rather difficult. Gravity is painful: it drags us down, it makes us fall — and when we’re down , it helps other things fall on top  of us. Gravity is an enigma.

But help may not be too distant; enter The Gravity Research Foundation. While the foundation’s mission may no longer be to counteract gravity, it still aims to help us better understand.

From the NYT:

Not long after the bombings of Hiroshima and Nagasaki, while the world was reckoning with the specter of nuclear energy, a businessman named Roger Babson was worrying about another of nature’s forces: gravity.

It had been 55 years since his sister Edith drowned in the Annisquam River, in Gloucester, Mass., when gravity, as Babson later described it, “came up and seized her like a dragon and brought her to the bottom.” Later on, the dragon took his grandson, too, as he tried to save a friend during a boating mishap.

Something had to be done.

“It seems as if there must be discovered some partial insulator of gravity which could be used to save millions of lives and prevent accidents,” Babson wrote in a manifesto, “Gravity — Our Enemy Number One.” In 1949, drawing on his considerable wealth, he started the Gravity Research Foundation and began awarding annual cash prizes for the best new ideas for furthering his cause.

It turned out to be a hopeless one. By the time the 2014 awards were announced last month, the foundation was no longer hoping to counteract gravity — it forms the very architecture of space-time — but to better understand it. What began as a crank endeavor has become mainstream. Over the years, winners of the prizes have included the likes of Stephen Hawking, Freeman Dyson, Roger Penrose and Martin Rees.

With his theory of general relativity, Einstein described gravity with an elegance that has not been surpassed. A mass like the sun makes the universe bend, causing smaller masses like planets to move toward it.

The problem is that nature’s other three forces are described in an entirely different way, by quantum mechanics. In this system forces are conveyed by particles. Photons, the most familiar example, are the carriers of light. For many scientists, the ultimate prize would be proof that gravity is carried by gravitons, allowing it to mesh neatly with the rest of the machine.

So far that has been as insurmountable as Babson’s old dream. After nearly a century of trying, the best physicists have come up with is superstring theory, a self-consistent but possibly hollow body of mathematics that depends on the existence of extra dimensions and implies that our universe is one of a multitude, each unknowable to the rest.

With all the accomplishments our species has achieved, we could be forgiven for concluding that we have reached a dead end. But human nature compels us to go on.

This year’s top gravity prize of $4,000 went to Lawrence Krauss and Frank Wilczek. Dr. Wilczek shared a Nobel Prize in 2004 for his part in developing the theory of the strong nuclear force, the one that holds quarks together and forms the cores of atoms.

So far gravitons have eluded science’s best detectors, like LIGO, the Laser Interferometer Gravitational-Wave Observatory. Mr. Dyson suggested at a recent talk that the search might be futile, requiring an instrument with mirrors so massive that they would collapse to form a black hole — gravity defeating its own understanding. But in their paper Dr. Krauss and Dr. Wilczek suggest how gravitons might leave their mark on cosmic background radiation, the afterglow of the Big Bang.

Continue reading the main story Continue reading the main story
Continue reading the main story

There are other mysteries to contend with. Despite the toll it took on Babson’s family, theorists remain puzzled over why gravity is so much weaker than electromagnetism. Hold a refrigerator magnet over a paper clip, and it will fly upward and away from Earth’s pull.

Reaching for an explanation, the physicists Lisa Randall and Raman Sundrum once proposed that gravity is diluted because it leaks into a parallel universe. Striking off in a different direction, Dr. Randall and another colleague, Matthew Reece, recently speculated that the pull of a disk of dark matter might be responsible for jostling the solar system and unleashing periodic comet storms like one that might have killed off the dinosaurs.

It was a young theorist named Bryce DeWitt who helped disabuse Babson of his dream of stopping such a mighty force. In “The Perfect Theory,” a new book about general relativity, the Oxford astrophysicist Pedro G. Ferreira tells how DeWitt, in need of a down payment for a house, entered the Gravitational Research Foundation’s competition in 1953 with a paper showing why the attempt to make any kind of antigravity device was “a waste of time.”

He won the prize, the foundation became more respectable, and DeWitt went on to become one of the most prominent theorists of general relativity. Babson, however, was not entirely deterred. In 1962 after more than 100 prominent Atlantans were killed in a plane crash in Paris, he donated $5,000 to Emory University along with a marble monument “to remind students of the blessings forthcoming” once gravity is counteracted.

He paid for similar antigravity monuments at more than a dozen campuses, including one at Tufts University, where newly minted doctoral students in cosmology kneel before it in a ceremony in which an apple is dropped on their heads.

I thought of Babson recently during a poignant scene in the movie “Gravity,” in which two astronauts are floating high above Earth, stranded from home. During a moment of calm, one of them, Lt. Matt Kowalski (played by George Clooney), asks the other, Dr. Ryan Stone (Sandra Bullock), “What do you miss down there?”

She tells him about her daughter:

“She was 4. She was at school playing tag, slipped and hit her head, and that was it. The stupidest thing.” It was gravity that did her in.

Read the entire article here.

Image: Portrait of Isaac Newton (1642-1727) by  Sir Godfrey Kneller (1646–1723). Courtesy of Wikipedia.

Iran, Women, Clothes

hajib_Jeune_femmeA fascinating essay by Haleh Anvari, Iranian writer and artist, provides an insightful view of the role that fashion takes in shaping many of our perceptions — some right, many wrong — of women.

Quite rightly she argues that the measures our culture places on women, through the lens of Western fashion or Muslim tradition, are misleading. In both cases, there remains a fundamental need to address and to continue to address women’s rights versus those of men. Fashion stereotypes may be vastly different across continents, but the underlying issues remain very much the same whether a woman wears a hijab on the street or lingerie on a catwalk.

From the NYT:

I took a series of photographs of myself in 2007 that show me sitting on the toilet, weighing myself, and shaving my legs in the bath. I shot them as an angry response to an encounter with a gallery owner in London’s artsy Brick Lane. I had offered him photos of colorful chadors — an attempt to question the black chador as the icon of Iran by showing the world that Iranian women were more than this piece of black cloth. The gallery owner wasn’t impressed. “Do you have any photos of Iranian women in their private moments?” he asked.

As an Iranian with a reinforced sense of the private-public divide we navigate daily in our country, I found his curiosity offensive. So I shot my “Private Moments” in a sardonic spirit, to show that Iranian women are like all women around the world if you get past the visual hurdle of the hijab. But I never shared those, not just because I would never get a permit to show them publicly in Iran, but also because I am prepared to go only so far to prove a point. Call me old-fashioned.Read the entire article here.

Ever since the hijab, a generic term for every Islamic modesty covering, became mandatory after the 1979 revolution, Iranian women have been used to represent the country visually. For the new Islamic republic, the all-covering cloak called a chador became a badge of honor, a trademark of fundamental change. To Western visitors, it dropped a pin on their travel maps, where the bodies of Iranian women became a stand-in for the character of Iranian society. When I worked with foreign journalists for six years, I helped produce reports that were illustrated invariably with a woman in a black chador. I once asked a photojournalist why. He said, “How else can we show where we are?”

How wonderful. We had become Iran’s Eiffel Tower or Big Ben.

Next came the manteau-and-head scarf combo — less traditional, and more relaxed, but keeping the lens on the women. Serious reports about elections used a “hair poking out of scarf” standard as an exit poll, or images of scarf-clad women lounging in coffee shops, to register change. One London newspaper illustrated a report on the rise of gasoline prices with a woman in a head scarf, photographed in a gas station, holding a pump nozzle with gasoline suggestively dripping from its tip. A visitor from Mars or a senior editor from New York might have been forgiven for imagining Iran as a strange land devoid of men, where fundamentalist chador-clad harridans vie for space with heathen babes guzzling cappuccinos. (Incidentally, women hardly ever step out of the car to pump gas here; attendants do it for us.)

The disputed 2009 elections, followed by demonstrations and a violent backlash, brought a brief respite. The foreign press was ejected, leaving the reporting to citizen journalists not bound by the West’s conventions. They depicted a politically mature citizenry, male and female, demanding civic acknowledgment together.

We are now witnessing another shift in Iran’s image. It shows Iran “unveiled” — a tired euphemism now being used to literally undress Iranian women or show them off as clotheshorses. An Iranian fashion designer in Paris receives more plaudits in the Western media for his blog’s street snapshots of stylish, affluent young women in North Tehran than he gets for his own designs. In this very publication, a male Iranian photographer depicted Iranian women through flimsy fabrics under the title “Veiled Truths”; one is shown in a one-piece pink swimsuit so minimal it could pass for underwear; others are made more sensual behind sheer “veils,” reinforcing a sense of peeking at them. Search the Internet and you can get an eyeful of nubile limbs in opposition to the country’s official image, shot by Iranian photographers of both sexes, keen to show the hidden, supposedly true, other side of Iran.

Young Iranians rightly desire to show the world the unseen sides of their lives. But their need to show themselves as like their peers in the West takes them into dangerous territory. Professional photographers and artists, encouraged by Western curators and seeking fast-track careers, are creating a new wave of homegrown neo-Orientalism. A favorite reworking of an old cliché is the thin, beautiful young woman reclining while smoking a hookah, dancing, or otherwise at leisure in her private spaces. Ingres could sue for plagiarism.

In a country where the word feminism is pejorative, there is no inkling that the values of both fundamentalism and Western consumerism are two sides of the same coin — the female body as an icon defining Iranian culture.

It is true that we Iranians live dual lives, and so it is true that to see us in focus, you must enter our inner sanctum. But the inner sanctum includes women who believe in the hijab, fat women, old women and, most important, women in professions from doctor to shopkeeper. It also includes men, not all of whom are below 30 years of age. If you wish to see Iran as it is, you need go no further than Facebook and Instagram. Here, Iran is neither fully veiled nor longing to undress itself. Its complex variety is shown through the lens of its own people, in both private and public spaces.

Read the entire essay here.

Image: Young woman from Naplouse in a hijab, c1867-1885. Courtesy of Wikipedia.