Non-Spooky Action at a Distance

Albert Einstein famously called quantum entanglement “spooky action at a distance”. It refers to the notion that measuring the state of one of two entangled particles makes the state of the second particle known instantaneously, regardless of the distance  separating the two particles. Entanglement seems to link these particles and make them behave as one system. This peculiar characteristic has been a core element of the counterintuitiive world of quantum theory. Yet while experiments have verified this spookiness, other theorists maintain that both theory and experiment are flawed, and that a different interpretation is required. However, one such competing theory — the many worlds interpretation — makes equally spooky predictions.

From ars technica:

Quantum nonlocality, perhaps one of the most mysterious features of quantum mechanics, may not be a real phenomenon. Or at least that’s what a new paper in the journal PNAS asserts. Its author claims that nonlocality is nothing more than an artifact of the Copenhagen interpretation, the most widely accepted interpretation of quantum mechanics.

Nonlocality is a feature of quantum mechanics where particles are able to influence each other instantaneously regardless of the distance between them, an impossibility in classical physics. Counterintuitive as it may be, nonlocality is currently an accepted feature of the quantum world, apparently verified by many experiments. It’s achieved such wide acceptance that even if our understandings of quantum physics turn out to be completely wrong, physicists think some form of nonlocality would be a feature of whatever replaced it.

The term “nonlocality” comes from the fact that this “spooky action at a distance,” as Einstein famously called it, seems to put an end to our intuitive ideas about location. Nothing can travel faster than the speed of light, so if two quantum particles can influence each other faster than light could travel between the two, then on some level, they act as a single system—there must be no real distance between them.

The concept of location is a bit strange in quantum mechanics anyway. Each particle is described by a mathematical quantity known as the “wave function.” The wave function describes a probability distribution for the particle’s location, but not a definite location. These probable locations are not just scientists’ guesses at the particle’s whereabouts; they’re actual, physical presences. That is to say, the particles exist in a swarm of locations at the same time, with some locations more probable than others.

A measurement collapses the wave function so that the particle is no longer spread out over a variety of locations. It begins to act just like objects we’re familiar with—existing in one specific location.

The experiments that would measure nonlocality, however, usually involve two particles that are entangled, which means that both are described by a shared wave function. The wave function doesn’t just deal with the particle’s location, but with other aspects of its state as well, such as the direction of the particle’s spin. So if scientists can measure the spin of one of the two entangled particles, the shared wave function collapses and the spins of both particles become certain. This happens regardless of the distance between the particles.

The new paper calls all this into question.

The paper’s sole author, Frank Tipler, argues that the reason previous studies apparently confirmed quantum nonlocality is that they were relying on an oversimplified understanding of quantum physics in which the quantum world and the macroscopic world we’re familiar with are treated as distinct from one another. Even large structures obey the laws of quantum Physics, Tipler points out, so the scientists making the measurements must be considered part of the system being studied.

It is intuitively easy to separate the quantum world from our everyday world, as they appear to behave so differently. However, the equations of quantum mechanics can be applied to large objects like human beings, and they essentially predict that you’ll behave just as classical physics—and as observation—says you will. (Physics students who have tried calculating their own wave functions can attest to this). The laws of quantum physics do govern the entire Universe, even if distinctly quantum effects are hard to notice at a macroscopic level.

When this is taken into account, according to Tipler, the results of familiar nonlocality experiments are altered. Typically, such experiments are thought to involve only two measurements: one on each of two entangled particles. But Tipler argues that in such experiments, there’s really a third measurement taking place when the scientists compare the results of the two.

This third measurement is crucial, Tipler argues, as without it, the first two measurements are essentially meaningless. Without comparing the first two, there’s no way to know that one particle’s behavior is actually linked to the other’s. And crucially, in order for the first two measurements to be compared, information must be exchanged between the particles, via the scientists, at a speed less than that of light. In other words, when the third measurement is taken into account, the two particles are not communicating faster than light. There is no “spooky action at a distance.”

Tipler has harsh criticism for the reasoning that led to nonlocality. “The standard argument that quantum phenomena are nonlocal goes like this,” he says in the paper. “(i) Let us add an unmotivated, inconsistent, unobservable, nonlocal process (collapse) to local quantum mechanics; (ii) note that the resulting theory is nonlocal; and (iii) conclude that quantum mechanics is [nonlocal].”

He’s essentially saying that scientists are arbitrarily adding nonlocality, which they can’t observe, and then claiming they have discovered nonlocality. Quite an accusation, especially for the science world. (The “collapse” he mentions is the collapse of the particle’s wave function, which he asserts is not a real phenomenon.) Instead, he claims that the experiments thought to confirm nonlocality are in fact confirming an alternative to the Copenhagen interpretation called the many-worlds interpretation (MWI). As its name implies, the MWI predicts the existence of other universes.

The Copenhagen interpretation has been summarized as “shut up and measure.” Even though the consequences of a wave function-based world don’t make much intuitive sense, it works. The MWI tries to keep particles concrete at the cost of making our world a bit fuzzy. It posits that rather than becoming a wave function, particles remain distinct objects but enter one of a number of alternative universes, which recombine to a single one when the particle is measured.

Scientists who thought they were measuring nonlocality, Tipler claims, were in fact observing the effects of alternate universe versions of themselves, also measuring the same particles.

Part of the significance of Tipler’s claim is that he’s able to mathematically derive the same experimental results from the MWI without use of nonlocality. But this does not necessarily make for evidence that the MWI is correct; either interpretation remains consistent with the data. Until the two can be distinguished experimentally, it all comes down to whether you personally like or dislike nonlocality.

Read the entire article here.

We Are Back

Old-Kiln trail Boulder Jul2014

After a month-long respite, marred by sporadic writing, theDiagonal is finally back. Your friendly editor has relocated to Boulder, CO, where the air is fresh, the streams are cold, and natural beauty is all-enveloping. Writing continues apace.

 

Isolation Fractures the Mind

Through the lens of extreme isolation Michael Bond shows us in this fascinating article how we really are social animals. Remove a person from all meaningful social contact — even for a short while — and her mind will begin to play tricks and eventually break. Michael Bond is author of The Power of Others.

From the BBC:

When people are isolated from human contact, their mind can do some truly bizarre things, says Michael Bond. Why does this happen?

Sarah Shourd’s mind began to slip after about two months into her incarceration. She heard phantom footsteps and flashing lights, and spent most of her day crouched on all fours, listening through a gap in the door.

That summer, the 32-year-old had been hiking with two friends in the mountains of Iraqi Kurdistan when they were arrested by Iranian troops after straying onto the border with Iran. Accused of spying, they were kept in solitary confinement in Evin prison in Tehran, each in their own tiny cell. She endured almost 10,000 hours with little human contact before she was freed. One of the most disturbing effects was the hallucinations.

“In the periphery of my vision, I began to see flashing lights, only to jerk my head around to find that nothing was there,” she wrote in the New York Times in 2011. “At one point, I heard someone screaming, and it wasn’t until I felt the hands of one of the friendlier guards on my face, trying to revive me, that I realised the screams were my own.”

We all want to be alone from time to time, to escape the demands of our colleagues or the hassle of crowds. But not alone alone. For most people, prolonged social isolation is all bad, particularly mentally. We know this not only from reports by people like Shourd who have experienced it first-hand, but also from psychological experiments on the effects of isolation and sensory deprivation, some of which had to be called off due to the extreme and bizarre reactions of those involved. Why does the mind unravel so spectacularly when we’re truly on our own, and is there any way to stop it?

We’ve known for a while that isolation is physically bad for us. Chronically lonely people have higher blood pressure, are more vulnerable to infection, and are also more likely to develop Alzheimer’s disease and dementia. Loneliness also interferes with a whole range of everyday functioning, such as sleep patterns, attention and logical and verbal reasoning. The mechanisms behind these effects are still unclear, though what is known is that social isolation unleashes an extreme immune response – a cascade of stress hormones and inflammation. This may have been appropriate in our early ancestors, when being isolated from the group carried big physical risks, but for us the outcome is mostly harmful.

Yet some of the most profound effects of loneliness are on the mind. For starters, isolation messes with our sense of time. One of the strangest effects is the ‘time-shifting’ reported by those who have spent long periods living underground without daylight. In 1961, French geologist Michel Siffre led a two-week expedition to study an underground glacier beneath the French Alps and ended up staying two months, fascinated by how the darkness affected human biology. He decided to abandon his watch and “live like an animal”. While conducting tests with his team on the surface, they discovered it took him five minutes to count to what he thought was 120 seconds.

A similar pattern of ‘slowing time’ was reported by Maurizio Montalbini, a sociologist and caving enthusiast. In 1993, Montalbini spent 366 days in an underground cavern near Pesaro in Italy that had been designed with Nasa to simulate space missions, breaking his own world record for time spent underground. When he emerged, he was convinced only 219 days had passed. His sleep-wake cycles had almost doubled in length. Since then, researchers have found that in darkness most people eventually adjust to a 48-hour cycle: 36 hours of activity followed by 12 hours of sleep. The reasons are still unclear.

As well as their time-shifts, Siffre and Montalbini reported periods of mental instability too. But these experiences were nothing compared with the extreme reactions seen in notorious sensory deprivation experiments in the mid-20th Century.

In the 1950s and 1960s, China was rumoured to be using solitary confinement to “brainwash” American prisoners captured during the Korean War, and the US and Canadian governments were all too keen to try it out. Their defence departments funded a series of research programmes that might be considered ethically dubious today.

The most extensive took place at McGill University Medical Center in Montreal, led by the psychologist Donald Hebb. The McGill researchers invited paid volunteers – mainly college students – to spend days or weeks by themselves in sound-proof cubicles, deprived of meaningful human contact. Their aim was to reduce perceptual stimulation to a minimum, to see how their subjects would behave when almost nothing was happening. They minimised what they could feel, see, hear and touch, fitting them with translucent visors, cotton gloves and cardboard cuffs extending beyond the fingertips. As Scientific American magazine reported at the time, they had them lie on U-shaped foam pillows to restrict noise, and set up a continuous hum of air-conditioning units to mask small sounds.

After only a few hours, the students became acutely restless. They started to crave stimulation, talking, singing or reciting poetry to themselves to break the monotony. Later, many of them became anxious or highly emotional. Their mental performance suffered too, struggling with arithmetic and word association tests.

But the most alarming effects were the hallucinations. They would start with points of light, lines or shapes, eventually evolving into bizarre scenes, such as squirrels marching with sacks over their shoulders or processions of eyeglasses filing down a street. They had no control over what they saw: one man saw only dogs; another, babies.

Some of them experienced sound hallucinations as well: a music box or a choir, for instance. Others imagined sensations of touch: one man had the sense he had been hit in the arm by pellets fired from guns. Another, reaching out to touch a doorknob, felt an electric shock.

When they emerged from the experiment they found it hard to shake this altered sense of reality, convinced that the whole room was in motion, or that objects were constantly changing shape and size.

Read the entire article here.

 

The Art of Annoyance

g-g-clad

Our favorite voyeurs and provocateurs of contemporary British culture are at it again. Artists Gilbert & George have resurfaced with a new and thoroughly annoying collection — Scapegoating Pictures. You can catch their latest treatise on the state of their city (London) and nation at White Cube in London from July 18 – September 28.

From the Guardian.

The world of art is overwhelmingly liberal and forward looking. Unless you start following the money into Charles Saatchi’s bank account, the mood, content and operating assumptions of contemporary art are strikingly leftwing, from Bob and Roberta Smith’s cute posters to Jeremy Deller’s people’s art. The consensus is so progressive it does not need saying.

Gilbert & George have never signed up to that consensus. I am not saying they are rightwing. I am definitely not saying they are “racist”. But throughout their long careers, from a nostalgia for Edwardian music-hall songs to a more unsettling affinity for skinheads, they have delighted in provoking … us, dear Guardian reader.

Their new exhibition of grand, relentless photomontages restates their defiant desire to offend on a colossal scale. I could almost hear them at my shoulder asking: “Are you annoyed yet?”

Then suddenly they were at my shoulder, as I wrote down choice quotes from Scapegoating Pictures, the scabrous triptych of slogan-spattered pictures that climaxes this exhibition. When I confessed I was wondering which ones I could quote in a newspaper they insisted it’s all quotable: “We have a free press.” So here goes: “Fuck the Vicar.” “Get Frotting.” “Be candid with christians.” “Jerk off a judge.” “Crucify a curator.” “Molest a mullah.”

This wall of insults, mostly directed at religion, is the manifesto of Gilbert & George’s new pictures – and yet you discover it only at the end of the show. Before revealing where they are really coming from in this dirty-mouthed atheist onslaught, they have teased you with all kinds of dubious paranoias. What are these old men – Gilbert & George are 70 and 72, and the self-portraits that warp and gyrate through this kaleidoscopic digital-age profusion of images make no attempt to conceal their ageing process – so scared of?

At times this exhibition is like going on a tour of east London with one of Ukip’s less presentable candidates. Just look at that woman veiling her face. And here is a poster calling for an Islamic state in Britain.

Far from being scared, these artists are bold as brass. No one is asking Gilbert & George to go over the top one more time and plumb the psychic depths of Britain. They’re respectable now; they could just sit back in their suits. But, in these turbulent and estranging works, they give voice to the divided reality of a country at one and the same time gloriously plural and savagely bigoted.

In reality, nothing could be further from the mentality of racists and little Englanders than the polymorphically playful world of Gilbert & George. Their images merge with the faces of young men of all races who have caught their eye. Bullet-like metal canisters pulse through the pictures like threats of violence. Yet these menacing forms are actually empty containers for the drug nitrous oxide found by the artists outside their home, things that look evil but are residues of ecstatic nights.

No other artists today portray their own time and place with the curiosity that Gilbert & George display here. Their own lives are starkly visible, as they walk around their local streets in Spitalfiields, collecting the evidence of drug-fuelled mayhem and looking at the latest graffiti.

Read the entire story and see more of G & G’s works here.

Image: Clad, Gilbert & George, 2013. Courtesy of Gilbert & George / Guardian.

You Are a Neural Computation

Since the days of Aristotle, and later Descartes, thinkers have sought to explain consciousness and free will. Several thousand years on and we are still pondering the notion; science has made great strides and yet fundamentally we still have little idea.

Many neuroscientists now armed with new and very precise research tools are aiming to change this. Yet, increasingly it seems that free will may indeed by a cognitive illusion. Evidence suggests that our subconscious decides and initiates action for us long before we are aware of making a conscious decision. There seems to be no god or ghost in the machine.

From Technology Review:

It was an expedition seeking something never caught before: a single human neuron lighting up to create an urge, albeit for the minor task of moving an index finger, before the subject was even aware of feeling anything. Four years ago, Itzhak Fried, a neurosurgeon at the University of California, Los Angeles, slipped several probes, each with eight hairlike electrodes able to record from single neurons, into the brains of epilepsy patients. (The patients were undergoing surgery to diagnose the source of severe seizures and had agreed to participate in experiments during the process.) Probes in place, the patients—who were conscious—were given instructions to press a button at any time of their choosing, but also to report when they’d first felt the urge to do so.

Later, Gabriel Kreiman, a neuroscientist at Harvard Medical School and Children’s Hospital in Boston, captured the quarry. Poring over data after surgeries in 12 patients, he found telltale flashes of individual neurons in the pre-­supplementary motor area (associated with movement) and the anterior cingulate (associated with motivation and attention), preceding the reported urges by anywhere from hundreds of milliseconds to several seconds. It was a direct neural measurement of the unconscious brain at work—caught in the act of formulating a volitional, or freely willed, decision. Now Kreiman and his colleagues are planning to repeat the feat, but this time they aim to detect pre-urge signatures in real time and stop the subject from performing the action—or see if that’s even possible.

A variety of imaging studies in humans have revealed that brain activity related to decision-making tends to precede conscious action. Implants in macaques and other animals have examined brain circuits involved in perception and action. But Kreiman broke ground by directly measuring a preconscious decision in humans at the level of single neurons. To be sure, the readouts came from an average of just 20 neurons in each patient. (The human brain has about 86 billion of them, each with thousands of connections.) And ultimately, those neurons fired only in response to a chain of even earlier events. But as more such experiments peer deeper into the labyrinth of neural activity behind decisions—whether they involve moving a finger or opting to buy, eat, or kill something—science could eventually tease out the full circuitry of decision-making and perhaps point to behavioral therapies or treatments. “We need to understand the neuronal basis of voluntary decision-making—or ‘freely willed’ decision-­making—and its pathological counterparts if we want to help people such as drug, sex, food, and gambling addicts, or patients with obsessive-compulsive disorder,” says Christof Koch, chief scientist at the Allen Institute of Brain Science in Seattle (see “Cracking the Brain’s Codes”). “Many of these people perfectly well know that what they are doing is dysfunctional but feel powerless to prevent themselves from engaging in these behaviors.”

Kreiman, 42, believes his work challenges important Western philosophical ideas about free will. The Argentine-born neuroscientist, an associate professor at Harvard Medical School, specializes in visual object recognition and memory formation, which draw partly on unconscious processes. He has a thick mop of black hair and a tendency to pause and think a long moment before reframing a question and replying to it expansively. At the wheel of his Jeep as we drove down Broadway in Cambridge, Massachusetts, Kreiman leaned over to adjust the MP3 player—toggling between Vivaldi, Lady Gaga, and Bach. As he did so, his left hand, the one on the steering wheel, slipped to let the Jeep drift a bit over the double yellow lines. Kreiman’s view is that his neurons made him do it, and they also made him correct his small error an instant later; in short, all actions are the result of neural computations and nothing more. “I am interested in a basic age-old question,” he says. “Are decisions really free? I have a somewhat extreme view of this—that there is nothing really free about free will. Ultimately, there are neurons that obey the laws of physics and mathematics. It’s fine if you say ‘I decided’—that’s the language we use. But there is no god in the machine—only neurons that are firing.”

Our philosophical ideas about free will date back to Aristotle and were systematized by René Descartes, who argued that humans possess a God-given “mind,” separate from our material bodies, that endows us with the capacity to freely choose one thing rather than another. Kreiman takes this as his departure point. But he’s not arguing that we lack any control over ourselves. He doesn’t say that our decisions aren’t influenced by evolution, experiences, societal norms, sensations, and perceived consequences. “All of these external influences are fundamental to the way we decide what we do,” he says. “We do have experiences, we do learn, we can change our behavior.”

But the firing of a neuron that guides us one way or another is ultimately like the toss of a coin, Kreiman insists. “The rules that govern our decisions are similar to the rules that govern whether a coin will land one way or the other. Ultimately there is physics; it is chaotic in both cases, but at the end of the day, nobody will argue the coin ‘wanted’ to land heads or tails. There is no real volition to the coin.”

Testing Free Will

It’s only in the past three to four decades that imaging tools and probes have been able to measure what actually happens in the brain. A key research milestone was reached in the early 1980s when Benjamin Libet, a researcher in the physiology department at the University of California, San Francisco, made a remarkable study that tested the idea of conscious free will with actual data.

Libet fitted subjects with EEGs—gadgets that measure aggregate electrical brain activity through the scalp—and had them look at a clock dial that spun around every 2.8 seconds. The subjects were asked to press a button whenever they chose to do so—but told they should also take note of where the time hand was when they first felt the “wish or urge.” It turns out that the actual brain activity involved in the action began 300 milliseconds, on average, before the subject was conscious of wanting to press the button. While some scientists criticized the methods—questioning, among other things, the accuracy of the subjects’ self-reporting—the study set others thinking about how to investigate the same questions. Since then, functional magnetic resonance imaging (fMRI) has been used to map brain activity by measuring blood flow, and other studies have also measured brain activity processes that take place before decisions are made. But while fMRI transformed brain science, it was still only an indirect tool, providing very low spatial resolution and averaging data from millions of neurons. Kreiman’s own study design was the same as Libet’s, with the important addition of the direct single-neuron measurement.

When Libet was in his prime, ­Kreiman was a boy. As a student of physical chemistry at the University of Buenos Aires, he was interested in neurons and brains. When he went for his PhD at Caltech, his passion solidified under his advisor, Koch. Koch was deep in collaboration with Francis Crick, co-discoverer of DNA’s structure, to look for evidence of how consciousness was represented by neurons. For the star-struck kid from Argentina, “it was really life-changing,” he recalls. “Several decades ago, people said this was not a question serious scientists should be thinking about; they either had to be smoking something or have a Nobel Prize”—and Crick, of course, was a Nobelist. Crick hypothesized that studying how the brain processed visual information was one way to study consciousness (we tap unconscious processes to quickly decipher scenes and objects), and he collaborated with Koch on a number of important studies. Kreiman was inspired by the work. “I was very excited about the possibility of asking what seems to be the most fundamental aspect of cognition, consciousness, and free will in a reductionist way—in terms of neurons and circuits of neurons,” he says.

One thing was in short supply: humans willing to have scientists cut open their skulls and poke at their brains. One day in the late 1990s, Kreiman attended a journal club—a kind of book club for scientists reviewing the latest literature—and came across a paper by Fried on how to do brain science in people getting electrodes implanted in their brains to identify the source of severe epileptic seizures. Before he’d heard of Fried, “I thought examining the activity of neurons was the domain of monkeys and rats and cats, not humans,” Kreiman says. Crick introduced Koch to Fried, and soon Koch, Fried, and Kreiman were collaborating on studies that investigated human neural activity, including the experiment that made the direct neural measurement of the urge to move a finger. “This was the opening shot in a new phase of the investigation of questions of voluntary action and free will,” Koch says.

Read the entire article here.

Go Forth And Declutter

Google-search-hoarding

Having only just recently re-located to Colorado’s wondrous Front Range of the Rocky Mountains, your friendly editor now finds himself surrounded by figurative, less-inspiring mountains: moving boxes, bins, bags, more boxes. It’s floor to ceiling clutter as far as the eye can see.

Some of these boxes contain essentials, yet probably around 80 percent hold stuff. Yes, just stuff — aging items that hold some kind of sentimental meaning or future promise: old CDs, baby clothes, used ticket stubs, toys from an attic three moves ago, too many socks, ill-fitting clothing, 13 allen wrenches and screwdrivers, first-grade school projects, photo negatives, fading National Geographic magazines, gummed-up fountain pens, European postcards…

So, here’s a very timely story on the psychology of clutter and hoarding.

From the WSJ:

Jennifer James and her husband don’t have a lot of clutter—but they do find it hard to part with their children’s things. The guest cottage behind their home in Oklahoma City is half-filled with old toys, outgrown clothing, artwork, school papers, two baby beds, a bassinet and a rocking horse.

“Every time I think about getting rid of it, I want to cry,” says Ms. James, a 46-year-old public-relations consultant. She fears her children, ages 6, 8 and 16, will grow up and think she didn’t love them if she doesn’t save it all. “In keeping all this stuff, I think someday I’ll be able to say to my children, ‘See—I treasured your innocence. I treasured you!’ “

Many powerful emotions are lurking amid stuff we keep. Whether it’s piles of unread newspapers, clothes that don’t fit, outdated electronics, even empty margarine tubs, the things we accumulate reflect some of our deepest thoughts and feelings.

Now there’s growing recognition among professional organizers that to come to grips with their clutter, clients need to understand why they save what they save, or things will inevitably pile up again. In some cases, therapists are working along with organizers to help clients confront their psychological demons.

“The work we do with clients goes so much beyond making their closets look pretty,” says Collette Shine, president of the New York chapter of the National Association of Professional Organizers. “It involves getting into their hearts and their heads.”

For some people—especially those with big basements—hanging onto old and unused things doesn’t present a problem. But many others say they’re drowning in clutter.

“I have clients who say they are distressed at all the clutter they have, and distressed at the thought of getting rid of things,” says Simon Rego, director of psychology training at Montefiore Medical Center in Bronx, N.Y., who makes house calls, in extreme cases, to help hoarders.

In some cases, chronic disorganization can be a symptom of Attention Deficit Hyperactivity Disorder, Obsessive-Compulsive Disorder and dementia—all of which involve difficulty with planning, focusing and making decisions.

The extreme form, hoarding, is now a distinct psychiatric disorder, defined in the new Diagnostic and Statistical Manual-5 as “persistent difficulty discarding possessions, regardless of their value” such that living areas cannot be used. Despite all the media attention, only 2% to 5% of people fit the criteria—although many more joke, or fear, they are headed that way.

Difficulty letting go of your stuff can also go hand in hand with separation anxiety, compulsive shopping, perfectionism, procrastination and body-image issues. And the reluctance to cope can create a vicious cycle of avoidance, anxiety and guilt.

In most cases, however, psychologists say that clutter can be traced to what they call cognitive errors—flawed thinking that drives dysfunctional behaviors that can get out of hand.

Among the most common clutter-generating bits of logic: “I might need these someday.” “These might be valuable.” “These might fit again if I lose (or gain) weight.”

“We all have these dysfunctional thoughts. It’s perfectly normal,” Dr. Rego says. The trick, he says, is to recognize the irrational thought that makes you cling to an item and substitute one that helps you let go, such as, “Somebody else could use this, so I’ll give it away.”

He concedes he has saved “maybe 600” disposable Allen wrenches that came with IKEA furniture over the years.

The biggest sources of clutter and the hardest to discard are things that hold sentimental meaning. Dr. Rego says it’s natural to want to hang onto objects that trigger memories, but some people confuse letting go of the object with letting go of the person.

Linda Samuels, president of the Institute for Challenging Disorganization, an education and research group, says there’s no reason to get rid of things just for the sake of doing it.

“Figure out what’s important to you and create an environment that supports that,” she says.

Robert McCollum, a state tax auditor and Ms. James’s husband, says he treasures items like the broken fairy wand one daughter carried around for months.

“I don’t want to lose my memories, and I don’t need a professional organizer,” he says. “I’ve already organized it all in bins.” The only problem would be if they ever move to a place that doesn’t have 1,000 square feet of storage, he adds.

Sometimes the memories people cling to are images of themselves in different roles or happier times. “Our closets are windows into our internal selves,” says Jennifer Baumgartner, a Baltimore psychologist and author of “You Are What You Wear.”

“Say you’re holding on to your team uniforms from college,” she says. “Ask yourself, what about that experience did you like? What can you do in your life now to recapture that?”

Somebody-might-need-this thinking is often what drives people to save stacks of newspapers, magazines, outdated electronic equipment, decades of financial records and craft supplies. With a little imagination, anything could be fodder for scrapbooks or Halloween costumes.

For people afraid to toss things they might want in the future, Dr. Baumgartner says it helps to have a worst-case scenario plan. “What if you do need that tutu you’ve given away for a Halloween costume? What would you do? You can find almost anything on eBay.

Read the entire story here.

Image courtesy of Google search.

Questioning Quantum Orthodoxy

de-BrogliePhysics works very well in explaining our world, yet it is also broken — it cannot, at the moment, reconcile our views of the very small (quantum theory) with those of the very large (relativity theory).

So although the probabilistic underpinnings of quantum theory have done wonders in allowing physicists to construct the Standard Model, gaps remain.

Back in the mid-1920s, the probabilistic worldview proposed by Niels Bohr and others gained favor and took hold. A competing theory, known as the pilot wave theory, proposed by a young Louis de Broglie, was given short shrift. Yet some theorists have maintained that it may do a better job of reconciling this core gap in our understanding — so it is time to revisit and breathe fresh life into pilot wave theory.

From Wired / Quanta:

For nearly a century, “reality” has been a murky concept. The laws of quantum physics seem to suggest that particles spend much of their time in a ghostly state, lacking even basic properties such as a definite location and instead existing everywhere and nowhere at once. Only when a particle is measured does it suddenly materialize, appearing to pick its position as if by a roll of the dice.

This idea that nature is inherently probabilistic — that particles have no hard properties, only likelihoods, until they are observed — is directly implied by the standard equations of quantum mechanics. But now a set of surprising experiments with fluids has revived old skepticism about that worldview. The bizarre results are fueling interest in an almost forgotten version of quantum mechanics, one that never gave up the idea of a single, concrete reality.

The experiments involve an oil droplet that bounces along the surface of a liquid. The droplet gently sloshes the liquid with every bounce. At the same time, ripples from past bounces affect its course. The droplet’s interaction with its own ripples, which form what’s known as a pilot wave, causes it to exhibit behaviors previously thought to be peculiar to elementary particles — including behaviors seen as evidence that these particles are spread through space like waves, without any specific location, until they are measured.

Particles at the quantum scale seem to do things that human-scale objects do not do. They can tunnel through barriers, spontaneously arise or annihilate, and occupy discrete energy levels. This new body of research reveals that oil droplets, when guided by pilot waves, also exhibit these quantum-like features.

To some researchers, the experiments suggest that quantum objects are as definite as droplets, and that they too are guided by pilot waves — in this case, fluid-like undulations in space and time. These arguments have injected new life into a deterministic (as opposed to probabilistic) theory of the microscopic world first proposed, and rejected, at the birth of quantum mechanics.

“This is a classical system that exhibits behavior that people previously thought was exclusive to the quantum realm, and we can say why,” said John Bush, a professor of applied mathematics at the Massachusetts Institute of Technology who has led several recent bouncing-droplet experiments. “The more things we understand and can provide a physical rationale for, the more difficult it will be to defend the ‘quantum mechanics is magic’ perspective.”

Magical Measurements

The orthodox view of quantum mechanics, known as the “Copenhagen interpretation” after the home city of Danish physicist Niels Bohr, one of its architects, holds that particles play out all possible realities simultaneously. Each particle is represented by a “probability wave” weighting these various possibilities, and the wave collapses to a definite state only when the particle is measured. The equations of quantum mechanics do not address how a particle’s properties solidify at the moment of measurement, or how, at such moments, reality picks which form to take. But the calculations work. As Seth Lloyd, a quantum physicist at MIT, put it, “Quantum mechanics is just counterintuitive and we just have to suck it up.”

A classic experiment in quantum mechanics that seems to demonstrate the probabilistic nature of reality involves a beam of particles (such as electrons) propelled one by one toward a pair of slits in a screen. When no one keeps track of each electron’s trajectory, it seems to pass through both slits simultaneously. In time, the electron beam creates a wavelike interference pattern of bright and dark stripes on the other side of the screen. But when a detector is placed in front of one of the slits, its measurement causes the particles to lose their wavelike omnipresence, collapse into definite states, and travel through one slit or the other. The interference pattern vanishes. The great 20th-century physicist Richard Feynman said that this double-slit experiment “has in it the heart of quantum mechanics,” and “is impossible, absolutely impossible, to explain in any classical way.”

Some physicists now disagree. “Quantum mechanics is very successful; nobody’s claiming that it’s wrong,” said Paul Milewski, a professor of mathematics at the University of Bath in England who has devised computer models of bouncing-droplet dynamics. “What we believe is that there may be, in fact, some more fundamental reason why [quantum mechanics] looks the way it does.”

Riding Waves

The idea that pilot waves might explain the peculiarities of particles dates back to the early days of quantum mechanics. The French physicist Louis de Broglie presented the earliest version of pilot-wave theory at the 1927 Solvay Conference in Brussels, a famous gathering of the founders of the field. As de Broglie explained that day to Bohr, Albert Einstein, Erwin Schrödinger, Werner Heisenberg and two dozen other celebrated physicists, pilot-wave theory made all the same predictions as the probabilistic formulation of quantum mechanics (which wouldn’t be referred to as the “Copenhagen” interpretation until the 1950s), but without the ghostliness or mysterious collapse.

The probabilistic version, championed by Bohr, involves a single equation that represents likely and unlikely locations of particles as peaks and troughs of a wave. Bohr interpreted this probability-wave equation as a complete definition of the particle. But de Broglie urged his colleagues to use two equations: one describing a real, physical wave, and another tying the trajectory of an actual, concrete particle to the variables in that wave equation, as if the particle interacts with and is propelled by the wave rather than being defined by it.

For example, consider the double-slit experiment. In de Broglie’s pilot-wave picture, each electron passes through just one of the two slits, but is influenced by a pilot wave that splits and travels through both slits. Like flotsam in a current, the particle is drawn to the places where the two wavefronts cooperate, and does not go where they cancel out.

De Broglie could not predict the exact place where an individual particle would end up — just like Bohr’s version of events, pilot-wave theory predicts only the statistical distribution of outcomes, or the bright and dark stripes — but the two men interpreted this shortcoming differently. Bohr claimed that particles don’t have definite trajectories; de Broglie argued that they do, but that we can’t measure each particle’s initial position well enough to deduce its exact path.

In principle, however, the pilot-wave theory is deterministic: The future evolves dynamically from the past, so that, if the exact state of all the particles in the universe were known at a given instant, their states at all future times could be calculated.

At the Solvay conference, Einstein objected to a probabilistic universe, quipping, “God does not play dice,” but he seemed ambivalent about de Broglie’s alternative. Bohr told Einstein to “stop telling God what to do,” and (for reasons that remain in dispute) he won the day. By 1932, when the Hungarian-American mathematician John von Neumann claimed to have proven that the probabilistic wave equation in quantum mechanics could have no “hidden variables” (that is, missing components, such as de Broglie’s particle with its well-defined trajectory), pilot-wave theory was so poorly regarded that most physicists believed von Neumann’s proof without even reading a translation.

More than 30 years would pass before von Neumann’s proof was shown to be false, but by then the damage was done. The physicist David Bohm resurrected pilot-wave theory in a modified form in 1952, with Einstein’s encouragement, and made clear that it did work, but it never caught on. (The theory is also known as de Broglie-Bohm theory, or Bohmian mechanics.)

Later, the Northern Irish physicist John Stewart Bell went on to prove a seminal theorem that many physicists today misinterpret as rendering hidden variables impossible. But Bell supported pilot-wave theory. He was the one who pointed out the flaws in von Neumann’s original proof. And in 1986 he wrote that pilot-wave theory “seems to me so natural and simple, to resolve the wave-particle dilemma in such a clear and ordinary way, that it is a great mystery to me that it was so generally ignored.”

The neglect continues. A century down the line, the standard, probabilistic formulation of quantum mechanics has been combined with Einstein’s theory of special relativity and developed into the Standard Model, an elaborate and precise description of most of the particles and forces in the universe. Acclimating to the weirdness of quantum mechanics has become a physicists’ rite of passage. The old, deterministic alternative is not mentioned in most textbooks; most people in the field haven’t heard of it. Sheldon Goldstein, a professor of mathematics, physics and philosophy at Rutgers University and a supporter of pilot-wave theory, blames the “preposterous” neglect of the theory on “decades of indoctrination.” At this stage, Goldstein and several others noted, researchers risk their careers by questioning quantum orthodoxy.

A Quantum Drop

Now at last, pilot-wave theory may be experiencing a minor comeback — at least, among fluid dynamicists. “I wish that the people who were developing quantum mechanics at the beginning of last century had access to these experiments,” Milewski said. “Because then the whole history of quantum mechanics might be different.”

The experiments began a decade ago, when Yves Couder and colleagues at Paris Diderot University discovered that vibrating a silicon oil bath up and down at a particular frequency can induce a droplet to bounce along the surface. The droplet’s path, they found, was guided by the slanted contours of the liquid’s surface generated from the droplet’s own bounces — a mutual particle-wave interaction analogous to de Broglie’s pilot-wave concept.

Read the entire article here.

Image: Louis de Broglie. Courtesy of Wikipedia.

Defying Enemy Number One

Sir_Isaac_NewtonEnemy number one in this case is not your favorite team’s arch-rival or your political nemesis or your neighbor’s nocturnal barking dog. It is not sugar, nor is it trans-fat. Enemy number one is not North Korea (close),  nor is it the latest group of murderous  terrorists  (closer).

The real enemy is gravity. Not the movie, that is, but the natural phenomenon.

Gravity is constricting: it anchors us to our measly home  planet, making extra-terrestrial exploration rather difficult. Gravity is painful: it drags us down, it makes us fall — and when we’re down , it helps other things fall on top  of us. Gravity is an enigma.

But help may not be too distant; enter The Gravity Research Foundation. While the foundation’s mission may no longer be to counteract gravity, it still aims to help us better understand.

From the NYT:

Not long after the bombings of Hiroshima and Nagasaki, while the world was reckoning with the specter of nuclear energy, a businessman named Roger Babson was worrying about another of nature’s forces: gravity.

It had been 55 years since his sister Edith drowned in the Annisquam River, in Gloucester, Mass., when gravity, as Babson later described it, “came up and seized her like a dragon and brought her to the bottom.” Later on, the dragon took his grandson, too, as he tried to save a friend during a boating mishap.

Something had to be done.

“It seems as if there must be discovered some partial insulator of gravity which could be used to save millions of lives and prevent accidents,” Babson wrote in a manifesto, “Gravity — Our Enemy Number One.” In 1949, drawing on his considerable wealth, he started the Gravity Research Foundation and began awarding annual cash prizes for the best new ideas for furthering his cause.

It turned out to be a hopeless one. By the time the 2014 awards were announced last month, the foundation was no longer hoping to counteract gravity — it forms the very architecture of space-time — but to better understand it. What began as a crank endeavor has become mainstream. Over the years, winners of the prizes have included the likes of Stephen Hawking, Freeman Dyson, Roger Penrose and Martin Rees.

With his theory of general relativity, Einstein described gravity with an elegance that has not been surpassed. A mass like the sun makes the universe bend, causing smaller masses like planets to move toward it.

The problem is that nature’s other three forces are described in an entirely different way, by quantum mechanics. In this system forces are conveyed by particles. Photons, the most familiar example, are the carriers of light. For many scientists, the ultimate prize would be proof that gravity is carried by gravitons, allowing it to mesh neatly with the rest of the machine.

So far that has been as insurmountable as Babson’s old dream. After nearly a century of trying, the best physicists have come up with is superstring theory, a self-consistent but possibly hollow body of mathematics that depends on the existence of extra dimensions and implies that our universe is one of a multitude, each unknowable to the rest.

With all the accomplishments our species has achieved, we could be forgiven for concluding that we have reached a dead end. But human nature compels us to go on.

This year’s top gravity prize of $4,000 went to Lawrence Krauss and Frank Wilczek. Dr. Wilczek shared a Nobel Prize in 2004 for his part in developing the theory of the strong nuclear force, the one that holds quarks together and forms the cores of atoms.

So far gravitons have eluded science’s best detectors, like LIGO, the Laser Interferometer Gravitational-Wave Observatory. Mr. Dyson suggested at a recent talk that the search might be futile, requiring an instrument with mirrors so massive that they would collapse to form a black hole — gravity defeating its own understanding. But in their paper Dr. Krauss and Dr. Wilczek suggest how gravitons might leave their mark on cosmic background radiation, the afterglow of the Big Bang.

Continue reading the main story Continue reading the main story
Continue reading the main story

There are other mysteries to contend with. Despite the toll it took on Babson’s family, theorists remain puzzled over why gravity is so much weaker than electromagnetism. Hold a refrigerator magnet over a paper clip, and it will fly upward and away from Earth’s pull.

Reaching for an explanation, the physicists Lisa Randall and Raman Sundrum once proposed that gravity is diluted because it leaks into a parallel universe. Striking off in a different direction, Dr. Randall and another colleague, Matthew Reece, recently speculated that the pull of a disk of dark matter might be responsible for jostling the solar system and unleashing periodic comet storms like one that might have killed off the dinosaurs.

It was a young theorist named Bryce DeWitt who helped disabuse Babson of his dream of stopping such a mighty force. In “The Perfect Theory,” a new book about general relativity, the Oxford astrophysicist Pedro G. Ferreira tells how DeWitt, in need of a down payment for a house, entered the Gravitational Research Foundation’s competition in 1953 with a paper showing why the attempt to make any kind of antigravity device was “a waste of time.”

He won the prize, the foundation became more respectable, and DeWitt went on to become one of the most prominent theorists of general relativity. Babson, however, was not entirely deterred. In 1962 after more than 100 prominent Atlantans were killed in a plane crash in Paris, he donated $5,000 to Emory University along with a marble monument “to remind students of the blessings forthcoming” once gravity is counteracted.

He paid for similar antigravity monuments at more than a dozen campuses, including one at Tufts University, where newly minted doctoral students in cosmology kneel before it in a ceremony in which an apple is dropped on their heads.

I thought of Babson recently during a poignant scene in the movie “Gravity,” in which two astronauts are floating high above Earth, stranded from home. During a moment of calm, one of them, Lt. Matt Kowalski (played by George Clooney), asks the other, Dr. Ryan Stone (Sandra Bullock), “What do you miss down there?”

She tells him about her daughter:

“She was 4. She was at school playing tag, slipped and hit her head, and that was it. The stupidest thing.” It was gravity that did her in.

Read the entire article here.

Image: Portrait of Isaac Newton (1642-1727) by  Sir Godfrey Kneller (1646–1723). Courtesy of Wikipedia.

Iran, Women, Clothes

hajib_Jeune_femmeA fascinating essay by Haleh Anvari, Iranian writer and artist, provides an insightful view of the role that fashion takes in shaping many of our perceptions — some right, many wrong — of women.

Quite rightly she argues that the measures our culture places on women, through the lens of Western fashion or Muslim tradition, are misleading. In both cases, there remains a fundamental need to address and to continue to address women’s rights versus those of men. Fashion stereotypes may be vastly different across continents, but the underlying issues remain very much the same whether a woman wears a hijab on the street or lingerie on a catwalk.

From the NYT:

I took a series of photographs of myself in 2007 that show me sitting on the toilet, weighing myself, and shaving my legs in the bath. I shot them as an angry response to an encounter with a gallery owner in London’s artsy Brick Lane. I had offered him photos of colorful chadors — an attempt to question the black chador as the icon of Iran by showing the world that Iranian women were more than this piece of black cloth. The gallery owner wasn’t impressed. “Do you have any photos of Iranian women in their private moments?” he asked.

As an Iranian with a reinforced sense of the private-public divide we navigate daily in our country, I found his curiosity offensive. So I shot my “Private Moments” in a sardonic spirit, to show that Iranian women are like all women around the world if you get past the visual hurdle of the hijab. But I never shared those, not just because I would never get a permit to show them publicly in Iran, but also because I am prepared to go only so far to prove a point. Call me old-fashioned.Read the entire article here.

Ever since the hijab, a generic term for every Islamic modesty covering, became mandatory after the 1979 revolution, Iranian women have been used to represent the country visually. For the new Islamic republic, the all-covering cloak called a chador became a badge of honor, a trademark of fundamental change. To Western visitors, it dropped a pin on their travel maps, where the bodies of Iranian women became a stand-in for the character of Iranian society. When I worked with foreign journalists for six years, I helped produce reports that were illustrated invariably with a woman in a black chador. I once asked a photojournalist why. He said, “How else can we show where we are?”

How wonderful. We had become Iran’s Eiffel Tower or Big Ben.

Next came the manteau-and-head scarf combo — less traditional, and more relaxed, but keeping the lens on the women. Serious reports about elections used a “hair poking out of scarf” standard as an exit poll, or images of scarf-clad women lounging in coffee shops, to register change. One London newspaper illustrated a report on the rise of gasoline prices with a woman in a head scarf, photographed in a gas station, holding a pump nozzle with gasoline suggestively dripping from its tip. A visitor from Mars or a senior editor from New York might have been forgiven for imagining Iran as a strange land devoid of men, where fundamentalist chador-clad harridans vie for space with heathen babes guzzling cappuccinos. (Incidentally, women hardly ever step out of the car to pump gas here; attendants do it for us.)

The disputed 2009 elections, followed by demonstrations and a violent backlash, brought a brief respite. The foreign press was ejected, leaving the reporting to citizen journalists not bound by the West’s conventions. They depicted a politically mature citizenry, male and female, demanding civic acknowledgment together.

We are now witnessing another shift in Iran’s image. It shows Iran “unveiled” — a tired euphemism now being used to literally undress Iranian women or show them off as clotheshorses. An Iranian fashion designer in Paris receives more plaudits in the Western media for his blog’s street snapshots of stylish, affluent young women in North Tehran than he gets for his own designs. In this very publication, a male Iranian photographer depicted Iranian women through flimsy fabrics under the title “Veiled Truths”; one is shown in a one-piece pink swimsuit so minimal it could pass for underwear; others are made more sensual behind sheer “veils,” reinforcing a sense of peeking at them. Search the Internet and you can get an eyeful of nubile limbs in opposition to the country’s official image, shot by Iranian photographers of both sexes, keen to show the hidden, supposedly true, other side of Iran.

Young Iranians rightly desire to show the world the unseen sides of their lives. But their need to show themselves as like their peers in the West takes them into dangerous territory. Professional photographers and artists, encouraged by Western curators and seeking fast-track careers, are creating a new wave of homegrown neo-Orientalism. A favorite reworking of an old cliché is the thin, beautiful young woman reclining while smoking a hookah, dancing, or otherwise at leisure in her private spaces. Ingres could sue for plagiarism.

In a country where the word feminism is pejorative, there is no inkling that the values of both fundamentalism and Western consumerism are two sides of the same coin — the female body as an icon defining Iranian culture.

It is true that we Iranians live dual lives, and so it is true that to see us in focus, you must enter our inner sanctum. But the inner sanctum includes women who believe in the hijab, fat women, old women and, most important, women in professions from doctor to shopkeeper. It also includes men, not all of whom are below 30 years of age. If you wish to see Iran as it is, you need go no further than Facebook and Instagram. Here, Iran is neither fully veiled nor longing to undress itself. Its complex variety is shown through the lens of its own people, in both private and public spaces.

Read the entire essay here.

Image: Young woman from Naplouse in a hijab, c1867-1885. Courtesy of Wikipedia.

Dinosaurs of Retail

moa

Shopping malls in the United States were in their prime in the 1970s and ’80s. Many had positioned themselves a a bright, clean, utopian alternative to inner-city blight and decay. A quarter of a century on, while the mega-malls may be thriving, the numerous smaller suburban brethren are seeing lower sales. As internet shopping and retailing pervades all reaches of our society many midsize malls are decaying or shutting down completely.  Documentary photographer Seth Lawless captures this fascinating transition in a new book: Black Friday: the Collapse of the American Shopping Mall.

From the Guardian:

It is hard to believe there has ever been any life in this place. Shattered glass crunches under Seph Lawless’s feet as he strides through its dreary corridors. Overhead lights attached to ripped-out electrical wires hang suspended in the stale air and fading wallpaper peels off the walls like dead skin.

Lawless sidesteps debris as he passes from plot to plot in this retail graveyard called Rolling Acres Mall in Akron, Ohio. The shopping centre closed in 2008, and its largest retailers, which had tried to make it as standalone stores, emptied out by the end of last year. When Lawless stops to overlook a two-storey opening near the mall’s once-bustling core, only an occasional drop of water, dribbling through missing ceiling tiles, breaks the silence.

“You came, you shopped, you dressed nice – you went to the mall. That’s what people did,” says Lawless, a pseudonymous photographer who grew up in a suburb of nearby Cleveland. “It was very consumer-driven and kind of had an ugly side, but there was something beautiful about it. There was something there.”

Gazing down at the motionless escalators, dead plants and empty benches below, he adds: “It’s still beautiful, though. It’s almost like ancient ruins.”

Dying shopping malls are speckled across the United States, often in middle-class suburbs wrestling with socioeconomic shifts. Some, like Rolling Acres, have already succumbed. Estimates on the share that might close or be repurposed in coming decades range from 15 to 50%. Americans are returning downtown; online shopping is taking a 6% bite out of brick-and-mortar sales; and to many iPhone-clutching, city-dwelling and frequently jobless young people, the culture that spawned satire like Mallrats seems increasingly dated, even cartoonish.

According to longtime retail consultant Howard Davidowitz, numerous midmarket malls, many of them born during the country’s suburban explosion after the second world war, could very well share Rolling Acres’ fate. “They’re going, going, gone,” Davidowitz says. “They’re trying to change; they’re trying to get different kinds of anchors, discount stores … [But] what’s going on is the customers don’t have the fucking money. That’s it. This isn’t rocket science.”

Shopping culture follows housing culture. Sprawling malls were therefore a natural product of the postwar era, as Americans with cars and fat wallets sprawled to the suburbs. They were thrown up at a furious pace as shoppers fled cities, peaking at a few hundred per year at one point in the 1980s, according to Paco Underhill, an environmental psychologist and author of Call of the Mall: The Geography of Shopping. Though construction has since tapered off, developers left a mall overstock in their wake.

Currently, the US contains around 1,500 of the expansive “malls” of suburban consumer lore. Most share a handful of bland features. Brick exoskeletons usually contain two storeys of inward-facing stores separated by tile walkways. Food courts serve mediocre pizza. Parking lots are big enough to easily misplace a car. And to anchor them economically, malls typically depend on department stores: huge vendors offering a variety of products across interconnected sections.

For mid-century Americans, these gleaming marketplaces provided an almost utopian alternative to the urban commercial district, an artificial downtown with less crime and fewer vermin. As Joan Didion wrote in 1979, malls became “cities in which no one lives but everyone consumes”. Peppered throughout disconnected suburbs, they were a place to see and be seen, something shoppers have craved since the days of the Greek agora. And they quickly matured into a self-contained ecosystem, with their own species – mall rats, mall cops, mall walkers – and an annual feeding frenzy known as Black Friday.

“Local governments had never dealt with this sort of development and were basically bamboozled [by developers],” Underhill says of the mall planning process. “In contrast to Europe, where shopping malls are much more a product of public-private negotiation and funding, here in the US most were built under what I call ‘cowboy conditions’.”

Shopping centres in Europe might contain grocery stores or childcare centres, while those in Japan are often built around mass transit. But the suburban American variety is hard to get to and sells “apparel and gifts and damn little else”, Underhill says.

Nearly 700 shopping centres are “super-regional” megamalls, retail leviathans usually of at least 1 million square feet and upward of 80 stores. Megamalls typically outperform their 800 slightly smaller, “regional” counterparts, though size and financial health don’t overlap entirely. It’s clearer, however, that luxury malls in affluent areas are increasingly forcing the others to fight for scraps. Strip malls – up to a few dozen tenants conveniently lined along a major traffic artery – are retail’s bottom feeders and so well-suited to the new environment. But midmarket shopping centres have begun dying off alongside the middle class that once supported them. Regional malls have suffered at least three straight years of declining profit per square foot, according to the International Council of Shopping Centres (ICSC).

Read the entire story here.

Image: Mall of America. Courtesy of Wikipedia.

Your Tax Dollars At Work — Leetspeak

US-FBI-ShadedSealIt’s fascinating to see what our government agencies are doing with some of our hard earned tax dollars.

In this head-scratching example, the FBI — the FBI’s Intelligence Research Support Unit, no less — has just completed a 83-page glossary of Internet slang or “leetspeak”. LOL and Ugh! (the latter is not an acronym).

Check out the document via Muckrock here — they obtained the “secret” document through the Freedom of Information Act.

From the Washington Post:

The Internet is full of strange and bewildering neologisms, which anyone but a text-addled teen would struggle to understand. So the fine, taxpayer-funded people of the FBI — apparently not content to trawl Urban Dictionary, like the rest of us — compiled a glossary of Internet slang.

An 83-page glossary. Containing nearly 3,000 terms.

The glossary was recently made public through a Freedom of Information request by the group MuckRock, which posted the PDF, called “Twitter shorthand,” online. Despite its name, this isn’t just Twitter slang: As the FBI’s Intelligence Research Support Unit explains in the introduction, it’s a primer on shorthand used across the Internet, including in “instant messages, Facebook and Myspace.” As if that Myspace reference wasn’t proof enough that the FBI’s a tad out of touch, the IRSU then promises the list will prove useful both professionally and “for keeping up with your children and/or grandchildren.” (Your tax dollars at work!)

All of these minor gaffes could be forgiven, however, if the glossary itself was actually good. Obviously, FBI operatives and researchers need to understand Internet slang — the Internet is, increasingly, where crime goes down these days. But then we get things like ALOTBSOL (“always look on the bright side of life”) and AMOG (“alpha male of group”) … within the first 10 entries.

ALOTBSOL has, for the record, been tweeted fewer than 500 times in the entire eight-year history of Twitter. AMOG has been tweeted far more often, but usually in Spanish … as a misspelling, it would appear, of “amor” and “amigo.”

Among the other head-scratching terms the FBI considers can’t-miss Internet slang:

  1. AYFKMWTS (“are you f—— kidding me with this s—?”) — 990 tweets
  2. BFFLTDDUP (“best friends for life until death do us part) — 414 tweets
  3. BOGSAT (“bunch of guys sitting around talking”) — 144 tweets
  4. BTDTGTTSAWIO (“been there, done that, got the T-shirt and wore it out”) — 47 tweets
  5. BTWITIAILWY (“by the way, I think I am in love with you”) — 535 tweets
  6. DILLIGAD (“does it look like I give a damn?”) — 289 tweets
  7. DITYID (“did I tell you I’m depressed?”) — 69 tweets
  8. E2EG (“ear-to-ear grin”) — 125 tweets
  9. GIWIST (“gee, I wish I said that”) — 56 tweets
  10. HCDAJFU (“he could do a job for us”) — 25 tweets
  11. IAWTCSM (“I agree with this comment so much”) — 20 tweets
  12. IITYWIMWYBMAD (“if I tell you what it means will you buy me a drink?”) — 250 tweets
  13. LLTA (“lots and lots of thunderous applause”) — 855 tweets
  14. NIFOC (“naked in front of computer”) — 1,065 tweets, most of them referring to acronym guides like this one.
  15. PMYMHMMFSWGAD (“pardon me, you must have mistaken me for someone who gives a damn”) — 128 tweets
  16. SOMSW (“someone over my shoulder watching) — 170 tweets
  17. WAPCE (“women are pure concentrated evil”) — 233 tweets, few relating to women
  18. YKWRGMG (“you know what really grinds my gears?”) — 1,204 tweets

In all fairness to the FBI, they do get some things right: “crunk” is helpfully defined as “crazy and drunk,” FF is “a recommendation to follow someone referenced in the tweet,” and a whole range of online patois is translated to its proper English equivalent: hafta is “have to,” ima is “I’m going to,” kewt is “cute.”

Read the entire article here.

Image: FBI Seal. Courtesy of U.S. Government.

Goostman Versus Turing

eugene-goostman

Some computer scientists believe that “Eugene Goostman” may have overcome the famous hurdle proposed by Alan Turning, by cracking the eponymous Turning Test. Eugene is a 13 year-old Ukrainian “boy” constructed from computer algorithms designed to feign intelligence and mirror human thought processes. During a text-based exchange Eugene managed to convince his human interrogators that he was a real boy — and thus his creators claim to have broken the previously impenetrable Turing barrier.

Other researchers and philosophers disagree: they claim that it’s easier to construct an artificial intelligence that converses in good, but limited English — Eugene is Ukrainian after all — than it would be to develop a native anglophone adult. So, the Turning Test barrier may yet stand.

From the Guardian:

From 2001: a Space Odyssey to Her, the idea of an intelligent computer that can hold conversations with humans has long been a dream of science-fiction writers, but that fantasy may just have taken a step closer to becoming reality. On Saturday, a computer algorithm claiming to be a 13-year-old boy called Eugene Goostman passed the Turing test, it has been claimed. However, some commentators have suggested that pretending to be a 13-year-old Ukrainian child is not as difficult as pretending to be an Anglophone adult, and that the result should not count.

The program convinced 33% of the human judges at an event at the Royal Society that it was actually a human. The Turing test, which was devised in 1950 by the pioneering computer scientist and second wold war codebreaker Alan Turing is a classic benchmark in artificial intelligence – it determines a machine’s ability to show intelligent behaviour equivalent to, or indistinguishable from, that of a human.

Transcripts of the conversations from the latest Royal Society test are not yet available but below are transcripts from a previous event at Bletchley Park on 23 June 2012, with the same program.

Can you tell Eugene isn’t human?

Test 1

[12:43:37] Judge: Hi, I’m Cathy.

[12:43:43] Eugene: Cathy! Nice to meet you! You’re a woman, if I’m not mistaken.

[12:44:20] Judge: Right. What is your name?

[12:44:29] Eugene: People call me Eugene. Or “Zhenya”. It is a short name for Eugene in Russia and Ukraine.

[12:45:17] Judge: Nice name. Are you from either area?

[12:45:41] Eugene: Agree 🙂 Oh, what a fruitful conversation;-)

[12:45:52] Judge: Do you really think so?

[12:46:04] Eugene: No, your guess was wrong. I don’t think so. By the way, I still don’t know your specialty – or, possibly, I’ve missed it?

[12:46:56] Judge: What speciality? What’s yours?

[12:47:01] Eugene: Huh? And I forgot to ask you where you are from.

[12:47:51] Judge: The universe and you?

[12:48:01] Eugene: Is it your hobby to ask little poor Jewish boys such tricky questions they don’t have an answer for?

Test 2

[16:11:25] Judge: what music do you like?

[16:11:32] Eugene: I like Eminem – because he doesn’t have a thing to do with “polite correctness”.

[16:11:46] Judge: OMG i love eminem, he’s my favourite artist!

[16:12:00] Eugene: I wonder what Eminem’s nick mean. I associate it only with a word “enema”. But he is a cool guy, I like his The Real Slim Shady.

[16:12:55] Judge: i know what his nickname means, his real name is Marshall Mathers. From the initials M&M phynetically sounding like Eminem.

[16:13:06] Eugene: Eminem is a famous crapper. No, raper… ripper… Yeah, I mean – rapper. I like his Stan and The Real Slim Shady.

[16:13:32] Judge: I like Hailies Song.

[16:13:46] Eugene: Interesting tastes. I like science-fiction books, by the way. Where do you came from, by the way? Could you tell me about the place where you live?

[16:14:22] Judge: Epsom, home of the derby. yourself?

[16:14:27] Eugene: My guinea pig says that name Derby sounds very nice.

[16:14:41] Judge: i shopuld [sic] talk with him sometime.

[16:14:48] Eugene: I think you should take some typing lessons. Or just type more accurately.

[16:15:04] Judge: That’s very rude.

[16:15:09] Eugene: Ummm … Frankly, I didn’t get your question 🙁

[16:15:49] Judge: “Ummm” isnt a word. i think you need typing lessons

[16:15:58] Eugene: My thoughts are same. By the way, I still don’t know your specialty – or, possibly, I’ve missed it?

Test 3

[16:06:31] Judge: Hi there, do you prefer to wear a dress or trousers to work?

[16:06:37] Eugene: I am a scholar. I’m too young to make money.

[16:07:32] Judge: What was the last film you saw at the cinema?

[16:07:38] Eugene: Huh? Could you tell me what are you? I mean your profession.

[16:08:11] Judge: I am an IT manager, and you?

[16:08:16] Eugene: Manager? So how is it to work as manager?

[16:09:13] Judge: Hard work but rewarding, what do you do?

[16:09:18] Eugene: I am only 13, so I’m attending school so far.

[16:09:51] Judge: Have you seen teletubbies?

[16:10:04] Eugene: No, I haven’t seen it, unfortunately. But I hope it is better than these crappy Star Wars. And I forgot to ask you where you are from …

[16:10:15] Judge: Brighton, and you?

Read the entire story and Eugene’s conversation with real humans here.

Image: A conversation with Eugene Goostman. Courtesy of BBC.

Nuts About Nutella

nutella

Connoisseurs of Nutella — that wonderful concoction of hazelnuts and chocolate — are celebrating 50 years of the iconic Italian spread. Here’s to the next 50 bites, sorry years! Say no more.

From the Guardian:

In Piedmont they have been making gianduiotto, a confectionery combining hazelnuts and cocoa sold in a pretty tinfoil wrapper, since the mid-18th century. They realised long ago that the nuts, which are plentiful in the surrounding hills, are a perfect match for chocolate. But no one had any idea that their union would prove so harmonious, lasting and fruitful. Only after the second world war was this historic marriage finally sealed.

Cocoa beans are harder to come by and, consequently, more expensive. Pietro Ferrero, an Alba-based pastry cook, decided to turn the problem upside down. Chocolate should not be allowed to dictate its terms. By using more nuts and less cocoa, one could obtain a product that was just as good and not as costly. What is more, it would be spread.

Nutella, one of the world’s best-known brands, celebrated its 50th anniversary in Alba last month. In telling the story of this chocolate spread, it’s difficult to avoid cliches: a success story emblematic of Italy’s postwar recovery, the tale of a visionary entrepreneur and his perseverance, a business model driven by a single product.

The early years were spectacular. In 1946 the Ferrero brothers produced and sold 300kg of their speciality; nine months later output had reached 10 tonnes. Pietro stayed at home making the spread. Giovanni went to market across Italy in his little Fiat. In 1948 Ferrero, now a limited company, moved into a 5,000 sq metre factory equipped to produce 50 tonnes of gianduiotto a month.

By 1949 the process was nearing perfection, with the launch of the “supercrema” version, which was smoother and stuck more to the bread than the knife. It was also the year Pietro died. He did not live long enough to savour his triumph.

His son Michele was driven by the same obsession with greater spreadability. Under his leadership Ferrero became an empire. But it would take another 15 years of hard work and endless experiments before finally, in 1964, Nutella was born.

The firm now sells 365,000 tonnes of Nutella a year worldwide, the biggest consumers being the Germans, French, Italians and Americans. The anniversary was, of course, the occasion for a big promotional operation. At a gathering in Rome last month, attended by two government ministers, journalists received a 1kg jar marked with the date and a commemorative Italian postage stamp. It is an ideal opportunity for Ferrero – which also owns the Tic Tac, Ferrero Rocher, Kinder and Estathé brands, among others – to affirm its values and rehearse its well-established narrative.

There are no recent pictures of the patriarch Michele, who divides his time between Belgium and Monaco. According to Forbes magazine he was worth $9.5bn in 2009, making him the richest person in Italy. He avoids the media and making public appearances, even eschewing the boards of leading Italian firms.

His son Giovanni, who has managed the company on his own after the early death of his brother Pietro in 2011, only agreed to a short interview on Italy’s main public TV channel. He abides by the same rule as his father: “Only on two occasions should the papers mention one’s name – birth and death.”

In contrast, Ferrero executives have plenty to say about both products and the company, with its 30,000-strong workforce at 14 locations, its €8bn ($10bn) revenue, 72% share of the chocolate-spreads market, 5 million friends on Facebook, 40m Google references, its hazelnut plantations in both hemispheres securing it a round-the-year supply of fresh ingredients and, of course, its knowhow.

“The recipe for Nutella is not a secret like Coca-Cola,” says marketing manager Laurent Cremona. “Everyone can find out the ingredients. We simply know how to combine them better than other people.”

Be that as it may, the factory in Alba is as closely guarded as Fort Knox and visits are not allowed. “It’s not a company, it’s an oasis of happiness,” says Francesco Paolo Fulci, a former ambassador and president of the Ferrero foundation. “In 70 years, we haven’t had a single day of industrial action.”

Read the entire article here.

Image: Never enough Nutella. Courtesy of secret Nutella fans the world over / Ferrero, S.P.A

theDiagonal is Dislocating to The Diagonal

Flatirons_Winter_SunriseDear readers, theDiagonal is in the midst of a major dislocation in May-June 2014. Thus, your friendly editor would like to apologize for the recent, intermittent service. While theDiagonal lives online, its human-powered (currently) editor is physically relocating with family to Boulder, CO. Normal, daily service from theDiagonal will resume in July.

The city of Boulder intersects Colorado State Highway 119, as it sweeps on a SW to NE track from the Front Range towards the Central Plains. Coincidentally, or not, highway 119 is more affectionately known as The Diagonal.

Image: The Flatirons, mountain formations, in Boulder, Colorado. Courtesy of Jesse Varner / AzaToth / Wikipedia.

Images: Go Directly To Jail or…

open-door

If you live online and write or share images it’s likely that you’ve been, or will soon be, sued by the predatory Getty Images. Your kindly editor at theDiagonal uses images found to be in the public domain or references them as fair use in this blog, and yet has fallen prey to this extortionate nuisance of a company.

Getty with its army of fee extortion collectors — many are not even legally trained or accredited — will find reason to send you numerous legalistic and threatening letters demanding hundreds of dollars in compensation and damages. It will do this without sound proof, relying on the threats to cajole unwary citizens to part with significant sums. This is such a big market for Getty that numerous services, such as this one, have sprung up over the years to help writers and bloggers combat the Getty extortion.

With that in mind, it’s refreshing to see the Metropolitan Museum of Art in New York taking a rather different stance: the venerable institution is doing us all a wonderful service by making many hundreds of thousands of classic images available online for free. Getty take that!

From WSJ:

This month, the Metropolitan Museum of Art released for download about 400,000 digital images of works that are in the public domain. The images, which are free to use for non-commercial use without permission or fees, may now be downloaded from the museum’s website. The museum will continue to add images to the collection as they digitize files as part of the initiative Open Access for Scholarly Content (OASC). 

When asked about the impact of the initiative, Sree Sreenivasan, Chief Digital Officer, said the new program would provide increased access and streamline the process of obtaining these images. “In keeping with the Museum’s mission, we hope the new image policy will stimulate new scholarship in a variety of media, provide greater access to our vast collection, and broaden the reach of the Museum to researchers world-wide. By providing open access, museums and scholars will no longer have to request permission to use our public domain images, they can download the images directly from our website.”

Thomas P. Campbell, director and chief executive of the Metropolitan Museum of Art, said the Met joins a growing number of museums using an open-access policy to make available digital images of public domain works. “I am delighted that digital technology can open the doors to this trove of images from our encyclopedic collection,” Mr. Campbell said in his May 16 announcement. Other New York institutions that have initiated similar programs include the New York Public Library (map collection),  the Brooklyn Academy of Music and the New York Philharmonic. 

See more images here.

Image: “The Open Door,” earlier than May 1844. Courtesy of William Henry Fox Talbot/The Metropolitan Museum of Art, New York.

I Think, Therefore I am, Not Robot

Robbie_the_Robot_2006

A sentient robot is the long-held dream of both artificial intelligence researcher and science fiction author. Yet, some leading mathematicians theorize it may never happen, despite our accelerating technological prowess.

From New Scientist:

So long, robot pals – and robot overlords. Sentient machines may never exist, according to a variation on a leading mathematical model of how our brains create consciousness.

Over the past decade, Giulio Tononi at the University of Wisconsin-Madison and his colleagues have developed a mathematical framework for consciousness that has become one of the most influential theories in the field. According to their model, the ability to integrate information is a key property of consciousness. They argue that in conscious minds, integrated information cannot be reduced into smaller components. For instance, when a human perceives a red triangle, the brain cannot register the object as a colourless triangle plus a shapeless patch of red.

But there is a catch, argues Phil Maguire at the National University of Ireland in Maynooth. He points to a computational device called the XOR logic gate, which involves two inputs, A and B. The output of the gate is “0” if A and B are the same and “1” if A and B are different. In this scenario, it is impossible to predict the output based on A or B alone – you need both.

Memory edit

Crucially, this type of integration requires loss of information, says Maguire: “You have put in two bits, and you get one out. If the brain integrated information in this fashion, it would have to be continuously haemorrhaging information.”

Maguire and his colleagues say the brain is unlikely to do this, because repeated retrieval of memories would eventually destroy them. Instead, they define integration in terms of how difficult information is to edit.

Consider an album of digital photographs. The pictures are compiled but not integrated, so deleting or modifying individual images is easy. But when we create memories, we integrate those snapshots of information into our bank of earlier memories. This makes it extremely difficult to selectively edit out one scene from the “album” in our brain.

Based on this definition, Maguire and his team have shown mathematically that computers can’t handle any process that integrates information completely. If you accept that consciousness is based on total integration, then computers can’t be conscious.

Open minds

“It means that you would not be able to achieve the same results in finite time, using finite memory, using a physical machine,” says Maguire. “It doesn’t necessarily mean that there is some magic going on in the brain that involves some forces that can’t be explained physically. It is just so complex that it’s beyond our abilities to reverse it and decompose it.”

Disappointed? Take comfort – we may not get Rosie the robot maid, but equally we won’t have to worry about the world-conquering Agents of The Matrix.

Neuroscientist Anil Seth at the University of Sussex, UK, applauds the team for exploring consciousness mathematically. But he is not convinced that brains do not lose information. “Brains are open systems with a continual turnover of physical and informational components,” he says. “Not many neuroscientists would claim that conscious contents require lossless memory.”

Read the entire story here.

Image: Robbie the Robot, Forbidden Planet. Courtesy of San Diego Comic Con, 2006 / Wikipedia.

c2=e/m

Feynmann_Diagram_Gluon_RadiationParticle physicists will soon attempt to reverse the direction of Einstein’s famous equation delineating energy-matter equivalence, e=mc2. Next year, they plan to crash quanta of light into each other to create matter. Cool or what!

From the Guardian:

Researchers have worked out how to make matter from pure light and are drawing up plans to demonstrate the feat within the next 12 months.

The theory underpinning the idea was first described 80 years ago by two physicists who later worked on the first atomic bomb. At the time they considered the conversion of light into matter impossible in a laboratory.

But in a report published on Sunday, physicists at Imperial College London claim to have cracked the problem using high-powered lasers and other equipment now available to scientists.

“We have shown in principle how you can make matter from light,” said Steven Rose at Imperial. “If you do this experiment, you will be taking light and turning it into matter.”

The scientists are not on the verge of a machine that can create everyday objects from a sudden blast of laser energy. The kind of matter they aim to make comes in the form of subatomic particles invisible to the naked eye.

The original idea was written down by two US physicists, Gregory Breit and John Wheeler, in 1934. They worked out that – very rarely – two particles of light, or photons, could combine to produce an electron and its antimatter equivalent, a positron. Electrons are particles of matter that form the outer shells of atoms in the everyday objects around us.

But Breit and Wheeler had no expectations that their theory would be proved any time soon. In their study, the physicists noted that the process was so rare and hard to produce that it would be “hopeless to try to observe the pair formation in laboratory experiments”.

Oliver Pike, the lead researcher on the study, said the process was one of the most elegant demonstrations of Einstein’s famous relationship that shows matter and energy are interchangeable currencies. “The Breit-Wheeler process is the simplest way matter can be made from light and one of the purest demonstrations of E=mc2,” he said.

Writing in the journal Nature Photonics, the scientists describe how they could turn light into matter through a number of separate steps. The first step fires electrons at a slab of gold to produce a beam of high-energy photons. Next, they fire a high-energy laser into a tiny gold capsule called a hohlraum, from the German for “empty room”. This produces light as bright as that emitted from stars. In the final stage, they send the first beam of photons into the hohlraum where the two streams of photons collide.

The scientists’ calculations show that the setup squeezes enough particles of light with high enough energies into a small enough volume to create around 100,000 electron-positron pairs.

The process is one of the most spectacular predictions of a theory called quantum electrodynamics (QED) that was developed in the run up to the second world war. “You might call it the most dramatic consequence of QED and it clearly shows that light and matter are interchangeable,” Rose told the Guardian.

The scientists hope to demonstrate the process in the next 12 months. There are a number of sites around the world that have the technology. One is the huge Omega laser in Rochester, New York. But another is the Orion laser at Aldermaston, the atomic weapons facility in Berkshire.

A successful demonstration will encourage physicists who have been eyeing the prospect of a photon-photon collider as a tool to study how subatomic particles behave. “Such a collider could be used to study fundamental physics with a very clean experimental setup: pure light goes in, matter comes out. The experiment would be the first demonstration of this,” Pike said.

Read the entire story here.

Image: Feynmann diagram for gluon radiation. Courtesy of Wikipedia.

 

 

95.5 Percent is Made Up and It’s Dark

Petrarch_by_Bargilla

Physicists and astronomers observe the very small and the very big. Although they are focused on very different areas of scientific endeavor and discovery, they tend to agree on one key observation: 95.5 of the cosmos is currently invisible to us. That is, only around 4.5 percent of our physical universe is made up of matter or energy that we can see or sense directly through experimental interaction. The rest, well, it’s all dark — so-called dark matter and dark energy. But nobody really knows what or how or why. Effectively, despite tremendous progress in our understanding of our world, we are still in a global “Dark Age”.

From the New Scientist:

TO OUR eyes, stars define the universe. To cosmologists they are just a dusting of glitter, an insignificant decoration on the true face of space. Far outweighing ordinary stars and gas are two elusive entities: dark matter and dark energy. We don’t know what they are… except that they appear to be almost everything.

These twin apparitions might be enough to give us pause, and make us wonder whether all is right with the model universe we have spent the past century so carefully constructing. And they are not the only thing. Our standard cosmology also says that space was stretched into shape just a split second after the big bang by a third dark and unknown entity called the inflaton field. That might imply the existence of a multiverse of countless other universes hidden from our view, most of them unimaginably alien – just to make models of our own universe work.

Are these weighty phantoms too great a burden for our observations to bear – a wholesale return of conjecture out of a trifling investment of fact, as Mark Twain put it?

The physical foundation of our standard cosmology is Einstein’s general theory of relativity. Einstein began with a simple observation: that any object’s gravitational mass is exactly equal to its resistance to accelerationMovie Camera, or inertial mass. From that he deduced equations that showed how space is warped by mass and motion, and how we see that bending as gravity. Apples fall to Earth because Earth’s mass bends space-time.

In a relatively low-gravity environment such as Earth, general relativity’s effects look very like those predicted by Newton’s earlier theory, which treats gravity as a force that travels instantaneously between objects. With stronger gravitational fields, however, the predictions diverge considerably. One extra prediction of general relativity is that large accelerating masses send out tiny ripples in the weave of space-time called gravitational waves. While these waves have never yet been observed directly, a pair of dense stars called pulsars, discovered in 1974, are spiralling in towards each other just as they should if they are losing energy by emitting gravitational waves.

Gravity is the dominant force of nature on cosmic scales, so general relativity is our best tool for modelling how the universe as a whole moves and behaves. But its equations are fiendishly complicated, with a frightening array of levers to pull. If you then give them a complex input, such as the details of the real universe’s messy distribution of mass and energy, they become effectively impossible to solve. To make a working cosmological model, we make simplifying assumptions.

The main assumption, called the Copernican principle, is that we are not in a special place. The cosmos should look pretty much the same everywhere – as indeed it seems to, with stuff distributed pretty evenly when we look at large enough scales. This means there’s just one number to put into Einstein’s equations: the universal density of matter.

Einstein’s own first pared-down model universe, which he filled with an inert dust of uniform density, turned up a cosmos that contracted under its own gravity. He saw that as a problem, and circumvented it by adding a new term into the equations by which empty space itself gains a constant energy density. Its gravity turns out to be repulsive, so adding the right amount of this “cosmological constant” ensured the universe neither expanded nor contracted. When observations in the 1920s showed it was actually expanding, Einstein described this move as his greatest blunder.

It was left to others to apply the equations of relativity to an expanding universe. They arrived at a model cosmos that grows from an initial point of unimaginable density, and whose expansion is gradually slowed down by matter’s gravity.

This was the birth of big bang cosmology. Back then, the main question was whether the expansion would ever come to a halt. The answer seemed to be no; there was just too little matter for gravity to rein in the fleeing galaxies. The universe would coast outwards forever.

Then the cosmic spectres began to materialise. The first emissary of darkness put a foot in the door as long ago as the 1930s, but was only fully seen in the late 1970s when astronomers found that galaxies are spinning too fast. The gravity of the visible matter would be too weak to hold these galaxies together according to general relativity, or indeed plain old Newtonian physics. Astronomers concluded that there must be a lot of invisible matter to provide extra gravitational glue.

The existence of dark matter is backed up by other lines of evidence, such as how groups of galaxies move, and the way they bend light on its way to us. It is also needed to pull things together to begin galaxy-building in the first place. Overall, there seems to be about five times as much dark matter as visible gas and stars.

Dark matter’s identity is unknown. It seems to be something beyond the standard model of particle physics, and despite our best efforts we have yet to see or create a dark matter particle on Earth (see “Trouble with physics: Smashing into a dead end”). But it changed cosmology’s standard model only slightly: its gravitational effect in general relativity is identical to that of ordinary matter, and even such an abundance of gravitating stuff is too little to halt the universe’s expansion.

The second form of darkness required a more profound change. In the 1990s, astronomers traced the expansion of the universe more precisely than ever before, using measurements of explosions called type 1a supernovae. They showed that the cosmic expansion is accelerating. It seems some repulsive force, acting throughout the universe, is now comprehensively trouncing matter’s attractive gravity.

This could be Einstein’s cosmological constant resurrected, an energy in the vacuum that generates a repulsive force, although particle physics struggles to explain why space should have the rather small implied energy density. So imaginative theorists have devised other ideas, including energy fields created by as-yet-unseen particles, and forces from beyond the visible universe or emanating from other dimensions.

Whatever it might be, dark energy seems real enough. The cosmic microwave background radiation, released when the first atoms formed just 370,000 years after the big bang, bears a faint pattern of hotter and cooler spots that reveals where the young cosmos was a little more or less dense. The typical spot sizes can be used to work out to what extent space as a whole is warped by the matter and motions within it. It appears to be almost exactly flat, meaning all these bending influences must cancel out. This, again, requires some extra, repulsive energy to balance the bending due to expansion and the gravity of matter. A similar story is told by the pattern of galaxies in space.

All of this leaves us with a precise recipe for the universe. The average density of ordinary matter in space is 0.426 yoctograms per cubic metre (a yoctogram is 10-24 grams, and 0.426 of one equates to about 250 protons), making up 4.5 per cent of the total energy density of the universe. Dark matter makes up 22.5 per cent, and dark energy 73 per cent (see diagram). Our model of a big-bang universe based on general relativity fits our observations very nicely – as long as we are happy to make 95.5 per cent of it up.

Arguably, we must invent even more than that. To explain why the universe looks so extraordinarily uniform in all directions, today’s consensus cosmology contains a third exotic element. When the universe was just 10-36 seconds old, an overwhelming force took over. Called the inflaton field, it was repulsive like dark energy, but far more powerful, causing the universe to expand explosively by a factor of more than 1025, flattening space and smoothing out any gross irregularities.

When this period of inflation ended, the inflaton field transformed into matter and radiation. Quantum fluctuations in the field became slight variations in density, which eventually became the spots in the cosmic microwave background, and today’s galaxies. Again, this fantastic story seems to fit the observational facts. And again it comes with conceptual baggage. Inflation is no trouble for general relativity – mathematically it just requires an add-on term identical to the cosmological constant. But at one time this inflaton field must have made up 100 per cent of the contents of the universe, and its origin poses as much of a puzzle as either dark matter or dark energy. What’s more, once inflation has started it proves tricky to stop: it goes on to create a further legion of universes divorced from our own. For some cosmologists, the apparent prediction of this multiverse is an urgent reason to revisit the underlying assumptions of our standard cosmology (see “Trouble with physics: Time to rethink cosmic inflation?”).

The model faces a few observational niggles, too. The big bang makes much more lithium-7 in theory than the universe contains in practice. The model does not explain the possible alignment in some features in the cosmic background radiation, or why galaxies along certain lines of sight seem biased to spin left-handedly. A newly discovered supergalactic structure 4 billion light years long calls into question the assumption that the universe is smooth on large scales.

Read the entire story here.

Image: Petrarch, who first conceived the idea of a European “Dark Age”, by Andrea di Bartolo di Bargilla, c1450. Courtesy of Galleria degli Uffizi, Florence, Italy / Wikipedia.

Building a Memory Palace

Feats of memory have long been the staple of human endeavor — for instance, memorizing and recalling Pi to hundreds of decimal places. Nowadays, however, memorization is a competitive sport replete with grand prizes, worthy of a place in an X-Games tournament.

From the NYT:

The last match of the tournament had all the elements of a classic showdown, pitting style versus stealth, quickness versus deliberation, and the world’s foremost card virtuoso against its premier numbers wizard.

If not quite Ali-Frazier or Williams-Sharapova, the duel was all the audience of about 100 could ask for. They had come to the first Extreme Memory Tournament, or XMT, to see a fast-paced, digitally enhanced memory contest, and that’s what they got.

The contest, an unusual collaboration between industry and academic scientists, featured one-minute matches between 16 world-class “memory athletes” from all over the world as they met in a World Cup-like elimination format. The grand prize was $20,000; the potential scientific payoff was large, too.

One of the tournament’s sponsors, the company Dart NeuroScience, is working to develop drugs for improved cognition. The other, Washington University in St. Louis, sent a research team with a battery of cognitive tests to determine what, if anything, sets memory athletes apart. Previous research was sparse and inconclusive.

Yet as the two finalists, both Germans, prepared to face off — Simon Reinhard, 35, a lawyer who holds the world record in card memorization (a deck in 21.19 seconds), and Johannes Mallow, 32, a teacher with the record for memorizing digits (501 in five minutes) — the Washington group had one preliminary finding that wasn’t obvious.

“We found that one of the biggest differences between memory athletes and the rest of us,” said Henry L. Roediger III, the psychologist who led the research team, “is in a cognitive ability that’s not a direct measure of memory at all but of attention.”

The Memory Palace

The technique the competitors use is no mystery.

People have been performing feats of memory for ages, scrolling out pi to hundreds of digits, or phenomenally long verses, or word pairs. Most store the studied material in a so-called memory palace, associating the numbers, words or cards with specific images they have already memorized; then they mentally place the associated pairs in a familiar location, like the rooms of a childhood home or the stops on a subway line.

The Greek poet Simonides of Ceos is credited with first describing the method, in the fifth century B.C., and it has been vividly described in popular books, most recently “Moonwalking With Einstein,” by Joshua Foer.

Each competitor has his or her own variation. “When I see the eight of diamonds and the queen of spades, I picture a toilet, and my friend Guy Plowman,” said Ben Pridmore, 37, an accountant in Derby, England, and a former champion. “Then I put those pictures on High Street in Cambridge, which is a street I know very well.”

As these images accumulate during memorization, they tell an increasingly bizarre but memorable story. “I often use movie scenes as locations,” said James Paterson, 32, a high school psychology teacher in Ascot, near London, who competes in world events. “In the movie ‘Gladiator,’ which I use, there’s a scene where Russell Crowe is in a field, passing soldiers, inspecting weapons.”

Mr. Paterson uses superheroes to represent combinations of letters or numbers: “I might have Batman — one of my images — playing Russell Crowe, and something else playing the horse, and so on.”

The material that competitors attempt to memorize falls into several standard categories. Shuffled decks of cards. Random words. Names matched with faces. And numbers, either binary (ones and zeros) or integers. They are given a set amount of time to study — up to one minute in this tournament, an hour or more in others — before trying to reproduce as many cards, words or digits in the order presented.

Now and then, a challenger boasts online of having discovered an entirely new method, and shows up at competitions to demonstrate it.

“Those people are easy to find, because they come in last, or close to it,” said another world-class competitor, Boris Konrad, 29, a German postdoctoral student in neuroscience. “Everyone here uses this same type of technique.”

Anyone can learn to construct a memory palace, researchers say, and with practice remember far more detail of a particular subject than before. The technique is accessible enough that preteens pick it up quickly, and Mr. Paterson has integrated it into his teaching.

“I’ve got one boy, for instance, he has no interest in academics really, but he knows the Premier League, every team, every player,” he said. “I’m working with him, and he’s using that knowledge as scaffolding to help remember what he’s learning in class.”

Experts in Forgetting

The competitors gathered here for the XMT are not just anyone, however. This is the all-world team, an elite club of laser-smart types who take a nerdy interest in stockpiling facts and pushing themselves hard.

In his doctoral study of 30 world-class performers (most from Germany, which has by far the highest concentration because there are more competitions), Mr. Konrad has found as much. The average I.Q.: 130. Average study time: 1,000 to 2,000 hours and counting. The top competitors all use some variation of the memory-palace system and test, retest and tweak it.

“I started with my own system, but now I use his,” said Annalena Fischer, 20, pointing to her boyfriend, Christian Schäfer, 22, whom she met at a 2010 memory competition in Germany. “Except I don’t use the distance runners he uses; I don’t know anything about the distance runners.” Both are advanced science students and participants in Mr. Konrad’s study.

One of the Washington University findings is predictable, if still preliminary: Memory athletes score very highly on tests of working memory, the mental sketchpad that serves as a shopping list of information we can hold in mind despite distractions.

One way to measure working memory is to have subjects solve a list of equations (5 + 4 = x; 8 + 9 = y; 7 + 2 = z; and so on) while keeping the middle numbers in mind (4, 9 and 2 in the above example). Elite memory athletes can usually store seven items, the top score on the test the researchers used; the average for college students is around two.

“And college students tend to be good at this task,” said Dr. Roediger, a co-author of the new book “Make It Stick: The Science of Successful Learning.” “What I’d like to do is extend the scoring up to, say, 21, just to see how far the memory athletes can go.”

Yet this finding raises another question: Why don’t the competitors’ memory palaces ever fill up? Players usually have many favored locations to store studied facts, but they practice and compete repeatedly. They use and reuse the same blueprints hundreds of times, and the new images seem to overwrite the old ones — virtually without error.

“Once you’ve remembered the words or cards or whatever it is, and reported them, they’re just gone,” Mr. Paterson said.

Many competitors say the same: Once any given competition is over, the numbers or words or facts are gone. But this is one area in which they have less than precise insight.

In its testing, which began last year, the Washington University team has given memory athletes surprise tests on “old” material — lists of words they’d been tested on the day before. On Day 2, they recalled an average of about three-quarters of the words they memorized on Day 1 (college students remembered fewer than 5 percent). That is, despite what competitors say, the material is not gone; far from it.

Yet to install a fresh image-laden “story” in any given memory palace, a memory athlete must clear away the old one in its entirety. The same process occurs when we change a password: The old one must be suppressed, so it doesn’t interfere with the new one.

One term for that skill is “attentional control,” and psychologists have been measuring it for years with standardized tests. In the best known, the Stroop test, people see words flash by on a computer screen and name the color in which a word is presented. Answering is nearly instantaneous when the color and the word match — “red” displayed in red — but slower when there’s a mismatch, like “red” displayed in blue.

Read the entire article here.

Life and Death: Sharing Startups

The great cycle of re-invention spawned by the Internet and mobile technologies continues apace. This time it’s the entrepreneurial businesses laying the foundation for the sharing economy — whether that be beds, room, clothes, tuition, bicycles or cars. A few succeed to become great new businesses; most fail.

From the WSJ:

A few high-profile “sharing-economy” startups are gaining quick traction with users, including those that let consumers rent apartments and homes like Airbnb Inc., or get car rides, such as Uber Technologies Inc.

Both Airbnb and Uber are valued in the billions of dollars, a sign that investors believe the segment is hot—and a big reason why more entrepreneurs are embracing the business model.

At MassChallenge, a Boston-based program to help early-stage entrepreneurs, about 9% of participants in 2013 were starting companies to connect consumers or businesses with products and services that would otherwise go unused. That compares with about 5% in 2010, for instance.

“We’re bullish on the sharing economy, and we’ll definitely make more investments in it,” said Sam Altman, president of Y Combinator, a startup accelerator in Mountain View, Calif., and one of Airbnb’s first investors.

Yet at least a few dozen sharing-economy startups have failed since 2012, including BlackJet, a Florida-based service that touted itself as the “Uber for jet travel,” and Tutorspree, a New York service dubbed the “Airbnb for tutors.” Most ran out of money, following struggles that ranged from difficulties building a critical mass of supply and demand, to higher-than-expected operating costs.

“We ended up being unable to consistently produce a level of demand on par with what we needed to scale rapidly,” said Aaron Harris, co-founder of Tutorspree, which launched in January 2011 and shuttered in August 2013.

“If you have to reacquire the customer every six months, they’ll forget you,” said Howard Morgan, co-founder of First Round Capital, which was an investor in BlackJet. “A private jet ride isn’t something you do every day. If you’re very wealthy, you have your own plane.” By comparison, he added that he recently used Uber’s ride-sharing service three times in one day.

Consider carpooling startup Ridejoy, for example. During its first year in 2011, its user base was growing by about 30% a month, with more than 25,000 riders and drivers signed up, and an estimated 10,000 rides completed, said Kalvin Wang, one of its three founders. But by the spring of 2013, Ridejoy, which had raised $1.3 million from early-stage investors like Freestyle Capital, was facing ferocious competition from free alternatives, such as carpooling forums on college websites.

Also, some riders could—and did—begin to sidestep the middleman. Many skipped paying its 10% transaction fee by handing their drivers cash instead of paying by credit card on Ridejoy’s website or mobile app. Others just didn’t get it, and even 25,000 users wasn’t sufficient to sustain the business. “You never really have enough inventory,” said Mr. Wang.

After it folded in the summer of 2013, Ridejoy returned about half of its funding to investors, according to Mr. Wang. Alexis Ohanian, an entrepreneur in Brooklyn, N.Y., who was an investor in Ridejoy, said it “could just be the timing or execution that was off.” He cited the success so far of Lyft Inc., the two-year-old San Francisco company that is valued at more than $700 million and offers a short-distance ride-sharing service. “It turned out the short rides are what the market really wanted,” Mr. Ohanian said.

One drawback is that because much of the revenue a sharing business generates goes directly back to the suppliers—of bedrooms, parking spots, vehicles or other “shared” assets—the underlying business may be continuously strapped for cash.

Read the entire article here.

The (Space) Explorers Club

clangers

Thirteen private companies recently met in New York city to present their plans and ideas for their commercial space operations. Ranging from space tourism to private exploration of the Moon and asteroid mining the companies gathered at the Explorers Club to herald a new phase of human exploration.

From Technology Review:

It was a rare meeting of minds. Representatives from 13 commercial space companies gathered on May 1 at a place dedicated to going where few have gone before: the Explorers Club in New York.

Amid the mansions and high-end apartment buildings just off Central Park, executives from space-tourism companies, rocket-making startups, and even a business that hopes to make money by mining asteroids for useful materials showed off displays and gave presentations.

The Explorers Club event provided a snapshot of what may be a new industry in the making. In an era when NASA no longer operates manned space missions and government funding for unmanned missions is tight, a host of startups—most funded by space enthusiasts with very deep pockets—have stepped up in hope of filling the gap. In the past few years, several have proved themselves. Elon Musk’s SpaceX, for example, delivers cargo to the International Space Station for NASA. Both Richard Branson’s Virgin Galactic and rocket-plane builder XCOR Aerospace plan to perform demonstrations this year that will help catapult commercial spaceflight from the fringe into the mainstream.

The advancements being made by space companies could matter to more than the few who can afford tickets to space. SpaceX has already shaken incumbents in the $190 billion satellite launch industry by offering cheaper rides into space for communications, mapping, and research satellites.

However, space tourism also looks set to become significantly cheaper. “People don’t have to actually go up for it to impact them,” says David Mindell, an MIT professor of aeronautics and astronautics and a specialist in the history of engineering. “At $200,000 you’ll have a lot more ‘space people’ running around, and over time that could have a big impact.” One direct result, says Mindell, may be increased public support for human spaceflight, especially “when everyone knows someone who’s been into space.”

Along with reporters, Explorer Club members, and members of the public who had paid the $75 to $150 entry fee, several former NASA astronauts were in attendance to lend their endorsements—including the MC for the evening, Michael López-Alegría, veteran of the space shuttle and the ISS. Also on hand, highlighting the changing times with his very presence, was the world’s first second-generation astronaut, Richard Garriott. Garriott’s father flew missions on Skylab and the space shuttle in the 1970s and 1980s, respectively. However, Garriott paid his own way to the International Space Station in 2008 as a private citizen.

The evening was a whirlwind of activity, with customer testimonials and rapid-fire displays of rocket launches, spacecraft in orbit, and space ships under construction and being tested. It all painted a picture of an industry on the move, with multiple companies offering services from suborbital experiences and research opportunities to flights to Earth orbit and beyond.

The event also offered a glimpse at the plans of several key players.

Lauren De Niro Pipher, head of astronaut relations at Virgin Galactic, revealed that the company’s founder plans to fly with his family aboard the Virgin Galactic SpaceShipTwo rocket plane in November or December of this year. The flight will launch the company’s suborbital spaceflight business, for which De Niro Pipher said more than 700 customers have so far put down deposits on tickets costing $200,000 to $250,000.

The director of business development for Blue Origin, Bretton Alexander, announced his company’s intention to begin test flights of its first full-scale vehicle within the next year. “We have not publicly started selling rides in space as others have,” said Alexander during his question-and-answer session. “But that is our plan to do that, and we look forward to doing that, hopefully soon.”

Blue Origin is perhaps the most secretive of the commercial spaceflight companies, typically revealing little of its progress toward the services it plans to offer: suborbital manned spaceflight and, later, orbital flight. Like Virgin, it was founded by a wealthy entrepreneur, in this case Amazon founder Jeff Bezos. The company, which is headquartered in Kent, Washington, has so far conducted at least one supersonic test flight and a test of its escape rocket system, both at its West Texas test center.

Also on hand was the head of Planetary Resources, Chris Lewicki, a former spacecraft engineer and manager for Mars programs at NASA. He showed off a prototype of his company’s Arkyd 100, an asteroid-hunting space telescope the size of a toaster oven. If all goes according to plan, a fleet of Arkyd 100s will first scan the skies from Earth orbit in search of nearby asteroids that might be rich in mineral wealth and water, to be visited by the next generation of Arkyd probes. Water is potentially valuable for future space-based enterprises as rocket fuel (split into its constituent elements of hydrogen and oxygen) and for use in life support systems. Planetary Resources plans to “launch early, launch often,” Lewicki told me after his presentation. To that end, the company is building a series of CubeSat-size spacecraft dubbed Arkyd 3s, to be launched from the International Space Station by the end of this year.

Andrew Antonio, experience manager at a relatively new company, World View Enterprises, showed a computer-generated video of his company’s planned balloon flights to the edge of space. A manned capsule will ascend to 100,000 feet, or about 20 miles up, from which the curvature of Earth and the black sky of space are visible. At $75,000 per ticket (reduced to $65,000 for Explorers Club members), the flight will be more affordable than competing rocket-powered suborbital experiences but won’t go as high. Antonio said his company plans to launch a small test vehicle “in about a month.”

XCOR’s director of payload sales and operations, Khaki Rodway, showed video clips of the company’s Lynx suborbital rocket plane coming together in Mojave, California, as well as a profile of an XCOR spaceflight customer. Hangared just down the flight line at the same air and space port where Virgin Galactic’s SpaceShipTwo is undergoing flight testing, the Lynx offers seating for one paying customer per flight at $95,000. XCOR hopes the Lynx will begin flying by the end of this year.

Read the entire article here.

Image: Still from the Clangers TV show. Courtesy of BBC / Smallfilms.

Intimate Anonymity

A new mobile app lets you share all your intimate details with a stranger for 20 days. The fascinating part of this social experiment is that the stranger remains anonymous throughout. The app known as 20 Day Stranger is brought to us by the venerable MIT Media Lab. It may never catch on, but you can be sure that psychologists are gleefully awaiting some data.

From Slate:

Social media is all about connecting with people you know, people you sort of know, or people you want to know. But what about all those people you didn’t know you wanted to know? They’re out there, too, and the new iPhone app 20 Day Stranger wants to put you in touch with them. Created by the MIT Media Lab’s Playful Systems research group, the app connects strangers and allows them to update each other about any and every detail of their lives for 20 days. But the people are totally anonymous and can interact directly only at the end of their 20 days together, when they can exchange one message each.

20 Day Stranger uses information from the iPhone’s sensors to alert your stranger-friend when you wake up (and start moving the phone), when you’re in a car or bus (from GPS tracking), and where you are. But it isn’t totally privacy-invading: The app also takes steps to keep both people anonymous. When it shows your stranger-friend that you’re walking around somewhere, it accompanies the notification with images from a half-mile radius of where you actually are on Google Maps. Your stranger-friend might be able to figure out what area you’re in, or they might not.

Kevin Slavin, the director of Playful Systems, explained to Fast Company that the app’s goal is to introduce people online in a positive and empathetic way, rather than one that’s filled with suspicion or doubt. Though 20 Day Stranger is currently being beta tested, Playful Systems’ goal is to generally release it in the App Store. But the group is worried about getting people to adopt it all over instead of building up user bases in certain geographic areas. “There’s no one type of person what will make it useful,” Slavin said. “It’s the heterogeneous quality of everyone in aggregate. Which is a bad [promotional] strategy if you’re making commercial software.”

At this point it’s not that rare to interact frequently with someone you’ve never met in person on social media. What’s unusual it not to know their name or anything about who they are. But an honest window into another person’s life without the pressure of identity could expand your worldview and maybe even stimulate introspection. It sounds like a step up from Secret, that’s for sure.

Read the entire article here.

Measuring a Life

stephen-sutton

“I don’t see the point in measuring life in time any more… I would rather measure it in terms of what I actually achieve. I’d rather measure it in terms of making a difference, which I think is a much more valid and pragmatic measure.”

These are the inspiring and insightful words of 19 year-old, Stephen Sutton, from Birmingham in Britain, about a week before he died from bowel cancer. His upbeat attitude and selflessness during his last days captured the hearts and minds of the nation, and he raised around $5½ million for cancer charities in the process.

From the Guardian:

Few scenarios can seem as cruel or as bleak as a 19-year-old boy dying of cancer. And yet, in the case of Stephen Sutton, who died peacefully in his sleep in the early hours of Wednesday morning, it became an inspiring, uplifting tale for millions of people.

Sutton was already something of a local hero in Birmingham, where he was being treated, but it was an extraordinary Facebook update in April that catapulted him into the national spotlight.

“It’s a final thumbs up from me,” he wrote, accompanied by a selfie of him lying in a sickbed, covered in drips, smiling cheerfully with his thumbs in the air. “I’ve done well to blag things as well as I have up till now, but unfortunately I think this is just one hurdle too far.”

It was an extraordinary moment: many would have forgiven him being full of rage and misery. And yet here was a simple, understated display of cheerful defiance.

Sutton had originally set a fundraising target of £10,000 for the Teenage Cancer Trust. But the emotional impact of that selfie was so profound that, in a matter of days, more than £3m was donated.

He made a temporary recovery that baffled doctors; he explained that he had “coughed up” a tumour. And so began an extraordinary dialogue with his well-wishers.

To his astonishment, nearly a million people liked his Facebook page and tens of thousands followed him on Twitter. It is fashionable to be downbeat about social media: to dismiss it as being riddled with the banal and the narcissistic, or for stripping human interaction of warmth as conversations shift away from the “real world” to the online sphere.

But it was difficult not to be moved by the online response to Stephen’s story: a national wave of emotion that is not normally forthcoming for those outside the world of celebrity.

His social-media updates were relentlessly upbeat, putting those of us who have tweeted moaning about a cold to shame. “Just another update to let everyone know I am still doing and feeling very well,” he reassured followers less than a week before his death. “My disease is very advanced and will get me eventually, but I will try my damn hardest to be here as long as possible.”

Sutton was diagnosed with bowel cancer in September 2010 when he was 15; tragically, he had been misdiagnosed and treated for constipation months earlier.

But his response was unabashed positivity from the very beginning, even describing his diagnosis as a “good thing” and a “kick up the backside”.

The day he began chemotherapy, he attended a party dressed as a granny – he was so thin and pale, he said, that he was “quite convincing”. He refused to take time off school, where he excelled.

When he was diagnosed as terminally ill two years later, he set up a Facebook page with a bucket list of things he wanted to achieve, including sky-diving, crowd-surfing in a rubber dinghy, and hugging an animal bigger than him (an elephant, it turned out).

But it was his fundraising for cancer research that became his passion, and his efforts will undoubtedly transform the lives of some of the 2,200 teenagers and young adults diagnosed with cancer each year.

The Teenage Cancer Trust on Wednesday said it was humbled and hugely grateful for his efforts, with donations still ticking up and reaching £3.34m by mid-afternoon .

His dream had been to become a doctor. With that ambition taken from him, he sought and found new ways to help people. “Spreading positivity” was another key aim. Four days ago, he organised a National Good Gestures Day, in Birmingham, giving out “free high-fives, hugs, handshakes and fist bumps”.

Indeed, it was not just money for cancer research that Sutton was after. He became an evangelist for a new approach to life.

“I don’t see the point in measuring life in time any more,” he told one crowd. “I would rather measure it in terms of what I actually achieve. I’d rather measure it in terms of making a difference, which I think is a much more valid and pragmatic measure.”

By such a measure, Sutton could scarcely have lived a longer, richer and more fulfilling life.

Read the entire story here.

Image: Stephen Sutton. Courtesy of Google Search.

Thwaites

thwaits_icebridge_2012

Over the coming years the words “Thwaites Glacier” will become known to many people, especially those who make their home near the world’s oceans. The thawing of Antarctic ice and the accelerating melting of its glaciers — of which Thwaites is a prime example — pose an increasing threat to our coasts, but imperil us all.

Thwaites is one of size mega-glaciers that drain into the West Antarctic’s Amundsen Sea. If all were to melt completely, as they are continuing to do, global sea-level would be projected to rise an average of 4½ feet. Astonishingly, this catastrophe in the making has passed a tipping-point — climatologists and glaciologists now tend to agree that the melting is irreversible and accelerating.

From ars technica:

Today, researchers at UC Irvine and the Jet Propulsion Laboratory have announced results indicating that glaciers across a large area of West Antarctica have been destabilized and that there is little that will stop their continuing retreat. These glaciers are all that stand between the ocean and a massive basin of ice that sits below sea level. Should the sea invade this basin, we’d be committed to several meters of sea level rise.

Even in the short term, the new findings should increase our estimates for sea level rise by the end of the century, the scientists suggest. But the ongoing process of retreat and destabilization will mean that the area will contribute to rising oceans for centuries.

The press conference announcing these results is ongoing. We will have a significant update on this story later today.

UPDATE (2:05pm CDT):

The glaciers in question are in West Antarctica, and drain into the Amundsen Sea. On the coastal side, the ends of the glacier are actually floating on ocean water. Closer to the coast, there’s what’s called a “grounding line,” where the weight of the ice above sea level pushes the bottom of the glacier down against the sea bed. From there on, back to the interior of Antarctica, all of the ice is directly in contact with the Earth.

That’s a rather significant fact, given that, just behind a range of coastal hills, all of the ice is sitting in a huge basin that’s significantly below sea level. In total, the basin contains enough ice to raise sea levels approximately four meters, largely because the ice piled in there rises significantly above sea level.

Because of this configuration, the grounding line of the glaciers that drain this basin act as a protective barrier, keeping the sea back from the base of the deeper basin. Once ocean waters start infiltrating the base of a glacier, the glacier melts, flows faster, and thins. This lessens the weight holding the glacier down, ultimately causing it to float, which hastens its break up. Since the entire basin is below sea level (in some areas by over a kilometer), water entering the basin via any of the glaciers could destabilize the entire thing.

Thus, understanding the dynamics of the grounding lines is critical. Today’s announcements have been driven by two publications. One of them models the behavior of one of these glaciers, and shows that it has likely reached a point where it will be prone to a sudden retreat sometime in the next few centuries. The second examines every glacier draining this basin, and shows that all but one of them are currently losing contact with their grounding lines.

Ungrounded

The data come from two decades worth of data from the ESA’s Earth Remote Sensing satellites. These include radar that performs two key functions: peers through the ice to get a sense of the terrain that lies buried under the ice near the grounding line. And, through interferometry, it tracks the dynamics of the ice sheet’s flow in the area, as well as its thinning and the location of the grounding line itself. The study tracks a number of glaciers that all drain into the region: Pine Island, Thwaites, Haynes, and Smith/Kohler.

As we’ve covered previously, the Pine Island Glacier came ungrounded in the second half of the past decade, retreating up to 31km in the process. Although this was the one that made headlines, all the glaciers in the area are in retreat. Thwaites saw areas retreat up to 14km over the course of the study, Haynes retracted by 10km, and the Smith/Kohler glaciers retreated by 35km.

The retreating was accompanied by thinning of the glaciers, as ice that had been held back above sea levels in the interior spread forward and thinned out. This contributed to sea level rise, and the speakers at the press conference agreed that the new data shows that the recently released IPCC estimates for sea level rise are out of date; even by the end of this century, the continuation of this process will significantly increase the rate of sea level rise we can expect.

The real problem, however, comes later. Glaciers can establish new grounding lines if there’s a feature in the terrain, such as a hill that rises above sea level, that provides a new anchoring point. The authors see none: “Upstream of the 2011 grounding line positions, we find no major bed obstacle that would prevent the glaciers from further retreat and draw down the entire basin.” In fact, several of the existing grounding lines are close to points where the terrain begins to slope downward into the basin.

For some of the glaciers, the problems are already starting. At Pine Island, the bottom of the glacier is now sitting on terrain that’s 400 meters deeper than where the end rested in 1992, and there are no major hills between there and the basin. As far as the Smith/Kohler glaciers, the grounding line is 800 meters deeper and “its ice shelf pinning points are vanishing.”

What’s next?

As a result, the authors concluded that these glaciers are essentially destabilized—unless something changes radically, they’re destined for retreat into the indefinite future. But what will the trajectory of that retreat look like? In this case, the data doesn’t directly help. It needs to be fed into a model that projects the current melting into the future. Conveniently, a different set of scientists has already done this modeling.

The work focuses on the Thwaites glacier, which appears to be the most stable: there are 60-80km before between the existing terminus and the deep basin, and two or three ridges within that distance that will allow the formation of new grounding lines.

The authors simulated the behavior of Thwaites using a number of different melting rates. These ranged from a low that approximated the behavior typical in the early 90s, to a high rate of melt that is similar to what was observed in recent years. Every single one of these situations saw the Thwaites retreat into the deep basin within the next 1,000 years. In the higher melt scenarios—the ones most reflective of current conditions—this typically took only a few centuries.

The other worrisome behavior is that there appeared to be a tipping point. In every simulation that saw an extensive retreat, rates of melting shifted from under 80 gigatonnes of ice per year to 150 gigatonnes or more, all within the span of a couple of decades. In the later conditions, this glacier alone contributed half a centimeter to sea level rise—every year.

Read the entire article here.

Image: Thwaites Glacier, Antarctica, 2012. Courtesy of NASA Earth Observatory.

DarwinTunes

Charles_DarwinResearchers at Imperial College, London recently posed an intriguing question and have since developed a cool experiment to test it. Does artistic endeavor, such as music, follow the same principles of evolutionary selection in biology, as described by Darwin? That is, does the funkiest survive? Though, one has to wonder what the eminent scientist would have thought about some recent fusion of rap / dubstep / classical.

From the Guardian:

There were some funky beats at Imperial College London on Saturday at its annual science festival. As well as opportunities to create bogeys, see robots dance and try to get physics PhD students to explain their wacky world, this fascinating event included the chance to participate in a public game-like experiment called DarwinTunes.

Participants select tunes and “mate” them with other tunes to create musical offspring: if the offspring are in turn selected by other players, they “survive” and get the chance to reproduce their musical DNA. The experiment is online – you too can try to immortalise your selfish musical genes.

It is a model of evolution in practice that raises fascinating questions about culture and nature. These questions apply to all the arts, not just to dance beats. How does “cultural evolution” work? How close is the analogy between Darwin’s well-proven theory of evolution in nature and the evolution of art, literature and music?

The idea of cultural evolution was boldly defined by Jacob Bronowski as our fundamental human ability “not to accept the environment but to change it”. The moment the first stone tools appeared in Africa, about 2.5m years ago, a new, faster evolution, that of human culture, became visible on Earth: from cave paintings to the Renaissance, from Galileo to the 3D printer, this cultural evolution has advanced at breathtaking speed compared with the massive periods of time it takes nature to evolve new forms.

In DarwinTunes, cultural evolution is modelled as what the experimenters call “the survival of the funkiest”. Pulsing dance beats evolve through selections made by participants, and the music (it is claimed) becomes richer through this process of selection. Yet how does the model really correspond to the story of culture?

One way Darwin’s laws of nature apply to visual art is in the need for every successful form to adapt to its environment. In the forests of west and central Africa, wood carving was until recent times a flourishing art form. In the islands of Greece, where marble could be quarried easily, stone sculpture was more popular. In the modern technological world, the things that easily come to hand are not wood or stone but manufactured products and media images – so artists are inclined to work with the readymade.

At first sight, the thesis of DarwinTunes is a bit crude. Surely it is obvious that artists don’t just obey the selections made by their audience – that is, their consumers. To think they do is to apply the economic laws of our own consumer society across all history. Culture is a lot funkier than that.

Yet just because the laws of evolution need some adjustment to encompass art, that does not mean art is a mysterious spiritual realm impervious to scientific study. In fact, the evolution of evolution – the adjustments made by researchers to Darwin’s theory since it was unveiled in the Victorian age – offers interesting ways to understand culture.

One useful analogy between art and nature is the idea of punctuated equilibrium, introduced by some evolutionary scientists in the 1970s. Just as species may evolve not through a constant smooth process but by spectacular occasional leaps, so the history of art is punctuated by massively innovative eras followed by slower, more conventional periods.

Read the entire story here.

Image: Charles Darwin, 1868, photographed by Julia Margaret Cameron. Courtesy of Wikipedia.

Plastic, Heal Thyself!

[tube]sybsT1_0qwQ[/tube]

Blood is a remarkable substance: it transports vital oxygen to nourish our cells, it carries signalling chemicals that control our actions, it delivers armies of substances, at a moment’s notice, to ward against bodily infection and injury. Now, imagine a similar, bio-mimetic process in plastic, which remarkably allows a plastic material to heal itself.

From New Scientist:

If you prick it, does it not bleed? Puncture this plastic and it will heal itself with oozing fluids, in a process that mimics the way blot clots form to repair wounds. The plastic could one day be used to automatically patch holes in distant spacecraft or repair fighter jets on the fly.

So far, efforts to develop materials that fix themselves the way biological tissue mends itself have been limited. Scott White at the University of Illinois at Urbana-Champaign and his colleagues developed one of the first versions in 2001, but that material could only heal microscopic cracks.

Now his team have created a plastic lined with a type of artificial vascular system that can heal damage large enough to be visible to the naked eye.

The key is a pair of liquids that react when they are mixed. One fluid contains long, thin molecules and the other contains three-sided molecules. When the fluids mix, the molecules join together to create a scaffold, similar to the way blood platelets and fibrin proteins join to form a clot.

After a few minutes of contact, the liquids turn into a thick gel that fills the damaged area. Over a few hours, other ingredients within the fluids cause the gel to harden.

Strength from weakness

To test the concept, the team ran separate channels of each liquid through a plastic square and punctured it, creating a 4-millimetre hole with 35 millimetres of surrounding cracks. This also tore open the fluid channels.

Pumps on the edge of the plastic square squirted the fluids into the channels, where they oozed out and mixed, filling the hole and the radiating cracks within 20 minutes. The material hardened in about 3 hours, and the resulting patch was around 60 per cent as strong as the original plastic.

Holes larger than 8 millimetres proved more difficult to fill, as gravity caused the gel to sag before it could harden. The team thinks using foams in place of fluids would fill larger gaps, but they haven’t tested that idea yet.

Eventually, White and his team envision plastics with multiple criss-crossing channels, to ensure that the fluids always overlap with a damaged area. Embedding this synthetic vascular network would weaken the original material, but not by much, they say.

“You pay the price for being able to repair this damage, but it is certainly one that nature has figured out how to tolerate,” says team member Jeff Moore, also at the University of Illinois. “If you just look to things like bone or trees, they are all vascularised.”

Read the entire article here.

Image: Self-healing materials fix large-scale damage. Courtesy of University of Illinois at Urbana-Champaign.

The Rise of McLiterature

Will-Self-2007A sad symptom of our expanding media binge culture and the fragmentation of our shortening attention spans is the demise of literary fiction. Author Will Self believes the novel, and narrative prose in general, is on a slow, but accelerating, death-spiral. His eloquent views presented in a May 6, 2014 lecture are excerpted below.

From the Guardian:

If you happen to be a writer, one of the great benisons of having children is that your personal culture-mine is equipped with its own canaries. As you tunnel on relentlessly into the future, these little harbingers either choke on the noxious gases released by the extraction of decadence, or they thrive in the clean air of what we might call progress. A few months ago, one of my canaries, who’s in his mid-teens and harbours a laudable ambition to be the world’s greatest ever rock musician, was messing about on his electric guitar. Breaking off from a particularly jagged and angry riff, he launched into an equally jagged diatribe, the gist of which was already familiar to me: everything in popular music had been done before, and usually those who’d done it first had done it best. Besides, the instant availability of almost everything that had ever been done stifled his creativity, and made him feel it was all hopeless.

A miner, if he has any sense, treats his canary well, so I began gently remonstrating with him. Yes, I said, it’s true that the web and the internet have created a permanent Now, eliminating our sense of musical eras; it’s also the case that the queered demographics of our longer-living, lower-birthing population means that the middle-aged squat on top of the pyramid of endeavour, crushing the young with our nostalgic tastes. What’s more, the decimation of the revenue streams once generated by analogues of recorded music have put paid to many a musician’s income. But my canary had to appreciate this: if you took the long view, the advent of the 78rpm shellac disc had also been a disaster for musicians who in the teens and 20s of the last century made their daily bread by live performance. I repeated one of my favourite anecdotes: when the first wax cylinder recording of Feodor Chaliapin singing “The Song of the Volga Boatmen was played, its listeners, despite a lowness of fidelity that would seem laughable to us (imagine a man holding forth from a giant bowl of snapping, crackling and popping Rice Krispies), were nonetheless convinced the portly Russian must be in the room, and searched behind drapes and underneath chaise longues for him.

So recorded sound blew away the nimbus of authenticity surrounding live performers – but it did worse things. My canaries have often heard me tell how back in the 1970s heyday of the pop charts, all you needed was a writing credit on some loathsome chirpy-chirpy-cheep-cheeping ditty in order to spend the rest of your born days lying by a guitar-shaped pool in the Hollywood Hills hoovering up cocaine. Surely if there’s one thing we have to be grateful for it’s that the web has put paid to such an egregious financial multiplier being applied to raw talentlessness. Put paid to it, and also returned musicians to the domain of live performance and, arguably, reinvigorated musicianship in the process. Anyway, I was saying all of this to my canary when I was suddenly overtaken by a great wave of noxiousness only I could smell. I faltered, I fell silent, then I said: sod you and your creative anxieties, what about me? How do you think it feels to have dedicated your entire adult life to an art form only to see the bloody thing dying before your eyes?

My canary is a perceptive songbird – he immediately ceased his own cheeping, except to chirrup: I see what you mean. The literary novel as an art work and a narrative art form central to our culture is indeed dying before our eyes. Let me refine my terms: I do not mean narrative prose fiction tout court is dying – the kidult boywizardsroman and the soft sadomasochistic porn fantasy are clearly in rude good health. And nor do I mean that serious novels will either cease to be written or read. But what is already no longer the case is the situation that obtained when I was a young man. In the early 1980s, and I would argue throughout the second half of the last century, the literary novel was perceived to be the prince of art forms, the cultural capstone and the apogee of creative endeavour. The capability words have when arranged sequentially to both mimic the free flow of human thought and investigate the physical expressions and interactions of thinking subjects; the way they may be shaped into a believable simulacrum of either the commonsensical world, or any number of invented ones; and the capability of the extended prose form itself, which, unlike any other art form, is able to enact self-analysis, to describe other aesthetic modes and even mimic them. All this led to a general acknowledgment: the novel was the true Wagnerian Gesamtkunstwerk.

This is not to say that everyone walked the streets with their head buried in Ulysses or To the Lighthouse, or that popular culture in all its forms didn’t hold sway over the psyches and imaginations of the great majority. Nor do I mean to suggest that in our culture perennial John Bull-headed philistinism wasn’t alive and snorting: “I don’t know much about art, but I know what I like.” However, what didn’t obtain is the current dispensation, wherein those who reject the high arts feel not merely entitled to their opinion, but wholly justified in it. It goes further: the hallmark of our contemporary culture is an active resistance to difficulty in all its aesthetic manifestations, accompanied by a sense of grievance that conflates it with political elitism. Indeed, it’s arguable that tilting at this papery windmill of artistic superiority actively prevents a great many people from confronting the very real economic inequality and political disenfranchisement they’re subject to, exactly as being compelled to chant the mantra “choice” drowns out the harsh background Muzak telling them they have none.

Just because you’re paranoid it doesn’t mean they aren’t out to get you. Simply because you’ve remarked a number of times on the concealed fox gnawing its way into your vitals, it doesn’t mean it hasn’t at this moment swallowed your gall bladder. Ours is an age in which omnipresent threats of imminent extinction are also part of the background noise – nuclear annihilation, terrorism, climate change. So we can be blinkered when it comes to tectonic cultural shifts. The omnipresent and deadly threat to the novel has been imminent now for a long time – getting on, I would say, for a century – and so it’s become part of culture. During that century, more books of all kinds have been printed and read by far than in the entire preceding half millennium since the invention of movable-type printing. If this was death it had a weird, pullulating way of expressing itself. The saying is that there are no second acts in American lives; the novel, I think, has led a very American sort of life: swaggering, confident, brash even – and ever aware of its world-conquering manifest destiny. But unlike Ernest Hemingway or F Scott Fitzgerald, the novel has also had a second life. The form should have been laid to rest at about the time of Finnegans Wake, but in fact it has continued to stalk the corridors of our minds for a further three-quarters of a century. Many fine novels have been written during this period, but I would contend that these were, taking the long view, zombie novels, instances of an undead art form that yet wouldn’t lie down.

Literary critics – themselves a dying breed, a cause for considerable schadenfreude on the part of novelists – make all sorts of mistakes, but some of the most egregious ones result from an inability to think outside of the papery prison within which they conduct their lives’ work. They consider the codex. They are – in Marshall McLuhan’s memorable phrase – the possessors of Gutenberg minds.

There is now an almost ceaseless murmuring about the future of narrative prose. Most of it is at once Panglossian and melioristic: yes, experts assert, there’s no disputing the impact of digitised text on the whole culture of the codex; fewer paper books are being sold, newspapers fold, bookshops continue to close, libraries as well. But … but, well, there’s still no substitute for the experience of close reading as we’ve come to understand and appreciate it – the capacity to imagine entire worlds from parsing a few lines of text; the ability to achieve deep and meditative levels of absorption in others’ psyches. This circling of the wagons comes with a number of public-spirited campaigns: children are given free books; book bags are distributed with slogans on them urging readers to put books in them; books are hymned for their physical attributes – their heft, their appearance, their smell – as if they were the bodily correlates of all those Gutenberg minds, which, of  course, they are.

The seeming realists among the Gutenbergers say such things as: well, clearly, books are going to become a minority technology, but the beau livre will survive. The populist Gutenbergers prate on about how digital texts linked to social media will allow readers to take part in a public conversation. What none of the Gutenbergers are able to countenance, because it is quite literally – for once the intensifier is justified – out of their minds, is that the advent of digital media is not simply destructive of the codex, but of the Gutenberg mind itself. There is one question alone that you must ask yourself in order to establish whether the serious novel will still retain cultural primacy and centrality in another 20 years. This is the question: if you accept that by then the vast majority of text will be read in digital form on devices linked to the web, do you also believe that those readers will voluntarily choose to disable that connectivity? If your answer to this is no, then the death of the novel is sealed out of your own mouth.

Read the entire excerpt here.

Image: Will Self, 2007. Courtesy of Wikipedia / Creative Commons.

Expanding Binge Culture

The framers of the U.S. Declaration of Independence could not have known. They could not have foreseen how commoditization, consumerism, globalisation and always-on media culture would come to transform our culture. They did well to insert “Life, Liberty and the pursuit of Happiness”.

But they failed to consider our collective evolution — if you would wish to denote it as such — towards a sophisticated culture of binge. Significant numbers of us have long binged on physical goods, money, natural resources, food and drink. However, media has lagged, somewhat. But no longer. Now we have at our instantaneous whim entire libraries of all-you-can-eat infotainment. Time will tell if this signals the demise of quality, as it gets replaced with overwhelming quantity. One area shows where we may be heading — witness the “fastfoodification” of our news.

From NYT:

When Beyoncé released, without warning, 17 videos around midnight on Dec. 13, millions of fans rejoiced. As a more casual listener of Ms. Knowles, I balked at the onslaught of new material and watched a few videos before throwing in the towel.

Likewise, when Netflix, in one fell swoop, made complete seasons of “House of Cards” and “Orange Is the New Black” available for streaming, I quailed at the challenge, though countless others happily immersed themselves in their worlds of Washington intrigue and incarcerated women.

Then there is the news, to which floodgates are now fully open thanks to the Internet and cable TV: Flight 370, Putin, Chris Christie, Edward Snowden, Rob Ford, Obamacare, “Duck Dynasty,” “bossy,” #CancelColbert, conscious uncoupling. When presented with 24/7 coverage of these ongoing narratives from an assortment of channels — traditional journalism sites, my Facebook feed, the log-out screen of my email — I followed some closely and very consciously uncoupled from others.

Had these content providers released their offerings in the old-media landscape, à la carte rather than in an all-you-can-eat buffet, the prospect of a seven-course meal might not have seemed so daunting. I could handle a steady drip of one article a day about Mr. Ford in a newspaper. But after two dozen, updated every 10 minutes, plus scores of tweets, videos and GIFs that keep on giving, I wanted to forget altogether about Toronto’s embattled mayor.

While media technology is now catching up to Americans’ penchant for overdoing it and finding plenty of willing indulgers, there are also those like me who recoil from the abundance of binge culture.

In the last decade, media entertainment has given far more freedom to consumers: watch, listen to and read anything at anytime. But Barry Schwartz’s 2004 book, “The Paradox of Choice,” argues that our surfeit of consumer choices engenders anxiety, not satisfaction, and sometimes even a kind of paralysis.

His thesis (which has its dissenters) applies mostly to the profusion of options within a single set: for instance, the challenge of picking out salad dressing from 175 varieties in a supermarket. Nevertheless, it is also germane to the concept of bingeing, when 62 episodes of “Breaking Bad” wait overwhelmingly in a row like bottles of Newman’s Own on a shelf.

Alex Quinlan, 31, a first-year Ph.D. student in poetry at Florida State University, said he used to spend at least an hour every morning reading the news and “putting off my responsibilities,” as well as binge-watching shows. He is busier now, and last fall had trouble installing an Internet connection in his home, which effectively “rewired my media-consumption habits,” he said. “I’m a lot more disciplined. Last night I watched one episode of ‘House of Cards’ and went to bed. A year ago, I probably would’ve watched one, gotten another beer, then watched two more.”

Even shorter-term bingeing can seem like a major commitment, because there is a distorting effect of receiving a large chunk of content at once rather than getting it piecemeal. To watch one Beyoncé video a week would eat as much time as watching all in one day, but their unified dissemination makes them seem intimidatingly movie-length (which they are, approximately) rather than like a series of four-minute clips.

I also experienced some first-world anxiety last year with the release of the fourth season of “Arrested Development.” I had devoured the show’s first three seasons, parceled out in 22-minute weekly installments on Fox as well as on DVD, where I would watch episodes I had already seen (in pre-streaming days, binge-watching required renting or owning a copy, which was more like a contained feast). But when Netflix uploaded 15 new episodes totaling 8.5 hours on May 26, I was not among those queuing up for it. It took me some time to get around to the show, and once I had started, the knowledge of how many episodes stretched in front of me, at my disposal whenever I wanted, proved off-putting.

This despite the keeping-up-with-the-Joneses quality to binge-viewing. If everyone is quickly exhausting every new episode of a show, and writing and talking about it the next day, it’s easy to feel left out of the conversation if you haven’t kept pace. And sometimes when you’re late to the party, you decide to stay home instead.

Because we frequently gorge when left to our own Wi-Fi-enabled devices, the antiquated methods of “scheduling our information consumption” may have been healthier, if less convenient, said Clay Johnson, 36, the author of “The Information Diet.” He recalled rushing home after choir practice when he was younger to catch “Northern Exposure” on TV.

“That idea is now preposterous,” he said. “We don’t have appointment television anymore. Just because we can watch something all the time doesn’t mean we should. Maybe we should schedule it in a way that makes sense around our daily lives.”

“It’s a lot like food,” he added. “You see some people become info-anorexic, who say the answer is to unplug and not consume anything. Much like an eating disorder, it’s just as unhealthy a decision as binge-watching the news and media. There’s a middle ground of people who are saying, ‘I need to start treating this form of input in my life like a conscious decision and to be informed in the right way.’ ”

Read the entire story here.