MondayMap: Our New Address — Laniakea

laniakea_nrao

Once upon a time we humans sat smugly at the center of the universe. Now, many of us (though, not yet all) know better. Over the the last several centuries we learned and accepted that the Earth spun around the nearest Star, and not the converse. We then learned that the Sun formed part of an immense galaxy, the Milky Way, itself spinning in a vast cosmological dance. More recently, we learned that the Milky Way formed part of a larger cluster of galaxies, known as the Local Group.

Now we find that our Local Group is a mere speck within an immense supercluster containing around 100,000 galaxies spanning half a billion light years. Researchers have dubbed this galactic supercluster, rather aptly, Laniakea, Hawaiian for “immense heaven”. Laniakea is your new address. And, fascinatingly, Laniakea is moving towards an even larger grouping of galaxies named the Shapely supercluster.

From the Guardian:

In what amounts to a back-to-school gift for pupils with nerdier leanings, researchers have added a fresh line to the cosmic address of humanity. No longer will a standard home address followed by “the Earth, the solar system, the Milky Way, the universe” suffice for aficionados of the extended astronomical location system.

The extra line places the Milky Way in a vast network of neighbouring galaxies or “supercluster” that forms a spectacular web of stars and planets stretching across 520m light years of our local patch of universe. Named Laniakea, meaning “immeasurable heaven” in Hawaiian, the supercluster contains 100,000 large galaxies that together have the mass of 100 million billion suns.

Our home galaxy, the Milky Way, lies on the far outskirts of Laniakea near the border with another supercluster of galaxies named Perseus-Pisces. “When you look at it in three dimensions, is looks like a sphere that’s been badly beaten up and we are over near the edge, being pulled towards the centre,” said Brent Tully, an astronomer at the University of Hawaii in Honolulu.

Astronomers have long known that just as the solar system is part of the Milky Way, so the Milky Way belongs to a cosmic structure that is much larger still. But their attempts to define the larger structure had been thwarted because it was impossible to work out where one cluster of galaxies ended and another began.

Tully’s team gathered measurements on the positions and movement of more than 8,000 galaxies and, after discounting the expansion of the universe, worked out which were being pulled towards us and which were being pulled away. This allowed the scientists to define superclusters of galaxies that all moved in the same direction (if you’re reading this story on a mobile device, click here to watch a video explaining the research).

The work published in Nature gives astronomers their first look at the vast group of galaxies to which the Milky Way belongs. A narrow arch of galaxies connects Laniakea to the neighbouring Perseus-Pisces supercluster, while two other superclusters called Shapley and Coma lie on the far side of our own.

Tully said the research will help scientists understand why the Milky Way is hurtling through space at 600km a second towards the constellation of Centaurus. Part of the reason is the gravitational pull of other galaxies in our supercluster.

“But our whole supercluster is being pulled in the direction of this other supercluster, Shapley, though it remains to be seen if that’s all that’s going on,” said Tully.

Read the entire article here or the nerdier paper here.

Image: Laniakea: Our Home Supercluster of Galaxies. The blue dot represents the location of the Milky Way. Courtesy: R. Brent Tully (U. Hawaii) et al., SDvision, DP, CEA/Saclay.

Theism Versus Spirituality

Prominent neo-atheist Sam Harris continues to reject theism, and does so thoughtfully and eloquently. In his latest book, Waking Up, he continues to argue the case against religion, but makes a powerful case for spirituality. Harris defines spirituality as an inner sense of a good and powerful reality, based on sound self-awarenesses and insightful questioning of one’s own consciousness. This type of spirituality, quite rightly, is devoid of theistic angels and demons. Harris reveals more in his interview with Gary Gutting, professor of philosophy at the University of Notre Dame.

From the NYT:

Sam Harris is a neuroscientist and prominent “new atheist,” who along with others like Richard Dawkins, Daniel Dennett and Christopher Hitchens helped put criticism of religion at the forefront of public debate in recent years. In two previous books, “The End of Faith” and “Letter to a Christian Nation,” Harris argued that theistic religion has no place in a world of science. In his latest book, “Waking Up,” his thought takes a new direction. While still rejecting theism, Harris nonetheless makes a case for the value of “spirituality,” which he bases on his experiences in meditation. I interviewed him recently about the book and some of the arguments he makes in it.

Gary Gutting: A common basis for atheism is naturalism — the view that only science can give a reliable account of what’s in the world. But in “Waking Up” you say that consciousness resists scientific description, which seems to imply that it’s a reality beyond the grasp of science. Have you moved away from an atheistic view?

Sam Harris: I don’t actually argue that consciousness is “a reality” beyond the grasp of science. I just think that it is conceptually irreducible — that is, I don’t think we can fully understand it in terms of unconscious information processing. Consciousness is “subjective”— not in the pejorative sense of being unscientific, biased or merely personal, but in the sense that it is intrinsically first-person, experiential and qualitative.

The only thing in this universe that suggests the reality of consciousness is consciousness itself. Many philosophers have made this argument in one way or another — Thomas Nagel, John Searle, David Chalmers. And while I don’t agree with everything they say about consciousness, I agree with them on this point.

The primary approach to understanding consciousness in neuroscience entails correlating changes in its contents with changes in the brain. But no matter how reliable these correlations become, they won’t allow us to drop the first-person side of the equation. The experiential character of consciousness is part of the very reality we are studying. Consequently, I think science needs to be extended to include a disciplined approach to introspection.

G.G.: But science aims at objective truth, which has to be verifiable: open to confirmation by other people. In what sense do you think first-person descriptions of subjective experience can be scientific?

S.H.: In a very strong sense. The only difference between claims about first-person experience and claims about the physical world is that the latter are easier for others to verify. That is an important distinction in practical terms — it’s easier to study rocks than to study moods — but it isn’t a difference that marks a boundary between science and non-science. Nothing, in principle, prevents a solitary genius on a desert island from doing groundbreaking science. Confirmation by others is not what puts the “truth” in a truth claim. And nothing prevents us from making objective claims about subjective experience.

Are you thinking about Margaret Thatcher right now? Well, now you are. Were you thinking about her exactly six minutes ago? Probably not. There are answers to questions of this kind, whether or not anyone is in a position to verify them.

And certain truths about the nature of our minds are well worth knowing. For instance, the anger you felt yesterday, or a year ago, isn’t here anymore, and if it arises in the next moment, based on your thinking about the past, it will quickly pass away when you are no longer thinking about it. This is a profoundly important truth about the mind — and it can be absolutely liberating to understand it deeply. If you do understand it deeply — that is, if you are able to pay clear attention to the arising and passing away of anger, rather than merely think about why you have every right to be angry — it becomes impossible to stay angry for more than a few moments at a time. Again, this is an objective claim about the character of subjective experience. And I invite our readers to test it in the laboratory of their own minds.

G. G.: Of course, we all have some access to what other people are thinking or feeling. But that access is through probable inference and so lacks the special authority of first-person descriptions. Suppose I told you that in fact I didn’t think of Margaret Thatcher when I read your comment, because I misread your text as referring to Becky Thatcher in “The Adventures of Tom Sawyer”? If that’s true, I have evidence for it that you can’t have. There are some features of consciousness that we will agree on. But when our first-person accounts differ, then there’s no way to resolve the disagreement by looking at one another’s evidence. That’s very different from the way things are in science.

S.H.: This difference doesn’t run very deep. People can be mistaken about the world and about the experiences of others — and they can even be mistaken about the character of their own experience. But these forms of confusion aren’t fundamentally different. Whatever we study, we are obliged to take subjective reports seriously, all the while knowing that they are sometimes false or incomplete.

For instance, consider an emotion like fear. We now have many physiological markers for fear that we consider quite reliable, from increased activity in the amygdala and spikes in blood cortisol to peripheral physiological changes like sweating palms. However, just imagine what would happen if people started showing up in the lab complaining of feeling intense fear without showing any of these signs — and they claimed to feel suddenly quite calm when their amygdalae lit up on fMRI, their cortisol spiked, and their skin conductance increased. We would no longer consider these objective measures of fear to be valid. So everything still depends on people telling us how they feel and our (usually) believing them.

However, it is true that people can be very poor judges of their inner experience. That is why I think disciplined training in a technique like “mindfulness,” apart from its personal benefits, can be scientifically important.

Read the entire story here.

An Ode to the Monopolist

Peter Thiel on why entrepreneurs should strive for monopoly and avoid competition. If only it were that simple for esoteric restaurants, innovative technology companies and all startup businesses in between.

From WSJ:

What valuable company is nobody building? This question is harder than it looks, because your company could create a lot of value without becoming very valuable itself. Creating value isn’t enough—you also need to capture some of the value you create.

This means that even very big businesses can be bad businesses. For example, U.S. airline companies serve millions of passengers and create hundreds of billions of dollars of value each year. But in 2012, when the average airfare each way was $178, the airlines made only 37 cents per passenger trip. Compare them to Google which creates less value but captures far more. Google brought in $50 billion in 2012 (versus $160 billion for the airlines), but it kept 21% of those revenues as profits—more than 100 times the airline industry’s profit margin that year. Google makes so much money that it is now worth three times more than every U.S. airline combined.

The airlines compete with each other, but Google stands alone. Economists use two simplified models to explain the difference: perfect competition and monopoly.

“Perfect competition” is considered both the ideal and the default state in Economics 101. So-called perfectly competitive markets achieve equilibrium when producer supply meets consumer demand. Every firm in a competitive market is undifferentiated and sells the same homogeneous products. Since no firm has any market power, they must all sell at whatever price the market determines. If there is money to be made, new firms will enter the market, increase supply, drive prices down and thereby eliminate the profits that attracted them in the first place. If too many firms enter the market, they’ll suffer losses, some will fold, and prices will rise back to sustainable levels. Under perfect competition, in the long run no company makes an economic profit.

The opposite of perfect competition is monopoly. Whereas a competitive firm must sell at the market price, a monopoly owns its market, so it can set its own prices. Since it has no competition, it produces at the quantity and price combination that maximizes its profits.

To an economist, every monopoly looks the same, whether it deviously eliminates rivals, secures a license from the state or innovates its way to the top. I’m not interested in illegal bullies or government favorites: By “monopoly,” I mean the kind of company that is so good at what it does that no other firm can offer a close substitute. Google is a good example of a company that went from 0 to 1: It hasn’t competed in search since the early 2000s, when it definitively distanced itself from Microsoft and Yahoo!

Americans mythologize competition and credit it with saving us from socialist bread lines. Actually, capitalism and competition are opposites. Capitalism is premised on the accumulation of capital, but under perfect competition, all profits get competed away. The lesson for entrepreneurs is clear: If you want to create and capture lasting value, don’t build an undifferentiated commodity business.

How much of the world is actually monopolistic? How much is truly competitive? It is hard to say because our common conversation about these matters is so confused. To the outside observer, all businesses can seem reasonably alike, so it is easy to perceive only small differences between them. But the reality is much more binary than that. There is an enormous difference between perfect competition and monopoly, and most businesses are much closer to one extreme than we commonly realize.

The confusion comes from a universal bias for describing market conditions in self-serving ways: Both monopolists and competitors are incentivized to bend the truth.

Monopolists lie to protect themselves. They know that bragging about their great monopoly invites being audited, scrutinized and attacked. Since they very much want their monopoly profits to continue unmolested, they tend to do whatever they can to conceal their monopoly—usually by exaggerating the power of their (nonexistent) competition.

Think about how Google talks about its business. It certainly doesn’t claim to be a monopoly. But is it one? Well, it depends: a monopoly in what? Let’s say that Google is primarily a search engine. As of May 2014, it owns about 68% of the search market. (Its closest competitors, Microsoft and Yahoo! have about 19% and 10%, respectively.) If that doesn’t seem dominant enough, consider the fact that the word “google” is now an official entry in the Oxford English Dictionary—as a verb. Don’t hold your breath waiting for that to happen to Bing.

But suppose we say that Google is primarily an advertising company. That changes things. The U.S. search-engine advertising market is $17 billion annually. Online advertising is $37 billion annually. The entire U.S. advertising market is $150 billion. And global advertising is a $495 billion market. So even if Google completely monopolized U.S. search-engine advertising, it would own just 3.4% of the global advertising market. From this angle, Google looks like a small player in a competitive world.

What if we frame Google as a multifaceted technology company instead? This seems reasonable enough; in addition to its search engine, Google makes dozens of other software products, not to mention robotic cars, Android phones and wearable computers. But 95% of Google’s revenue comes from search advertising; its other products generated just $2.35 billion in 2012 and its consumer-tech products a mere fraction of that. Since consumer tech is a $964 billion market globally, Google owns less than 0.24% of it—a far cry from relevance, let alone monopoly. Framing itself as just another tech company allows Google to escape all sorts of unwanted attention.

Non-monopolists tell the opposite lie: “We’re in a league of our own.” Entrepreneurs are always biased to understate the scale of competition, but that is the biggest mistake a startup can make. The fatal temptation is to describe your market extremely narrowly so that you dominate it by definition.

Read the entire article here.

The Next (and Final) Doomsday Scenario

Personally, I love dystopian visions and apocalyptic nightmares. So, news that the famed Higgs boson may ultimately cause our demise, and incidentally the end of the entire cosmos, caught my attention.

Apparently theoreticians have calculated that the Higgs potential of which the Higgs boson is a manifestation has characteristics that make the universe unstable. (The Higgs was discovered in 2012 by teams at CERN’s Large Hadron Collider.) Luckily for those wishing to avoid the final catastrophe this instability may keep the universe intact for several more billions of years, and if suddenly the Higgs were to trigger the final apocalypse it would be at the speed of light.

From Popular Mechanics:

In July 2012, when scientists at CERN’s Large Hadron Collider culminated decades of work with their discovery of the Higgs boson, most physicists celebrated. Stephen Hawking did not. The famed theorist expressed his disappointmentthat nothing more unusual was found, calling the discovery “a pity in a way.” But did he ever say the Higgs could destroy the universe?

That’s what many reports in the media said earlier this week, quoting a preface Hawking wrote to a book called Starmus. According to The Australian, the preface reads in part: “The Higgs potential has the worrisome feature that it might become metastable at energies above 100 [billion] gigaelectronvolts (GeV). This could mean that the universe could undergo catastrophic vacuum decay, with a bubble of the true vacuum expanding at the speed of light. This could happen at any time and we wouldn’t see it coming.”

What Hawking is talking about here is not the Higgs boson but what’s called the Higgs potential, which are “totally different concepts,” says Katie Mack, a theoretical astrophysicist at Melbourne University. The Higgs field permeates the entire universe, and the Higgs boson is an excitation of that field, just like an electron is an excitation of an electric field. In this analogy, the Higgs potential is like the voltage, determining the value of the field.

Once physicists began to close in on the mass of the Higgs boson, they were able to work out the Higgs potential. That value seemed to reveal that the universe exists in what’s known as a meta-stable vacuum state, or false vacuum, a state that’s stable for now but could slip into the “true” vacuum at any time. This is the catastrophic vacuum decay in Hawking’s warning, though he is not the first to posit the idea.

Is he right?

“There are a couple of really good reasons to think that’s not the end of the story,” Mack says. There are two ways for a meta-stable state to fall off into the true vacuum—one classical way, and one quantum way. The first would occur via a huge energy boost, the 100 billion GeVs Hawking mentions. But, Mack says, the universe already experienced such high energies during the period of inflation just after the big bang. Particles in cosmic rays from space also regularly collide with these kinds of high energies, and yet the vacuum hasn’t collapsed (otherwise, we wouldn’t be here).

“Imagine that somebody hands you a piece of paper and says, ‘This piece of paper has the potential to spontaneously combust,’ and so you might be worried,” Mack says. “But then they tell you 20 years ago it was in a furnace.” If it didn’t combust in the furnace, it’s not likely to combust sitting in your hand.

Of course, there’s always the quantum world to consider, and that’s where things always get weirder. In the quantum world, where the smallest of particles interact, it’s possible for a particle on one side of a barrier to suddenly appear on the other side of the barrier without actually going through it, a phenomenon known as quantum tunneling. If our universe was in fact in a meta-stable state, it could quantum tunnel through the barrier to the vacuum on the other side with no warning, destroying everything in an instant. And while that is theoretically possible, predictions show that if it were to happen, it’s not likely for billions of billions of years. By then, the sun and Earth and you and I and Stephen Hawking will be a distant memory, so it’s probably not worth losing sleep over it.

What’s more likely, Mack says, is that there is some new physics not yet understood that makes our vacuum stable. Physicists know there are parts of the model missing; mysteries like quantum gravity and dark matter that still defy explanation. When two physicists published a paper documenting the Higgs potential conundrum in March, their conclusion was that an explanation lies beyond the Standard Model, not that the universe may collapse at any time.

Read the article here.

The Original Rolling Stones

rocks-at-racetrack_arno_gourdol

Who or what has been moving these Death Valley boulders? Theories have persisted for quite some time: unknown inhabitants of the desert straddling California and Nevada; mischievous troglodytes from Middle Earth; aliens sending us cryptic, geologic messages; invisible demons; telepathic teenagers.

But now we know, and the mysterious forces at work are, unfortunately, rather mundane — the rocks are moved through a combination of rain, ice and wind. Oh well — time to focus on crop circles again!

From ars technica:

Mario is just a video game, and rocks don’t have legs. Both of these things are true. Yet, like the Mario ghosts that advance only when your back is turned, there are rocks that we know have been moving—even though no one has ever seen them do it.

The rocks in question occupy a spot called Racetrack Playa in Death Valley. Playas are desert mudflats that sometimes host shallow lakes when enough water is around. Racetrack Playa gets its name from long furrows extending from large rocks sitting on the playa bed—tracks that make it look as if the rocks had been dragged through the mud. The tracks of the various rocks run parallel to each other, sometimes suggesting that the rocks had made sharp turns in unison, like dehydrated synchronize swimmers.

Many potential explanations have been offered up (some going back to the 1940s) for this bizarre situation, as the rocks seem to only move occasionally and had never been caught in the act. One thing everyone could agree on was that it must occur when the playa is wet and the muddy bottom is slick. At first, suggestions revolved around especially strong winds. One geologist went as far as to bring out a propeller airplane to see how much wind it would take.

The other idea was that ice, which does occasionally form there, could be responsible. If the rocks were frozen into a sheet of ice, a little buoyancy might reduce the friction beneath them. And again, strong winds over the surface of the ice could drag the whole mess around, accounting for the synchronized nature of the tracks.

Over the years, a number of clever studies have attempted to test these possibilities. But to truly put the question to rest, the rocks were going to have to be observed while moving. A team led by Richard Norris and his engineer cousin James Norris set out to do just that. They set out 15 rocks with GPS loggers, a weather station, and some time-lapse cameras in 2011. Magnetic triggers were buried beneath the rocks so that the loggers would start recording when they began to move. And the Norrises waited.

They got what they were after last winter. A little rain and snow provided enough water to fill the lake to a depth of a few centimeters. At night, temperatures were low enough for ice to form. On a few sunny days, the rocks stirred.

By noon, the thin sheet of ice—just a few millimeters thick—would start breaking up. Light wind pushed the ice, and the water in the lake, to the northeast. The rocks, which weren’t frozen into the thin ice, went along for the ride. On one occasion, two rocks were recorded traveling 65 meters over 16 minutes, with a peak rate of 5 to 6 meters per minute.

These movements were detectable in the time-lapse images, but you might not actually notice it if you were standing there. The researchers note that the tracks carved in the mud aren’t immediately apparent due to the muddy water.

The total distances traveled by the instrumented rocks between November and February ranged from 15 to 225 meters. While all moving rocks travel in the direction of the prevailing wind, they didn’t all move together—motion depended on the way the ice broke up and the depth of the water around each rock.

While the proposed explanations weren’t far off, the thinness of the ice and the minimal wind speed that were needed were both surprises. There was no ice buoyancy lifting the rocks. They were just being pushed by loose sheets of thin ice that were themselves being pushed by wind and water.

In the end, there’s nothing extraordinary about the motion of these rocks, but the necessary conditions are rare enough that the results still shock us. Similar tracks have been found in a few playas elsewhere around the world, though, and ice-pushed rocks also leave marks in the shallows of Canada’s Great Slave Lake. There’s no need to worry about the rocks at Racetrack Playa coming to life and opening secretly ferocious jaws when you look away.

Read the entire story here.

Image: Rocks at Racetrack Playa, Death Valley. Courtesy of Arno Gourdol. Some Rights Reserved.

The Future of History

[tube]f3nJOCfkerI[/tube]

Take and impassioned history professor, a mediocre U.S. high school history curriculum and add Bill Gates, and you get an opportunity to inject fresh perspectives and new ideas into young minds.

Not too long ago Professor David Christian’s collection of Big History DVDs caught Gates’ attention, leading to a broad mission to overhaul the boring history lesson — one school at a time. Professor Christian’s approach takes a thoroughly holistic approach to the subject, spanning broad and interconnected topics such as culture, biochemistry, astronomy, agriculture and physics. The sweeping narrative fundamental to Christian’s delivery reminds me somewhat of Kenneth Clark’s Civilisation and Jacob Bronowski’s The Ascent of Man, two landmark U.K. television series.

From the New York Times:

In 2008, shortly after Bill Gates stepped down from his executive role at Microsoft, he often awoke in his 66,000-square-foot home on the eastern bank of Lake Washington and walked downstairs to his private gym in a baggy T-shirt, shorts, sneakers and black socks yanked up to the midcalf. Then, during an hour on the treadmill, Gates, a self-described nerd, would pass the time by watching DVDs from the Teaching Company’s “Great Courses” series. On some mornings, he would learn about geology or meteorology; on others, it would be oceanography or U.S. history.

As Gates was working his way through the series, he stumbled upon a set of DVDs titled “Big History” — an unusual college course taught by a jovial, gesticulating professor from Australia named David Christian. Unlike the previous DVDs, “Big History” did not confine itself to any particular topic, or even to a single academic discipline. Instead, it put forward a synthesis of history, biology, chemistry, astronomy and other disparate fields, which Christian wove together into nothing less than a unifying narrative of life on earth. Standing inside a small “Mr. Rogers”-style set, flanked by an imitation ivy-covered brick wall, Christian explained to the camera that he was influenced by the Annales School, a group of early-20th-century French historians who insisted that history be explored on multiple scales of time and space. Christian had subsequently divided the history of the world into eight separate “thresholds,” beginning with the Big Bang, 13 billion years ago (Threshold 1), moving through to the origin of Homo sapiens (Threshold 6), the appearance of agriculture (Threshold 7) and, finally, the forces that gave birth to our modern world (Threshold 8).

Christian’s aim was not to offer discrete accounts of each period so much as to integrate them all into vertiginous conceptual narratives, sweeping through billions of years in the span of a single semester. A lecture on the Big Bang, for instance, offered a complete history of cosmology, starting with the ancient God-centered view of the universe and proceeding through Ptolemy’s Earth-based model, through the heliocentric versions advanced by thinkers from Copernicus to Galileo and eventually arriving at Hubble’s idea of an expanding universe. In the worldview of “Big History,” a discussion about the formation of stars cannot help including Einstein and the hydrogen bomb; a lesson on the rise of life will find its way to Jane Goodall and Dian Fossey. “I hope by the end of this course, you will also have a much better sense of the underlying unity of modern knowledge,” Christian said at the close of the first lecture. “There is a unified account.”

As Gates sweated away on his treadmill, he found himself marveling at the class’s ability to connect complex concepts. “I just loved it,” he said. “It was very clarifying for me. I thought, God, everybody should watch this thing!” At the time, the Bill & Melinda Gates Foundation had donated hundreds of millions of dollars to educational initiatives, but many of these were high-level policy projects, like the Common Core Standards Initiative, which the foundation was instrumental in pushing through. And Gates, who had recently decided to become a full-time philanthropist, seemed to pine for a project that was a little more tangible. He was frustrated with the state of interactive coursework and classroom technology since before he dropped out of Harvard in the mid-1970s; he yearned to experiment with entirely new approaches. “I wanted to explore how you did digital things,” he told me. “That was a big issue for me in terms of where education was going — taking my previous skills and applying them to education.” Soon after getting off the treadmill, he asked an assistant to set a meeting with Christian.

A few days later, the professor, who was lecturing at San Diego State University, found himself in the lobby of a hotel, waiting to meet with the billionaire. “I was scared,” Christian recalled. “Someone took me along the corridor, knocks on a door, Bill opens it, invites me in. All I remember is that within five minutes, he had so put me at my ease. I thought, I’m a nerd, he’s a nerd and this is fun!” After a bit of small talk, Gates got down to business. He told Christian that he wanted to introduce “Big History” as a course in high schools all across America. He was prepared to fund the project personally, outside his foundation, and he wanted to be personally involved. “He actually gave me his email address and said, ‘Just think about it,’ ” Christian continued. ” ‘Email me if you think this is a good idea.’ ”

Christian emailed to say that he thought it was a pretty good idea. The two men began tinkering, adapting Christian’s college course into a high-school curriculum, with modules flexible enough to teach to freshmen and seniors alike. Gates, who insisted that the course include a strong digital component, hired a team of engineers and designers to develop a website that would serve as an electronic textbook, brimming with interactive graphics and videos. Gates was particularly insistent on the idea of digital timelines, which may have been vestige of an earlier passion project, Microsoft Encarta, the electronic encyclopedia that was eventually overtaken by the growth of Wikipedia. Now he wanted to offer a multifaceted historical account of any given subject through a friendly user interface. The site, which is open to the public, would also feature a password-protected forum for teachers to trade notes and update and, in some cases, rewrite lesson plans based on their experiences in the classroom.

Read the entire article here.

Video: Clip from Threshold 1, The Big Bang. Courtesy of Big History Project, David Christian.

Measuring the Quantum Jitter

Some physicists are determined to find out if we are mere holograms. Perhaps not quite like the dystopian but romanticized version fictionalized in The Matrix, but still a fascinating idea nonetheless. Armed with a very precise measuring tool, known as a Holometer or more precisely twin correlated Michelson holographic interferometers, researchers aim to find the scale at which the universe becomes jittery. In turn this will give a better picture of the fundamental units of space-time, well beyond the the elementary particles themselves, and somewhat closer to the Planck Length.

From the New Scientist:

The search for the fundamental units of space and time has officially begun. Physicists at the Fermi National Accelerator Laboratory near Chicago, Illinois, announced this week that the Holometer, a device designed to test whether we live in a giant hologram, has started taking data.

The experiment is testing the idea that the universe is actually made up of tiny “bits”, in a similar way to how a newspaper photo is actually made up of dots. These fundamental units of space and time would be unbelievably tiny: a hundred billion billion times smaller than a proton. And like the well-knownquantum behaviour of matter and energy, these bits of space-time would behave more like waves than particles.

“The theory is that space is made of waves instead of points, that everything is a little jittery, and never sits still,” says Craig Hogan at the University of Chicago, who dreamed up the experiment.

The Holometer is designed to measure this “jitter”. The surprisingly simple device is operated from a shed in a field near Chicago, and consists of two powerful laser beams that are directed through tubes 40 metres long. The lasers precisely measure the positions of mirrors along their paths at two points in time.

If space-time is smooth and shows no quantum behaviour, then the mirrors should remain perfectly still. But if both lasers measure an identical, small difference in the mirrors’ position over time, that could mean the mirrors are being jiggled about by fluctuations in the fabric of space itself.

 So what of the idea that the universe is a hologram? This stems from the notion that information cannot be destroyed, so for example the 2D event horizon of a black hole “records” everything that falls into it. If this is the case, then the boundary of the universe could also form a 2D representation of everything contained within the universe, like a hologram storing a 3D image in 2D .

Hogan cautions that the idea that the universe is a hologram is somewhat misleading because it suggests that our experience is some kind of illusion, a projection like a television screen. If the Holometer finds a fundamental unit of space, it won’t mean that our 3D world doesn’t exist. Rather it will change the way we understand its basic makeup. And so far, the machine appears to be working.

In a presentation given in Chicago on Monday at the International Conference on Particle Physics and Cosmology, Hogan said that the initial results show the Holometer is capable of measuring quantum fluctuations in space-time, if they are there.

“This was kind of an amazing moment,” says Hogan. “It’s just noise right now – we don’t know whether it’s space-time noise – but the machine is operating at that specification.”

Hogan expects that the Holometer will have gathered enough data to put together an answer to the quantum question within a year. If the space-time jitter is there, Hogan says it could underpin entirely new explanations for why the expansion of our universe is accelerating, something traditionally attributed to the little understood phenomenon of dark energy.

Read the entire article here.

Burning Man Bucket List

BM-super-pool-art

As this year’s Burning Man comes to an end in the eerily beautiful Black Rock Desert in Nevada I am reminded that attending this life event should be on everyone’s bucket list, before they actually kick it.

That said, applying one or more of the Ten Principle’s that guide Burners, should be a year-round quest — not a once in a lifetime transient goal.

Read more about this year’s BM here.

See more BM visuals here.

Image: Super Pool art installation, Burning Man 2014. Courtesy of Jim Urquhart / Reuters.

 

How to Get Blazingly Fast Internet

Chattanooga,_TennesseeIt’s rather simple in theory, and only requires two steps. Step 1: Follow the lead of a city like Chattanooga, Tennessee. Step 2: Tell you monopolistic cable company what to do with its cables. Done. Now you have a 1 Gigabit Internet connection — around 50-100 times faster than your mother’s Wifi.

This experiment is fueling a renaissance of sorts in the Southern U.S. city and other metropolitan areas can only look on in awe. It comes as no surprise that the cable oligarchs at Comcast, Time Warner and AT&T are looking for any way to halt the city’s progress into the 21st Century.

The Guardian:

Loveman’s department store on Market Street in Chattanooga closed its doors in 1993 after almost a century in business, another victim of a nationwide decline in downtowns that hollowed out so many US towns. Now the opulent building is buzzing again, this time with tech entrepreneurs taking advantage of the fastest internet in the western hemisphere.

Financed by the cash raised from the sale of logistics group Access America, a group of thirty-something local entrepreneurs have set up Lamp Post, an incubator for a new generation of tech companies, in the building. A dozen startups are currently working out of the glitzy downtown office.

“We’re not Silicon Valley. No one will ever replicate that,” says Allan Davis, one of Lamp Post’s partners. “But we don’t need to be and not everyone wants that. The expense, the hassle. You don’t need to be there to create great technology. You can do it here.”

He’s not alone in thinking so. Lamp Post is one of several tech incubators in this mid-sized Tennessee city. Money is flowing in. Chattanooga has gone from close to zero venture capital in 2009 to more than five organized funds with investable capital over $50m in 2014 – not bad for a city of 171,000 people.

The city’s go-getting mayor Andy Berke, a Democrat tipped for higher office, is currently reviewing plans for a city center tech zone specifically designed to meet the needs of its new workforce.

In large part the success is being driven by The Gig. Thanks to an ambitious roll-out by the city’s municipally owned electricity company, EPB, Chattanooga is one of the only places on Earth with internet at speeds as fast as 1 gigabit per second – about 50 times faster than the US average.

The tech buildup comes after more than a decade of reconstruction in Chattanooga that has regenerated the city with a world-class aquarium, 12 miles of river walks along the Tennessee River, an arts district built around the Hunter Museum of American Arts, high-end restaurants and outdoor activities.

But it’s the city’s tech boom has sparked interest from other municipalities across the world. It also comes as the Federal Communications Commission (FCC) prepares to address some of the biggest questions the internet has faced when it returns from the summer break. And while the FCC discusses whether Comcast, the world’s biggest cable company, should take over Time Warner, the US’s second largest cable operator, and whether to allow those companies to set up fast lanes (and therefore slow lanes) for internet traffic, Chattanooga is proof that another path is possible.

It’s a story that is being watched very closely by Big Cable’s critics. “In DC there is often an attitude that the only way to solve our problems is to hand them over to big business. Chattanooga is a reminder that the best solutions are often local and work out better than handing over control to Comcast or AT&T to do whatever they want with us,” said Chris Mitchell, director of community broadband networks at advocacy group the Institute for Local Self-Reliance.

On Friday, the US cable industry called on the FCC to block Chattanooga’s plan to expand, as well as a similar plan for Wilson, North Carolina.

“The success of public broadband is a mixed record, with numerous examples of failures,” USTelecom said in a blog post. “With state taxpayers on the financial hook when a municipal broadband network goes under, it is entirely reasonable for state legislatures to be cautious in limiting or even prohibiting that activity.”

Mayor Berke has dealt with requests for visits from everyone from tiny rural communities to “humungous international cities”. “You don’t see many mid-sized cities that have the kind of activity that we have right now in Chattanooga,” he said. “What the Gig did was change the idea of what our city could be. Mid-sized southern cities are not generally seen as being ahead of the technological curve, the Gig changed that. We now have people coming in looking to us as a leader.”

It’s still early days but there have already been notable successes. In addition to Access America’s sale for an undisclosed sum, last year restaurant booking site OpenTable bought a local company, QuickCue, for $11.5m. “That’s a great example of a story that just doesn’t happen in other mid-sized southern cities,” said Berke.

But it’s what Chattanooga can do next that has the local tech community buzzed.

EPB’s high-speed network came about after it decided to set up a smart electric grid in order to cut power outages. EPB estimated it would take 10 years to build the system and raised a $170m through a municipal bond to pay for it. In 2009 president Barack Obama launched the American Recovery and Reinvestment Act, a stimulus programme aimed at getting the US economy back on track amid the devastation of the recession. EPB was awarded $111m to get its smart grid up and running. Less than three years later the whole service territory was built.

The fibre-optic network uses IntelliRupter PulseClosers, made by S&C Electric, that can reroute power during outages. The University of California at Berkeley estimates that power outages cost the US economy $80bn a year through business disruption with manufacturers stopping their lines and restaurants closing. Chattanooga’s share of that loss was about $100m, EPB estimates. The smart grid can detect a fault in milliseconds and route power around problems. Since the system was installed the duration of power outages has been cut in half.

But it was the other uses of that fiber that fired up enthusiasm in Chattanooga. “When we first started talking about this and the uses of the smart grid we would say to customers and community groups ‘Oh and it can also offer very high-speed internet, TV and phone.’ The electric power stuff was no longer of interest. This is what what people got excited about and it’s the same today,” said EPB vice president Danna Bailey.

Read the entire story here.

Image: Chattanooga, TN skyline. Courtesy of Wikipedia.

The IBM Songbook

IBM Songbook

It would be fascinating to see a Broadway or West End show based on lyrics penned in honor of IBM and Thomas Watson, Sr., its first president. Makes you wonder if faithful employees of say, Facebook or Apple, would ever write a songbook — not in jest — for their corporate alma mater. I think not.

From ars technica:

“For thirty-seven years,” reads the opening passage in the book, “the gatherings and conventions of our IBM workers have expressed in happy songs the fine spirit of loyal cooperation and good fellowship which has promoted the signal success of our great IBM Corporation in its truly International Service for the betterment of business and benefit to mankind.”

That’s a hell of a mouthful, but it’s only the opening volley in the war on self-respect and decency that is the 1937 edition of Songs of the IBM, a booklet of corporate ditties first published in 1927 on the order of IBM company founder Thomas Watson, Sr.

The 1937 edition of the songbook is a 54-page monument to glassey-eyed corporate inhumanity, with every page overflowing with trite praise to The Company and Its Men. The booklet reads like a terribly parody of a hymnal—one that praises not the traditional Christian trinity but the new corporate triumvirate of IBM the father, Watson the son, and American entrepreneurship as the holy spirit:

Thomas Watson is our inspiration,
Head and soul of our splendid I.B.M.
We are pledged to him in every nation,
Our President and most beloved man.
His wisdom has guided each division
In service to all humanity
We have grown and broadened with his vision,
None can match him or our great company.
T. J. Watson, we all honor you,
You’re so big and so square and so true,
We will follow and serve with you forever,
All the world must know what I. B. M. can do.

—from “To Thos. J. Watson, President, I.B.M. Our Inspiration”

The wording transcends sense and sanity—these aren’t songs that normal human beings would choose to line up and sing, are they? Have people changed so much in the last 70-80 years that these songs—which seem expressly designed to debase their singers and deify their subjects—would be joyfully sung in harmony without complaint at company meetings? Were workers in the 1920s and 1930s so dehumanized by the rampaging robber barons of high industry that the only way to keep a desirable corporate job at a place like IBM was to toe the line and sing for your paycheck?

Surely no one would stand for this kind of thing in the modern world—to us, company songs seem like relics of a less-enlightened age. If anything, the mindless overflowing trite words sound like the kind of praises one would find directed at a cult of personality dictator in a decaying wreck of a country like North Korea.

Indeed, some of the songs in the book wouldn’t be out of place venerating the Juche ideal instead of IBM:

We don’t pretend we’re gay.
We always feel that way,
Because we’re filling the world with sunshine.
With I.B.M. machines,
We’ve got the finest means,
For brightly painting the clouds with sunshine.

—from “Painting the Clouds with Sunshine”

Surely no one would stand for this kind of thing in the modern world—to us, company songs seem like relics of a less-enlightened age. If anything, the mindless overflowing trite words sound like the kind of praises one would find directed at a cult of personality dictator in a decaying wreck of a country like North Korea.

Tie an onion to your belt

All right, time to come clean: it’s incredibly easy to cherry pick terrible examples out of a 77-year old corporate songbook (though this songbook makes it easy because of how crazy it is to modern eyes). Moreover, to answer one of the rhetorical questions above, no—people have not changed so much over the past 80-ish years that they could sing mawkishly pro-IBM songs with an irony-free straight face. At least, not without some additional context.

There’s a decade-old writeup on NetworkWorld about the IBM corporate song phenomena that provides a lot of the glue necessary to build a complete mental picture of what was going on in both employees’ and leaderships’ heads. The key takeaway to deflate a lot of the looniness is that the majority of the songs came out of the Great Depression era, and employees lucky enough to be steadfastly employed by a company like IBM often werereally that grateful.

The formal integration of singing as an aspect of IBM’s culture at the time was heavily encouraged by Thomas J. Watson Sr. Watson and his employees co-opted the era’s showtunes and popular melodies for their proto-filking, ensuring that everyone would know the way the song went, if not the exact wording. Employees belting out “To the International Ticketograph Division” to the tune of “My Bonnie Lies Over the Ocean” (“In I.B.M. There’s a division. / That’s known as the Ticketograph; / It’s peopled by men who have vision, / Progressive and hard-working staff”) really isn’t all that different from any other team-building exercise that modern companies do—in fact, in a lot of ways, it’s far less humiliating than a company picnic with Mandatory Interdepartmental Three-Legged Races.

Many of the songs mirror the kinds of things that university students of the same time period might sing in honor of their alma mater. When viewed from the perspective of the Depression and post-Depression era, the singing is still silly—but it also makes a lot more sense. Watson reportedly wanted to inspire loyalty and cohesion among employees—and, remember, this was also an era where “normal” employee behavior was to work at a single company for most of one’s professional life, and then retire with a pension. It’s certainly a lot easier to sing a company’s praises if there’s paid retirement at the end of the last verse.

Read the entire article and see more songs here.

Image: Page 99-100 of the IBM Songbook, 1937. Courtesy of IBM / are technica.

Syndrome X

DNA_Structure

The quest for immortality or even great longevity has probably led humans since they first became self-aware. Entire cultural movements and industries are founded on the desire to enhance and extend our lives. Genetic research, of course, may eventually unlock some or all of life and death’s mysteries. In the meantime, groups of dedicated scientists continue to look for for the foundation of aging with a view to understanding the process and eventually slowing (and perhaps stopping) it. Richard Walker is one of these singularly focused researchers.

From the BBC:

Richard Walker has been trying to conquer ageing since he was a 26-year-old free-loving hippie. It was the 1960s, an era marked by youth: Vietnam War protests, psychedelic drugs, sexual revolutions. The young Walker relished the culture of exultation, of joie de vivre, and yet was also acutely aware of its passing. He was haunted by the knowledge that ageing would eventually steal away his vitality – that with each passing day his body was slightly less robust, slightly more decayed. One evening he went for a drive in his convertible and vowed that by his 40th birthday, he would find a cure for ageing.

Walker became a scientist to understand why he was mortal. “Certainly it wasn’t due to original sin and punishment by God, as I was taught by nuns in catechism,” he says. “No, it was the result of a biological process, and therefore is controlled by a mechanism that we can understand.”

Scientists have published several hundred theories of ageing, and have tied it to a wide variety of biological processes. But no one yet understands how to integrate all of this disparate information.

Walker, now 74, believes that the key to ending ageing may lie in a rare disease that doesn’t even have a real name, “Syndrome X”. He has identified four girls with this condition, marked by what seems to be a permanent state of infancy, a dramatic developmental arrest. He suspects that the disease is caused by a glitch somewhere in the girls’ DNA. His quest for immortality depends on finding it.

It’s the end of another busy week and MaryMargret Williams is shuttling her brood home from school. She drives an enormous SUV, but her six children and their coats and bags and snacks manage to fill every inch. The three big kids are bouncing in the very back. Sophia, 10, with a mouth of new braces, is complaining about a boy-crazy friend. She sits next to Anthony, seven, and Aleena, five, who are glued to something on their mother’s iPhone. The three little kids squirm in three car seats across the middle row. Myah, two, is mining a cherry slushy, and Luke, one, is pawing a bag of fresh crickets bought for the family gecko.

Finally there’s Gabrielle, who’s the smallest child, and the second oldest, at nine years old. She has long, skinny legs and a long, skinny ponytail, both of which spill out over the edges of her car seat. While her siblings giggle and squeal, Gabby’s dusty-blue eyes roll up towards the ceiling. By the calendar, she’s almost an adolescent. But she has the buttery skin, tightly clenched fingers and hazy awareness of a newborn.

Back in 2004, when MaryMargret and her husband, John, went to the hospital to deliver Gabby, they had no idea anything was wrong. They knew from an ultrasound that she would have clubbed feet, but so had their other daughter, Sophia, who was otherwise healthy. And because MaryMargret was a week early, they knew Gabby would be small, but not abnormally so. “So it was such a shock to us when she was born,” MaryMargret says.

Gabby came out purple and limp. Doctors stabilised her in the neonatal intensive care unit and then began a battery of tests. Within days the Williamses knew their new baby had lost the genetic lottery. Her brain’s frontal lobe was smooth, lacking the folds and grooves that allow neurons to pack in tightly. Her optic nerve, which runs between the eyes and the brain, was atrophied, which would probably leave her blind. She had two heart defects. Her tiny fists couldn’t be pried open. She had a cleft palate and an abnormal swallowing reflex, which meant she had to be fed through a tube in her nose. “They started trying to prepare us that she probably wouldn’t come home with us,” John says. Their family priest came by to baptise her.

Day after day, MaryMargret and John shuttled between Gabby in the hospital and 13-month-old Sophia at home. The doctors tested for a few known genetic syndromes, but they all came back negative. Nobody had a clue what was in store for her. Her strong Catholic family put their faith in God. “MaryMargret just kept saying, ‘She’s coming home, she’s coming home’,” recalls her sister, Jennie Hansen. And after 40 days, she did.

Gabby cried a lot, loved to be held, and ate every three hours, just like any other newborn. But of course she wasn’t. Her arms would stiffen and fly up to her ears, in a pose that the family nicknamed her “Harley-Davidson”. At four months old she started having seizures. Most puzzling and problematic, she still wasn’t growing. John and MaryMargret took her to specialist after specialist: a cardiologist, a gastroenterologist, a geneticist, a neurologist, an ophthalmologist and an orthopaedist. “You almost get your hopes up a little – ’This is exciting! We’re going to the gastro doctor, and maybe he’ll have some answers’,” MaryMargret says. But the experts always said the same thing: nothing could be done.

The first few years with Gabby were stressful. When she was one and Sophia two, the Williamses drove from their home in Billings, Montana, to MaryMargret’s brother’s home outside of St Paul, Minnesota. For nearly all of those 850 miles, Gabby cried and screamed. This continued for months until doctors realised she had a run-of-the-mill bladder infection. Around the same period, she acquired a severe respiratory infection that left her struggling to breathe. John and MaryMargret tried to prepare Sophia for the worst, and even planned which readings and songs to use at Gabby’s funeral. But the tiny toddler toughed it out.

While Gabby’s hair and nails grew, her body wasn’t getting bigger. She was developing in subtle ways, but at her own pace. MaryMargret vividly remembers a day at work when she was pushing Gabby’s stroller down a hallway with skylights in the ceiling. She looked down at Gabby and was shocked to see her eyes reacting to the sunlight. “I thought, ‘Well, you’re seeing that light!’” MaryMargret says. Gabby wasn’t blind, after all.

Despite the hardships, the couple decided they wanted more children. In 2007 MaryMargret had Anthony, and the following year she had Aleena. By this time, the Williamses had stopped trudging to specialists, accepting that Gabby was never going to be fixed. “At some point we just decided,” John recalls, “it’s time to make our peace.”

Mortal questions

When Walker began his scientific career, he focused on the female reproductive system as a model of “pure ageing”: a woman’s ovaries, even in the absence of any disease, slowly but inevitably slide into the throes of menopause. His studies investigated how food, light, hormones and brain chemicals influence fertility in rats. But academic science is slow. He hadn’t cured ageing by his 40th birthday, nor by his 50th or 60th. His life’s work was tangential, at best, to answering the question of why we’re mortal, and he wasn’t happy about it. He was running out of time.

So he went back to the drawing board. As he describes in his book, Why We Age, Walker began a series of thought experiments to reflect on what was known and not known about ageing.

Ageing is usually defined as the slow accumulation of damage in our cells, organs and tissues, ultimately causing the physical transformations that we all recognise in elderly people. Jaws shrink and gums recede. Skin slacks. Bones brittle, cartilage thins and joints swell. Arteries stiffen and clog. Hair greys. Vision dims. Memory fades. The notion that ageing is a natural, inevitable part of life is so fixed in our culture that we rarely question it. But biologists have been questioning it for a long time.

It’s a harsh world out there, and even young cells are vulnerable. It’s like buying a new car: the engine runs perfectly but is still at risk of getting smashed on the highway. Our young cells survive only because they have a slew of trusty mechanics on call. Take DNA, which provides the all-important instructions for making proteins. Every time a cell divides, it makes a near-perfect copy of its three-billion-letter code. Copying mistakes happen frequently along the way, but we have specialised repair enzymes to fix them, like an automatic spellcheck. Proteins, too, are ever vulnerable. If it gets too hot, they twist into deviant shapes that keep them from working. But here again, we have a fixer: so-called ‘heat shock proteins’ that rush to the aid of their misfolded brethren. Our bodies are also regularly exposed to environmental poisons, such as the reactive and unstable ‘free radical’ molecules that come from the oxidisation of the air we breathe. Happily, our tissues are stocked with antioxidants and vitamins that neutralise this chemical damage. Time and time again, our cellular mechanics come to the rescue.

Which leads to the biologists’ longstanding conundrum: if our bodies are so well tuned, why, then, does everything eventually go to hell?

One theory is that it all boils down to the pressures of evolution. Humans reproduce early in life, well before ageing rears its ugly head. All of the repair mechanisms that are important in youth – the DNA editors, the heat shock proteins, the antioxidants – help the young survive until reproduction, and are therefore passed down to future generations. But problems that show up after we’re done reproducing cannot be weeded out by evolution. Hence, ageing.

Most scientists say that ageing is not caused by any one culprit but by the breakdown of many systems at once. Our sturdy DNA mechanics become less effective with age, meaning that our genetic code sees a gradual increase in mutations. Telomeres, the sequences of DNA that act as protective caps on the ends of our chromosomes, get shorter every year. Epigenetic messages, which help turn genes on and off, get corrupted with time. Heat shock proteins run down, leading to tangled protein clumps that muck up the smooth workings of a cell. Faced with all of this damage, our cells try to adjust by changing the way they metabolise nutrients and store energy. To ward off cancer, they even know how to shut themselves down. But eventually cells stop dividing and stop communicating with each other, triggering the decline we see from the outside.

Scientists trying to slow the ageing process tend to focus on one of these interconnected pathways at a time. Some researchers have shown, for example, that mice on restricted-calorie diets live longer than normal. Other labs have reported that giving mice rapamycin, a drug that targets an important cell-growth pathway, boosts their lifespan. Still other groups are investigating substances that restore telomeres, DNA repair enzymes and heat shock proteins.

During his thought experiments, Walker wondered whether all of these scientists were fixating on the wrong thing. What if all of these various types of cellular damages were the consequences of ageing, but not the root cause of it? He came up with an alternative theory: that ageing is the unavoidable fallout of our development.

The idea sat on the back burner of Walker’s mind until the evening of 23 October 2005. He was working in his home office when his wife called out to him to join her in the family room. She knew he would want to see what was on TV: an episode of Dateline about a young girl who seemed to be “frozen in time”. Walker watched the show and couldn’t believe what he was seeing. Brooke Greenberg was 12 years old, but just 13 pounds (6kg) and 27 inches (69cm) long. Her doctors had never seen anything like her condition, and suspected the cause was a random genetic mutation. “She literally is the Fountain of Youth,” her father, Howard Greenberg, said.

Walker was immediately intrigued. He had heard of other genetic diseases, such as progeria and Werner syndrome, which cause premature ageing in children and adults respectively. But this girl seemed to be different. She had a genetic disease that stopped her development and with it, Walker suspected, the ageing process. Brooke Greenberg, in other words, could help him test his theory.

Uneven growth

Brooke was born a few weeks premature, with many birth defects. Her paediatrician labeled her with Syndrome X, not knowing what else to call it.

After watching the show, Walker tracked down Howard Greenberg’s address. Two weeks went by before Walker heard back, and after much discussion he was allowed to test Brooke. He was sent Brooke’s medical records as well as blood samples for genetic testing. In 2009, his team published a brief report describing her case.

Walker’s analysis found that Brooke’s organs and tissues were developing at different rates. Her mental age, according to standardised tests, was between one and eight months. Her teeth appeared to be eight years old; her bones, 10 years. She had lost all of her baby fat, and her hair and nails grew normally, but she had not reached puberty. Her telomeres were considerably shorter than those of healthy teenagers, suggesting that her cells were ageing at an accelerated rate.

All of this was evidence of what Walker dubbed “developmental disorganisation”. Brooke’s body seemed to be developing not as a coordinated unit, he wrote, but rather as a collection of individual, out-of-sync parts. “She is not simply ‘frozen in time’,” Walker wrote. “Her development is continuing, albeit in a disorganised fashion.”

The big question remained: why was Brooke developmentally disorganised? It wasn’t nutritional and it wasn’t hormonal. The answer had to be in her genes. Walker suspected that she carried a glitch in a gene (or a set of genes, or some kind of complex genetic programme) that directed healthy development. There must be some mechanism, after all, that allows us to develop from a single cell to a system of trillions of cells. This genetic programme, Walker reasoned, would have two main functions: it would initiate and drive dramatic changes throughout the organism, and it would also coordinate these changes into a cohesive unit.

Ageing, he thought, comes about because this developmental programme, this constant change, never turns off. From birth until puberty, change is crucial: we need it to grow and mature. After we’ve matured, however, our adult bodies don’t need change, but rather maintenance. “If you’ve built the perfect house, you would want to stop adding bricks at a certain point,” Walker says. “When you’ve built a perfect body, you’d want to stop screwing around with it. But that’s not how evolution works.” Because natural selection cannot influence traits that show up after we have passed on our genes, we never evolved a “stop switch” for development, Walker says. So we keep adding bricks to the house. At first this doesn’t cause much damage – a sagging roof here, a broken window there. But eventually the foundation can’t sustain the additions, and the house topples. This, Walker says, is ageing.

Brooke was special because she seemed to have been born with a stop switch. But finding the genetic culprit turned out to be difficult. Walker would need to sequence Brooke’s entire genome, letter by letter.

That never happened. Much to Walker’s chagrin, Howard Greenberg abruptly severed their relationship. The Greenbergs have not publicly explained why they ended their collaboration with Walker, and declined to comment for this article.

Second chance

In August 2009, MaryMargret Williams saw a photo of Brooke on the cover of People magazine, just below the headline “Heartbreaking mystery: The 16-year-old baby”. She thought Brooke sounded a lot like Gabby, so contacted Walker.

After reviewing Gabby’s details, Walker filled her in on his theory. Testing Gabby’s genes, he said, could help him in his mission to end age-related disease – and maybe even ageing itself.

This didn’t sit well with the Williamses. John, who works for the Montana Department of Corrections, often interacts with people facing the reality of our finite time on Earth. “If you’re spending the rest of your life in prison, you know, it makes you think about the mortality of life,” he says. What’s important is not how long you live, but rather what you do with the life you’re given. MaryMargret feels the same way. For years she has worked in a local dermatology office. She knows all too well the cultural pressures to stay young, and wishes more people would embrace the inevitability of getting older. “You get wrinkles, you get old, that’s part of the process,” she says.

But Walker’s research also had its upside. First and foremost, it could reveal whether the other Williams children were at risk of passing on Gabby’s condition.

For several months, John and MaryMargret hashed out the pros and cons. They were under no illusion that the fruits of Walker’s research would change Gabby’s condition, nor would they want it to. But they did want to know why. “What happened, genetically, to make her who she is?” John says. And more importantly: “Is there a bigger meaning for it?”

John and MaryMargret firmly believe that God gave them Gabby for a reason. Walker’s research offered them a comforting one: to help treat Alzheimer’s and other age-related diseases. “Is there a small piece that Gabby could present to help people solve these awful diseases?” John asks. “Thinking about it, it’s like, no, that’s for other people, that’s not for us.” But then he thinks back to the day Gabby was born. “I was in that delivery room, thinking the same thing – this happens to other people, not us.”

Still not entirely certain, the Williamses went ahead with the research.

Amassing evidence

Walker published his theory in 2011, but he’s only the latest of many researchers to think along the same lines. “Theories relating developmental processes to ageing have been around for a very long time, but have been somewhat under the radar for most researchers,” says Joao Pedro de Magalhaes, a biologist at the University of Liverpool. In 1932, for example, English zoologist George Parker Bidder suggested that mammals have some kind of biological “regulator” that stops growth after the animal reaches a specific size. Ageing, Bidder thought, was the continued action of this regulator after growth was done.

Subsequent studies showed that Bidder wasn’t quite right; there are lots of marine organisms, for example, that never stop growing but age anyway. Still, his fundamental idea of a developmental programme leading to ageing has persisted.

For several years, Stuart Kim’s group at Stanford University has been comparing which genes are expressed in young and old nematode worms. It turns out that some genes involved in ageing also help drive development in youth.

Kim suggested that the root cause of ageing is the “drift”, or mistiming, of developmental pathways during the ageing process, rather than an accumulation of cellular damage.

Other groups have since found similar patterns in mice and primates. One study, for example, found that many genes turned on in the brains of old monkeys and humans are the same as those expressed in young brains, suggesting that ageing and development are controlled by some of the same gene networks.

Perhaps most provocative of all, some studies of worms have shown that shutting down essential development genes in adults significantly prolongs life. “We’ve found quite a lot of genes in which this happened – several dozen,” de Magalhaes says.

Nobody knows whether the same sort of developmental-programme genes exist in people. But say that they do exist. If someone was born with a mutation that completely destroyed this programme, Walker reasoned, that person would undoubtedly die. But if a mutation only partially destroyed it, it might lead to a condition like what he saw in Brooke Greenberg or Gabby Williams. So if Walker could identify the genetic cause of Syndrome X, then he might also have a driver of the ageing process in the rest of us.

And if he found that, then could it lead to treatments that slow – or even end – ageing? “There’s no doubt about it,” he says.

Public stage

After agreeing to participate in Walker’s research, the Williamses, just like the Greenbergs before them, became famous. In January 2011, when Gabby was six, the television channel TLC featured her on a one-hour documentary. The Williams family also appeared on Japanese television and in dozens of newspaper and magazine articles.

Other than becoming a local celebrity, though, Gabby’s everyday life hasn’t changed much since getting involved in Walker’s research. She spends her days surrounded by her large family. She’ll usually lie on the floor, or in one of several cushions designed to keep her spine from twisting into a C shape. She makes noises that would make an outsider worry: grunting, gasping for air, grinding her teeth. Her siblings think nothing of it. They play boisterously in the same room, somehow always careful not to crash into her. Once a week, a teacher comes to the house to work with Gabby. She uses sounds and shapes on an iPad to try to teach cause and effect. When Gabby turned nine, last October, the family made her a birthday cake and had a party, just as they always do. Most of her gifts were blankets, stuffed animals and clothes, just as they are every year. Her aunt Jennie gave her make-up.

Walker teamed up with geneticists at Duke University and screened the genomes of Gabby, John and MaryMargret. This test looked at the exome, the 2% of the genome that codes for proteins. From this comparison, the researchers could tell that Gabby did not inherit any exome mutations from her parents – meaning that it wasn’t likely that her siblings would be able to pass on the condition to their kids. “It was a huge relief – huge,” MaryMargret says.

Still, the exome screening didn’t give any clues as to what was behind Gabby’s disease. Gabby carries several mutations in her exome, but none in a gene that would make sense of her condition. All of us have mutations littering our genomes. So it’s impossible to know, in any single individual, whether a particular mutation is harmful or benign – unless you can compare two people with the same condition.

All girls

Luckily for him, Walker’s continued presence in the media has led him to two other young girls who he believes have the same syndrome. One of them, Mackenzee Wittke, of Alberta, Canada, is now five years old, with has long and skinny limbs, just like Gabby. “We have basically been stuck in a time warp,” says her mother, Kim Wittke. The fact that all of these possible Syndrome X cases are girls is intriguing – it could mean that the crucial mutation is on their X chromosome. Or it could just be a coincidence.

Walker is working with a commercial outfit in California to compare all three girls’ entire genome sequences – the exome plus the other 98% of DNA code, which is thought to be responsible for regulating the expression of protein-coding genes.

For his theory, Walker says, “this is do or die – we’re going to do every single bit of DNA in these girls. If we find a mutation that’s common to them all, that would be very exciting.”

But that seems like a very big if.

Most researchers agree that finding out the genes behind Syndrome X is a worthwhile scientific endeavour, as these genes will no doubt be relevant to our understanding of development. They’re far less convinced, though, that the girls’ condition has anything to do with ageing. “It’s a tenuous interpretation to think that this is going to be relevant to ageing,” says David Gems, a geneticist at University College London. It’s not likely that these girls will even make it to adulthood, he says, let alone old age.

It’s also not at all clear that these girls have the same condition. Even if they do, and even if Walker and his collaborators discover the genetic cause, there would still be a steep hill to climb. The researchers would need to silence the same gene or genes in laboratory mice, which typically have a lifespan of two or three years. “If that animal lives to be 10, then we’ll know we’re on the right track,” Walker says. Then they’d have to find a way to achieve the same genetic silencing in people, whether with a drug or some kind of gene therapy. And then they’d have to begin long and expensive clinical trials to make sure that the treatment was safe and effective. Science is often too slow, and life too fast.

End of life

On 24 October 2013, Brooke passed away. She was 20 years old. MaryMargret heard about it when a friend called after reading it in a magazine. The news hit her hard. “Even though we’ve never met the family, they’ve just been such a part of our world,” she says.

MaryMargret doesn’t see Brooke as a template for Gabby – it’s not as if she now believes that she only has 11 years left with her daughter. But she can empathise with the pain the Greenbergs must be feeling. “It just makes me feel so sad for them, knowing that there’s a lot that goes into a child like that,” she says. “You’re prepared for them to die, but when it finally happens, you can just imagine the hurt.”

Today Gabby is doing well. MaryMargret and John are no longer planning her funeral. Instead, they’re beginning to think about what would happen if Gabby outlives them. (Sophia has offered to take care of her sister.) John turned 50 this year, and MaryMargret will be 41. If there were a pill to end ageing, they say they’d have no interest in it. Quite the contrary: they look forward to getting older, because it means experiencing the new joys, new pains and new ways to grow that come along with that stage of life.

Richard Walker, of course, has a fundamentally different view of growing old. When asked why he’s so tormented by it, he says it stems from childhood, when he watched his grandparents physically and psychologically deteriorate. “There was nothing charming to me about sedentary old people, rocking chairs, hot houses with Victorian trappings,” he says. At his grandparents’ funerals, he couldn’t help but notice that they didn’t look much different in death than they did at the end of life. And that was heartbreaking. “To say I love life is an understatement,” he says. “Life is the most beautiful and magic of all things.”

If his hypothesis is correct – who knows? – it might one day help prevent disease and modestly extend life for millions of people. Walker is all too aware, though, that it would come too late for him. As he writes in his book: “I feel a bit like Moses who, after wandering in the desert for most years of his life, was allowed to gaze upon the Promised Land but not granted entrance into it.”

 Read the entire story here.

Story courtesy of BBC and Mosaic under Creative Commons License.

Image: DNA structure. Courtesy of Wikipedia.

The Idea Shower and The Strategic Staircase

Every now and then we visit the world of corporatespeak to see how business jargon is faring: which words are in, which phrases are out. Unfortunately, many of the most used and over-used still find their way into common office parlance. With apologies to our state-side readers some of the most popular British phrases follow, and, no surprise, many of these cringeworthy euphemisms seem to emanate from the U.S. Ugh!

From the Guardian:

I don’t know about you, but I’m a sucker for a bit of joined up, blue sky thinking. I love nothing more than the opportunity to touch base with my boss first thing on a Monday morning. It gives me that 24 carat feeling.

I apologise for the sarcasm, but management speak makes most people want to staple the boss’s tongue to the desk. A straw poll around my office found jargon is seen by staff as a tool for making something seem more impressive than it actually is.

The Plain English Campaign says that many staff working for big corporate organisations find themselves using management speak as a way of disguising the fact that they haven’t done their job properly. Some people think that it is easy to bluff their way through by using long, impressive-sounding words and phrases, even if they don’t know what they mean, which is telling in itself.

Furthermore, a recent survey by Institute of Leadership & Management, revealed that management speak is used in almost two thirds (64%) of offices, with nearly a quarter (23%) considering it to be a pointless irritation. “Thinking outside the box” (57%), “going forward” (55%) and “let’s touch base” (39%) were identified as the top three most overused pieces of jargon.

Walk through any office and you’ll hear this kind of thing going on every day. Here are some of the most irritating euphemisms doing the rounds:

Helicopter view – need a phrase that means broad overview of the business? Then why not say “a broad view of the business”?

Idea shower – brainstorm might be out of fashion, but surely we can thought cascade something better than this drivel.

Touch base offline – meaning let’s meet and talk. Because, contrary to popular belief, it is possible to communicate without a Wi-Fi signal. No, really, it is. Fancy a coffee?

Low hanging fruit – easy win business. This would be perfect for hungry children in orchards, but what is really happening is an admission that you don’t want to take the complicated route.

Look under the bonnet – analyse a situation. Most people wouldn’t have a clue about a car engine. When I look under a car bonnet I scratch my head, try not to look like I haven’t got a clue, jiggle a few pipes and kick the tyres before handing the job over to a qualified professional.

Get all your ducks in a row – be organised. Bert and Ernie from Sesame Street had an obsession with rubber ducks. You may think I’m disorganised, but there’s no need to talk to me like a five-year-old.

Don’t let the grass grow too long on this one – work fast. I’m looking for a polite way of suggesting that you get off your backside and get on with it.

Not enough bandwidth – too busy. Really? Try upgrading to fibre optics. I reckon I know a few people who haven’t been blessed with enough “bandwidth” and it’s got nothing to do with being busy.

Cascading relevant information – speaking to your colleagues. If anything, this is worse than touching base offline. From the flourish of cascading through to relevant, and onto information – this is complete nonsense.

The strategic staircase – business plan. Thanks, but I’ll take the lift.

Run it up the flagpole – try it out. Could you attach yourself while you’re at it?

Read the entire story here.

Sugar Is Bad For You, Really? Really!

 

sugar moleculesIn case you may not have heard, sugar is bad for you. In fact, an increasing number of food scientists will tell you that sugar is a poison, and that it’s time to fight the sugar oligarchs in much the same way that health advocates resolved to take on big tobacco many decades ago.

From the Guardian:

If you have any interest at all in diet, obesity, public health, diabetes, epidemiology, your own health or that of other people, you will probably be aware that sugar, not fat, is now considered the devil’s food. Dr Robert Lustig’s book, Fat Chance: The Hidden Truth About Sugar, Obesity and Disease, for all that it sounds like a Dan Brown novel, is the difference between vaguely knowing something is probably true, and being told it as a fact. Lustig has spent the past 16 years treating childhood obesity. His meta-analysis of the cutting-edge research on large-cohort studies of what sugar does to populations across the world, alongside his own clinical observations, has him credited with starting the war on sugar. When it reaches the enemy status of tobacco, it will be because of Lustig.

“Politicians have to come in and reset the playing field, as they have with any substance that is toxic and abused, ubiquitous and with negative consequence for society,” he says. “Alcohol, cigarettes, cocaine. We don’t have to ban any of them. We don’t have to ban sugar. But the food industry cannot be given carte blanche. They’re allowed to make money, but they’re not allowed to make money by making people sick.”

Lustig argues that sugar creates an appetite for itself by a determinable hormonal mechanism – a cycle, he says, that you could no more break with willpower than you could stop feeling thirsty through sheer strength of character. He argues that the hormone related to stress, cortisol, is partly to blame. “When cortisol floods the bloodstream, it raises blood pressure; increases the blood glucose level, which can precipitate diabetes. Human research shows that cortisol specifically increases caloric intake of ‘comfort foods’.” High cortisol levels during sleep, for instance, interfere with restfulness, and increase the hunger hormone ghrelin the next day. This differs from person to person, but I was jolted by recognition of the outrageous deliciousness of doughnuts when I haven’t slept well.

“The problem in obesity is not excess weight,” Lustig says, in the central London hotel that he has made his anti-metabolic illness HQ. “The problem with obesity is that the brain is not seeing the excess weight.” The brain can’t see it because appetite is determined by a binary system. You’re either in anorexigenesis – “I’m not hungry and I can burn energy” – or you’re in orexigenesis – “I’m hungry and I want to store energy.” The flip switch is your leptin level (the hormone that regulates your body fat) but too much insulin in your system blocks the leptin signal.

It helps here if you have ever been pregnant or remember much of puberty and that savage hunger; the way it can trick you out of your best intentions, the lure of ridiculous foods: six-month-old Christmas cake, sweets from a bin. If you’re leptin resistant – that is, if your insulin is too high as a result of your sugar intake – you’ll feel like that all the time.

Telling people to simply lose weight, he tells me, “is physiologically impossible and it’s clinically dangerous. It’s a goal that’s not achievable.” He explains further in the book: “Biochemistry drives behaviour. You see a patient who drinks 10 gallons of water a day and urinates 10 gallons of water a day. What is wrong with him? Could he have a behavioural disorder and be a psychogenic water drinker? Could be. Much more likely he has diabetes.” To extend that, you could tell people with diabetes not to drink water, and 3% of them might succeed – the outliers. But that wouldn’t help the other 97% just as losing the weight doesn’t, long-term, solve the metabolic syndrome – the addiction to sugar – of which obesity is symptomatic.

Many studies have suggested that diets tend to work for two months, some for as long as six. “That’s what the data show. And then everybody’s weight comes roaring back.” During his own time working night shifts, Lustig gained 3st, which he never lost and now uses exuberantly to make two points. The first is that weight is extremely hard to lose, and the second – more important, I think – is that he’s no diet and fitness guru himself. He doesn’t want everybody to be perfect: he’s just a guy who doesn’t want to surrender civilisation to diseases caused by industry. “I’m not a fitness guru,” he says, puckishly. “I’m 45lb overweight!”

“Sugar causes diseases: unrelated to their calories and unrelated to the attendant weight gain. It’s an independent primary-risk factor. Now, there will be food-industry people who deny it until the day they die, because their livelihood depends on it.” And here we have the reason why he sees this is a crusade and not a diet book, the reason that Lustig is in London and not Washington. This is an industry problem; the obesity epidemic began in 1980. Back then, nobody knew about leptin. And nobody knew about insulin resistance until 1984.

“What they knew was, when they took the fat out they had to put the sugar in, and when they did that, people bought more. And when they added more, people bought more, and so they kept on doing it. And that’s how we got up to current levels of consumption.” Approximately 80% of the 600,000 packaged foods you can buy in the US have added calorific sweeteners (this includes bread, burgers, things you wouldn’t add sugar to if you were making them from scratch). Daily fructose consumption has doubled in the past 30 years in the US, a pattern also observable (though not identical) here, in Canada, Malaysia, India, right across the developed and developing world. World sugar consumption has tripled in the past 50 years, while the population has only doubled; it makes sense of the obesity pandemic.

“It would have happened decades earlier; the reason it didn’t was that sugar wasn’t cheap. The thing that made it cheap was high-fructose corn syrup. They didn’t necessarily know the physiology of it, but they knew the economics of it.” Adding sugar to everyday food has become as much about the industry prolonging the shelf life as it has about palatability; if you’re shopping from corner shops, you’re likely to be eating unnecessary sugar in pretty well everything. It is difficult to remain healthy in these conditions. “You here in Britain are light years ahead of us in terms of understanding the problem. We don’t get it in the US, we have this libertarian streak. You don’t have that. You’re going to solve it first. So it’s in my best interests to help you, because that will help me solve it back there.”

The problem has mushroomed all over the world in 30 years and is driven by the profits of the food and diet industries combined. We’re not looking at a global pandemic of individual greed and fecklessness: it would be impossible for the citizens of the world to coordinate their human weaknesses with that level of accuracy. Once you stop seeing it as a problem of personal responsibility it’s easier to accept how profound and serious the war on sugar is. Life doesn’t have to become wholemeal and joyless, but traffic-light systems and five-a-day messaging are under-ambitious.

“The problem isn’t a knowledge deficit,” an obesity counsellor once told me. “There isn’t a fat person on Earth who doesn’t know vegetables are good for you.” Lustig agrees. “I, personally, don’t have a lot of hope that those things will turn things around. Education has not solved any substance of abuse. This is a substance of abuse. So you need two things, you need personal intervention and you need societal intervention. Rehab and laws, rehab and laws. Education would come in with rehab. But we need laws.”

Read the entire article here.

Image: Molecular diagrams of sucrose (left) and fructose (right). Courtesy of Wikipedia.

 

National Extinction Coming Soon

Based on declining fertility rates in some Asian nations a new study predicts complete national extinctions in the not too distant future.

From the Telegraph:

South Koreans will be ‘extinct’ by 2750 if nothing is done to halt the nation’s falling fertility rate, according to a study by The National Assembly Research Service in Seoul.

The fertility rate declined to a new low of 1.19 children per woman in 2013, the study showed, well below the fertility rate required to sustainSouth Korea‘s current population of 50 million people, the Chosun Ilbo reported.

In a simulation, the NARS study suggests that the population will shrink to 40 million in 2056 and 10 million in 2136. The last South Korean, the report indicates, will die in 2750, making it the first national group in the world to become extinct.

The simulation is a worst-case scenario and does not consider possible changes in immigration policy, for example.

The study, carried out at the request of Yang Seung-jo, a member of the opposition New Politics Alliance for Democracy, underlines the challenges facing a number of nations in the Asia-Pacific region.

Japan, Taiwan, Singapore and increasingly China are all experiencing growing financial pressures caused by rising healthcare costs and pension payments for an elderly population.

The problem is particularly serious in South Korea, where more than 38 per cent of the population is predicted to be of retirement age by 2050, according to the National Statistics Office. The equivalent figure in Japan is an estimated 39.6 per cent by 2050.

According to a 2012 study conducted by Tohoku University, Japan will go extinct in about one thousand years, with the last Japanese child born in 3011.

David Coleman, a population expert at Oxford University, has previously warned that South Korea’s fertility rate is so low that it threatens the existence of the nation.

The NARS study suggests that the southern Korean port city of Busan is most at risk, largely because of a sharp decline in the number of young and middle-aged residents, and that the last person will be born in the city in 2413.

Read the entire article here.

Those 25,000 Unread Emails

Google-search-emailIt may not be you. You may not be the person who has tens of thousands of unread emails scattered across various email accounts. However, you know someone just like this — buried in a virtual avalanche of unopened text, unable to extricate herself (or him) and with no pragmatic plan to tackle the digital morass.

Washington Post writer Brigid Schulte has some ideas to help your friend  (or you of course — your secret is safe with us).

From the Washington Post:

I was drowning in e-mail. Overwhelmed. Overloaded. Spending hours a day, it seemed, roiling in an unending onslaught of info turds and falling further and further behind. The day I returned from a two-week break, I had 23,768 messages in my inbox. And 14,460 of them were unread.

I had to do something. I kept missing stuff. Forgetting stuff. Apologizing. And getting miffed and increasingly angry e-mails from friends and others who wondered why I was ignoring them. It wasn’t just vacation that put me so far behind. I’d been behind for more than a year. Vacation only made it worse. Every time I thought of my inbox, I’d start to hyperventilate.

I’d tried tackling it before: One night a few months ago, I was determined to stay at my desk until I’d powered through all the unread e-mails. At dawn, I was still powering through and nowhere near the end. And before long, the inbox was just as crammed as it had been before I lost that entire night’s sleep.

On the advice of a friend, I’d even hired a Virtual Assistant to help me with the backlog. But I had no idea how to use one. And though I’d read about people declaring e-mail bankruptcy when their inbox was overflowing — deleting everything and starting over from scratch — I was positive there were gems somewhere in that junk, and I couldn’t bear to lose them.

I knew I wasn’t alone. I’d get automatic response messages saying someone was on vacation and the only way they could relax was by telling me they’d never, ever look at my e-mail, so please send it again when they returned. My friend, Georgetown law professor Rosa Brooks, often sends out this auto response: “My inbox looks like Pompeii, post-volcano. Will respond as soon as I have time to excavate.” And another friend, whenever an e-mail is longer than one or two lines, sends a short note, “This sounds like a conversation,” and she won’t respond unless you call her.

E-mail made the late writer Nora Ephron’s list of the 22 things she won’t miss in life. Twice. In 2013, more than 182 billion e-mails were sent every day, no doubt clogging up millions of inboxes around the globe.

Bordering on despair, I sought help from four productivity gurus. And, following their advice, in two weeks of obsession-bordering-on-compulsion, my inbox was down to zero.

Here’s how.

*CREATE A SYSTEM. Julie Gray, a time coach who helps people dig out of e-mail overload all the time, said the first thing I had to change was my mind.

“This is such a pervasive problem. People think, ‘What am I doing wrong? They think they don’t have discipline or focus or that there’s some huge character flaw and they’re beating themselves up all the time. Which only makes it worse,” she said.

“So I first start changing their e-mail mindset from ‘This is an example of my failure,’ to ‘This just means I haven’t found the right system for me yet.’ It’s really all about finding your own path through the craziness.”

Do not spend another minute on e-mail, she admonished me, until you’ve begun to figure out a system. Otherwise, she said, I’d never dig out.

So we talked systems. It soon became clear that I’d created a really great e-mail system for when I was writing my book — ironically enough, on being overwhelmed — spending most of my time not at all overwhelmed in yoga pants in my home office working on my iMac. I was a follower of Randy Pausch who wrote, in “The Last Lecture,” to keep your e-mail inbox down to one page and religiously file everything once you’ve handled it. And I had for a couple years.

But now that I was traveling around the country to talk about the book, and back at work at The Washington Post, using my laptop, iPhone and iPad, that system was completely broken. I had six different e-mail accounts. And my main Verizon e-mail that I’d used for years and the Mac Mail inbox with meticulous file folders that I loved on my iMac didn’t sync across any of them.

Gray asked: “If everything just blew up today, and you had to start over, how would you set up your system?”

I wanted one inbox. One e-mail account. And I wanted the same inbox on all my devices. If I deleted an e-mail on my laptop, I wanted it deleted on my iMac. If I put an e-mail into a folder on my iMac, I wanted that same folder on my laptop.

So I decided to use Gmail, which does sync, as my main account. I set up an auto responder on my Verizon e-mail saying I was no longer using it and directing people to my Gmail account. I updated all my accounts to send to Gmail. And I spent hours on the phone with Apple one Sunday (thank you, Chazz,) to get my Gmail account set up in my beloved Mac mail inbox that would sync. Then I transferred old files and created new ones on Gmail. I had to keep my Washington Post account separate, but that wasn’t the real problem.

All systems go.

Read the entire article here.

Image courtesy of Google Search.

 

Robin Williams You Will Be Missed

Google-search-robin-williams

Mork returned to Ork this weekend; sadly, his creator Robin Williams passed away on August 11, 2014. He was 63. His unique comic genius will be sorely missed.

From NYT:

Some years ago, at a party at the Cannes Film Festival, I was leaning against a rail watching a fireworks display when I heard a familiar voice behind me. Or rather, at least a dozen voices, punctuating the offshore explosions with jokes, non sequiturs and off-the-wall pop-cultural, sexual and political references.

There was no need to turn around: The voices were not talking directly to me and they could not have belonged to anyone other than Robin Williams, who was extemporizing a monologue at least as pyrotechnically amazing as what was unfolding against the Mediterranean sky. I’m unable to recall the details now, but you can probably imagine the rapid-fire succession of accents and pitches — macho basso, squeaky girly, French, Spanish, African-American, human, animal and alien — entangling with curlicues of self-conscious commentary about the sheer ridiculousness of anyone trying to narrate explosions of colored gunpowder in real time.

Part of the shock of his death on Monday came from the fact that he had been on — ubiquitous, self-reinventing, insistently present — for so long. On Twitter, mourners dated themselves with memories of the first time they had noticed him. For some it was the movie “Aladdin.” For others “Dead Poets Society” or “Mrs. Doubtfire.” I go back even further, to the “Mork and Mindy” television show and an album called “Reality — What a Concept” that blew my eighth-grade mind.

Back then, it was clear that Mr. Williams was one of the most explosively, exhaustingly, prodigiously verbal comedians who ever lived. The only thing faster than his mouth was his mind, which was capable of breathtaking leaps of free-associative absurdity. Janet Maslin, reviewing his standup act in 1979, cataloged a tumble of riffs that ranged from an impression of Jacques Cousteau to “an evangelist at the Disco Temple of Comedy,” to Truman Capote Jr. at “the Kindergarten of the Stars” (whatever that was). “He acts out the Reader’s Digest condensed version of ‘Roots,’ ” Ms. Maslin wrote, “which lasts 15 seconds in its entirety. He improvises a Shakespearean-sounding epic about the Three Mile Island nuclear disaster, playing all the parts himself, including Einstein’s ghost.” (That, or something like it, was a role he would reprise more than 20 years later in Steven Spielberg’s “A.I.”)

Read the entire article here.

Image courtesy of Google Search.

Kissing for the Sake of Art

The Makeout ProjectThe process for many artists in often long and arduous. Despite the creative and, usually, fulfilling end result the path is frequently punctuated with disrespect, self-deprivation, suffering and pain. Indeed, many artists have paid a heavier price for their expression: censorship, imprisonment, torture, death.

So, it’s refreshing to see an artist taking a more pleasure-filled route. Kissing. Someone has to do it!

From the Guardian:

From the naked women that Yves Klein covered in blue paint to Terry Richardson’s bevy of porny subjects, the art world is full of work that for one person seems liberated and for another exploitative. Continuing to skirt that line is Jedediah Johnson, an American photographer whose ongoing series the Makeout Project involves him putting on lipstick then kissing people, before documenting the resulting smears in portraits.

Johnson’s shots are really striking, with his LaChapellian palette of bright colours making the lipstick jump out from its surprisingly circuitous path across each person’s face. The subjects look variously flirtatious, amused and ashamed; some have strange narratives, like the woman who is holding a baby just out of shot, her partner hovering off to one side.

It’s sensational enough to have been covered in the Daily Mail with their characteristically BIZARRE use of capitalisation, perhaps chiefly because it seems cheeky – or indeed sleazy. “People say ‘oh, it’s disgusting and he’s just doing it to get cheap thrills’, and I guess that is kind of not totally untrue,” Johnson tells me, explaining the germ of his project. “I just got this thought of this lipstick mark on your face when someone kisses you as being a powerful, loaded gesture that could communicate a lot. And also, y’know, there were a lot of people I knew who I wanted to kiss.” It was a way of addressing his “romantic anxiety”, which was holding him back from kissing those he desired.

So he started asking to kiss people at parties, generally picking someone he knew first of all, so the other partygoers could see it was an art project rather than a novel way of getting his end away. After a while, he graduated to studio portraits, and not just of attractive young women. He says he didn’t want to be “the guy using art as an excuse to kiss people he wants to – and I don’t think there’s necessarily anything wrong with that, but that’s just not who I wanted to be. So I’m going to have to kiss some people I don’t want to.” This includes a series of (still pretty attractive) men, who ended up teaching Jedediah a lot. “I didn’t realise people lead a kiss – I would always just kiss people, and I was leading, and I had no idea. There have been a couple of times when I kissed guys and they led; I tried to move into different real estate on their face, and they wouldn’t let me.”

His work is understandably misinterpreted though, with some people seeing the hand that cradles the face in each shot as a controlling, violent image. “I understand that when you are just pointing the viewer in a direction, they come up with stuff you’re not into.” But the only thing that really grates him is when people accuse him of not making art. “I have two degrees in art, and I don’t feel I have the ability to declare whether something is art or not. It’s an awful thing to say.”

The intrigue of his images comes from trying to assess the dynamic between the pair, from the woman biting her lip faux-seductively to those trying to hide their feelings about what’s just happened. Is there ever an erotic charge? “A few times it’s got really real for me; there’s some where I was probably like oh that was nice, and they’re thinking oh that was incredible, I don’t know what to do now. The different levels are very interesting.” He has had one unfortunate bad breath incident, though: “I was like hey, let’s make out, and she was like, great, just let me finish my garlic string beans. She still had garlic in her mouth.”

Read the entire story and see more images here.

Visit Jedediah Johnson’s website to see the entire Makeout Project here.

Image: The Makeout Project by Jedediah Johnson. Courtesy of Jedediah Johnson / Guardian.

Privacy and Potato Chips

Google-search-potato-chip

Privacy and lack thereof is much in the news and on or minds. New revelations of data breaches, phone taps, corporate hackers and governmental overreach surface on a daily basis. So, it is no surprise to learn that researchers have found a cheap way to eavesdrop on our conversations via a potato chip (crisp, to our British-English readers) packet. No news yet on which flavor of chip makes for the best spying!

From ars technica:

Watch enough spy thrillers, and you’ll undoubtedly see someone setting up a bit of equipment that points a laser at a distant window, letting the snoop listen to conversations on the other side of the glass. This isn’t something Hollywood made up; high-tech snooping devices of this sort do exist, and they take advantage of the extremely high-precision measurements made possible with lasers in order to measure the subtle vibrations caused by sound waves.

A team of researchers has now shown, however, that you can skip the lasers. All you really need is a consumer-level digital camera and a conveniently located bag of Doritos. A glass of water or a plant would also do.

Good vibrations

Despite the differences in the technology involved, both approaches rely on the same principle: sound travels on waves of higher and lower pressure in the air. When these waves reach a flexible object, they set off small vibrations in the object. If you can detect these vibrations, it’s possible to reconstruct the sound. Laser-based systems detect the vibrations by watching for changes in the reflections of the laser light, but researchers wondered whether you could simply observe the object directly, using the ambient light it reflects. (The team involved researchers at MIT, Adobe Research, and Microsoft Research.)

The research team started with a simple test system made from a loudspeaker playing a rising tone, a high-speed camera, and a variety of objects: water, cardboard, a candy wrapper, some metallic foil, and (as a control) a brick. Each of these (even the brick) showed some response at the lowest end of the tonal range, but the other objects, particularly the cardboard and foil, had a response into much higher tonal regions. To observe the changes in ambient light, the camera didn’t have to capture the object at high resolution—it was used at 700 x 700 pixels or less—but it did have to be high-speed, capturing as many as 20,000 frames a second.

Processing the images wasn’t simple, however. A computer had to perform a weighted average over all the pixels captured, and even a twin 3.5GHz machine with 32GB of RAM took more than two hours to process one capture. Nevertheless, the results were impressive, as the algorithm was able to detect motion on the order of a thousandth of a pixel. This enabled the system to recreate the audio waves emitted by the loudspeaker.

Most of the rest of the paper describing the results involved making things harder on the system, as the researchers shifted to using human voices and moving the camera outside the room. They also showed that pre-testing the vibrating object’s response to a tone scale could help them improve their processing.

But perhaps the biggest surprise came when they showed that they didn’t actually need a specialized, high-speed camera. It turns out that most consumer-grade equipment doesn’t expose its entire sensor at once and instead scans an image across the sensor grid in a line-by-line fashion. Using a consumer video camera, the researchers were able to determine that there’s a 16 microsecond delay between each line, with a five millisecond delay between frames. Using this information, they treated each line as a separate exposure and were able to reproduce sound that way.

Read the entire article here.

Image courtesy of Google Search.

 

 

The Enigma of Privacy

Privacy is still a valued and valuable right. It should not be a mere benefit in a democratic society. But, in our current age privacy is becoming an increasingly threatened species. We are surrounded with social networks that share and mine our behaviors and we are assaulted by the snoopers and spooks from local and national governments.

From the Observer:

We have come to the end of privacy; our private lives, as our grandparents would have recognised them, have been winnowed away to the realm of the shameful and secret. To quote ex-tabloid hack Paul McMullan, “privacy is for paedos”. Insidiously, through small concessions that only mounted up over time, we have signed away rights and privileges that other generations fought for, undermining the very cornerstones of our personalities in the process. While outposts of civilisation fight pyrrhic battles, unplugging themselves from the web – “going dark” – the rest of us have come to accept that the majority of our social, financial and even sexual interactions take place over the internet and that someone, somewhere, whether state, press or corporation, is watching.

The past few years have brought an avalanche of news about the extent to which our communications are being monitored: WikiLeaks, the phone-hacking scandal, the Snowden files. Uproar greeted revelations about Facebook’s “emotional contagion” experiment (where it tweaked mathematical formulae driving the news feeds of 700,000 of its members in order to prompt different emotional responses). Cesar A Hidalgo of the Massachusetts Institute of Technology described the Facebook news feed as “like a sausage… Everyone eats it, even though nobody knows how it is made”.

Sitting behind the outrage was a particularly modern form of disquiet – the knowledge that we are being manipulated, surveyed, rendered and that the intelligence behind this is artificial as well as human. Everything we do on the web, from our social media interactions to our shopping on Amazon, to our Netflix selections, is driven by complex mathematical formulae that are invisible and arcane.

Most recently, campaigners’ anger has turned upon the so-called Drip (Data Retention and Investigatory Powers) bill in the UK, which will see internet and telephone companies forced to retain and store their customers’ communications (and provide access to this data to police, government and up to 600 public bodies). Every week, it seems, brings a new furore over corporations – Apple, Google, Facebook – sidling into the private sphere. Often, it’s unclear whether the companies act brazenly because our governments play so fast and loose with their citizens’ privacy (“If you have nothing to hide, you’ve nothing to fear,” William Hague famously intoned); or if governments see corporations feasting upon the private lives of their users and have taken this as a licence to snoop, pry, survey.

We, the public, have looked on, at first horrified, then cynical, then bored by the revelations, by the well-meaning but seemingly useless protests. But what is the personal and psychological impact of this loss of privacy? What legal protection is afforded to those wishing to defend themselves against intrusion? Is it too late to stem the tide now that scenes from science fiction have become part of the fabric of our everyday world?

Novels have long been the province of the great What If?, allowing us to see the ramifications from present events extending into the murky future. As long ago as 1921, Yevgeny Zamyatin imagined One State, the transparent society of his dystopian novel, We. For Orwell, Huxley, Bradbury, Atwood and many others, the loss of privacy was one of the establishing nightmares of the totalitarian future. Dave Eggers’s 2013 novel The Circle paints a portrait of an America without privacy, where a vast, internet-based, multimedia empire surveys and controls the lives of its people, relying on strict adherence to its motto: “Secrets are lies, sharing is caring, and privacy is theft.” We watch as the heroine, Mae, disintegrates under the pressure of scrutiny, finally becoming one of the faceless, obedient hordes. A contemporary (and because of this, even more chilling) account of life lived in the glare of the privacy-free internet is Nikesh Shukla’s Meatspace, which charts the existence of a lonely writer whose only escape is into the shallows of the web. “The first and last thing I do every day,” the book begins, “is see what strangers are saying about me.”

Our age has seen an almost complete conflation of the previously separate spheres of the private and the secret. A taint of shame has crept over from the secret into the private so that anything that is kept from the public gaze is perceived as suspect. This, I think, is why defecation is so often used as an example of the private sphere. Sex and shitting were the only actions that the authorities in Zamyatin’s One State permitted to take place in private, and these remain the battlegrounds of the privacy debate almost a century later. A rather prim leaked memo from a GCHQ operative monitoring Yahoo webcams notes that “a surprising number of people use webcam conversations to show intimate parts of their body to the other person”.

It is to the bathroom that Max Mosley turns when we speak about his own campaign for privacy. “The need for a private life is something that is completely subjective,” he tells me. “You either would mind somebody publishing a film of you doing your ablutions in the morning or you wouldn’t. Personally I would and I think most people would.” In 2008, Mosley’s “sick Nazi orgy”, as the News of the World glossed it, featured in photographs published first in the pages of the tabloid and then across the internet. Mosley’s defence argued, successfully, that the romp involved nothing more than a “standard S&M prison scenario” and the former president of the FIA won £60,000 damages under Article 8 of the European Convention on Human Rights. Now he has rounded on Google and the continued presence of both photographs and allegations on websites accessed via the company’s search engine. If you type “Max Mosley” into Google, the eager autocomplete presents you with “video,” “case”, “scandal” and “with prostitutes”. Half-way down the first page of the search we find a link to a professional-looking YouTube video montage of the NotW story, with no acknowledgment that the claims were later disproved. I watch it several times. I feel a bit grubby.

“The moment the Nazi element of the case fell apart,” Mosley tells me, “which it did immediately, because it was a lie, any claim for public interest also fell apart.”

Here we have a clear example of the blurred lines between secrecy and privacy. Mosley believed that what he chose to do in his private life, even if it included whips and nipple-clamps, should remain just that – private. The News of the World, on the other hand, thought it had uncovered a shameful secret that, given Mosley’s professional position, justified publication. There is a momentary tremor in Mosley’s otherwise fluid delivery as he speaks about the sense of invasion. “Your privacy or your private life belongs to you. Some of it you may choose to make available, some of it should be made available, because it’s in the public interest to make it known. The rest should be yours alone. And if anyone takes it from you, that’s theft and it’s the same as the theft of property.”

Mosley has scored some recent successes, notably in continental Europe, where he has found a culture more suspicious of Google’s sweeping powers than in Britain or, particularly, the US. Courts in France and then, interestingly, Germany, ordered Google to remove pictures of the orgy permanently, with far-reaching consequences for the company. Google is appealing against the rulings, seeing it as absurd that “providers are required to monitor even the smallest components of content they transmit or store for their users”. But Mosley last week extended his action to the UK, filing a claim in the high court in London.

Mosley’s willingness to continue fighting, even when he knows that it means keeping alive the image of his white, septuagenarian buttocks in the minds (if not on the computers) of the public, seems impressively principled. He has fallen victim to what is known as the Streisand Effect, where his very attempt to hide information about himself has led to its proliferation (in 2003 Barbra Streisand tried to stop people taking pictures of her Malibu home, ensuring photos were posted far and wide). Despite this, he continues to battle – both in court, in the media and by directly confronting the websites that continue to display the pictures. It is as if he is using that initial stab of shame, turning it against those who sought to humiliate him. It is noticeable that, having been accused of fetishising one dark period of German history, he uses another to attack Google. “I think, because of the Stasi,” he says, “the Germans can understand that there isn’t a huge difference between the state watching everything you do and Google watching everything you do. Except that, in most European countries, the state tends to be an elected body, whereas Google isn’t. There’s not a lot of difference between the actions of the government of East Germany and the actions of Google.”

All this brings us to some fundamental questions about the role of search engines. Is Google the de facto librarian of the internet, given that it is estimated to handle 40% of all traffic? Is it something more than a librarian, since its algorithms carefully (and with increasing use of your personal data) select the sites it wants you to view? To what extent can Google be held responsible for the content it puts before us?

Read the entire article here.

Frozen Moving Pictures

green-salt-flotowarner

Recent works by artist duo Floto+Warner could be mistaken for a family of bizarrely fluid, alien life-forms, not 3D sculptures of colorful chemicals. While these still images of fluorescent airborne liquids certainly pay homage to Jackson Pollock, they have a unique and playful character all of their own. And, in this case the creative process is just as fascinating as the end result.

From Jonathan Jones over at the Guardian:

Luridly chemical colours hang in the air in the vast wastelands of Nevada in an eye-catching set of pictures by the New York art duo Floto+Warner. To make these images of bright liquids arrested in space, Cassandra and Jeremy Floto threw up cocktails of colour until their camera caught just the splashy, fluid, stilled moments they wanted to record. Apparently, Photoshop is not involved.

These images echo the great modern tradition that pictures motion, energy and flux. “Energy and motion made visible – memories arrested in space,” as Jackson Pollock said of his paintings that he made by dripping, flicking and throwing paint on to canvases laid on the floor. Pollock’s “action paintings” are the obvious source of Floto and Warner’s hurled colours: their photographs are playful riffs on Pollock. And they bring out one of the most startling things about his art: the sense it is still in motion even when it has stopped; the feel of paint being liquid long after it has dried.

Floto and Warner prove that Pollock is still the Great American Artist, 58 years after his death. American art still can’t help echoing him. Works from Robert Smithson’s Spiral Jetty to Andy Warhol’s piss paintings echo his free-ranging exploration of space and his dynamic expansion of the act of drawing.

Yet these images of arrested veils and clouds of colour also echo other attempts to capture living motion. In 1830 to 1831 Hokusai depicted The Great Wave off Kanagawa as a tower of blueness cresting into white foam and about to fall onto the boats helplessly caught in its path. Hokusai’s woodblock print is a decisive moment in the story of art. It takes motion as a topic, and distills its essence in an image at once dynamic and suspended.

Photographers would soon take up Hokusai’s challenge to understand the nature of motion. Famously, Eadweard Muybridge in the late 19th century took strange serial studies of human and animal bodies in motion. Yet the photographer whom Floto+Warner echo most vividly is Harold E Edgerton, who brought the scientific photography of movement into modern times in striking pictures of a foot kicking a ball or a bullet piercing an apple.

Read the entire story and see more of Floto+Warner’s images here.

Image: Green Salt, Floto+Warner. Courtesy of the Guardian.

The Cosmological Axis of Evil

WMAP_temp-anisotropy

The cosmos seems remarkably uniform — look in any direction with the naked eye or the most powerful telescopes and you’ll see much the same as in any other direction. Yet, on a grand scale, our universe shows some peculiar fluctuations that have cosmologists scratching their heads. The temperature of the universe, as described by the cosmic microwave background (CMB), shows some interesting fluctuations in specific, vast regions. It is the distribution of these temperature variations that shows what seem to be non-random patterns. Cosmologists have dubbed the pattern, “axis of evil”.

From ars technica:

The Universe is incredibly regular. The variation of the cosmos’ temperature across the entire sky is tiny: a few millionths of a degree, no matter which direction you look. Yet the same light from the very early cosmos that reveals the Universe’s evenness also tells astronomers a great deal about the conditions that gave rise to irregularities like stars, galaxies, and (incidentally) us.

That light is the cosmic microwave background, and it provides some of the best knowledge we have about the structure, content, and history of the Universe. But it also contains a few mysteries: on very large scales, the cosmos seems to have a certain lopsidedness. That slight asymmetry is reflected in temperature fluctuations much larger than any galaxy, aligned on the sky in a pattern facetiously dubbed “the axis of evil.”

The lopsidedness is real, but cosmologists are divided over whether it reveals anything meaningful about the fundamental laws of physics. The fluctuations are sufficiently small that they could arise from random chance. We have just one observable Universe, but nobody sensible believes we can see all of it. With a sufficiently large cosmos beyond the reach of our telescopes, the rest of the Universe may balance the oddity that we can see, making it a minor, local variation.

However, if the asymmetry can’t be explained away so simply, it could indicate that some new physical mechanisms were at work in the early history of the Universe. As Amanda Yoho, a graduate student in cosmology at Case Western Reserve University, told Ars, “I think the alignments, in conjunction with all of the other large angle anomalies, must point to something we don’t know, whether that be new fundamental physics, unknown astrophysical or cosmological sources, or something else.”

Over the centuries, astronomers have provided increasing evidence that Earth, the Solar System, and the Milky Way don’t occupy a special position in the cosmos. Not only are we not at the center of existence—much less the corrupt sinkhole surrounded by the pure crystal heavens, as in early geocentric Christian theology—the Universe has no center and no edge.

In cosmology, that’s elevated to a principle. The Universe is isotropic, meaning it’s (roughly) the same in every direction. The cosmic microwave background (CMB) is the strongest evidence for the isotropic principle: the spectrum of the light reaching Earth from every direction indicates that it was emitted by matter at almost exactly the same temperature.

The Big Bang model explains why. In the early years of the Universe’s history, matter was very dense and hot, forming an opaque plasma of electrons, protons, and helium nuclei. The expansion of space-time thinned out until the plasma cooled enough that stable atoms could form. That event, which ended roughly 380,000 years after the Big Bang, is known as recombination. The immediate side effect was to make the Universe transparent and liberate vast numbers of photons, most of which have traveled through space unmolested ever since.

We observe the relics of recombination in the form of the CMB. The temperature of the Universe today is about 2.73 degrees above absolute zero in every part of the sky. The lack of variation makes the cosmos nearly as close to a perfect thermal body as possible. However, measurements show anisotropies—tiny fluctuations in temperature, roughly 10 millionths of a degree or less. These irregularities later gave rise to areas where mass gathered. A perfectly featureless, isotropic cosmos would have no stars, galaxies, or planets full of humans.

To measure the physical size of these anisotropies, researchers turn the whole-sky map of temperature fluctuations into something called a power spectrum. That’s akin to the process of taking light from a galaxy and finding the component wavelengths (colors) that make it up. The power spectrum encompasses fluctuations over the whole sky down to very small variations in temperature. (For those with some higher mathematics knowledge, this process involves decomposing the temperature fluctuations in spherical harmonics.)

Smaller details in the fluctuations tell cosmologists the relative amounts of ordinary matter, dark matter, and dark energy. However, some of the largest fluctuations—covering one-fourth, one-eighth, and one-sixteenth of the sky—are bigger than any structure in the Universe, therefore representing temperature variations across the whole sky.

Those large-scale fluctuations in the power spectrum are where something weird happens. The temperature variations are both larger than expected and aligned with each other to a high degree. That’s at odds with theoretical expectations: the CMB anisotropies should be randomly oriented, not aligned. In fact, the smaller-scale variations are random, which makes the deviation at larger scales that much stranger.

Kate Land and Joao Magueijo jokingly dubbed the strange alignment “the axis of evil” in a 2005 paper (freely available on the ArXiv), riffing on an infamous statement by then-US President George W. Bush. Their findings were based on data from an earlier observatory, the Wilkinson Microwave Anisotropy Probe (WMAP), but the follow-up Planck mission found similar results. There’s no question that the “axis of evil” is there; cosmologists just have to figure out what to think about it.

The task of interpretation is complicated by what’s called “cosmic variance,” or the fact that our observable Universe is just one region in a larger Universe. Random chance dictates that some pockets of the whole Universe will have larger or smaller fluctuations than others, and those fluctuations might even be aligned entirely by coincidence.

In other words, the “axis of evil” could very well be an illusion, a pattern that wouldn’t seem amiss if we could see more of the Universe. However, cosmic variance also predicts how big those local, random deviations should be—and the fluctuations in the CMB data are larger. They’re not so large as to rule out the possibility of a local variation entirely—they’re above-average height—but cosmologists can’t easily dismiss the possibility that something else is going on.

Read the entire article here.

Image courtesy of Hinshaw et al WMAP paper.

Don’t Hitchhike, Unless You’re a Robot

hitchbot

 

A Canadian is trying valiantly to hitchhike across the nation, from coast-to-coast — Nova Scotia to British Columbia. While others have made this trek before, this journey is peculiar in one respect. The intrepid hiker is a child-sized robot. She or he — we don’t really know — is named hitchBOT.

hitchBOT is currently still in eastern Canada; New Brunswick to be more precise. So one has to wonder if (s)he would have made better progress from commandeering one of Google’s self-propelled, driverless cars to make the 3,781 mile journey.

Read the entire story and follow hitchBOT’s progress across Canada here.

Image courtesy of hitchBOT / Independent.

 

Ugliness Behind the Beautiful Game

Google-map-QatarQatar hosts the World Cup in 2022. This gives the emirate another 8 years to finish construction of the various football venues, hotels and infrastructure required to support the world’s biggest single sporting event.

Perhaps, it will also give the emirate some time to clean up its appalling record of worker abuse and human rights violations. Numerous  laborers have died during the construction process, while others are paid minimal wages or not at all. And to top it off most employees live in atrocious conditions , cannot move freely, nor can they change jobs or even repatriate — many come from the Indian subcontinent or East Asia. You could be forgiven for labeling these people indentured servants rather than workers.

From the Guardian:

Migrant workers who built luxury offices used by Qatar’s 2022 football World Cup organisers have told the Guardian they have not been paid for more than a year and are now working illegally from cockroach-infested lodgings.

Officials in Qatar’s Supreme Committee for Delivery and Legacy have been using offices on the 38th and 39th floors of Doha’s landmark al-Bidda skyscraper – known as the Tower of Football – which were fitted out by men from Nepal, Sri Lanka and India who say they have not been paid for up to 13 months’ work.

The project, a Guardian investigation shows, was directly commissioned by the Qatar government and the workers’ plight is set to raise fresh doubts over the autocratic emirate’s commitment to labour rights as construction starts this year on five new stadiums for the World Cup.

The offices, which cost £2.5m to fit, feature expensive etched glass, handmade Italian furniture, and even a heated executive toilet, project sources said. Yet some of the workers have not been paid, despite complaining to the Qatari authorities months ago and being owed wages as modest as £6 a day.

By the end of this year, several hundred thousand extra migrant workers from some of the world’s poorest countries are scheduled to have travelled to Qatar to build World Cup facilities and infrastructure. The acceleration in the building programme comes amid international concern over a rising death toll among migrant workers and the use of forced labour.

“We don’t know how much they are spending on the World Cup, but we just need our salary,” said one worker who had lost a year’s pay on the project. “We were working, but not getting the salary. The government, the company: just provide the money.”

The migrants are squeezed seven to a room, sleeping on thin, dirty mattresses on the floor and on bunk beds, in breach of Qatar’s own labour standards. They live in constant fear of imprisonment because they have been left without paperwork after the contractor on the project, Lee Trading and Contracting, collapsed. They say they are now being exploited on wages as low as 50p an hour.

Their case was raised with Qatar’s prime minister by Amnesty International last November, but the workers have said 13 of them remain stranded in Qatar. Despite having done nothing wrong, five have even been arrested and imprisoned by Qatari police because they did not have ID papers. Legal claims lodged against the former employer at the labour court in November have proved fruitless. They are so poor they can no longer afford the taxi to court to pursue their cases, they say.

A 35-year-old Nepalese worker and father of three who ssaid he too had lost a year’s pay: “If I had money to buy a ticket, I would go home.”

Qatar’s World Cup organising committee confirmed that it had been granted use of temporary offices on the floors fitted out by the unpaid workers. It said it was “heavily dismayed to learn of the behaviour of Lee Trading with regard to the timely payment of its workers”. The committee stressed it did not commission the firm. “We strongly disapprove and will continue to press for a speedy and fair conclusion to all cases,” it said.

Jim Murphy, the shadow international development secretary, said the revelation added to the pressure on the World Cup organising committee. “They work out of this building, but so far they can’t even deliver justice for the men who toiled at their own HQ,” he said.

Sharan Burrow, secretary general of the International Trade Union Confederation, said the workers’ treatment was criminal. “It is an appalling abuse of fundamental rights, yet there is no concern from the Qatar government unless they are found out,” she said. “In any other country you could prosecute this behaviour.”

Read the entire article here.

Image: Qatar. Courtesy of Google Maps.

MondayMap: Drought Mapping

US-droughtThe NYT has an fascinating and detailed article bursting with charts and statistics that shows the pervasive grip of the drought in the United States. The desert Southwest and West continue to be parched and scorching. This is not a pretty picture for farmers and increasingly for those (sub-)urban dwellers who rely upon a fragile and dwindling water supply.

From the NYT:

Droughts appear to be intensifying over much of the West and Southwest as a result of global warming. Over the past decade, droughts in some regions have rivaled the epic dry spells of the 1930s and 1950s. About 34 percent of the contiguous United States was in at least a moderate drought as of July 22.
Things have been particularly bad in California, where state officials have approved drastic measures to reduce water consumption. California farmers, without water from reservoirs in the Central Valley, are left to choose which of their crops to water. Parts of Texas, Oklahoma and surrounding states are also suffering from drought conditions.
The relationship between the climate and droughts is complicated. Parts of the country are becoming wetter: East of the Mississippi, rainfall has been rising. But global warming also appears to be causing moisture to evaporate faster in places that were already dry. Researchers believe drought conditions in these places are likely to intensify in coming years.
There has been little relief for some places since the summer of 2012. At the recent peak this May, about 40 percent of the country was abnormally dry or in at least a moderate drought.

Read the entire story and see the statistics for yourself here.

Image courtesy of Drought Monitor / NYT.

Computer Generated Reality

[tube]nLtmEjqzg7M[/tube]

Computer games have come a very long way since the pioneering days of Pong and Pacman. Games are now so realistic that many are indistinguishable from the real-world characters and scenarios they emulate. It is a testament to the skill and ingenuity of hardware and software engineers and the creativity of developers who bring all the diverse underlying elements of a game together. Now, however, they have a match in the form of computer system that is able to generate richly  imagined and rendered world for use in the games themselves. It’s all done through algorithms.

From Technology Review:

Read the entire story here.

Video: No Man’s Sky. Courtesy of Hello Games.

 

 

Gun Love

Gun Violence in America

The second amendment remains ever strong in the U.S. And, of course so does the number of homicides and child deaths at the hands of guns. Sigh!

From the Guardian:

In February, a nine-year-old Arkansas boy called Hank asked his uncle if he could head off on his own from their remote camp to hunt a rabbit with his .22 calibre rifle. “I said all right,” recalled his uncle Brent later. “It wasn’t a concern. Some people are like, ‘a nine year old shouldn’t be off by himself,’ but he wasn’t an average nine year old.”

Hank was steeped in hunting: when he was two, his father, Brad, would put him in a rucksack on his back when he went turkey hunting. Brad regularly took Hank hunting and said that his son often went off hunting by himself. On this particular day, Hank and his uncle Brent had gone squirrel hunting together as his father was too sick to go.

When Hank didn’t return from hunting the rabbit, his uncle raised the alarm. His mother, Kelli, didn’t learn about his disappearance for seven hours. “They didn’t want to bother me unduly,” she says.

The following morning, though, after police, family and hundreds of locals searched around the camp, Hank’s body was found by a creek with a single bullet wound to the forehead. The cause of death was, according to the police, most likely a hunting accident.

“He slipped and the butt of the gun hit the ground and the gun fired,” says Kelli.

Kelli had recently bought the gun for Hank. “It was the first gun I had purchased for my son, just a youth .22 rifle. I never thought it would be a gun that would take his life.”

Both Kelli and Brad, from whom she is separated, believe that the gun was faulty – it shouldn’t have gone off unless the trigger was pulled, they claim. Since Hank’s death, she’s been posting warnings on her Facebook page about the gun her son used: “I wish someone else had posted warnings about it before what happened,” she says.

Had Kelli not bought the gun and had Brad not trained his son to use it, Hank would have celebrated his 10th birthday on 6 June, which his mother commemorated by posting Hank’s picture on her Facebook page with the message: “Happy Birthday Hank! Mommy loves you!”

Little Hank thus became one in a tally of what the makers of a Channel 4 documentary called Kids and Guns claim to be 3,000 American children who die each year from gun-related accidents. A recent Yale University study found that more than 7,000 US children and adolescents are hospitalised or killed by guns each year and estimates that about 20 children a day are treated in US emergency rooms following incidents involving guns.

Hank’s story is striking, certainly for British readers, for two reasons. One, it dramatises how hunting is for many Americans not the privileged pursuit it is overwhelmingly here, but a traditional family activity as much to do with foraging for food as it is a sport.

Francine Shaw, who directed Kids and Guns, says: “In rural America … people hunt to eat.”

Kelli has a fond memory of her son coming home with what he’d shot. “He’d come in and say: “Momma – I’ve got some squirrel to cook.” And I’d say ‘Gee, thanks.’ That child was happy to bring home meat. He was the happiest child when he came in from shooting.”

But Hank’s story is also striking because it shows how raising kids to hunt and shoot is seen as good parenting, perhaps even as an essential part of bringing up children in America – a society rife with guns and temperamentally incapable of overturning the second amendment that confers the right to bear arms, no matter how many innocent Americans die or get maimed as a result.

“People know I was a good mother and loved him dearly,” says Kelli. “We were both really good parents and no one has said anything hateful to us. The only thing that has been said is in a news report about a nine year old being allowed to hunt alone.”

Does Kelli regret that Hank was allowed to hunt alone at that young age? “Obviously I do, because I’ve lost my son,” she tells me. But she doesn’t blame Brent for letting him go off from camp unsupervised with a gun.

“We’re sure not anti-gun here, but do I wish I could go back in time and not buy that gun? Yes I do. I know you in England don’t have guns. I wish I could go back and have my son back. I would live in England, away from the guns.”

Read the entire article here.

Infographic courtesy of Care2 via visua.ly

The Best

The United States is home to many first and superlatives: first in democracy, wealth, openness, innovation, industry, innovation. The nation also takes great pride in its personal and cultural freedoms. Yet it is also home to another superlative: first in rates of incarceration.  In fact, the US leads other nations by such a wide margin that questions continue to be asked. In the land of the free, something must be wrong.

From the Atlantic:

On Friday, the U.S. Sentencing Commission voted unanimously to allow nearly 50,000 nonviolent federal drug offenders to seek lower sentences. The commission’s decision retroactively applied an earlier change in sentencing guidelines to now cover roughly half of those serving federal drug sentences. Endorsed by both the Department of Justice and prison-reform advocates, the move is a significant step forward in reversing decades of mass incarcerationthough in a global context, still modest—step forward in reversing decades of mass incarceration.

How large is America’s prison problem? More than 2.4 million people are behind bars in the United States today, either awaiting trial or serving a sentence. That’s more than the combined population of 15 states, all but three U.S. cities, and the U.S. armed forces. They’re scattered throughout a constellation of 102 federal prisons, 1,719 state prisons, 2,259 juvenile facilities, 3,283 local jails, and many more military, immigration, territorial, and Indian Country facilities.

Compared to the rest of the world, these numbers are staggering. Here’s how the United States’ incarceration rate compares with those of other modern liberal democracies like Britain and Canada:

That graph is from a recent report by Prison Policy Initiative, an invaluable resource on mass incarceration. (PPI also has a disturbing graph comparing state incarceration rates with those of other countries around the world, which I highly recommend looking at here.) “Although our level of crime is comparable to those of other stable, internally secure, industrialized nations,” the report says, “the United States has an incarceration rate far higher than any other country.”

Some individual states like Louisiana contribute disproportionately, but no state is free from mass incarceration. Disturbingly, many states’ prison populations outrank even those of dictatorships and illiberal democracies around the world. New York jails more people per capita than Rwanda, where tens of thousands await trial for their roles in the 1994 genocide. California, Illinois, and Ohio each have a higher incarceration rate than Cuba and Russia. Even Maine and Vermont imprison a greater share of people than Saudi Arabia, Venezuela, or Egypt.

But mass incarceration is more than just an international anomaly; it’s also a relatively recent phenomenon in American criminal justice. Starting in the 1970s with the rise of tough-on-crime politicians and the War on Drugs, America’s prison population jumped eightfold between 1970 and 2010.

These two metrics—the international and the historical—have to be seen together to understand how aberrant mass incarceration is. In time or in space, the warehousing of millions of Americans knows no parallels. In keeping with American history, however, it also disproportionately harms the non-white and the non-wealthy. “For a great many poor people in America, particularly poor black men, prison is a destination that braids through an ordinary life, much as high school and college do for rich white ones,” wrote Adam Gopnik in his seminal 2012 article.

Mass incarceration on a scale almost unexampled in human history is a fundamental fact of our country today—perhaps the fundamental fact, as slavery was the fundamental fact of 1850. In truth, there are more black men in the grip of the criminal-justice system—in prison, on probation, or on parole—than were in slavery then. Over all, there are now more people under “correctional supervision” in America—more than six million—than were in the Gulag Archipelago under Stalin at its height.

Mass incarceration’s effects are not confined to the cell block. Through the inescapable stigma it imposes, a brush with the criminal-justice system can hamstring a former inmate’s employment and financial opportunities for life. The effect is magnified for those who already come from disadvantaged backgrounds. Black men, for example, made substantial economic progress between 1940 and 1980 thanks to the post-war economic boom and the dismantling of de jure racial segregation. But mass incarceration has all but ground that progress to a halt: A new University of Chicago study found that black men are no better off in 2014 than they were when Congress passed the Civil Rights Act 50 years earlier.

Read the entire article here.

Climate Change Denial: English Only

It’s official. Native English-speakers are more likely to be in denial over climate change than non-English speakers. In fact, many who do not see a human hand in our planet’s environmental and climatic troubles are located in the United States, Britain,  Australia and Canada. Enough said, in English.

Sacre bleu!

Now, the Guardian would have you believe that media monopolist — Rupert Murdoch — is behind the climate change skeptics and deniers. After all, he is well known for his views on climate and his empire controls large swathes of the media that most English-speaking people consume.  However, it’s probably a little more complicated.

From the Guardian:

Here in the United States, we fret a lot about global warming denial. Not only is it a dangerous delusion, it’s an incredibly prevalent one. Depending on your survey instrument of choice, we regularly learn that substantial minorities of Americans deny, or are sceptical of, the science of climate change.

The global picture, however, is quite different. For instance, recently the UK-based market research firm Ipsos MORI released its “Global Trends 2014” report, which included a number of survey questions on the environment asked across 20 countries. (h/t Leo Hickman). And when it came to climate change, the result was very telling.

Note that these results are not perfectly comparable across countries, because the data were gathered online, and Ipsos MORI cautions that for developing countries like India and China, “the results should be viewed as representative of a more affluent and ‘connected’ population.”

Nonetheless, some pretty significant patterns are apparent. Perhaps most notably: Not only is the United States clearly the worst in its climate denial, but Great Britain and Australia are second and third worst, respectively. Canada, meanwhile, is the seventh worst.

What do these four nations have in common? They all speak the language of Shakespeare.

Why would that be? After all, presumably there is nothing about English, in and of itself, that predisposes you to climate change denial. Words and phrases like “doubt,” “natural causes,” “climate models,” and other sceptic mots are readily available in other languages. So what’s the real cause?

One possible answer is that it’s all about the political ideologies prevalent in these four countries.

The US climate change counter movement is comprised of 91 separate organizations, with annual funding, collectively, of “just over $900 million.” And they all speak English.

“I do not find these results surprising,” says Riley Dunlap, a sociologist at Oklahoma State University who has extensively studied the climate denial movement. “It’s the countries where neo-liberalism is most hegemonic and with strong neo-liberal regimes (both in power and lurking on the sidelines to retake power) that have bred the most active denial campaigns—US, UK, Australia and now Canada. And the messages employed by these campaigns filter via the media and political elites to the public, especially the ideologically receptive portions.” (Neoliberalism is an economic philosophy centered on the importance of free markets and broadly opposed to big government interventions.)

Indeed, the English language media in three of these four countries are linked together by a single individual: Rupert Murdoch. An apparent climate sceptic or lukewarmer, Murdoch is the chairman of News Corp and 21st Century Fox. (You can watch him express his climate views here.) Some of the media outlets subsumed by the two conglomerates that he heads are responsible for quite a lot of English language climate scepticism and denial.

In the US, Fox News and the Wall Street Journal lead the way; research shows that Fox watching increases distrust of climate scientists. (You can also catch Fox News in Canada.) In Australia, a recent study found that slightly under a third of climate-related articles in 10 top Australian newspapers “did not accept” the scientific consensus on climate change, and that News Corp papers — the Australian, the Herald Sun, and the Daily Telegraph — were particular hotbeds of scepticism. “TheAustralian represents climate science as matter of opinion or debate rather than as a field for inquiry and investigation like all scientific fields,” noted the study.

And then there’s the UK. A 2010 academic study found that while News Corp outlets in this country from 1997 to 2007 did not produce as much strident climate scepticism as did their counterparts in the US and Australia, “the Sun newspaper offered a place for scornful sceptics on its opinion pages as did The Times and Sunday Times to a lesser extent.” (There are also other outlets in the UK, such as the Daily Mail, that feature plenty of scepticism but aren’t owned by News Corp.)

Thus, while there may not be anything inherent to the English language that impels climate denial, the fact that English language media are such a major source of that denial may in effect create a language barrier.

And media aren’t the only reason that denialist arguments are more readily available in the English language. There’s also the Anglophone nations’ concentration of climate “sceptic” think tanks, which provide the arguments and rationalisations necessary to feed this anti-science position.

According to a study in the journal Climatic Change earlier this year, the US is home to 91 different organisations (think tanks, advocacy groups, and trade associations) that collectively comprise a “climate change counter-movement.” The annual funding of these organisations, collectively, is “just over $900 million.” That is a truly massive amount of English-speaking climate “sceptic” activity, and while the study was limited to the US, it is hard to imagine that anything comparable exists in non-English speaking countries.

Read the entire article here.

A Godless Universe: Mind or Mathematics

In his science column for the NYT George Johnson reviews several recent books by noted thinkers who for different reasons believe science needs to expand its borders. Philosopher Thomas Nagel and physicist Max Tegmark both agree that our current understanding of the universe is rather limited and that science needs to turn to new or alternate explanations. Nagel, still an atheist, suggests in his book Mind and Cosmos that the mind somehow needs to be considered a fundamental structure of the universe. While Tegmark in his book Our Mathematical Universe: My Quest for the Ultimate Nature of Reality suggests that mathematics is the core, irreducible framework of the cosmos. Two radically different ideas — yet both are correct in one respect: we still know so very little about ourselves and our surroundings.

From the NYT:

Though he probably didn’t intend anything so jarring, Nicolaus Copernicus, in a 16th-century treatise, gave rise to the idea that human beings do not occupy a special place in the heavens. Nearly 500 years after replacing the Earth with the sun as the center of the cosmic swirl, we’ve come to see ourselves as just another species on a planet orbiting a star in the boondocks of a galaxy in the universe we call home. And this may be just one of many universes — what cosmologists, some more skeptically than others, have named the multiverse.

Despite the long string of demotions, we remain confident, out here on the edge of nowhere, that our band of primates has what it takes to figure out the cosmos — what the writer Timothy Ferris called “the whole shebang.” New particles may yet be discovered, and even new laws. But it is almost taken for granted that everything from physics to biology, including the mind, ultimately comes down to four fundamental concepts: matter and energy interacting in an arena of space and time.

There are skeptics who suspect we may be missing a crucial piece of the puzzle. Recently, I’ve been struck by two books exploring that possibility in very different ways. There is no reason why, in this particular century, Homo sapiens should have gathered all the pieces needed for a theory of everything. In displacing humanity from a privileged position, the Copernican principle applies not just to where we are in space but to when we are in time.

Since it was published in 2012, “Mind and Cosmos,” by the philosopher Thomas Nagel, is the book that has caused the most consternation. With his taunting subtitle — “Why the Materialist Neo-Darwinian Conception of Nature Is Almost Certainly False” — Dr. Nagel was rejecting the idea that there was nothing more to the universe than matter and physical forces. He also doubted that the laws of evolution, as currently conceived, could have produced something as remarkable as sentient life. That idea borders on anathema, and the book quickly met with a blistering counterattack. Steven Pinker, a Harvard psychologist, denounced it as “the shoddy reasoning of a once-great thinker.”

What makes “Mind and Cosmos” worth reading is that Dr. Nagel is an atheist, who rejects the creationist idea of an intelligent designer. The answers, he believes, may still be found through science, but only by expanding it further than it may be willing to go.

“Humans are addicted to the hope for a final reckoning,” he wrote, “but intellectual humility requires that we resist the temptation to assume that the tools of the kind we now have are in principle sufficient to understand the universe as a whole.”

Dr. Nagel finds it astonishing that the human brain — this biological organ that evolved on the third rock from the sun — has developed a science and a mathematics so in tune with the cosmos that it can predict and explain so many things.

Neuroscientists assume that these mental powers somehow emerge from the electrical signaling of neurons — the circuitry of the brain. But no one has come close to explaining how that occurs.

Continue reading the main story Continue reading the main story
Continue reading the main story

That, Dr. Nagel proposes, might require another revolution: showing that mind, along with matter and energy, is “a fundamental principle of nature” — and that we live in a universe primed “to generate beings capable of comprehending it.” Rather than being a blind series of random mutations and adaptations, evolution would have a direction, maybe even a purpose.

“Above all,” he wrote, “I would like to extend the boundaries of what is not regarded as unthinkable, in light of how little we really understand about the world.”

Dr. Nagel is not alone in entertaining such ideas. While rejecting anything mystical, the biologist Stuart Kauffman has suggested that Darwinian theory must somehow be expanded to explain the emergence of complex, intelligent creatures. And David J. Chalmers, a philosopher, has called on scientists to seriously consider “panpsychism” — the idea that some kind of consciousness, however rudimentary, pervades the stuff of the universe.

Some of this is a matter of scientific taste. It can be just as exhilarating, as Stephen Jay Gould proposed in “Wonderful Life,” to consider the conscious mind as simply a fluke, no more inevitable than the human appendix or a starfish’s five legs. But it doesn’t seem so crazy to consider alternate explanations.

Heading off in another direction, a new book by the physicist Max Tegmark suggests that a different ingredient — mathematics — needs to be admitted into science as one of nature’s irreducible parts. In fact, he believes, it may be the most fundamental of all.

In a well-known 1960 essay, the physicist Eugene Wigner marveled at “the unreasonable effectiveness of mathematics” in explaining the world. It is “something bordering on the mysterious,” he wrote, for which “there is no rational explanation.”

The best he could offer was that mathematics is “a wonderful gift which we neither understand nor deserve.”

Dr. Tegmark, in his new book, “Our Mathematical Universe: My Quest for the Ultimate Nature of Reality,” turns the idea on its head: The reason mathematics serves as such a forceful tool is that the universe is a mathematical structure. Going beyond Pythagoras and Plato, he sets out to show how matter, energy, space and time might emerge from numbers.

Read the entire article here.