Category Archives: Idea Soup

Colorless Green Ideas Sleep Furiously

Linguist, philosopher, and more recently political activist, Noam Chomsky penned the title phrase in the late 1950s. The sentence is grammatically correct, but semantically nonsensical. Some now maintain that many of Chomsky’s early ideas on the innateness of human language are equally nonsensical. Chomsky popularized the idea that language is innate to humans; that somehow and somewhere the minds of human infants contain a mechanism that can make sense of language by applying rules encoded in and activated by our genes. Steven Pinker expanded on Chomsky’s theory by proposing that the mind contains an innate device that encodes a common, universal grammar, which is foundational to all languages across all human societies.

Recently however, this notion has come under increasing criticism. A  growing number of prominent linguistic scholars, including Professor Vyvyan Evans, maintain that Chomsky’s and Pinker’s linguistic models are outdated — that a universal grammar is nothing but a finely-tuned myth. Evans and others maintain that language arises from and is directly embodied in experience.

From the New Scientist:

The ideas of Noam Chomsky, popularised by Steven Pinker, come under fire in Vyvyan Evans’s book The Language Myth: Why language is not an instinct

IS THE way we think about language on the cusp of a revolution? After reading The Language Myth, it certainly looks as if a major shift is in progress, one that will open people’s minds to liberating new ways of thinking about language.

I came away excited. I found that words aren’t so much things that can be limited by a dictionary definition but are encyclopaedic, pointing to sets of concepts. There is the intriguing notion that language will always be less rich than our ideas and there will always be things we cannot quite express. And there is the growing evidence that words are rooted in concepts built out of our bodily experience of living in the world.

Its author, Vyvyan Evans, is a professor of linguistics at Bangor University, UK, and his primary purpose is not so much to map out the revolution (that comes in a sequel) but to prepare you for it by sweeping out old ideas. The book is sure to whip up a storm, because in his sights are key ideas from some of the world’s great thinkers, including philosophers Noam Chomsky and Jerry Fodor.

Ideas about language that have entered the public consciousness are more myth than reality, Evans argues. Bestsellers by Steven Pinker, the Harvard University professor who popularised Chomksy in The Language InstinctHow the Mind Works and The Stuff of Thought, come in for particular criticism. “Science has moved on,” Evans writes. “And to end it all, Pinker is largely wrong, about language and about a number of other things too…”

The commonplace view of “language as instinct” is the myth Evans wants to destroy and he attempts the operation with great verve. The myth comes from the way children effortlessly learn languages just by listening to adults around them, without being aware explicitly of the governing grammatical rules.

This “miracle” of spontaneous learning led Chomsky to argue that grammar is stored in a module of the mind, a “language acquisition device”, waiting to be activated, stage-by-stage, when an infant encounters the jumble of language. The rules behind language are built into our genes.

This innate grammar is not the grammar of a school textbook, but a universal grammar, capable of generating the rules of any of the 7000 or so languages that a child might be exposed to, however different they might appear. In The Language Instinct, Pinker puts it this way: “a Universal Grammar, not reducible to history or cognition, underlies the human language instinct”. The search for that universal grammar has kept linguists busy for half a century.

They may have been chasing a mirage. Evans marshals impressive empirical evidence to take apart different facets of the “language instinct myth”. A key criticism is that the more languages are studied, the more their diversity becomes apparent and an underlying universal grammar less probable.

In a whistle-stop tour, Evans tells stories of languages with a completely free word order, including Jiwarli and Thalanyji from Australia. Then there’s the Inuit language Inuktitut, which builds sentences out of prefixes and suffixes to create giant words like tawakiqutiqarpiit, roughly meaning: “Do you have any tobacco for sale?” And there is the native Canadian language, Straits Salish, which appears not to have nouns or verbs.

An innate language module also looks shaky, says Evans, now scholars have watched languages emerge among communities of deaf people. A sign language is as rich grammatically as a spoken one, but new ones don’t appear fully formed as we might expect if grammar is laid out in our genes. Instead, they gain grammatical richness over several generations.

Now, too, we have detailed studies of how children acquire language. Grammatical sentences don’t start to pop out of their mouths at certain developmental stages, but rather bits and pieces emerge as children learn. At first, they use chunks of particular expressions they hear often, only gradually learning patterns and generalising to a fully fledged grammar. So grammars emerge from use, and the view of “language-as-instinct”, argues Evans, should be replaced by “language-as-use”.

The “innate” view also encounters a deep philosophical problem. If the rules of language are built into our genes, how is it that sentences mean something? How do they connect to our thoughts, concepts and to the outside world?

A solution from the language-as-instinct camp is that there is an internal language of thought called “mentalese”. In The Language Instinct, Pinker explains: “Knowing a language, then, is knowing how to translate mentalese into strings of words.” But philosophers are left arguing over the same question once removed: how does mentalese come to have meaning?

Read the entire article here.

 

The Italian Canary Sings

Coal_bituminousThose who decry benefits fraud in their own nations should look to the illustrious example of Italian “miner” Carlo Cani. His adventures in absconding from work over a period of 35 years (yes, years) would make a wonderful indie movie, and should be an inspiration to less ambitious slackers the world over.

From the Telegraph:

An Italian coal miner’s confession that he is drawing a pension despite hardly ever putting in a day’s work over a 35-year career has underlined the country’s problem with benefit fraud and its dysfunctional pension system.

Carlo Cani started work as a miner in 1980 but soon found that he suffered from claustrophobia and hated being underground.

He started doing everything he could to avoid hacking away at the coal face, inventing an imaginative range of excuses for not venturing down the mine in Sardinia where he was employed.

He pretended to be suffering from amnesia and haemorrhoids, rubbed coal dust into his eyes to feign an infection and on occasion staggered around pretending to be drunk.

The miner, now aged 60, managed to accumulate years of sick leave, apparently with the help of compliant doctors, and was able to stay at home to indulge his passion for jazz.

He also spent extended periods of time at home on reduced pay when demand for coal from the mine dipped, under an Italian system known as “cassazione integrazione” in which employees are kept on the pay roll during periods of economic difficulty for their companies.

Despite his long periods of absence, he was still officially an employee of the mining company, Carbosulcis, and therefore eventually entitled to a pension.

“I invented everything – amnesia, pains, haemorrhoids, I used to lurch around as if I was drunk. I bumped my thumb on a wall and obviously you can’t work with a swollen thumb,” Mr Cani told La Stampa daily on Tuesday.

“Other times I would rub coal dust into my eyes. I just didn’t like the work – being a miner was not the job for me.”

But rather than find a different occupation, he managed to milk the system for 35 years, until retiring on a pension in 2006 at the age of just 52.

“I reached the pensionable age without hardly ever working. I hated being underground. “Right from the start, I had no affinity for coal.”

He said he had “respect” for his fellow miners, who had earned their pensions after “years of sweat and back-breaking work”, while he had mostly rested at home.

The case only came to light this week but has caused such a furore in Italy that Mr Cani is now refusing to take telephone calls.

He could not be contacted but another Carlo Cani, who is no relation but lives in the same area of southern Sardinia and has his number listed in the phone book, said: “People round here are absolutely furious about this – to think that someone could skive off work for so long and still get his pension. He even seems to be proud of that fact.

“It’s shameful. This is a poor region and there is no work. All the young people are leaving and moving to England and Germany.”

The former miner’s work-shy ways have caused indignation in a country in which youth unemployment is more than 40 per cent.

Read the entire story here.

Image: Bituminous coal. The type of coal not mined by retired “miner” Carlo Cani. Courtesy of Wikipedia.

Cross-Connection Requires a Certain Daring

A previously unpublished essay by Isaac Asimov on the creative process shows us his well reasoned thinking on the subject. While he believed that deriving new ideas could be done productively in a group, he seemed to gravitate more towards the notion of the lone creative genius. Both, however, require the innovator(s) to cross-connect thoughts, often from disparate sources.

From Technology Review:

How do people get new ideas?

Presumably, the process of creativity, whatever it is, is essentially the same in all its branches and varieties, so that the evolution of a new art form, a new gadget, a new scientific principle, all involve common factors. We are most interested in the “creation” of a new scientific principle or a new application of an old one, but we can be general here.

One way of investigating the problem is to consider the great ideas of the past and see just how they were generated. Unfortunately, the method of generation is never clear even to the “generators” themselves.

But what if the same earth-shaking idea occurred to two men, simultaneously and independently? Perhaps, the common factors involved would be illuminating. Consider the theory of evolution by natural selection, independently created by Charles Darwin and Alfred Wallace.

There is a great deal in common there. Both traveled to far places, observing strange species of plants and animals and the manner in which they varied from place to place. Both were keenly interested in finding an explanation for this, and both failed until each happened to read Malthus’s “Essay on Population.”

Both then saw how the notion of overpopulation and weeding out (which Malthus had applied to human beings) would fit into the doctrine of evolution by natural selection (if applied to species generally).

Obviously, then, what is needed is not only people with a good background in a particular field, but also people capable of making a connection between item 1 and item 2 which might not ordinarily seem connected.

Undoubtedly in the first half of the 19th century, a great many naturalists had studied the manner in which species were differentiated among themselves. A great many people had read Malthus. Perhaps some both studied species and read Malthus. But what you needed was someone who studied species, read Malthus, and had the ability to make a cross-connection.

That is the crucial point that is the rare characteristic that must be found. Once the cross-connection is made, it becomes obvious. Thomas H. Huxley is supposed to have exclaimed after reading On the Origin of Species, “How stupid of me not to have thought of this.”

But why didn’t he think of it? The history of human thought would make it seem that there is difficulty in thinking of an idea even when all the facts are on the table. Making the cross-connection requires a certain daring. It must, for any cross-connection that does not require daring is performed at once by many and develops not as a “new idea,” but as a mere “corollary of an old idea.”

It is only afterward that a new idea seems reasonable. To begin with, it usually seems unreasonable. It seems the height of unreason to suppose the earth was round instead of flat, or that it moved instead of the sun, or that objects required a force to stop them when in motion, instead of a force to keep them moving, and so on.

A person willing to fly in the face of reason, authority, and common sense must be a person of considerable self-assurance. Since he occurs only rarely, he must seem eccentric (in at least that respect) to the rest of us. A person eccentric in one respect is often eccentric in others.

Consequently, the person who is most likely to get new ideas is a person of good background in the field of interest and one who is unconventional in his habits. (To be a crackpot is not, however, enough in itself.)

Once you have the people you want, the next question is: Do you want to bring them together so that they may discuss the problem mutually, or should you inform each of the problem and allow them to work in isolation?

My feeling is that as far as creativity is concerned, isolation is required. The creative person is, in any case, continually working at it. His mind is shuffling his information at all times, even when he is not conscious of it. (The famous example of Kekule working out the structure of benzene in his sleep is well-known.)

The presence of others can only inhibit this process, since creation is embarrassing. For every new good idea you have, there are a hundred, ten thousand foolish ones, which you naturally do not care to display.

Nevertheless, a meeting of such people may be desirable for reasons other than the act of creation itself.

Read the entire article here.

The Sandwich of Corporate Exploitation

Google-search-sandwich

If ever you needed a vivid example of corporate exploitation of the most vulnerable, this is it. So-called free-marketeers will sneer at any suggestion of corporate over-reach — they will chant that it’s just the free market at work. But, the rules of this market,
as are many others, are written and enforced by the patricians and well-stacked against the plebs.

From NYT:

If you are a chief executive of a large company, you very likely have a noncompete clause in your contract, preventing you from jumping ship to a competitor until some period has elapsed. Likewise if you are a top engineer or product designer, holding your company’s most valuable intellectual property between your ears.

And you also probably have a noncompete agreement if you assemble sandwiches at Jimmy John’s sub sandwich chain for a living.

But what’s most startling about that information, first reported by The Huffington Post, is that it really isn’t all that uncommon. As my colleague Steven Greenhouse reported this year, employers are now insisting that workers in a surprising variety of relatively low- and moderate-paid jobs sign noncompete agreements.

Indeed, while HuffPo has no evidence that Jimmy John’s, a 2,000-location sandwich chain, ever tried to enforce the agreement to prevent some $8-an-hour sandwich maker or delivery driver from taking a job at the Blimpie down the road, there are other cases where low-paid or entry-level workers have had an employer try to restrict their employability elsewhere. The Times article tells of a camp counselor and a hair stylist who faced such restrictions.

American businesses are paying out a historically low proportion of their income in the form of wages and salaries. But the Jimmy John’s employment agreement is one small piece of evidence that workers, especially those without advanced skills, are also facing various practices and procedures that leave them worse off, even apart from what their official hourly pay might be. Collectively they tilt the playing field toward the owners of businesses and away from the workers who staff them.

You see it in disputes like the one heading to the Supreme Court over whether workers at an Amazon warehouse in Nevada must be paid for the time they wait to be screened at the end of the workday to ensure they have no stolen goods on them.

It’s evident in continuing lawsuits against Federal Express claiming that its “independent contractors” who deliver packages are in fact employees who are entitled to benefits and reimbursements of costs they incur.

And it is shown in the way many retailers assign hourly workers inconvenient schedules that can change at the last minute, giving them little ability to plan their lives (my colleague Jodi Kantor wrote memorably about the human effects of those policies on a Starbucks coffee worker in August, and Starbucks rapidly said it would end many of them).

These stories all expose the subtle ways that employers extract more value from their entry-level workers, at the cost of their quality of life (or, in the case of the noncompete agreements, freedom to leave for a more lucrative offer).

What’s striking about some of these labor practices is the absence of reciprocity. When a top executive agrees to a noncompete clause in a contract, it is typically the product of a negotiation in which there is some symmetry: The executive isn’t allowed to quit for a competitor, but he or she is guaranteed to be paid for the length of the contract even if fired.

Read the entire story here.

Image courtesy of Google Search.

Frenemies: The Religious Beheading and The Secular Guillotine

Secular ideologues in the West believe they are on the moral high-ground. The separation of church (and mosque or synagogue) from state is, they believe, the path to a more just, equal and less-violent culture. They will cite example after example in contemporary and recent culture of terrible violence in the name of religious extremism and fundamentalism.

And, yet, step back for a minute from the horrendous stories and images of atrocities wrought by religious fanatics in Europe, Africa, Asia and the Middle East. Think of the recent histories of fledgling nations in Africa; the ethnic cleansings across much of Central and Eastern Europe — several times over; the egomaniacal tribal terrorists of Central Asia, the brutality of neo-fascists and their socialist bedfellows in Latin America. Delve deeper into these tragic histories — some still unfolding before our very eyes — and you will see a much more complex view of humanity.  Our tribal rivalries know no bounds and our violence towards others is certainly not limited only to the catalyst of religion. Yes, we fight for our religion, but we also fight for territory, politics, resources, nationalism, revenge, poverty, ego.  Soon the coming fights will be about water and food — these will make our wars over belief systems seem rather petty.

Scholar and author Karen Armstrong explores the complexities of religious and secular violence in the broader context of human struggle in her new book, Fields of Blood: Religion and the History of Violence.

From the Guardian:

As we watch the fighters of the Islamic State (Isis) rampaging through the Middle East, tearing apart the modern nation-states of Syria and Iraq created by departing European colonialists, it may be difficult to believe we are living in the 21st century. The sight of throngs of terrified refugees and the savage and indiscriminate violence is all too reminiscent of barbarian tribes sweeping away the Roman empire, or the Mongol hordes of Genghis Khan cutting a swath through China, Anatolia, Russia and eastern Europe, devastating entire cities and massacring their inhabitants. Only the wearily familiar pictures of bombs falling yet again on Middle Eastern cities and towns – this time dropped by the United States and a few Arab allies – and the gloomy predictions that this may become another Vietnam, remind us that this is indeed a very modern war.

The ferocious cruelty of these jihadist fighters, quoting the Qur’an as they behead their hapless victims, raises another distinctly modern concern: the connection between religion and violence. The atrocities of Isis would seem to prove that Sam Harris, one of the loudest voices of the “New Atheism”, was right to claim that “most Muslims are utterly deranged by their religious faith”, and to conclude that “religion itself produces a perverse solidarity that we must find some way to undercut”. Many will agree with Richard Dawkins, who wrote in The God Delusion that “only religious faith is a strong enough force to motivate such utter madness in otherwise sane and decent people”. Even those who find these statements too extreme may still believe, instinctively, that there is a violent essence inherent in religion, which inevitably radicalises any conflict – because once combatants are convinced that God is on their side, compromise becomes impossible and cruelty knows no bounds.

Despite the valiant attempts by Barack Obama and David Cameron to insist that the lawless violence of Isis has nothing to do with Islam, many will disagree. They may also feel exasperated. In the west, we learned from bitter experience that the fanatical bigotry which religion seems always to unleash can only be contained by the creation of a liberal state that separates politics and religion. Never again, we believed, would these intolerant passions be allowed to intrude on political life. But why, oh why, have Muslims found it impossible to arrive at this logicalsolution to their current problems? Why do they cling with perverse obstinacy to the obviously bad idea of theocracy? Why, in short, have they been unable to enter the modern world? The answer must surely lie in their primitive and atavistic religion.

But perhaps we should ask, instead, how it came about that we in the west developed our view of religion as a purely private pursuit, essentially separate from all other human activities, and especially distinct from politics. After all, warfare and violence have always been a feature of political life, and yet we alone drew the conclusion that separating the church from the state was a prerequisite for peace. Secularism has become so natural to us that we assume it emerged organically, as a necessary condition of any society’s progress into modernity. Yet it was in fact a distinct creation, which arose as a result of a peculiar concatenation of historical circumstances; we may be mistaken to assume that it would evolve in the same fashion in every culture in every part of the world.

We now take the secular state so much for granted that it is hard for us to appreciate its novelty, since before the modern period, there were no “secular” institutions and no “secular” states in our sense of the word. Their creation required the development of an entirely different understanding of religion, one that was unique to the modern west. No other culture has had anything remotely like it, and before the 18th century, it would have been incomprehensible even to European Catholics. The words in other languages that we translate as “religion” invariably refer to something vaguer, larger and more inclusive. The Arabic word dinsignifies an entire way of life, and the Sanskrit dharma covers law, politics, and social institutions as well as piety. The Hebrew Bible has no abstract concept of “religion”; and the Talmudic rabbis would have found it impossible to define faith in a single word or formula, because the Talmud was expressly designed to bring the whole of human life into the ambit of the sacred. The Oxford Classical Dictionary firmly states: “No word in either Greek or Latin corresponds to the English ‘religion’ or ‘religious’.” In fact, the only tradition that satisfies the modern western criterion of religion as a purely private pursuit is Protestant Christianity, which, like our western view of “religion”, was also a creation of the early modern period.

Traditional spirituality did not urge people to retreat from political activity. The prophets of Israel had harsh words for those who assiduously observed the temple rituals but neglected the plight of the poor and oppressed. Jesus’s famous maxim to “Render unto Caesar the things that are Caesar’s” was not a plea for the separation of religion and politics. Nearly all the uprisings against Rome in first-century Palestine were inspired by the conviction that the Land of Israel and its produce belonged to God, so that there was, therefore, precious little to “give back” to Caesar. When Jesus overturned the money-changers’ tables in the temple, he was not demanding a more spiritualised religion. For 500 years, the temple had been an instrument of imperial control and the tribute for Rome was stored there. Hence for Jesus it was a “den of thieves”. The bedrock message of the Qur’an is that it is wrong to build a private fortune but good to share your wealth in order to create a just, egalitarian and decent society. Gandhi would have agreed that these were matters of sacred import: “Those who say that religion has nothing to do with politics do not know what religion means.”

The myth of religious violence

Before the modern period, religion was not a separate activity, hermetically sealed off from all others; rather, it permeated all human undertakings, including economics, state-building, politics and warfare. Before 1700, it would have been impossible for people to say where, for example, “politics” ended and “religion” began. The Crusades were certainly inspired by religious passion but they were also deeply political: Pope Urban II let the knights of Christendom loose on the Muslim world to extend the power of the church eastwards and create a papal monarchy that would control Christian Europe. The Spanish inquisition was a deeply flawed attempt to secure the internal order of Spain after a divisive civil war, at a time when the nation feared an imminent attack by the Ottoman empire. Similarly, the European wars of religion and the thirty years war were certainly exacerbated by the sectarian quarrels of Protestants and Catholics, but their violence reflected the birth pangs of the modern nation-state.

Read the entire article here.

Past Experience is Good; Random Decision-Making is Better

We all know that making decisions from past experience is wise. We learn from the benefit of hindsight. We learn to make small improvements or radical shifts in our thinking and behaviors based on history and previous empirical evidence. Stock market gurus and investment mavens will tell you time after time that they have a proven method — based on empirical evidence and a lengthy, illustrious track record — for picking the next great stock or investing your hard-earned retirement funds.

Yet, empirical evidence shows that chimpanzees throwing darts at the WSJ stock pages are just as good at stock market tips as we humans (and the “masters of the universe”). So, it seems that random decision-making can be just as good, if not better, than wisdom and experience.

From the Guardian:

No matter how much time you spend reading the recent crop of books on How To Decide or How To Think Clearly, you’re unlikely to encounter glowing references to a decision-making system formerly used by the Azande of central Africa. Faced with a dilemma, tribespeople would force poison down the neck of a chicken while asking questions of the “poison oracle”; the chicken answered by surviving (“yes”) or expiring (“no”). Clearly, this was cruel to chickens. That aside, was it such a terrible way to choose among options? The anthropologist EE Evans-Pritchard, who lived with the Azande in the 1920s, didn’t think so. “I always kept a supply of poison [and] we regulated our affairs in accordance with the oracle’s decisions,” he wrote, adding drily: “I found this as satisfactory a way of running my home and affairs as any other I know of.” You could dismiss that as a joke. After all, chicken-poisoning is plainly superstition, delivering random results. But what if random results are sometimes exactly what you need?

The other day, US neuroscientists published details of experiments on rats, showing that in certain unpredictable situations, they stop trying to make decisions based on past experience. Instead, a circuit in their brains switches to “random mode”. The researchers’ hunch is that this serves a purpose: past experience is usually helpful, but when uncertainty levels are high, it can mislead, so randomness is in the rats’ best interests. When we’re faced with the unfamiliar, experience can mislead humans, too, partly because we filter it through various irrational biases. According to those books on thinking clearly, we should strive to overcome these biases, thus making more rational calculations. But there’s another way to bypass our biased brains: copy the rats, and choose randomly.

In certain walks of life, the usefulness of randomness is old news: the stock market, say, is so unpredictable that, to quote the economist Burton Malkiel, “a blindfolded monkey throwing darts at a newspaper’s financial pages could select a portfolio that would do as well as one carefully selected by experts”. (This has been tried, with simulated monkeys, andthey beat the market.) But, generally, as Michael Schulson put it recentlyin an Aeon magazine essay, “We take it for granted that the best decisions stem from empirical analysis and informed choice.” Yet consider, he suggests, the ancient Greek tradition of filling some government positions by lottery. Randomness disinfects a process that might be dirtied by corruption.

Randomness can be similarly useful in everyday life. For tiny choices, it’s a time-saver: pick randomly from a menu, and you can get back to chatting with friends. For bigger ones, it’s an acknowledgment of how little one can ever know about the complex implications of a decision. Let’s be realistic: for the biggest decisions, such as whom to marry, trusting to randomness feels absurd. But if you can up the randomness quotient for marginally less weighty choices, especially when uncertainty prevails, you may find it pays off. Though kindly refrain from poisoning any chickens.

Read the entire article here.

UnDesign

The future of good design may actually lie in intentionally doing the wrong thing. While we are drawn to the beauty of symmetry — in faces, in objects — we are also drawn by the promise of imperfection.

From Wired:

In the late 1870s, Edgar Degas began work on what would become one of his most radical paintings, Jockeys Before the Race. Degas had been schooled in techniques of the neoclassicist and romanticist masters but had begun exploring subject matter beyond the portraits and historical events that were traditionally considered suitable for fine art, training his eye on café culture, common laborers, and—most famously—ballet dancers. But with Jockeys, Degas pushed past mild provocation. He broke some of the most established formulas of composition. The painting is technically exquisite, the horses vividly sculpted with confident brushstrokes, their musculature perfectly rendered. But while composing this beautifully balanced, impressionistically rendered image, Degas added a crucial, jarring element: a pole running vertically—and asymmetrically—in the immediate foreground, right through the head of one of the horses.

Degas wasn’t just “thinking outside of the box,” as the innovation cliché would have it. He wasn’t trying to overturn convention to find a more perfect solution. He was purposely creating something that wasn’t pleasing, intentionally doing the wrong thing. Naturally viewers were horrified. Jockeys was lampooned in the magazine Punch, derided as a “mistaken impression.” But over time, Degas’ transgression provided inspiration for other artists eager to find new ways to inject vitality and dramatic tension into work mired in convention. You can see its influence across art history, from Frederic Remington’s flouting of traditional compositional technique to the crackling photojournalism of Henri Cartier-Bresson.

Degas was engaged in a strategy that has shown up periodically for centuries across every artistic and creative field. Think of it as one step in a cycle: In the early stages, practitioners dedicate themselves to inventing and improving the rules—how to craft the most pleasing chord progression, the perfectly proportioned building, the most precisely rendered amalgamation of rhyme and meter. Over time, those rules become laws, and artists and designers dedicate themselves to excelling within these agreed-upon parameters, creating work of unparalleled refinement and sophistication—the Pantheon, the Sistine Chapel, the Goldberg Variations. But once a certain maturity has been reached, someone comes along who decides to take a different route. Instead of trying to create an ever more polished and perfect artifact, this rebel actively seeks out imperfection—sticking a pole in the middle of his painting, intentionally adding grungy feedback to a guitar solo, deliberately photographing unpleasant subjects. Eventually some of these creative breakthroughs end up becoming the foundation of a new set of aesthetic rules, and the cycle begins again.

DEGAS WASN’T JUST THINKING OUTSIDE THE BOX. HE WAS PURPOSELY CREATING SOMETHING THAT WASN’T PLEASING.

For the past 30 years, the field of technology design has been working its way through the first two stages of this cycle, an industry-wide march toward more seamless experiences, more delightful products, more leverage over the world around us. Look at our computers: beige and boxy desktop machines gave way to bright and colorful iMacs, which gave way to sleek and sexy laptops, which gave way to addictively touchable smartphones. It’s hard not to look back at this timeline and see it as a great story of human progress, a joint effort to experiment and learn and figure out the path toward a more refined and universally pleasing design.

All of this has resulted in a world where beautifully constructed tech is more powerful and more accessible than ever before. It is also more consistent. That’s why all smartphones now look basically the same—gleaming black glass with handsomely cambered edges. Google, Apple, and Microsoft all use clean, sans-serif typefaces in their respective software. After years of experimentation, we have figured out what people like and settled on some rules.

But there’s a downside to all this consensus—it can get boring. From smartphones to operating systems to web page design, it can start to feel like the truly transformational moments have come and gone, replaced by incremental updates that make our devices and interactions faster and better.

This brings us to an important and exciting moment in the design of our technologies. We have figured out the rules of creating sleek sophistication. We know, more or less, how to get it right. Now, we need a shift in perspective that allows us to move forward. We need a pole right through a horse’s head. We need to enter the third stage of this cycle. It’s time to stop figuring out how to do things the right way, and start getting it wrong.

In late 2006, when I was creative director here at WIRED, I was working on the design of a cover featuring John Hodgman. We were far along in the process—Hodgman was styled and photographed, the cover lines written, our fonts selected, the layout firmed up. I had been aiming for a timeless design with a handsome monochromatic color palette, a cover that evoked a 1960s jet-set vibe. When I presented my finished design, WIRED’s editor at the time, Chris Anderson, complained that the cover was too drab. He uttered the prescriptive phrase all graphic designers hate hearing: “Can’t you just add more colors?”

I demurred. I felt the cover was absolutely perfect. But Chris did not, and so, in a spasm of designerly “fuck you,” I drew a small rectangle into my design, a little stripe coming off from the left side of the page, rudely breaking my pristine geometries. As if that weren’t enough, I filled it with the ugliest hue I could find: neon orange— Pantone 811, to be precise. My perfect cover was now ruined!

By the time I came to my senses a couple of weeks later, it was too late. The cover had already been sent to the printer. My anger morphed into regret. To the untrained eye, that little box might not seem so offensive, but I felt that I had betrayed one of the most crucial lessons I learned in design school—that every graphic element should serve a recognizable function. This stray dash of color was careless at best, a postmodernist deviation with no real purpose or value. It confused my colleagues and detracted from the cover’s clarity, unnecessarily making the reader more conscious of the design.

But you know what? I actually came to like that crass little neon orange bar. I ended up including a version of it on the next month’s cover, and again the month after that. It added something, even though I couldn’t explain what it was. I began referring to this idea—intentionally making “bad” design choices—as Wrong Theory, and I started applying it in little ways to all of WIRED’s pages. Pictures that were supposed to run large, I made small. Where type was supposed to run around graphics, I overlapped the two. Headlines are supposed to come at the beginning of stories? I put them at the end. I would even force our designers to ruin each other’s “perfect” layouts.

At the time, this represented a major creative breakthrough for me—the idea that intentional wrongness could yield strangely pleasing results. Of course I was familiar with the idea of rule-breaking innovation—that each generation reacts against the one that came before it, starting revolutions, turning its back on tired conventions. But this was different. I wasn’t just throwing out the rulebook and starting from scratch. I was following the rules, then selectively breaking one or two for maximum impact.

Read the entire article here.

Slow Reading is Catching on Fast (Again)

Pursuing a cherished activity, uninterrupted, with no distraction is one of life’s pleasures. Many who multi-task and brag about it have long forgotten the benefits of deep focus and immersion in one single, prolonged task. Reading can be such a process — and over the last several years researchers have found that distraction-free, thoughtful reading — slow reading — is beneficial.

So, please put down your tablet, laptop, smartphone and TV remote after you read this post, go find an unread book, shut out your daily distractions — kids, news, Facebook, boss, grocery lists, plumber — and immerse yourself in the words on a page, and nothing else. It will relieve you of stress and benefit your brain.

From WSJ:

Once a week, members of a Wellington, New Zealand, book club arrive at a cafe, grab a drink and shut off their cellphones. Then they sink into cozy chairs and read in silence for an hour.

The point of the club isn’t to talk about literature, but to get away from pinging electronic devices and read, uninterrupted. The group calls itself the Slow Reading Club, and it is at the forefront of a movement populated by frazzled book lovers who miss old-school reading.

Slow reading advocates seek a return to the focused reading habits of years gone by, before Google, smartphones and social media started fracturing our time and attention spans. Many of its advocates say they embraced the concept after realizing they couldn’t make it through a book anymore.

“I wasn’t reading fiction the way I used to,” said Meg Williams, a 31-year-old marketing manager for an annual arts festival who started the club. “I was really sad I’d lost the thing I used to really, really enjoy.”

Slow readers list numerous benefits to a regular reading habit, saying it improves their ability to concentrate, reduces stress levels and deepens their ability to think, listen and empathize. The movement echoes a resurgence in other old-fashioned, time-consuming pursuits that offset the ever-faster pace of life, such as cooking the “slow-food” way or knitting by hand.

The benefits of reading from an early age through late adulthood have been documented by researchers. A study of 300 elderly people published by the journal Neurology last year showed that regular engagement in mentally challenging activities, including reading, slowed rates of memory loss in participants’ later years.

A study published last year in Science showed that reading literary fiction helps people understand others’ mental states and beliefs, a crucial skill in building relationships. A piece of research published in Developmental Psychology in 1997 showed first-grade reading ability was closely linked to 11th grade academic achievements.

Yet reading habits have declined in recent years. In a survey this year, about 76% of Americans 18 and older said they read at least one book in the past year, down from 79% in 2011, according to the Pew Research Center.

Attempts to revive reading are cropping up in many places. Groups in Seattle, Brooklyn, Boston and Minneapolis have hosted so-called silent reading parties, with comfortable chairs, wine and classical music.

Diana La Counte of Orange County, Calif., set up what she called a virtual slow-reading group a few years ago, with members discussing the group’s book selection online, mostly on Facebook. “When I realized I read Twitter more than a book, I knew it was time for action,” she says.

Read the entire story here.

Texas and Its Textbooks: The Farce Continues

Just over a year ago I highlighted the plight of accepted scholarly fact in Texas. The state, through its infamous School Board of Education (SBOE), had just completed a lengthy effort to revise many textbooks for middle- and high-school curricula. The SBOE and its ideological supporters throughout the Texas political machine managed to insert numerous dubious claims, fictitious statements in place of agreed upon facts and handfuls of slanted opinion in all manner of historical and social science texts. Many academics and experts in their respective fields raised alarms over the process. But the SBOE derided these “liberal elitists”, and openly flaunted its distaste for fact, preferring to distort historical record with undertones of conservative Christianity.

Many non-Texan progressives and believers-in-fact laughingly shook their heads knowing that Texas could and should be left its own devices. Unfortunately, for the rest of the country, Texas has so much buying power that textbook publishers will often publish with Texas in mind, but distribute their books throughout the entire nation.

So now it comes as no surprise to find that many newly, or soon to be, published Texas textbooks for grades 6-12 are riddled with errors. An academic review of 43 textbooks highlights the disaster waiting to happen to young minds in Texas, and across many other states. The Texas SBOE will take a vote on which books to approve in November.

Some choice examples of the errors and half-truths below.

All of the world geography textbooks inaccurately downplay the role that conquest played in the spread of Christianity.

Discovery Education — Social Studies Techbook World Geography and Cultures

The text states: “When Europeans arrived, they brought Christianity with them and spread it among the indigenous people. Over time, Christianity became the main religion in Latin America.”

Pearson Education – Contemporary World Cultures

The text states: “Priests came to Mexico to convert Native Americans to the Roman Catholic religion. The Church became an important part of life in the new colony. Churches were built in the centers of towns and cities, and church officials became leaders in the colony.”

Houghton Mifflin Harcourt – World Geography

The text states: “The Spanish brought their language and Catholic religion, both of which dominate modern Mexico.”

Various

All but two of the world geography textbooks fail to mention the Spaniards’ forced conversions of the indigenous peoples to Christianity (e.g., the Spanish Requerimiento of 1513) and their often-systematic destruction of indigenous religious institutions. The two exceptions (Cengage Learning, Inc. – World Cultures and Geography and Houghton Mifflin Harcourt – World Geography) delay this grim news until a chapter on South America, and even there do not give it the prominence it deserves.

What’s Wrong?

The Christianization of the indigenous peoples of the Americas was most decidedly not benign. These descriptions provide a distorted picture of the spread of Christianity. An accurate account must include information about the forced conversion of native peoples and the often-systematic destruction of indigenous religious institutions and practices. (This error of omission is especially problematic when contrasted with the emphasis on conquest – often violent – to describe the spread of Islam in some textbooks.)

One world history textbook (by Worldview Software, Inc.) includes outdated – and possibly offensive – anthropological categories and racial terminology in describing African civilization.

WorldView Software – World History A: Early Civilizations to the Mid-1800s

The text states: “South of the Sahara Desert most of the people before the Age of Explorations were black Africans of the Negro race.”

 Elsewhere, the text states: “The first known inhabitants of Africa north of the Sahara in prehistory were Caucasoid Hamitic people of uncertain origin.”

What’s Wrong?

First, the term “Negro” is archaic and fraught with ulterior meaning. It should categorically not be used in a modern textbook. Further, the first passage is unforgivably misleading because it suggests that all black native Africans belong to a single “racial” group. This is typological thinking, which disappeared largely from texts after the 1940s. It harkens back to the racialization theory that all people could be classified as one of three “races”: Caucasoid, Mongoloid, or Negroid. Better to say: “…were natives of African origin.” Similarly, in the second passage, it is more accurate to simply omit reference to “Caucasoid.”

From the Washington Post:

When it comes to controversies about curriculum, textbook content and academic standards, Texas is the state that keeps on giving.

Back in 2010, we had an uproar over proposed changes to social studies standards by religious conservatives on the State Board of Education, which included a bid to calling the United States’ hideous slave trade history as the “Atlantic triangular trade.” There were other doozies, too, such as one proposal to remove Thomas Jefferson from the Enlightenment curriculum and replace him with John Calvin. Some were changed but the board’s approved standards were roundly criticized as distorted history.

There’s a new fuss about proposed social studies textbooks for Texas public schools that are based on what are called the Texas Essential  Knowledge  and  Skills.  Scholarly reviews of 43 proposed history, geography and government textbooks for Grades 6-12 — undertaken by the Education Fund of the Texas Freedom Network, a watchdog and activist group that monitors far-right issues and organizations — found extensive problems in American Government textbooks, U.S. and World History textbooks,Religion in World History textbooks, and Religion in World Geography textbooks.  The state board will vote on which books to approve in November.

Ideas promoted in various proposed textbooks include the notion that Moses and Solomon inspired American democracy, that in the era of segregation only “sometimes” were schools for black children “lower in quality” and that Jews view Jesus Christ as an important prophet.

Here are the broad findings of 10 scholars, who wrote four separate reports, taken from an executive summary, followed by the names of the scholars and a list of publishers who submitted textbooks.

The findings:

  • A number of government and world history textbooks exaggerate Judeo-Christian influence on the nation’s founding and Western political tradition.
  • Two government textbooks include misleading information that undermines the Constitutional concept of the separation of church and state.
  • Several world history and world geography textbooks include biased statements that inappropriately portray Islam and Muslims negatively.
  • All of the world geography textbooks inaccurately downplay the role that conquest played in the spread of Christianity.
  • Several world geography and history textbooks suffer from an incomplete – and often inaccurate – account of religions other than Christianity.
  • Coverage of key Christian concepts and historical events are lacking in a few textbooks, often due to the assumption that all students are Christians and already familiar with Christian events and doctrine.
  • A few government and U.S. history textbooks suffer from an uncritical celebration of the free enterprise system, both by ignoring legitimate problems that exist in capitalism and failing to include coverage of government’s role in the U.S. economic system.
  • One government textbook flirts with contemporary Tea Party ideology, particularly regarding the inclusion of anti-taxation and anti-regulation arguments.
  • One world history textbook includes outdated – and possibly offensive – anthropological categories and racial terminology in describing African civilization.

Read the entire article here and check out the academic report here.

 

Theism Versus Spirituality

Prominent neo-atheist Sam Harris continues to reject theism, and does so thoughtfully and eloquently. In his latest book, Waking Up, he continues to argue the case against religion, but makes a powerful case for spirituality. Harris defines spirituality as an inner sense of a good and powerful reality, based on sound self-awarenesses and insightful questioning of one’s own consciousness. This type of spirituality, quite rightly, is devoid of theistic angels and demons. Harris reveals more in his interview with Gary Gutting, professor of philosophy at the University of Notre Dame.

From the NYT:

Sam Harris is a neuroscientist and prominent “new atheist,” who along with others like Richard Dawkins, Daniel Dennett and Christopher Hitchens helped put criticism of religion at the forefront of public debate in recent years. In two previous books, “The End of Faith” and “Letter to a Christian Nation,” Harris argued that theistic religion has no place in a world of science. In his latest book, “Waking Up,” his thought takes a new direction. While still rejecting theism, Harris nonetheless makes a case for the value of “spirituality,” which he bases on his experiences in meditation. I interviewed him recently about the book and some of the arguments he makes in it.

Gary Gutting: A common basis for atheism is naturalism — the view that only science can give a reliable account of what’s in the world. But in “Waking Up” you say that consciousness resists scientific description, which seems to imply that it’s a reality beyond the grasp of science. Have you moved away from an atheistic view?

Sam Harris: I don’t actually argue that consciousness is “a reality” beyond the grasp of science. I just think that it is conceptually irreducible — that is, I don’t think we can fully understand it in terms of unconscious information processing. Consciousness is “subjective”— not in the pejorative sense of being unscientific, biased or merely personal, but in the sense that it is intrinsically first-person, experiential and qualitative.

The only thing in this universe that suggests the reality of consciousness is consciousness itself. Many philosophers have made this argument in one way or another — Thomas Nagel, John Searle, David Chalmers. And while I don’t agree with everything they say about consciousness, I agree with them on this point.

The primary approach to understanding consciousness in neuroscience entails correlating changes in its contents with changes in the brain. But no matter how reliable these correlations become, they won’t allow us to drop the first-person side of the equation. The experiential character of consciousness is part of the very reality we are studying. Consequently, I think science needs to be extended to include a disciplined approach to introspection.

G.G.: But science aims at objective truth, which has to be verifiable: open to confirmation by other people. In what sense do you think first-person descriptions of subjective experience can be scientific?

S.H.: In a very strong sense. The only difference between claims about first-person experience and claims about the physical world is that the latter are easier for others to verify. That is an important distinction in practical terms — it’s easier to study rocks than to study moods — but it isn’t a difference that marks a boundary between science and non-science. Nothing, in principle, prevents a solitary genius on a desert island from doing groundbreaking science. Confirmation by others is not what puts the “truth” in a truth claim. And nothing prevents us from making objective claims about subjective experience.

Are you thinking about Margaret Thatcher right now? Well, now you are. Were you thinking about her exactly six minutes ago? Probably not. There are answers to questions of this kind, whether or not anyone is in a position to verify them.

And certain truths about the nature of our minds are well worth knowing. For instance, the anger you felt yesterday, or a year ago, isn’t here anymore, and if it arises in the next moment, based on your thinking about the past, it will quickly pass away when you are no longer thinking about it. This is a profoundly important truth about the mind — and it can be absolutely liberating to understand it deeply. If you do understand it deeply — that is, if you are able to pay clear attention to the arising and passing away of anger, rather than merely think about why you have every right to be angry — it becomes impossible to stay angry for more than a few moments at a time. Again, this is an objective claim about the character of subjective experience. And I invite our readers to test it in the laboratory of their own minds.

G. G.: Of course, we all have some access to what other people are thinking or feeling. But that access is through probable inference and so lacks the special authority of first-person descriptions. Suppose I told you that in fact I didn’t think of Margaret Thatcher when I read your comment, because I misread your text as referring to Becky Thatcher in “The Adventures of Tom Sawyer”? If that’s true, I have evidence for it that you can’t have. There are some features of consciousness that we will agree on. But when our first-person accounts differ, then there’s no way to resolve the disagreement by looking at one another’s evidence. That’s very different from the way things are in science.

S.H.: This difference doesn’t run very deep. People can be mistaken about the world and about the experiences of others — and they can even be mistaken about the character of their own experience. But these forms of confusion aren’t fundamentally different. Whatever we study, we are obliged to take subjective reports seriously, all the while knowing that they are sometimes false or incomplete.

For instance, consider an emotion like fear. We now have many physiological markers for fear that we consider quite reliable, from increased activity in the amygdala and spikes in blood cortisol to peripheral physiological changes like sweating palms. However, just imagine what would happen if people started showing up in the lab complaining of feeling intense fear without showing any of these signs — and they claimed to feel suddenly quite calm when their amygdalae lit up on fMRI, their cortisol spiked, and their skin conductance increased. We would no longer consider these objective measures of fear to be valid. So everything still depends on people telling us how they feel and our (usually) believing them.

However, it is true that people can be very poor judges of their inner experience. That is why I think disciplined training in a technique like “mindfulness,” apart from its personal benefits, can be scientifically important.

Read the entire story here.

An Ode to the Monopolist

Peter Thiel on why entrepreneurs should strive for monopoly and avoid competition. If only it were that simple for esoteric restaurants, innovative technology companies and all startup businesses in between.

From WSJ:

What valuable company is nobody building? This question is harder than it looks, because your company could create a lot of value without becoming very valuable itself. Creating value isn’t enough—you also need to capture some of the value you create.

This means that even very big businesses can be bad businesses. For example, U.S. airline companies serve millions of passengers and create hundreds of billions of dollars of value each year. But in 2012, when the average airfare each way was $178, the airlines made only 37 cents per passenger trip. Compare them to Google which creates less value but captures far more. Google brought in $50 billion in 2012 (versus $160 billion for the airlines), but it kept 21% of those revenues as profits—more than 100 times the airline industry’s profit margin that year. Google makes so much money that it is now worth three times more than every U.S. airline combined.

The airlines compete with each other, but Google stands alone. Economists use two simplified models to explain the difference: perfect competition and monopoly.

“Perfect competition” is considered both the ideal and the default state in Economics 101. So-called perfectly competitive markets achieve equilibrium when producer supply meets consumer demand. Every firm in a competitive market is undifferentiated and sells the same homogeneous products. Since no firm has any market power, they must all sell at whatever price the market determines. If there is money to be made, new firms will enter the market, increase supply, drive prices down and thereby eliminate the profits that attracted them in the first place. If too many firms enter the market, they’ll suffer losses, some will fold, and prices will rise back to sustainable levels. Under perfect competition, in the long run no company makes an economic profit.

The opposite of perfect competition is monopoly. Whereas a competitive firm must sell at the market price, a monopoly owns its market, so it can set its own prices. Since it has no competition, it produces at the quantity and price combination that maximizes its profits.

To an economist, every monopoly looks the same, whether it deviously eliminates rivals, secures a license from the state or innovates its way to the top. I’m not interested in illegal bullies or government favorites: By “monopoly,” I mean the kind of company that is so good at what it does that no other firm can offer a close substitute. Google is a good example of a company that went from 0 to 1: It hasn’t competed in search since the early 2000s, when it definitively distanced itself from Microsoft and Yahoo!

Americans mythologize competition and credit it with saving us from socialist bread lines. Actually, capitalism and competition are opposites. Capitalism is premised on the accumulation of capital, but under perfect competition, all profits get competed away. The lesson for entrepreneurs is clear: If you want to create and capture lasting value, don’t build an undifferentiated commodity business.

How much of the world is actually monopolistic? How much is truly competitive? It is hard to say because our common conversation about these matters is so confused. To the outside observer, all businesses can seem reasonably alike, so it is easy to perceive only small differences between them. But the reality is much more binary than that. There is an enormous difference between perfect competition and monopoly, and most businesses are much closer to one extreme than we commonly realize.

The confusion comes from a universal bias for describing market conditions in self-serving ways: Both monopolists and competitors are incentivized to bend the truth.

Monopolists lie to protect themselves. They know that bragging about their great monopoly invites being audited, scrutinized and attacked. Since they very much want their monopoly profits to continue unmolested, they tend to do whatever they can to conceal their monopoly—usually by exaggerating the power of their (nonexistent) competition.

Think about how Google talks about its business. It certainly doesn’t claim to be a monopoly. But is it one? Well, it depends: a monopoly in what? Let’s say that Google is primarily a search engine. As of May 2014, it owns about 68% of the search market. (Its closest competitors, Microsoft and Yahoo! have about 19% and 10%, respectively.) If that doesn’t seem dominant enough, consider the fact that the word “google” is now an official entry in the Oxford English Dictionary—as a verb. Don’t hold your breath waiting for that to happen to Bing.

But suppose we say that Google is primarily an advertising company. That changes things. The U.S. search-engine advertising market is $17 billion annually. Online advertising is $37 billion annually. The entire U.S. advertising market is $150 billion. And global advertising is a $495 billion market. So even if Google completely monopolized U.S. search-engine advertising, it would own just 3.4% of the global advertising market. From this angle, Google looks like a small player in a competitive world.

What if we frame Google as a multifaceted technology company instead? This seems reasonable enough; in addition to its search engine, Google makes dozens of other software products, not to mention robotic cars, Android phones and wearable computers. But 95% of Google’s revenue comes from search advertising; its other products generated just $2.35 billion in 2012 and its consumer-tech products a mere fraction of that. Since consumer tech is a $964 billion market globally, Google owns less than 0.24% of it—a far cry from relevance, let alone monopoly. Framing itself as just another tech company allows Google to escape all sorts of unwanted attention.

Non-monopolists tell the opposite lie: “We’re in a league of our own.” Entrepreneurs are always biased to understate the scale of competition, but that is the biggest mistake a startup can make. The fatal temptation is to describe your market extremely narrowly so that you dominate it by definition.

Read the entire article here.

The Future of History

[tube]f3nJOCfkerI[/tube]

Take and impassioned history professor, a mediocre U.S. high school history curriculum and add Bill Gates, and you get an opportunity to inject fresh perspectives and new ideas into young minds.

Not too long ago Professor David Christian’s collection of Big History DVDs caught Gates’ attention, leading to a broad mission to overhaul the boring history lesson — one school at a time. Professor Christian’s approach takes a thoroughly holistic approach to the subject, spanning broad and interconnected topics such as culture, biochemistry, astronomy, agriculture and physics. The sweeping narrative fundamental to Christian’s delivery reminds me somewhat of Kenneth Clark’s Civilisation and Jacob Bronowski’s The Ascent of Man, two landmark U.K. television series.

From the New York Times:

In 2008, shortly after Bill Gates stepped down from his executive role at Microsoft, he often awoke in his 66,000-square-foot home on the eastern bank of Lake Washington and walked downstairs to his private gym in a baggy T-shirt, shorts, sneakers and black socks yanked up to the midcalf. Then, during an hour on the treadmill, Gates, a self-described nerd, would pass the time by watching DVDs from the Teaching Company’s “Great Courses” series. On some mornings, he would learn about geology or meteorology; on others, it would be oceanography or U.S. history.

As Gates was working his way through the series, he stumbled upon a set of DVDs titled “Big History” — an unusual college course taught by a jovial, gesticulating professor from Australia named David Christian. Unlike the previous DVDs, “Big History” did not confine itself to any particular topic, or even to a single academic discipline. Instead, it put forward a synthesis of history, biology, chemistry, astronomy and other disparate fields, which Christian wove together into nothing less than a unifying narrative of life on earth. Standing inside a small “Mr. Rogers”-style set, flanked by an imitation ivy-covered brick wall, Christian explained to the camera that he was influenced by the Annales School, a group of early-20th-century French historians who insisted that history be explored on multiple scales of time and space. Christian had subsequently divided the history of the world into eight separate “thresholds,” beginning with the Big Bang, 13 billion years ago (Threshold 1), moving through to the origin of Homo sapiens (Threshold 6), the appearance of agriculture (Threshold 7) and, finally, the forces that gave birth to our modern world (Threshold 8).

Christian’s aim was not to offer discrete accounts of each period so much as to integrate them all into vertiginous conceptual narratives, sweeping through billions of years in the span of a single semester. A lecture on the Big Bang, for instance, offered a complete history of cosmology, starting with the ancient God-centered view of the universe and proceeding through Ptolemy’s Earth-based model, through the heliocentric versions advanced by thinkers from Copernicus to Galileo and eventually arriving at Hubble’s idea of an expanding universe. In the worldview of “Big History,” a discussion about the formation of stars cannot help including Einstein and the hydrogen bomb; a lesson on the rise of life will find its way to Jane Goodall and Dian Fossey. “I hope by the end of this course, you will also have a much better sense of the underlying unity of modern knowledge,” Christian said at the close of the first lecture. “There is a unified account.”

As Gates sweated away on his treadmill, he found himself marveling at the class’s ability to connect complex concepts. “I just loved it,” he said. “It was very clarifying for me. I thought, God, everybody should watch this thing!” At the time, the Bill & Melinda Gates Foundation had donated hundreds of millions of dollars to educational initiatives, but many of these were high-level policy projects, like the Common Core Standards Initiative, which the foundation was instrumental in pushing through. And Gates, who had recently decided to become a full-time philanthropist, seemed to pine for a project that was a little more tangible. He was frustrated with the state of interactive coursework and classroom technology since before he dropped out of Harvard in the mid-1970s; he yearned to experiment with entirely new approaches. “I wanted to explore how you did digital things,” he told me. “That was a big issue for me in terms of where education was going — taking my previous skills and applying them to education.” Soon after getting off the treadmill, he asked an assistant to set a meeting with Christian.

A few days later, the professor, who was lecturing at San Diego State University, found himself in the lobby of a hotel, waiting to meet with the billionaire. “I was scared,” Christian recalled. “Someone took me along the corridor, knocks on a door, Bill opens it, invites me in. All I remember is that within five minutes, he had so put me at my ease. I thought, I’m a nerd, he’s a nerd and this is fun!” After a bit of small talk, Gates got down to business. He told Christian that he wanted to introduce “Big History” as a course in high schools all across America. He was prepared to fund the project personally, outside his foundation, and he wanted to be personally involved. “He actually gave me his email address and said, ‘Just think about it,’ ” Christian continued. ” ‘Email me if you think this is a good idea.’ ”

Christian emailed to say that he thought it was a pretty good idea. The two men began tinkering, adapting Christian’s college course into a high-school curriculum, with modules flexible enough to teach to freshmen and seniors alike. Gates, who insisted that the course include a strong digital component, hired a team of engineers and designers to develop a website that would serve as an electronic textbook, brimming with interactive graphics and videos. Gates was particularly insistent on the idea of digital timelines, which may have been vestige of an earlier passion project, Microsoft Encarta, the electronic encyclopedia that was eventually overtaken by the growth of Wikipedia. Now he wanted to offer a multifaceted historical account of any given subject through a friendly user interface. The site, which is open to the public, would also feature a password-protected forum for teachers to trade notes and update and, in some cases, rewrite lesson plans based on their experiences in the classroom.

Read the entire article here.

Video: Clip from Threshold 1, The Big Bang. Courtesy of Big History Project, David Christian.

Burning Man Bucket List

BM-super-pool-art

As this year’s Burning Man comes to an end in the eerily beautiful Black Rock Desert in Nevada I am reminded that attending this life event should be on everyone’s bucket list, before they actually kick it.

That said, applying one or more of the Ten Principle’s that guide Burners, should be a year-round quest — not a once in a lifetime transient goal.

Read more about this year’s BM here.

See more BM visuals here.

Image: Super Pool art installation, Burning Man 2014. Courtesy of Jim Urquhart / Reuters.

 

The IBM Songbook

IBM Songbook

It would be fascinating to see a Broadway or West End show based on lyrics penned in honor of IBM and Thomas Watson, Sr., its first president. Makes you wonder if faithful employees of say, Facebook or Apple, would ever write a songbook — not in jest — for their corporate alma mater. I think not.

From ars technica:

“For thirty-seven years,” reads the opening passage in the book, “the gatherings and conventions of our IBM workers have expressed in happy songs the fine spirit of loyal cooperation and good fellowship which has promoted the signal success of our great IBM Corporation in its truly International Service for the betterment of business and benefit to mankind.”

That’s a hell of a mouthful, but it’s only the opening volley in the war on self-respect and decency that is the 1937 edition of Songs of the IBM, a booklet of corporate ditties first published in 1927 on the order of IBM company founder Thomas Watson, Sr.

The 1937 edition of the songbook is a 54-page monument to glassey-eyed corporate inhumanity, with every page overflowing with trite praise to The Company and Its Men. The booklet reads like a terribly parody of a hymnal—one that praises not the traditional Christian trinity but the new corporate triumvirate of IBM the father, Watson the son, and American entrepreneurship as the holy spirit:

Thomas Watson is our inspiration,
Head and soul of our splendid I.B.M.
We are pledged to him in every nation,
Our President and most beloved man.
His wisdom has guided each division
In service to all humanity
We have grown and broadened with his vision,
None can match him or our great company.
T. J. Watson, we all honor you,
You’re so big and so square and so true,
We will follow and serve with you forever,
All the world must know what I. B. M. can do.

—from “To Thos. J. Watson, President, I.B.M. Our Inspiration”

The wording transcends sense and sanity—these aren’t songs that normal human beings would choose to line up and sing, are they? Have people changed so much in the last 70-80 years that these songs—which seem expressly designed to debase their singers and deify their subjects—would be joyfully sung in harmony without complaint at company meetings? Were workers in the 1920s and 1930s so dehumanized by the rampaging robber barons of high industry that the only way to keep a desirable corporate job at a place like IBM was to toe the line and sing for your paycheck?

Surely no one would stand for this kind of thing in the modern world—to us, company songs seem like relics of a less-enlightened age. If anything, the mindless overflowing trite words sound like the kind of praises one would find directed at a cult of personality dictator in a decaying wreck of a country like North Korea.

Indeed, some of the songs in the book wouldn’t be out of place venerating the Juche ideal instead of IBM:

We don’t pretend we’re gay.
We always feel that way,
Because we’re filling the world with sunshine.
With I.B.M. machines,
We’ve got the finest means,
For brightly painting the clouds with sunshine.

—from “Painting the Clouds with Sunshine”

Surely no one would stand for this kind of thing in the modern world—to us, company songs seem like relics of a less-enlightened age. If anything, the mindless overflowing trite words sound like the kind of praises one would find directed at a cult of personality dictator in a decaying wreck of a country like North Korea.

Tie an onion to your belt

All right, time to come clean: it’s incredibly easy to cherry pick terrible examples out of a 77-year old corporate songbook (though this songbook makes it easy because of how crazy it is to modern eyes). Moreover, to answer one of the rhetorical questions above, no—people have not changed so much over the past 80-ish years that they could sing mawkishly pro-IBM songs with an irony-free straight face. At least, not without some additional context.

There’s a decade-old writeup on NetworkWorld about the IBM corporate song phenomena that provides a lot of the glue necessary to build a complete mental picture of what was going on in both employees’ and leaderships’ heads. The key takeaway to deflate a lot of the looniness is that the majority of the songs came out of the Great Depression era, and employees lucky enough to be steadfastly employed by a company like IBM often werereally that grateful.

The formal integration of singing as an aspect of IBM’s culture at the time was heavily encouraged by Thomas J. Watson Sr. Watson and his employees co-opted the era’s showtunes and popular melodies for their proto-filking, ensuring that everyone would know the way the song went, if not the exact wording. Employees belting out “To the International Ticketograph Division” to the tune of “My Bonnie Lies Over the Ocean” (“In I.B.M. There’s a division. / That’s known as the Ticketograph; / It’s peopled by men who have vision, / Progressive and hard-working staff”) really isn’t all that different from any other team-building exercise that modern companies do—in fact, in a lot of ways, it’s far less humiliating than a company picnic with Mandatory Interdepartmental Three-Legged Races.

Many of the songs mirror the kinds of things that university students of the same time period might sing in honor of their alma mater. When viewed from the perspective of the Depression and post-Depression era, the singing is still silly—but it also makes a lot more sense. Watson reportedly wanted to inspire loyalty and cohesion among employees—and, remember, this was also an era where “normal” employee behavior was to work at a single company for most of one’s professional life, and then retire with a pension. It’s certainly a lot easier to sing a company’s praises if there’s paid retirement at the end of the last verse.

Read the entire article and see more songs here.

Image: Page 99-100 of the IBM Songbook, 1937. Courtesy of IBM / are technica.

The Idea Shower and The Strategic Staircase

Every now and then we visit the world of corporatespeak to see how business jargon is faring: which words are in, which phrases are out. Unfortunately, many of the most used and over-used still find their way into common office parlance. With apologies to our state-side readers some of the most popular British phrases follow, and, no surprise, many of these cringeworthy euphemisms seem to emanate from the U.S. Ugh!

From the Guardian:

I don’t know about you, but I’m a sucker for a bit of joined up, blue sky thinking. I love nothing more than the opportunity to touch base with my boss first thing on a Monday morning. It gives me that 24 carat feeling.

I apologise for the sarcasm, but management speak makes most people want to staple the boss’s tongue to the desk. A straw poll around my office found jargon is seen by staff as a tool for making something seem more impressive than it actually is.

The Plain English Campaign says that many staff working for big corporate organisations find themselves using management speak as a way of disguising the fact that they haven’t done their job properly. Some people think that it is easy to bluff their way through by using long, impressive-sounding words and phrases, even if they don’t know what they mean, which is telling in itself.

Furthermore, a recent survey by Institute of Leadership & Management, revealed that management speak is used in almost two thirds (64%) of offices, with nearly a quarter (23%) considering it to be a pointless irritation. “Thinking outside the box” (57%), “going forward” (55%) and “let’s touch base” (39%) were identified as the top three most overused pieces of jargon.

Walk through any office and you’ll hear this kind of thing going on every day. Here are some of the most irritating euphemisms doing the rounds:

Helicopter view – need a phrase that means broad overview of the business? Then why not say “a broad view of the business”?

Idea shower – brainstorm might be out of fashion, but surely we can thought cascade something better than this drivel.

Touch base offline – meaning let’s meet and talk. Because, contrary to popular belief, it is possible to communicate without a Wi-Fi signal. No, really, it is. Fancy a coffee?

Low hanging fruit – easy win business. This would be perfect for hungry children in orchards, but what is really happening is an admission that you don’t want to take the complicated route.

Look under the bonnet – analyse a situation. Most people wouldn’t have a clue about a car engine. When I look under a car bonnet I scratch my head, try not to look like I haven’t got a clue, jiggle a few pipes and kick the tyres before handing the job over to a qualified professional.

Get all your ducks in a row – be organised. Bert and Ernie from Sesame Street had an obsession with rubber ducks. You may think I’m disorganised, but there’s no need to talk to me like a five-year-old.

Don’t let the grass grow too long on this one – work fast. I’m looking for a polite way of suggesting that you get off your backside and get on with it.

Not enough bandwidth – too busy. Really? Try upgrading to fibre optics. I reckon I know a few people who haven’t been blessed with enough “bandwidth” and it’s got nothing to do with being busy.

Cascading relevant information – speaking to your colleagues. If anything, this is worse than touching base offline. From the flourish of cascading through to relevant, and onto information – this is complete nonsense.

The strategic staircase – business plan. Thanks, but I’ll take the lift.

Run it up the flagpole – try it out. Could you attach yourself while you’re at it?

Read the entire story here.

Sugar Is Bad For You, Really? Really!

 

sugar moleculesIn case you may not have heard, sugar is bad for you. In fact, an increasing number of food scientists will tell you that sugar is a poison, and that it’s time to fight the sugar oligarchs in much the same way that health advocates resolved to take on big tobacco many decades ago.

From the Guardian:

If you have any interest at all in diet, obesity, public health, diabetes, epidemiology, your own health or that of other people, you will probably be aware that sugar, not fat, is now considered the devil’s food. Dr Robert Lustig’s book, Fat Chance: The Hidden Truth About Sugar, Obesity and Disease, for all that it sounds like a Dan Brown novel, is the difference between vaguely knowing something is probably true, and being told it as a fact. Lustig has spent the past 16 years treating childhood obesity. His meta-analysis of the cutting-edge research on large-cohort studies of what sugar does to populations across the world, alongside his own clinical observations, has him credited with starting the war on sugar. When it reaches the enemy status of tobacco, it will be because of Lustig.

“Politicians have to come in and reset the playing field, as they have with any substance that is toxic and abused, ubiquitous and with negative consequence for society,” he says. “Alcohol, cigarettes, cocaine. We don’t have to ban any of them. We don’t have to ban sugar. But the food industry cannot be given carte blanche. They’re allowed to make money, but they’re not allowed to make money by making people sick.”

Lustig argues that sugar creates an appetite for itself by a determinable hormonal mechanism – a cycle, he says, that you could no more break with willpower than you could stop feeling thirsty through sheer strength of character. He argues that the hormone related to stress, cortisol, is partly to blame. “When cortisol floods the bloodstream, it raises blood pressure; increases the blood glucose level, which can precipitate diabetes. Human research shows that cortisol specifically increases caloric intake of ‘comfort foods’.” High cortisol levels during sleep, for instance, interfere with restfulness, and increase the hunger hormone ghrelin the next day. This differs from person to person, but I was jolted by recognition of the outrageous deliciousness of doughnuts when I haven’t slept well.

“The problem in obesity is not excess weight,” Lustig says, in the central London hotel that he has made his anti-metabolic illness HQ. “The problem with obesity is that the brain is not seeing the excess weight.” The brain can’t see it because appetite is determined by a binary system. You’re either in anorexigenesis – “I’m not hungry and I can burn energy” – or you’re in orexigenesis – “I’m hungry and I want to store energy.” The flip switch is your leptin level (the hormone that regulates your body fat) but too much insulin in your system blocks the leptin signal.

It helps here if you have ever been pregnant or remember much of puberty and that savage hunger; the way it can trick you out of your best intentions, the lure of ridiculous foods: six-month-old Christmas cake, sweets from a bin. If you’re leptin resistant – that is, if your insulin is too high as a result of your sugar intake – you’ll feel like that all the time.

Telling people to simply lose weight, he tells me, “is physiologically impossible and it’s clinically dangerous. It’s a goal that’s not achievable.” He explains further in the book: “Biochemistry drives behaviour. You see a patient who drinks 10 gallons of water a day and urinates 10 gallons of water a day. What is wrong with him? Could he have a behavioural disorder and be a psychogenic water drinker? Could be. Much more likely he has diabetes.” To extend that, you could tell people with diabetes not to drink water, and 3% of them might succeed – the outliers. But that wouldn’t help the other 97% just as losing the weight doesn’t, long-term, solve the metabolic syndrome – the addiction to sugar – of which obesity is symptomatic.

Many studies have suggested that diets tend to work for two months, some for as long as six. “That’s what the data show. And then everybody’s weight comes roaring back.” During his own time working night shifts, Lustig gained 3st, which he never lost and now uses exuberantly to make two points. The first is that weight is extremely hard to lose, and the second – more important, I think – is that he’s no diet and fitness guru himself. He doesn’t want everybody to be perfect: he’s just a guy who doesn’t want to surrender civilisation to diseases caused by industry. “I’m not a fitness guru,” he says, puckishly. “I’m 45lb overweight!”

“Sugar causes diseases: unrelated to their calories and unrelated to the attendant weight gain. It’s an independent primary-risk factor. Now, there will be food-industry people who deny it until the day they die, because their livelihood depends on it.” And here we have the reason why he sees this is a crusade and not a diet book, the reason that Lustig is in London and not Washington. This is an industry problem; the obesity epidemic began in 1980. Back then, nobody knew about leptin. And nobody knew about insulin resistance until 1984.

“What they knew was, when they took the fat out they had to put the sugar in, and when they did that, people bought more. And when they added more, people bought more, and so they kept on doing it. And that’s how we got up to current levels of consumption.” Approximately 80% of the 600,000 packaged foods you can buy in the US have added calorific sweeteners (this includes bread, burgers, things you wouldn’t add sugar to if you were making them from scratch). Daily fructose consumption has doubled in the past 30 years in the US, a pattern also observable (though not identical) here, in Canada, Malaysia, India, right across the developed and developing world. World sugar consumption has tripled in the past 50 years, while the population has only doubled; it makes sense of the obesity pandemic.

“It would have happened decades earlier; the reason it didn’t was that sugar wasn’t cheap. The thing that made it cheap was high-fructose corn syrup. They didn’t necessarily know the physiology of it, but they knew the economics of it.” Adding sugar to everyday food has become as much about the industry prolonging the shelf life as it has about palatability; if you’re shopping from corner shops, you’re likely to be eating unnecessary sugar in pretty well everything. It is difficult to remain healthy in these conditions. “You here in Britain are light years ahead of us in terms of understanding the problem. We don’t get it in the US, we have this libertarian streak. You don’t have that. You’re going to solve it first. So it’s in my best interests to help you, because that will help me solve it back there.”

The problem has mushroomed all over the world in 30 years and is driven by the profits of the food and diet industries combined. We’re not looking at a global pandemic of individual greed and fecklessness: it would be impossible for the citizens of the world to coordinate their human weaknesses with that level of accuracy. Once you stop seeing it as a problem of personal responsibility it’s easier to accept how profound and serious the war on sugar is. Life doesn’t have to become wholemeal and joyless, but traffic-light systems and five-a-day messaging are under-ambitious.

“The problem isn’t a knowledge deficit,” an obesity counsellor once told me. “There isn’t a fat person on Earth who doesn’t know vegetables are good for you.” Lustig agrees. “I, personally, don’t have a lot of hope that those things will turn things around. Education has not solved any substance of abuse. This is a substance of abuse. So you need two things, you need personal intervention and you need societal intervention. Rehab and laws, rehab and laws. Education would come in with rehab. But we need laws.”

Read the entire article here.

Image: Molecular diagrams of sucrose (left) and fructose (right). Courtesy of Wikipedia.

 

Those 25,000 Unread Emails

Google-search-emailIt may not be you. You may not be the person who has tens of thousands of unread emails scattered across various email accounts. However, you know someone just like this — buried in a virtual avalanche of unopened text, unable to extricate herself (or him) and with no pragmatic plan to tackle the digital morass.

Washington Post writer Brigid Schulte has some ideas to help your friend  (or you of course — your secret is safe with us).

From the Washington Post:

I was drowning in e-mail. Overwhelmed. Overloaded. Spending hours a day, it seemed, roiling in an unending onslaught of info turds and falling further and further behind. The day I returned from a two-week break, I had 23,768 messages in my inbox. And 14,460 of them were unread.

I had to do something. I kept missing stuff. Forgetting stuff. Apologizing. And getting miffed and increasingly angry e-mails from friends and others who wondered why I was ignoring them. It wasn’t just vacation that put me so far behind. I’d been behind for more than a year. Vacation only made it worse. Every time I thought of my inbox, I’d start to hyperventilate.

I’d tried tackling it before: One night a few months ago, I was determined to stay at my desk until I’d powered through all the unread e-mails. At dawn, I was still powering through and nowhere near the end. And before long, the inbox was just as crammed as it had been before I lost that entire night’s sleep.

On the advice of a friend, I’d even hired a Virtual Assistant to help me with the backlog. But I had no idea how to use one. And though I’d read about people declaring e-mail bankruptcy when their inbox was overflowing — deleting everything and starting over from scratch — I was positive there were gems somewhere in that junk, and I couldn’t bear to lose them.

I knew I wasn’t alone. I’d get automatic response messages saying someone was on vacation and the only way they could relax was by telling me they’d never, ever look at my e-mail, so please send it again when they returned. My friend, Georgetown law professor Rosa Brooks, often sends out this auto response: “My inbox looks like Pompeii, post-volcano. Will respond as soon as I have time to excavate.” And another friend, whenever an e-mail is longer than one or two lines, sends a short note, “This sounds like a conversation,” and she won’t respond unless you call her.

E-mail made the late writer Nora Ephron’s list of the 22 things she won’t miss in life. Twice. In 2013, more than 182 billion e-mails were sent every day, no doubt clogging up millions of inboxes around the globe.

Bordering on despair, I sought help from four productivity gurus. And, following their advice, in two weeks of obsession-bordering-on-compulsion, my inbox was down to zero.

Here’s how.

*CREATE A SYSTEM. Julie Gray, a time coach who helps people dig out of e-mail overload all the time, said the first thing I had to change was my mind.

“This is such a pervasive problem. People think, ‘What am I doing wrong? They think they don’t have discipline or focus or that there’s some huge character flaw and they’re beating themselves up all the time. Which only makes it worse,” she said.

“So I first start changing their e-mail mindset from ‘This is an example of my failure,’ to ‘This just means I haven’t found the right system for me yet.’ It’s really all about finding your own path through the craziness.”

Do not spend another minute on e-mail, she admonished me, until you’ve begun to figure out a system. Otherwise, she said, I’d never dig out.

So we talked systems. It soon became clear that I’d created a really great e-mail system for when I was writing my book — ironically enough, on being overwhelmed — spending most of my time not at all overwhelmed in yoga pants in my home office working on my iMac. I was a follower of Randy Pausch who wrote, in “The Last Lecture,” to keep your e-mail inbox down to one page and religiously file everything once you’ve handled it. And I had for a couple years.

But now that I was traveling around the country to talk about the book, and back at work at The Washington Post, using my laptop, iPhone and iPad, that system was completely broken. I had six different e-mail accounts. And my main Verizon e-mail that I’d used for years and the Mac Mail inbox with meticulous file folders that I loved on my iMac didn’t sync across any of them.

Gray asked: “If everything just blew up today, and you had to start over, how would you set up your system?”

I wanted one inbox. One e-mail account. And I wanted the same inbox on all my devices. If I deleted an e-mail on my laptop, I wanted it deleted on my iMac. If I put an e-mail into a folder on my iMac, I wanted that same folder on my laptop.

So I decided to use Gmail, which does sync, as my main account. I set up an auto responder on my Verizon e-mail saying I was no longer using it and directing people to my Gmail account. I updated all my accounts to send to Gmail. And I spent hours on the phone with Apple one Sunday (thank you, Chazz,) to get my Gmail account set up in my beloved Mac mail inbox that would sync. Then I transferred old files and created new ones on Gmail. I had to keep my Washington Post account separate, but that wasn’t the real problem.

All systems go.

Read the entire article here.

Image courtesy of Google Search.

 

Privacy and Potato Chips

Google-search-potato-chip

Privacy and lack thereof is much in the news and on or minds. New revelations of data breaches, phone taps, corporate hackers and governmental overreach surface on a daily basis. So, it is no surprise to learn that researchers have found a cheap way to eavesdrop on our conversations via a potato chip (crisp, to our British-English readers) packet. No news yet on which flavor of chip makes for the best spying!

From ars technica:

Watch enough spy thrillers, and you’ll undoubtedly see someone setting up a bit of equipment that points a laser at a distant window, letting the snoop listen to conversations on the other side of the glass. This isn’t something Hollywood made up; high-tech snooping devices of this sort do exist, and they take advantage of the extremely high-precision measurements made possible with lasers in order to measure the subtle vibrations caused by sound waves.

A team of researchers has now shown, however, that you can skip the lasers. All you really need is a consumer-level digital camera and a conveniently located bag of Doritos. A glass of water or a plant would also do.

Good vibrations

Despite the differences in the technology involved, both approaches rely on the same principle: sound travels on waves of higher and lower pressure in the air. When these waves reach a flexible object, they set off small vibrations in the object. If you can detect these vibrations, it’s possible to reconstruct the sound. Laser-based systems detect the vibrations by watching for changes in the reflections of the laser light, but researchers wondered whether you could simply observe the object directly, using the ambient light it reflects. (The team involved researchers at MIT, Adobe Research, and Microsoft Research.)

The research team started with a simple test system made from a loudspeaker playing a rising tone, a high-speed camera, and a variety of objects: water, cardboard, a candy wrapper, some metallic foil, and (as a control) a brick. Each of these (even the brick) showed some response at the lowest end of the tonal range, but the other objects, particularly the cardboard and foil, had a response into much higher tonal regions. To observe the changes in ambient light, the camera didn’t have to capture the object at high resolution—it was used at 700 x 700 pixels or less—but it did have to be high-speed, capturing as many as 20,000 frames a second.

Processing the images wasn’t simple, however. A computer had to perform a weighted average over all the pixels captured, and even a twin 3.5GHz machine with 32GB of RAM took more than two hours to process one capture. Nevertheless, the results were impressive, as the algorithm was able to detect motion on the order of a thousandth of a pixel. This enabled the system to recreate the audio waves emitted by the loudspeaker.

Most of the rest of the paper describing the results involved making things harder on the system, as the researchers shifted to using human voices and moving the camera outside the room. They also showed that pre-testing the vibrating object’s response to a tone scale could help them improve their processing.

But perhaps the biggest surprise came when they showed that they didn’t actually need a specialized, high-speed camera. It turns out that most consumer-grade equipment doesn’t expose its entire sensor at once and instead scans an image across the sensor grid in a line-by-line fashion. Using a consumer video camera, the researchers were able to determine that there’s a 16 microsecond delay between each line, with a five millisecond delay between frames. Using this information, they treated each line as a separate exposure and were able to reproduce sound that way.

Read the entire article here.

Image courtesy of Google Search.

 

 

The Enigma of Privacy

Privacy is still a valued and valuable right. It should not be a mere benefit in a democratic society. But, in our current age privacy is becoming an increasingly threatened species. We are surrounded with social networks that share and mine our behaviors and we are assaulted by the snoopers and spooks from local and national governments.

From the Observer:

We have come to the end of privacy; our private lives, as our grandparents would have recognised them, have been winnowed away to the realm of the shameful and secret. To quote ex-tabloid hack Paul McMullan, “privacy is for paedos”. Insidiously, through small concessions that only mounted up over time, we have signed away rights and privileges that other generations fought for, undermining the very cornerstones of our personalities in the process. While outposts of civilisation fight pyrrhic battles, unplugging themselves from the web – “going dark” – the rest of us have come to accept that the majority of our social, financial and even sexual interactions take place over the internet and that someone, somewhere, whether state, press or corporation, is watching.

The past few years have brought an avalanche of news about the extent to which our communications are being monitored: WikiLeaks, the phone-hacking scandal, the Snowden files. Uproar greeted revelations about Facebook’s “emotional contagion” experiment (where it tweaked mathematical formulae driving the news feeds of 700,000 of its members in order to prompt different emotional responses). Cesar A Hidalgo of the Massachusetts Institute of Technology described the Facebook news feed as “like a sausage… Everyone eats it, even though nobody knows how it is made”.

Sitting behind the outrage was a particularly modern form of disquiet – the knowledge that we are being manipulated, surveyed, rendered and that the intelligence behind this is artificial as well as human. Everything we do on the web, from our social media interactions to our shopping on Amazon, to our Netflix selections, is driven by complex mathematical formulae that are invisible and arcane.

Most recently, campaigners’ anger has turned upon the so-called Drip (Data Retention and Investigatory Powers) bill in the UK, which will see internet and telephone companies forced to retain and store their customers’ communications (and provide access to this data to police, government and up to 600 public bodies). Every week, it seems, brings a new furore over corporations – Apple, Google, Facebook – sidling into the private sphere. Often, it’s unclear whether the companies act brazenly because our governments play so fast and loose with their citizens’ privacy (“If you have nothing to hide, you’ve nothing to fear,” William Hague famously intoned); or if governments see corporations feasting upon the private lives of their users and have taken this as a licence to snoop, pry, survey.

We, the public, have looked on, at first horrified, then cynical, then bored by the revelations, by the well-meaning but seemingly useless protests. But what is the personal and psychological impact of this loss of privacy? What legal protection is afforded to those wishing to defend themselves against intrusion? Is it too late to stem the tide now that scenes from science fiction have become part of the fabric of our everyday world?

Novels have long been the province of the great What If?, allowing us to see the ramifications from present events extending into the murky future. As long ago as 1921, Yevgeny Zamyatin imagined One State, the transparent society of his dystopian novel, We. For Orwell, Huxley, Bradbury, Atwood and many others, the loss of privacy was one of the establishing nightmares of the totalitarian future. Dave Eggers’s 2013 novel The Circle paints a portrait of an America without privacy, where a vast, internet-based, multimedia empire surveys and controls the lives of its people, relying on strict adherence to its motto: “Secrets are lies, sharing is caring, and privacy is theft.” We watch as the heroine, Mae, disintegrates under the pressure of scrutiny, finally becoming one of the faceless, obedient hordes. A contemporary (and because of this, even more chilling) account of life lived in the glare of the privacy-free internet is Nikesh Shukla’s Meatspace, which charts the existence of a lonely writer whose only escape is into the shallows of the web. “The first and last thing I do every day,” the book begins, “is see what strangers are saying about me.”

Our age has seen an almost complete conflation of the previously separate spheres of the private and the secret. A taint of shame has crept over from the secret into the private so that anything that is kept from the public gaze is perceived as suspect. This, I think, is why defecation is so often used as an example of the private sphere. Sex and shitting were the only actions that the authorities in Zamyatin’s One State permitted to take place in private, and these remain the battlegrounds of the privacy debate almost a century later. A rather prim leaked memo from a GCHQ operative monitoring Yahoo webcams notes that “a surprising number of people use webcam conversations to show intimate parts of their body to the other person”.

It is to the bathroom that Max Mosley turns when we speak about his own campaign for privacy. “The need for a private life is something that is completely subjective,” he tells me. “You either would mind somebody publishing a film of you doing your ablutions in the morning or you wouldn’t. Personally I would and I think most people would.” In 2008, Mosley’s “sick Nazi orgy”, as the News of the World glossed it, featured in photographs published first in the pages of the tabloid and then across the internet. Mosley’s defence argued, successfully, that the romp involved nothing more than a “standard S&M prison scenario” and the former president of the FIA won £60,000 damages under Article 8 of the European Convention on Human Rights. Now he has rounded on Google and the continued presence of both photographs and allegations on websites accessed via the company’s search engine. If you type “Max Mosley” into Google, the eager autocomplete presents you with “video,” “case”, “scandal” and “with prostitutes”. Half-way down the first page of the search we find a link to a professional-looking YouTube video montage of the NotW story, with no acknowledgment that the claims were later disproved. I watch it several times. I feel a bit grubby.

“The moment the Nazi element of the case fell apart,” Mosley tells me, “which it did immediately, because it was a lie, any claim for public interest also fell apart.”

Here we have a clear example of the blurred lines between secrecy and privacy. Mosley believed that what he chose to do in his private life, even if it included whips and nipple-clamps, should remain just that – private. The News of the World, on the other hand, thought it had uncovered a shameful secret that, given Mosley’s professional position, justified publication. There is a momentary tremor in Mosley’s otherwise fluid delivery as he speaks about the sense of invasion. “Your privacy or your private life belongs to you. Some of it you may choose to make available, some of it should be made available, because it’s in the public interest to make it known. The rest should be yours alone. And if anyone takes it from you, that’s theft and it’s the same as the theft of property.”

Mosley has scored some recent successes, notably in continental Europe, where he has found a culture more suspicious of Google’s sweeping powers than in Britain or, particularly, the US. Courts in France and then, interestingly, Germany, ordered Google to remove pictures of the orgy permanently, with far-reaching consequences for the company. Google is appealing against the rulings, seeing it as absurd that “providers are required to monitor even the smallest components of content they transmit or store for their users”. But Mosley last week extended his action to the UK, filing a claim in the high court in London.

Mosley’s willingness to continue fighting, even when he knows that it means keeping alive the image of his white, septuagenarian buttocks in the minds (if not on the computers) of the public, seems impressively principled. He has fallen victim to what is known as the Streisand Effect, where his very attempt to hide information about himself has led to its proliferation (in 2003 Barbra Streisand tried to stop people taking pictures of her Malibu home, ensuring photos were posted far and wide). Despite this, he continues to battle – both in court, in the media and by directly confronting the websites that continue to display the pictures. It is as if he is using that initial stab of shame, turning it against those who sought to humiliate him. It is noticeable that, having been accused of fetishising one dark period of German history, he uses another to attack Google. “I think, because of the Stasi,” he says, “the Germans can understand that there isn’t a huge difference between the state watching everything you do and Google watching everything you do. Except that, in most European countries, the state tends to be an elected body, whereas Google isn’t. There’s not a lot of difference between the actions of the government of East Germany and the actions of Google.”

All this brings us to some fundamental questions about the role of search engines. Is Google the de facto librarian of the internet, given that it is estimated to handle 40% of all traffic? Is it something more than a librarian, since its algorithms carefully (and with increasing use of your personal data) select the sites it wants you to view? To what extent can Google be held responsible for the content it puts before us?

Read the entire article here.

Ugliness Behind the Beautiful Game

Google-map-QatarQatar hosts the World Cup in 2022. This gives the emirate another 8 years to finish construction of the various football venues, hotels and infrastructure required to support the world’s biggest single sporting event.

Perhaps, it will also give the emirate some time to clean up its appalling record of worker abuse and human rights violations. Numerous  laborers have died during the construction process, while others are paid minimal wages or not at all. And to top it off most employees live in atrocious conditions , cannot move freely, nor can they change jobs or even repatriate — many come from the Indian subcontinent or East Asia. You could be forgiven for labeling these people indentured servants rather than workers.

From the Guardian:

Migrant workers who built luxury offices used by Qatar’s 2022 football World Cup organisers have told the Guardian they have not been paid for more than a year and are now working illegally from cockroach-infested lodgings.

Officials in Qatar’s Supreme Committee for Delivery and Legacy have been using offices on the 38th and 39th floors of Doha’s landmark al-Bidda skyscraper – known as the Tower of Football – which were fitted out by men from Nepal, Sri Lanka and India who say they have not been paid for up to 13 months’ work.

The project, a Guardian investigation shows, was directly commissioned by the Qatar government and the workers’ plight is set to raise fresh doubts over the autocratic emirate’s commitment to labour rights as construction starts this year on five new stadiums for the World Cup.

The offices, which cost £2.5m to fit, feature expensive etched glass, handmade Italian furniture, and even a heated executive toilet, project sources said. Yet some of the workers have not been paid, despite complaining to the Qatari authorities months ago and being owed wages as modest as £6 a day.

By the end of this year, several hundred thousand extra migrant workers from some of the world’s poorest countries are scheduled to have travelled to Qatar to build World Cup facilities and infrastructure. The acceleration in the building programme comes amid international concern over a rising death toll among migrant workers and the use of forced labour.

“We don’t know how much they are spending on the World Cup, but we just need our salary,” said one worker who had lost a year’s pay on the project. “We were working, but not getting the salary. The government, the company: just provide the money.”

The migrants are squeezed seven to a room, sleeping on thin, dirty mattresses on the floor and on bunk beds, in breach of Qatar’s own labour standards. They live in constant fear of imprisonment because they have been left without paperwork after the contractor on the project, Lee Trading and Contracting, collapsed. They say they are now being exploited on wages as low as 50p an hour.

Their case was raised with Qatar’s prime minister by Amnesty International last November, but the workers have said 13 of them remain stranded in Qatar. Despite having done nothing wrong, five have even been arrested and imprisoned by Qatari police because they did not have ID papers. Legal claims lodged against the former employer at the labour court in November have proved fruitless. They are so poor they can no longer afford the taxi to court to pursue their cases, they say.

A 35-year-old Nepalese worker and father of three who ssaid he too had lost a year’s pay: “If I had money to buy a ticket, I would go home.”

Qatar’s World Cup organising committee confirmed that it had been granted use of temporary offices on the floors fitted out by the unpaid workers. It said it was “heavily dismayed to learn of the behaviour of Lee Trading with regard to the timely payment of its workers”. The committee stressed it did not commission the firm. “We strongly disapprove and will continue to press for a speedy and fair conclusion to all cases,” it said.

Jim Murphy, the shadow international development secretary, said the revelation added to the pressure on the World Cup organising committee. “They work out of this building, but so far they can’t even deliver justice for the men who toiled at their own HQ,” he said.

Sharan Burrow, secretary general of the International Trade Union Confederation, said the workers’ treatment was criminal. “It is an appalling abuse of fundamental rights, yet there is no concern from the Qatar government unless they are found out,” she said. “In any other country you could prosecute this behaviour.”

Read the entire article here.

Image: Qatar. Courtesy of Google Maps.

Computer Generated Reality

[tube]nLtmEjqzg7M[/tube]

Computer games have come a very long way since the pioneering days of Pong and Pacman. Games are now so realistic that many are indistinguishable from the real-world characters and scenarios they emulate. It is a testament to the skill and ingenuity of hardware and software engineers and the creativity of developers who bring all the diverse underlying elements of a game together. Now, however, they have a match in the form of computer system that is able to generate richly  imagined and rendered world for use in the games themselves. It’s all done through algorithms.

From Technology Review:

Read the entire story here.

Video: No Man’s Sky. Courtesy of Hello Games.

 

 

Gun Love

Gun Violence in America

The second amendment remains ever strong in the U.S. And, of course so does the number of homicides and child deaths at the hands of guns. Sigh!

From the Guardian:

In February, a nine-year-old Arkansas boy called Hank asked his uncle if he could head off on his own from their remote camp to hunt a rabbit with his .22 calibre rifle. “I said all right,” recalled his uncle Brent later. “It wasn’t a concern. Some people are like, ‘a nine year old shouldn’t be off by himself,’ but he wasn’t an average nine year old.”

Hank was steeped in hunting: when he was two, his father, Brad, would put him in a rucksack on his back when he went turkey hunting. Brad regularly took Hank hunting and said that his son often went off hunting by himself. On this particular day, Hank and his uncle Brent had gone squirrel hunting together as his father was too sick to go.

When Hank didn’t return from hunting the rabbit, his uncle raised the alarm. His mother, Kelli, didn’t learn about his disappearance for seven hours. “They didn’t want to bother me unduly,” she says.

The following morning, though, after police, family and hundreds of locals searched around the camp, Hank’s body was found by a creek with a single bullet wound to the forehead. The cause of death was, according to the police, most likely a hunting accident.

“He slipped and the butt of the gun hit the ground and the gun fired,” says Kelli.

Kelli had recently bought the gun for Hank. “It was the first gun I had purchased for my son, just a youth .22 rifle. I never thought it would be a gun that would take his life.”

Both Kelli and Brad, from whom she is separated, believe that the gun was faulty – it shouldn’t have gone off unless the trigger was pulled, they claim. Since Hank’s death, she’s been posting warnings on her Facebook page about the gun her son used: “I wish someone else had posted warnings about it before what happened,” she says.

Had Kelli not bought the gun and had Brad not trained his son to use it, Hank would have celebrated his 10th birthday on 6 June, which his mother commemorated by posting Hank’s picture on her Facebook page with the message: “Happy Birthday Hank! Mommy loves you!”

Little Hank thus became one in a tally of what the makers of a Channel 4 documentary called Kids and Guns claim to be 3,000 American children who die each year from gun-related accidents. A recent Yale University study found that more than 7,000 US children and adolescents are hospitalised or killed by guns each year and estimates that about 20 children a day are treated in US emergency rooms following incidents involving guns.

Hank’s story is striking, certainly for British readers, for two reasons. One, it dramatises how hunting is for many Americans not the privileged pursuit it is overwhelmingly here, but a traditional family activity as much to do with foraging for food as it is a sport.

Francine Shaw, who directed Kids and Guns, says: “In rural America … people hunt to eat.”

Kelli has a fond memory of her son coming home with what he’d shot. “He’d come in and say: “Momma – I’ve got some squirrel to cook.” And I’d say ‘Gee, thanks.’ That child was happy to bring home meat. He was the happiest child when he came in from shooting.”

But Hank’s story is also striking because it shows how raising kids to hunt and shoot is seen as good parenting, perhaps even as an essential part of bringing up children in America – a society rife with guns and temperamentally incapable of overturning the second amendment that confers the right to bear arms, no matter how many innocent Americans die or get maimed as a result.

“People know I was a good mother and loved him dearly,” says Kelli. “We were both really good parents and no one has said anything hateful to us. The only thing that has been said is in a news report about a nine year old being allowed to hunt alone.”

Does Kelli regret that Hank was allowed to hunt alone at that young age? “Obviously I do, because I’ve lost my son,” she tells me. But she doesn’t blame Brent for letting him go off from camp unsupervised with a gun.

“We’re sure not anti-gun here, but do I wish I could go back in time and not buy that gun? Yes I do. I know you in England don’t have guns. I wish I could go back and have my son back. I would live in England, away from the guns.”

Read the entire article here.

Infographic courtesy of Care2 via visua.ly

The Best

The United States is home to many first and superlatives: first in democracy, wealth, openness, innovation, industry, innovation. The nation also takes great pride in its personal and cultural freedoms. Yet it is also home to another superlative: first in rates of incarceration.  In fact, the US leads other nations by such a wide margin that questions continue to be asked. In the land of the free, something must be wrong.

From the Atlantic:

On Friday, the U.S. Sentencing Commission voted unanimously to allow nearly 50,000 nonviolent federal drug offenders to seek lower sentences. The commission’s decision retroactively applied an earlier change in sentencing guidelines to now cover roughly half of those serving federal drug sentences. Endorsed by both the Department of Justice and prison-reform advocates, the move is a significant step forward in reversing decades of mass incarcerationthough in a global context, still modest—step forward in reversing decades of mass incarceration.

How large is America’s prison problem? More than 2.4 million people are behind bars in the United States today, either awaiting trial or serving a sentence. That’s more than the combined population of 15 states, all but three U.S. cities, and the U.S. armed forces. They’re scattered throughout a constellation of 102 federal prisons, 1,719 state prisons, 2,259 juvenile facilities, 3,283 local jails, and many more military, immigration, territorial, and Indian Country facilities.

Compared to the rest of the world, these numbers are staggering. Here’s how the United States’ incarceration rate compares with those of other modern liberal democracies like Britain and Canada:

That graph is from a recent report by Prison Policy Initiative, an invaluable resource on mass incarceration. (PPI also has a disturbing graph comparing state incarceration rates with those of other countries around the world, which I highly recommend looking at here.) “Although our level of crime is comparable to those of other stable, internally secure, industrialized nations,” the report says, “the United States has an incarceration rate far higher than any other country.”

Some individual states like Louisiana contribute disproportionately, but no state is free from mass incarceration. Disturbingly, many states’ prison populations outrank even those of dictatorships and illiberal democracies around the world. New York jails more people per capita than Rwanda, where tens of thousands await trial for their roles in the 1994 genocide. California, Illinois, and Ohio each have a higher incarceration rate than Cuba and Russia. Even Maine and Vermont imprison a greater share of people than Saudi Arabia, Venezuela, or Egypt.

But mass incarceration is more than just an international anomaly; it’s also a relatively recent phenomenon in American criminal justice. Starting in the 1970s with the rise of tough-on-crime politicians and the War on Drugs, America’s prison population jumped eightfold between 1970 and 2010.

These two metrics—the international and the historical—have to be seen together to understand how aberrant mass incarceration is. In time or in space, the warehousing of millions of Americans knows no parallels. In keeping with American history, however, it also disproportionately harms the non-white and the non-wealthy. “For a great many poor people in America, particularly poor black men, prison is a destination that braids through an ordinary life, much as high school and college do for rich white ones,” wrote Adam Gopnik in his seminal 2012 article.

Mass incarceration on a scale almost unexampled in human history is a fundamental fact of our country today—perhaps the fundamental fact, as slavery was the fundamental fact of 1850. In truth, there are more black men in the grip of the criminal-justice system—in prison, on probation, or on parole—than were in slavery then. Over all, there are now more people under “correctional supervision” in America—more than six million—than were in the Gulag Archipelago under Stalin at its height.

Mass incarceration’s effects are not confined to the cell block. Through the inescapable stigma it imposes, a brush with the criminal-justice system can hamstring a former inmate’s employment and financial opportunities for life. The effect is magnified for those who already come from disadvantaged backgrounds. Black men, for example, made substantial economic progress between 1940 and 1980 thanks to the post-war economic boom and the dismantling of de jure racial segregation. But mass incarceration has all but ground that progress to a halt: A new University of Chicago study found that black men are no better off in 2014 than they were when Congress passed the Civil Rights Act 50 years earlier.

Read the entire article here.

Isolation Fractures the Mind

Through the lens of extreme isolation Michael Bond shows us in this fascinating article how we really are social animals. Remove a person from all meaningful social contact — even for a short while — and her mind will begin to play tricks and eventually break. Michael Bond is author of The Power of Others.

From the BBC:

When people are isolated from human contact, their mind can do some truly bizarre things, says Michael Bond. Why does this happen?

Sarah Shourd’s mind began to slip after about two months into her incarceration. She heard phantom footsteps and flashing lights, and spent most of her day crouched on all fours, listening through a gap in the door.

That summer, the 32-year-old had been hiking with two friends in the mountains of Iraqi Kurdistan when they were arrested by Iranian troops after straying onto the border with Iran. Accused of spying, they were kept in solitary confinement in Evin prison in Tehran, each in their own tiny cell. She endured almost 10,000 hours with little human contact before she was freed. One of the most disturbing effects was the hallucinations.

“In the periphery of my vision, I began to see flashing lights, only to jerk my head around to find that nothing was there,” she wrote in the New York Times in 2011. “At one point, I heard someone screaming, and it wasn’t until I felt the hands of one of the friendlier guards on my face, trying to revive me, that I realised the screams were my own.”

We all want to be alone from time to time, to escape the demands of our colleagues or the hassle of crowds. But not alone alone. For most people, prolonged social isolation is all bad, particularly mentally. We know this not only from reports by people like Shourd who have experienced it first-hand, but also from psychological experiments on the effects of isolation and sensory deprivation, some of which had to be called off due to the extreme and bizarre reactions of those involved. Why does the mind unravel so spectacularly when we’re truly on our own, and is there any way to stop it?

We’ve known for a while that isolation is physically bad for us. Chronically lonely people have higher blood pressure, are more vulnerable to infection, and are also more likely to develop Alzheimer’s disease and dementia. Loneliness also interferes with a whole range of everyday functioning, such as sleep patterns, attention and logical and verbal reasoning. The mechanisms behind these effects are still unclear, though what is known is that social isolation unleashes an extreme immune response – a cascade of stress hormones and inflammation. This may have been appropriate in our early ancestors, when being isolated from the group carried big physical risks, but for us the outcome is mostly harmful.

Yet some of the most profound effects of loneliness are on the mind. For starters, isolation messes with our sense of time. One of the strangest effects is the ‘time-shifting’ reported by those who have spent long periods living underground without daylight. In 1961, French geologist Michel Siffre led a two-week expedition to study an underground glacier beneath the French Alps and ended up staying two months, fascinated by how the darkness affected human biology. He decided to abandon his watch and “live like an animal”. While conducting tests with his team on the surface, they discovered it took him five minutes to count to what he thought was 120 seconds.

A similar pattern of ‘slowing time’ was reported by Maurizio Montalbini, a sociologist and caving enthusiast. In 1993, Montalbini spent 366 days in an underground cavern near Pesaro in Italy that had been designed with Nasa to simulate space missions, breaking his own world record for time spent underground. When he emerged, he was convinced only 219 days had passed. His sleep-wake cycles had almost doubled in length. Since then, researchers have found that in darkness most people eventually adjust to a 48-hour cycle: 36 hours of activity followed by 12 hours of sleep. The reasons are still unclear.

As well as their time-shifts, Siffre and Montalbini reported periods of mental instability too. But these experiences were nothing compared with the extreme reactions seen in notorious sensory deprivation experiments in the mid-20th Century.

In the 1950s and 1960s, China was rumoured to be using solitary confinement to “brainwash” American prisoners captured during the Korean War, and the US and Canadian governments were all too keen to try it out. Their defence departments funded a series of research programmes that might be considered ethically dubious today.

The most extensive took place at McGill University Medical Center in Montreal, led by the psychologist Donald Hebb. The McGill researchers invited paid volunteers – mainly college students – to spend days or weeks by themselves in sound-proof cubicles, deprived of meaningful human contact. Their aim was to reduce perceptual stimulation to a minimum, to see how their subjects would behave when almost nothing was happening. They minimised what they could feel, see, hear and touch, fitting them with translucent visors, cotton gloves and cardboard cuffs extending beyond the fingertips. As Scientific American magazine reported at the time, they had them lie on U-shaped foam pillows to restrict noise, and set up a continuous hum of air-conditioning units to mask small sounds.

After only a few hours, the students became acutely restless. They started to crave stimulation, talking, singing or reciting poetry to themselves to break the monotony. Later, many of them became anxious or highly emotional. Their mental performance suffered too, struggling with arithmetic and word association tests.

But the most alarming effects were the hallucinations. They would start with points of light, lines or shapes, eventually evolving into bizarre scenes, such as squirrels marching with sacks over their shoulders or processions of eyeglasses filing down a street. They had no control over what they saw: one man saw only dogs; another, babies.

Some of them experienced sound hallucinations as well: a music box or a choir, for instance. Others imagined sensations of touch: one man had the sense he had been hit in the arm by pellets fired from guns. Another, reaching out to touch a doorknob, felt an electric shock.

When they emerged from the experiment they found it hard to shake this altered sense of reality, convinced that the whole room was in motion, or that objects were constantly changing shape and size.

Read the entire article here.

 

You Are a Neural Computation

Since the days of Aristotle, and later Descartes, thinkers have sought to explain consciousness and free will. Several thousand years on and we are still pondering the notion; science has made great strides and yet fundamentally we still have little idea.

Many neuroscientists now armed with new and very precise research tools are aiming to change this. Yet, increasingly it seems that free will may indeed by a cognitive illusion. Evidence suggests that our subconscious decides and initiates action for us long before we are aware of making a conscious decision. There seems to be no god or ghost in the machine.

From Technology Review:

It was an expedition seeking something never caught before: a single human neuron lighting up to create an urge, albeit for the minor task of moving an index finger, before the subject was even aware of feeling anything. Four years ago, Itzhak Fried, a neurosurgeon at the University of California, Los Angeles, slipped several probes, each with eight hairlike electrodes able to record from single neurons, into the brains of epilepsy patients. (The patients were undergoing surgery to diagnose the source of severe seizures and had agreed to participate in experiments during the process.) Probes in place, the patients—who were conscious—were given instructions to press a button at any time of their choosing, but also to report when they’d first felt the urge to do so.

Later, Gabriel Kreiman, a neuroscientist at Harvard Medical School and Children’s Hospital in Boston, captured the quarry. Poring over data after surgeries in 12 patients, he found telltale flashes of individual neurons in the pre-­supplementary motor area (associated with movement) and the anterior cingulate (associated with motivation and attention), preceding the reported urges by anywhere from hundreds of milliseconds to several seconds. It was a direct neural measurement of the unconscious brain at work—caught in the act of formulating a volitional, or freely willed, decision. Now Kreiman and his colleagues are planning to repeat the feat, but this time they aim to detect pre-urge signatures in real time and stop the subject from performing the action—or see if that’s even possible.

A variety of imaging studies in humans have revealed that brain activity related to decision-making tends to precede conscious action. Implants in macaques and other animals have examined brain circuits involved in perception and action. But Kreiman broke ground by directly measuring a preconscious decision in humans at the level of single neurons. To be sure, the readouts came from an average of just 20 neurons in each patient. (The human brain has about 86 billion of them, each with thousands of connections.) And ultimately, those neurons fired only in response to a chain of even earlier events. But as more such experiments peer deeper into the labyrinth of neural activity behind decisions—whether they involve moving a finger or opting to buy, eat, or kill something—science could eventually tease out the full circuitry of decision-making and perhaps point to behavioral therapies or treatments. “We need to understand the neuronal basis of voluntary decision-making—or ‘freely willed’ decision-­making—and its pathological counterparts if we want to help people such as drug, sex, food, and gambling addicts, or patients with obsessive-compulsive disorder,” says Christof Koch, chief scientist at the Allen Institute of Brain Science in Seattle (see “Cracking the Brain’s Codes”). “Many of these people perfectly well know that what they are doing is dysfunctional but feel powerless to prevent themselves from engaging in these behaviors.”

Kreiman, 42, believes his work challenges important Western philosophical ideas about free will. The Argentine-born neuroscientist, an associate professor at Harvard Medical School, specializes in visual object recognition and memory formation, which draw partly on unconscious processes. He has a thick mop of black hair and a tendency to pause and think a long moment before reframing a question and replying to it expansively. At the wheel of his Jeep as we drove down Broadway in Cambridge, Massachusetts, Kreiman leaned over to adjust the MP3 player—toggling between Vivaldi, Lady Gaga, and Bach. As he did so, his left hand, the one on the steering wheel, slipped to let the Jeep drift a bit over the double yellow lines. Kreiman’s view is that his neurons made him do it, and they also made him correct his small error an instant later; in short, all actions are the result of neural computations and nothing more. “I am interested in a basic age-old question,” he says. “Are decisions really free? I have a somewhat extreme view of this—that there is nothing really free about free will. Ultimately, there are neurons that obey the laws of physics and mathematics. It’s fine if you say ‘I decided’—that’s the language we use. But there is no god in the machine—only neurons that are firing.”

Our philosophical ideas about free will date back to Aristotle and were systematized by René Descartes, who argued that humans possess a God-given “mind,” separate from our material bodies, that endows us with the capacity to freely choose one thing rather than another. Kreiman takes this as his departure point. But he’s not arguing that we lack any control over ourselves. He doesn’t say that our decisions aren’t influenced by evolution, experiences, societal norms, sensations, and perceived consequences. “All of these external influences are fundamental to the way we decide what we do,” he says. “We do have experiences, we do learn, we can change our behavior.”

But the firing of a neuron that guides us one way or another is ultimately like the toss of a coin, Kreiman insists. “The rules that govern our decisions are similar to the rules that govern whether a coin will land one way or the other. Ultimately there is physics; it is chaotic in both cases, but at the end of the day, nobody will argue the coin ‘wanted’ to land heads or tails. There is no real volition to the coin.”

Testing Free Will

It’s only in the past three to four decades that imaging tools and probes have been able to measure what actually happens in the brain. A key research milestone was reached in the early 1980s when Benjamin Libet, a researcher in the physiology department at the University of California, San Francisco, made a remarkable study that tested the idea of conscious free will with actual data.

Libet fitted subjects with EEGs—gadgets that measure aggregate electrical brain activity through the scalp—and had them look at a clock dial that spun around every 2.8 seconds. The subjects were asked to press a button whenever they chose to do so—but told they should also take note of where the time hand was when they first felt the “wish or urge.” It turns out that the actual brain activity involved in the action began 300 milliseconds, on average, before the subject was conscious of wanting to press the button. While some scientists criticized the methods—questioning, among other things, the accuracy of the subjects’ self-reporting—the study set others thinking about how to investigate the same questions. Since then, functional magnetic resonance imaging (fMRI) has been used to map brain activity by measuring blood flow, and other studies have also measured brain activity processes that take place before decisions are made. But while fMRI transformed brain science, it was still only an indirect tool, providing very low spatial resolution and averaging data from millions of neurons. Kreiman’s own study design was the same as Libet’s, with the important addition of the direct single-neuron measurement.

When Libet was in his prime, ­Kreiman was a boy. As a student of physical chemistry at the University of Buenos Aires, he was interested in neurons and brains. When he went for his PhD at Caltech, his passion solidified under his advisor, Koch. Koch was deep in collaboration with Francis Crick, co-discoverer of DNA’s structure, to look for evidence of how consciousness was represented by neurons. For the star-struck kid from Argentina, “it was really life-changing,” he recalls. “Several decades ago, people said this was not a question serious scientists should be thinking about; they either had to be smoking something or have a Nobel Prize”—and Crick, of course, was a Nobelist. Crick hypothesized that studying how the brain processed visual information was one way to study consciousness (we tap unconscious processes to quickly decipher scenes and objects), and he collaborated with Koch on a number of important studies. Kreiman was inspired by the work. “I was very excited about the possibility of asking what seems to be the most fundamental aspect of cognition, consciousness, and free will in a reductionist way—in terms of neurons and circuits of neurons,” he says.

One thing was in short supply: humans willing to have scientists cut open their skulls and poke at their brains. One day in the late 1990s, Kreiman attended a journal club—a kind of book club for scientists reviewing the latest literature—and came across a paper by Fried on how to do brain science in people getting electrodes implanted in their brains to identify the source of severe epileptic seizures. Before he’d heard of Fried, “I thought examining the activity of neurons was the domain of monkeys and rats and cats, not humans,” Kreiman says. Crick introduced Koch to Fried, and soon Koch, Fried, and Kreiman were collaborating on studies that investigated human neural activity, including the experiment that made the direct neural measurement of the urge to move a finger. “This was the opening shot in a new phase of the investigation of questions of voluntary action and free will,” Koch says.

Read the entire article here.

Go Forth And Declutter

Google-search-hoarding

Having only just recently re-located to Colorado’s wondrous Front Range of the Rocky Mountains, your friendly editor now finds himself surrounded by figurative, less-inspiring mountains: moving boxes, bins, bags, more boxes. It’s floor to ceiling clutter as far as the eye can see.

Some of these boxes contain essentials, yet probably around 80 percent hold stuff. Yes, just stuff — aging items that hold some kind of sentimental meaning or future promise: old CDs, baby clothes, used ticket stubs, toys from an attic three moves ago, too many socks, ill-fitting clothing, 13 allen wrenches and screwdrivers, first-grade school projects, photo negatives, fading National Geographic magazines, gummed-up fountain pens, European postcards…

So, here’s a very timely story on the psychology of clutter and hoarding.

From the WSJ:

Jennifer James and her husband don’t have a lot of clutter—but they do find it hard to part with their children’s things. The guest cottage behind their home in Oklahoma City is half-filled with old toys, outgrown clothing, artwork, school papers, two baby beds, a bassinet and a rocking horse.

“Every time I think about getting rid of it, I want to cry,” says Ms. James, a 46-year-old public-relations consultant. She fears her children, ages 6, 8 and 16, will grow up and think she didn’t love them if she doesn’t save it all. “In keeping all this stuff, I think someday I’ll be able to say to my children, ‘See—I treasured your innocence. I treasured you!’ “

Many powerful emotions are lurking amid stuff we keep. Whether it’s piles of unread newspapers, clothes that don’t fit, outdated electronics, even empty margarine tubs, the things we accumulate reflect some of our deepest thoughts and feelings.

Now there’s growing recognition among professional organizers that to come to grips with their clutter, clients need to understand why they save what they save, or things will inevitably pile up again. In some cases, therapists are working along with organizers to help clients confront their psychological demons.

“The work we do with clients goes so much beyond making their closets look pretty,” says Collette Shine, president of the New York chapter of the National Association of Professional Organizers. “It involves getting into their hearts and their heads.”

For some people—especially those with big basements—hanging onto old and unused things doesn’t present a problem. But many others say they’re drowning in clutter.

“I have clients who say they are distressed at all the clutter they have, and distressed at the thought of getting rid of things,” says Simon Rego, director of psychology training at Montefiore Medical Center in Bronx, N.Y., who makes house calls, in extreme cases, to help hoarders.

In some cases, chronic disorganization can be a symptom of Attention Deficit Hyperactivity Disorder, Obsessive-Compulsive Disorder and dementia—all of which involve difficulty with planning, focusing and making decisions.

The extreme form, hoarding, is now a distinct psychiatric disorder, defined in the new Diagnostic and Statistical Manual-5 as “persistent difficulty discarding possessions, regardless of their value” such that living areas cannot be used. Despite all the media attention, only 2% to 5% of people fit the criteria—although many more joke, or fear, they are headed that way.

Difficulty letting go of your stuff can also go hand in hand with separation anxiety, compulsive shopping, perfectionism, procrastination and body-image issues. And the reluctance to cope can create a vicious cycle of avoidance, anxiety and guilt.

In most cases, however, psychologists say that clutter can be traced to what they call cognitive errors—flawed thinking that drives dysfunctional behaviors that can get out of hand.

Among the most common clutter-generating bits of logic: “I might need these someday.” “These might be valuable.” “These might fit again if I lose (or gain) weight.”

“We all have these dysfunctional thoughts. It’s perfectly normal,” Dr. Rego says. The trick, he says, is to recognize the irrational thought that makes you cling to an item and substitute one that helps you let go, such as, “Somebody else could use this, so I’ll give it away.”

He concedes he has saved “maybe 600” disposable Allen wrenches that came with IKEA furniture over the years.

The biggest sources of clutter and the hardest to discard are things that hold sentimental meaning. Dr. Rego says it’s natural to want to hang onto objects that trigger memories, but some people confuse letting go of the object with letting go of the person.

Linda Samuels, president of the Institute for Challenging Disorganization, an education and research group, says there’s no reason to get rid of things just for the sake of doing it.

“Figure out what’s important to you and create an environment that supports that,” she says.

Robert McCollum, a state tax auditor and Ms. James’s husband, says he treasures items like the broken fairy wand one daughter carried around for months.

“I don’t want to lose my memories, and I don’t need a professional organizer,” he says. “I’ve already organized it all in bins.” The only problem would be if they ever move to a place that doesn’t have 1,000 square feet of storage, he adds.

Sometimes the memories people cling to are images of themselves in different roles or happier times. “Our closets are windows into our internal selves,” says Jennifer Baumgartner, a Baltimore psychologist and author of “You Are What You Wear.”

“Say you’re holding on to your team uniforms from college,” she says. “Ask yourself, what about that experience did you like? What can you do in your life now to recapture that?”

Somebody-might-need-this thinking is often what drives people to save stacks of newspapers, magazines, outdated electronic equipment, decades of financial records and craft supplies. With a little imagination, anything could be fodder for scrapbooks or Halloween costumes.

For people afraid to toss things they might want in the future, Dr. Baumgartner says it helps to have a worst-case scenario plan. “What if you do need that tutu you’ve given away for a Halloween costume? What would you do? You can find almost anything on eBay.

Read the entire story here.

Image courtesy of Google search.

Iran, Women, Clothes

hajib_Jeune_femmeA fascinating essay by Haleh Anvari, Iranian writer and artist, provides an insightful view of the role that fashion takes in shaping many of our perceptions — some right, many wrong — of women.

Quite rightly she argues that the measures our culture places on women, through the lens of Western fashion or Muslim tradition, are misleading. In both cases, there remains a fundamental need to address and to continue to address women’s rights versus those of men. Fashion stereotypes may be vastly different across continents, but the underlying issues remain very much the same whether a woman wears a hijab on the street or lingerie on a catwalk.

From the NYT:

I took a series of photographs of myself in 2007 that show me sitting on the toilet, weighing myself, and shaving my legs in the bath. I shot them as an angry response to an encounter with a gallery owner in London’s artsy Brick Lane. I had offered him photos of colorful chadors — an attempt to question the black chador as the icon of Iran by showing the world that Iranian women were more than this piece of black cloth. The gallery owner wasn’t impressed. “Do you have any photos of Iranian women in their private moments?” he asked.

As an Iranian with a reinforced sense of the private-public divide we navigate daily in our country, I found his curiosity offensive. So I shot my “Private Moments” in a sardonic spirit, to show that Iranian women are like all women around the world if you get past the visual hurdle of the hijab. But I never shared those, not just because I would never get a permit to show them publicly in Iran, but also because I am prepared to go only so far to prove a point. Call me old-fashioned.Read the entire article here.

Ever since the hijab, a generic term for every Islamic modesty covering, became mandatory after the 1979 revolution, Iranian women have been used to represent the country visually. For the new Islamic republic, the all-covering cloak called a chador became a badge of honor, a trademark of fundamental change. To Western visitors, it dropped a pin on their travel maps, where the bodies of Iranian women became a stand-in for the character of Iranian society. When I worked with foreign journalists for six years, I helped produce reports that were illustrated invariably with a woman in a black chador. I once asked a photojournalist why. He said, “How else can we show where we are?”

How wonderful. We had become Iran’s Eiffel Tower or Big Ben.

Next came the manteau-and-head scarf combo — less traditional, and more relaxed, but keeping the lens on the women. Serious reports about elections used a “hair poking out of scarf” standard as an exit poll, or images of scarf-clad women lounging in coffee shops, to register change. One London newspaper illustrated a report on the rise of gasoline prices with a woman in a head scarf, photographed in a gas station, holding a pump nozzle with gasoline suggestively dripping from its tip. A visitor from Mars or a senior editor from New York might have been forgiven for imagining Iran as a strange land devoid of men, where fundamentalist chador-clad harridans vie for space with heathen babes guzzling cappuccinos. (Incidentally, women hardly ever step out of the car to pump gas here; attendants do it for us.)

The disputed 2009 elections, followed by demonstrations and a violent backlash, brought a brief respite. The foreign press was ejected, leaving the reporting to citizen journalists not bound by the West’s conventions. They depicted a politically mature citizenry, male and female, demanding civic acknowledgment together.

We are now witnessing another shift in Iran’s image. It shows Iran “unveiled” — a tired euphemism now being used to literally undress Iranian women or show them off as clotheshorses. An Iranian fashion designer in Paris receives more plaudits in the Western media for his blog’s street snapshots of stylish, affluent young women in North Tehran than he gets for his own designs. In this very publication, a male Iranian photographer depicted Iranian women through flimsy fabrics under the title “Veiled Truths”; one is shown in a one-piece pink swimsuit so minimal it could pass for underwear; others are made more sensual behind sheer “veils,” reinforcing a sense of peeking at them. Search the Internet and you can get an eyeful of nubile limbs in opposition to the country’s official image, shot by Iranian photographers of both sexes, keen to show the hidden, supposedly true, other side of Iran.

Young Iranians rightly desire to show the world the unseen sides of their lives. But their need to show themselves as like their peers in the West takes them into dangerous territory. Professional photographers and artists, encouraged by Western curators and seeking fast-track careers, are creating a new wave of homegrown neo-Orientalism. A favorite reworking of an old cliché is the thin, beautiful young woman reclining while smoking a hookah, dancing, or otherwise at leisure in her private spaces. Ingres could sue for plagiarism.

In a country where the word feminism is pejorative, there is no inkling that the values of both fundamentalism and Western consumerism are two sides of the same coin — the female body as an icon defining Iranian culture.

It is true that we Iranians live dual lives, and so it is true that to see us in focus, you must enter our inner sanctum. But the inner sanctum includes women who believe in the hijab, fat women, old women and, most important, women in professions from doctor to shopkeeper. It also includes men, not all of whom are below 30 years of age. If you wish to see Iran as it is, you need go no further than Facebook and Instagram. Here, Iran is neither fully veiled nor longing to undress itself. Its complex variety is shown through the lens of its own people, in both private and public spaces.

Read the entire essay here.

Image: Young woman from Naplouse in a hijab, c1867-1885. Courtesy of Wikipedia.

Dinosaurs of Retail

moa

Shopping malls in the United States were in their prime in the 1970s and ’80s. Many had positioned themselves a a bright, clean, utopian alternative to inner-city blight and decay. A quarter of a century on, while the mega-malls may be thriving, the numerous smaller suburban brethren are seeing lower sales. As internet shopping and retailing pervades all reaches of our society many midsize malls are decaying or shutting down completely.  Documentary photographer Seth Lawless captures this fascinating transition in a new book: Black Friday: the Collapse of the American Shopping Mall.

From the Guardian:

It is hard to believe there has ever been any life in this place. Shattered glass crunches under Seph Lawless’s feet as he strides through its dreary corridors. Overhead lights attached to ripped-out electrical wires hang suspended in the stale air and fading wallpaper peels off the walls like dead skin.

Lawless sidesteps debris as he passes from plot to plot in this retail graveyard called Rolling Acres Mall in Akron, Ohio. The shopping centre closed in 2008, and its largest retailers, which had tried to make it as standalone stores, emptied out by the end of last year. When Lawless stops to overlook a two-storey opening near the mall’s once-bustling core, only an occasional drop of water, dribbling through missing ceiling tiles, breaks the silence.

“You came, you shopped, you dressed nice – you went to the mall. That’s what people did,” says Lawless, a pseudonymous photographer who grew up in a suburb of nearby Cleveland. “It was very consumer-driven and kind of had an ugly side, but there was something beautiful about it. There was something there.”

Gazing down at the motionless escalators, dead plants and empty benches below, he adds: “It’s still beautiful, though. It’s almost like ancient ruins.”

Dying shopping malls are speckled across the United States, often in middle-class suburbs wrestling with socioeconomic shifts. Some, like Rolling Acres, have already succumbed. Estimates on the share that might close or be repurposed in coming decades range from 15 to 50%. Americans are returning downtown; online shopping is taking a 6% bite out of brick-and-mortar sales; and to many iPhone-clutching, city-dwelling and frequently jobless young people, the culture that spawned satire like Mallrats seems increasingly dated, even cartoonish.

According to longtime retail consultant Howard Davidowitz, numerous midmarket malls, many of them born during the country’s suburban explosion after the second world war, could very well share Rolling Acres’ fate. “They’re going, going, gone,” Davidowitz says. “They’re trying to change; they’re trying to get different kinds of anchors, discount stores … [But] what’s going on is the customers don’t have the fucking money. That’s it. This isn’t rocket science.”

Shopping culture follows housing culture. Sprawling malls were therefore a natural product of the postwar era, as Americans with cars and fat wallets sprawled to the suburbs. They were thrown up at a furious pace as shoppers fled cities, peaking at a few hundred per year at one point in the 1980s, according to Paco Underhill, an environmental psychologist and author of Call of the Mall: The Geography of Shopping. Though construction has since tapered off, developers left a mall overstock in their wake.

Currently, the US contains around 1,500 of the expansive “malls” of suburban consumer lore. Most share a handful of bland features. Brick exoskeletons usually contain two storeys of inward-facing stores separated by tile walkways. Food courts serve mediocre pizza. Parking lots are big enough to easily misplace a car. And to anchor them economically, malls typically depend on department stores: huge vendors offering a variety of products across interconnected sections.

For mid-century Americans, these gleaming marketplaces provided an almost utopian alternative to the urban commercial district, an artificial downtown with less crime and fewer vermin. As Joan Didion wrote in 1979, malls became “cities in which no one lives but everyone consumes”. Peppered throughout disconnected suburbs, they were a place to see and be seen, something shoppers have craved since the days of the Greek agora. And they quickly matured into a self-contained ecosystem, with their own species – mall rats, mall cops, mall walkers – and an annual feeding frenzy known as Black Friday.

“Local governments had never dealt with this sort of development and were basically bamboozled [by developers],” Underhill says of the mall planning process. “In contrast to Europe, where shopping malls are much more a product of public-private negotiation and funding, here in the US most were built under what I call ‘cowboy conditions’.”

Shopping centres in Europe might contain grocery stores or childcare centres, while those in Japan are often built around mass transit. But the suburban American variety is hard to get to and sells “apparel and gifts and damn little else”, Underhill says.

Nearly 700 shopping centres are “super-regional” megamalls, retail leviathans usually of at least 1 million square feet and upward of 80 stores. Megamalls typically outperform their 800 slightly smaller, “regional” counterparts, though size and financial health don’t overlap entirely. It’s clearer, however, that luxury malls in affluent areas are increasingly forcing the others to fight for scraps. Strip malls – up to a few dozen tenants conveniently lined along a major traffic artery – are retail’s bottom feeders and so well-suited to the new environment. But midmarket shopping centres have begun dying off alongside the middle class that once supported them. Regional malls have suffered at least three straight years of declining profit per square foot, according to the International Council of Shopping Centres (ICSC).

Read the entire story here.

Image: Mall of America. Courtesy of Wikipedia.

Your Tax Dollars At Work — Leetspeak

US-FBI-ShadedSealIt’s fascinating to see what our government agencies are doing with some of our hard earned tax dollars.

In this head-scratching example, the FBI — the FBI’s Intelligence Research Support Unit, no less — has just completed a 83-page glossary of Internet slang or “leetspeak”. LOL and Ugh! (the latter is not an acronym).

Check out the document via Muckrock here — they obtained the “secret” document through the Freedom of Information Act.

From the Washington Post:

The Internet is full of strange and bewildering neologisms, which anyone but a text-addled teen would struggle to understand. So the fine, taxpayer-funded people of the FBI — apparently not content to trawl Urban Dictionary, like the rest of us — compiled a glossary of Internet slang.

An 83-page glossary. Containing nearly 3,000 terms.

The glossary was recently made public through a Freedom of Information request by the group MuckRock, which posted the PDF, called “Twitter shorthand,” online. Despite its name, this isn’t just Twitter slang: As the FBI’s Intelligence Research Support Unit explains in the introduction, it’s a primer on shorthand used across the Internet, including in “instant messages, Facebook and Myspace.” As if that Myspace reference wasn’t proof enough that the FBI’s a tad out of touch, the IRSU then promises the list will prove useful both professionally and “for keeping up with your children and/or grandchildren.” (Your tax dollars at work!)

All of these minor gaffes could be forgiven, however, if the glossary itself was actually good. Obviously, FBI operatives and researchers need to understand Internet slang — the Internet is, increasingly, where crime goes down these days. But then we get things like ALOTBSOL (“always look on the bright side of life”) and AMOG (“alpha male of group”) … within the first 10 entries.

ALOTBSOL has, for the record, been tweeted fewer than 500 times in the entire eight-year history of Twitter. AMOG has been tweeted far more often, but usually in Spanish … as a misspelling, it would appear, of “amor” and “amigo.”

Among the other head-scratching terms the FBI considers can’t-miss Internet slang:

  1. AYFKMWTS (“are you f—— kidding me with this s—?”) — 990 tweets
  2. BFFLTDDUP (“best friends for life until death do us part) — 414 tweets
  3. BOGSAT (“bunch of guys sitting around talking”) — 144 tweets
  4. BTDTGTTSAWIO (“been there, done that, got the T-shirt and wore it out”) — 47 tweets
  5. BTWITIAILWY (“by the way, I think I am in love with you”) — 535 tweets
  6. DILLIGAD (“does it look like I give a damn?”) — 289 tweets
  7. DITYID (“did I tell you I’m depressed?”) — 69 tweets
  8. E2EG (“ear-to-ear grin”) — 125 tweets
  9. GIWIST (“gee, I wish I said that”) — 56 tweets
  10. HCDAJFU (“he could do a job for us”) — 25 tweets
  11. IAWTCSM (“I agree with this comment so much”) — 20 tweets
  12. IITYWIMWYBMAD (“if I tell you what it means will you buy me a drink?”) — 250 tweets
  13. LLTA (“lots and lots of thunderous applause”) — 855 tweets
  14. NIFOC (“naked in front of computer”) — 1,065 tweets, most of them referring to acronym guides like this one.
  15. PMYMHMMFSWGAD (“pardon me, you must have mistaken me for someone who gives a damn”) — 128 tweets
  16. SOMSW (“someone over my shoulder watching) — 170 tweets
  17. WAPCE (“women are pure concentrated evil”) — 233 tweets, few relating to women
  18. YKWRGMG (“you know what really grinds my gears?”) — 1,204 tweets

In all fairness to the FBI, they do get some things right: “crunk” is helpfully defined as “crazy and drunk,” FF is “a recommendation to follow someone referenced in the tweet,” and a whole range of online patois is translated to its proper English equivalent: hafta is “have to,” ima is “I’m going to,” kewt is “cute.”

Read the entire article here.

Image: FBI Seal. Courtesy of U.S. Government.