Free Will: An Illusion?

Neuroscientists continue to find interesting experimental evidence that we do not have free will. Many philosophers continue to dispute this notion and cite inconclusive results and lack of holistic understanding of decision-making on the part of brain scientists. An article by Kerri Smith over at Nature lays open this contentious and fascinating debate.

[div class=attrib]From Nature:[end-div]

The experiment helped to change John-Dylan Haynes’s outlook on life. In 2007, Haynes, a neuroscientist at the Bernstein Center for Computational Neuroscience in Berlin, put people into a brain scanner in which a display screen flashed a succession of random letters1. He told them to press a button with either their right or left index fingers whenever they felt the urge, and to remember the letter that was showing on the screen when they made the decision. The experiment used functional magnetic resonance imaging (fMRI) to reveal brain activity in real time as the volunteers chose to use their right or left hands. The results were quite a surprise.

“The first thought we had was ‘we have to check if this is real’,” says Haynes. “We came up with more sanity checks than I’ve ever seen in any other study before.”

The conscious decision to push the button was made about a second before the actual act, but the team discovered that a pattern of brain activity seemed to predict that decision by as many as seven seconds. Long before the subjects were even aware of making a choice, it seems, their brains had already decided.

As humans, we like to think that our decisions are under our conscious control — that we have free will. Philosophers have debated that concept for centuries, and now Haynes and other experimental neuroscientists are raising a new challenge. They argue that consciousness of a decision may be a mere biochemical afterthought, with no influence whatsoever on a person’s actions. According to this logic, they say, free will is an illusion. “We feel we choose, but we don’t,” says Patrick Haggard, a neuroscientist at University College London.

You may have thought you decided whether to have tea or coffee this morning, for example, but the decision may have been made long before you were aware of it. For Haynes, this is unsettling. “I’ll be very honest, I find it very difficult to deal with this,” he says. “How can I call a will ‘mine’ if I don’t even know when it occurred and what it has decided to do?”

[div class=attrib]More from theSource here.[end-div]

[div class=attrib]Image courtesy of Nature.[end-div]

Book Review: Are You Serious? Lee Siegel

“You cannot be serious”, goes the oft quoted opening to a John McEnroe javelin thrown at an unsuspecting tennis umpire. This leads us to an earnest review of what is means to be serious from Lee Siegel’s new book, “Are You Serious?” As Michael Agger points out for Slate:

We don’t know what to take seriously anymore. Is Brian Williams a serious news anchor or is he playing at being serious? How about Jon Stewart? The New York Times exudes seriousness, but the satire of The Onion can also be very serious.

Do we indeed need a how-to manual on how to exude required seriousness in the correct circumstances? Do we need a 3rd party narrator to tell us when to expect seriousness or irony or serious irony? Perhaps Lee Siegel’s book can shed some light.

[div class=attrib]More from Slate’s review of Siegel’s book:[end-div]

Siegel’s business-casual jaunt through seriosity begins with the Victorian poet Matthew Arnold, who saw the decline of religious spirit and proposed the “high seriousness” of poetry and literature in its place. “Seriousness implied a trustworthy personality,” Siegel writes, “just as faith in God once implied a trustworthy soul.” The way in which Arnold connected morality to cultural refinement soothed the intellectual insecurity of Americans vis-à-vis Europe and appealed to our ethos of self-improvement. The contemporary disciples of Arnold are those friends of yours who read Ulysses along with Ulysses Annotated, actually go to art galleries, and know their way around a Ring cycle. The way they enjoy culture expresses their seriousness of purpose.

I’ve only pulled at a few of the provocative strings in Siegel’s book. His argument that Sarah Palin is someone who has signaled seriousness by being willing to humiliate herself on reality TV makes a wild sort of sense. At other times, Siegel floats some nonsense that he knows to be silly.

But I don’t want to leave you hanging without providing Siegel’s answer to the question of finding seriousness in life. He gives us his “three pillars”: attention, purpose, continuity. That could mean being a really competent lawyer. Or being so skilled at being a pilot that you land a plane on the Hudson and save everyone onboard. Or being like Socrates and drinking the hemlock to prove that you believed in your ideas. Just find the thing that makes you “fully alive” and then you’re set. Which is to say that although the cultural and political figures we should take seriously change, the prospect of becoming a serious person remains dauntingly unchanged.

[div class=attrib]More from theSource here.[end-div]

The Lanier Effect

Twenty or so years ago the economic prognosticators and technology pundits would all have had us believe that the internet would transform society; it would level the playing field; it would help the little guy compete against the corporate behemoth; it would make us all “socially” rich if not financially. Yet, the promise of those early, heady days seems remarkably narrow nowadays. What happened? Or rather, what didn’t happen?

We excerpt a lengthy interview with Jaron Lanier over at the Edge. Lanier, a pioneer in the sphere of virtual reality, offers some well-laid arguments for and against concentration of market power as enabled by information systems and the internet. Though he leaves his most powerful criticism at the doors of Google. Their (in)famous corporate mantra — “do no evil” — will start to look remarkably disingenuous.

[div class=attrib]From the Edge:[end-div]

I’ve focused quite a lot on how this stealthy component of computation can affect our sense of ourselves, what it is to be a person. But lately I’ve been thinking a lot about what it means to economics.

In particular, I’m interested in a pretty simple problem, but one that is devastating. In recent years, many of us have worked very hard to make the Internet grow, to become available to people, and that’s happened. It’s one of the great topics of mankind of this era.  Everyone’s into Internet things, and yet we have this huge global economic trouble. If you had talked to anyone involved in it twenty years ago, everyone would have said that the ability for people to inexpensively have access to a tremendous global computation and networking facility ought to create wealth. This ought to create wellbeing; this ought to create this incredible expansion in just people living decently, and in personal liberty. And indeed, some of that’s happened. Yet if you look at the big picture, it obviously isn’t happening enough, if it’s happening at all.

The situation reminds me a little bit of something that is deeply connected, which is the way that computer networks transformed finance. You have more and more complex financial instruments, derivatives and so forth, and high frequency trading, all these extraordinary constructions that would be inconceivable without computation and networking technology.

At the start, the idea was, “Well, this is all in the service of the greater good because we’ll manage risk so much better, and we’ll increase the intelligence with which we collectively make decisions.” Yet if you look at what happened, risk was increased instead of decreased.

… We were doing a great job through the turn of the century. In the ’80s and ’90s, one of the things I liked about being in the Silicon Valley community was that we were growing the middle class. The personal computer revolution could have easily been mostly about enterprises. It could have been about just fighting IBM and getting computers on desks in big corporations or something, instead of this notion of the consumer, ordinary person having access to a computer, of a little mom and pop shop having a computer, and owning their own information. When you own information, you have power. Information is power. The personal computer gave people their own information, and it enabled a lot of lives.

… But at any rate, the Apple idea is that instead of the personal computer model where people own their own information, and everybody can be a creator as well as a consumer, we’re moving towards this iPad, iPhone model where it’s not as adequate for media creation as the real media creation tools, and even though you can become a seller over the network, you have to pass through Apple’s gate to accept what you do, and your chances of doing well are very small, and it’s not a person to person thing, it’s a business through a hub, through Apple to others, and it doesn’t create a middle class, it creates a new kind of upper class.

Google has done something that might even be more destructive of the middle class, which is they’ve said, “Well, since Moore’s law makes computation really cheap, let’s just give away the computation, but keep the data.” And that’s a disaster.

What’s happened now is that we’ve created this new regimen where the bigger your computer servers are, the more smart mathematicians you have working for you, and the more connected you are, the more powerful and rich you are. (Unless you own an oil field, which is the old way.) II benefit from it because I’m close to the big servers, but basically wealth is measured by how close you are to one of the big servers, and the servers have started to act like private spying agencies, essentially.

With Google, or with Facebook, if they can ever figure out how to steal some of Google’s business, there’s this notion that you get all of this stuff for free, except somebody else owns the data, and they use the data to sell access to you, and the ability to manipulate you, to third parties that you don’t necessarily get to know about. The third parties tend to be kind of tawdry.

[div class=attrib]Read the entire article.[end-div]

[div class=attrib]Image courtesy of Jaron Lanier.[end-div]

Reading Between the Lines

In his book, “The Secret Life of Pronouns”, professor of psychology James Pennebaker describes how our use of words like “I”, “we”, “he”, “she” and “who” reveals a wealth of detail about ourselves including, and very surprisingly, our health and social status.

[div class=attrib]Excerpts from James Pennebaker’s interview with Scientific American:[end-div]

In the 1980s, my students and I discovered that if people were asked to write about emotional upheavals, their physical health improved. Apparently, putting emotional experiences into language changed the ways people thought about their upheavals. In an attempt to better understand the power of writing, we developed a computerized text analysis program to determine how language use might predict later health improvements.

Much to my surprise, I soon discovered that the ways people used pronouns in their essays predicted whose health would improve the most. Specifically, those people who benefited the most from writing changed in their pronoun use from one essay to another. Pronouns were reflecting people’’s abilities to change perspective.

As I pondered these findings, I started looking at how people used pronouns in other texts — blogs, emails, speeches, class writing assignments, and natural conversation. Remarkably, how people used pronouns was correlated with almost everything I studied. For example, use of  first-person singular pronouns (I, me, my) was consistently related to gender, age, social class, honesty, status, personality, and much more.

… In my own work, we have analyzed the collected works of poets, playwrights, and novelists going back to the 1500s to see how their writing changed as they got older. We’ve compared the pronoun use of suicidal versus non-suicidal poets. Basically, poets who eventually commit suicide use I-words more than non-suicidal poets.
The analysis of language style can also serve as a psychological window into authors and their relationships. We have analyzed the poetry of Elizabeth Barrett and Robert Browning and compared it with the history of their marriage. Same thing with Ted Hughes and Sylvia Plath. Using a method we call Language Style Matching, we can isolate changes in the couples’ relationships.

… One of the most interesting results was part of a study my students and I conducted dealing with status in email correspondence. Basically, we discovered that in any interaction, the person with the higher status uses I-words less (yes, less) than people who are low in status. The effects were quite robust and, naturally, I wanted to test this on myself. I always assumed that I was a warm, egalitarian kind of guy who treated people pretty much the same.

I was the same as everyone else. When undergraduates wrote me, their emails were littered with I, me, and my. My response, although quite friendly, was remarkably detached — hardly an I-word graced the page.

[div class=attrib]More from theSource here.[end-div]

[div class=attrib]Images courtesy of Univesity of Texas at Austin.[end-div]

Once Not So Crazy Ideas About Our Sun

Some wacky ideas about our sun from not so long ago help us realize the importance of a healthy dose of skepticism combined with good science. In fact, as you’ll see from the timestamp on the image from NASA’s Solar and Heliospheric Observatory (SOHO) science can now bring us – the public – near realtime images of our nearest star.

[div class=attrib]From Slate:[end-div]

The sun is hell.

The18th-century English clergyman Tobias Swinden argued that hell couldn’t lie below Earth’s surface: The fires would soon go out, he reasoned, due to lack of air. Not to mention that the Earth’s interior would be too small to accommodate all the damned, especially after making allowances for future generations of the damned-to-be. Instead, wrote Swinden, it’s obvious that hell stares us in the face every day: It’s the sun.

The sun is made of ice.

In 1798, Charles Palmer—who was not an astronomer, but an accountant—argued that the sun can’t be a source of heat, since Genesis says that light already existed before the day that God created the sun. Therefore, he reasoned, the sun must merely focus light upon Earth—light that exists elsewhere in the universe. Isn’t the sun even shaped like a giant lens? The only natural, transparent substance that it could be made of, Palmer figured, is ice. Palmer’s theory was published in a widely read treatise that, its title crowed, “overturn[ed] all the received systems of the universe hitherto extant, proving the celebrated and indefatigable Sir Isaac Newton, in his theory of the solar system, to be as far distant from the truth, as any of the heathen authors of Greece or Rome.”

Earth is a sunspot.

Sunspots are magnetic regions on the sun’s surface. But in 1775, mathematician and theologian J. Wiedeberg said that the sun’s spots are created by the clumping together of countless solid “heat particles,” which he speculated were constantly being emitted by the sun. Sometimes, he theorized, these heat particles stick together even at vast distances from the sun—and this is how planets form. In other words, he believed that Earth is a sunspot.

The sun’s surface is liquid.

Throughout the 18th and 19th centuries, textbooks and astronomers were torn between two competing ideas about the sun’s nature. Some believed that its dazzling brightness was caused by luminous clouds and that small holes in the clouds, which revealed the cool, dark solar surface below, were the sunspots. But the majority view was that the sun’s body was a hot, glowing liquid, and that the sunspots were solar mountains sticking up through this lava-like substance.

The sun is inhabited.

No less a distinguished astronomer than William Herschel, who discovered the planet Uranus in 1781, often stated that the sun has a cool, solid surface on which human-like creatures live and play. According to him, these solar citizens are shielded from the heat given off by the sun’s “dazzling outer clouds” by an inner protective cloud layer—like a layer of haz-mat material—that perfectly blocks the solar emissions and allows for pleasant grassy solar meadows and idyllic lakes.

Language and Gender

As any Italian speaker would attest, the moon, of course is utterly feminine. It is “la luna”. Now, to a German it is “der mond”, and very masculine.

Numerous languages assign a grammatical gender to objects, which in turn influences how people see these objects as either female or male. Yet, researchers have found that sex tends to be ascribed to objects and concepts even in gender-neutral languages. Scientific American reviews this current research.

[div class attrib]From Scientific American:[end-div]

Gender is so fundamental to the way we understand the world that people are prone to assign a sex to even inanimate objects. We all know someone, or perhaps we are that person, who consistently refers to their computer or car with a gender pronoun (“She’s been running great these past few weeks!”) New research suggests that our tendency to see gender everywhere even applies to abstract ideas such as numbers. Across cultures, people see odd numbers as male and even numbers as female.

Scientists have long known that language can influence how we perceive gender in objects. Some languages consistently refer to certain objects as male or female, and this in turn, influences how speakers of that language think about those objects. Webb Phillips of the Max Planck Institute, Lauren Schmidt of HeadLamp Research, and Lera Boroditsky at Stanford University asked Spanish- and German-speaking bilinguals to rate various objects according to whether they seemed more similar to males or females. They found that people rated each object according to its grammatical gender. For example, Germans see the moon as being more like a man, because the German word for moon is grammatically masculine (“der Mond”). In contrast, Spanish-speakers see the moon as being more like a woman, because in Spanish the word for moon is grammatically feminine (“la Luna”).

Aside from language, objects can also become infused with gender based on their appearance, who typically uses them, and whether they seem to possess the type of characteristics usually associated with men or women. David Gal and James Wilkie of Northwestern University studied how people view gender in everyday objects, such as food and furniture. They found that people see food dishes containing meat as more masculine and salads and sour dairy products as more feminine. People see furniture items, such as tables and trash cans, as more feminine when they feature rounded, rather than sharp, edges.

[div class=attrib]More from theSource here.[end-div]

[div class=attrib]Image courtesy of Scientific American.[end-div]

What do Papua New Guinea and New York City Have In Common?

Well the simple answer is, around 800 spoken languages. Or to be more precise, Papua New Guinea is home to an astounding 830 different languages. New York City comes in a close second, with around 800 spoken languages – and that’s not counting when the United Nations is in session on Manhattan’s East Side. Sadly, some of the rarer tongues spoken in New York and Papua New Guinea, and around the globe for that matter, are rapidly becoming extinct – at the rate of around one language every two weeks.

As the Economist points out a group of linguists in New York City is working to codify some of the city’s most endangered tongues.

[div class=attrib]From the Economist:[end-div]

New York is also home, of course, to a lot of academic linguists, and three of them have got together to create an organisation called the Endangered Language Alliance (ELA), which is ferreting out speakers of unusual tongues from the city’s huddled immigrant masses. The ELA, which was set up last year by Daniel Kaufman, Juliette Blevins and Bob Holman, has worked in detail on 12 languages since its inception. It has codified their grammars, their pronunciations and their word-formation patterns, as well as their songs and legends. Among the specimens in its collection are Garifuna, which is spoken by descendants of African slaves who made their homes on St Vincent after a shipwreck unexpectedly liberated them; Mamuju, from Sulawesi in Indonesia; Mahongwe, a language from Gabon; Shughni, from the Pamirian region of Tajikistan; and an unusual variant of a Mexican language called Totonac.

Each volunteer speaker of a language of interest is first tested with what is known as a Swadesh list. This is a set of 207 high-frequency, slow-to-change words such as parts of the body, colours and basic verbs like eat, drink, sleep and kill. The Swadesh list is intended to ascertain an individual’s fluency before he is taken on. Once he has been accepted, Dr Kaufman and his colleagues start chipping away at the language’s phonology (the sounds of which it is composed) and its syntax (how its meaning is changed by the order of words and phrases). This sort of analysis is the bread and butter of linguistics.

Every so often, though, the researchers come across a bit of jam. The Mahongwe word manono, for example, means “I like” when spoken soft and flat, and “I don’t like” when the first syllable is a tad sharper in tone. Similarly, mbaza could be either “chest” or “council house”. In both cases, the two words are nearly indistinguishable to an English speaker, but yield starkly different patterns when run through a spectrograph. Manono is a particular linguistic oddity, since it uses only tone to differentiate an affirmative from a negative—a phenomenon the ELA has since discovered applies to all verbs in Mahongwe.

[div class=attrib]More from theSource here.[end-div]

[div class=attrib]Image courtesy of Wikipedia / Creative Commons.[end-div]

MondayPoem: A Little Language

This week theDiagonal triangulates its sights on the topic of language and communication. So, we introduce an apt poem by Robert Duncan. Of Robert Duncan, Poetry Foundation writes:

Though the name Robert Duncan is not well known outside the literary world, within that world it has become associated with a number of superlatives. Kenneth Rexroth, writing in Assays, names Duncan “one of the most accomplished, one of the most influential” of the postwar American poets.

 

By Robert Duncan:

– A Little Language
I know a little language of my cat, though Dante says
that animals have no need of speech and Nature
abhors the superfluous.   My cat is fluent.   He
converses when he wants with me.   To speak
is natural.   And whales and wolves I’ve heard
in choral soundings of the sea and air
know harmony and have an eloquence that stirs
my mind and heart—they touch the soul.   Here
Dante’s religion that would set Man apart
damns the effluence of our life from us
to build therein its powerhouse.
It’s in his animal communication Man is
true, immediate, and
in immediacy, Man is all animal.
His senses quicken in the thick of the symphony,
old circuits of animal rapture and alarm,
attentions and arousals in which an identity rearrives.
He hears
particular voices among
the concert, the slightest
rustle in the undertones,
rehearsing a nervous aptitude
yet to prove his. He sees the flick
of significant red within the rushing mass
of ruddy wilderness and catches the glow
of a green shirt
to delite him in a glowing field of green
—it speaks to him—
and in the arc of the spectrum color
speaks to color.
The rainbow articulates
a promise he remembers
he but imitates
in noises that he makes,
this speech in every sense
the world surrounding him.
He picks up on the fugitive tang of mace
amidst the savory mass,
and taste in evolution is an everlasting key.
There is a pun of scents in what makes sense.
Myrrh it may have been,
the odor of the announcement that filld the house.
He wakes from deepest sleep
upon a distant signal and waits
as if crouching, springs
to life.
[div class=attrib]Image courtesy of Wikipedia / Creative Commons.[end-div]

Sleep: Defragmenting the Brain

[div class=attrib]From Neuroskeptic:[end-div]

After a period of heavy use, hard disks tend to get ‘fragmented’. Data gets written all over random parts of the disk, and it gets inefficient to keep track of it all.

That’s why you need to run a defragmentation program occasionally. Ideally, you do this overnight, while you’re asleep, so it doesn’t stop you from using the computer.

A new paper from some Stanford neuroscientists argues that the function of sleep is to reorganize neural connections – a bit like a disk defrag for the brain – although it’s also a bit like compressing files to make more room, and a bit like a system reset: Synaptic plasticity in sleep: learning, homeostasis and disease

The basic idea is simple. While you’re awake, you’re having experiences, and your brain is forming memories. Memory formation involves a process called long-term potentiation (LTP) which is essentially the strengthening of synaptic connections between nerve cells.

Yet if LTP is strengthening synapses, and we’re learning all our lives, wouldn’t the synapses eventually hit a limit? Couldn’t they max out, so that they could never get any stronger?

Worse, the synapses that strengthen during memory are primarily glutamate synapses – and these are dangerous. Glutamate is a common neurotransmitter, and it’s even a flavouring, but it’s also a toxin.

Too much glutamate damages the very cells that receive the messages. Rather like how sound is useful for communication, but stand next to a pneumatic drill for an hour, and you’ll go deaf.

So, if our brains were constantly forming stronger glutamate synapses, we might eventually run into serious problems. This is why we sleep, according to the new paper. Indeed, sleep deprivation is harmful to health, and this theory would explain why.

[div class=attrib]More from theSource here.[end-div]

The Rise of Twins

[div class=attrib]From Slate:[end-div]

Twenty-five years ago, almost no one had a cell phone. Very few of us had digital cameras, and laptop computers belonged only to the very rich. But there is something else—not electronic, but also man-made—that has climbed from the margins of the culture in the 1980s to become a standard accoutrement in upscale neighborhoods across the land: twins.

According to the latest data from the Centers for Disease Control and Prevention the U.S. twin rate has skyrocketed from one pair born out of every 53 live births in 1980 to one out of every 31 births in 2008. Where are all these double-babies coming from? And what’s going to happen in years to come—will the multiple-birth rate continue to grow until America ends up a nation of twins?

The twin boom can be explained by changes in when and how women are getting pregnant. Demographers have in recent years described a “delayer boom,” in which birth rates have risen among the sort of women—college-educated—who tend to put off starting a family into their mid-30s or beyond. There are now more in this group than ever before: In 1980, just 12.8 percent of women had attained a bachelor’s degree or higher; by 2010, that number had almost tripled, to 37 percent. And women in their mid-30s have multiple births at a higher rate than younger women. A mother who is 35, for example, is four times more likely than a mother who is 20 to give birth to twins. That seems to be on account of her producing more follicle-stimulating hormone, or FSH, which boosts ovulation. The more FSH you have in your bloodstream, the greater your chances of producing more than one egg in each cycle, and having fraternal twins as a result.

[div class=attrib]More from theSource here.[end-div]

Science: A Contest of Ideas

[div class=attrib]From Project Syndicate:[end-div]

It was recently discovered that the universe’s expansion is accelerating, not slowing, as was previously thought. Light from distant exploding stars revealed that an unknown force (dubbed “dark energy”) more than outweighs gravity on cosmological scales.

Unexpected by researchers, such a force had nevertheless been predicted in 1915 by a modification that Albert Einstein proposed to his own theory of gravity, the general theory of relativity. But he later dropped the modification, known as the “cosmological term,” calling it the “biggest blunder” of his life.

So the headlines proclaim: “Einstein was right after all,” as though scientists should be compared as one would clairvoyants: Who is distinguished from the common herd by knowing the unknowable – such as the outcome of experiments that have yet to be conceived, let alone conducted? Who, with hindsight, has prophesied correctly?

But science is not a competition between scientists; it is a contest of ideas – namely, explanations of what is out there in reality, how it behaves, and why. These explanations are initially tested not by experiment but by criteria of reason, logic, applicability, and uniqueness at solving the mysteries of nature that they address. Predictions are used to test only the tiny minority of explanations that survive these criteria.

The story of why Einstein proposed the cosmological term, why he dropped it, and why cosmologists today have reintroduced it illustrates this process. Einstein sought to avoid the implication of unmodified general relativity that the universe cannot be static – that it can expand (slowing down, against its own gravity), collapse, or be instantaneously at rest, but that it cannot hang unsupported.

This particular prediction cannot be tested (no observation could establish that the universe is at rest, even if it were), but it is impossible to change the equations of general relativity arbitrarily. They are tightly constrained by the explanatory substance of Einstein’s theory, which holds that gravity is due to the curvature of spacetime, that light has the same speed for all observers, and so on.

But Einstein realized that it is possible to add one particular term – the cosmological term – and adjust its magnitude to predict a static universe, without spoiling any other explanation. All other predictions based on the previous theory of gravity – that of Isaac Newton – that were testable at the time were good approximations to those of unmodified general relativity, with that single exception: Newton’s space was an unmoving background against which objects move. There was no evidence yet, contradicting Newton’s view – no mystery of expansion to explain. Moreover, anything beyond that traditional conception of space required a considerable conceptual leap, while the cosmological term made no measurable difference to other predictions. So Einstein added it.

[div class=attrib]More from theSource here.[end-div]

[div class=attrib]Image courtesy of Wikipedia / Creative Commons.[end-div]

Creativity and Anger

It turns out that creativity gets a boost from anger. While anger certainly is not beneficial in some contexts, researchers have found that angry people are more likely to be creative.

[div class=attrib]From Scientific American:[end-div]

This counterintuitive idea was pursued by researchers Matthijs Baas, Carsten De Dreu, and Bernard Nijstad in a series of studies  recently published in The Journal of Experimental Social Psychology. They found that angry people were more likely to be creative – though this advantage didn’t last for long, as the taxing nature of anger eventually leveled out creativity. This study joins several recent lines of research exploring the relative upside to anger – the ways in which anger is not only less harmful than typically assumed, but may even be helpful (though perhaps in small doses).

In an initial study, the researchers found that feeling angry was indeed associated with brainstorming in a more unstructured manner, consistent with “creative” problem solving. In a second study, the researchers first elicited anger from the study participants (or sadness, or a non-emotional state) and then asked them to engage in a brainstorming session in which they generated ideas to preserve and improve the environment. In the beginning of this task, angry participants generated more ideas (by volume) and generated more original ideas (those thought of by less than 1 percent or less of the other participants), compared to the other sad or non-emotional participants. However, this benefit was only present in the beginning of the task, and eventually, the angry participants generated only as many ideas as the other participants.

These findings reported by Baas and colleagues make sense, given what we already know about anger. Though anger may be unpleasant to feel, it is associated with a variety of attributes that may facilitate creativity. First, anger is an energizing feeling, important for the sustained attention needed to solve problems creatively. Second, anger leads to more flexible, unstructured thought processes.

Anecdotal evidence from internal meetings at Apple certainly reinforces the notion that creativity may benefit from well-channeled anger. Apple is often cited as one of the wolrd’s most creative companies.

[div class=attrib]From Jonah Lehred over at Wired:[end-div]

Many of my favorite Steve Jobs stories feature his anger, as he unleashes his incisive temper on those who fail to meet his incredibly high standards. A few months ago, Adam Lashinsky had a fascinating article in Fortune describing life inside the sanctum of 1 Infinite Loop. The article begins with the following scene:

In the summer of 2008, when Apple launched the first version of its iPhone that worked on third-generation mobile networks, it also debuted MobileMe, an e-mail system that was supposed to provide the seamless synchronization features that corporate users love about their BlackBerry smartphones. MobileMe was a dud. Users complained about lost e-mails, and syncing was spotty at best. Though reviewers gushed over the new iPhone, they panned the MobileMe service.

Steve Jobs doesn’t tolerate duds. Shortly after the launch event, he summoned the MobileMe team, gathering them in the Town Hall auditorium in Building 4 of Apple’s campus, the venue the company uses for intimate product unveilings for journalists. According to a participant in the meeting, Jobs walked in, clad in his trademark black mock turtleneck and blue jeans, clasped his hands together, and asked a simple question:

“Can anyone tell me what MobileMe is supposed to do?” Having received a satisfactory answer, he continued, “So why the fuck doesn’t it do that?”

For the next half-hour Jobs berated the group. “You’ve tarnished Apple’s reputation,” he told them. “You should hate each other for having let each other down.” The public humiliation particularly infuriated Jobs. Walt Mossberg, the influential Wall Street Journal gadget columnist, had panned MobileMe. “Mossberg, our friend, is no longer writing good things about us,” Jobs said. On the spot, Jobs named a new executive to run the group.

Brutal, right? But those flashes of intolerant anger have always been an important part of Jobs’ management approach. He isn’t shy about the confrontation of failure and he doesn’t hold back negative feedback. He is blunt at all costs, a cultural habit that has permeated the company. Jonathan Ive, the lead designer at Apple, describes the tenor of group meetings as “brutally critical.”

[div class=attrib]More from theSource here and here.[end-div]

[div class=attrib]Image of Brandy Norwood, courtesy of Wikipedia / Creative Commons.[end-div]

CEO, COO, CFO, CTO: Acronym Soup Explained

[div class=attrib]From Slate:[end-div]

Steve Jobs resigned from his position as Apple’s CEO, or chief executive officer, Wednesday. Taking his place is Tim Cook, previously the company’s COO, or chief operating officer. They also have a CFO, and, at one point or another, the company has had a CIO and CTO, too. When did we start calling corporate bosses C-this-O and C-that-O?

The 1970s. The phrase chief executive officer has been used, if at times rarely, in connection to corporate structures since at least the 19th century. (See, for instance, this 1888 book on banking law in Canada.) About 40 years ago, the phrase began gaining ground on president as the preferred title for the top director in charge of a company’s daily operations. Around the same time, the use of CEO in printed material surged and, if the Google Books database is to be believed, surpassed the long-form chief executive officer in the early 1980s. CFO has gained popularity, too, but at a much slower rate.

The online version of the Oxford English Dictionary published its first entries for CEO and CFO in January of this year. The entries’ first citations are a 1972 article in the Harvard Business Review and a 1971 Boston Globe article, respectively. (Niche publications were using the initials at least a half-decade earlier.) The New York Times seems to have printed its first CEO in a table graphic for a 1972 article, “Executives’ Pay Still Rising,” when space for the full phrase might have been lacking.

[div class=attrib]More from theSource here.[end-div]

[div class=attrib]Image of Steve Jobs and Bill Gates courtesy of Wikipedia / Creative Commons.[end-div]

A Better Way to Board An Airplane

Frequent fliers the world over may soon find themselves thanking a physicist named Jason Steffen. Back in 2008 he ran some computer simulations to find a more efficient way for travelers to board an airplane. Recent tests inside a mock cabin interior confirmed Steffen’s model to be both faster for the airline and easier for passengers, and best of all less time spent waiting in the aisle and jostling for overhead bin space.

[div class=attrib]From the New Scientist:[end-div]

The simulations showed that the best way was to board every other row of window seats on one side of the plane, starting from the back, then do the mirror image on the other side. The remaining window seats on the first side would follow, again starting from the back; then their counterparts on the second side; followed by the same procedure with middle seats and lastly aisles (see illustration).

In Steffen’s computer models, the strategy minimized traffic jams in the aisle and allowed multiple people to stow their luggage simultaneously. “It spread people out along the length of the aisle,” Steffen says. “They’d all put their stuff away and get out of the way at the same time.”

Steffen published his model in the Journal of Air Transport Management in 2008, then went back to his “day job” searching for extrasolar planets. He mostly forgot about the plane study until this May, when he received an email from Jon Hotchkiss, the producer of a new TV show called “This vs That.”

“It’s a show that answers the kinds of scientific questions that come up in people’s everyday life,” Hotchkiss says. He wanted to film an episode addressing the question of the best way to board a plane, and wanted Steffen on board as an expert commentator. Steffen jumped at the chance: “I said, hey, someone wants to test my theory? Sure!”

They, along with 72 volunteers and Hollywood extras, spent a day on a mock plane that has been used in movies such as Kill Bill and Miss Congeniality 2.

[div class=attrib]More from theSource here.[end-div]

MondayPoem: A Sunset of the City

Labor Day traditionally signals the end of summer. A poem by Gwendolyn Brooks sets the mood. She was the first black author to win the Pulitzer Prize.
[div class=attrib]By Gwendolyn Brooks:[end-div]

A Sunset of the City —

Already I am no longer looked at with lechery or love.
My daughters and sons have put me away with marbles and dolls,
Are gone from the house.
My husband and lovers are pleasant or somewhat polite
And night is night.

It is a real chill out,
The genuine thing.
I am not deceived, I do not think it is still summer
Because sun stays and birds continue to sing.

It is summer-gone that I see, it is summer-gone.
The sweet flowers indrying and dying down,
The grasses forgetting their blaze and consenting to brown.

It is a real chill out. The fall crisp comes.
I am aware there is winter to heed.
There is no warm house
That is fitted with my need.
I am cold in this cold house this house
Whose washed echoes are tremulous down lost halls.
I am a woman, and dusty, standing among new affairs.
I am a woman who hurries through her prayers.

Tin intimations of a quiet core to be my
Desert and my dear relief
Come: there shall be such islanding from grief,
And small communion with the master shore.
Twang they. And I incline this ear to tin,
Consult a dual dilemma. Whether to dry
In humming pallor or to leap and die.

Somebody muffed it? Somebody wanted to joke.

[div class=attrib]Image courtesy of Poetry Foundation.[end-div]

Data, data, data: It’s Everywhere

Cities are one of the most remarkable and peculiar inventions of our species. They provide billions in the human family a framework for food, shelter and security. Increasingly, cities are becoming hubs in a vast data network where public officials and citizens mine and leverage vast amounts of information.

[div class=attrib]Krystal D’Costa for Scientific American:[end-div]

Once upon a time there was a family that lived in homes raised on platforms in the sky. They had cars that flew and sorta drove themselves. Their sidewalks carried them to where they needed to go. Video conferencing was the norm, as were appliances which were mostly automated. And they had a robot that cleaned and dispensed sage advice.

I was always a huge fan of the Jetsons. The family dynamics I could do without—Hey, Jane, you clearly had outside interests. You totally could have pursued them, and rocked at it too!—but they were a social reflection of the times even while set in the future, so that is what it is. But their lives were a technological marvel! They could travel by tube, electronic arms dressed them (at the push of the button), and Rosie herself was astounding. If it rained, the Superintendent could move their complex to a higher altitude to enjoy the sunshine! Though it’s a little terrifying to think that Mr. Spacely could pop up on video chat at any time. Think about your boss having that sort of access. Scary, right?

The year 2062 used to seem impossibly far away. But as the setting for the space-age family’s adventures looms on the horizon, even the tech-expectant Jetsons would have to agree that our worlds are perhaps closer than we realize. The moving sidewalks and push button technology (apps, anyone?) have been realized, we’re developing cars that can drive themselves, and we’re on our way to building more Rosie-like AI. Heck, we’re even testing the limits of personal flight. No joke. We’re even working to build a smarter electrical grid, one that would automatically adjust home temperatures and more accurately measure usage.

Sure, we have a ways to go just yet, but we’re more than peering over the edge. We’ve taken the first big step in revolutionizing our management of data.

The September special issue of Scientific American focuses on the strengths of urban centers. Often disparaged for congestion, pollution, and perceived apathy, cities have a history of being vilified. And yet, they’re also seats of innovation. The Social Nexus explores the potential awaiting to be unleashed by harnessing data.

If there’s one thing cities have an abundance of, it’s data. Number of riders on the subway, parking tickets given in a certain neighborhood, number of street fairs, number of parking facilities, broken parking meters—if you can imagine it, chances are the City has the data available, and it’s now open for you to review, study, compare, and shape, so that you can help built a city that’s responsive to your needs.

[div class=attrib]More from theSource here.[end-div]

[div class=attrib]Image courtesy of Wikipedia / Creative Commons.[end-div]

Will nostalgia destroy pop culture?

[div class=attrib]Thomas Rogers for Slate:[end-div]

Over the last decade, American culture has been overtaken by a curious, overwhelming sense of nostalgia. Everywhere you look, there seems to be some new form of revivalism going on. The charts are dominated by old-school-sounding acts like Adele and Mumford & Sons. The summer concert schedule is dominated by reunion tours. TV shows like VH1’s “I Love the 90s” allow us to endlessly rehash the catchphrases of the recent past. And, thanks to YouTube and iTunes, new forms of music and pop culture are facing increasing competition from the ever-more-accessible catalog of older acts.

In his terrific new book, “Retromania,” music writer Simon Reynolds looks at how this nostalgia obsession is playing itself out everywhere from fashion to performance art to electronic music — and comes away with a worrying prognosis. If we continue looking backward, he argues, we’ll never have transformative decades, like the 1960s, or bold movements like rock ‘n’ roll, again. If all we watch and listen to are things that we’ve seen and heard before, and revive trends that have already existed, culture becomes an inescapable feedback loop.

Salon spoke to Reynolds over the phone from Los Angeles about the importance of the 1960s, the strangeness of Mumford & Sons — and why our future could be defined by boredom.

In the book you argue that our culture has increasingly been obsessed with looking backward, and that’s a bad thing. What makes you say that?

Every day, some new snippet of news comes along that is somehow connected to reconsuming the past. Just the other day I read that the famous Redding Festival in Britain is going to be screening a 1992 Nirvana concert during their festival. These events are like cultural antimatter. They won’t be remembered 20 years from now, and the more of them there are, the more alarming it is. I can understand why people want to go to them — they’re attractive and comforting. But this nostalgia seems to have crept into everything. The other day my daughter, who is 5 years old, was at camp, and they had an ’80s day. How can my daughter even understand what that means? She said the counselors were dressed really weird.

[div class=attrib]More from theSource here.[end-div]

[div class=attrib]Image courtesy of Slate.[end-div]

Dark Matter: An Illusion?

Cosmologists and particle physicists have over the last decade or so proposed the existence of Dark Matter. It’s so called because it cannot be seen or sensed directly. It is inferred from gravitational effects on visible matter. Together with it’s theoretical cousin, Dark Energy, the two were hypothesized to make up most of the universe. In fact, the regular star-stuff — matter and energy — of which we, our planet, solar system and the visible universe are made, consists of only a paltry 4 percent.

Dark Matter and Dark Energy were originally proposed to account for discrepancies in calculations of the mass of large objects such as galaxies and galaxy clusters, and calculations derived from the mass of smaller visible objects such as stars, nebulae and interstellar gas.

The problem with Dark Matter is that it remains elusive and for the most part a theoretical construct. And, now a new group of theories suggest that the dark stuff may in fact be an illusion.

[div class=attrib]From National Geographic:[end-div]

The mysterious substance known as dark matter may actually be an illusion created by gravitational interactions between short-lived particles of matter and antimatter, a new study says.

Dark matter is thought to be an invisible substance that makes up almost a quarter of the mass in the universe. The concept was first proposed in 1933 to explain why the outer galaxies in galaxy clusters orbit faster than they should, based on the galaxies’ visible mass.

(Related: “Dark-Matter Galaxy Detected: Hidden Dwarf Lurks Nearby?”)

At the observed speeds, the outer galaxies should be flung out into space, since the clusters don’t appear to have enough mass to keep the galaxies at their edges gravitationally bound.

So physicists proposed that the galaxies are surrounded by halos of invisible matter. This dark matter provides the extra mass, which in turn creates gravitational fields strong enough to hold the clusters together.

In the new study, physicist Dragan Hajdukovic at the European Organization for Nuclear Research (CERN) in Switzerland proposes an alternative explanation, based on something he calls the “gravitational polarization of the quantum vacuum.”

(Also see “Einstein’s Gravity Confirmed on a Cosmic Scale.”)

Empty Space Filled With “Virtual” Particles

The quantum vacuum is the name physicists give to what we see as empty space.

According to quantum physics, empty space is not actually barren but is a boiling sea of so-called virtual particles and antiparticles constantly popping in and out of existence.

Antimatter particles are mirror opposites of normal matter particles. For example, an antiproton is a negatively charged version of the positively charged proton, one of the basic constituents of the atom.

When matter and antimatter collide, they annihilate in a flash of energy. The virtual particles spontaneously created in the quantum vacuum appear and then disappear so quickly that they can’t be directly observed.

In his new mathematical model, Hajdukovic investigates what would happen if virtual matter and virtual antimatter were not only electrical opposites but also gravitational opposites—an idea some physicists previously proposed.

“Mainstream physics assumes that there is only one gravitational charge, while I have assumed that there are two gravitational charges,” Hajdukovic said.

According to his idea, outlined in the current issue of the journal Astrophysics and Space Science, matter has a positive gravitational charge and antimatter a negative one.

That would mean matter and antimatter are gravitationally repulsive, so that an object made of antimatter would “fall up” in the gravitational field of Earth, which is composed of normal matter.

Particles and antiparticles could still collide, however, since gravitational repulsion is much weaker than electrical attraction.

How Galaxies Could Get Gravity Boost

While the idea of particle antigravity might seem exotic, Hajdukovic says his theory is based on well-established tenants in quantum physics.

For example, it’s long been known that particles can team up to create a so-called electric dipole, with positively charge particles at one end and negatively charged particles at the other. (See “Universe’s Existence May Be Explained by New Material.”)

According to theory, there are countless electric dipoles created by virtual particles in any given volume of the quantum vacuum.

All of these electric dipoles are randomly oriented—like countless compass needles pointing every which way. But if the dipoles form in the presence of an existing electric field, they immediately align along the same direction as the field.

According to quantum field theory, this sudden snapping to order of electric dipoles, called polarization, generates a secondary electric field that combines with and strengthens the first field.

Hajdukovic suggests that a similar phenomenon happens with gravity. If virtual matter and antimatter particles have different gravitational charges, then randomly oriented gravitational dipoles would be generated in space.

[div class=attrib]More from theSource here.[end-div]

Improvements to Our Lives Through Science

Ask a hundred people how science can be used for the good and you’re likely to get a hundred different answers. Well, Edge Magazine did just that, posing the question: “What scientific concept would improve everybody’s cognitive toolkit”, to 159 critical thinkers. Below we excerpt some of our favorites. The thoroughly engrossing, novel length article can be found here in its entirety.

[div class=attrib]From Edge:[end-div]

ether
Richard H. Thaler. Father of behavioral economics.

I recently posted a question in this space asking people to name their favorite example of a wrong scientific belief. One of my favorite answers came from Clay Shirky. Here is an excerpt:
The existence of ether, the medium through which light (was thought to) travel. It was believed to be true by analogy — waves propagate through water, and sound waves propagate through air, so light must propagate through X, and the name of this particular X was ether.
It’s also my favorite because it illustrates how hard it is to accumulate evidence for deciding something doesn’t exist. Ether was both required by 19th century theories and undetectable by 19th century apparatus, so it accumulated a raft of negative characteristics: it was odorless, colorless, inert, and so on.

Ecology
Brian Eno. Artist; Composer; Recording Producer: U2, Cold Play, Talking Heads, Paul Simon.

That idea, or bundle of ideas, seems to me the most important revolution in general thinking in the last 150 years. It has given us a whole new sense of who we are, where we fit, and how things work. It has made commonplace and intuitive a type of perception that used to be the province of mystics — the sense of wholeness and interconnectedness.
Beginning with Copernicus, our picture of a semi-divine humankind perfectly located at the centre of The Universe began to falter: we discovered that we live on a small planet circling a medium sized star at the edge of an average galaxy. And then, following Darwin, we stopped being able to locate ourselves at the centre of life. Darwin gave us a matrix upon which we could locate life in all its forms: and the shocking news was that we weren’t at the centre of that either — just another species in the innumerable panoply of species, inseparably woven into the whole fabric (and not an indispensable part of it either). We have been cut down to size, but at the same time we have discovered ourselves to be part of the most unimaginably vast and beautiful drama called Life.

We Are Not Alone In The Universe
J. Craig Venter. Leading scientist of the 21st century.

I cannot imagine any single discovery that would have more impact on humanity than the discovery of life outside of our solar system. There is a human-centric, Earth-centric view of life that permeates most cultural and societal thinking. Finding that there are multiple, perhaps millions of origins of life and that life is ubiquitous throughout the universe will profoundly affect every human.

Correlation is not a cause
Susan Blackmore. Psychologist; Author, Consciousness: An Introduction.

The phrase “correlation is not a cause” (CINAC) may be familiar to every scientist but has not found its way into everyday language, even though critical thinking and scientific understanding would improve if more people had this simple reminder in their mental toolkit.
One reason for this lack is that CINAC can be surprisingly difficult to grasp. I learned just how difficult when teaching experimental design to nurses, physiotherapists and other assorted groups. They usually understood my favourite example: imagine you are watching at a railway station. More and more people arrive until the platform is crowded, and then — hey presto — along comes a train. Did the people cause the train to arrive (A causes B)? Did the train cause the people to arrive (B causes A)? No, they both depended on a railway timetable (C caused both A and B).

A Statistically Significant Difference in Understanding the Scientific Process
Diane F. Halpern. Professor, Claremont McKenna College; Past-president, American Psychological Society.

Statistically significant difference — It is a simple phrase that is essential to science and that has become common parlance among educated adults. These three words convey a basic understanding of the scientific process, random events, and the laws of probability. The term appears almost everywhere that research is discussed — in newspaper articles, advertisements for “miracle” diets, research publications, and student laboratory reports, to name just a few of the many diverse contexts where the term is used. It is a short hand abstraction for a sequence of events that includes an experiment (or other research design), the specification of a null and alternative hypothesis, (numerical) data collection, statistical analysis, and the probability of an unlikely outcome. That is a lot of science conveyed in a few words.

 

Confabulation
Fiery Cushman. Post-doctoral fellow, Mind/Brain/Behavior Interfaculty Initiative, Harvard University.

We are shockingly ignorant of the causes of our own behavior. The explanations that we provide are sometimes wholly fabricated, and certainly never complete. Yet, that is not how it feels. Instead it feels like we know exactly what we’re doing and why. This is confabulation: Guessing at plausible explanations for our behavior, and then regarding those guesses as introspective certainties. Every year psychologists use dramatic examples to entertain their undergraduate audiences. Confabulation is funny, but there is a serious side, too. Understanding it can help us act better and think better in everyday life.

We are Lost in Thought
Sam Harris. Neuroscientist; Chairman, The Reason Project; Author, Letter to a Christian Nation.

I invite you to pay attention to anything — the sight of this text, the sensation of breathing, the feeling of your body resting against your chair — for a mere sixty seconds without getting distracted by discursive thought. It sounds simple enough: Just pay attention. The truth, however, is that you will find the task impossible. If the lives of your children depended on it, you could not focus on anything — even the feeling of a knife at your throat — for more than a few seconds, before your awareness would be submerged again by the flow of thought. This forced plunge into unreality is a problem. In fact, it is the problem from which every other problem in human life appears to be made.
I am by no means denying the importance of thinking. Linguistic thought is indispensable to us. It is the basis for planning, explicit learning, moral reasoning, and many other capacities that make us human. Thinking is the substance of every social relationship and cultural institution we have. It is also the foundation of science. But our habitual identification with the flow of thought — that is, our failure to recognize thoughts as thoughts, as transient appearances in consciousness — is a primary source of human suffering and confusion.

Knowledge
Mark Pagel. Professor of Evolutionary Biology, Reading University, England and The Santa Fe.

The Oracle of Delphi famously pronounced Socrates to be “the most intelligent man in the world because he knew that he knew nothing”. Over 2000 years later the physicist-turned-historian Jacob Bronowski would emphasize — in the last episode of his landmark 1970s television series the “Ascent of Man” — the danger of our all-too-human conceit of thinking we know something. What Socrates knew and what Bronowski had come to appreciate is that knowledge — true knowledge — is difficult, maybe even impossible, to come buy, it is prone to misunderstanding and counterfactuals, and most importantly it can never be acquired with exact precision, there will always be some element of doubt about anything we come to “know”‘ from our observations of the world.

[div class=attrib]More from theSource here.[end-div]

The Business of Making Us Feel Good

Advertisers have long known how to pull at our fickle emotions and inner motivations to sell their products. Further still many corporations fine-tune their products to the nth degree to ensure we learn to crave more of the same. Whether it’s the comforting feel of an armchair, the soft yet lingering texture of yogurt, the fresh scent of hand soap, or the crunchiness of the perfect potato chip, myriad focus groups, industrial designers and food scientists are hard at work engineering our addictions.

[div class=attrib]From the New York Times:[end-div]

Feeling low? According to a new study in the Journal of Consumer Research, when people feel bad, their sense of touch quickens and they instinctively want to hug something or someone. Tykes cling to a teddy bear or blanket. It’s a mammal thing. If young mammals feel gloomy, it’s usually because they’re hurt, sick, cold, scared or lost. So their brain rewards them with a gust of pleasure if they scamper back to mom for a warm nuzzle and a meal. No need to think it over. All they know is that, when a negative mood hits, a cuddle just feels right; and if they’re upbeat and alert, then their eyes hunger for new sights and they’re itching to explore.

It’s part of evolution’s gold standard, the old carrot-and-stick gambit, an impulse that evades reflection because it evolved to help infants thrive by telling them what to do — not in words but in sequins of taste, heartwarming touches, piquant smells, luscious colors.

Back in the days before our kind knew what berries to eat, let alone which merlot to choose or HD-TV to buy, the question naturally arose: How do you teach a reckless animal to live smart? Some brains endorsed correct, lifesaving behavior by doling out sensory rewards. Healthy food just tasted yummy, which is why we now crave the sweet, salty, fatty foods our ancestors did — except that for them such essentials were rare, needing to be painstakingly gathered or hunted. The seasoned hedonists lived to explore and nuzzle another day — long enough to pass along their snuggly, junk-food-bedeviled genes.

[div class=attrib]More from theSource here.[end-div]

Cities Might Influence Not Just Our Civilizations, but Our Evolution

[div class=attrib]From Scientific American:[end-div]

Cities reverberate through history as centers of civilization. Ur. Babylon. Rome. Baghdad. Tenochtitlan. Beijing. Paris. London. New York. As pivotal as cities have been for our art and culture, our commerce and trade, our science and technology, our wars and peace, it turns out that cities might have been even more important than we had suspected, influencing our very genes and evolution.

Cities reverberate through history as centers of civilization. Ur. Babylon. Rome. Baghdad. Tenochtitlan. Beijing. Paris. London. New York. As pivotal as cities have been for our art and culture, our commerce and trade, our science and technology, our wars and peace, it turns out that cities might have been even more important than we had suspected, influencing our very genes and evolution.

Cities have been painted as hives of scum and villainy, dens of filth and squalor, with unsafe water, bad sanitation, industrial pollution and overcrowded neighborhoods. It turns out that by bringing people closer together and spreading disease, cities might increase the chance that, over time, the descendants of survivors could resist infections.

Evolutionary biologist Ian Barnes at the University of London and his colleagues focused on a genetic variant with the alphabet-soup name of SLC11A1 1729+55del4. This variant is linked with natural resistance to germs that dwell within cells, such as tuberculosis and leprosy.

The scientists analyzed DNA samples from 17 modern populations that had occupied their cities for various lengths of time. The cities ranged from Çatalhöyük in Turkey, settled in roughly 6000 B.C., to Juba in Sudan, settled in the 20th century.

The researchers discovered an apparently highly significant link between the occurrence of this genetic variant and the duration of urban settlement. People from a long-populated urban area often seemed better adapted to resisting these specific types of infections — for instance, those in areas settled for more than 5,200 years, such as Susa in Iran, were almost certain to possess this variant, while in cities settled for only a few hundred years, such as Yakutsk in Siberia, only 70 percent to 80 percent of people would have it.

[div class=attrib]More from theSource here.[end-div]

[div class=attrib]Image courtesy of Scientific American.[end-div]

Our Kids’ Glorious New Age of Distraction

[div class=attrib]From Slate:[end-div]

Children are not what they used to be. They tweet and blog and text without batting an eyelash. Whenever they need the answer to a question, they simply log onto their phone and look it up on Google. They live in a state of perpetual, endless distraction, and, for many parents and educators, it’s a source of real concern. Will future generations be able to finish a whole book? Will they be able to sit through an entire movie without checking their phones? Are we raising a generation of impatient brats?

According to Cathy N. Davidson, a professor of interdisciplinary studies at Duke University, and the author of the new book “Now You See It: How Brain Science of Attention Will Transform the Way We Live, Work, and Learn,” much of the panic about children’s shortened attention spans isn’t just misguided, it’s harmful. Younger generations, she argues, don’t just think about technology more casually, they’re actually wired to respond to it in a different manner than we are, and it’s up to us — and our education system — to catch up to them.

Davidson is personally invested in finding a solution to the problem. As vice provost at Duke, she spearheaded a project to hand out a free iPod to every member of the incoming class, and began using wikis and blogs as part of her teaching. In a move that garnered national media attention, she crowd-sourced the grading in her course. In her book, she explains how everything from video gaming to redesigned schools can enhance our children’s education — and ultimately, our future.

Salon spoke to Davidson over the phone about the structure of our brains, the danger of multiple-choice testing, and what the workplace of the future will actually look like.

[div class=attrib]More from theSource here.[end-div]

[div class=attrib]Image courtesy of Wikipedia / Creative Commons.[end-div]

A Holiday in Hell

Not quite as poetic and intricate as Dante’s circuitous map of hell but a fascinating invention by Tom Gauld nonetheless.

[div class=attrib]From Frank Jacobs for Strange Maps:[end-div]

“A perpetual holiday is a good working definition of hell”, said George Bernard Shaw; in fact, just the odd few weeks of summer vacation may be near enough unbearable – what with all the frantic packing and driving, the getting lost and having forgotten, the primitive lodgings, lousy food and miserable weather, not to mention the risk of disease and unfriendly natives.

And yet, even for the bored teenagers forced to join their parents’ on their annual work detox, the horrors of the summer holiday mix with the chance of thrilling adventures, beckoning beyond the unfamiliar horizon.

Tom Gauld may well have been such a teenager, for this cartoon of his deftly expresses both the eeriness and the allure of obligatory relaxation in a less than opportune location. It evokes both the ennui of being where you don’t want to be, and the exhilarating exotism of those surroundings – as if they were an endpaper map of a Boys’ Own adventure, waiting for the dotted line of your very own expedition.

 

[div class=attrib]More from theSource here.[end-div]

People Who Become Nouns

John Montagu, 4th Earl of Sandwich.

The world of science is replete with nouns derived from people. There is the Amp (named after André-Marie Ampère); the Volt (after Alessandro Giuseppe Antonio Anastasio Volta), the Watt (after the Scottish engineer James Watt). And the list goes on. We have the Kelvin, Ohm, Coulomb, Celsius, Hertz, Joule, Sievert. We also have more commonly used nouns in circulation that derive from people. The mackintosh, cardigan and sandwich are perhaps the most frequently used.

[div class=attrib]From Slate:[end-div]

Before there were silhouettes, there was a French fellow named Silhouette. And before there were Jacuzzi parties there were seven inventive brothers by that name. It’s easy to forget that some of the most common words in the English language came from living, breathing people. Explore these real-life namesakes courtesy of Slate’s partnership with LIFE.com.

Jules Leotard: Tight Fit

French acrobat Jules Leotard didn’t just invent the art of the trapeze, he also lent his name to the skin-tight, one-piece outfit that allowed him to keep his limbs free while performing.

It would be fascinating to see if today’s popular culture might lend surnames with equal staying power to our language.

[div class=attrib]Slate has some more fascinating examples, here.[end-div]

[div class=attrib]Image of John Montagu, 4th Earl of Sandwich, 1783, by Thomas Gainsborough. Courtesy of Wikipedia / Creative Commons.[end-div]

Why Does “Cool” Live On and Not “Groovy”?

Why do some words take hold in the public consciousness and persist through generations while others fall by the wayside after one season?

Despite the fleetingness of many new slang terms, such as txtnesia (“when you forget what you texted someone last”), a visit to the Urbandictionary will undoubtedly amuse at the inventiveness of our our language., though gobsmacked and codswallop may come to mind as well.

[div class=attrib]From State:[end-div]

Feeling nostalgic for a journalistic era I never experienced, I recently read Tom Wolfe’s 1968 The Electric Kool-Aid Acid Test. I’d been warned that the New Journalists slathered their prose with slang, so I wasn’t shocked to find nonstandard English on nearly every line: dig, trippy, groovy, grok, heads, hip, mysto and, of course, cool. This psychedelic time capsule led me to wonder about the relative stickiness of all these words—the omnipresence of cool versus the datedness of groovy and the dweeb cachet of grok, a Robert Heinlein coinage from Stranger in a Strange Land literally signifying to drink but implying profound understanding. Mysto, an abbreviation for mystical, seems to have fallen into disuse. It doesn’t even have an Urban Dictionary entry.

There’s no grand unified theory for why some slang terms live and others die. In fact, it’s even worse than that: The very definition of slang is tenuous and clunky. Writing for the journal American Speech, Bethany Dumas and Jonathan Lighter argued in 1978 that slang must meet at least two of the following criteria: It lowers “the dignity of formal or serious speech or writing,” it implies that the user is savvy (he knows what the word means, and knows people who know what it means), it sounds taboo in ordinary discourse (as in with adults or your superiors), and it replaces a conventional synonym. This characterization seems to open the door to words that most would not recognize as slang, including like in the quotative sense: “I was like … and he was like.” It replaces a conventional synonym (said), and certainly lowers seriousness, but is probably better categorized as a tic.

At least it’s widely agreed that young people, seeking to make a mark, are especially prone to generating such dignity-reducing terms. (The editor of The New Partridge Dictionary of Slang and Unconventional English, Tom Dalzell, told me that “every generation comes up with a new word for a marijuana cigarette.”) Oppressed people, criminals, and sports fans make significant contributions, too. There’s also a consensus that most slang, like mysto, is ephemeral. Connie Eble, a linguist at the University of North Carolina, has been collecting slang from her students since the early 1970s. (She asks them to write down terms heard around campus.) In 1996, when she reviewed all the submissions she’d received, she found that more than half were only turned in once. While many words made it from one year to the next, only a tiny minority lasted a decade.

[div class=attrib]More from theSource here.[end-div]

[div class=attrib]Image courtesy of Slate.[end-div]

MondayPoem: Silence

A poem by Billy Collins ushers in another week. Collins served two terms as the U.S. Poet Laureate, from 2001-2003. He is known for poetry imbued with leftfield humor and deep insight.

[div]By Billy Collins:[end-div]

Silence —

There is the sudden silence of the crowd
above a player not moving on the field,
and the silence of the orchid.

The silence of the falling vase
before it strikes the floor,
the silence of the belt when it is not striking the child.

The stillness of the cup and the water in it,
the silence of the moon
and the quiet of the day far from the roar of the sun.

The silence when I hold you to my chest,
the silence of the window above us,
and the silence when you rise and turn away.

And there is the silence of this morning
which I have broken with my pen,
a silence that had piled up all night

like snow falling in the darkness of the house—
the silence before I wrote a word
and the poorer silence now.

[div class=attrib]Image courtesy of Poetry Foundation.[end-div]

Undesign

Jonathan Ive, the design brains behind such iconic contraptions as the iMac, iPod and the iPhone discusses his notion of “undesign”. Ive has over 300 patents and is often cited as one of the most influential industrial designers of the last 20 years. Perhaps it’s purely coincidental that’s Ive’s understated “undesign” comes from his unassuming Britishness.

[div class=attrib]From Slate:[end-div]

Macworld, 1999. That was the year Apple introduced the iMac in five candy colors. The iMac was already a translucent computer that tried its best not to make you nervous. Now it strove to be even more welcoming, almost silly. And here was Apple’s newish head of design, Jonathan Ive, talking about the product in a video—back when he let his hair grow and before he had permanently donned his dark T-shirt uniform. Even then, Ive had the confessional intimacy that makes him the star of Apple promotional videos today. His statement is so ridiculous that he laughs at it himself: “A computer absolutely can be sexy, it’s um … yeah, it can.”

A decade later, no one would laugh (too loudly) if you said that an Apple product was sexy. Look at how we all caress our iPhones. This is not an accident. In interviews, Ive talks intensely about the tactile quality of industrial design. The team he runs at Apple is obsessed with mocking up prototypes. There is a now-legendary story from Ive’s student days of an apartment filled with foam models of his projects. Watch this scene in the documentary Objectified where Ive explains the various processes used to machine a MacBook Air keyboard. He gazes almost longingly upon a titanium blank. This is a man who loves his materials.

Ive’s fixation on how a product feels in your hand, and his micro-focus on aspects like the shininess of the stainless steel, or the exact amount of reflectivity in the screen, were first fully realized with the iPod. From that success, you can see how Ive and Steve Jobs led Apple to glory in the past decade. The iPod begat the iPhone, which in turned inspired the iPad. A new kind of tactile computing was born. Ive’s primary concern for physicality, and his perfectionist desire to think through every aspect of the manufacturing process (even the boring parts), were the exact gifts needed to make a singular product like the iPhone a reality and to guide Apple products through a new era of human-computer interaction. Putting design first has reaped huge financial rewards: Apple is now vying with Exxon to be the world’s most valuable company.

[div class=attrib]More from theSource here.[end-div]

[div class=attrib]Image courtesy of CNNMoney.[end-div]

Business Transforms Street Art

Street art once known as graffiti used to be a derided outlet for social misfits and cultural rebels. Now it is big business. Corporations have embraced the medium, and some street artists have even “sold out” to commercial interests.

Jonathan Jones laments the demise of this art form and its transformation into just another type of corporate advertising.

[div class=attrib]By Jonathan Jones for the Guardian:[end-div]

Street art is so much part of the establishment that when David Cameron spoke about this summer’s riots, he was photographed in front of a bright and bulbous Oxfordshire graffiti painting. Contradiction? Of course not. The efforts of Banksy and all the would-be Banksys have so deeply inscribed the “coolness” of street art into the middle-class mind that it is now as respectable as the Proms, and enjoyed by the same crowd – who can now take a picnic basket down to watch a painting marathon under the railway arches.

No wonder an event described as “the UK’s biggest street art project” (60 artists from all over the world decorating Nelson Street in Bristol last week) went down fairly quietly in the national press. It’s not that new or surprising any more, let alone controversial. Nowadays, doing a bit of street art is as routine as checking your emails. There’s probably an app for it.

Visitors to London buy Banksy prints on canvas from street stalls, while in Tripoli photographers latch on to any bloke with a spray can near any wall that’s still standing. Graffiti and street art have become instant – and slightly lazy – icons of everything our culture lauds, from youth to rebellion to making a fast buck from art.

Is this how street art will die – not with a bang, but with a whimper? Maybe there was a time when painting a wittily satirical or cheekily rude picture or comment on a wall was genuinely disruptive and shocking. That time is gone. Councils still do their bit to keep street art alive by occasionally obliterating it, and so confirming that it has edge. But basically it has been absorbed so deep into the mainstream that old folk who once railed at graffiti in their town are now more likely to have a Banksy book on their shelves than a collection of Giles cartoons.

[div class=attrib]More from theSource here.[end-div]

[div class=attrib]Image of highly decorative graffiti typically found in Olinda, Pernambuco, Brazil. Courtesy of Bjørn Christian Tørrissen.[end-div]