Tag Archives: culture

MondayMap: National Business Emotional Intelligence

A recent article in the Harvard Business Review (HBR) gives would-be business negotiators some general tips on how best to deal with counterparts from other regions of the world. After all, getting to yes and reaching a mutually beneficial agreement across all parties does require a good degree of cultural sensitivity and emotional intelligence.

map-Emotional-Counterpart

While there is no substitute to understanding other nations through travel and cultural immersion, the HBR article describes some interesting nuances to help those lacking in geographic awareness, international business experience,  and cross-cultural wisdom. The first step in this exotic journey is rather appropriately, a map.

No surprise, the Japanese and Filipinos shirk business confrontation, whereas the Russians and French savor it. Northern Europeans are less emotional, while Southern Europeans and Latin Americans are much more emotionally expressive.

From Frank Jacobs over at Strange Maps:

Negotiating with Filipinos? Be warm and personal, but stay polite. Cutting das Deal with Germans? Stay cool as ice, and be tough as nails. So what happens if you’re a German doing business in the Philippines?

That’s not the question this map was designed to answer. This map — actually, a diagram — shows differences in attitudes to business negotiations in a number of countries. Familiarise yourself with them, then burn the drawing. From now on, you’ll be a master international dealmaker.

Vertically, the map distinguishes between countries where it is highly haram to show emotions during business proceedings (Japan being the prime example) and countries where emotions are an accepted part of il commercio (yes, Italians are emotional extroverts — also in business).

The horizontal axis differentiates countries with a very confrontational negotiating style — think heated arguments and slammed doors — from places where decorum is the alpha and omega of commercial dealings. For an extreme example of the former, try trading with an Israeli company. For the latter, I refer you to those personable but (apparently also) persnickety Filipinos.

Read the entire article here.

Map courtesy of Erin Meyer, professor and the program director for Managing Global Virtual Teams at INSEAD. Courtesy of HBR / Strange Maps.

Colonizing the Milky Way 101

ESO-The_Milky_Way_panorama

The human race is likely to spend many future generations grappling with the aftermaths of its colonial sojourns across the globe. Almost every race and creed over our documented history has actively pursued encroaching upon and displacing others. By our very nature we are territorial animals, and very good ones at that.

Yet despite the untold volumes of suffering, pain and death wrought on those we colonize our small blue planet is not enough for our fantasies and follies. We send our space probes throughout the solar system to test for habitability. We dream of human outposts on the Moon and on Mars. But even our solar system is too minuscule for our expansive, acquisitive ambitions. Why not colonize our entire galaxy? Now we’re talking!

Kim Stanley Robinson, author extraordinaire of numerous speculative and science fiction novels, gives us an idea of what it may take to spread our wings across the Milky Way in a recent article for Scientific American, excerpted here.

It will be many centuries before humans move beyond our solar system. But, before we do so I’d propose that we get our own house in order first. That will be our biggest challenge, not the invention of yet to be imagined technologies.

From Scientific American:

The idea that humans will eventually travel to and inhabit other parts of our galaxy was well expressed by the early Russian rocket scientist Konstantin Tsiolkovsky, who wrote, “Earth is humanity’s cradle, but you’re not meant to stay in your cradle forever.” Since then the idea has been a staple of science fiction, and thus become part of a consensus image of humanity’s future. Going to the stars is often regarded as humanity’s destiny, even a measure of its success as a species. But in the century since this vision was proposed, things we have learned about the universe and ourselves combine to suggest that moving out into the galaxy may not be humanity’s destiny after all.

The problem that tends to underlie all the other problems with the idea is the sheer size of the universe, which was not known when people first imagined we would go to the stars. Tau Ceti, one of the closest stars to us at around 12 light-years away, is 100 billion times farther from Earth than our moon. A quantitative difference that large turns into a qualitative difference; we can’t simply send people over such immense distances in a spaceship, because a spaceship is too impoverished an environment to support humans for the time it would take, which is on the order of centuries. Instead of a spaceship, we would have to create some kind of space-traveling ark, big enough to support a community of humans and other plants and animals in a fully recycling ecological system.

On the other hand it would have to be small enough to accelerate to a fairly high speed, to shorten the voyagers’ time of exposure to cosmic radiation, and to breakdowns in the ark. Regarded from some angles bigger is better, but the bigger the ark is, the proportionally more fuel it would have to carry along to slow itself down on reaching its destination; this is a vicious circle that can’t be squared. For that reason and others, smaller is better, but smallness creates problems for resource metabolic flow and ecologic balance. Island biogeography suggests the kinds of problems that would result from this miniaturization, but a space ark’s isolation would be far more complete than that of any island on Earth. The design imperatives for bigness and smallness may cross each other, leaving any viable craft in a non-existent middle.

The biological problems that could result from the radical miniaturization, simplification and isolation of an ark, no matter what size it is, now must include possible impacts on our microbiomes. We are not autonomous units; about eighty percent of the DNA in our bodies is not human DNA, but the DNA of a vast array of smaller creatures. That array of living beings has to function in a dynamic balance for us to be healthy, and the entire complex system co-evolved on this planet’s surface in a particular set of physical influences, including Earth’s gravity, magnetic field, chemical make-up, atmosphere, insolation, and bacterial load. Traveling to the stars means leaving all these influences, and trying to replace them artificially. What the viable parameters are on the replacements would be impossible to be sure of in advance, as the situation is too complex to model. Any starfaring ark would therefore be an experiment, its inhabitants lab animals. The first generation of the humans aboard might have volunteered to be experimental subjects, but their descendants would not have. These generations of descendants would be born into a set of rooms a trillion times smaller than Earth, with no chance of escape.

In this radically diminished enviroment, rules would have to be enforced to keep all aspects of the experiment functioning. Reproduction would not be a matter of free choice, as the population in the ark would have to maintain minimum and maximum numbers. Many jobs would be mandatory to keep the ark functioning, so work too would not be a matter of choices freely made. In the end, sharp constraints would force the social structure in the ark to enforce various norms and behaviors. The situation itself would require the establishment of something like a totalitarian state.

Read the entire article here.

Image: The Milky Way panorama. Courtesy: ESO/S. Brunier – Licensed under Creative Commons.

Man-With-Beard and Negative Frequency-Dependent Sexual Selection

[tube]6i8IER7nTfc[/tube]

Culture watchers pronounced “peak beard” around the time of the US Academy Awards in 2013.  Since then celebrities (male) of all stripes and colors have been ditching the hairy chin for a more clean-shaven look. While, I have no interest in the amount or type of stubble on George Clooney’s face, the beard/no-beard debate does raise a more fascinating issue with profound evolutionary consequences. Research shows that certain physical characteristics, including facial hair, become more appealing when they are rare. The converse is also true: certain traits are less appealing when common. Furthermore, studies of social signalling and mating preference in various animals shows the same bias. So, men, if you’re trying to attract the attention of a potential mate it’s time to think more seriously about negative frequency-dependent sexual selection and ditch the conforming hirsute hipster look for something else. Here’s an idea: just be yourself instead of following the herd. Though, I do still like Manuel’s gallic mustache.

From the BBC:

The ebb and flow of men’s beard fashions may be guided by Darwinian selection, according to a new study.

The more beards there are, the less attractive they become – giving clean-shaven men a competitive advantage, say scientists in Sydney, Australia.

When “peak beard” frequency is reached, the pendulum swings back toward lesser-bristled chins – a trend we may be witnessing now, the scientists say.

Their study has been published in the Royal Society journal Biology Letters.

In the experiment, women and men were asked to rate different faces with “four standard levels of beardedness”.

Both beards and clean-shaven faces became more appealing when they were rare.

The pattern mirrors an evolutionary phenomenon – “negative frequency-dependent sexual selection”, or to put it more simply “an advantage to rare traits”.

The bright colours of male guppies vary by this force – which is driven by females’ changing preferences.

Scientists at the University of New South Wales decided to test this hypothesis for men’s facial hair – recruiting volunteers on their Facebook site, The Sex Lab.

“Big thick beards are back with an absolute vengeance and so we thought underlying this fashion, one of the dynamics that might be important is this idea of negative frequency dependence,” said Prof Rob Brooks, one of the study’s authors.

“The idea is that perhaps people start copying the George Clooneys and the Joaquin Phoenixs and start wearing those beards, but then when more and more people get onto the bandwagon the value of being on the bandwagon diminishes, so that might be why we’ve hit ‘peak beard’.”

“Peak beard” was the climax of the trend for beards in professions not naturally associated with a bristly chin – bankers, film stars, and even footballers began sporting facial hair.

Read the entire story here.

Video courtesy of Fawlty Towers / BBC Productions.

The Old School Social Network Returns

Not too long ago newbies to a community might first have met their neighbors face-to-face by knocking on each others’ front doors, through strolling around the neighborhood or at browsing at the local, communal market or store. But busy schedules, privacy fences, garage doors, a car-centric culture and a general fear of strangers have raised barriers and successfully isolated us. So, it’s wonderful to see the digital tools of our modern age being put to a more ancient use — meeting the neighbors, and breaking down some barriers — many of which seem to be caused by our technologies. Long may the old school (face-to-face) social network prosper!

From NYT:

When Laurell Boyers, 34, and her husband, Federico Bastiani, 37, moved in together in Bologna in 2012, they did not know any of their neighbors. It was a lonely feeling.

“All my friends back home had babies, play dates, people to talk to, and I felt so left out,” Ms. Boyers, who moved from South Africa, said on a recent afternoon. “We didn’t have family or friends connections here. We knew people occasionally, but none in our same situation.”

So Mr. Bastiani took a chance and posted a flier along his street, Via Fondazza, explaining that he had created a closed group on Facebook just for the people who lived there. He was merely looking to make some new friends.

In three or four days, the group had about 20 followers. Almost two years later, the residents say, walking along Via Fondazza does not feel like strolling in a big city neighborhood anymore. Rather, it is more like exploring a small town, where everyone knows one another, as the group now has 1,100 members.

“Now I am obligated to speak to everyone when I leave the house,” Ms. Boyers said jokingly. “It’s comforting and also tiring, sometimes. You have to be careful what you ask for.”

The idea, Italy’s first “social street,” has been such a success that it has caught on beyond Bologna and the narrow confines of Via Fondazza. There are 393 social streets in Europe, Brazil and New Zealand, inspired by Mr. Bastiani’s idea, according to the Social Street Italia website, which was created out of the Facebook group to help others replicate the project.

Bologna, a midsize northern city, is known for its progressive politics and cooperatives. It is home to what is considered Italy’s oldest university, and it has a mix of a vibrant, young crowd and longtime residents, known for their strong sense of community.

Still, socially speaking, Italy — Bologna included — can be conservative. Friendships and relationships often come through family connections. It is not always easy to meet new people. In large cities, neighbors typically keep to themselves.

But today, the residents of Via Fondazza help one another fix broken appliances, run chores or recharge car batteries. They exchange train tickets and organize parties.

About half of Via Fondazza’s residents belong to the Facebook group. Those who do not use the Internet are invited to events via leaflets or word of mouth.

“I’ve noticed that people at first wonder whether they need to pay something” for the help from others, said Mr. Bastiani, referring to the experience of an 80-year-old woman who needed someone to go pick up some groceries for her, or a resident who sought help assembling a piece of Ikea furniture.

“But that’s not the point,” he added. “The best part of this is that it breaks all the schemes. We live near one another, and we help each other. That’s it.”

The impact of the experiment has surprised almost everyone here.

It “has changed the walking in Via Fondazza,” said Francesca D’Alonzo, a 27-year-old law graduate who joined the group in 2013.

“We greet each other, we speak, we ask about our lives, we feel we belong here now,” she said.

The exchanges usually start virtually but soon become concrete, allowing residents to get to know one another in person.

Everyone on Via Fondazza seems to have an anecdote. Ms. D’Alonzo remembers the party she gave on New Year’s Eve in 2013, when her then mostly unknown neighbors brought so much food and wine that she did not know where to put it.

“It’s the mental habit that is so healthy,” she said. “You let people into your house because you know some and trust them enough to bring along some more. You open up your life.”

Read the entire article here.

Entrepreneur (Introvert) Versus CEO (Extrovert)

Conventional wisdom from the corridors of corporate power seems to suggest that successful CEOs tend to be extroverts. On the other hand, it also seems that many successful entrepreneurs come from more introverted stock. This divergence must put a great deal of pressure on the leader as a company transitions from a startup to an established business. Perhaps, this is another of the many reasons why around 90 percent of startups fail.

From WSJ:

A quiet, reserved introvert is probably not what first came to mind. Aren’t entrepreneurs supposed to be gregarious and commanding—verbally adept and able to inspire employees, clients and investors with the sheer force of their personality? No wonder the advice for introverts who want to be entrepreneurs has long been some form of: “Be more extroverted.”

Now, though, business experts and psychologists are starting to see that guidance is wrong. It disregards the unique skills that introverts bring to the table—the ability to focus for long periods, a propensity for balanced and critical thinking, a knack for quietly empowering others—that may make them even better suited for entrepreneurial and business success than extroverts.

Indeed, numerous entrepreneurs and CEOs are either self-admitted introverts or have so many introvert qualities that they are widely thought to be introverts. These include Bill Gates, co-founder of Microsoft, Steve Wozniak, co-founder of Apple, Larry Page, co-founder of Google, Mark Zuckerberg, co-founder of Facebook, Marissa Mayer, current president and CEO of Yahoo, and Warren Buffett, chairman and CEO of Berkshire Hathaway.

As entrepreneurs, introverts succeed because they “create and lead companies from a very focused place,” says Susan Cain, author of “Quiet: The Power of Introverts in a World That Can’t Stop Talking” and founder of Quiet Revolution, a website for introverts. This spring, she co-founded the Quiet Leadership Institute, a consulting firm with a mission to help companies harness the talent of introverted employees and to help introverts draw on their natural strengths. The company’s clients include General Electric, Procter & Gamble and NASA.

Another big plus, she says: Introverts are not interested in leadership for personal glory, and they steer clear of the cult of personality. Their emphasis is on creating something, not on themselves.

“By their nature, introverts tend to get passionate about one, two or three things in their life,” says Ms. Cain. “And in the service of their passion for an idea they will go out and build alliances and networks and acquire expertise and do whatever it takes to make it happen.”

Here are some of the traits common to most introverts that make them especially well-suited to entrepreneurship.

They crave solitude

Many people believe that introverts, by definition, are shy and extroverts are outgoing. This is incorrect. Introverts, whom experts say comprise about a third of the population, get their energy and process information internally. Some may be shy and some may be outgoing, but they all prefer to spend time alone or in small groups, and often feel drained by a lot of social interaction or large groups.

Extroverts—sometimes spelled “extraverts” in psychology circles—gain energy from being with other people and typically process information externally, meaning they prefer to talk through problems instead of pondering them alone, and they sometimes form opinions while they speak. (Ambiverts, a third personality type that makes up the majority of the population, are a mix of introvert and extrovert.)

Being comfortable being alone—and thinking before acting—can give introverts a leg up as they formulate a business plan or come up with new strategies once the company is launched.

Introverts not only have the stamina to spend long periods alone—they love it. “Good entrepreneurs are able to give themselves the solitude they need to think creatively and originally—to create something where there once was nothing,” says Ms. Cain. “And this is just how introverts are wired.”

Extroverts may find it hard to cloister themselves to think through big questions—what does the company have to offer, how will it reach its audience?—because they crave stimulation. Solitude drains them, and they aren’t as creative if they spend too much time alone, says Beth Buelow, a speaker and coach who is founder of The Introvert Entrepreneur, a website for introverts. So extroverts often take a “throw the spaghetti at the wall and see if it sticks” approach to solving problems, rather than think through possibilities.

While extroverts are networking, promoting or celebrating success, introverts have their “butt on the seat,” says Laurie Helgoe, author of “Introvert Power: Why Your Inner Life is Your Hidden Strength” and assistant professor in the department of psychology and human services at Davis & Elkins College in Elkins, W.Va. “An introvert on his or her own is going to enjoy digging in and doing research—and be able to sustain him- or herself in that lonely place of forging your own way.”

They don’t need external affirmation

Another important characteristic of introverts is that they tend to rely on their own inner compass—not external signals—to know that they’re making the right move or doing a good job. That can give them an edge in several ways.

For instance, they generally don’t look for people to tell them whether an idea is worth pursuing. They tend to think it through before speaking about it to anybody, and rely on their own judgment about whether it’s worth pursuing.

Read the entire story here.

Ambition Or Greed Dotcom

When I soak in articles like this one on Amazon’s (the dotcom) vast and ever-growing empire I wonder about the difference between ambition and greed. I used to admire this company tremendously, founded by the singularly focused Jeff Bezos. But, for some reason, when Amazon expanded into retailing groceries my allegiance began to wane. Now that they’re also producing their own entertainment programming, and have their sticky fingers in hundreds of diverse pies, I think I’m starting to dislike and distrust this corporate behemoth. Amazon gave up being a pure retailer a while ago — now they produce original shows and movies; they host e-commerce and manage business services for many other corporations; they run all manner of marketplaces; they compete with distributors. The company does all of this very well.

And, yet.

When did Jeff Bezo’s ambition and that of his 150,000-plus employees — to deliver all manner of stuff so effortlessly and conveniently — morph into what increasingly seems like greed? Because, somewhere along this spectrum of acquisitiveness a noble ambition seems to have become a selfish one.

Oh, and as for the demanding, competitive, brutish workplace — the company seems to be doing nothing more than applying the same principles to its employees as it does from its data-driven retailing and distribution operation. Unfortunately, it seems to have lost sight — as do many companies — that employees remain stubbornly human.

From NYT:

On Monday mornings, fresh recruits line up for an orientation intended to catapult them into Amazon’s singular way of working.

They are told to forget the “poor habits” they learned at previous jobs, one employee recalled. When they “hit the wall” from the unrelenting pace, there is only one solution: “Climb the wall,” others reported. To be the best Amazonians they can be, they should be guided by the leadership principles, 14 rules inscribed on handy laminated cards. When quizzed days later, those with perfect scores earn a virtual award proclaiming, “I’m Peculiar” — the company’s proud phrase for overturning workplace conventions.

At Amazon, workers are encouraged to tear apart one another’s ideas in meetings, toil long and late (emails arrive past midnight, followed by text messages asking why they were not answered), and held to standards that the company boasts are “unreasonably high.” The internal phone directory instructs colleagues on how to send secret feedback to one another’s bosses. Employees say it is frequently used to sabotage others. (The tool offers sample texts, including this: “I felt concerned about his inflexibility and openly complaining about minor tasks.”)

Many of the newcomers filing in on Mondays may not be there in a few years. The company’s winners dream up innovations that they roll out to a quarter-billion customers and accrue small fortunes in soaring stock. Losers leave or are fired in annual cullings of the staff — “purposeful Darwinism,” one former Amazon human resources director said. Some workers who suffered from cancer, miscarriages and other personal crises said they had been evaluated unfairly or edged out rather than given time to recover.

Even as the company tests delivery by drone and ways to restock toilet paper at the push of a bathroom button, it is conducting a little-known experiment in how far it can push white-collar workers, redrawing the boundaries of what is acceptable. The company, founded and still run by Jeff Bezos, rejects many of the popular management bromides that other corporations at least pay lip service to and has instead designed what many workers call an intricate machine propelling them to achieve Mr. Bezos’ ever-expanding ambitions.

“This is a company that strives to do really big, innovative, groundbreaking things, and those things aren’t easy,” said Susan Harker, Amazon’s top recruiter. “When you’re shooting for the moon, the nature of the work is really challenging. For some people it doesn’t work.”

Bo Olson was one of them. He lasted less than two years in a book marketing role and said that his enduring image was watching people weep in the office, a sight other workers described as well. “You walk out of a conference room and you’ll see a grown man covering his face,” he said. “Nearly every person I worked with, I saw cry at their desk.”

Thanks in part to its ability to extract the most from employees, Amazon is stronger than ever. Its swelling campus is transforming a swath of this city, a 10-million-square-foot bet that tens of thousands of new workers will be able to sell everything to everyone everywhere. Last month, it eclipsed Walmart as the most valuable retailer in the country, with a market valuation of $250 billion, and Forbes deemed Mr. Bezos the fifth-wealthiest person on earth.

Tens of millions of Americans know Amazon as customers, but life inside its corporate offices is largely a mystery. Secrecy is required; even low-level employees sign a lengthy confidentiality agreement. The company authorized only a handful of senior managers to talk to reporters for this article, declining requests for interviews with Mr. Bezos and his top leaders.

However, more than 100 current and former Amazonians — members of the leadership team, human resources executives, marketers, retail specialists and engineers who worked on projects from the Kindle to grocery delivery to the recent mobile phone launch — described how they tried to reconcile the sometimes-punishing aspects of their workplace with what many called its thrilling power to create.

In interviews, some said they thrived at Amazon precisely because it pushed them past what they thought were their limits. Many employees are motivated by “thinking big and knowing that we haven’t scratched the surface on what’s out there to invent,” said Elisabeth Rommel, a retail executive who was one of those permitted to speak.

Others who cycled in and out of the company said that what they learned in their brief stints helped their careers take off. And more than a few who fled said they later realized they had become addicted to Amazon’s way of working.

“A lot of people who work there feel this tension: It’s the greatest place I hate to work,” said John Rossman, a former executive there who published a book, “The Amazon Way.

Amazon may be singular but perhaps not quite as peculiar as it claims. It has just been quicker in responding to changes that the rest of the work world is now experiencing: data that allows individual performance to be measured continuously, come-and-go relationships between employers and employees, and global competition in which empires rise and fall overnight. Amazon is in the vanguard of where technology wants to take the modern office: more nimble and more productive, but harsher and less forgiving.

“Organizations are turning up the dial, pushing their teams to do more for less money, either to keep up with the competition or just stay ahead of the executioner’s blade,” said Clay Parker Jones, a consultant who helps old-line businesses become more responsive to change.

On a recent morning, as Amazon’s new hires waited to begin orientation, few of them seemed to appreciate the experiment in which they had enrolled. Only one, Keith Ketzle, a freckled Texan triathlete with an M.B.A., lit up with recognition, explaining how he left his old, lumbering company for a faster, grittier one.

“Conflict brings about innovation,” he said.

Read the entire article here.

Gadzooks, Gosh, Tarnation and the F-Bomb

Blimey! How our lexicon of foul language has evolved! Up to a few hundred years ago most swear words and oaths bore some connection to God, Jesus or other religious figure or event. But the need to display some level of dubious piety and avoid a lightening bolt from the blue led many to invent and mince a whole range of creative euphemisms. Hence, even today, we still hear words like “drat”, “gosh”, “tarnation”, “by george”, “by jove”, “heck”, “strewth”, “odsbodikins”, “gadzooks”, “doggone”.

More recently our linguistic penchant for shock and awe stems mostly from euphemistic — or not — labels for body parts and bodily functions — think: “freaking” or “shit” or “dick” and all manner of “f-words” and “c-words”. Sensitivities aside, many of us are fortunate enough to live in nations that have evolved beyond corporal or even capital punishment for uttering such blasphemous or vulgar indiscretions.

So, the next time your drop the “f-bomb” or a “dagnabbit” in public reflect for a while and thank yourself for supporting your precious democracy over the neighboring theocracy.

From WSJ:

At street level and in popular culture, Americans are freer with profanity now than ever before—or so it might seem to judge by how often people throw around the “F-bomb” or use a certain S-word of scatological meaning as a synonym for “stuff.” Or consider the millions of fans who adore the cartoon series “South Park,” with its pint-size, raucously foul-mouthed characters.

But things might look different to an expedition of anthropologists visiting from Mars. They might conclude that Americans today are as uptight about profanity as were our 19th-century forbears in ascots and petticoats. It’s just that what we think of as “bad” words is different. To us, our ancestors’ word taboos look as bizarre as tribal rituals. But the real question is: How different from them, for better or worse, are we?

In medieval English, at a time when wars were fought in disputes over religious doctrine and authority, the chief category of profanity was, at first, invoking—that is, swearing to—the name of God, Jesus or other religious figures in heated moments, along the lines of “By God!” Even now, we describe profanity as “swearing” or as muttering “oaths.”

It might seem like a kind of obsessive piety to us now, but the culture of that day was largely oral, and swearing—making a sincere oral testament—was a key gesture of commitment. To swear by or to God lightly was considered sinful, which is the origin of the expression to take the Lord’s name in vain (translated from Biblical Hebrew for “emptily”).

The need to avoid such transgressions produced various euphemisms, many of them familiar today, such as “by Jove,” “by George,” “gosh,” “golly” and “Odsbodikins,” which started as “God’s body.” “Zounds!” was a twee shortening of “By his wounds,” as in those of Jesus. A time traveler to the 17th century would encounter variations on that theme such as “Zlids!” and “Znails!”, referring to “his” eyelids and nails.

In the 19th century, “Drat!” was a way to say “God rot.” Around the same time, darn started when people avoided saying “Eternal damnation!” by saying “Tarnation!”, which, because of the D-word hovering around, was easy to recast as “Darnation!”, from which “darn!” was a short step.

By the late 18th century, sex, excretion and the parts associated with same had come to be treated as equally profane as “swearing” in the religious sense. Such matters had always been considered bawdy topics, of course, but the space for ordinary words referring to them had been shrinking for centuries already.

Chaucer had available to him a thoroughly inoffensive word referring to the sex act, swive. An anatomy book in the 1400s could casually refer to a part of the female anatomy with what we today call the C-word. But over time, referring to these things in common conversation came to be regarded with a kind of pearl-clutching horror.

By the 1500s, as English began taking its place alongside Latin as a world language with a copious high literature, a fashion arose for using fancy Latinate terms in place of native English ones for more private matters. Thus was born a slightly antiseptic vocabulary, with words like copulate and penis. Even today modern English has no terms for such things that are neither clinical nor vulgar, along the lines of arm or foot or whistle.

The burgeoning bourgeois culture of the late 1700s, both in Great Britain and America, was especially alarmist about the “down there” aspect of things. In growing cities with stark social stratification, a new gentry developed a new linguistic self-consciousness—more English grammars were published between 1750 and 1800 than had ever appeared before that time.

In speaking of cooked fowl, “white” and “dark” meat originated as terms to avoid mention of breasts and limbs. What one does in a restroom, another euphemism of this era, is only laboriously classified as repose. Bosom and seat (for the backside) originated from the same impulse.

Passages in books of the era can be opaque to us now without an understanding of how particular people had gotten: In Dickens’s “Oliver Twist,” Giles the butler begins, “I got softly out of bed; drew on a pair of…” only to be interrupted with “Ladies present…” after which he dutifully says “…of shoes, sir.” He wanted to say trousers, but because of where pants sit on the body, well…

Or, from the gargantuan Oxford English Dictionary, published in 1884 and copious enough to take up a shelf and bend it, you would never have known in the original edition that the F-word or the C-word existed.

Such moments extend well into the early 20th century. In a number called “Shuffle Off to Buffalo” in the 1932 Broadway musical “42nd Street,” Ginger Rogers sings “He did right by little Nelly / with a shotgun at his bell-” and then interjects “tummy” instead. “Belly” was considered a rude part of the body to refer to; tummy was OK because of its association with children.

Read the entire story here.

The Thugs of Cultural Disruption

What becomes of our human culture as Amazon crushes booksellers and publishers, Twitter dumbs down journalism, knowledge is replaced by keyword search, and the internet becomes a popularity contest?

Leon Wieseltier contributing editor at The Atlantic has some thoughts.

From NYT:

Amid the bacchanal of disruption, let us pause to honor the disrupted. The streets of American cities are haunted by the ghosts of bookstores and record stores, which have been destroyed by the greatest thugs in the history of the culture industry. Writers hover between a decent poverty and an indecent one; they are expected to render the fruits of their labors for little and even for nothing, and all the miracles of electronic dissemination somehow do not suffice for compensation, either of the fiscal or the spiritual kind. Everybody talks frantically about media, a second-order subject if ever there was one, as content disappears into “content.” What does the understanding of media contribute to the understanding of life? Journalistic institutions slowly transform themselves into silent sweatshops in which words cannot wait for thoughts, and first responses are promoted into best responses, and patience is a professional liability. As the frequency of expression grows, the force of expression diminishes: Digital expectations of alacrity and terseness confer the highest prestige upon the twittering cacophony of one-liners and promotional announcements. It was always the case that all things must pass, but this is ridiculous.

Meanwhile the discussion of culture is being steadily absorbed into the discussion of business. There are “metrics” for phenomena that cannot be metrically measured. Numerical values are assigned to things that cannot be captured by numbers. Economic concepts go rampaging through noneconomic realms: Economists are our experts on happiness! Where wisdom once was, quantification will now be. Quantification is the most overwhelming influence upon the contemporary American understanding of, well, everything. It is enabled by the idolatry of data, which has itself been enabled by the almost unimaginable data-generating capabilities of the new technology. The distinction between knowledge and information is a thing of the past, and there is no greater disgrace than to be a thing of the past. Beyond its impact upon culture, the new technology penetrates even deeper levels of identity and experience, to cognition and to consciousness. Such transformations embolden certain high priests in the church of tech to espouse the doctrine of “transhumanism” and to suggest, without any recollection of the bankruptcy of utopia, without any consideration of the cost to human dignity, that our computational ability will carry us magnificently beyond our humanity and “allow us to transcend these limitations of our biological bodies and brains. . . . There will be no distinction, post-Singularity, between human and machine.” (The author of that updated mechanistic nonsense is a director of engineering at Google.)

And even as technologism, which is not the same as technology, asserts itself over more and more precincts of human life, so too does scientism, which is not the same as science. The notion that the nonmaterial dimensions of life must be explained in terms of the material dimensions, and that nonscientific understandings must be translated into scientific understandings if they are to qualify as knowledge, is increasingly popular inside and outside the university, where the humanities are disparaged as soft and impractical and insufficiently new. The contrary insistence that the glories of art and thought are not evolutionary adaptations, or that the mind is not the brain, or that love is not just biology’s bait for sex, now amounts to a kind of heresy. So, too, does the view that the strongest defense of the humanities lies not in the appeal to their utility — that literature majors may find good jobs, that theaters may economically revitalize neighborhoods — but rather in the appeal to their defiantly nonutilitarian character, so that individuals can know more than how things work, and develop their powers of discernment and judgment, their competence in matters of truth and goodness and beauty, to equip themselves adequately for the choices and the crucibles of private and public life.

Read the entire essay here.

Narcissistick

The pursuit of all things self continues unabated in 2015. One has to wonder what children of the self-absorbed, selfie generations will be like. Or, perhaps, there will be no or few children, because many of the self-absorbed will remain, well, rather too self-absorbed.

From NYT:

Sometimes you don’t need an analyst’s report to get a look at the future of the media industry and the challenges it will bring.

On New Year’s Eve, I was one of the poor souls working in Times Square. By about 1 p.m., it was time to evacuate, and when I stepped into the cold that would assault the huddled, partying masses that night, a couple was getting ready to pose for a photo with the logo on The New York Times Building in the background. I love that I work at a place that people deem worthy of memorializing, and I often offer to help.

My assistance was not required. As I watched, the young couple mounted their phone on a collapsible pole, then extended it outward, the camera now able to capture the moment in wide-screen glory.

I’d seen the same phenomenon when I was touring the Colosseum in Rome last month. So many people were fighting for space to take selfies with their long sticks — what some have called the “Narcissistick” — that it looked like a reprise of the gladiatorial battles the place once hosted.

The urge to stare at oneself predates mirrors — you could imagine a Neanderthal fussing with his hair, his image reflected in a pool of water — but it has some pretty modern dimensions. In the forest of billboards in Times Square, the one with a camera that captures the people looking at the billboard always draws a big crowd.

Selfies are hardly new, but the incremental improvement in technology of putting a phone on a stick — a curiously analog fix that Time magazine listed as one of the best inventions of 2014 along with something called the “high-beta fusion reactor” — suggests that the séance with the self is only going to grow. (Selfie sticks are often used to shoot from above, which any self-respecting selfie auteur will tell you is the most flattering angle.)

There are now vast, automated networks to harvest all that narcissism, along with lots of personal data, creating extensive troves of user-generated content. The tendency to listen to the holy music of the self is reflected in the abundance of messaging and self-publishing services — Vine, WhatsApp, Snapchat, Instagram, Apple’s new voice messaging and the rest — all of which pose a profound challenge for media companies. Most media outfits are in the business of one-to-many, creating single pieces of text, images or audio meant to be shared by the masses.

But most sharing does not involve traditional media companies. Consumers are increasingly glued to their Facebook feeds as a source of information about not just their friends but the broader world as well. And with the explosive growth of Snapchat, the fastest-growing social app of the last year, much of the sharing that takes place involves one-to-one images that come and go in 10 seconds or less. Getting a media message — a television show, a magazine, a website, not to mention the ads that pay for most of it — into the intimate space between consumers and a torrent of information about themselves is only going to be more difficult.

I’ve been around since before there was a consumer Internet, but my frame of reference is as neither a Luddite nor a curmudgeon. I didn’t end up with over half a million followers on social media — Twitter and Facebookcombined — by posting only about broadband regulations and cable deals. (Not all self-flattering portraits are rendered in photos. You see what I did there, right?) The enhanced ability to communicate and share in the current age has many tangible benefits.

My wife travels a great deal, sometimes to conflicted regions, and WhatsApp’s global reach gives us a stable way of staying in touch. Over the holidays, our family shared endless photos, emoticons and inside jokes in group messages that were very much a part of Christmas. Not that long ago, we might have spent the time gathered around watching “Elf,” but this year, we were brought together by the here and now, the familiar, the intimate and personal. We didn’t need a traditional media company to help us create a shared experience.

Many younger consumers have become mini-media companies themselves, madly distributing their own content on Vine, Instagram, YouTube and Snapchat. It’s tough to get their attention on media created for the masses when they are so busy producing their own. And while the addiction to self is not restricted to millennials — boomers bow to no one in terms of narcissism — there are now easy-to-use platforms that amplify that self-reflecting impulse.

While legacy media companies still make products meant to be studied and savored over varying lengths of time — the movie “Boyhood,” The Atlantic magazine, the novel “The Goldfinch” — much of the content that individuals produce is ephemeral. Whatever bit of content is in front of someone — text messages, Facebook posts, tweets — is quickly replaced by more and different. For Snapchat, the fact that photos and videos disappear almost immediately is not a flaw, it’s a feature. Users can send content into the world with little fear of creating a trail of digital breadcrumbs that advertisers, parents or potential employers could follow. Warhol’s 15 minutes of fame has been replaced by less than 15 seconds on Snapchat.

Facebook, which is a weave of news encompassing both the self and the world, has become, for many, a de facto operating system on the web. And many of the people who aren’t busy on Facebook are up for grabs on the web but locked up on various messaging apps. What used to be called the audience is disappearing into apps, messaging and user-generated content. Media companies in search of significant traffic have to find a way into that stream.

“The majority of time that people are spending online is on Facebook,” said Anthony De Rosa, editor in chief of Circa, a mobile news start-up. “You have to find a way to break through or tap into all that narcissism. We are way too into ourselves.”

Read the entire article here.

Sartre: Forever Linked with Mrs Premise and Mrs Conclusion

Jean-Paul_Sartre_FP

One has to wonder how Jean-Paul Sartre would have been regarded today had he accepted the Nobel Prize in Literature in 1964, or had the characters of Monty Python not used him as a punching bag in one of their infamous, satyrical philosopher sketches:

Mrs Conclusion: What was Jean-Paul like? 

Mrs Premise: Well, you know, a bit moody. Yes, he didn’t join in the fun much. Just sat there thinking. Still, Mr Rotter caught him a few times with the whoopee cushion. (she demonstrates) Le Capitalisme et La Bourgeoisie ils sont la m~me chose… Oooh we did laugh…

From the Guardian:

In this age in which all shall have prizes, in which every winning author knows what’s necessary in the post-award trial-by-photoshoot (Book jacket pressed to chest? Check. Wall-to-wall media? Check. Backdrop of sponsor’s logo? Check) and in which scarcely anyone has the couilles, as they say in France, to politely tell judges where they can put their prize, how lovely to recall what happened on 22 October 1964, when Jean-Paul Sartre turned down the Nobel prize for literature.

“I have always declined official honours,” he explained at the time. “A writer should not allow himself to be turned into an institution. This attitude is based on my conception of the writer’s enterprise. A writer who adopts political, social or literary positions must act only within the means that are his own – that is, the written word.”

Throughout his life, Sartre agonised about the purpose of literature. In 1947’s What is Literature?, he jettisoned a sacred notion of literature as capable of replacing outmoded religious beliefs in favour of the view that it should have a committed social function. However, the last pages of his enduringly brilliant memoir Words, published the same year as the Nobel refusal, despair over that function: “For a long time I looked on my pen as a sword; now I know how powerless we are.” Poetry, wrote Auden, makes nothing happen; politically committed literature, Sartre was saying, was no better. In rejecting the honour, Sartre worried that the Nobel was reserved for “the writers of the west or the rebels of the east”. He didn’t damn the Nobel in quite the bracing terms that led Hari Kunzru to decline the 2003 John Llewellyn Rhys prize, sponsored by the Mail on Sunday (“As the child of an immigrant, I am only too aware of the poisonous effect of the Mail’s editorial line”), but gently pointed out its Eurocentric shortcomings. Plus, one might say 50 years on, ça change. Sartre said that he might have accepted the Nobel if it had been offered to him during France’s imperial war in Algeria, which he vehemently opposed, because then the award would have helped in the struggle, rather than making Sartre into a brand, an institution, a depoliticised commodity. Truly, it’s difficult not to respect his compunctions.

But the story is odder than that. Sartre read in Figaro Littéraire that he was in the frame for the award, so he wrote to the Swedish Academy saying he didn’t want the honour. He was offered it anyway. “I was not aware at the time that the Nobel prize is awarded without consulting the opinion of the recipient,” he said. “But I now understand that when the Swedish Academy has made a decision, it cannot subsequently revoke it.”

Regrets? Sartre had a few – at least about the money. His principled stand cost him 250,000 kronor (about £21,000), prize money that, he reflected in his refusal statement, he could have donated to the “apartheid committee in London” who badly needed support at the time. All of which makes one wonder what his compatriot, Patrick Modiano, the 15th Frenchman to win the Nobel for literature earlier this month, did with his 8m kronor (about £700,000).

The Swedish Academy had selected Sartre for having “exerted a far-reaching influence on our age”. Is this still the case? Though he was lionised by student radicals in Paris in May 1968, his reputation as a philosopher was on the wane even then. His brand of existentialism had been eclipsed by structuralists (such as Lévi-Strauss and Althusser) and post-structuralists (such as Derrida and Deleuze). Indeed, Derrida would spend a great deal of effort deriding Sartrean existentialism as a misconstrual of Heidegger. Anglo-Saxon analytic philosophy, with the notable exception of Iris Murdoch and Arthur Danto, has for the most part been sniffy about Sartre’s philosophical credentials.

Sartre’s later reputation probably hasn’t benefited from being championed by Paris’s philosophical lightweight, Bernard-Henri Lévy, who subtitled his biography of his hero The Philosopher of the Twentieth Century (Really? Not Heidegger, Russell, Wittgenstein or Adorno?); still less by his appearance in Monty Python’s least funny philosophy sketch, “Mrs Premise and Mrs Conclusion visit Jean-Paul Sartre at his Paris home”. Sartre has become more risible than lisible: unremittingly depicted as laughable philosopher toad – ugly, randy, incomprehensible, forever excitably over-caffeinated at Les Deux Magots with Simone de Beauvoir, encircled with pipe smoke and mired in philosophical jargon, not so much a man as a stock pantomime figure. He deserves better.

How then should we approach Sartre’s writings in 2014? So much of his lifelong intellectual struggle and his work still seems pertinent. When we read the “Bad Faith” section of Being and Nothingness, it is hard not to be struck by the image of the waiter who is too ingratiating and mannered in his gestures, and how that image pertains to the dismal drama of inauthentic self-performance that we find in our culture today. When we watch his play Huis Clos, we might well think of how disastrous our relations with other people are, since we now require them, more than anything else, to confirm our self-images, while they, no less vexingly, chiefly need us to confirm theirs. When we read his claim that humans can, through imagination and action, change our destiny, we feel something of the burden of responsibility of choice that makes us moral beings. True, when we read such sentences as “the being by which Nothingness comes to the world must be its own Nothingness”, we might want to retreat to a dark room for a good cry, but let’s not spoil the story.

His lifelong commitments to socialism, anti-fascism and anti-imperialism still resonate. When we read, in his novel Nausea, of the protagonost Antoine Roquentin in Bouville’s art gallery, looking at pictures of self-satisfied local worthies, we can apply his fury at their subjects’ self-entitlement to today’s images of the powers that be (the suppressed photo, for example, of Cameron and his cronies in Bullingdon pomp), and share his disgust that such men know nothing of what the world is really like in all its absurd contingency.

In his short story Intimacy, we confront a character who, like all of us on occasion, is afraid of the burden of freedom and does everything possible to make others take her decisions for her. When we read his distinctions between being-in-itself (être-en-soi), being-for-itself (être-pour-soi) and being-for-others (être-pour-autrui), we are encouraged to think about the tragicomic nature of what it is to be human – a longing for full control over one’s destiny and for absolute identity, and at the same time, a realisation of the futility of that wish.

The existential plight of humanity, our absurd lot, our moral and political responsibilities that Sartre so brilliantly identified have not gone away; rather, we have chosen the easy path of ignoring them. That is not a surprise: for Sartre, such refusal to accept what it is to be human was overwhelmingly, paradoxically, what humans do.

Read the entire article here.

Image: Jean-Paul Sartre (c1950). Courtesy: Archivo del diario Clarín, Buenos Aires, Argentina

 

DarwinTunes

Charles_DarwinResearchers at Imperial College, London recently posed an intriguing question and have since developed a cool experiment to test it. Does artistic endeavor, such as music, follow the same principles of evolutionary selection in biology, as described by Darwin? That is, does the funkiest survive? Though, one has to wonder what the eminent scientist would have thought about some recent fusion of rap / dubstep / classical.

From the Guardian:

There were some funky beats at Imperial College London on Saturday at its annual science festival. As well as opportunities to create bogeys, see robots dance and try to get physics PhD students to explain their wacky world, this fascinating event included the chance to participate in a public game-like experiment called DarwinTunes.

Participants select tunes and “mate” them with other tunes to create musical offspring: if the offspring are in turn selected by other players, they “survive” and get the chance to reproduce their musical DNA. The experiment is online – you too can try to immortalise your selfish musical genes.

It is a model of evolution in practice that raises fascinating questions about culture and nature. These questions apply to all the arts, not just to dance beats. How does “cultural evolution” work? How close is the analogy between Darwin’s well-proven theory of evolution in nature and the evolution of art, literature and music?

The idea of cultural evolution was boldly defined by Jacob Bronowski as our fundamental human ability “not to accept the environment but to change it”. The moment the first stone tools appeared in Africa, about 2.5m years ago, a new, faster evolution, that of human culture, became visible on Earth: from cave paintings to the Renaissance, from Galileo to the 3D printer, this cultural evolution has advanced at breathtaking speed compared with the massive periods of time it takes nature to evolve new forms.

In DarwinTunes, cultural evolution is modelled as what the experimenters call “the survival of the funkiest”. Pulsing dance beats evolve through selections made by participants, and the music (it is claimed) becomes richer through this process of selection. Yet how does the model really correspond to the story of culture?

One way Darwin’s laws of nature apply to visual art is in the need for every successful form to adapt to its environment. In the forests of west and central Africa, wood carving was until recent times a flourishing art form. In the islands of Greece, where marble could be quarried easily, stone sculpture was more popular. In the modern technological world, the things that easily come to hand are not wood or stone but manufactured products and media images – so artists are inclined to work with the readymade.

At first sight, the thesis of DarwinTunes is a bit crude. Surely it is obvious that artists don’t just obey the selections made by their audience – that is, their consumers. To think they do is to apply the economic laws of our own consumer society across all history. Culture is a lot funkier than that.

Yet just because the laws of evolution need some adjustment to encompass art, that does not mean art is a mysterious spiritual realm impervious to scientific study. In fact, the evolution of evolution – the adjustments made by researchers to Darwin’s theory since it was unveiled in the Victorian age – offers interesting ways to understand culture.

One useful analogy between art and nature is the idea of punctuated equilibrium, introduced by some evolutionary scientists in the 1970s. Just as species may evolve not through a constant smooth process but by spectacular occasional leaps, so the history of art is punctuated by massively innovative eras followed by slower, more conventional periods.

Read the entire story here.

Image: Charles Darwin, 1868, photographed by Julia Margaret Cameron. Courtesy of Wikipedia.

The Rise of McLiterature

Will-Self-2007A sad symptom of our expanding media binge culture and the fragmentation of our shortening attention spans is the demise of literary fiction. Author Will Self believes the novel, and narrative prose in general, is on a slow, but accelerating, death-spiral. His eloquent views presented in a May 6, 2014 lecture are excerpted below.

From the Guardian:

If you happen to be a writer, one of the great benisons of having children is that your personal culture-mine is equipped with its own canaries. As you tunnel on relentlessly into the future, these little harbingers either choke on the noxious gases released by the extraction of decadence, or they thrive in the clean air of what we might call progress. A few months ago, one of my canaries, who’s in his mid-teens and harbours a laudable ambition to be the world’s greatest ever rock musician, was messing about on his electric guitar. Breaking off from a particularly jagged and angry riff, he launched into an equally jagged diatribe, the gist of which was already familiar to me: everything in popular music had been done before, and usually those who’d done it first had done it best. Besides, the instant availability of almost everything that had ever been done stifled his creativity, and made him feel it was all hopeless.

A miner, if he has any sense, treats his canary well, so I began gently remonstrating with him. Yes, I said, it’s true that the web and the internet have created a permanent Now, eliminating our sense of musical eras; it’s also the case that the queered demographics of our longer-living, lower-birthing population means that the middle-aged squat on top of the pyramid of endeavour, crushing the young with our nostalgic tastes. What’s more, the decimation of the revenue streams once generated by analogues of recorded music have put paid to many a musician’s income. But my canary had to appreciate this: if you took the long view, the advent of the 78rpm shellac disc had also been a disaster for musicians who in the teens and 20s of the last century made their daily bread by live performance. I repeated one of my favourite anecdotes: when the first wax cylinder recording of Feodor Chaliapin singing “The Song of the Volga Boatmen was played, its listeners, despite a lowness of fidelity that would seem laughable to us (imagine a man holding forth from a giant bowl of snapping, crackling and popping Rice Krispies), were nonetheless convinced the portly Russian must be in the room, and searched behind drapes and underneath chaise longues for him.

So recorded sound blew away the nimbus of authenticity surrounding live performers – but it did worse things. My canaries have often heard me tell how back in the 1970s heyday of the pop charts, all you needed was a writing credit on some loathsome chirpy-chirpy-cheep-cheeping ditty in order to spend the rest of your born days lying by a guitar-shaped pool in the Hollywood Hills hoovering up cocaine. Surely if there’s one thing we have to be grateful for it’s that the web has put paid to such an egregious financial multiplier being applied to raw talentlessness. Put paid to it, and also returned musicians to the domain of live performance and, arguably, reinvigorated musicianship in the process. Anyway, I was saying all of this to my canary when I was suddenly overtaken by a great wave of noxiousness only I could smell. I faltered, I fell silent, then I said: sod you and your creative anxieties, what about me? How do you think it feels to have dedicated your entire adult life to an art form only to see the bloody thing dying before your eyes?

My canary is a perceptive songbird – he immediately ceased his own cheeping, except to chirrup: I see what you mean. The literary novel as an art work and a narrative art form central to our culture is indeed dying before our eyes. Let me refine my terms: I do not mean narrative prose fiction tout court is dying – the kidult boywizardsroman and the soft sadomasochistic porn fantasy are clearly in rude good health. And nor do I mean that serious novels will either cease to be written or read. But what is already no longer the case is the situation that obtained when I was a young man. In the early 1980s, and I would argue throughout the second half of the last century, the literary novel was perceived to be the prince of art forms, the cultural capstone and the apogee of creative endeavour. The capability words have when arranged sequentially to both mimic the free flow of human thought and investigate the physical expressions and interactions of thinking subjects; the way they may be shaped into a believable simulacrum of either the commonsensical world, or any number of invented ones; and the capability of the extended prose form itself, which, unlike any other art form, is able to enact self-analysis, to describe other aesthetic modes and even mimic them. All this led to a general acknowledgment: the novel was the true Wagnerian Gesamtkunstwerk.

This is not to say that everyone walked the streets with their head buried in Ulysses or To the Lighthouse, or that popular culture in all its forms didn’t hold sway over the psyches and imaginations of the great majority. Nor do I mean to suggest that in our culture perennial John Bull-headed philistinism wasn’t alive and snorting: “I don’t know much about art, but I know what I like.” However, what didn’t obtain is the current dispensation, wherein those who reject the high arts feel not merely entitled to their opinion, but wholly justified in it. It goes further: the hallmark of our contemporary culture is an active resistance to difficulty in all its aesthetic manifestations, accompanied by a sense of grievance that conflates it with political elitism. Indeed, it’s arguable that tilting at this papery windmill of artistic superiority actively prevents a great many people from confronting the very real economic inequality and political disenfranchisement they’re subject to, exactly as being compelled to chant the mantra “choice” drowns out the harsh background Muzak telling them they have none.

Just because you’re paranoid it doesn’t mean they aren’t out to get you. Simply because you’ve remarked a number of times on the concealed fox gnawing its way into your vitals, it doesn’t mean it hasn’t at this moment swallowed your gall bladder. Ours is an age in which omnipresent threats of imminent extinction are also part of the background noise – nuclear annihilation, terrorism, climate change. So we can be blinkered when it comes to tectonic cultural shifts. The omnipresent and deadly threat to the novel has been imminent now for a long time – getting on, I would say, for a century – and so it’s become part of culture. During that century, more books of all kinds have been printed and read by far than in the entire preceding half millennium since the invention of movable-type printing. If this was death it had a weird, pullulating way of expressing itself. The saying is that there are no second acts in American lives; the novel, I think, has led a very American sort of life: swaggering, confident, brash even – and ever aware of its world-conquering manifest destiny. But unlike Ernest Hemingway or F Scott Fitzgerald, the novel has also had a second life. The form should have been laid to rest at about the time of Finnegans Wake, but in fact it has continued to stalk the corridors of our minds for a further three-quarters of a century. Many fine novels have been written during this period, but I would contend that these were, taking the long view, zombie novels, instances of an undead art form that yet wouldn’t lie down.

Literary critics – themselves a dying breed, a cause for considerable schadenfreude on the part of novelists – make all sorts of mistakes, but some of the most egregious ones result from an inability to think outside of the papery prison within which they conduct their lives’ work. They consider the codex. They are – in Marshall McLuhan’s memorable phrase – the possessors of Gutenberg minds.

There is now an almost ceaseless murmuring about the future of narrative prose. Most of it is at once Panglossian and melioristic: yes, experts assert, there’s no disputing the impact of digitised text on the whole culture of the codex; fewer paper books are being sold, newspapers fold, bookshops continue to close, libraries as well. But … but, well, there’s still no substitute for the experience of close reading as we’ve come to understand and appreciate it – the capacity to imagine entire worlds from parsing a few lines of text; the ability to achieve deep and meditative levels of absorption in others’ psyches. This circling of the wagons comes with a number of public-spirited campaigns: children are given free books; book bags are distributed with slogans on them urging readers to put books in them; books are hymned for their physical attributes – their heft, their appearance, their smell – as if they were the bodily correlates of all those Gutenberg minds, which, of  course, they are.

The seeming realists among the Gutenbergers say such things as: well, clearly, books are going to become a minority technology, but the beau livre will survive. The populist Gutenbergers prate on about how digital texts linked to social media will allow readers to take part in a public conversation. What none of the Gutenbergers are able to countenance, because it is quite literally – for once the intensifier is justified – out of their minds, is that the advent of digital media is not simply destructive of the codex, but of the Gutenberg mind itself. There is one question alone that you must ask yourself in order to establish whether the serious novel will still retain cultural primacy and centrality in another 20 years. This is the question: if you accept that by then the vast majority of text will be read in digital form on devices linked to the web, do you also believe that those readers will voluntarily choose to disable that connectivity? If your answer to this is no, then the death of the novel is sealed out of your own mouth.

Read the entire excerpt here.

Image: Will Self, 2007. Courtesy of Wikipedia / Creative Commons.

Expanding Binge Culture

The framers of the U.S. Declaration of Independence could not have known. They could not have foreseen how commoditization, consumerism, globalisation and always-on media culture would come to transform our culture. They did well to insert “Life, Liberty and the pursuit of Happiness”.

But they failed to consider our collective evolution — if you would wish to denote it as such — towards a sophisticated culture of binge. Significant numbers of us have long binged on physical goods, money, natural resources, food and drink. However, media has lagged, somewhat. But no longer. Now we have at our instantaneous whim entire libraries of all-you-can-eat infotainment. Time will tell if this signals the demise of quality, as it gets replaced with overwhelming quantity. One area shows where we may be heading — witness the “fastfoodification” of our news.

From NYT:

When Beyoncé released, without warning, 17 videos around midnight on Dec. 13, millions of fans rejoiced. As a more casual listener of Ms. Knowles, I balked at the onslaught of new material and watched a few videos before throwing in the towel.

Likewise, when Netflix, in one fell swoop, made complete seasons of “House of Cards” and “Orange Is the New Black” available for streaming, I quailed at the challenge, though countless others happily immersed themselves in their worlds of Washington intrigue and incarcerated women.

Then there is the news, to which floodgates are now fully open thanks to the Internet and cable TV: Flight 370, Putin, Chris Christie, Edward Snowden, Rob Ford, Obamacare, “Duck Dynasty,” “bossy,” #CancelColbert, conscious uncoupling. When presented with 24/7 coverage of these ongoing narratives from an assortment of channels — traditional journalism sites, my Facebook feed, the log-out screen of my email — I followed some closely and very consciously uncoupled from others.

Had these content providers released their offerings in the old-media landscape, à la carte rather than in an all-you-can-eat buffet, the prospect of a seven-course meal might not have seemed so daunting. I could handle a steady drip of one article a day about Mr. Ford in a newspaper. But after two dozen, updated every 10 minutes, plus scores of tweets, videos and GIFs that keep on giving, I wanted to forget altogether about Toronto’s embattled mayor.

While media technology is now catching up to Americans’ penchant for overdoing it and finding plenty of willing indulgers, there are also those like me who recoil from the abundance of binge culture.

In the last decade, media entertainment has given far more freedom to consumers: watch, listen to and read anything at anytime. But Barry Schwartz’s 2004 book, “The Paradox of Choice,” argues that our surfeit of consumer choices engenders anxiety, not satisfaction, and sometimes even a kind of paralysis.

His thesis (which has its dissenters) applies mostly to the profusion of options within a single set: for instance, the challenge of picking out salad dressing from 175 varieties in a supermarket. Nevertheless, it is also germane to the concept of bingeing, when 62 episodes of “Breaking Bad” wait overwhelmingly in a row like bottles of Newman’s Own on a shelf.

Alex Quinlan, 31, a first-year Ph.D. student in poetry at Florida State University, said he used to spend at least an hour every morning reading the news and “putting off my responsibilities,” as well as binge-watching shows. He is busier now, and last fall had trouble installing an Internet connection in his home, which effectively “rewired my media-consumption habits,” he said. “I’m a lot more disciplined. Last night I watched one episode of ‘House of Cards’ and went to bed. A year ago, I probably would’ve watched one, gotten another beer, then watched two more.”

Even shorter-term bingeing can seem like a major commitment, because there is a distorting effect of receiving a large chunk of content at once rather than getting it piecemeal. To watch one Beyoncé video a week would eat as much time as watching all in one day, but their unified dissemination makes them seem intimidatingly movie-length (which they are, approximately) rather than like a series of four-minute clips.

I also experienced some first-world anxiety last year with the release of the fourth season of “Arrested Development.” I had devoured the show’s first three seasons, parceled out in 22-minute weekly installments on Fox as well as on DVD, where I would watch episodes I had already seen (in pre-streaming days, binge-watching required renting or owning a copy, which was more like a contained feast). But when Netflix uploaded 15 new episodes totaling 8.5 hours on May 26, I was not among those queuing up for it. It took me some time to get around to the show, and once I had started, the knowledge of how many episodes stretched in front of me, at my disposal whenever I wanted, proved off-putting.

This despite the keeping-up-with-the-Joneses quality to binge-viewing. If everyone is quickly exhausting every new episode of a show, and writing and talking about it the next day, it’s easy to feel left out of the conversation if you haven’t kept pace. And sometimes when you’re late to the party, you decide to stay home instead.

Because we frequently gorge when left to our own Wi-Fi-enabled devices, the antiquated methods of “scheduling our information consumption” may have been healthier, if less convenient, said Clay Johnson, 36, the author of “The Information Diet.” He recalled rushing home after choir practice when he was younger to catch “Northern Exposure” on TV.

“That idea is now preposterous,” he said. “We don’t have appointment television anymore. Just because we can watch something all the time doesn’t mean we should. Maybe we should schedule it in a way that makes sense around our daily lives.”

“It’s a lot like food,” he added. “You see some people become info-anorexic, who say the answer is to unplug and not consume anything. Much like an eating disorder, it’s just as unhealthy a decision as binge-watching the news and media. There’s a middle ground of people who are saying, ‘I need to start treating this form of input in my life like a conscious decision and to be informed in the right way.’ ”

Read the entire story here.

Zentai Coming to a City Near You

google-search-zentai

The latest Japanese export may not become as ubiquitous as Pokemon or the Toyota Camry. However, aficionados of Zentai seem to be increasing in numbers, and outside of the typical esoteric haunts such as clubs or during Halloween parties. Though, it may be a while before Zentai outfits appear around the office.

From the Washington Post:

They meet on clandestine Internet forums. Or in clubs. Or sometimes at barbecue parties, where as many as 10 adherents gather every month to eat meat and frolic in an outfit that falls somewhere between a Power Ranger’s tunic and Spider-Man’s digs.

They meet on clandestine Internet forums. Or in clubs. Or sometimes at barbecue parties, where as many as 10 adherents gather every month to eat meat and frolic in an outfit that falls somewhere between a Power Ranger’s tunic and Spider-Man’s digs.

It’s called “zentai.” And in Japan, it can mean a lot of things. To 20-year-old Hokkyoku Nigo, it means liberation from the judgment and opinions of others. To a 22-year-old named Hanaka, it represents her lifelong fascination with superheroes. To a 36-year-old teacher named Nezumiko, it elicits something sexual. “I like to touch and stroke others and to be touched and stroked like this,” she told the AFP’s Harumi Ozawa.

But to most outsiders, zentai means exactly what it looks like: spandex body suits.

Where did this phenomenon come from and what does it mean? In a culture of unique displays — from men turning trucks into glowing light shows to women wearing Victoria-era clothing — zentai appears to be yet another oddity in a country well accustomed to them.

The trend can take on elements of prurience, however, and groups with names such as “zentai addict” and “zentai fetish” teem on Facebook. There are zentai ninjas. There are zentai Pokemon. There are zentai British flags and zentai American flags.

An organization called the Zentai Project, based in England, explains it as “a tight, colorful suit that transforms a normal person into amusement for all who see them. … The locals don’t know what to make of us, but the tourists love us and we get onto lots of tourist snaps — sometimes we can hardly walk 3 steps down the street before being stopped to pose for another picture.”

Though the trend is now apparently global, it was once just a group of Japanese climbing into skintight latex for unknown reasons.

“With my face covered, I cannot eat or drink like other customers,” Hokkyoku Nigo says in the AFP story. “I have led my life always worrying about what other people think of me. They say I look cute, gentle, childish or naive. I have always felt suffocated by that. But wearing this, I am just a person in a full body suit.”

Ikuo Daibo, a professor at Tokyo Mirai University, says wearing full body suits may reflect a sense of societal abandonment. People are acting out to define their individuality.

“In Japan,” he said, ”many people feel lost; they feel unable to find their role in society. They have too many role models and cannot choose which one to follow.”

Read the entire article here.

Image courtesy of Google Search.

Teens and the Internet: Don’t Panic

Some view online social networks, smartphones and texting as nothing but bad news for the future socialization of our teens. After all, they’re usually hunched heads down, thumbs out, immersed in their own private worlds, oblivious to all else, all the while paradoxically and simultaneously, publishing and sharing anything and everything to anyone.

Yet, others, including as Microsoft researcher Danah Boyd, have a more benign view of the technological maelstrom that surrounds our kids. In her book It’s Complicated: The Social Lives of Networked Teens, she argues that teenagers aren’t doing anything different today online than their parents and grandparents often did in person. Parents will take comfort from Boyd’s analysis that today’s teens will become much like their parents: behaving and worrying about many of the same issues that their parents did. Of course, teens will find this very, very uncool indeed.

From Technology Review:

Kids today! They’re online all the time, sharing every little aspect of their lives. What’s wrong with them? Actually, nothing, says Danah Boyd, a Microsoft researcher who studies social media. In a book coming out this winter, It’s Complicated: The Social Lives of Networked Teens, Boyd argues that teenagers aren’t doing much online that’s very different from what kids did at the sock hop, the roller rink, or the mall. They do so much socializing online mostly because they have little choice, Boyd says: parents now generally consider it unsafe to let kids roam their neighborhoods unsupervised. Boyd, 36, spoke with MIT Technology Review’s deputy editor, Brian Bergstein, at Microsoft Research’s offices in Manhattan.

I feel like you might have titled the book Everybody Should Stop Freaking Out.

It’s funny, because one of the early titles was Like, Duh. Because whenever I would show my research to young people, they’d say, “Like, duh. Isn’t this so obvious?” And it opens with the anecdote of a boy who says, “Can you just talk to my mom? Can you tell her that I’m going to be okay?” I found that refrain so common among young people.

You and your colleague Alice Marwick interviewed 166 teenagers for this book. But you’ve studied social media for a long time. What surprised you?

It was shocking how heavily constrained their mobility was. I had known it had gotten worse since I was a teenager, but I didn’t get it—the total lack of freedom to just go out and wander. Young people weren’t even trying to sneak out [of the house at night]. They were trying to get online, because that’s the place where they hung out with their friends.

And I had assumed based on the narratives in the media that bullying was on the rise. I was shocked that data showed otherwise.

Then why do narratives such as “Bullying is more common online” take hold?

It’s made more visible. There is some awful stuff out there, but it frustrates me when a panic distracts us from the reality of what’s going on. One of my frustrations is that there are some massive mental health issues, and we want to blame the technology [that brings them to light] instead of actually dealing with mental health issues.

take your point that Facebook or Insta­gram is the equivalent of yesterday’s hangouts. But social media amplify everyday situations in difficult new ways. For example, kids might instantly see on Facebook that they’re missing out on something other kids are doing together.

That can be a blessing or a curse. These interpersonal conflicts ramp up much faster [and] can be much more hurtful. That’s one of the challenges for this cohort of youth: some of them have the social and emotional skills that are necessary to deal with these conflicts; others don’t. It really sucks when you realize that somebody doesn’t like you as much as you like them. Part of it is, then, how do you use that as an opportunity not to just wallow in your self-pity but to figure out how to interact and be like “Hey, let’s talk through what this friendship is like”?

You contend that teenagers are not cavalier about privacy, despite appearances, and adeptly shift sensitive conversations into chat and other private channels.

Many adults assume teens don’t care about privacy because they’re so willing to participate in social media. They want to be in public. But that doesn’t mean that they want to be public. There’s a big difference. Privacy isn’t about being isolated from others. It’s about having the capacity to control a social situation.

So if parents can let go of some common fears, what should they be doing?

One thing that I think is dangerous is that we’re trained that we are the experts at everything that goes on in our lives and our kids’ lives. So the assumption is that we should teach them by telling them. But I think the best way to teach is by asking questions: “Why are you posting that? Help me understand.” Using it as an opportunity to talk. Obviously there comes a point when your teenage child is going to roll their eyes and go, “I am not interested in explaining anything more to you, Dad.”

The other thing is being present. The hardest thing that I saw, overwhelmingly—the most unhealthy environments—were those where the parents were not present. They could be physically present and not actually present.

Read the entire article here.

The Best Place to be a Woman

By most accounts the best place to be a woman is one that offers access to quality education and comprehensive healthcare, provides gender equality with men, and meaningful career and family work-life balance. So where is this real world Shangri-La. Some might suggest this place to be the land of opportunity — the United States. But, that’s not even close. Nor is it Canada or Switzerland or Germany or the UK.

According to a recent Global Gender Gap report, and a number of other surveys, the best place to be born a girl is Iceland. Next on the list come Finland, Norway, and Sweden, with another Scandinavian country, Denmark, not too far behind in seventh place. By way of comparison, the US comes in 23rd — not great, but better than Afghanistan and Yemen.

From the Social Reader:

Icelanders are among the happiest and healthiest people on Earth. They publish more books per capita than any other country, and they have more artists. They boast the most prevalent belief in evolution — and elves, too. Iceland is the world’s most peaceful nation (the cops don’t even carry guns), and the best place for kids. Oh, and they’ve got a lesbian head of state, the world’s first. Granted, the national dish is putrefied shark meat, but you can’t have everything.

Iceland is also the best place to have a uterus, according to the folks at the World Economic Forum. The Global Gender Gap Report ranks countries based on where women have the most equal access to education and healthcare, and where they can participate most fully in the country’s political and economic life.

According to the 2013 report, Icelandic women pretty much have it all. Their sisters in Finland, Norway, and Sweden have it pretty good, too: those countries came in second, third and fourth, respectively. Denmark is not far behind at number seven.

The U.S. comes in at a dismal 23rd, which is a notch down from last year. At least we’re not Yemen, which is dead last out of 136 countries.

So how did a string of countries settled by Vikings become leaders in gender enlightenment? Bloodthirsty raiding parties don’t exactly sound like models of egalitarianism, and the early days weren’t pretty. Medieval Icelandic law prohibited women from bearing arms or even having short hair. Viking women could not be chiefs or judges, and they had to remain silent in assemblies. On the flip side, they could request a divorce and inherit property. But that’s not quite a blueprint for the world’s premier egalitarian society.

The change came with literacy, for one thing. Today almost everybody in Scandinavia can read, a legacy of the Reformation and early Christian missionaries, who were interested in teaching all citizens to read the Bible. Following a long period of turmoil, Nordic states also turned to literacy as a stabilizing force in the late 18th century. By 1842, Sweden had made education compulsory for both boys and girls.

Researchers have found that the more literate the society in general, the more egalitarian it is likely to be, and vice versa. But the literacy rate is very high in the U.S., too, so there must be something else going on in Scandinavia. Turns out that a whole smorgasbord of ingredients makes gender equality a high priority in Nordic countries.

To understand why, let’s take a look at religion. The Scandinavian Lutherans, who turned away from the excesses of the medieval Catholic Church, were concerned about equality — especially the disparity between rich and poor. They thought that individuals had some inherent rights that could not just be bestowed by the powerful, and this may have opened them to the idea of rights for women. Lutheran state churches in Denmark, Sweden, Finland, Norway and Iceland have had female priests since the middle of the 20th century, and today, the Swedish Lutheran Church even has a female archbishop.

Or maybe it’s just that there’s not much religion at all. Scandinavians aren’t big churchgoers. They tend to look at morality from a secular point of view, where there’s not so much obsessive focus on sexual issues and less interest in controlling women’s behavior and activities. Scandinavia’s secularism decoupled sex from sin, and this worked out well for females. They came to be seen as having the right to sexual experience just like men, and reproductive freedom, too. Girls and boys learn about contraception in school (and even the pleasure of orgasms), and most cities have youth clinics where contraceptives are readily available. Women may have an abortion for any reason up to the eighteenth week (they can seek permission from the National Board of Health and Welfare after that), and the issue is not politically controversial.

Scandinavia’s political economy also developed along somewhat different lines than America’s did. Sweden and Norway had some big imperialist adventures, but this behavior declined following the Napoleonic Wars. After that they invested in the military to ward off invaders, but they were less interested in building it up to deal with bloated colonial structures and foreign adventures. Overall Nordic countries devoted fewer resources to the military — the arena where patriarchal values tend to get emphasized and entrenched. Iceland, for example, spends the world’s lowest percentage of GDP on its military.

Industrialization is part of the story, too: it hit the Nordic countries late. In the 19th century, Scandinavia did have a rich and powerful merchant class, but the region never produced the Gilded Age industrial titans and extreme concentration of wealth that happened in America back then, and has returned today. (Income inequality and discrimination of all kinds seem to go hand-in-hand.)

In the 20th century, farmers and workers in the newly populated Nordic cities tended to join together in political coalitions, and they could mount a serious challenge to the business elites, who were relatively weak compared to those in the U.S. Like ordinary people everywhere, Scandinavians wanted a social and economic system where everyone could get a job, expect decent pay, and enjoy a strong social safety net. And that’s what they got — kind of like Roosevelt’s New Deal without all the restrictions added by New York bankers and southern conservatives. Strong trade unions developed, which tend to promote gender equality. The public sector grew, providing women with good job opportunities. Iceland today has the highest rate of union membership out of any OECD country.

Over time, Scandinavian countries became modern social democratic states where wealth is more evenly distributed, education is typically free up through university, and the social safety net allows women to comfortably work and raise a family. Scandinavian moms aren’t agonizing over work-family balance: parents can take a year or more of paid parental leave. Dads are expected to be equal partners in childrearing, and they seem to like it. (Check them out in the adorable photo book, The Swedish Dad.)

The folks up north have just figured out — and it’s not rocket science! — that everybody is better off when men and women share power and influence. They’re not perfect — there’s still some unfinished business about how women are treated in the private sector, and we’ve sensed an undertone of darker forces in pop culture phenoms like The Girl with the Dragon Tattoo. But Scandinavians have decided that investment in women is both good for social relations and a smart economic choice. Unsurprisingly, Nordic countries have strong economies and rank high on things like innovation — Sweden is actually ahead of the U.S. on that metric. (So please, no more nonsense about how inequality makes for innovation.)

The good news is that things are getting better for women in most places in the world. But the World Economic Forum report shows that the situation either remains the same or is deteriorating for women in 20 percent of countries.

In the U.S., we’ve evened the playing field in education, and women have good economic opportunities. But according to the WEF, American women lag behind men in terms of health and survival, and they hold relatively few political offices. Both facts become painfully clear every time a Tea Party politician betrays total ignorance of how the female body works. Instead of getting more women to participate in the political process, we’ve got setbacks like a new voter ID law in Texas, which could disenfranchise one-third of the state’s woman voters. That’s not going to help the U.S. become a world leader in gender equality.

Read the entire article here.

All Conquering TV

In almost 90 years since television was invented it has done more to re-shape our world than conquering armies and pandemics. Whether you see TV  as a force for good or evil — or more recently, as a method for delivering absurd banality — you would be hard-pressed to find another human invention that has altered us so profoundly, psychologically, socially and culturally. What would its creator — John Logie Baird — think of his invention now, almost 70 years after his death?

From the Guardian:

Like most people my age – 51 – my childhood was in black and white. That’s because my memory of childhood is in black and white, and that’s because television in the 1960s (and most photography) was black and white. Bill and Ben, the Beatles, the Biafran war, Blue Peter, they were all black and white, and their images form the monochrome memories of my early years.

That’s one of the extraordinary aspects of television – its ability to trump reality. If seeing is believing, then there’s always a troubling doubt until you’ve seen it on television. A mass medium delivered to almost every household, it’s the communal confirmation of experience.

On 30 September it will be 84 years since the world’s first-ever television transmission. In Armchair Nation, his new social history of TV, Joe Moran, professor of English and cultural history at Liverpool John Moores University, recounts the events of that momentous day. A Yorkshire comedian named Sydney Howard performed a comic monologue and someone called Lulu Stanley sang “He’s tall, and dark, and handsome” in what was perhaps the earliest progenitor of The X Factor.

The images were broadcast by the BBC and viewed by a small group of invited guests on a screen about half the size of the average smartphone in the inventor John Logie Baird’s Covent Garden studio. Logie Baird may have been a visionary but even he would have struggled to comprehend just how much the world would be changed by his vision – television, the 20th century’s defining technology.

Every major happening is now captured by television, or it’s not a major happening. Politics and politicians are determined by how they play on television. Public knowledge, charity, humour, fashion trends, celebrity and consumer demand are all subject to its critical influence. More than the aeroplane or the nuclear bomb, the computer or the telephone, TV has determined what we know and how we think, the way we believe and how we perceive ourselves and the world around us (only the motor car is a possible rival and that, strictly speaking, was a 19th-century invention).

Not not only did television re-envision our sense of the world, it remains, even in the age of the internet, Facebook and YouTube, the most powerful generator of our collective memories, the most seductive and shocking mirror of society, and the most virulent incubator of social trends. It’s also stubbornly unavoidable.

There is good television, bad television, too much television and even, for some cultural puritans, no television, but whatever the equation, there is always television. It’s ubiquitously there, radiating away in the corner, even when it’s not. Moran quotes a dumbfounded Joey Tribbiani (Matt LeBlanc) from Friends on learning that a new acquaintance doesn’t have a TV set: “But what does your furniture point at?”

Like all the best comic lines, it contains a profound truth. The presence of television is so pervasive that its very absence is a kind of affront to the modern way of life. Not only has television reshaped the layout of our sitting rooms, it has also reshaped the very fabric of our lives.

Just to take Friends as one small example. Before it was first aired back in 1994, the idea of groups of young people hanging out in a coffee bar talking about relationships in a language of comic neurosis was, at least as far as pubcentric Britain was concerned, laughable. Now it’s a high-street fact of life. Would Starbucks and Costa have enjoyed the same success if Joey and friends had not showed the way?

But in 1929 no one had woken up and smelled the coffee. The images were extremely poor quality, the equipment was dauntingly expensive and reception vanishingly limited. In short, it didn’t look like the future. One of the first people to recognise television’s potential – or at least the most unappealing part of it – was Aldous Huxley. Writing in Brave New World, published in 1932, he described a hospice of the future in which every bed had a TV set at its foot. “Television was left on, a running tap, from morning till night.”

All the same, television remained a London-only hobby for a tiny metropolitan elite right up until the Second World War. Then, for reasons of national security, the BBC switched off its television signal and the experiment seemed to come to a bleak end.

It wasn’t until after the war that television was slowly spread out across the country. Some parts of the Scottish islands did not receive a signal until deep into the 1960s, but the nation was hooked. Moran quotes revealing statistics from 1971 about the contemporary British way of life: “Ten per cent of homes still had no indoor lavatory or bath, 31% had no fridge and 62% had no telephone, but only 9% had no TV.”

My family, as IT happened, fitted into that strangely incongruous sector that had no inside lavatory or bath but did have a TV. This seems bizarre, if you think about society’s priorities, but it’s a common situation today throughout large parts of the developing world.

I don’t recall much anxiety about the lack of a bath, at least on my part, but I can’t imagine what the sense of social exclusion would have been like, aged nine, if I hadn’t had access to Thunderbirds and The Big Match.

The strongest memory I have of watching television in the early 1970s is in my grandmother’s flat on wintry Saturday afternoons. Invariably the gas fire was roaring, the room was baking, and that inscrutable spectacle of professional wrestling, whose appeal was a mystery to me (if not Roland Barthes), lasted an eternity before the beautifully cadenced poetry of the football results came on.

Read the entire article here.

Image: John Logie Baird. Courtesy of Wikipedia.

United States of Strange

With the United States turning another year older it reminds us to ponder some of the lesser known components of this beautiful yet paradoxical place. All nations have their esoteric cultural wonders and benign local oddities: the British (actually the Scots) have kilts, bowler hats, the Royal Family; Italians have Vespas, governments that last on average 8 months; the French, well they’re just French; the Germans love fast cars and lederhosen. But for sheer variety and volume the United States probably surpasses all for its extreme absurdity.

From the Telegraph:

Run by the improbably named Genghis Cohen, Machine Gun Vegas bills itself as the ‘world’s first luxury gun lounge’. It opened last year, and claims to combine “the look and feel of an ultra-lounge with the functionality of a state of the art indoor gun range”. The team of NRA-certified on-site instructors, however, may be its most unique appeal. All are female, and all are ex-US military personnel.

See other images and read the entire article here.

Image courtesy of the Telegraph.

Us and Them: Group Affinity Begins Early

Research shows how children as young as four years empathize with some but not others. It’s all about the group: which peer group you belong to versus the rest. Thus, the uphill struggle to instill tolerance in the next generation needs to begin very early in life.

From the WSJ:

Here’s a question. There are two groups, Zazes and Flurps. A Zaz hits somebody. Who do you think it was, another Zaz or a Flurp?

It’s depressing, but you have to admit that it’s more likely that the Zaz hit the Flurp. That’s an understandable reaction for an experienced, world-weary reader of The Wall Street Journal. But here’s something even more depressing—4-year-olds give the same answer.

In my last column, I talked about some disturbing new research showing that preschoolers are already unconsciously biased against other racial groups. Where does this bias come from?

Marjorie Rhodes at New York University argues that children are “intuitive sociologists” trying to make sense of the social world. We already know that very young children make up theories about everyday physics, psychology and biology. Dr. Rhodes thinks that they have theories about social groups, too.

In 2012 she asked young children about the Zazes and Flurps. Even 4-year-olds predicted that people would be more likely to harm someone from another group than from their own group. So children aren’t just biased against other racial groups: They also assume that everybody else will be biased against other groups. And this extends beyond race, gender and religion to the arbitrary realm of Zazes and Flurps.

In fact, a new study in Psychological Science by Dr. Rhodes and Lisa Chalik suggests that this intuitive social theory may even influence how children develop moral distinctions.

Back in the 1980s, Judith Smetana and colleagues discovered that very young kids could discriminate between genuinely moral principles and mere social conventions. First, the researchers asked about everyday rules—a rule that you can’t be mean to other children, for instance, or that you have to hang up your clothes. The children said that, of course, breaking the rules was wrong. But then the researchers asked another question: What would you think if teachers and parents changed the rules to say that being mean and dropping clothes were OK?

Children as young as 2 said that, in that case, it would be OK to drop your clothes, but not to be mean. No matter what the authorities decreed, hurting others, even just hurting their feelings, was always wrong. It’s a strikingly robust result—true for children from Brazil to Korea. Poignantly, even abused children thought that hurting other people was intrinsically wrong.

This might leave you feeling more cheerful about human nature. But in the new study, Dr. Rhodes asked similar moral questions about the Zazes and Flurps. The 4-year-olds said it would always be wrong for Zazes to hurt the feelings of others in their group. But if teachers decided that Zazes could hurt Flurps’ feelings, then it would be OK to do so. Intrinsic moral obligations only extended to members of their own group.

The 4-year-olds demonstrate the deep roots of an ethical tension that has divided philosophers for centuries. We feel that our moral principles should be universal, but we simultaneously feel that there is something special about our obligations to our own group, whether it’s a family, clan or country.

Read the entire article after the jump.

Image: Us and Them, Pink Floyd. Courtesy of Pink Floyd / flickr.

Great Literature and Human Progress

Professor of Philosophy Gregory Currie tackles a thorny issue in his latest article. The question he seeks to answer is, “does great literature make us better?” It’s highly likely that a poll of most nations would show the majority of people  believe that literature does in fact propel us in a forward direction, intellectually, morally, emotionally and culturally. It seem like a no-brainer. But where is the hard evidence?

From the New York Times:

You agree with me, I expect, that exposure to challenging works of literary fiction is good for us. That’s one reason we deplore the dumbing-down of the school curriculum and the rise of the Internet and its hyperlink culture. Perhaps we don’t all read very much that we would count as great literature, but we’re apt to feel guilty about not doing so, seeing it as one of the ways we fall short of excellence. Wouldn’t reading about Anna Karenina, the good folk of Middlemarch and Marcel and his friends expand our imaginations and refine our moral and social sensibilities?

If someone now asks you for evidence for this view, I expect you will have one or both of the following reactions. First, why would anyone need evidence for something so obviously right? Second, what kind of evidence would he want? Answering the first question is easy: if there’s no evidence – even indirect evidence – for the civilizing value of literary fiction, we ought not to assume that it does civilize. Perhaps you think there are questions we can sensibly settle in ways other than by appeal to evidence: by faith, for instance. But even if there are such questions, surely no one thinks this is one of them.

What sort of evidence could we present? Well, we can point to specific examples of our fellows who have become more caring, wiser people through encounters with literature. Indeed, we are such people ourselves, aren’t we?

I hope no one is going to push this line very hard. Everything we know about our understanding of ourselves suggests that we are not very good at knowing how we got to be the kind of people we are. In fact we don’t really know, very often, what sorts of people we are. We regularly attribute our own failures to circumstance and the failures of others to bad character. But we can’t all be exceptions to the rule (supposing it is a rule) that people do bad things because they are bad people.

We are poor at knowing why we make the choices we do, and we fail to recognize the tiny changes in circumstances that can shift us from one choice to another. When it comes to other people, can you be confident that your intelligent, socially attuned and generous friend who reads Proust got that way partly because of the reading? Might it not be the other way around: that bright, socially competent and empathic people are more likely than others to find pleasure in the complex representations of human interaction we find in literature?

There’s an argument we often hear on the other side, illustrated earlier this year by a piece on The New Yorker’s Web site. Reminding us of all those cultured Nazis, Teju Cole notes the willingness of a president who reads novels and poetry to sign weekly drone strike permissions. What, he asks, became of “literature’s vaunted power to inspire empathy?” I find this a hard argument to like, and not merely because I am not yet persuaded by the moral case against drones. No one should be claiming that exposure to literature protects one against moral temptation absolutely, or that it can reform the truly evil among us. We measure the effectiveness of drugs and other medical interventions by thin margins of success that would not be visible without sophisticated statistical techniques; why assume literature’s effectiveness should be any different?

We need to go beyond the appeal to common experience and into the territory of psychological research, which is sophisticated enough these days to make a start in testing our proposition.

Psychologists have started to do some work in this area, and we have learned a few things so far. We know that if you get people to read a short, lowering story about a child murder they will afterward report feeling worse about the world than they otherwise would. Such changes, which are likely to be very short-term, show that fictions press our buttons; they don’t show that they refine us emotionally or in any other way.

We have learned that people are apt to pick up (purportedly) factual information stated or implied as part of a fictional story’s background. Oddly, people are more prone to do that when the story is set away from home: in a study conducted by Deborah Prentice and colleagues and published in 1997, Princeton undergraduates retained more from a story when it was set at Yale than when it was set on their own campus (don’t worry Princetonians, Yalies are just as bad when you do the test the other way around). Television, with its serial programming, is good for certain kinds of learning; according to a study from 2001 undertaken for the Kaiser Foundation, people who regularly watched the show “E.R.” picked up a good bit of medical information on which they sometimes acted. What we don’t have is compelling evidence that suggests that people are morally or socially better for reading Tolstoy.

Not nearly enough research has been conducted; nor, I think, is the relevant psychological evidence just around the corner. Most of the studies undertaken so far don’t draw on serious literature but on short snatches of fiction devised especially for experimental purposes. Very few of them address questions about the effects of literature on moral and social development, far too few for us to conclude that literature either does or doesn’t have positive moral effects.

There is a puzzling mismatch between the strength of opinion on this topic and the state of the evidence. In fact I suspect it is worse than that; advocates of the view that literature educates and civilizes don’t overrate the evidence — they don’t even think that evidence comes into it. While the value of literature ought not to be a matter of faith, it looks as if, for many of us, that is exactly what it is.

Read the entire article here.

Image: The Odyssey, Homer. Book cover. Courtesy of Goodreads.com

What’s In a Name?

Recently we posted a fascinating story about a legal ruling in Iceland that allowed parents to set aside centuries of Icelandic history by naming their girl “Blaer” — a traditionally male name. You see Iceland has an official organization — the Iceland Naming Committee — that regulates and decides if a given name is acceptable (by Icelandic standards).

Well, this got us thinking about rules and conventions in other nations. For instance, New Zealand will not allow parents to name a child “Pluto”, however “Number 16 Bus Shelter” and “Violence” recently got the thumbs up. Some misguided or innovative (depending upon your perspective) New Zealanders have unsuccessfully tried to name their offspring: “*” (yes, asterisk), “.” (period or full-stop), “V”, and “Emperor”.

Not to be outdone, a U.S. citizen recently legally changed his name to “In God” (first name) “We Trust” (last name). Humans are indeed a strange species.

From CNN:

Lucifer cannot be born in New Zealand.

And there’s no place for Christ or a Messiah either.

In New Zealand, parents have to run by the government any name they want to bestow on their baby.

And each year, there’s a bevy of unusual ones too bizarre to pass the taste test.

The country’s Registrar of Births, Deaths and Marriages shared that growing list with CNN on Wednesday.

Four words:

What were they thinking?

In the past 12 years, the agency had to turn down not one, not two, but six sets of parents who wanted to name their child “Lucifer.”

Also shot down were parents who wanted to grace their child with the name “Messiah.” That happened twice.

“Christ,” too, was rejected.

Specific rules

As the agency put it, acceptable names must not cause offense to a reasonable person, not be unreasonably long and should not resemble an official title and rank.

It’s no surprise then that the names nixed most often since 2001 are “Justice” (62 times) and “King” (31 times).

Some of the other entries scored points in the creativity department — but clearly didn’t take into account the lifetime of pain they’d bring.

“Mafia No Fear.” “4Real.” “Anal.”

Oh, come on!

Then there were the parents who preferred brevity through punctuation. The ones who picked ‘”*” (the asterisk) or ‘”.”(period).

Slipping through

Still, some quirky names do make it through.

In 2008, the country made made international news when the naming agency allowed a set of twins to be named ‘

“Benson” and “Hedges” — a popular cigarette brand — and OK’d the names “Violence” and “Number 16 Bus Shelter.”

Asked about those examples, Michael Mead of the Internal Affairs Department (under which the agency falls) said, “All names registered with the Department since 1995 have conformed to these rules.”

And what happens when parents don’t conform?

Four years ago, a 9-year-old girl was taken away from her parents by the state so that her name could be changed from “Talula Does the Hula From Hawaii.”

Not alone

To be sure, New Zealand is not the only country to act as editor for some parent’s wacky ideas.

Sweden also has a naming law and has nixed attempts to name children “Superman,” “Metallica,” and the oh-so-easy-to-pronounce “Brfxxccxxmnpcccclllmmnprxvclmnckssqlbb11116.”

In 2009, the Dominican Republic contemplated banning unusual names after a host of parents began naming their children after cars or fruit.

In the United States, however, naming fights have centered on adults.

In 2008, a judge allowed an Illinois school bus driver to legally change his first name to “In God” and his last name to “We Trust.”

But the same year, an appeals court in New Mexico ruled against a man — named Variable — who wanted to change his name to “F— Censorship!”

Here is a list of some the names banned in New Zealand since 2001 — and how many times they came up

Justice:62

King:31

Princess:28

Prince:27

Royal:25

Duke:10

Major:9

Bishop:9

Majesty:7

J:6

Lucifer:6

using brackets around middle names:4

Knight:4

Lady:3

using back slash between names:8

Judge:3

Royale:2

Messiah:2

T:2

I:2

Queen:2

II:2

Sir:2

III:2

Jr:2

E:2

V:2

Justus:2

Master:2

Constable:1

Queen Victoria:1

Regal:1

Emperor:1

Christ:1

Juztice:1

3rd:1

C J :1

G:1

Roman numerals III:1

General:1

Saint:1

Lord:1

. (full stop):1

89:1

Eminence:1

M:1

VI:1

Mafia No Fear:1

2nd:1

Majesti:1

Rogue:1

4real:1

* (star symbol):1

5th:1

S P:1

C:1

Sargent:1

Honour:1

D:1

Minister:1

MJ:1

Chief:1

Mr:1

V8:1

President:1

MC:1

Anal:1

A.J:1

Baron:1

L B:1

H-Q:1

Queen V:1

Read the entire article following the jump.

The War on Apostrophes

No, we don’t mean war on apostasy, for which many have been hung, drawn, quartered, burned and beheaded. And no, “apostrophes” are not a new sect of fundamentalist terrorists.

Apostrophes are punctuation, and a local city council in Britain has deemed to outlaw them. Why?

From the Guardian:

The sometimes vexing question of where and when to add an apostrophe appears to have been solved in one corner of Devon: the local authority is planning to do away with them altogether.

Later this month members of Mid Devon district council’s cabinet will discuss formally banning the pesky little punctuation marks from its (no apostrophe needed) street signs, apparently to avoid “confusion”.

The news of the Tory-controlled council’s (apostrophe required) decision provoked howls of condemnation on Friday from champions of plain English, fans of grammar, and politicians. Even the government felt the need to join the campaign to save the apostrophe.

The Plain English Campaign led the criticism. “It’s nonsense,” said Steve Jenner, spokesperson and radio presenter. “Where’s it going to stop. Are we going to declare war on commas, outlaw full stops?”

Jenner was puzzled over why the council appeared to think it a good idea not to have punctuation on signs. “If it’s to try to make things clearer, it’s not going to work. The whole purpose of punctuation is to make language easier to understand. Is it because someone at the council doesn’t understand how it works?”

Jenner suggested the council was providing a bad example to children who were – hopefully – being taught punctuation at school only to not see it being used correctly on street signs. “It seems a bit hypocritical,” he added.

Sian Harris, lecturer in English literature at Exeter University, said the proposals were likely to lead to greater confusion. She said: “Usually the best way to teach about punctuation is to show practical examples of it – removing [apostrophes] from everyday life would be a terrible shame and make that understanding increasingly difficult. English is a complicated language as it is — removing apostrophes is not going to help with that at all.”

Ben Bradshaw, the former culture secretary and Labour MP for Exeter, condemned the plans on Twitter. He wrote a precisely punctuated tweet: “Tory Mid Devon Council bans the apostrophe to ‘avoid confusion’ … Whole point of proper grammar is to avoid confusion!”

The council’s plans caused a stir 200 miles away in Whitehall, where the Department for Communities and Local Government came out in defence of punctuation. A spokesman said: “Whilst this is ultimately a matter for the local council, ministers’ view is that England’s apostrophes should be cherished.”

To be fair to modest Mid Devon, it is not the only authority to pick on the apostrophe. Birmingham did the same three years ago (the Mail went with the headline The city where apostrophes arent welcome).

The book retailer Waterstones caused a bit of a stir last year when it ditched the mark.

The council’s communications manager, Andrew Lacey, attempted to dampen down the controversy. Lacey said: “Our proposed policy on street naming and numbering covers a whole host of practical issues, many of which are aimed at reducing potential confusion over street names.

“Although there is no national guidance that stops apostrophes being used, for many years the convention we’ve followed here is for new street names not to be given apostrophes.”

He said there were only three official street names in Mid Devon which include them: Beck’s Square and Blundell’s Avenue, both in Tiverton, and St George’s Well in Cullompton. All were named many, many years ago.

“No final decision has yet been made and the proposed policy will be discussed at cabinet,” he said.

Read the entire story after the jump.

Image: Mid Devon District Council’s plan is presumably to avoid errors such as this (from Hackney, London). Courtesy of Guardian / Andy Drysdale / Rex Features.

Two Nations Divided by Book Covers

“England and America are two countries separated by the same language”. This oft used quote is usually attributed to Oscar Wilde or GBS (George Bernard Shaw). Regardless of who originated the phrase both authors would not be surprised to see that book covers are divided by the Atlantic Ocean as well. The Millions continues its fascinating annual comparative analysis.

American book covers on the left, British book covers on the right.

[div class=attrib]From The Millions:[end-div]

As we’ve done for several years now, we thought it might be fun to compare the U.S. and U.K. book cover designs of this year’s Morning News Tournament of Books contenders. Book cover art is an interesting element of the literary world — sometimes fixated upon, sometimes ignored — but, as readers, we are undoubtedly swayed by the little billboard that is the cover of every book we read. And, while many of us no longer do most of our reading on physical books with physical covers, those same cover images now beckon us from their grids in the various online bookstores. From my days as a bookseller, when import titles would sometimes find their way into our store, I’ve always found it especially interesting that the U.K. and U.S. covers often differ from one another. This would seem to suggest that certain layouts and imagery will better appeal to readers on one side of the Atlantic rather than the other. These differences are especially striking when we look at the covers side by side. The American covers are on the left, and the UK are on the right. Your equally inexpert analysis is encouraged in the comments.

[div class=attrib]Read the entire article and see more book covers after the jump.[end-div]

[div class=atrrib]Book cover images courtesy of The Millions and their respective authors and publishers.[end-div]

Light Breeze Signals the Winds of Change

The gods of Norse legend are surely turning slowly in their graves. A Reykjavik, Iceland, court recently granted a 15-year-old the right to use her given name. Her first name, “Blaer” means “light breeze” in Icelandic, and until the ruling was not permitted to use the name under Iceland’s strict cultural preservation laws. So, before you name your next child Shoniqua or Te’o or Cruise, pause for a few moments to think how lucky you are that you live elsewhere (with apologies to our readers in Iceland).

[div class=attrib]From the Guardian:[end-div]

A 15-year-old Icelandic girl has been granted the right to legally use the name given to her by her mother, despite the opposition of authorities and Iceland’s strict law on names.

Reykjavik District Court ruled Thursday that the name “Blaer” can be used. It means “light breeze.”

The decision overturns an earlier rejection by Icelandic authorities who declared it was not a proper feminine name. Until now, Blaer Bjarkardottir had been identified simply as “Girl” in communications with officials.

“I’m very happy,” she said after the ruling. “I’m glad this is over. Now I expect I’ll have to get new identity papers. Finally I’ll have the name Blaer in my passport.”

Like a handful of other countries, including Germany and Denmark, Iceland has official rules about what a baby can be named. Names are supposed to fit Icelandic grammar and pronunciation rules — choices like Carolina and Christa are not allowed because the letter “c” is not part of Iceland’s alphabet.

Blaer’s mother, Bjork Eidsdottir, had fought for the right for the name to be recognized. The court ruling means that other girls will be also allowed to use the name in Iceland.

In an interview earlier this year, Eidsdottir said she did not know the name “Blaer” was not on the list of accepted female names when she gave it to her daughter. The name was rejected because the panel viewed it as a masculine name that was inappropriate for a girl.

The court found that based on testimony and other evidence, that the name could be used by both males and females and that Blaer had a right to her own name under Iceland’s constitution and Europe’s human rights conventions. It rejected the government’s argument that her request should be denied to protect the Icelandic language.

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image: Odin holds bracelets and leans on his spear while looking towards the völva in Völuspá. Gesturing, the völva holds a spoon and sits beside a steaming kettle. Published in Gjellerup, Karl (1895). Courtesy of Wikipedia.[end-div]

Las Vegas, Tianducheng and Paris: Cultural Borrowing

These three locations in Nevada, China (near Hangzhou) and Paris, France, have something in common. People the world over travel to these three places to see what they share. But only one has an original. In this case, we’re talking about the Eiffel Tower.

Now, this architectural grand theft is subject to a lengthy debate — the merits of mimicry, on a vast scale. There is even a fascinating coffee table sized book dedicated to this growing trend: Original Copies: Architectural Mimicry in Contemporary China, by Bianca Bosker.

Interestingly, the copycat trend only seems worrisome if those doing the copying are in a powerful and growing nation, and the copying is done on a national scale, perhaps for some form of cultural assimilation. After all, we don’t hear similar cries when developers put up a copy of Venice in Las Vegas — that’s just for entertainment we are told.

Yet haven’t civilizations borrowed, and stolen, ideas both good and bad throughout the ages? The answer of course is an unequivocal yes. Humans are avaricious collectors of memes that work — it’s more efficient to borrow than to invent. The Greeks borrowed from the Egyptians; the Romans borrowed from the Greeks; the Turks borrowed from the Romans; the Arabs borrowed from the Turks; the Spanish from the Arabs, the French from the Spanish, the British from the French, and so on. Of course what seems to be causing a more recent stir is that China is doing the borrowing, and on such a rapid and grand scale — the nation is copying not just buildings (and most other products) but entire urban landscapes. However, this is one way that empires emerge and evolve. In this case, China’s acquisitive impulses could, perhaps, be tempered if most nations of the world borrowed less from the Chinese — money that is. But that’s another story.

[div class=attrib]From the Atlantic:[end-div]

The latest and most famous case of Chinese architectural mimicry doesn’t look much like its predecessors. On December 28, German news weekly Der Spiegel reported that the Wangjing Soho, Zaha Hadid’s soaring new office and retail development under construction in Beijing, is being replicated, wall for wall and window for window, in Chongqing, a city in central China.

To most outside observers, this bold and quickly commissioned counterfeit represents a familiar form of piracy. In fashion, technology, and architecture, great ideas trickle down, often against the wishes of their progenitors. But in China, architectural copies don’t usually ape the latest designs.

In the vast space between Beijing and Chongqing lies a whole world of Chinese architectural simulacra that quietly aspire to a different ideal. In suburbs around China’s booming cities, developers build replicas of towns like Halstatt, Austria and Dorchester, England. Individual homes and offices, too, are designed to look like Versailles or the Chrysler Building. The most popular facsimile in China is the White House. The fastest-urbanizing country in history isn’t scanning design magazines for inspiration; it’s watching movies.

At Beijing’s Palais de Fortune, two hundred chateaus sit behind gold-tipped fences. At Chengdu’s British Town, pitched roofs and cast-iron street lamps dot the streets. At Shanghai’s Thames Town, a Gothic cathedral has become a tourist attraction in itself. Other developments have names like “Top Aristocrat,” (Beijing), “the Garden of Monet” (Shanghai), and “Galaxy Dante,” (Shenzhen).

Architects and critics within and beyond China have treated these derivative designs with scorn, as shameless kitsch or simply trash. Others cite China’s larger knock-off culture, from handbags to housing, as evidence of the innovation gap between China and the United States. For a larger audience on the Internet, they are merely a punchline, another example of China’s endlessly entertaining wackiness.

In short, the majority of Chinese architectural imitation, oozing with historical romanticism, is not taken seriously.

But perhaps it ought to be.

In Original Copies: Architectural Mimicry in Contemporary China, the first detailed book on the subject, Bianca Bosker argues that the significance of these constructions has been unfairly discounted. Bosker, a senior technology editor at the Huffington Post, has been visiting copycat Chinese villages for some six years, and in her view, these distorted impressions of the West offer a glance at the hopes, dreams and contradictions of China’s middle class.

“Clearly there’s an acknowledgement that there’s something great about Paris,” says Bosker. “But it’s also: ‘We can do it ourselves.'”

Armed with firsthand observation, field research, interviews, and a solid historical background, Bosker’s book is an attempt to change the way we think about Chinese duplitecture. “We’re seeing the Chinese dream in action,” she says. “It has to do with this ability to take control of your life. There’s now this plethora of options to choose from.” That is something new in China, as is the role that private enterprise is taking in molding built environments that will respond to people’s fantasies.

While the experts scoff, the people who build and inhabit these places are quite proud of them. As the saying goes, “The way to live best is to eat Chinese food, drive an American car, and live in a British house. That’s the ideal life.” The Chinese middle class is living in Orange County, Beijing, the same way you listen to reggae music or lounge in Danish furniture.

In practice, though, the depth and scale of this phenomenon has few parallels. No one knows how many facsimile communities there are in China, but the number is increasing every day. “Every time I go looking for more,” Bosker says, “I find more.”

How many are there?

“At least hundreds.”

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image: Tianducheng, 13th arrondissement, Paris in China. Courtesy of Bianca Bosker/University of Hawaii Press.[end-div]

Startup Culture: New is the New New

Starting up a new business was once a demanding and complex process, often undertaken in anonymity in the long shadows between the hours of a regular job. It still is over course. However nowadays “the startup” has become more of an event. The tech sector has raised this to a fine art by spawning an entire self-sustaining and self-promoting industry around startups.

You’ll find startup gurus, serial entrepreneurs and digital prophets — yes, AOL has a digital prophet on its payroll — strutting around on stage, twittering tips in the digital world, leading business plan bootcamps, pontificating on accelerator panels, hosting incubator love-ins in coffee shops or splashed across the covers of Entrepreneur or Inc or FastCompany magazines on an almost daily basis. Beware! The back of your cereal box may be next.

[div class=attrib]From the Telegraph:[end-div]

I’ve seen the best minds of my generation destroyed by marketing, shilling for ad clicks, dragging themselves through the strip-lit corridors of convention centres looking for a venture capitalist. Just as X Factor has convinced hordes of tone deaf kids they can be pop stars, the startup industry has persuaded thousands that they can be the next rockstar entrepreneur. What’s worse is that while X Factor clogs up the television schedules for a couple of months, tech conferences have proliferated to such an extent that not a week goes by without another excuse to slope off. Some founders spend more time on panels pontificating about their business plans than actually executing them.

Earlier this year, I witnessed David Shing, AOL’s Digital Prophet – that really is his job title – delivering the opening remarks at a tech conference. The show summed up the worst elements of the self-obsessed, hyperactive world of modern tech. A 42-year-old man with a shock of Russell Brand hair, expensive spectacles and paint-splattered trousers, Shingy paced the stage spouting buzzwords: “Attention is the new currency, man…the new new is providing utility, brothers and sisters…speaking on the phone is completely cliche.” The audience lapped it all up. At these rallies in praise of the startup, enthusiasm and energy matter much more than making sense.

Startup culture is driven by slinging around superlatives – every job is an “incredible opportunity”, every product is going to “change lives” and “disrupt” an established industry. No one wants to admit that most startups stay stuck right there at the start, pub singers pining for their chance in the spotlight. While the startups and hangers-on milling around in the halls bring in stacks of cash for the event organisers, it’s the already successful entrepreneurs on stage and the investors who actually benefit from these conferences. They meet up at exclusive dinners and in the speakers’ lounge where the real deals are made. It’s Studio 54 for geeks.

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Image: Startup, WA. Courtesy of Wikipedia.[end-div]

La Serrata: Why the Rise Always Followed by the Fall

Humans do learn from their mistakes. Yet, history does repeat. Nations will continue to rise, and inevitably fall. Why? Chrystia Freeland, author of “Plutocrats: The Rise of the New Global Super-Rich and the Fall of Everyone Else,” offers an insightful analysis based in part on 14th century Venice.

[div class=attrib]From the New York Times:[end-div]

IN the early 14th century, Venice was one of the richest cities in Europe. At the heart of its economy was the colleganza, a basic form of joint-stock company created to finance a single trade expedition. The brilliance of the colleganza was that it opened the economy to new entrants, allowing risk-taking entrepreneurs to share in the financial upside with the established businessmen who financed their merchant voyages.

Venice’s elites were the chief beneficiaries. Like all open economies, theirs was turbulent. Today, we think of social mobility as a good thing. But if you are on top, mobility also means competition. In 1315, when the Venetian city-state was at the height of its economic powers, the upper class acted to lock in its privileges, putting a formal stop to social mobility with the publication of the Libro d’Oro, or Book of Gold, an official register of the nobility. If you weren’t on it, you couldn’t join the ruling oligarchy.

The political shift, which had begun nearly two decades earlier, was so striking a change that the Venetians gave it a name: La Serrata, or the closure. It wasn’t long before the political Serrata became an economic one, too. Under the control of the oligarchs, Venice gradually cut off commercial opportunities for new entrants. Eventually, the colleganza was banned. The reigning elites were acting in their immediate self-interest, but in the longer term, La Serrata was the beginning of the end for them, and for Venetian prosperity more generally. By 1500, Venice’s population was smaller than it had been in 1330. In the 17th and 18th centuries, as the rest of Europe grew, the city continued to shrink.

The story of Venice’s rise and fall is told by the scholars Daron Acemoglu and James A. Robinson, in their book “Why Nations Fail: The Origins of Power, Prosperity, and Poverty,” as an illustration of their thesis that what separates successful states from failed ones is whether their governing institutions are inclusive or extractive. Extractive states are controlled by ruling elites whose objective is to extract as much wealth as they can from the rest of society. Inclusive states give everyone access to economic opportunity; often, greater inclusiveness creates more prosperity, which creates an incentive for ever greater inclusiveness.

The history of the United States can be read as one such virtuous circle. But as the story of Venice shows, virtuous circles can be broken. Elites that have prospered from inclusive systems can be tempted to pull up the ladder they climbed to the top. Eventually, their societies become extractive and their economies languish.

That was the future predicted by Karl Marx, who wrote that capitalism contained the seeds of its own destruction. And it is the danger America faces today, as the 1 percent pulls away from everyone else and pursues an economic, political and social agenda that will increase that gap even further — ultimately destroying the open system that made America rich and allowed its 1 percent to thrive in the first place.

You can see America’s creeping Serrata in the growing social and, especially, educational chasm between those at the top and everyone else. At the bottom and in the middle, American society is fraying, and the children of these struggling families are lagging the rest of the world at school.

Economists point out that the woes of the middle class are in large part a consequence of globalization and technological change. Culture may also play a role. In his recent book on the white working class, the libertarian writer Charles Murray blames the hollowed-out middle for straying from the traditional family values and old-fashioned work ethic that he says prevail among the rich (whom he castigates, but only for allowing cultural relativism to prevail).

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image: The Grand Canal and the Church of the Salute (1730) by Canaletto. Courtesy of Museum of Fine Arts, Houston / WikiCommons.[end-div]

Brilliant! The Brits are Coming

Following decades of one-way cultural osmosis — from the United States to the UK, it seems that the trend may be reversing. Well, at least in the linguistic department. Although it may be a while before “blimey” enters the American lexicon, other words and phrases such as “spot on”, “chat up”, “ginger” to describe hair color, “gormless”

[div class=attrib]From the BBC:[end-div]

There is little that irks British defenders of the English language more than Americanisms, which they see creeping insidiously into newspaper columns and everyday conversation. But bit by bit British English is invading America too.

Spot on – it’s just ludicrous!” snaps Geoffrey Nunberg, a linguist at the University of California at Berkeley.

“You are just impersonating an Englishman when you say spot on.”

Will do – I hear that from Americans. That should be put into quarantine,” he adds.

And don’t get him started on the chattering classes – its overtones of a distinctly British class system make him quiver.

But not everyone shares his revulsion at the drip, drip, drip of Britishisms – to use an American term – crossing the Atlantic.

“I enjoy seeing them,” says Ben Yagoda, professor of English at the University of Delaware, and author of the forthcoming book, How to Not Write Bad.

“It’s like a birdwatcher. If I find an American saying one, it makes my day!”

Last year Yagoda set up a blog dedicated to spotting the use of British terms in American English.

So far he has found more than 150 – from cheeky to chat-up via sell-by date, and the long game – an expression which appears to date back to 1856, and comes not from golf or chess, but the card game whist. President Barack Obama has used it in at least one speech.

Yagoda notices changes in pronunciation too – for example his students sometimes use “that sort of London glottal stop”, dropping the T in words like “important” or “Manhattan”.

Kory Stamper, Associate Editor for Merriam-Webster, whose dictionaries are used by many American publishers and news organisations, agrees that more and more British words are entering the American vocabulary.

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image: Ngram graph showing online usage of the phrase “chat up”. Courtesy of Google / BBC.[end-div]

Death Cafe

“Death Cafe” sounds like the name of a group of alternative musicians from Denmark. But it’s not. Its rather more literal definition is a coffee shop where customers go to talk about death over a cup of earl grey tea or double shot espresso. And, while it’s not displacing Starbucks (yet), death cafes are a growing trend in Europe, first inspired by the pop-up Cafe Mortels of Switzerland.

[div class=attrib]From the Independent:[end-div]

Do you have a death wish?” is not a question normally bandied about in seriousness. But have you ever actually asked whether a parent, partner or friend has a wish, or wishes, concerning their death? Burial or cremation? Where would they like to die? It’s not easy to do.

Stiff-upper-lipped Brits have a particular problem talking about death. Anyone who tries invariably gets shouted down with “Don’t talk like that!” or “If you say it, you’ll make it happen.” A survey by the charity Dying Matters reveals that more than 70 per cent of us are uncomfortable talking about death and that less than a third of us have spoken to family members about end-of-life wishes.

But despite this ingrained reluctance there are signs of burgeoning interest in exploring death. I attended my first death cafe recently and was surprised to discover that the gathering of goths, emos and the terminally ill that I’d imagined, turned out to be a collection of fascinating, normal individuals united by a wish to discuss mortality.

At a trendy coffee shop called Cakey Muto in Hackney, east London, taking tea (and scones!) with death turned out to be rather a lot of fun. What is believed to be the first official British death cafe took place in September last year, organised by former council worker Jon Underwood. Since then, around 150 people have attended death cafes in London and the one I visited was the 17th such happening.

“We don’t want to shove death down people’s throats,” Underwood says. “We just want to create an environment where talking about death is natural and comfortable.” He got the idea from the Swiss model (cafe mortel) invented by sociologist Bernard Crettaz, the popularity of which gained momentum in the Noughties and has since spread to France.

Underwood is keen to start a death cafe movement in English-speaking countries and his website (deathcafe.com) includes instructions for setting up your own. He has already inspired the first death cafe in America and groups have sprung up in Northern England too. Last month, he arranged the first death cafe targeting issues around dying for a specific group, the LGBT community, which he says was extremely positive and had 22 attendees.

Back in Cakey Muto, 10 fellow attendees and I eye each other nervously as the cafe door is locked and we seat ourselves in a makeshift circle. Conversation is kicked off by our facilitator, grief specialist Kristie West, who sets some ground rules. “This is a place for people to talk about death,” she says. “I want to make it clear that it is not about grief, even though I’m a grief specialist. It’s also not a debate platform. We don’t want you to air all your views and pick each other apart.”

A number of our party are directly involved in the “death industry”: a humanist-funeral celebrant, an undertaker and a lady who works in a funeral home. Going around the circle explaining our decision to come to a death cafe, what came across from this trio, none of whom knew each other, was their satisfaction in their work.

“I feel more alive than ever since working in a funeral home,” one of the women remarked. “It has helped me recognise that it isn’t a circle between life and death, it is more like a cosmic soup. The dead and the living are sort of floating about together.”

Others in the group include a documentary maker, a young woman whose mother died 18 months ago, a lady who doesn’t say much but was persuaded by her neighbour to come, and a woman who has attended three previous death cafes but still hasn’t managed to admit this new interest to her family or get them to talk about death.

The funeral celebrant tells the circle she’s been thinking a lot about what makes a good or bad death. She describes “the roaring corrosiveness of stepping into a household” where a “bad death” has taken place and the group meditates on what a bad death entails: suddenness, suffering and a difficult relationship between the deceased and bereaved?

“I have seen people have funerals which I don’t think they would have wanted,” says the undertaker, who has 17 years of experience. “It is possible to provide funerals more cheaply, more sensitively and with greater respect for the dead.”

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Death cafe menu courtesy of Death Cafe.[end-div]

Human Evolution: Stalled

It takes no expert neuroscientist, anthropologist or evolutionary biologist to recognize that human evolution has probably stalled. After all, one only needs to observe our obsession with reality TV. Yes, evolution screeched to a halt around 1999, when reality TV hit critical mass in the mainstream public consciousness. So, what of evolution?

[div class=attrib]From the Wall Street Journal:[end-div]

If you write about genetics and evolution, one of the commonest questions you are likely to be asked at public events is whether human evolution has stopped. It is a surprisingly hard question to answer.

I’m tempted to give a flippant response, borrowed from the biologist Richard Dawkins: Since any human trait that increases the number of babies is likely to gain ground through natural selection, we can say with some confidence that incompetence in the use of contraceptives is probably on the rise (though only if those unintended babies themselves thrive enough to breed in turn).

More seriously, infertility treatment is almost certainly leading to an increase in some kinds of infertility. For example, a procedure called “intra-cytoplasmic sperm injection” allows men with immobile sperm to father children. This is an example of the “relaxation” of selection pressures caused by modern medicine. You can now inherit traits that previously prevented human beings from surviving to adulthood, procreating when they got there or caring for children thereafter. So the genetic diversity of the human genome is undoubtedly increasing.

Or it was until recently. Now, thanks to pre-implantation genetic diagnosis, parents can deliberately choose to implant embryos that lack certain deleterious mutations carried in their families, with the result that genes for Tay-Sachs, Huntington’s and other diseases are retreating in frequency. The old and overblown worry of the early eugenicists—that “bad” mutations were progressively accumulating in the species—is beginning to be addressed not by stopping people from breeding, but by allowing them to breed, safe in the knowledge that they won’t pass on painful conditions.

Still, recent analyses of the human genome reveal a huge number of rare—and thus probably fairly new—mutations. One study, by John Novembre of the University of California, Los Angeles, and his colleagues, looked at 202 genes in 14,002 people and found one genetic variant in somebody every 17 letters of DNA code, much more than expected. “Our results suggest there are many, many places in the genome where one individual, or a few individuals, have something different,” said Dr. Novembre.

Another team, led by Joshua Akey of the University of Washington, studied 1,351 people of European and 1,088 of African ancestry, sequencing 15,585 genes and locating more than a half million single-letter DNA variations. People of African descent had twice as many new mutations as people of European descent, or 762 versus 382. Dr. Akey blames the population explosion of the past 5,000 years for this increase. Not only does a larger population allow more variants; it also implies less severe selection against mildly disadvantageous genes.

So we’re evolving as a species toward greater individual (rather than racial) genetic diversity. But this isn’t what most people mean when they ask if evolution has stopped. Mainly they seem to mean: “Has brain size stopped increasing?” For a process that takes millions of years, any answer about a particular instant in time is close to meaningless. Nonetheless, the short answer is probably “yes.”

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image: The “Robot Evolution”. Courtesy of STRK3.[end-div]