Category Archives: Idea Soup

Back to the Future

France_in_XXI_Century_Latest_fashionJust over a hundred years ago, at the turn of the 19th century, Jean-Marc Côté and some of his fellow French artists were commissioned to imagine what the world would look like in 2000. Their colorful sketches and paintings portrayed some interesting inventions, though all seem to be grounded in familiar principles and incremental innovations — mechanical helpers, ubiquitous propellers and wings. Interestingly, none of these artist-futurists imagined a world beyond Victorian dress, gender inequality and wars. But these are gems nonetheless.

France_in_XXI_Century._Air_cabSome of their works found their way into cigar boxes and cigarette cases, others were exhibited at the 1900 World Exhibition in Paris. My three favorites: a Tailor of the Latest Fashion, the Aero-cab Station and the Whale Bus. See the full complement of these remarkable futuristic visions at the Public Domain Review, and check out the House Rolling Through the Countryside and At School.

I suspect our contemporary futurists — born in the late 20th or early 21st-century — will fall prey to the same narrow visions when asked to sketch our planet in 3000. But despite the undoubted wealth of new gadgets and gizmos a thousand years from now the challenge would be to see if their imagined worlds might be at peace and with equality for all.
France_in_XXI_Century_Whale_busImages courtesy of the Public Domain Review, a project of the Open Knowledge Foundation. Public Domain.

 

Barbie the Surveillance Officer

Google-search-hello-barbie

There are probably any number of reasons that you, and your kids, may choose to steer clear of Barbie (the Mattel doll that is). Detractors will point to a growing list of problems for which Barbie is to blame, including: gender stereotyping, body image distortion, vacuum cleaner accidents with her fake hair, eating disorders, and poor self esteem. However, it may not have occurred to you that the latest incarnation of the doll — interactive Hello Barbie — could also be spying on you and your family. Could the CIA, NSA or MI5 be keeping tabs on you through your kid’s doll? Creepy, and oh, she’s still far too thin.

From the Guardian:

Mattel’s latest Wi-Fi enabled Barbie doll can easily be hacked to turn it into a surveillance device for spying on children and listening into conversations without the owner’s knowledge.

The Hello Barbie doll is billed as the world’s first “interactive doll” capable of listening to a child and responding via voice, in a similar way to Apple’s Siri, Google’s Now and Microsoft’s Cortana.

It connects to the internet via Wi-Fi and has a microphone to record children and send that information off to third-parties for processing before responding with natural language responses.

But US security researcher Matt Jakubowski discovered that when connected to Wi-Fi the doll was vulnerable to hacking, allowing him easy access to the doll’s system information, account information, stored audio files and direct access to the microphone.

Jakubowski told NBC: “You can take that information and find out a person’s house or business. It’s just a matter of time until we are able to replace their servers with ours and have her say anything we want.”

Once Jakubowski took control of where the data was sent the snooping possibilities were apparent. The doll only listens in on a conversation when a button is pressed and the recorded audio is encrypted before being sent over the internet, but once a hacker has control of the doll the privacy features could be overridden.

It was the ease with which the doll was compromise that was most concerning. The information stored by the doll could allow hackers to take over a home Wi-Fi network and from there gain access to other internet connected devices, steal personal information and cause other problems for the owners, potentially without their knowledge.

Read the entire story here.

Image courtesy of Google Search.

Forget The Millennials — It’s Time For Generation K

Blame fickle social scientists. After the baby-boomers the most researched generation has been that of the millennials — so-called due to their coming of age at the turn of the century. We know what millennails like to eat and drink, how they dress, their politics; we know about their proclivity to sharing, their need for meaning and fun at work; we know they need attention and constant feedback. In fact, we have learned so much — and perhaps so little — from the thousands of, often-conflicting, research studies of millennials that some researchers have decided to move on to new blood. Yes, it’s time to tap another rich vein of research material — Generation K. But I’ll stop after relating what the “K” means in Generation K, and let you form your own conclusions.

[tube]n-7K_OjsDCQ[/tube]

Generation K is named for Katniss, as in the Hunger Games‘ hero Katniss Everdeen. That’s right, if you were born between 1995 and 2002, according to economist Noreena Hertz you are Gen-Katniss.

From the Guardian:

The brutal, bleak series that has captured the hearts of a generation will come to a brutal, bleak end in November when The Hunger Games: Mockingjay – Part 2 arrives in cinemas. It is the conclusion of the Hunger Games saga, which has immersed the young in a cleverly realised world of trauma, violence, mayhem and death.

For fans of Suzanne Collins’s trilogy about a young girl, Katniss Everdeen, forced to fight for survival in a country ruled by fear and fuelled by televised gladiatorial combat, this is the moment they have been waiting for.

Since the first book in the trilogy was published in 2008, Collins’s tale has sold more than 65 million copies in the US alone. The films, the first of which was released in 2012, have raked in more than $2bn worldwide at the box office and made a global star of their leading lady, Jennifer Lawrence, who plays the increasingly traumatised Katniss with a perfect mix of fury and resignation. For the huge appeal of The Hunger Games goes deeper than the fact that it’s an exciting tale well told. The generation who came to Katniss as young teens and have grown up ploughing through the books and queuing for the movies respond to her story in a particularly personal way.

As to why that might be, the economist and academic Noreena Hertz, who coined the term Generation K (after Katniss) for those born between 1995 and 2002, says that this is a generation riddled with anxiety, distrustful of traditional institutions from government to marriage, and, “like their heroine Katniss Everdeen, [imbued with] a strong sense of what is right and fair”.

“I think The Hunger Games resonates with them so much because they are Katniss navigating a dark and difficult world,” says Hertz, who interviewed 2,000 teenagers from the UK and the US about their hopes, fears and beliefs, concluding that today’s teens are shaped by three factors: technology, recession and coming of age in a time of great unease.

“This is a generation who grew up through 9/11, the Madrid bombings, the London bombings and Islamic State terrors. They see danger piped down their smartphones and beheadings on their Facebook page,” she says. “My data showed very clearly how anxious they are about everything from getting into debt or not getting a job, to wider issues such as climate change and war – 79% of those who took part in my survey worried about getting a job, 72% worried about debt, and you have to remember these are teenagers.

“In previous generations teenagers did not think in this way. Unlike the first-era millennials [who Hertz classes as those aged between 20 and 30] who grew up believing that the world was their oyster and ‘Yes we can’, this new generation knows the world is an unequal and harsh place.”

Writer and activist Laurie Penny, herself a first-era millennial at the age of 29, agrees. “I think what today’s young people have grasped that my generation didn’t get until our early 20s, is that adults don’t know everything,” she says. “They might be trying their best but they don’t always have your best interests at heart. The current generation really understands that – they’re more politically engaged and they have more sense of community because they’re able to find each other easily thanks to their use of technology.”

One of the primary appeals of the Hunger Games trilogy is its refusal to sugarcoat the scenarios Katniss finds herself in. In contrast to JK Rowling’s Harry Potter series, there are no reliable adult figures to dispense helpful advice and no one in authority she can truly trust (notably even the most likeable adult figures in the books tend to be flawed at best and fraudulent at worst). Even her friends may not always have her back, hard as they try – Dumbledore’s Army would probably find themselves taken out before they’d uttered a single counter-curse in the battlegrounds of Panem. At the end of the day, Katniss can only rely on one person, herself.

“Ultimately, the message of the Hunger Games is that everything’s not going to be OK,” says Penny. “One of the reasons Jennifer Lawrence is so good is because she lets you see that while Katniss is heroic, she’s also frightened all of the time. She spends the whole story being forced into situations she doesn’t want to be in. Kids respond because they can imagine what it’s like to be terrified but know that you have to carry on.”

It’s incontestable that we live in difficult times and that younger generations in particular may be more acutely aware that things aren’t improving any time soon, but is it a reach to say that fans of the Hunger Games are responding as much to the world around them as to the books?

Read the entire story here.

Video: The Hunger Games: Mockingjay Part 2 Official Trailer – “We March Together”. Courtesy of the Hunger Games franchise.

Perchance Art Thou Smitten by Dapper Hipsters? Verily Methinks

Linguistic-trends-2015As the (mostly) unidirectional tide of cultural influence flows from the U.S to the United Kingdom, the English mother tongue is becoming increasingly (and distressingly, I might add) populated by Americanisms: trash instead of rubbish, fries not chips, deplane instead of disembark, shopping cart instead of trolley, bangs rather than fringe, period instead of full stop. And there’s more: 24/7, heads-up, left-field, normalcy, a savings of, deliverable, the ask, winningest.

All, might I say, utterly cringeworthy.

Yet, there may be a slight glimmer of hope, and all courtesy of the hipster generation. Hipsters, you see, crave an authentic, artisanal experience — think goat cheese and bespoke hats — that also seems to embrace language. So, in 2015, compared with a mere decade earlier, you’re more likely to hear some of the following words, which would normally be more attributable to an archaic, even Shakespearean, era:

perchance, mayhaps, parlor, amidst, amongst, whilst, unbeknownst, thou, thee, ere, hath

I’m all for it. My only hope now, is that these words will flow against the tide and into the U.S. to repair some of the previous linguistic deforestation. Methinks I’ll put some of these to immediate, good use.

From the Independent:

Hipsters are famous for their love of all things old-fashioned: 19th Century beards, pickle-making, Amish outerwear, naming their kids Clementine or Atticus. Now, they may be excavating archaic language, too.

As Chi Luu points out at JSTOR Daily  — the blog of a database of academic journals, what could be more hipster than that? — old-timey words like bespoke, peruse, smitten and dapper appear to be creeping back into the lexicon.

This data comes from Google’s Ngram viewer, which charts the frequencies of words appearing in printed sources between 1800 and 2012.

Google’s Ngram shows that lots of archaic words appear to be resurfacing — including gems like perchance, mayhaps and parlor.

The same trend is visible for words like amongst, amidst, whilst and unbeknownst, which are are archaic forms of among, amid, while and unknown.

Read the story in its entirety here.

Image courtesy of Google’s Ngram viewer / Independent.

Your Job is Killing You

Women_mealtime_st_pancras_workhouse

Many of us complain about the daily stresses from our jobs and our bosses, even our coworkers. We even bemoan the morning commute and the work we increasingly bring back home to complete in the evening. Many of us can be heard to say, “this job is killing me!”. Metaphorically, of course.

Well, researchers at Stanford and Harvard now find that in some cases your job is actually, quite literally, killing you. This may seem self-evident, but the data shows that workers with less education are significantly more likely to be employed in jobs that are more stressful and dangerous, and have less healthy workplace practices. This, in turn, leads to a significantly lower average life span than that for those with higher educational attainment. Researchers measured typical employment-related stressors such as: unemployment, layoffs, absence of employer subsidized health insurance, shift work, long working hours, job insecurity and work-family conflict. The less education a worker has, the more likely that she or he will suffer a greater burden from one or more of these stressors.

Looks like we’re gradually reverting to well-tested principles of Victorian worker exploitation. Check out more details from the study here.

From Washington Post:

People often like to groan about how their job is “killing” them. Tragically, for some groups of people in the U.S., that statement appears to be true.

A new study by researchers at Harvard and Stanford has quantified just how much a stressful workplace may be shaving off of Americans’ life spans. It suggests that the amount of life lost to stress varies significantly for people of different races, educational levels and genders, and ranges up to nearly three years of life lost for some groups.

Past research has shown an incredible variation in life expectancy around the United States, depending on who you are and where you live. Mapping life expectancy around the nation by both county of residence and race, you can see that people in some parts of the U.S. live as many as 33 years longer on average than people in other parts of the country, the researchers say.

Those gaps appear to be getting worse, as the wealthy extend their life spans and other groups are stagnant. One study found that men and women with fewer than 12 years of education had life expectancies that were still on par with most adults in the 1950s and 1960s — suggesting the economic gains of the last few decades have gone mostly to more educated people. The financial crisis and subsequent recession, which put many people in economic jeopardy, may have worsened this effect.

There are lots of reasons that people with lower incomes and educations tend to have lower life expectancies: differences in access to health care, in exposure to air and water pollution, in nutrition and health care early in life, and in behaviors, such as smoking, exercise and diet. Past research has also shown that job insecurity, long hours, heavy demands at work and other stresses can also cut down on a worker’s life expectancy by taking a heavy toll on a worker’s health. (If you work in an office, here are some exercises you might try to prevent this.)

But researchers say this is the first study to look at the ways that a workplace’s influence on life expectancy specifically break down by racial and educational lines.

To do their analysis, they divided people into 18 different groups by race, education and sex. They then looked at 10 different workplace factors — including unemployment and layoffs, the absence of health insurance, shift work, long working hours, job insecurity and work-family conflict — and estimated the effect that each would have on annual mortality and life expectancy.

The data show that people with less education are much more likely to end up in jobs with more unhealthy workplace practices that cut down on one’s life span. People with the highest educational attainment were less affected by workplace stress than people with the least education, the study says.

Read the entire story here.

Image: Women mealtime at St Pancras workhouse, London. Courtesy: Peter Higginbothom. Licensed under Public Domain via Commons.

The Vicious Cycle of Stuff

google-search-stuff

Many of us in the West, and now increasingly in developing nations, are the guilty perpetrators of the seemingly never-ending cycle of consumption and accumulation. Yet for all the talk of sustainability, down-sizing, and responsible consumption we continue to gather, hoard and surround ourselves with more and more stuff.

From the Guardian:

The personal storage industry rakes in $22bn each year, and it’s only getting bigger. Why?

I’ll give you a hint: it’s not because vast nations of hoarders have finally decided to get their acts together and clean out the hall closet.

It’s also not because we’re short on space. In 1950 the average size of a home in the US was 983 square feet. Compare that to 2011, when American houses ballooned to an average size of 2,480 square feet – almost triple the size.

And finally, it’s not because of our growing families. This will no doubt come as a great relief to our helpful commenters who each week kindly suggest that for maximum environmental impact we simply stop procreating altogether: family sizes in the western world are steadily shrinking, from an average of 3.37 people in 1950 to just 2.6 today.

So, if our houses have tripled in size while the number of people living in them has shrunk, what, exactly, are we doing with all of this extra space? And why the billions of dollars tossed to an industry that was virtually nonexistent a generation or two ago?

Well, friends, it’s because of our stuff. What kind of stuff? Who cares! Whatever fits! Furniture, clothing, children’s toys (for those not fans of deprivation, that is), games, kitchen gadgets and darling tchotchkes that don’t do anything but take up space and look pretty for a season or two before being replaced by other, newer things – equally pretty and equally useless.

The simple truth is this: you can read all the books and buy all the cute cubbies and baskets and chalkboard labels, even master the life-changing magic of cleaning up – but if you have more stuff than you do space to easily store it, your life will be spent a slave to your possessions.

We shop because we’re bored, anxious, depressed or angry, and we make the mistake of buying material goods and thinking they are treats which will fill the hole, soothe the wound, make us feel better. The problem is, they’re not treats, they’re responsibilities and what we own very quickly begins to own us.

The second you open your wallet to buy something, it costs you – and in more ways than you might think. Yes, of course there’s the price tag and the corresponding amount of time it took you to earn that amount of money, but possessions also cost you space in your home and time spent cleaning and maintaining them. And as the token environmentalist in the room, I’d be remiss if I didn’t remind you that when you buy something, you’re also taking on the task of disposing of it (responsibly or not) when you’re done with it. Our addiction to consumption is a vicious one, and it’s stressing us out.

I know this because I’ve experienced it, having lived in everything from a four-bedroom house to my current one-bedroom flat I share with my daughter – but I’m also bringing some cold, hard science to the table.

A study published by UCLA showed that women’s stress hormones peaked during the times they were dealing with their possessions and material goods. Anyone who parks on the street because they can’t fit their car into the garage, or has stared down a crammed closet, can relate.

Our addiction to consuming is a vicious one, and it’s having a markedly negative impact on virtually every aspect of our lives.

Read the entire story here.

Image courtesy of Google Search.

Time for the Bucket List to Kick the Bucket

For the same reasons that New Year’s resolutions are daft, it’s time to ditch the bucket list. Columnist Steven Thrasher rightly argues that your actions to get something done or try something new should be driven by your gusto for life — passion, curiosity, wonder, joy — rather than dictated by a check box because you’re one step closer to death. Signs that it’s time to ditch the bucket list: when the idea is co-opted by corporations, advertisers and Hollywood; when motivational posters appear in hallways; and when physical bucket list buckets and notepads go on sale at Pottery Barn or Walmart.

From the Guardian:

Before each one of us dies, let’s wipe the “bucket list” from our collective vocabulary.

I hate the term “the bucket list.” The phrase, a list of things one wants to do in life before one dies or “kicks the bucket”, is the kind of hackneyed, cliche, stupid and insipid term only we Americans can come up with.

Even worse, “the bucket list” has become an excuse for people to couch things they actually desire to try doing as only socially acceptable if framed in the face of their death. It’s as if pleasure, curiosity and fun weren’t reasons enough for action.

If you want to try doing something others might find strange or unorthodox – write a novel, learn to tap dance, engage in a rim job, field dress a deer, climb Everest, go out in drag for a night – why do you need any justification at all? And certainly, why would you need an explanation that is only justifiable in terms of kicking the bucket?

According to the Wall Street Journal, the phrase “bucket list” comes to us from the banal mind of screenwriter Justin Zackham, who developed a list of things he wanted to do before he died. Years later, his “bucket list” became the title of his corny 2007 film starring Jack Nicholson and Morgan Freeman. It’s about two old men with terminal cancer who want to live it up before they die. That, if anyone at all, is who should be using the term “bucket list”. They want to do something with the finite time they know they have left? Fine.

But bucket list has trickled down to everday use by the perfectly healthy, the exceptionally young, and most of all, to douche bags. I realized this at Burning Man last week. Often, when I asked exceptionally boring people what had drawn them to Black Rock City, they’d say: “It was on my bucket list!”

Really? You wanted to schlep out to the desert and face freezing lows, scorching highs and soul crushing techno simply because you’re going to die someday?

There’s a funny dynamic sometimes when I go on a long trip while I’m out of work. When I backpacked through Asia and Europe in 2013, people (usually friends chained to a spouse, children and a mortgage) would sometimes awkwardly say to me: “Well, it will be the trip of a lifetime!” It was a good trip, but just one of many great journeys I’ve taken in my life so far. My adventures might interrupt someone else’s idea of what’s “normal.” But travel isn’t something I do to fulfil my “bucket list”; travel is a way of life for me. I do not rush into a trip thinking: “Good Christ, I could die tomorrow!” I don’t travel in place of the stable job or partner or kids I may or may not ever have. I do it as often as I can because it brings me joy.

Read the entire column here.

The 75 Percent Versus 1 Percent

Stop the presses! Hold your horses! There seems to be some hope for humanity after all — and I was just about to seek a misanthropic-approved cave in which to hide.

A recent study by Common Cause shows that three-quarters of one thousand people surveyed identify more closely with unselfish values (altruism, forgiveness, honesty) than selfish ones (money, fame, power). But, as George Monbiot points out those in the 1 percent who run the globe tend to be the selfish ones. Also, he’s quite right to propose that we’d all be better served if the media apparatchik’s who fawn upon the 1 percent spent more time delving into the stories of those who give, rather than take.

From the Guardian:

Do you find yourself thrashing against the tide of human indifference and selfishness? Are you oppressed by the sense that while you care, others don’t? That, because of humankind’s callousness, civilisation and the rest of life on Earth are basically stuffed? If so, you are not alone. But neither are you right.

A study by the Common Cause Foundation, due to be published next month, reveals two transformative findings. The first is that a large majority of the 1,000 people they surveyed – 74% – identifies more strongly with unselfish values than with selfish values. This means that they are more interested in helpfulness, honesty, forgiveness and justice than in money, fame, status and power. The second is that a similar majority – 78% – believes others to be more selfish than they really are. In other words, we have made a terrible mistake about other people’s minds.

The revelation that humanity’s dominant characteristic is, er, humanity will come as no surprise to those who have followed recent developments in behavioural and social sciences. People, these findings suggest, are basically and inherently nice.

A review article in the journal Frontiers in Psychology points out that our behaviour towards unrelated members of our species is “spectacularly unusual when compared to other animals”. While chimpanzees might share food with members of their own group, though usually only after being plagued by aggressive begging, they tend to react violently towards strangers. Chimpanzees, the authors note, behave more like the homo economicus of neoliberal mythology than people do.

Humans, by contrast, are ultrasocial: possessed of an enhanced capacity for empathy, an unparalleled sensitivity to the needs of others, a unique level of concern about their welfare, and an ability to create moral norms that generalise and enforce these tendencies.

Such traits emerge so early in our lives that they appear to be innate. In other words, it seems that we have evolved to be this way. By the age of 14 months, children begin to help each other, for example by handing over objects another child can’t reach. By the time they are two, they start sharing things they value. By the age of three, they start to protest against other people’s violation of moral norms.

A fascinating paper in the journal Infancy reveals that reward has nothing to do with it. Three- to five-year-olds are less likely to help someone a second time if they have been rewarded for doing it the first time. In other words, extrinsic rewards appear to undermine the intrinsic desire to help. (Parents, economists and government ministers, please note.) The study also discovered that children of this age are more inclined to help people if they perceive them to be suffering, and that they want to see someone helped whether or not they do it themselves. This suggests that they are motivated by a genuine concern for other people’s welfare, rather than by a desire to look good.

Why? How would the hard logic of evolution produce such outcomes? This is the subject of heated debate. One school of thought contends that altruism is a logical response to living in small groups of closely related people, and evolution has failed to catch up with the fact that we now live in large groups, mostly composed of strangers.

Another argues that large groups containing high numbers of altruists will outcompete large groups which contain high numbers of selfish people. A third hypothesis insists that a tendency towards collaboration enhances your own survival, regardless of the group in which you might find yourself. Whatever the mechanism might be, the outcome should be a cause of celebration.

So why do we retain such a dim view of human nature? Partly, perhaps, for historical reasons. Philosophers from Hobbes to Rousseau, Malthus to Schopenhauer, whose understanding of human evolution was limited to the Book of Genesis, produced persuasive, influential and catastrophically mistaken accounts of “the state of nature” (our innate, ancestral characteristics). Their speculations on this subject should long ago have been parked on a high shelf marked “historical curiosities”. But somehow they still seem to exert a grip on our minds.

Another problem is that – almost by definition – many of those who dominate public life have a peculiar fixation on fame, money and power. Their extreme self-centredness places them in a small minority, but, because we see them everywhere, we assume that they are representative of humanity.

The media worships wealth and power, and sometimes launches furious attacks on people who behave altruistically. In the Daily Mail last month, Richard Littlejohn described Yvette Cooper’s decision to open her home to refugees as proof that “noisy emoting has replaced quiet intelligence” (quiet intelligence being one of his defining qualities). “It’s all about political opportunism and humanitarian posturing,” he theorised, before boasting that he doesn’t “give a damn” about the suffering of people fleeing Syria. I note with interest the platform given to people who speak and write as if they are psychopaths.

Read the entire story here.

Wot! Proper Grammar?

It seems that there are several ways to turn off a potential dating connection online: a picture of your bad teeth, tales of your poor hygiene, political posturing, and now, a poorly written profile or introductory email. Is our children learning?

Seriously, can it be that the younger generation is finally rebelling against the tyranny of lowercase Twitteresque, incorrect punctuation, nonsensical grammar, fatuous emoticons and facile abbreviations? If so, this is wonderful news for those who care about our language. Now, perhaps, these same people can turn their talents to educating the barely literate generations holding jobs in corporate America. After decades of subservience to fractured Powerpoint haiku many can no longer string together a coherent paragraph.

From the WSJ:

When Jeff Cohen was getting ready to meet his OkCupid date for drinks in Manhattan, he started to have second thoughts as he reread the glaring grammatical error in her last message: “I will see you their.”

The date flopped for a couple of reasons, but bad grammar bothers Mr. Cohen. Learning a potential mate doesn’t know the difference between “there,” “they’re” and “their” is like discovering she loves cats, he says. Mr. Cohen is allergic to cats. “It’s like learning I’m going to sneeze every time I see her,” he says.

With crimes against grammar rising in the age of social media, some people are beginning to take action. The online dating world is a prime battleground.

Mr. Cohen joins a number of singles picky about the grammar gaffes they’re seeing on dating sites. For love, these folks say written communications matter, from the correct use of semicolons, to understanding the difference between its and it’s, and sentences built on proper parallel construction.

“Grammar snobbery is one of the last permissible prejudices,” says John McWhorter, a linguistics professor at Columbia University. “The energy that used to go into open classism and racism now goes into disparaging people’s grammar.”

Mr. Cohen now uses an app that ranks the message quality of prospective dates. Called the Grade, the app checks messages for typos and grammar errors and assigns each user a letter grade from A+ to F.

The Grade demotes people whose messages contain certain abbreviations, like “wassup” and “YOLO,” short for “You Only Live Once,” popular among young people who want to justify doing something risky or indulgent. Clifford Lerner, chief executive of SNAP Interactive Inc., the company that makes the Grade, says the app downgrades these types of phrases in an effort to promote “meaningful conversations.”

Dating site Match asked more than 5,000 singles in the U.S. what criteria they used most in assessing dates. Beyond personal hygiene—which 96% of women valued most, as compared with 91% of men—singles said they judged a date foremost by the person’s grammar. The survey found 88% of women and 75% of men said they cared about grammar most, putting it ahead of a person’s confidence and teeth.

“When you get a message that is grammatically correct and has a voice and is put together, it is very attractive, it definitely adds hotness points,” says New Yorker Grace Gold. “People who send me text-type messages, and horrific grammatical errors? I just delete them.” She recalls the red flag raised by one potential suitor who had written his entire dating profile in lowercase.

Language has always played a part in how people judge others, but it has become amplified in recent years with increasing informal and colloquial usage, says Ben Zimmer, a lexicographer and chair of the New Words Committee of the American Dialect Society.

Read the entire story here.

PhotoMash: Two Types of Radical

Photomash-Radical-1-vs-Radical-2Meet two faces of radicalism: one is the face of radical islam; the second is the face of radial nationalism. Different, but similar, and both morally bankrupt.

Both have ideas that resonate with a very limited few (luckily for the rest of us); both inflame our discourse; both fuel hatred, distrust and intolerance; both project fear, racism, xenophobia and misogyny. Welcome to the new faces of fascism.

As a Londoner recently said of an attacker (reportedly belonging to the first type of radical group): #YouAintNoMuslimBruv.

I’d suggest to our second radical: #YouAintNoAmericanBro.

Both of these nightmarish visions seek a place on the world stage — both should and will rightly fail.

Image courtesy of the Washington Post, December 7, 2015.

PhotoMash: Two Kinds of Monster, One Real

I couldn’t resist this week’s photo mash-up. This one comes courtesy of the Guardian on December 3, 2015. It features two types of monster very aptly placed alongside each other by a kindly newspaper editor.

Photomash-Trump-vs-Monsters

The first monster happens to want to be President of the United States. He seems to be a racist, misogynist and raving bigot, and unfortunately (for some), he’s very real. The second, is a story of photographic artist Flora Borsi. She’s tired of perfect models with perfect hair in perfect fashion photographs. So, she retouches them, or in her words “detouches” the images into her “little monsters”. These are not real.

Our real world can be rather surreal.

Images courtesy of Guardian.

The US and the UK: A Stark Difference

Terrorism-US-3Dec2015Within the space of a few days we’ve witnessed two more acts of atrocious violence and murder. One in San Bernardino, California, the other in London, England.

In California 14 innocent people lost there lives and, by some accounts, 21 people were injured, and of course many hundreds of police officers and first-responders put their lives at risk in searching for and confronting the murderers.

In London, 3 people were injured, one seriously by an attacker on the London Underground (subway).Terrorism-UK-6Dec2015

 

Label these attacks acts of terrorism; acts of deranged minds. But, whether driven by warped ideologies or mental health issues the murder and violence in California and London shows one very stark difference.

Guns. Lots of guns.

The attackers in California were armed to the teeth: handguns, semi-automatic weapons and thousands of rounds of ammunition. The attacker in London was wielding a knife. You see, terrorism, violent radicalism and mental health problems exist — much to the same extent — in both the US and UK (and across the globe for that matter). But more often than not the outcome will be rather different — that is, more bloody and deadly — in the US because of access to weapons that conveniently facilitate mass murder.

And, sadly until a significant proportion of the US population comes to terms with this fact, rather than hiding behind a distorted interpretation of the 2nd Amendment, the carnage and mass murder — in the US — will continue.

 

Monarchy: Bad. Corporations and Oligarchs: Good

Google-search-GOP-candidates

The Founders of the United States had an inkling that federated democracy could not belong to all the people — hence they inserted the Electoral College. Yet they tried hard to design a system that improved upon the unjust, corruptness of hereditary power. But while they understood the dangers of autocratic British monarchy, they utterly failed to understand the role of corporations and vast sums of money in delivering much the same experience a couple of centuries later.

Ironically enough, all of Europe’s monarchies have given way to parliamentary democracies which are less likely to be ruled or controlled through financial puppeteering. In the United States, on the other hand, the once shining beacon of democracy is firmly in the grip of corporations, political action committees (PAC) and a handful of oligarchs awash in money, and lots of it. They control the discourse. They filter the news. They vet and anoint candidates; and destroy their foes. They shape and make policy. They lobby and “pay” lawmakers. They buy and aggregate votes. They now define and run the system.

But, of course, our corporations and billionaires are not hereditary aristocrats — they’re ordinary people with our interests at heart — according to the U.S. Supreme Court. So, all must be perfect and good, especially for those who subscribe to the constructionist view of the US Constitution.

From the Guardian:

To watch American politics today is to watch money speaking. The 2016 US elections will almost certainly be the most expensive in recent history, with total campaign expenditure exceeding the estimated $7bn (£4.6bn) splurged on the 2012 presidential and congressional contests. Donald Trump is at once the personification of this and the exception that proves the rule because – as he keeps trumpeting – at least it’s his own money. Everyone else depends on other people’s, most of it now channelled through outside groups such as “Super PACs” – political action committees – which are allowed to raise unlimited amounts from individuals and corporations.

The sums involved dwarf those in any other mature democracy. Already, during the first half of 2015, $400m has been raised, although the elections are not till next autumn. Spending on television advertising is currently projected to reach $4.4bn over the whole campaign. For comparison, all candidates and parties in Britain’s 2010 election spent less than £46m. In Canada’s recent general election the law allowed parties to lay out a maximum of about C$25m (£12.5m) for the first 37 days of an election campaign, plus an extra C$685,185 (to be precise) for each subsequent day.

Rejecting a challenge to such campaign finance regulation back in 2004, the Canadian supreme court argued that “individuals should have an equal opportunity to participate in the electoral process”, and that “wealth is the main obstacle to equal participation”. “Where those having access to the most resources monopolise the election discourse,” it explained, “their opponents will be deprived of a reasonable opportunity to speak and be heard.”

The US supreme court has taken a very different view. In its 2010 Citizens United judgment it said, in effect, that money has a right to speak. Specifically, it affirmed that a “prohibition on corporate independent expenditures is … a ban on speech”. As the legal scholar Robert Post writes, in a persuasive demolition of the court’s reasoning, “this passage flatly equates the first amendment rights of ordinary commercial corporations with those of natural persons”. (Or, as the former presidential candidate Mitt Romney put it in response to a heckler: “Corporations are people, my friend,”)

In a book entitled Citizens Divided, Post demonstrates how the Citizens United judgment misunderstands the spirit and deeper purpose of the first amendment: for people to be best equipped to govern themselves they need not just the freedom of political speech, but also the “representative integrity” of the electoral process.

Of course, an outsize role for money in US politics is nothing new. Henry George, one of the most popular political economists of his day, wrote in 1883 that “popular government must be a sham and a fraud” so long as “elections are to be gained by the use of money, and cannot be gained without it”. Whether today’s elections are so easily to be gained by the use of money is doubtful, when so much of it is sloshing about behind so many candidates, but does anyone doubt the “cannot be gained without it”?

Money may have been shaping US politics for some time, but what is new is the scale and unconstrained character of the spending, since the 2010 Citizens United decision and the Super PACs that it (and a subsequent case in a lower court) enabled. Figures from the Center for Responsive Politics show outside spending in presidential campaign years rising significantly in 2004 and 2008 but then nearly trebling in 2012 – and, current trends suggest, we ain’t seen nothing yet.

The American political historian Doris Kearns Godwin argues that the proliferation of Republican presidential candidates, so many that they won’t even fit on the stage for one television debate, is at least partly a result of the ease with which wealthy individuals and businesses can take a punt on their own man – or Carly Fiorina. A New York Times analysis found that around 130 families and their businesses accounted for more than half the money raised by Republican candidates and their Super PACs up to the middle of this year. (Things aren’t much better on the Democrat side.) And Godwin urges her fellow citizens to “fight for an amendment to undo Citizens United”.

The Harvard law professor and internet guru Larry Lessig has gone a step further, himself standing for president on the single issue of cleaning up US politics, with a draft citizen equality act covering voter registration, gerrymandering, changing the voting system and reforming campaign finance. That modest goal achieved, he will resign and hand over the reins to his vice-president. Earlier this year he said he would proceed if he managed to crowdfund more than $1m, which he has done. Not peanuts for you or me, but Jeb Bush’s Super PAC, Right to Rise, is planning to spend $37m on television ads before the end of February next year. So one of the problems of the campaign for campaign finance reform is … how to finance its campaign.

Read the entire story here.

Image courtesy of Google Search.

H2O and IQ

There is great irony in NASA’s recent discovery of water flowing on Mars.

First, that the gift of our intelligence allows us to make such amazing findings on other worlds while we use the same brain cells to enable the rape and pillage of our own.

CADrought-LakeOroville

Second, the meager seasonal trickles of liquid on the martian surface show us a dire possible future for our own planet.

Mars-Recurring-Slope-Lineae

From the Guardian:

Evidence for flowing water on Mars: this opens up the possibility of life, of wonders we cannot begin to imagine. Its discovery is an astonishing achievement. Meanwhile, Martian scientists continue their search for intelligent life on Earth.

We may be captivated by the thought of organisms on another planet, but we seem to have lost interest in our own. The Oxford Junior Dictionary has been excising the waymarks of the living world. Adders, blackberries, bluebells, conkers, holly, magpies, minnows, otters, primroses, thrushes, weasels and wrens are now surplus to requirements.

In the past four decades, the world has lost 50% of its vertebrate wildlife. But across the latter half of this period, there has been a steep decline in media coverage. In 2014, according to a study at Cardiff University, there were as many news stories broadcast by the BBC and ITV about Madeleine McCann (who went missing in 2007) as there were about the entire range of environmental issues.

Think of what would change if we valued terrestrial water as much as we value the possibility of water on Mars. Only 3% of the water on this planet is fresh; and of that, two-thirds is frozen. Yet we lay waste to the accessible portion. Sixty per cent of the water used in farming is needlessly piddled away by careless irrigation. Rivers, lakes and aquifers are sucked dry, while what remains is often so contaminated that it threatens the lives of those who drink it. In the UK, domestic demand is such that the upper reaches of many rivers disappear during the summer. Yet still we install clunky old toilets and showers that gush like waterfalls.

As for salty water, of the kind that so enthrals us when apparently detected on Mars, on Earth we express our appreciation with a frenzy of destruction. A new report suggests fish numbers have halved since 1970. Pacific bluefin tuna, which once roamed the seas in untold millions, have been reduced to an estimated 40,000, yet still they are pursued. Coral reefs are under such pressure that most could be gone by 2050. And in our own deep space, our desire for exotic fish rips through a world scarcely better known to us than the red planet’s surface. Trawlers are now working at depths of 2,000 metres. We can only guess at what they could be destroying.

A few hours before the Martian discovery was announced, Shell terminated its Arctic oil prospecting in the Chukchi Sea. For the company’s shareholders, it’s a minor disaster: the loss of $4bn; for those who love the planet and the life it sustains, it is a stroke of great fortune. It happened only because the company failed to find sufficient reserves. Had Shell succeeded, it would have exposed one of the most vulnerable places on Earth to spills, which are almost inevitable where containment is almost impossible. Are we to leave such matters to chance?

At the beginning of September, two weeks after he granted Shell permission to drill in the Chukchi Sea, Barack Obama travelled to Alaska to warn Americans about the devastating effects that climate change caused by the burning of fossil fuels could catalyse in the Arctic. “It’s not enough just to talk the talk”, he told them. “We’ve got to walk the walk.” We should “embrace the human ingenuity that can do something about it”. Human ingenuity is on abundant display at Nasa, which released those astounding images. But not when it comes to policy.

Let the market decide: this is the way in which governments seek to resolve planetary destruction. Leave it to the conscience of consumers, while that conscience is muted and confused by advertising and corporate lies. In a near-vacuum of information, we are each left to decide what we should take from other species and other people, what we should allocate to ourselves or leave to succeeding generations. Surely there are some resources and some places – such as the Arctic and the deep sea – whose exploitation should simply stop?

Read the entire article here.

Images: Lake Oroville, California, Earth, courtesy of U.S. Drought Portal. Recurring slope lineae, Mars, courtesy of NASA/JPL.

Green Friday

South Arapahoe Peak

To my US readers… Happy Thanksgiving. By this time you will no doubt have been bombarded by countless commercials, online adds, billboards, flyers and messages to your inbox, social media account etc., espousing the wonders of the so-called Black Friday shopping orgy.

My advice: boycott the shopping mall and the stores — both online and brick-and-mortar — go outside, breath some fresh air, and join Green Friday. It’s infinitely better for the heart and the soul (and your bank account). My home state of Colorado has joined the bandwagon this year by opening up all state parks for free on Fresh Air Friday.

Image: South Arapahoe Peak, looking East, Indian Peaks Wilderness, Colorado. Courtesy of the author.

 

Crony Capitalism Rules

The self-righteous preachers of on all sides of the political aisle in the U.S are constantly decrying corruption across the globe; one day the target may be a central African nation, the next it’s China, then a country in Latin America. Of course, this wouldn’t be so ****ing hypocritical if those in positions of power opened their eyes — and closed their wallets — to the rampant cash-fueled cronyism in their own backyards.

[tube]ETxmCCsMoD0[/tube]

The threat to this democracy from those with hoards of money is greater than any real or imagined hostility from terrorism. Money greases and fuels the well-oiled machine in Washington D.C; it catalyses those who peddle influence; it brokers power and it curries favor. The influence of money is insidious and pervasive, and it is eating away the promise of democracy for all.

Our politicians pay homage to the bundlers; they crave endorsement from the millionaires; and, increasingly, they need anointment from the billionaires. And Rome burns. Then, when our so-called representatives have had their turn in the public limelight and in power, they retreat to the shadows, where as lobbyists and brokers they wield even greater power for the moneyed few. And Rome continues to burn.

So you know things must be rather dire if even huge swathes of capitalist corporate America want some form of significant campaign finance reform. You can read for yourself what the Committee for Economic Development of the Conference Board has to say in its scathing report, Crony Capitalism: Unhealthy Relations Between Business and Government.

From the Guardian:

Political corruption is eating our democracy out from the inside. Most Americans know that. But democratic and economic health can’t be easily disentangled. As it diminishes our public sphere and drowns out the myriad of citizen voices, it also sucks the energy and vitality from our economy. This causes pain to business owners.

According to a recent report from the Committee on Economic Development, an old, white-shoe non-partisan organization that came out of the aftermath of World War II (and was a booster for the Marshall Plan), the United States economy is increasingly represented by crony capitalism, not competitive capitalism.

Lobbyists and privately funded elections have, according to the CED: “exerted an important toll on the US economy”. They propose banning registered lobbyists from raising money for federal candidates and officeholders, and implementing strict revolving door policies.

Crony capitalism, the report details, leads to “rent-seeking through subsidies or taxes that benefit vested interests at the expense of others, rather than the pursuit of profit through socially and economically productive behavior”.

What is most striking about the report is who is behind it. The CEO of CED is former Romney supporter Steve Odland. A former top lobbyist for PepsiCo, a republican called Larry Thompson – someone I never thought I’d agree with – is endorsing the single most important structural reform in America: publicly financed elections.

Thompson is the Co-Chair of CED’s Sustainable Capitalism Subcommittee, a driver in the release of the report. Paul Atkins, another member of the CED board (and the sustainable capitalism subcommittee) was a Bush-appointed SEC Commissioner who opposed rules constraining hedge funds.

“Campaign finance reform could free elected officials from their dependence on private campaign funding. Such funding is seen as an important reason why elected officials might bend their views on policy issues away from the public interest” the report said.

I disagree with a big part of the report. I don’t think we should reduce the corporate tax rate. But the crony capitalism argument is right on point, and the most striking thing about the report is its full-throated endorsement of a public financing model. And, the report persuasively shows how our current model reduces competitiveness of the economy “by favoring insiders over outsiders” and “continues to sap vitality” out of our economic life.

We haven’t always had this problem. Until the 1980s, candidates spent a fraction of their time talking to donors; just a few weeks a year, a little more right before an election. True, they’d fund raise from the wealthy interests, as they do now, but it was a minuscule part of their job: policy and constituent services were the heart of the work.

 Read the entire story here.

Video: Money, money, money. ABBA. Courtesy of AbbaEVEO.

Social Media Lice

google-search-group-selfie

We know that social media helps us stay superficially connected to others. We also know many of the drawbacks — an over-inflated and skewed sense of self; poor understanding and reduced thoughtfulness; neurotic fear of missing out (FOMO); public shaming, online bullying and trolling.

But, now we hear that one of the key foundations of social media — the taking and sharing of selfies — has more serious consequences. Social media has caused an explosion in head lice, especially in teenagers, particularly girls. Call it: social media head lice syndrome. While this may cause you to scratch your head in disbelief, or for psychosomatic reasons, the outbreak of lice is rather obvious. It goes like this: a group of teens needs a quick selfie fix; teens crowd around the smartphone and pose; teens lean in, heads together; head lice jump from one scalp to the next.

From the Independent:

Selfies have sparked an explosion in the number of head lice cases among teenagers a group of US paediatricians has warned.

The group said there is a growing trend of “social media lice” where lice spread when teenagers cram their heads together to take a selfie.

Lice cannot jump so they are less common in older children who do not tend to swap hats or headgear.

A Wisconsin paediatrician, Dr Sharon Rink, told local news channel WBAY2 she has seen a surge of teenagers coming to see her for treatment, something which was unheard of five years ago.

Dr Rink said: “People are doing selfies like every day, as opposed to going to photo booths years and years ago.

“So you’re probably having much more contact with other people’s heads.

“If you have an extremely itchy scalp and you’re a teenager, you might want to get checked out for lice instead of chalking it up to dandruff.”

In its official online guide to preventing the spread of head lice, the Center for Disease Control recommends avoiding head-to-head contact where possible and suggests girls are more likely to get the parasite than boys because they tend to have “more frequent head-to-head contact”.

Read (and scratch) more here.

Image courtesy of Google Search.

 

Corporate Grief

amazon-france-bannerFollowing the recent horrendous mass murders in Mali, Paris, and Lebanon (and elsewhere) there is a visible outpouring of grief, and on a worldwide scale. Many of us, while removed from direct involvement and having no direct connection to the victims and their families, still feel sadness, pain and loss.

The empathy expressed by strangers or distant acquaintances for those even remotely connected with the violence is tangible and genuine. For instance, we foreigners may seek out a long-lost French colleague to express our concern and condolences for Mali / France and all Bamakoans / Parisians. There is genuine concern and sense of connection, at a personal level, however frail that connection may be.

But what is going on when Amazon, Apple, eBay, Uber and other corporations wave their digital banners of solidarity — expressing grief — on the home pages of their websites?

Jessica Reed over at the Guardian makes some interesting observations. She is absolutely right to admonish those businesses that would seek to profit from such barbaric acts. In fact, we should boycott any found to be doing so. Some are taking real and positive action, such as enabling free communication or providing free transportation and products. However, she is also correct to warn us of the growing, insidious tendency to anthropomorphize and project sentience onto corporations.

Brands and companies increasingly love us, they sympathize, and now they grieve with us. But there is a vast difference from being hugged in sympathy by the boss of your local deli and the faceless, impersonal digital flag-waving courtesy of a dotcom home page.

Who knows where this will lead us decades from now: perhaps if there is money to be made, big corporations will terrorize us as well.

From Jessica Reed:

The pain is shared by all of us, but a golden rule should apply: don’t capitalise on grief, don’t profit from it. Perhaps this is why big companies imposing their sympathy on the rest of us leaves a bitter taste in my month: it is hard for me to see these gestures as anything but profiteering.

Companies are now posing as entities capable of compassion, never mind that they cannot possibly speak for all of its employees. This also brings us a step closer to endowing them with a human trait: the capacity to express emotions. They think they’re sentient.

If this sounds crazy, it’s because it is.

In the US, the debate about corporate personhood is ongoing. The supreme court already ruled that corporations are indeed people in some contexts: they have been granted the right to spend money on political issues, for example, as well as the right to refuse to cover birth control in their employee health plans on religious grounds.

Armed with these rulings, brands continue to colonise our lives, accompanying us from the cradle to the grave. They see you grow up, they see you die. They’re benevolent. They’re family.

Looking for someone to prove me wrong, I asked Ed Zitron, a PR chief executive, about these kinds of tactics. Zitron points out that tech companies are, in some cases, performing a useful service – as in Facebook’s “safety check”, T-Mobile and Verizon’s free communication with France, and Airbnb’s decision to compensate hosts for letting people stay longer for free. Those are tangible gestures – the equivalent of bringing grief-stricken neighbours meals to sustain them, rather than sending a hastily-written card.

Anything else, he says, is essentially good-old free publicity: “an empty gesture, a non-movement, a sanguine pretend-help that does nothing other than promote themselves”.

It’s hard to disagree with him and illustrates how far brands have further infiltrated our lives since the publication of Naomi Klein’s No Logo, which documented advertising as an industry not only interested in selling products, but also a dream and a message. We can now add “grief surfing” to the list.

A fear years back, Jon Stewart mocked this sorry state of affairs:

If only there were a way to prove that corporations are not people, show their inability to love, to show that they lack awareness of their own mortality, to see what they do when you walk in on them masturbating …

Turns out we can’t – companies will love you, in sickness and in health, for better and for worse, whether you want it or not.

 Read the entire article here.

Image: Screen grab from Amazon.fr, November 17, 2015.

Give Me Your Tired, Your Poor, Your Huddled Masses…

UnveilingTheStatueofLiberty-1886-EdwardMoran

Yesterday, November 19, 2015, the US House of Representatives voted in favor of legislation that would make it even more difficult for refugees from Iraq and Syria (and presumably other tormented lands) to enter the country. Today, I am reminded of the Emma Lazarus poem that sits at the base of the Statue of Liberty. The key lines of the New Colossus are:

 “Give me your tired, your poor,
Your huddled masses yearning to breathe free,
The wretched refuse of your teeming shore.
Send these, the homeless, tempest-tost to me,
I lift my lamp beside the golden door!”

Looks like many of our representatives, state governors and a variety of talking heads have forgotten one of the central tenets that inspired, and inspires, this nation.

Image: Statue of Liberty unveiled, by Edward Moran, 1886. Courtesy of Museum of the City of New York. Public Domain.

Xenophobia: Terrorism of the Mind

I suspect xenophobia is a spectrum disorder. At one end of the spectrum we see the acts of fundamentalist terrorists following their apocalyptic (and isolationist) scripts to their barbaric conclusions. At the other end, we hear the segregationist rants of talking heads demanding litmus tests for migrants, refugees and victims of violence. And, it’s all the more distasteful when one of the talking heads controls vast swathes of the global media.

So, shame on you Rupert Murdoch for suggesting that the US allow entry only to proven Christian refugees. Clearly, tolerance, understanding and inclusiveness are not concepts that Mr.Murdoch understands — traits that he lacks in common with those that he accuses.

From the Guardian:

I see Rupert Murdoch has come up with a foolproof method to ensure that the United States is safe from terrorism.

In a tweet offering advice to the American president, he wrote:

“Obama facing enormous opposition in accepting refugees. Maybe make special exception for proven Christians”

Oh yes he did. Does the News Corp boss not realise that this is just the kind of response to terrorism that the terrorists seek to provoke?

Ostracising all Muslims by refusing them sanctuary on the grounds that that they are potential terrorists is likely to be counter-productive. And, incidentally, is it not unChristian?

I note that the editor of Newsnight, Ian Katz, tongue firmly in cheek, tweeted back to Murdoch:

“Interesting idea… will you come and talk about it on @BBCNewsnight”.

But he didn’t take up the offer. He obviously prefers to let his wisdom shine through in 140 characters.

I am also queasy about the Tuesday editorial in Murdoch’s favourite newspaper, the Sun, which called on British-based Muslims to prove their opposition to “the jihadis” by marching through London with placards saying “not in our name”.

Rightly, the paper points out that Isis “seeks to establish violent, oppressive fundamentalism as the only true faith and to divide Muslims from non-Muslims.”

But I wonder whether the Sun realises that its message is similar: the effect of treating Muslims, all Muslims, as some kind of homogenous entity (and as a thing apart) is more likely to foment divisions with non-Muslims and alienate Muslims still further.

Read the entire story here.

The Illness Known As Evil

What turns a seemingly ordinary person (usually male) into a brutal killer or mass-murderer? How does a quiet computer engineer end up as a cold-blooded executioner of innocents on a terrorist video in 2015? Why does one single guard in a concentration camp lead hundreds of thousands to their deaths during the Second World War? Why do we humans perform acts of such unspeakable brutality and horror?

Since the social sciences have existed researchers have weighed these questions. Is it possible that those who commit such acts of evil are host to a disease of the brain? Some have dubbed this Syndrome E, where E stands for evil. Others are not convinced that evil is a neurological condition with biochemical underpinnings. And so the debate, and the violence, rages on.

From the New Scientist:

The idea that a civilised human being might be capable of barbaric acts is so alien that we often blame our animal instincts – the older, “primitive” areas of the brain taking over and subverting their more rational counterparts. But fresh thinking turns this long-standing explanation on its head. It suggests that people perform brutal acts because the “higher”, more evolved, brain overreaches. The set of brain changes involved has been dubbed Syndrome E – with E standing for evil.

In a world where ideological killings are rife, new insights into this problem are sorely needed. But reframing evil as a disease is controversial. Some believe it could provide justification for heinous acts or hand extreme organisations a recipe for radicalising more young people. Others argue that it denies the reality that we all have the potential for evil within us. Proponents, however, say that if evil really is a pathology, then society ought to try to diagnose susceptible individuals and reduce contagion. And if we can do that, perhaps we can put radicalisation into reverse, too.

Following the second world war, the behaviour of guards in Nazi concentration camps became the subject of study, with some researchers seeing them as willing, ideologically driven executioners, others as mindlessly obeying orders. The debate was reignited in the mid-1990s in the wake of the Rwandan genocide and the Srebrenica massacre in Bosnia. In 1996, The Lancet carried an editorial pointing out that no one was addressing evil from a biological point of view. Neurosurgeon Itzhak Fried, at the University of California, Los Angeles, decided to rise to the challenge.

In a paper published in 1997, he argued that the transformation of non-violent individuals into repetitive killers is characterised by a set of symptoms that suggests a common condition, which he called Syndrome E (see “Seven symptoms of evil“). He suggested that this is the result of “cognitive fracture”, which occurs when a higher brain region, the prefrontal cortex (PFC) – involved in rational thought and decision-making – stops paying attention to signals from more primitive brain regions and goes into overdrive.

The idea captured people’s imaginations, says Fried, because it suggested that you could start to define and describe this basic flaw in the human condition. “Just as a constellation of symptoms such as fever and a cough may signify pneumonia, defining the constellation of symptoms that signify this syndrome may mean that you could recognise it in the early stages.” But it was a theory in search of evidence. Neuroscience has come a long way since then, so Fried organised a conference in Paris earlier this year to revisit the concept.

At the most fundamental level, understanding why people kill is about understanding decision-making, and neuroscientists at the conference homed in on this. Fried’s theory starts with the assumption that people normally have a natural aversion to harming others. If he is correct, the higher brain overrides this instinct in people with Syndrome E. How might that occur?

Etienne Koechlin at the École Normale Supérieure in Paris was able to throw some empirical light on the matter by looking at people obeying rules that conflict with their own preferences. He put volunteers inside a brain scanner and let them choose between two simple tasks, guided by their past experience of which would be the more financially rewarding (paying 6 euros versus 4). After a while he randomly inserted rule-based trials: now there was a colour code indicating which of the two tasks to choose, and volunteers were told that if they disobeyed they would get no money.

Not surprisingly, they followed the rule, even when it meant that choosing the task they had learned would earn them a lower pay-off in the free-choice trials. But something unexpected happened. Although rule-following should have led to a simpler decision, they took longer over it, as if conflicted. In the brain scans, both the lateral and the medial regions of the PFC lit up. The former is known to be sensitive to rules; the latter receives information from the limbic system, an ancient part of the brain that processes emotional states, so is sensitive to our innate preferences. In other words, when following the rule, people still considered their personal preference, but activity in the lateral PFC overrode it.

Of course, playing for a few euros is far removed from choosing to kill fellow humans. However, Koechlin believes his results show that our instinctive values endure even when the game changes. “Rules do not change values, just behaviours,” he says. He interprets this as showing that it is normal, not pathological, for the higher brain to override signals coming from the primitive brain. If Fried’s idea is correct, this process goes into overdrive in Syndrome E, helping to explain how an ordinary person overcomes their squeamishness to kill. The same neuroscience may underlie famous experiments conducted by the psychologist Stanley Milgram at Yale University in the 1960s, which revealed the extraordinary lengths to which people would go out of obedience to an authority figure – even administering what they thought were lethal electric shocks to strangers.

Fried suggests that people experience a visceral reaction when they kill for the first time, but some rapidly become desensitised. And the primary instinct not to harm may be more easily overcome when people are “just following orders”. In unpublished work, Patrick Haggard at University College London has used brain scans to show that this is enough to make us feel less responsible for our actions. “There is something about being coerced that produces a different experience of agency,” he says, “as if people are subjectively able to distance themselves from this unpleasant event they are causing.”

However, what is striking about many accounts of mass killing, both contemporary and historical, is that the perpetrators often choose to kill even when not under orders to do so. In his book Ordinary Men, the historian Christopher Browning recounts the case of a Nazi unit called reserve police battalion 101. No member of this unit was forced to kill. A small minority did so eagerly from the start, but they may have had psychopathic or sadistic tendencies. However, the vast majority of those who were reluctant to kill soon underwent a transformation, becoming just as ruthless. Browning calls them “routinised” killers: it was as if, once they had decided to kill, it quickly became a habit.

Habits have long been considered unthinking, semi-automatic behaviours in which the higher brain is not involved. That seems to support the idea that the primitive brain is in control when seemingly normal people become killers. But this interpretation is challenged by new research by neuroscientist Ann Graybiel at the Massachusetts Institute of Technology. She studies people with common psychiatric disorders, such as addiction and depression, that lead them to habitually make bad decisions. In high-risk, high-stakes situations, they tend to downplay the cost with respect to the benefit and accept an unhealthy level of risk. Graybiel’s work suggests the higher brain is to blame.

In one set of experiments, her group trained rats to acquire habits – following certain runs through mazes. The researchers then suppressed the activity of neurons in an area of the PFC that blocks signals coming from a primitive part of the brain called the amygdala. The rats immediately changed their running behaviour – the habit had been broken. “The old idea that the cognitive brain doesn’t have evaluative access to that habitual behaviour, that it’s beyond its reach, is false,” says Graybiel. “It has moment-to-moment evaluative control.” That’s exciting, she says, because it suggests a way to treat people with maladaptive habits such as obsessive-compulsive disorder, or even, potentially, Syndrome E.

What made the experiment possible was a technique known as optogenetics, which allows light to regulate the activity of genetically engineered neurons in the rat PFC. That wouldn’t be permissible in humans, but cognitive or behavioural therapies, or drugs, could achieve the same effect. Graybiel believes it might even be possible to stop people deciding to kill in the first place by steering them away from the kind of cost-benefit analysis that led them to, say, blow themselves up on a crowded bus. In separate experiments with risk-taking rats, her team found that optogenetically decreasing activity in another part of the limbic system that communicates with the PFC, the striatum, made the rats more risk-averse: “We can just turn a knob and radically alter their behaviour,” she says.

Read the entire article here.

Vive La Republique

LibertyEqualityorDeath

My thoughts are with the innocent victims, and their families and friends, of the horrific and cowardly events in Paris, France.

Image: The motto of the French Republic — Liberty, Equality, Fraternity or Death. courtesy of Hector Fleischmann, La guillotine en 1793, Paris: Librairie des Publications Modernes, 1908. Public Domain.

Grandiose Narcissism

Google-search-GOP-debate

Oh America! You are locked in a painful and relentless electioneering cycle. Love it or hate it, the process of electing a president is a brutal and brutish amalgam of self-centeredness, untruth, circus-showmanship, flamboyance and ego. Psychologists have a label for these traits, often synthesized to their essence in political candidates and leaders. It’s called grandiose narcissism. It would seem that during the current presidential election cycle, which began several hundred years and 10 million political commercials ago, has an overstuffed share of these grandiose narcissists. This makes for tremendous entertainment. But, it’s thoroughly ghastly to think that one of these performers could be in the White House a mere six months from now.

From the NYT:

With the presidential campaign in full swing, a perennial question has resurfaced: How much weight should voters give to candidates’ personalities? The political rise of Donald J. Trump has drawn attention to one personality trait in particular: narcissism. Although narcissism does not lend itself to a precise definition, most psychologists agree that it comprises self-centeredness, boastfulness, feelings of entitlement and a need for admiration.

We have never met Mr. Trump, let alone examined him, so it would be inappropriate of us to offer a formal assessment of his level of narcissism. And in all fairness, today’s constant media attention makes a sizable ego a virtual job requirement for public office. Still, the Trump phenomenon raises the question of what kinds of leaders narcissists make. Fortunately, a recent body of research has suggested some answers.

In a 2013 article in Psychological Science, we and our colleagues approached this question by studying the 42 United States presidents up to and including George W. Bush. (The primary data were collected before Barack Obama’s presidency.) First we took a data set compiled by the psychologists Steven Rubenzer and Thomas Faschingbauer, who for an earlier study asked experts on each president to complete personality surveys on the subjects of their expertise. Then, using standard formulas from the research literature on personality, we produced estimates of each president’s narcissism level. Finally, we correlated these personality ratings with data from surveys of presidential performance obtained from independent panels of historians.

We found that narcissism, specifically “grandiose narcissism” — an amalgam of flamboyance, immodesty and dominance — was associated with greater overall presidential success. (This relation was small to moderate in magnitude.) The two highest scorers on grandiose narcissism were Lyndon B. Johnson and Theodore Roosevelt, the two lowest James Monroe and Millard Fillmore.

Grandiose narcissism was tied to slightly better crisis management, public persuasiveness and agenda-setting. Presidents with high levels of this trait were also more likely to assume office by winning election in a landslide (55 percent or more of the popular vote) and to initiate new legislation.

Yet we also found that grandiose narcissism was associated with certain negative outcomes, including unethical behaviors like stealing, abusing power and bending rules. High scorers on this trait were especially likely to have been the target of impeachment resolutions (John Tyler, Andrew Johnson, Bill Clinton).

We also considered a less well-understood dimension of narcissism: “vulnerable narcissism,” a trait associated with being self-absorbed and thin-skinned (think of Richard M. Nixon, who was a high scorer on this trait). We found that vulnerable narcissism showed little relation to successful presidential leadership.

To be certain, our results were based on a small and highly select sample, and we relied on presidential experts’ judgments of personality. Still, other psychological studies of narcissism, using other data and different methods, have yielded broadly similar results.

In contrast, the psychologist W. Keith Campbell and others have found that narcissists tend to be overconfident when making decisions, to overestimate their abilities and to portray their ideas as innovative when they are not. Compared with their non-narcissistic counterparts, they are more likely to accumulate resources for themselves at others’ expense.

Read the entire story here.

Image courtesy of Google Search.

Your Perfect Lifestyle Captured, Shared, Commoditized

Socality-BarbieMany millions of people post countless images on a daily basis of their perfect soft-focus and sepia-toned lives on Instagram (and other social media). These images are cataloged and captioned and shared so that many more millions may participate vicariously in these wonderfully perfect moments.

Recently a well known personality joined the Instagram picture-posting, image-sharing frenzy. Not unlike movie-stars, sports personalities and the music glitterati she’s garnered millions of followers on Instagram. She posts pictures of her latest, perfect outfits with perfect hair; she shows us perfect lattes sipped from the perfect coffee shop; she shares shares soft-focus sunsets from perfect mountaintops; images of a perfect 5-course dinner from a perfectly expensive bistro with or without that perfect bearded date; photographs of perfect vacations at the beach or from a yacht or a vintage train. She seems to have a perfect life, captured in a kaleidoscopic torrent of perfect visuals.

Her name is Barbie. Actually, her full name is Socality Barbie. She’s a parody of her human followers, and she’s well on her way to becoming the next social media sensation. Except, she’s not real, she’s a Barbie doll. But what’s really interesting about Socality Barbie is that she’s much like many of her human peers on social media — she’s a commoditized hipster.

My one complaint: she doesn’t take enough selfies. I wonder what’s next for her — perhaps an eponymous reality TV show.

From the Guardian:

Here she is on the sand, barefoot in the lapping waves, wearing cropped skinny jeans and shoulder-robing a blanket. And here she is in a cafe, the sleeves of her utility overshirt pushed up as she reaches for her flat white with its photogenic foam-art. Here she is in the mountains, wearing a beanie hat that perfectly offsets hair blow-dried into soft waves. Oh, and look, here’s a still-life shot of her weekend-away capsule wardrobe laid out on hardwood floors. She’s taking high-heeled hiking boots. But then, she is a Barbie doll.

Socality Barbie, the newest social-media sensation, is on a mission to take down Instagram from the inside. The account is the brainchild of an anonymous wedding photographer in Oregon, who dresses a Barbie doll in mini-hipster outfits and posts Instagram shots of doll-sized hikes (always sunny, lots of photogenic light shafts through the trees), coffee dates (whitewashed wooden tables and a calm, mindful atmosphere) and boyfriends (check shirt, facial hair).

It’s not exactly satire – I don’t think you can really satirise Instagram, that would be like satirising kittens – but Socality Barbie skewers something about how plastic Instagram has become. She is the Rosa Parks of a society oppressed by thigh gaps and tyrannised by heavily filtered brunches. She is a taking a brave stand against – OK, poking fun at – the disproportionate power and influence of Instagram, which has overtaken the Farrow & Ball paint chart as the sacred text we must live by.

Let me get one thing straight: I love Instagram. I am addicted. Sometimes I wake up in the night and, half asleep, reach for my phone and start scrolling through my feed, which at that hour is Lily-Rose Depp in novelty socks, people I vaguely know in New York taking overlit selfies in bars and insomniacs on a 3am camera-roll jag posting throwback photos with mawkish captions. And I love it. So I am absolutely not about to declare Instagram over. Anyway, that would be idiotic: in 2012, Facebook paid $1bn to buy it; it is now valued at $35bn. And in fashion, Instagram is everything. It has catwalk shows in real time, street style from all over the world, plus you get to see every time someone you know buys a new coat. What more could I possibly want?

But what Instagram isn’t any more is cutting edge. Instead of being hip, it is a world of commodified hipsterdom. All pigeon-toed loafers on pretty tiled floors and nail art on a hand holding a street-truck burger. It is a guilty pleasure, a cosy comforting world where everyone dresses really well and is also, like, super nice. It is is a bit like watching reality TV, in fact. You get to watch attractive people living their lives, at a level of apparent intimacy that makes it compelling. Theoretically, Instagram is more high-minded than reality TV, because it shows you a kaleidoscope of viewpoints from all over the world. The trouble is they all look the same.

Read the entire story here.

Images courtesy of Socality Barbie.

PhotoMash: Snoopers Charter and Fast Walking

Welcome to my inaugural PhotoMash segment. This is a lighthearted look at juxtaposing news stories. Online media needs eyeballs. So to keep our attention media outlets cycle and recycle their news stories ever more frequently. The result is that we’re increasingly likely to find unrelated and sometimes opposing stories right next to each other on a page. Editors have little time to police these embarrassing juxtapositions of text and images, since much is now driven by automated content publishing systems, which of course paves the way for my story and/or photo mash-up service.

Photomash-Teresa_May-Fast_Walking

So, here’s my first PhotoMash, courtesy of the Independent in the UK. Home Secretary Teresa May introducing new surveillance proposals and the UK’s first fast pedestrian lane for walkers. Makes for an interesting mash-up. Get the idea? Two, or more, incongruous images displayed coincidentally side-by-side. [Are those Teresa May’s legs?]

Images courtesy of the Independent.

Can Burning Man Be Saved?

Burning-Man-2015-gallery

I thought it rather appropriate to revisit Burning Man one day after Guy Fawkes Day in the UK. I must say that Burning Man has grown into more of a corporate event compared with the cheesy pyrotechnic festivities in Britain on the 5th of November. So, even though Burners have a bigger, bolder, brasher event please remember-remember, we Brits had the original burning man — by 380 years.

The once-counter-cultural phenomenon known as Burning Man seems to be maturing into an executive-level tech-fest. Let’s face it, if I can read about the festival in the mainstream media it can’t be as revolutionary as it once set out to be. Though, the founders‘ desire to keep the festival radically inclusive means that organizers can’t turn away those who may end up razing Burning Man to the ground due to corporate excess. VCs and the tech elite from Silicon Valley now descend in their hoards, having firmly placed Burning Man on their app-party circuit. Until recently, Burners mingled relatively freely throughout the week-long temporary metropolis in the Nevada desert; now, the nouveau riche arrive on private jets and “camp” in exclusive wagon-circles of luxury RVs catered to by corporate chefs and personal costume designers. It certainly seems like some of Larry Harvey’s 10 Principles delineating Burning Man’s cultural ethos are on shaky ground. Oh well, capitalism ruins another great idea! But, go once before you die.

From NYT:

There are two disciplines in which Silicon Valley entrepreneurs excel above almost everyone else. The first is making exorbitant amounts of money. The second is pretending they don’t care about that money.

To understand this, let’s enter into evidence Exhibit A: the annual Burning Man festival in Black Rock City, Nev.

If you have never been to Burning Man, your perception is likely this: a white-hot desert filled with 50,000 stoned, half-naked hippies doing sun salutations while techno music thumps through the air.

A few years ago, this assumption would have been mostly correct. But now things are a little different. Over the last two years, Burning Man, which this year runs from Aug. 25 to Sept. 1, has been the annual getaway for a new crop of millionaire and billionaire technology moguls, many of whom are one-upping one another in a secret game of I-can-spend-more-money-than-you-can and, some say, ruining it for everyone else.

Some of the biggest names in technology have been making the pilgrimage to the desert for years, happily blending in unnoticed. These include Larry Page and Sergey Brin, the Google founders, and Jeff Bezos, chief executive of Amazon. But now a new set of younger rich techies are heading east, including Mark Zuckerberg of Facebook, employees from Twitter, Zynga and Uber, and a slew of khaki-wearing venture capitalists.

Before I explain just how ridiculous the spending habits of these baby billionaires have become, let’s go over the rules of Burning Man: You bring your own place to sleep (often a tent), food to eat (often ramen noodles) and the strangest clothing possible for the week (often not much). There is no Internet or cell reception. While drugs are technically illegal, they are easier to find than candy on Halloween. And as for money, with the exception of coffee and ice, you cannot buy anything at the festival. Selling things to people is also a strict no-no. Instead, Burners (as they are called) simply give things away. What’s yours is mine. And that often means everything from a meal to saliva.

In recent years, the competition for who in the tech world could outdo who evolved from a need for more luxurious sleeping quarters. People went from spending the night in tents, to renting R.V.s, to building actual structures.

“We used to have R.V.s and precooked meals,” said a man who attends Burning Man with a group of Silicon Valley entrepreneurs. (He asked not to be named so as not to jeopardize those relationships.) “Now, we have the craziest chefs in the world and people who build yurts for us that have beds and air-conditioning.” He added with a sense of amazement, “Yes, air-conditioning in the middle of the desert!”

His camp includes about 100 people from the Valley and Hollywood start-ups, as well as several venture capital firms. And while dues for most non-tech camps run about $300 a person, he said his camp’s fees this year were $25,000 a person. A few people, mostly female models flown in from New York, get to go free, but when all is told, the weekend accommodations will collectively cost the partygoers over $2 million.

This is drastically different from the way most people experience the event. When I attended Burning Man a few years ago, we slept in tents and a U-Haul moving van. We lived on cereal and beef jerky for a week. And while Burning Man was one of the best experiences of my life, using the public Porta-Potty toilets was certainly one of the most revolting experiences thus far. But that’s what makes Burning Man so great: at least you’re all experiencing those gross toilets together.

That is, until recently. Now the rich are spending thousands of dollars to get their own luxury restroom trailers, just like those used on movie sets.

“Anyone who has been going to Burning Man for the last five years is now seeing things on a level of expense or flash that didn’t exist before,” said Brian Doherty, author of the book “This Is Burning Man.” “It does have this feeling that, ‘Oh, look, the rich people have moved into my neighborhood.’ It’s gentrifying.”

For those with even more money to squander, there are camps that come with “Sherpas,” who are essentially paid help.

Tyler Hanson, who started going to Burning Man in 1995, decided a couple of years ago to try working as a paid Sherpa at one of these luxury camps. He described the experience this way: Lavish R.V.s are driven in and connected together to create a private forted area, ensuring that no outsiders can get in. The rich are flown in on private planes, then picked up at the Burning Man airport, driven to their camp and served like kings and queens for a week. (Their meals are prepared by teams of chefs, which can include sushi, lobster boils and steak tartare — yes, in the middle of 110-degree heat.)

“Your food, your drugs, your costumes are all handled for you, so all you have to do is show up,” Mr. Hanson said. “In the camp where I was working, there were about 30 Sherpas for 12 attendees.”

Mr. Hanson said he won’t be going back to Burning Man anytime soon. The Sherpas, the money, the blockaded camps and the tech elite were too much for him. “The tech start-ups now go to Burning Man and eat drugs in search of the next greatest app,” he said. “Burning Man is no longer a counterculture revolution. It’s now become a mirror of society.”

Strangely, the tech elite won’t disagree with Mr. Hanson about it being a reflection of society. This year at the premiere of the HBO show “Silicon Valley,” Elon Musk, an entrepreneur who was a founder of PayPal, complained that Mike Judge, the show’s creator, didn’t get the tech world because — wait for it — he had not attended the annual party in the desert.

“I really feel like Mike Judge has never been to Burning Man, which is Silicon Valley,” Mr. Musk said to a Re/Code reporter, while using a number of expletives to describe the festival. “If you haven’t been, you just don’t get it.”

Read the entire story here.

Image: Burning Man gallery. Courtesy of Burners.

Remember, Remember the Fifth of November

Gunpowder_Plot_conspirators

I was born and came of age in London. So I have vivid, if somewhat mixed, memories of the 5th of November. We kids variously called it Guy Fawkes Day and Bonfire Night. We’d spend our pocket money (allowance) that week on fireworks rather than sweets (candy). We’d set off our fireworks and huddle around bonfires on the evening of the 5th. Naughtier kids would post (mail) fireworks in their neighbors’ letterboxes (mail boxes) and empty milk bottles.

Now that I live in the US I still have difficulty in explaining this strange and uniquely British celebration to Americans. So, here’s another attempt. Though I’ve since given up trying to explain the once common refrain — “Penny for the Guy!”– heard from children on street corners during the week leading up to the 5th of November [you will need to figure this out for yourself].

We celebrate it because Guy Fawkes once tried to blow up the Houses of Parliament. Oops, wrong! We celebrate it because on this day in 1605 the Gunpowder Plot planned by Mr.Fawkes and his Roman Catholic co-conspirators was successfully foiled. Correct!

From the Telegraph:

What is Bonfire Night?

Bonfire Night commemorates the failure of the Gunpowder Plot in November 1605 by a gang of Roman Catholic activists led by Warwickshire-born Robert Catesby.

When Protestant King James I began his reign, English Catholics had hoped that the persecution felt for over 45 years under his predecessor Queen Elizabeth would finally end, but this didn’t transpire so the Gunpowder Plot conspirators resolved to assassinate the King and his ministers by blowing up the Palace of Westminster during the state opening of Parliament.

Guy (Guido) Fawkes and his fellow conspirators, having rented out a house closed to the Houses of Parliament, managed to smuggle 36 barrels of gunpowder into a cellar of the House of Lords – enough to completely destroy the building. (Physicists from the Institute of Physics later calculated that the 2,500kg of gunpowder beneath Parliament would have obliterated an area 500 metres from the centre of the explosion).

The plot began to unravel when an anonymous letter was sent to the William Parker, the 4th Baron Monteagle, warning him not to avoid the House of Lords.

The letter (which could well have been sent by Lord Monteagle’s brother-in-law Francis Tresham), was made public and this led to a search of Westminster Palace in the early hours of November 5.

Explosive expert Fawkes, who had been left in the cellars to set off the fuse, was subsequently caught when a group of guards checked the cellars at the last moment.

Fawkes was arrested, sent to the Tower of London and tortured until he gave up the names of his fellow plotters and Lord Monteagle was rewarded with 500 pounds and 200 pounds worth of lands, for his service in protecting the crown.

Read the entire article here.

Image: A contemporary engraving of eight of the thirteen conspirators, by Crispijn van de Passe. Fawkes is third from the right. Public Domain.

LGBTQ Soup

LGBTQ_flag.svgAt some point we will have all moved on to a post-prudish, post-voyeuristic, post-exploitative, post-coming-out, post-gender identity world; we’ll all be celebrated as individuals, and discrimination will no longer exist.

Slap! Well, that’s quite enough of the pipe-dream for today, let’s get back to the complexity of present day reality. So, here’s a quick snapshot of where we are on the gender-label issue. Keep in mind, the “snapshot” is courtesy of the Guardian and the “we” refers to the British — both very peculiar institutions.

From the Guardian:

When Rugby League’s Keegan Hirst came out as gay this week, he said that he had been hiding for a long time. “How could I be gay? I’m from Batley, for goodness sake. No one is gay in Batley.” If the 27-year-old Yorkshireman had been a few years younger, he might have found some people in his hometown who are at least sexually fluid. A YouGov poll this week put the number of 18- to 24-year-old Brits who identify as entirely heterosexual at 46%, while just 6% would call themselves exclusively gay. Sexuality now falls between the lines: identity is more pliable, and fluidity more acceptable, than ever before.

The gay-straight binary is collapsing, and it’s doing so at speed. The days in which a celebrity’s sexual orientation was worthy of a tabloid scandal have long since died out. Though newspapers still report on famous people coming out and their same-sex relationships, the lurid language that once accompanied such stories has been replaced by more of a gossipy, “did you know?” tone, the sort your mum might take on the phone, when she’s telling you about what Julie round the corner has been up to. And the reaction of the celebrities involved has morphed, too, into a refusal to play the naming game. Arena-filling pop star Miley Cyrus posted an Instagram of a news story that described her as “genderqueer” with the caption, “NOTHING can/will define me! Free to be EVERYTHING!!!”. Kristen Stewart, who has been followed around by insinuations about the “gal pal” she is often photographed with for a couple of years, finally spoke about the relationship in an interview with Nylon magazine this month. She said, simply, “Google me, I’m not hiding”, but, like the people surveyed by YouGov, refused to define herself as gay or straight. “I think in three or four years, there are going to be a whole lot more people who don’t think it’s necessary to figure out if you’re gay or straight. It’s like, just do your thing.”

It’s arguable that celebrities such as Stewart are part of the reason for those parameters becoming less essential, at least in the west. It shouldn’t fall to famous people to define our social attitudes but, simply, visibility matters: if it is not seen as outrageous or transgressive that the star of Twilight will hold hands with her girlfriend in the street, then that, in a very small way, reinforces the normality of it. If Cara Delevingne tells Vogue that she loves her girlfriend, then that, too, adds to the picture. The more people who are out, the more normal it becomes; the less alone a confused kid in a small town looking at gossip websites might feel; the less baffled the parent of a teenager who brings home a same-sex date might be. Combine that with the seemingly unstoppable legislative reinforcement of equal rights, too – gay marriage becoming legal in Ireland, in the US – and suddenly, it seems less “abnormal”, less boundary-busting, to fall in love or lust with someone of the same gender.

“I would describe myself as a bisexual homoromantic,” says Alice, 23, from Sussex. For the uninitiated, I asked her to explain. “It means I like sex with men and women, but I only fall in love with women. I wouldn’t say something wishy-washy like, ‘It’s all about the person,’ because more often it’s just that I sometimes like a penis.” She says her attitude towards sex and sexuality is similar among other people in her peer group. “A lot of my friends talk about their sexuality in terms of behaviour these days, rather than in terms of labels. So they’ll say, ‘I like boys’, or ‘I get with girls too,’ rather than saying, ‘I’m gay, I’m a lesbian, I’m bisexual.’”

She says that even among those who exclusively date people of the same gender, there is a reluctance to claim an identity as proscriptive as “gay”. “Most young people who are gay don’t see it as a defining property of their character, because they don’t have to, because society doesn’t constantly remind them of their difference.” However, she is careful to point out that this is very much the case in the small, liberal part of London where she lives now. “[Not defining] is something I feel entitled to as a person who lives in London, but I didn’t feel entitled to it in a small town in the home counties. I’ve never experienced discrimination about my sexuality, but I’m aware that it’s because I ‘pass’ [as straight].”

In fact, among the young British people I spoke to, geography is vital. Lucy, 25, wonders if the number of people who say they are not straight really tallies with the number of people who are actually acting upon those desires. “Saying you’re sexually fluid means you’re part of a movement. It means you’re seen as forward-thinking,” she says, suggesting there is a certain cachet attached to being seen as open that does not come with affirmed heterosexuality. She also believes it is more of a metropolitan story than necessarily representative of Britain as a whole. “If I went back to my home town in the Midlands, we wouldn’t sit around talking about ‘sexual fluidity’. You’re a ‘dyke’, or you’re not. There’s only one type of lesbian there.”

Read the entire story here.

Image: Gay Pride Flag. Public Domain. Courtesy of Wikipedia.