Tag Archives: pop culture

I Literally Can’t Even

Literally… Can’t Even…

By the time you read this the title phrase will be a cringeworthy embarrassment to the teens that popularized it just over a month ago. Ok, so I’m exaggerating slightly, but you get my point — new slang enters, and leaves, our pop lexicon faster than the rise and fall of internet hasbeen Psy. The simplified life-cycle goes something like this:

Week 1: Teens co-opt and twist an existing word or phrase to a new meaning.

Week 2: Parents of teens scratch heads; teens’ obfuscation is successful.

Week 3: Social networks both on and offline amplify the new “meme”.

Week 4: Corporations targeting the teen demographic adopt the meme themselves.

Week 5: Mass media picks up the story.

Week 5 + 1 Day: New meme is old news; teens move on; parents continue head-scratching; corporate ad agencies still promoting old meme are fired.

To an amateur linguist this process is fascinating. Though, I must admit to heart palpitations — metaphorical ones — when I hear people, young and old, use and misuse “literally”. As for “can’t even”, well, its time has already passed. Next!

From NYT:

A little paradox of Internet celebrity is that a YouTube personality can amass millions upon millions of young fans by making it seem as if he’s chatting with each of them one to one. Tyler Oakley, a 26-year-old man who identifies as a “professional fangirl,” is a master of the genre. He has nerd glasses, pinchable cheeks, a quiff he dyes in shades of blue and green and more YouTube subscribers than Shakira. Some of his teenage admirers have told him that he is the very first gay person that they have ever seen. He models slumber party outfits and gushes over boy bands, giving the kids who watch him from their bedrooms a peek into a wider world.

In March 2012, Oakley faced the camera, balanced a laptop in his sightline and paged through a photo set of the curly-haired actor Darren Criss, whose turn as a hunky gay singer in “Glee” made him a fixture of teenage dreams. In these new pictures, which had just been leaked online, Criss was lounging on a beach wearing only a pair of low-rise jeans and a layer of perspiration. Oakley’s videotaped reaction was exultant. “I literally cannot even,” he informed his fans. “I can’t even. I am unable to even. I have lost my ability to even. I am so unable to even. Oh, my God. Oh, my God!”

Soon, Oakley’s groupies had immortalized his soliloquy in GIF form: “Can’t” upon “can’t,” looping forever. Now they could conjure the GIF whenever they felt so overcome by emotion that they couldn’t even complete a thought. Oakley was not the first to recast the sentence fragment “I can’t even” as a stand-alone expression. He just helped shepherd it out of the insular realm of Tumblr fandom and into the wide-open Internet. That June, John Green, a writer of fiction for young adults who was awed by the praise for his breakaway novel, “The Fault in Our Stars,” pledged to “endeavor to regain my ability to even.” When Kacey Musgraves, then 25, won Best Country Album at the 2014 Grammy Awards, besting Taylor Swift, she began her acceptance speech with two “I can’t evens.” And this season, “Saturday Night Live” aired a sketchin which a trio of nasal-toned interns “literally couldn’t even” deal with their office’s frigid temperature. The punch line lands when they screech at a fourth intern to close her window, and the audience sees her sitting helplessly at her desk, both arms suspended in plaster casts. “I can’t,” she whimpers. “I literally cannot.”

For those who grew up when teenagers didn’t “can’t,” the phrase might register as a whimper, as if millennials have spun their inability to climb the staircase out of the parental basement into a mantra. At least the Valley Girls of the 1980s and ’90s, who turned every statement into a question, and the vocal-fried pop tarts of the early 2000s, who growled almost inaudibly, had the decency to finish their sentences. Kids today, it seems, are so mindless that they can’t even complete their verb phrases.

But if you really believe that teenage girls (and boys) don’t know what they’re talking about, it’s more likely that they just don’t want you to know what they’re talking about. Teenagers may not be able to drive or vote or stay out past curfew or use the bathroom during school hours without permission, but they can talk. Their speech is the site of rebellion, and their slang provides shelter from adult scrutiny.

Guarding the secret code has become tricky, though. Teenagers used to listen for the telltale click of a parent eavesdropping on the telephone line. Now somebody (or something) is monitoring every keystroke. If an adult picks up a scrap of inscrutable teenager-speak via text or Twitter or a whisper wafting up from the back seat, she can access its definition on Urban Dictionary or Genius (which explains that “?‘I can’t even’ is a state of speechlessness too deep to even express in any other words”). In 1980, the linguist David Maurer, author of “The Big Con,” a book about underworld slang, wrote that “the migration of words from subculture to dominant culture is sparked by the amount of interaction between these groups,” as well as by the dominant group’s “interest in the behavior patterns” of the other. Parents are perennially nosy about what their teenagers are saying, and nowadays they can just Google it.

Read the entire article here.

Why Are We Obsessed With Zombies?

Google-search-zombie

Previous generations worried about Frankenstein, evil robots, even more evil aliens, hungry dinosaurs and, more recently, vampires. Nowadays our culture seems to be singularly obsessed with zombies. Why?

From the Conversation:

The zombie invasion is here. Our bookshops, cinemas and TVs are dripping with the pustulating debris of their relentless shuffle to cultural domination.

A search for “zombie fiction” on Amazon currently provides you with more than 25,000 options. Barely a week goes by without another onslaught from the living dead on our screens. We’ve just seen the return of one of the most successful of these, The Walking Dead, starring Andrew Lincoln as small-town sheriff, Rick Grimes. The show follows the adventures of Rick and fellow survivors as they kill lots of zombies and increasingly, other survivors, as they desperately seek safety.

Generational monsters

Since at least the late 19th century each generation has created fictional enemies that reflect a broader unease with cultural or scientific developments. The “Yellow Peril” villains such as Fu Manchu were a response to the massive increase in Chinese migration to the US and Europe from the 1870s, for example.

As the industrial revolution steamed ahead, speculative fiction of authors such as H G Wells began to consider where scientific innovation would take mankind. This trend reached its height in the Cold War during the 1950s and 1960s. Radiation-mutated monsters and invasions from space seen through the paranoid lens of communism all postulated the imminent demise of mankind.

By the 1970s, in films such as The Parallax View and Three Days of the Condor, the enemy evolved into government institutions and powerful corporations. This reflected public disenchantment following years of increasing social conflict, Vietnam and the Watergate scandal.

In the 1980s and 1990s it was the threat of AIDS that was embodied in the monsters of the era, such as “bunny boiling” stalker Alex in Fatal Attraction. Alex’s obsessive pursuit of the man with whom she shared a one night stand, Susanne Leonard argues, represented “the new cultural alignment between risk and sexual contact”, a theme continued with Anne Rices’s vampire Lestat in her series The Vampire Chronicles.

Risk and anxiety

Zombies, the flesh eating undead, have been mentioned in stories for more than 4,000 years. But the genre really developed with the work of H G Wells, Poe and particularly H P Lovecraft in the early 20th century. Yet these ponderous adversaries, descendants of Mary Shelley’s Frankenstein, have little in common with the vast hordes that threaten mankind’s existence in the modern versions.

M Keith Booker argued that in the 1950s, “the golden age of nuclear fear”, radiation and its fictional consequences were the flip side to a growing faith that science would solve the world’s problems. In many respects we are now living with the collapse of this faith. Today we live in societies dominated by an overarching anxiety reflecting the risk associated with each unpredictable scientific development.

Now we know that we are part of the problem, not necessarily the solution. The “breakthroughs” that were welcomed in the last century now represent some of our most pressing concerns. People have lost faith in assumptions of social and scientific “progress”.

Globalisation

Central to this is globalisation. While generating enormous benefits, globalisation is also tearing communities apart. The political landscape is rapidly changing as established political institutions seem unable to meet the challenges presented by the social and economic dislocation.

However, although destructive, globalisation is also forging new links between people, through what Anthony Giddens calls the “emptying of time and space”. Modern digital media has built new transnational alliances, and, particularly in the West, confronted people with stark moral questions about the consequences of their own lifestyles.

As the faith in inexorable scientific “progress” recedes, politics is transformed. The groups emerging from outside the political mainstream engage in much older battles of faith and identity. Whether right-wing nationalists or Islamic fundamentalists, they seek to build “imagined communities” through race, religion or culture and “fear” is their currency.

Evolving zombies

Modern zombies are the product of this globalised, risk conscious world. No longer the work of a single “mad” scientist re-animating the dead, they now appear as the result of secret government programmes creating untreatable viruses. The zombies indiscriminately overwhelm states irrespective of wealth, technology and military strength, turning all order to chaos.

Meanwhile, the zombies themselves are evolving into much more tenacious adversaries. In Danny Boyle’s 28 Days Later it takes only 20 days for society to be devastated. Charlie Higson’s Enemy series of novels have the zombies getting leadership and using tools. In the film of Max Brooks’ novel, World War Z, the seemingly superhuman athleticism of the zombies reflects the devastating springboard that vast urban populations would provide for such a disease. The film, starring Brad Pitt, had a reported budget of US$190m, demonstrating what a big business zombies have become.

Read the entire article here.

Image courtesy of Google Search.

The Decade of the Selfie

skineepix

Two recent stories are indicative of these self-obsessed times, and of course, both center around the selfie. One gives us some added insights into SkinneePix — a smartphone app that supposedly transforms you into your thinner and more attractive self. The second, shows us that perhaps, just perhaps, the selfie craze has reached its zenith — as politicians and royals and pop-stars show us what their bed-heads and double chins look like.

I’d like to hope that the trend fizzles soon, as have thousands of flash-in-the-pan trends have done before. Yet, what if this is just the beginning of an era that is unabashedly more self-centered? After all, there is a vast untapped world of selfidom out there: audio selfies of our bathroom routines; selfies that automatically rate your BMI; selfies that you can print in 3D; selfies that become your personal digital assistant; selfies that text other selfies; a selfie hall-of-fame; selfies that call your analyst based on how you look; selfies that set up appointments with your hair stylist should your hair not look like the top 10 selfies of the day; selfies from inside the body; a selfie that turns off your credit card and orders celery if you look 5 lbs overweight; selfies of selfies.

From the Guardian:

If you thought Prince Andrew or Michael Gove’s attempts at selfies were the worst thing about the craze – think again.

There is now an app which is designed specifically to make you look skinnier in your selfies. Acting as a FatBooth in reverse, SkinneePix promises to make it look like you’ve shed 5, 10 or 15 lbs with just the click of a button.

The description reads: “SkinneePix makes your photos look good and helps you feel good. It’s not complicated. No one needs to know. It’s our little secret.”

It’s already the norm to add a toasted haze to pouty selfies thanks to photo filters, and some celebs have even been accused of airbrushing their own pictures before putting them up on Instagram – so it was only a matter of time before someone came up with an app like this.

Creators Susan Green and Robin J Phillips say they came up with the app after discovering they hated all the selfies they took on holiday with friends. Green told the Huffington Post: “You’ve always heard about the camera adding 15 pounds, we just wanted to level the playing field.”

They do say don’t knock something til you’ve tried it, so I handed over 69p to iTunes in order to have a poke around the app and see what it’s really like. As it boots up the camera, it flashes up a little message which range from “Good hair day!” to “Make me look good”.

You can’t alter group pictures such as the now infamous Oscars selfie, so I snapped a quick photo at my desk.

Read more here.

From the Telegraph:

RIP The Selfie. It was fun while it lasted, really it was. What larks and indeed Likes as we watched popstrels Rihanna and Rita Ora and model Cara Delevigne record their tiny bikinis and piercings and bed-heads and, once, an endangered slow loris, for posterity.

The ironic Selfie remained fun and fresh and pout-tastic even when it was ushered into the august oak-paneled annals of the Oxford English Dictionary.

The egocentric Selfie weathered President Obama taking a deeply inappropriate quickie at Nelson Mandela’s funeral with the hottie Danish PM whose name we have all forgotten, and David Cameron.

The stealth Selfie even survived the PM being snapped barefoot and snoozing on the bed of his sister-in-law on the morning of her wedding day.

And the recent Ellen DeGeneres Oscars Selfie, with every celeb that ever there was jam-packed together (and, astonishingly, in focus) pretty much qualifies as the Sgt. Pepper’s Lonely Hearts album cover de nos jours.

But then, as is the inevitable parabola of such things, this week the entire phenomenon took a nosedive and died a million pixellated deaths thanks first to Ed Milliband’s blurred, sad-sack Selfie, in which he’s barely in the frame. Bit like his political career, really.

Then came the Parthian Shot: Prince Andrew’s royal snap in which the west wing of Buck House was eclipsed by his Selfie-satisfied porky chops.

And with that, a cutting edge trend turned into the dire digital equivalent of dad-dancing.

Cause of death: Selfie-harm.

Read more here.

Image courtesy of the Guardian / Skinneepix.

The Rise and Fall of Morally Potent Obscenity

There was a time in the U.S. when the many would express shock and decry the verbal (or non-verbal) obscenity of the few. It was also easier for parents to shield the sensitive ears and eyes of their children from the infrequent obscenities of pop stars, politicians and others seeking the media spotlight.

Nowadays, we collectively yawn at the antics of the next post-pubescent alumnus of the Disney Channel. Our pop icons, politicians, news anchors and their ilk have made rudeness, vulgarity and narcissism the norm. Most of us no longer seem to be outraged — some are saddened, some are titillated — and then we shift our ever-decreasing attention spans to the next 15 minute teen-sensation. The vulgar and vain is now ever-present. So we become desensitized, and our public figures and wannabe stars seek the next even-bigger-thing to get themselves noticed before we look elsewhere.

The essayist Lee Siegel seems to be on to something — many of our current obscenity-makers harken back to a time when their vulgarity actually conveyed meaning and could raise a degree of moral indignation in the audience. But now it’s just the new norm and a big yawn.

From Lee Siegel / WSJ:

“What’s celebrity sex, Dad?” It was my 7-year-old son, who had been looking over my shoulder at my computer screen. He mispronounced “celebrity” but spoke the word “sex” as if he had been using it all his life. “Celebrity six,” I said, abruptly closing my AOL screen. “It’s a game famous people play in teams of three,” I said, as I ushered him out of my office and downstairs into what I assumed was the safety of the living room.

No such luck. His 3-year-old sister had gotten her precocious little hands on my wife’s iPhone as it was charging on a table next to the sofa. By randomly tapping icons on the screen, she had conjured up an image of Beyoncé barely clad in black leather, caught in a suggestive pose that I hoped would suggest nothing at all to her or her brother.

And so it went on this typical weekend. The eff-word popped out of TV programs we thought were friendly enough to have on while the children played in the next room. Ads depicting all but naked couples beckoned to them from the mainstream magazines scattered around the house. The kids peered over my shoulder as I perused my email inbox, their curiosity piqued by the endless stream of solicitations having to do with one aspect or another of sex, sex, sex!

When did the culture become so coarse? It’s a question that quickly gets you branded as either an unsophisticated rube or some angry culture warrior. But I swear on my hard drive that I’m neither. My favorite movie is “Last Tango in Paris.” I agree (on a theoretical level) with the notorious rake James Goldsmith, who said that when a man marries his mistress, he creates a job vacancy. I once thought of writing a book-length homage to the eff-word in American culture, the apotheosis of which was probably Sir Ben Kingsley pronouncing it with several syllables in an episode of “The Sopranos.”

I’m cool, and I’m down with everything, you bet, but I miss a time when there were powerful imprecations instead of mere obscenity—or at least when sexual innuendo, because it was innuendo, served as a delicious release of tension between our private and public lives. Long before there was twerking, there were Elvis’s gyrations, which shocked people because gyrating hips are more associated with women (thrusting his hips forward would have had a masculine connotation). But Elvis’s physical motions on stage were all allusion, just as his lyrics were:

Touch it, pound it, what good does it do

There’s just no stoppin’ the way I feel for you

Cos’ every minute, every hour you’ll be shaken

By the strength and mighty power of my love

The relative subtlety stimulates the imagination, while casual obscenity drowns it out. And such allusiveness maintains social norms even as they are being violated—that’s sexy. The lyrics of Elvis’s “Power of My Love” gave him authority as a respected social figure, which made his asocial insinuations all the more gratifying.

The same went, in a later era, for the young Madonna : “Two by two their bodies become one.” It’s an electric image because you are actively engaged in completing it. Contrast that with the aging Madonna trash-talking like a kid:

Some girls got an attitude

Fake t— and a nasty mood

Hot s— when she’s in the nude

(In the naughty naked nude)

It’s the difference between locker-room talk and the language of seduction and desire. As Robbie Williams and the Pet Shop Boys observed a few years ago in their song “She’s Madonna”: “She’s got to be obscene to be believed.”

Everyone remembers the Rolling Stones’ “Brown Sugar,” whose sexual and racial provocations were perfectly calibrated for 1971. Few, if any, people can recall their foray into explicit obscenity two years later with “Star Star.” The earlier song was sly and licentious; behind the sexual allusions were the vitality and energy to carry them out. The explicitness of “Star Star” was for bored, weary, repressed squares in the suburbs, with their swingers parties and “key clubs.”

Just as religious vows of abstinence mean nothing without the temptations of desire—which is why St. Augustine spends so much time in his “Confessions” detailing the way he abandoned himself to the “fleshpots of Carthage”—violating a social norm when the social norm is absent yields no real pleasure. The great provocations are also great releases because they exist side by side with the prohibitions that they are provoking. Once you spell it all out, the tension between temptation and taboo disappears.

The open secret of violating a taboo with language that—through its richness, wit or rage—acknowledges the taboo is that it represents a kind of moralizing. In fact, all the magnificent potty mouths—from D.H. Lawrence to Norman Mailer, the Beats, the rockers, the proto-punks, punks and post-punks, Richard Pryor, Sam Kinison, Patti Smith, and up through, say, Sarah Silverman and the creators of “South Park”—have been moralizers. The late Lou Reed’s “I Wanna Be Black” is so full of racial slurs, obscenity and repugnant sexual imagery that I could not find one meaningful phrase to quote in this newspaper. It is also a wryly indignant song that rips into the racism of liberals whose reverence for black culture is a crippling caricature of black culture.

Though many of these vulgar outlaws were eventually warily embraced by the mainstream, to one degree or another, it wasn’t until long after their deaths that society assimilated them, still warily, and sometimes not at all. In their own lifetimes, they mostly existed on the margins or in the depths; you had to seek them out in society’s obscure corners. That was especially the case during the advent of new types of music. Swing, bebop, Sinatra, cool jazz, rock ‘n’ roll—all were specialized, youth-oriented upheavals in sound and style, and they drove the older generation crazy.

These days, with every new ripple in the culture transmitted, commented-on, analyzed, mocked, mashed-up and forgotten on countless universal devices every few minutes, everything is available to everyone instantly, every second, no matter how coarse or abrasive. You used to have to find your way to Lou Reed. Now as soon as some pointlessly vulgar song gets recorded, you hear it in a clothing store.

The shock value of earlier vulgarity partly lay in the fact that a hitherto suppressed impulse erupted into the public realm. Today Twitter, Snapchat, Instagram and the rest have made impulsiveness a new social norm. No one is driving anyone crazy with some new form of expression. You’re a parent and you don’t like it when Kanye West sings: “I sent this girl a picture of my d—. I don’t know what it is with females. But I’m not too good with that s—”? Shame on you.

The fact is that you’re hearing the same language, witnessing the same violence, experiencing the same graphic sexual imagery on cable, or satellite radio, or the Internet, or even on good old boring network TV, where almost explicit sexual innuendo and nakedly explicit violence come fast and furious. Old and young, high and low, the idiom is the same. Everything goes.

Graphic references to sex were once a way to empower the individual. The unfair boss, the dishonest general, the amoral politician might elevate themselves above other mortals and abuse their power, but everyone has a naked body and a sexual capacity with which to throw off balance the enforcers of some oppressive social norm. That is what Montaigne meant when he reminded his readers that “both kings and philosophers defecate.” Making public the permanent and leveling truths of our animal nature, through obscenity or evocations of sex, is one of democracy’s sacred energies. “Even on the highest throne in the world,” Montaigne writes, “we are still sitting on our asses.”

But we’ve lost the cleansing quality of “dirty” speech. Now it’s casual, boorish, smooth and corporate. Everybody is walking around sounding like Howard Stern. The trash-talking Jay-Z and Kanye West are superwealthy businessmen surrounded by bodyguards, media consultants and image-makers. It’s the same in other realms, too. What was once a cable revolution against treacly, morally simplistic network television has now become a formulaic ritual of “complex,” counterintuitive, heroic bad-guy characters like the murderous Walter White on “Breaking Bad” and the lovable serial killer in “Dexter.” And the constant stream of Internet gossip and brainless factoids passing themselves off as information has normalized the grossest references to sex and violence.

Back in the 1990s, growing explicitness and obscenity in popular culture gave rise to the so-called culture wars, in which the right and the left fought over the limits of free speech. Nowadays no one blames the culture for what the culture itself has become. This is, fundamentally, a positive development. Culture isn’t an autonomous condition that develops in isolation from other trends in society.

The JFK assassination, the bloody rampage of Charles Manson and his followers, the incredible violence of the Vietnam War—shocking history-in-the-making that was once hidden now became visible in American living rooms, night after night, through new technology, TV in particular. Culture raced to catch up with the straightforward transcriptions of current events.

And, of course, the tendency of the media, as old as Lord Northcliffe and the first mass-circulation newspapers, to attract business through sex and violence only accelerated. Normalized by TV and the rest of the media, the counterculture of the 1970s was smoothly assimilated into the commercial culture of the 1980s. Recall the 15-year-old Brooke Shields appearing in a commercial for Calvin Klein jeans in 1980, spreading her legs and saying, “Do you know what comes between me and my Calvins? Nothing.” From then on, there was no going back.

Today, our cultural norms are driven in large part by technology, which in turn is often shaped by the lowest impulses in the culture. Behind the Internet’s success in making obscene images commonplace is the dirty little fact that it was the pornography industry that revolutionized the technology of the Internet. Streaming video, technology like Flash, sites that confirm the validity of credit cards were all innovations of the porn business. The Internet and pornography go together like, well, love and marriage. No wonder so much culture seems to aspire to porn’s depersonalization, absolute transparency and intolerance of secrets.

Read the entire article here.

Nineteenth Century Celebrity

You could be forgiven for believing that celebrity is a peculiar and pervasive symptom of our contemporary culture. After all in our multi-channel, always on pop-culture, 24×7 event-driven, media-obsessed maelstrom celebrities come, and go, in the blink of an eye. This is the age of celebrity.

Well, the U.S. had its own national and international celebrity almost two hundred years ago, and he wasn’t an auto-tuned pop star or a viral internet sensation with a cute cat. His name — Marie-Joseph Paul Yves Roch Gilbert du Motier, the Marquis de La Fayette, a French nobleman and officer, and a major general in the Continental Army.

From Slate:

The Marquis de Lafayette, French nobleman and officer, was a major general in the Continental Army by the age of nineteen. When he returned for a comprehensive tour of the United States in 1824-1825, Lafayette was 67, and was the last man still living who had served at his rank in the Continental Army.

Americans loved the aging soldier for his role in the Revolutionary War, and for his help after the war in smoothing diplomatic relations between the United States and France. Moreover, he was a living connection to his friend and mentor George Washington. The combination made him a celebrity who enjoyed a frenzied reception as he made his way through all 24 states.

Women, especially, poured forth affection for the Marquis. In one beautifully lettered address, the “Young Ladies of the Lexington Female Academy” (Kentucky) showered their visitor with assurances that he was remembered by the new generation of Americans: “Even the youngest, gallant Warrior, know you; even the youngest have been taught to lisp your name.”

Lafayette’s visit inspired the production of souvenir merchandise embroidered, painted, or printed with his face and name. This napkin and glove are two examples of such products.

In his book Souvenir Nation: Relics, Keepsakes, and Curios from the Smithsonian’s National Museum of American History, William L. Bird, Jr. reports that Lafayette was uncomfortable when he encountered ladies wearing these gloves—particularly because a gentleman was expected to kiss a lady’s hand upon first meeting. Bird writes:

When offered a gloved hand at a ball in Philadelphia, Lafayette “murmur[ed] a few graceful words to the effect that he did not care to kiss himself, he [then] made a very low bow, and the lady passed on.”

Read the entire article here.

Image: La Fayette as a Lieutenant General, in 1791. Portrait by Joseph-Désiré Court. Courtesy of Wikipedia.

Frankenlanguage

An interesting story on the adoption of pop culture words into our common lexicon. Beware! The next blockbuster sci-fi movie that you see may influence your next choice of noun.

From the Guardian:

Water cooler conversation at a dictionary company tends towards the odd. A while ago I was chatting with one of my colleagues about our respective defining batches. “I’m not sure,” he said, “what to do about the plural of ‘hobbit’. There are some citations for ‘hobbitses’, but I think they may be facetious uses. Have any thoughts?”

I did: “We enter ‘hobbit’ into the dictionary?” You learn something new every day.

Pop culture is a goldmine of neologisms, and science fiction and fantasy is one rich seam that has been contributing to English for hundreds of years. Yes, hundreds: because what is Gulliver’s Travels but a fantasy satire of 18th-century travel novels? And what is Frankenstein but science fiction? The name of Mary Shelley’s monster lives on both as its own word and as a combining form used in words like “frankenfood”. And Swift’s fantasy novel was so evocative, we adopted a number of words from it, such as “Lilliputian”, the tongue-twisting “Brobdingnagian”, and – surprise – “yahoo”.

Don’t be surprised. Many words have their origins in science fiction and fantasy writing, but have been so far removed from their original contexts that we’ve forgotten. George Orwell gave us “doublespeak”; Carl Sagan is responsible for the term “nuclear winter”; and Isaac Asimov coined “microcomputer” and “robotics”. And, yes, “blaster”, as in “Hokey religions and ancient weapons are no match for a good blaster at your side, kid.”

Which brings us to the familiar and more modern era of sci-fi and fantasy, ones filled with tricorders, lightsabers, dark lords in fiery mountain fortresses, and space cowboys. Indeed, we have whole cable channels devoted to sci-fi and fantasy shows, and the big blockbuster movie this season is Star Trek (again). So why haven’t we seen “tricorder” and “lightsaber” entered into the dictionary? When will the dictionary give “Quidditch” its due? Whither “gorram”?

All fields have their own vocabulary and, as often happens, that vocabulary is often isolated to that field. When an ad executive talks about a “deck”, they are not referring to the same “deck” that poker players use, or the same “deck” that sailors work on. When specialized vocabulary does appear outside of its particular field and in more general literature, it’s often long after its initial point of origin. This process is no different with words from science fiction and fantasy. “Tricorder”, for instance, is used in print, but most often only to refer to the medical diagnostic device used in the Star Trek movies. It’s not quite generic enough to merit entry as a general vocabulary word.

In some cases, the people who gave us the word aren’t keen to see it taken outside of its intended world and used with an extended meaning. Consequently, some coinages don’t get into print as often as you’d think: “Jedi mind trick” only appears four times in the Corpus of Contemporary American English. That corpus contains over 450 million indexed words.

Savvy writers of each genre also liked to resurrect and breathe new life into old words. JRR Tolkien not only gave us “hobbit”, he also popularized the plural “dwarves”, which has appeared in English with increasing frequency since the publication of The Hobbit in 1968. “Eldritch”, which dates to the 1500s, is linked in the modern mind almost exclusively to the stories of HP Lovecraft. The verb “terraform” that was most recently popularized by Joss Whedon’s show Firefly dates back to the 1940s, though it was uncommon until Firefly aired. Prior to 1977, storm troopers were Nazis.

Even new words can look old: JK Rowling’s “muggle” is a coinage of her own devising – but there are earlier, rarer “muggles” entered into the Oxford English Dictionary (one meaning “a tail resembling that of a fish”, and another meaning “a young woman or sweetheart”), along with a “dumbledore” (“a bumble-bee”) and a “hagrid” (a variant of “hag-ridden” meaning “afflicted by nightmares”).

More interesting to the lexicographer is that, in spite of the devoted following that sci-fi and fantasy each have – of the top 10 highest-grossing film franchises in history, at least five of them are science fiction or fantasy – we haven’t adopted more sci-fi and fantasy words into general use. Perhaps, in the case of sci-fi, we just need to wait for technology to improve to the point that we can talk with our co-workers about jumping into hyperspace or hanging out on the holodeck.

Read the entire article here.

Nordic Noir and Scandinavian Cool

Apparently the world once thought of the countries that make up the Scandinavian region as dull and boring. Nothing much happened in Norway, Sweden, Finland and Denmark besides endless winters, ABBA, Volvo and utopian socialist experiments. Not any longer. Over the last couple of decades this region has become a hotbed of artistic, literary and business creativity.

[div class=attrib]From the Economist:[end-div]

TWENTY YEARS AGO the Nordic region was a cultural backwater. Even the biggest cities were dead after 8pm. The restaurants offered meatballs or pale versions of Italian or French favourites. The region did come up with a few cultural icons such as Ingmar Bergman and Abba, and managed to produce world-class architects and designers even at the height of post-war brutalism. But the few successes served only to emphasise the general dullness.

The backwater has now turned into an entrepot. Stockholm relishes its reputation as one of the liveliest cities in Europe (and infuriates its neighbours by billing itself as “the capital of Scandinavia”). Scandinavian crime novels have become a genre in their own right. Danish television shows such as “The Killing” and “Borgen” are syndicated across the world. Swedish music producers are fixtures in Hollywood. Copenhagen’s Noma is one of the world’s most highly rated restaurants and has brought about a food renaissance across the region.

Why has the land of the bland become a cultural powerhouse? Jonas Bonnier, CEO of the Bonnier Group, Sweden’s largest media company, thinks that it is partly because new technologies are levelling the playing field. Popular music was once dominated by British and American artists who were able to use all sorts of informal barriers to protect their position. Today, thanks to the internet, somebody sitting in a Stockholm attic can reach the world. Rovio’s Michael Hed suggests that network effects are much more powerful in small countries: as soon as one writer cracks the global detective market, dozens of others quickly follow.

All true. But there is no point in giving people microphones if they have nothing to say. The bigger reason why the region’s writers and artists—and indeed chefs and game designers—are catching the world’s attention is that they are so full of vim. They are reinventing old forms such as the detective story or the evening meal but also coming up with entirely new forms such as video games for iPads.

The cultural renaissance is thus part of the other changes that have taken place in the region. A closed society that was dominated by a single political orthodoxy (social democracy) and by a narrow definition of national identity (say, Swedishness or Finnishness) is being shaken up by powerful forces such as globalisation and immigration. All the Nordics are engaged in a huge debate about their identity in a post-social democratic world. Think-tanks such as Denmark’s Cepos flaunt pictures of Milton Friedman in the same way that student radicals once flaunted pictures of Che Guevara. Writers expose the dark underbelly of the old social democratic regime. Chefs will prepare anything under the sun as long as it is not meatballs.

The region’s identity crisis is creating a multicultural explosion. The Nordics are scavenging the world for ideas. They continue to enjoy a love-hate relationship with America. They are discovering inspiration from their growing ethnic minorities but are also reaching back into their own cultural traditions. Swedish crime writers revel in the peculiarities of their culture. Danish chefs refuse to use foreign ingredients. A region that has often felt the need to apologise for its culture—those bloodthirsty Vikings! Those toe-curling Abba lyrics! Those naff fishermen’s jumpers!—is enjoying a surge of regional pride.

Blood and snow

Over the past decade Scandinavia has become the world’s leading producer of crime novels. The two Swedes who did more than anyone else to establish Nordic noir—Stieg Larsson and Henning Mankell—have both left the scene of crime. Larsson died of a heart attack in 2004 before his three books about a girl with a dragon tattoo became a global sensation. Mr Mankell consigned his hero, Kurt Wallander, to Alzheimer’s after a dozen bestsellers. But their books continue to be bought in their millions: “Dragon Tattoo” has sold more than 50m, and the Wallander books collectively even more.

A group of new writers, such as Jo Nesbo in Norway and Camilla Lackberg in Sweden, are determined to keep the flame burning. And the crime wave is spreading beyond adult fiction and the written word. Sweden’s Martin Widmark writes detective stories for children. Swedish and British television producers compete to make the best version of Wallander. “The Killing” established a new standard for televised crime drama.

The region has a long tradition of crime writing. Per Wahloo and Maj Sjowall, a Swedish husband-and-wife team, earned a dedicated following among aficionados with their police novels in the 1960s and 1970s. They also established two of Nordic noir’s most appealing memes. Martin Beck is an illness-prone depressive who gets to the truth by dint of relentless plodding. The ten Martin Beck novels present Sweden as a capitalist hellhole that can be saved only by embracing Soviet-style communism (the crime at the heart of the novels is the social democratic system’s betrayal of its promise).

Today’s crime writers continue to profit from these conventions. Larsson’s Sweden, for example, is a crypto-fascist state run by a conspiracy of psychopathic businessmen and secret-service agents. But today’s Nordic crime writers have two advantages over their predecessors. The first is that their hitherto homogenous culture is becoming more variegated and their peaceful society has suffered inexplicable bouts of violence (such as the assassination in 1986 of Sweden’s prime minister, Olof Palme, and in 2003 of its foreign minister, Anna Lindh, and Anders Breivik’s murderous rampage in Norway in 2011). Nordic noir is in part an extended meditation on the tension between the old Scandinavia, with its low crime rate and monochrome culture, and the new one, with all its threats and possibilities. Mr Mankell is obsessed by the disruption of small-town life by global forces such as immigration and foreign criminal gangs. Each series of “The Killing” focuses as much on the fears—particularly of immigrant minorities—that the killing exposes as it does on the crime itself.

The second advantage is something that Wahloo and Sjowall would have found repulsive: a huge industry complete with support systems and the promise of big prizes. Ms Lackberg began her career in an all-female crime-writing class. Mr Mankell wrote unremunerative novels and plays before turning to a life of crime. Thanks in part to Larsson, crime fiction is one of the region’s biggest exports: a brand that comes with a guarantee of quality and a distribution system that stretches from Stockholm to Hollywood.

Dinner in Copenhagen can come as a surprise to even the most jaded foodie. The dishes are more likely to be served on slabs of rock or pieces of wood than on plates. The garnish often takes the form of leaves or twigs. Many ingredients, such as sea cabbage or wild flowers, are unfamiliar, and the more familiar sort, such as pike, are often teamed with less familiar ones, such as unripe elderberries.

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image: ABBA, Eurovision, 1974. Courtesy of Time.[end-div]

A Peek inside Lichtenstein’s Head

Residents and visitors to London are fortunate — they are bombarded by the rich sights, sounds and smells of one of the world’s great cities. One such sight is Tate Modern, ex-power station, now iconic home to some really good art. In fact, they’re hosting what promises to be a great exhibit soon — a retrospective of Roy Lichtenstein from February 21 to May 27.

[div class=attrib]From the Telegraph:[end-div]

Black paintwork, white brickwork, in tree-lined Greenwich Village. We’re spitting distance from Bleecker, whose elongated vowels once made music for Simon and Garfunkel and Steely Dan. When the floodwaters of the nearby Hudson inched upward and east during Hurricane Sandy, they ceased their creep yards from the steps outside.

Inside are the wood floors and fireplace of the area’s typical brownstone, but the cosy effect ends when an alcove ‘bookcase’ turns revolving door, stairway leading downwards. It’s straight from the pages of Agatha Christie, even Indiana Jones.

This is one of two entries (the other far less thrilling) to the cavernous room beneath that was once Roy Lichtenstein’s studio. The house above was used as a bolthole for visiting friends and family, ensuring he could work undisturbed, day in, day out. His watch was rigorous: 10 to 6, with 90 minutes for lunch.

The building is now home to the Lichtenstein Foundation, where every reference to his work, even wrapping paper, is assiduously filed away alongside the artist’s sketchbooks, scrapbooks and working materials. The studio is set up as it was when he was alive. Charts by the sink show dots and lines in every size, colour and combination. The walls have wooden racks designed to tip forward, preventing paint drip. One of his vast murals still hangs there – an incongruous combination of Etruscan meets Henry Moore meets a slice of Swiss cheese.

Aside a scalpel-scored table worktable stands the paint-splattered stool at which the artist whilst drafting and redrafting his compositions. And this is the thing about Lichtenstein. His finished works look so effortless, so without their maker’s mark that we rarely think of the hours, methods and materials that went into their producing. He sought to erase all trace of the selective artist engaged in difficult work. He is as apt to slip through our pressing fingers, as one observer put it, as drops of liquid mercury.

Roy Fox Lichtenstein had a long, uncommonly successful career, even if he did spend most of it in his studio rather than out basking in its rewards. With a retrospective of his work – the first since his death from pneumonia in 1997 aged 73 – opening at the Tate this month, comes the chance to assess the painterly approach behind the Pop inspired sheen, and it isn’t so hands-off after all.

Lichtenstein, born and raised in 1930s Manhattan, began his creative career at a time when Abstract Expressionism reigned supreme, emotional work predicated on a belief that each work is impossible to repeat. Artists sought to impress upon their public a unique signature that would reveal their inner sensibility. Brushwork, the hand-drawn line – these were the lauded aim.

Now, exiting the woodwork, were artists like Claes Oldenburg and Andy Warhol, using banal subjects to skewer such bloated clichés. The Pop crew drew plugs, step-on trash cans, dollar bills and Don Draper’s fizzy saviour, Alka Seltzer. But while most still used a grainy, obviously hand-drawn hatching or line to convey realism, Lichtenstein went a step further.

“I’d always wanted to know the difference between a mark that was art and one that wasn’t” he said, “so l chose among the crudest types of illustration – product packaging, mail order catalogues.” It provided the type of drawing that was most opposite individual expression and its lack of nuance appealed greatly. “I don’t care what a cup of coffee looks like” he said. “I only care about how it’s drawn.”

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image: Ohhh…Alright… by Roy Lichtenstein, 1964. Courtesy of Roy Lichtenstein Foundation / Wikipedia.[end-div]

 

The Rise of Neurobollocks

For readers of thediagonal in North America “neurobollocks” would roughly translate to “neurobullshit”.

So what is this growing “neuro-trend”, why is there an explosion in “neuro-babble” and all things with a “neuro-” prefix, and is Malcolm Gladwell to blame?

[div class=attrib]From the New Statesman:[end-div]

An intellectual pestilence is upon us. Shop shelves groan with books purporting to explain, through snazzy brain-imaging studies, not only how thoughts and emotions function, but how politics and religion work, and what the correct answers are to age-old philosophical controversies. The dazzling real achievements of brain research are routinely pressed into service for questions they were never designed to answer. This is the plague of neuroscientism – aka neurobabble, neurobollocks, or neurotrash – and it’s everywhere.

In my book-strewn lodgings, one literally trips over volumes promising that “the deepest mysteries of what makes us who we are are gradually being unravelled” by neuroscience and cognitive psychology. (Even practising scientists sometimes make such grandiose claims for a general audience, perhaps urged on by their editors: that quotation is from the psychologist Elaine Fox’s interesting book on “the new science of optimism”, Rainy Brain, Sunny Brain, published this summer.) In general, the “neural” explanation has become a gold standard of non-fiction exegesis, adding its own brand of computer-assisted lab-coat bling to a whole new industry of intellectual quackery that affects to elucidate even complex sociocultural phenomena. Chris Mooney’s The Republican Brain: the Science of Why They Deny Science – and Reality disavows “reductionism” yet encourages readers to treat people with whom they disagree more as pathological specimens of brain biology than as rational interlocutors.

The New Atheist polemicist Sam Harris, in The Moral Landscape, interprets brain and other research as showing that there are objective moral truths, enthusiastically inferring – almost as though this were the point all along – that science proves “conservative Islam” is bad.

Happily, a new branch of the neuroscienceexplains everything genre may be created at any time by the simple expedient of adding the prefix “neuro” to whatever you are talking about. Thus, “neuroeconomics” is the latest in a long line of rhetorical attempts to sell the dismal science as a hard one; “molecular gastronomy” has now been trumped in the scientised gluttony stakes by “neurogastronomy”; students of Republican and Democratic brains are doing “neuropolitics”; literature academics practise “neurocriticism”. There is “neurotheology”, “neuromagic” (according to Sleights of Mind, an amusing book about how conjurors exploit perceptual bias) and even “neuromarketing”. Hoping it’s not too late to jump on the bandwagon, I have decided to announce that I, too, am skilled in the newly minted fields of neuroprocrastination and neuroflâneurship.

Illumination is promised on a personal as well as a political level by the junk enlightenment of the popular brain industry. How can I become more creative? How can I make better decisions? How can I be happier? Or thinner? Never fear: brain research has the answers. It is self-help armoured in hard science. Life advice is the hook for nearly all such books. (Some cram the hard sell right into the title – such as John B Arden’s Rewire Your Brain: Think Your Way to a Better Life.) Quite consistently, heir recommendations boil down to a kind of neo- Stoicism, drizzled with brain-juice. In a selfcongratulatory egalitarian age, you can no longer tell people to improve themselves morally. So self-improvement is couched in instrumental, scientifically approved terms.

The idea that a neurological explanation could exhaust the meaning of experience was already being mocked as “medical materialism” by the psychologist William James a century ago. And today’s ubiquitous rhetorical confidence about how the brain works papers over a still-enormous scientific uncertainty. Paul Fletcher, professor of health neuroscience at the University of Cambridge, says that he gets “exasperated” by much popular coverage of neuroimaging research, which assumes that “activity in a brain region is the answer to some profound question about psychological processes. This is very hard to justify given how little we currently know about what different regions of the brain actually do.” Too often, he tells me in an email correspondence, a popular writer will “opt for some sort of neuro-flapdoodle in which a highly simplistic and questionable point is accompanied by a suitably grand-sounding neural term and thus acquires a weightiness that it really doesn’t deserve. In my view, this is no different to some mountebank selling quacksalve by talking about the physics of water molecules’ memories, or a beautician talking about action liposomes.”

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image courtesy of Amazon.[end-div]

FOMO: An Important New Acronym

FOMO is an increasing “problem” for college students and other young adults. Interestingly, and somewhat ironically, FOMO seems to be a more chronic issue in a culture mediated by online social networks. So, what is FOMO? And do you have it?

[div class=attrib]From the Washington Post:[end-div]

Over the past academic year, there has been an explosion of new or renewed campus activities, pop culture phenomena, tech trends, generational shifts, and social movements started by or significantly impacting students. Most can be summed up in a single word.

As someone who monitors student life and student media daily, I’ve noticed a small number of words appearing more frequently, prominently or controversially during the past two semesters on campuses nationwide. Some were brand-new. Others were redefined or reached a tipping point of interest or popularity. And still others showed a remarkable staying power, carrying over from semesters and years past.

I’ve selected 15 as finalists for what I am calling the “2011-2012 College Word of the Year Contest.” Okay, a few are actually acronyms or short phrases. But altogether the terms — whether short-lived or seemingly permanent — offer a unique glimpse at what students participated in, talked about, fretted over, and fought for this past fall and spring.

As Time Magazine’s Touré confirms, “The words we coalesce around as a society say so much about who we are. The language is a mirror that reflects our collective soul.”

Let’s take a quick look in the collegiate rearview mirror. In alphabetical order, here are my College Word of the Year finalists.

1) Boomerangers: Right after commencement, a growing number of college graduates are heading home, diploma in hand and futures on hold. They are the boomerangers, young 20-somethings who are spending their immediate college afterlife in hometown purgatory. A majority move back into their childhood bedroom due to poor employment or graduate school prospects or to save money so they can soon travel internationally, engage in volunteer work or launch their own business.

A brief homestay has long been an option favored by some fresh graduates, but it’s recently reemerged in the media as a defining activity of the current student generation.

“Graduation means something completely different than it used to 30 years ago,” student columnist Madeline Hennings wrote in January for the Collegiate Times at Virginia Tech. “At my age, my parents were already engaged, planning their wedding, had jobs, and thinking about starting a family. Today, the economy is still recovering, and more students are moving back in with mom and dad.”

2) Drunkorexia: This five-syllable word has become the most publicized new disorder impacting college students. Many students, researchers and health professionals consider it a dangerous phenomenon. Critics, meanwhile, dismiss it as a media-driven faux-trend. And others contend it is nothing more than a fresh label stamped onto an activity that students have been carrying out for years.

The affliction, which leaves students hungry and at times hung over, involves “starving all day to drink at night.” As a March report in Daily Pennsylvanian at the University of Pennsylvania further explained, it centers on students “bingeing or skipping meals in order to either compensate for alcohol calories consumed later at night, or to get drunk faster… At its most severe, it is a combination of an eating disorder and alcohol dependency.”

4) FOMO: Students are increasingly obsessed with being connected — to their high-tech devices, social media chatter and their friends during a night, weekend or roadtrip in which something worthy of a Facebook status update or viral YouTube video might occur.  (For an example of the latter, check out this young woman “tree dancing“ during a recent music festival.)

This ever-present emotional-digital anxiety now has a defining acronym: FOMO or Fear of Missing Out.  Recent Georgetown University graduate Kinne Chapin confirmed FOMO “is a widespread problem on college campuses. Each weekend, I have a conversation with a friend of mine in which one of us expresses the following: ‘I’m not really in the mood to go out, but I feel like I should.’ Even when we’d rather catch up on sleep or melt our brain with some reality television, we feel compelled to seek bigger and better things from our weekend. We fear that if we don’t partake in every Saturday night’s fever, something truly amazing will happen, leaving us hopelessly behind.”

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image courtesy of Urban Dictionary.[end-div]

How Religions Are Born: Church of Jedi

May the Fourth was Star Wars Day. Why? Say, “May the Fourth” slowly while pretending to lisp slightly, and you’ll understand. Appropriately, Matt Cresswen over at the Guardian took this day to review the growing Jedi religion in the UK.

Would that make George Lucas God?

[div class=attrib]From the Guardian:[end-div]

Today [May 4] is Star Wars Day, being May the Fourth. (Say the date slowly, several times.) Around the world, film buffs, storm troopers and Jedi are gathering to celebrate one of the greatest science fiction romps of all time. It would be easy to let the fan boys enjoy their day and be done with it. However, Jediism is a growing religion in the UK. Although the results of the 2001 census, in which 390,000 recipients stated their religion as Jedi, have been widely interpreted as a pop at the government, the UK does actually have serious Jedi.

For those of you who, like BBC producer Bill Dare, have never seen Star Wars, the Jedi are “good” characters from the films. They draw from a mystical entity binding the universe, called “the Force”. Sporting hoodies, the Jedi are generally altruistic, swift-footed and handy with a light sabre. Their enemies, Emperor Palpatine, Darth Vader and other cohorts use the dark side of the Force. By tapping into its powers, the dark side command armies of demented droids, kill Jedi and are capable of wiping out entire planets.

This week, Chi-Pa Amshe from the Church of Jediism in Anglesey, Wales, emailed me with some responses to questions. He said Jediism was growing and that they were gaining hundreds of members each month. The church made the news three years ago, after its founder, Daniel Jones, had a widely reported run-in with Tesco.

Chi-Pa Amshe, speaking as a spokesperson for the Jedi council (Falkna Kar, Anzai Kooji Cutpa and Daqian Xiong), believes that Jediism can merge with other belief systems, rather like a bolt-on accessory.

“Many of our members are in fact both Christian and Jedi,” he says. “We can no more understand the Force and our place within it than a gear in a clock could comprehend its function in moving the hands across the face. I’d like to point out that each of our members interprets their beliefs through the prison of their own lives and although we offer guidance and support, ultimately like with the Qur’an, it is up to them to find what they need and choose their own path.”

Meeting up as a church is hard, the council explained, and members rely heavily on Skype and Facebook. They have an annual physical meeting, “where the church council is available for face-to-face questions and guidance”. They also support charity events and attend computer gaming conventions.

Meanwhile, in New Zealand, a web-based group called the Jedi Church believes that Jediism has always been around.

It states: “The Jedi religion is just like the sun, it existed before a popular movie gave it a name, and now that it has a name, people all over the world can share their experiences of the Jedi religion, here in the Jedi church.”

There are many other Jedi groups on the web, although Chi-Pa Amshe said some were “very unpleasant”. The dark side, perhaps.

[div class=attrib]Read the entire article after the jump.[end-div]

Pop art + Money = Mind Candy

[div class=attrib]From the Guardian:[end-div]

The first pop artists were serious people. The late Richard Hamilton was being double-edged and sceptical when he called a painting Hommage à Chrysler Corp. Far from emptily celebrating what Andy Warhol called “all the great modern things”, pop art in the 1950s and early 1960s took a quizzical, sideways look at what was still a very new world of consumer goods. Claes Oldenburg made floppy, saggy sculptures of stuff, which rendered the new look worn out. Warhol painted car crashes. These artists saw modern life in the same surreal and eerie way as the science fiction writer JG Ballard does in his stories and novels.

When, then, did pop art become mind candy, bubblegum, an uncritical adoration of bright lights and synthetic colours? Probably when money got involved, and Warhol was shot, never again to be as brave as he was in the 60s, or when Jeff Koons gave Reaganomics its art, or when Damien Hirst made his tenth million. Who knows? The moment when pop art sank from radical criticism to bland adulation is impossible to pinpoint.

So here we are in Qatar, where today’s pop art guru Takashi Murakami has a new show. We’re not really there, of course, but do we need to be? Murakami is pop for the digital age, a designer of images that make more sense as screensavers than as any kind of high art. In Doha, the artist who celebrated a recent British show with a giveaway cardboard sculpture exhibits a six-metre balloon self-portrait and a 100-metre work inspired by the earthquake in Japan. This follows on from a 2010 exhibition in Versailles, no less. All over the world, in settings old and new, the bright and spectacular art of Murakami is as victorious as Twitter. It is art for computers: all stimuli, no soul.

[div class=attrib]Read the entire article here.[end-div]

[div class=attrib]Image: Takashi Murakami’s six-metre balloon self-portrait, part of the artist’s latest exhibition in Qatar. Courtesy of Chika Okazumi / Guardian.[end-div]

What of the Millennials?

The hippies of the sixties wanted love; the beatniks sought transcendence. Then came the punks, who were all about rage. The slackers and generation X stood for apathy and worry. And, now coming of age we have generation Y, also known as the “millennials”, whose birthdays fall roughly between 1982-2000.

A fascinating article by William Deresiewicz, excerpted below, posits the millennials as a “post-emotional” generation. Interestingly, while this generation seems to be fragmented, its members are much more focused on their own “brand identity” than previous generations.

[div class=attrib]From the New York Times:[end-div]

EVER since I moved three years ago to Portland, Ore., that hotbed of all things hipster, I’ve been trying to get a handle on today’s youth culture. The style is easy enough to describe — the skinny pants, the retro hats, the wall-to-wall tattoos. But style is superficial. The question is, what’s underneath? What idea of life? What stance with respect to the world?

So what’s the affect of today’s youth culture? Not just the hipsters, but the Millennial Generation as a whole, people born between the late ’70s and the mid-’90s, more or less — of whom the hipsters are a lot more representative than most of them care to admit. The thing that strikes me most about them is how nice they are: polite, pleasant, moderate, earnest, friendly. Rock ’n’ rollers once were snarling rebels or chest-beating egomaniacs. Now the presentation is low-key, self-deprecating, post-ironic, eco-friendly. When Vampire Weekend appeared on “The Colbert Report” last year to plug their album “Contra,” the host asked them, in view of the title, what they were against. “Closed-mindedness,” they said.

According to one of my students at Yale, where I taught English in the last decade, a colleague of mine would tell his students that they belonged to a “post-emotional” generation. No anger, no edge, no ego.

What is this about? A rejection of culture-war strife? A principled desire to live more lightly on the planet? A matter of how they were raised — everybody’s special and everybody’s point of view is valid and everybody’s feelings should be taken care of?

Perhaps a bit of each, but mainly, I think, something else. The millennial affect is the affect of the salesman. Consider the other side of the equation, the Millennials’ characteristic social form. Here’s what I see around me, in the city and the culture: food carts, 20-somethings selling wallets made from recycled plastic bags, boutique pickle companies, techie start-ups, Kickstarter, urban-farming supply stores and bottled water that wants to save the planet.

Today’s ideal social form is not the commune or the movement or even the individual creator as such; it’s the small business. Every artistic or moral aspiration — music, food, good works, what have you — is expressed in those terms.

Call it Generation Sell.

Bands are still bands, but now they’re little businesses, as well: self-produced, self-published, self-managed. When I hear from young people who want to get off the careerist treadmill and do something meaningful, they talk, most often, about opening a restaurant. Nonprofits are still hip, but students don’t dream about joining one, they dream about starting one. In any case, what’s really hip is social entrepreneurship — companies that try to make money responsibly, then give it all away.

[div class=attrib]Read the entire article here.[end-div]

[div class=attrib]Image: Millennial Momentum, Authors: Morley Winograd and Michael D. Hais, Rutgers University Press.[end-div]

Will nostalgia destroy pop culture?

[div class=attrib]Thomas Rogers for Slate:[end-div]

Over the last decade, American culture has been overtaken by a curious, overwhelming sense of nostalgia. Everywhere you look, there seems to be some new form of revivalism going on. The charts are dominated by old-school-sounding acts like Adele and Mumford & Sons. The summer concert schedule is dominated by reunion tours. TV shows like VH1’s “I Love the 90s” allow us to endlessly rehash the catchphrases of the recent past. And, thanks to YouTube and iTunes, new forms of music and pop culture are facing increasing competition from the ever-more-accessible catalog of older acts.

In his terrific new book, “Retromania,” music writer Simon Reynolds looks at how this nostalgia obsession is playing itself out everywhere from fashion to performance art to electronic music — and comes away with a worrying prognosis. If we continue looking backward, he argues, we’ll never have transformative decades, like the 1960s, or bold movements like rock ‘n’ roll, again. If all we watch and listen to are things that we’ve seen and heard before, and revive trends that have already existed, culture becomes an inescapable feedback loop.

Salon spoke to Reynolds over the phone from Los Angeles about the importance of the 1960s, the strangeness of Mumford & Sons — and why our future could be defined by boredom.

In the book you argue that our culture has increasingly been obsessed with looking backward, and that’s a bad thing. What makes you say that?

Every day, some new snippet of news comes along that is somehow connected to reconsuming the past. Just the other day I read that the famous Redding Festival in Britain is going to be screening a 1992 Nirvana concert during their festival. These events are like cultural antimatter. They won’t be remembered 20 years from now, and the more of them there are, the more alarming it is. I can understand why people want to go to them — they’re attractive and comforting. But this nostalgia seems to have crept into everything. The other day my daughter, who is 5 years old, was at camp, and they had an ’80s day. How can my daughter even understand what that means? She said the counselors were dressed really weird.

[div class=attrib]More from theSource here.[end-div]

[div class=attrib]Image courtesy of Slate.[end-div]