Tag Archives: media

Fake News: Who’s Too Blame?

alien-abduction-waltonShould we blame the creative originators of fake news, conspiracy theories, disinformation and click-bait hype? Or, should we blame the media for disseminating, spinning and aggrandizing these stories for their own profit or political motives? Or, should we blame us — the witless consumers.

I subscribe to the opinion that all three constituencies share responsibility — it’s very much a symbiotic relationship.

James Warren chief media writer for Poynter has a different opinion; he lays the blame squarely at the feet of gullible and unquestioning citizens. He makes a very compelling argument.

Perhaps if any educated political scholars remain several hundred years from now, they’ll hold the US presidential election of 2016 as the culmination of a process where lazy stupidity triumphed over healthy skepticism and reason.

From Hive:

The rise of “fake news” inspires the press to uncover its many practitioners worldwide, discern its economics and herald the alleged guilt-ridden soul-searching by its greatest enablers, Facebook and Google.

But the media dances around another reality with the dexterity of Beyonce, Usher and septuagenarian Mick Jagger: the stupidity of a growing number of Americans.

So thanks to Neal Gabler for taking to Bill Moyers’ website to pen, “Who’s Really to Blame for Fake News.” (Moyers)

Fake news, of course, “is an assault on the very principle of truth itself: a way to upend the reference points by which mankind has long operated. You could say, without exaggeration, that fake news is actually an attempt to reverse the Enlightenment. And because a democracy relies on truth — which is why dystopian writers have always described how future oligarchs need to undermine it — fake news is an assault on democracy as well.”

Gabler is identified here as the author of five books, without mentioning any. Well, one is 1995’s Winchell: Gossip, Power and the Culture of Celebrity. It’s a superb look at Walter Winchell, the man who really invented the gossip column and wound up with a readership and radio audience of 50 million, or two-thirds of the then-population, as he helped create our modern media world of privacy-invading gossip and personal destruction as entertainment.

“What is truly horrifying is that fake news is not the manipulation of an unsuspecting public,” Gabler writes of our current mess. “Quite the opposite. It is willful belief by the public. In effect, the American people are accessories in their own disinformation campaign. That is our current situation, and it is no sure thing that either truth or democracy survives.”

Think of it. The goofy stories, the lies, the conspiracy theories that now routinely gain credibility among millions who can’t be bothered to read a newspaper or decent digital site and can’t differentiate between Breitbart and The New York Times. Ask all those pissed-off Trump loyalists in rural towns to name their two U.S. senators.

We love convincing ourselves of the strengths of democracy, including the inevitable collective wisdom setting us back on a right track if ever we go astray. And while the media may hold itself out as cultural anthropologists in explaining the “anger” or “frustration” of “real people,” as is the case after Donald Trump’s election victory, we won’t really underscore rampant illiteracy and incomprehension.

So read Gabler. “Above all else, fake news is a lazy person’s news. It provides passive entertainment, demanding nothing of us. And that is a major reason we now have a fake news president.”

Read the entire essay here.

Image: Artist’s conception of an alien spacecraft tractor-beaming a human victim. Courtesy: unknown artist, Wikipedia. Public Domain.

The Elitist Media

The sad, low-energy, loser, elitist media just can’t get it right.

Local and national newspapers, magazines, TV and online media — on the left and right — continue to endorse Hillary Clinton and bash Donald Trump. Some have never endorsed a candidate before, while others have never endorsed a Democrat for President. Perhaps not surprisingly the non-elitist National Enquirer has endorsed Trump. Here are just a few of those elitist endorsements:

The Atlantic: Against Donald Trump

[O]ur interest here is not to advance the prospects of the Democratic Party, nor to damage those of the Republican Party,” the editorial concludes. “We believe in American democracy, in which individuals from various parties of different ideological stripes can advance their ideas and compete for the affection of voters. But Trump is not a man of ideas. He is a demagogue, a xenophobe, a sexist, a know-nothing, and a liar. He is spectacularly unfit for office, and voters—the statesmen and thinkers of the ballot box—should act in defense of American democracy and elect his opponent.

USA Today: Trump is ‘unfit for the presidency’

From the day he declared his candidacy 15 months ago through this week’s first presidential debate, Trump has demonstrated repeatedly that he lacks the temperament, knowledge, steadiness and honesty that America needs from its presidents.

Arizona Republic: Hillary Clinton is the only choice to move America ahead

Since The Arizona Republic began publication in 1890, we have never endorsed a Democrat over a Republican for president. Never. This reflects a deep philosophical appreciation for conservative ideals and Republican principles. This year is different. The 2016 Republican candidate is not conservative and he is not qualified.

Dallas Morning News: We recommend Hillary Clinton for president 

Trump’s values are hostile to conservatism. He plays on fear — exploiting base instincts of xenophobia, racism and misogyny — to bring out the worst in all of us, rather than the best. His serial shifts on fundamental issues reveal an astounding absence of preparedness. And his improvisational insults and midnight tweets exhibit a dangerous lack of judgment and impulse control.

Houston Chronicle: These are unsettling times that require a steady hand: That’s Hillary Clinton

Any one of Trump’s less-than-sterling qualities – his erratic temperament, his dodgy business practices, his racism, his Putin-like strongman inclinations and faux-populist demagoguery, his contempt for the rule of law, his ignorance – is enough to be disqualifying. His convention-speech comment, “I alone can fix it,” should make every American shudder. He is, we believe, a danger to the Republic.

Cincinnati Inquirer: Enquirer: It has to be Hillary Clinton

Trump is a clear and present danger to our country. He has no history of governance that should engender any confidence from voters. Trump has no foreign policy experience, and the fact that he doesn’t recognize it – instead insisting that, “I know more about ISIS than the generals do” – is even more troubling.

How to Tell the Difference Between a Liar and a Bulls**t Artist

In this age of media-fueled online vitriol, denigration, falsehood, and shamelessness — elevated to an art form by the Republican nominee for President — it’s critically important for us to understand the difference between a liar and a bulls**t artist.

The liar is interested in the truth, deep down, but she prefers to hide it behind a veil. The liar often has knowledge or expertise about the truth, but hides it. The bulls**t artist, on the other hand, is an entirely different animal. He is detached from reality caring not for truth or lies; he only cares for his desired effect on his intended audience. The absence of knowledge or expertise is required.

I’ll let you determine to which group Mr.Trump belongs. But, if you need help, check out CNN’s Fareed Zakaria reminding us about Mr.Trump’s unabashed ignorance and bulls**t-artistry.

[tube]Y8nhV4HM3jI[/tube]

From the Guardian:

As the past few decades have shown, the trolling mindset is awesomely well adapted to a digital age. It ignores rational argument. It ignores evidence. It misreads, deliberately. It uses anything and everything somebody says against them. To argue with trolls is to lose – to give them what they want. A troll is interested in impact to the exclusion of all else.

Trolls themselves are hairy Nordic creatures who live under bridges, but trolling doesn’t take its name from them. It comes from the Old French verb troller, meaning to hunt by wandering around in the hope of stumbling upon prey. The word made its way into English as a description of similar fishing tactics: slowly towing a lure in hope of a bite.

Then, in the early 1990s, a Usenet group took up the term to describe some users’ gleeful baiting of the naive: posting provocative comments in hope of attracting an outraged “bite”, then winding up their unwitting victim as thoroughly as possible.

In this, trolling is a form of bullshit art. “The essence of bullshit,” argues the philosopher Harry Frankfurt in his 2005 book of the same name, “is not that it is false but that it is phony”.

Both a liar and an honest person are interested in the truth – they’re playing on opposite sides in the same game. A bullshitter, however, has no such constraint. As Frankfurt puts it, a bullshitter “is neither on the side of the true nor on the side of the false … He does not care whether the things he says describe reality correctly. He just picks them out, or makes them up, to suit his purpose”.

Once again, impact is all. The total absence of knowledge or expertise is no barrier to bullshit. In fact, it helps. The artistry lies in knowing your audience, and saying whatever is needed in order to achieve a desired effect.

Read the entire article here.

Video courtesy of CNN.

Facebook’s Growing Filter Bubble

I’ve been writing about the filter bubble for quite sometime. The filter bubble refers to the tendency for online search tools, and now social media, to screen and deliver results that fit our online history and profile thereby returning only results that are deemed relevant. Eli Pariser coined the term in his book The Filter Bubble, published in 2011.

The filter bubble presents us with a clear faustian bargain: give up knowledge and serendipitous discovery of the wider world for narrow, personalized news and information that matches our immediate needs and agrees with our profile.

The great irony is that our technologies promise a limitless, interconnected web of data and information, but these same technologies ensure that we will see only the small sliver of information that passes through our personal, and social, filters. This consigns us to live inside our very own personal echo chambers, separated from disagreeable information that does not pass criteria in our profiles or measures gleaned across our social networks.

So, we should all be concerned as Facebook turns its attention to delivering and filtering news, and curating it in a quest for a more profitable return. Without question we are in the early stages of the reinvention of journalism as a whole and digital news in particular. The logical conclusion of this evolution has yet to be written, but it is certainly clear that handing so much power over the dissemination of news and information to one company cannot be in our long-term interests. If Mr. Zuckerberg and team deem certain political news to be personally distasteful or contrary to their corporate mission, should we sit back and allow them to filter it for us? I think not.

From Wired:

When Facebook News Feed guru Will Cathcart took the stage at F8 to talk about news, the audience was packed. Some followed along on Twitter. Others streamed the session online. Journalists, developers, and media types all clamored to catch a glimpse of “Creating Value for News Publishers and Readers on Facebook”—value that has become the most coveted asset in the news business as Facebook becomes a primary way the public finds and shares news.

As Cathcart kicked off the session, he took the captive audience to a Syrian refugee camp via Facebook’s new, innovative, and immersive 360 video experience. He didn’t say much about where the camp was (“I believe in Greece?”), nor anything about the camp situation. He didn’t offer the audio of the journalist describing the scene. No matter!

The refugee camp is a placeholder. A placeholder, in fact, that has become so overused that it was actually the second time yesterday that Facebook execs waved their hands about the importance of media before playing a video clip of refugees. It could have been a tour of the White House, the Boston bombing, Coachella. It could have been anything to Facebook. It’s “content.” It’s a commodity. What matters to Facebook is the product it’s selling—and who’s buying is you and the news industry.

What Facebook is selling you is pretty simple. It’s selling an experience, part of which includes news. That experience is dependent on content creators—you know, journalists and newsrooms—who come up with ideas, use their own resources to realize them, and then put them out into the world. All of which takes time, money, and skill. For its “media partners” (the CNNs, BuzzFeeds, and WIREDs of the world), Facebook is selling a promise that their future will be bright if they use Facebook’s latest news products to distribute those new, innovative, and immersive stories to Facebook’s giant audience.

The only problem is that Facebook’s promise isn’t a real one. It’s false hope; or at its worst, a threat.

Read the entire article here.

A Case For Less News

Google-search-cable-news

I find myself agreeing with columnist Oliver Burkeman over at the Guardian that we need to carefully manage our access to the 24/7 news cycle. Our news media has learned to thrive on hyperbole and sensationalism, which — let’s face it — tends to be mostly negative. This unending and unnerving stream of gloom and doom tends to make us believe that we are surrounded by more badness than there actually is. I have to believe that most of the 7 billion+ personal stories each day that we could be hearing about — however mundane — are likely to not be bad or evil. So, while it may not be wise to switch off cable or satellite news completely, we should consider a more measured, and balanced, approach to the media monster.

From the Guardian:

A few days before Christmas, feeling rather furtive about it, I went on a media diet: I quietly unsubscribed from, unfollowed or otherwise disconnected from several people and news sources whose output, I’d noticed, did nothing but bring me down. This felt like defeat. I’ve railed against the popular self-help advice that you should “give up reading the news” on the grounds that it’s depressing and distracting: if bad stuff’s happening out there, my reasoning goes, I don’t want to live in an artificial bubble of privilege and positivity; I want to face reality. But at some point during 2015’s relentless awfulness, it became unignorable: the days when I read about another mass shooting, another tale of desperate refugees or anything involving the words “Donald Trump” were the days I’d end up gloomier, tetchier, more attention-scattered. Needless to say, I channelled none of this malaise into making the planet better. I just got grumbly about the world, like a walking embodiment of that bumper-sticker: “Where are we going, and why are we in this handbasket?”

One problem is that merely knowing that the news focuses disproportionately on negative and scary stories doesn’t mean you’ll adjust your emotions accordingly. People like me scorn Trump and the Daily Mail for sowing unwarranted fears. We know that the risk of dying in traffic is vastly greater than from terrorism. We may even know that US gun crime is in dramatic decline, that global economic inequality is decreasing, or that there’s not much evidence that police brutality is on the rise. (We just see more of it, thanks to smartphones.) But, apparently, the part of our minds that knows these facts isn’t the same part that decides whether to feel upbeat or despairing. It’s entirely possible to know things are pretty good, yet feel as if they’re terrible.

This phenomenon has curious parallels with the “busyness epidemic”. Data on leisure time suggests we’re not much busier than we were, yet we feel busier, partly because – for “knowledge workers”, anyway – there’s no limit to the number of emails we can get, the demands that can be made of us, or the hours of the day we can be in touch with the office. Work feels infinite, but our capacities are finite, therefore overwhelm is inevitable. Similarly, technology connects us to more and more of the world’s suffering, of which there’s an essentially infinite amount, until feeling steamrollered by it becomes structurally inevitable – not a sign that life’s getting worse. And the consequences go beyond glumness. They include “compassion fade”, the well-studied effect whereby our urge to help the unfortunate declines as their numbers increase.

Read the whole column here.

Image courtesy of Google Search.

Neutrinos in the News

Something’s up. Perhaps there’s some degree of hope that we may be reversing the tide of “dumbeddownness” in the stories that the media pumps through its many tubes to reach us. So, it comes as a welcome surprise to see articles about the very, very small making big news in publications like the New Yorker. Stories about neutrinos no less. Thank you New Yorker for dumbing us up. And, kudos to the latest Nobel laureates — Takaaki Kajita and Arthur B. McDonald — for helping us understand just a little bit more about our world.

From the New Yorker:

This week the 2015 Nobel Prize in Physics was awarded jointly to Takaaki Kajita and Arthur B. McDonald for their discovery that elementary particles called neutrinos have mass. This is, remarkably, the fourth Nobel Prize associated with the experimental measurement of neutrinos. One might wonder why we should care so much about these ghostly particles, which barely interact with normal matter.

Even though the existence of neutrinos was predicted in 1930, by Wolfgang Pauli, none were experimentally observed until 1956. That’s because neutrinos almost always pass through matter without stopping. Every second of every day, more than six trillion neutrinos stream through your body, coming directly from the fiery core of the sun—but most of them go right through our bodies, and the Earth, without interacting with the particles out of which those objects are made. In fact, on average, those neutrinos would be able to traverse more than one thousand light-years of lead before interacting with it even once.

The very fact that we can detect these ephemeral particles is a testament to human ingenuity. Because the rules of quantum mechanics are probabilistic, we know that, even though almost all neutrinos will pass right through the Earth, a few will interact with it. A big enough detector can observe such an interaction. The first detector of neutrinos from the sun was built in the nineteen-sixties, deep within a mine in South Dakota. An area of the mine was filled with a hundred thousand gallons of cleaning fluid. On average, one neutrino each day would interact with an atom of chlorine in the fluid, turning it into an atom of argon. Almost unfathomably, the physicist in charge of the detector, Raymond Davis, Jr., figured out how to detect these few atoms of argon, and, four decades later, in 2002, he was awarded the Nobel Prize in Physics for this amazing technical feat.

Because neutrinos interact so weakly, they can travel immense distances. They provide us with a window into places we would never otherwise be able to see. The neutrinos that Davis detected were emitted by nuclear reactions at the very center of the sun, escaping this incredibly dense, hot place only because they so rarely interact with other matter. We have been able to detect neutrinos emerging from the center of an exploding star more than a hundred thousand light-years away.

But neutrinos also allow us to observe the universe at its very smallest scales—far smaller than those that can be probed even at the Large Hadron Collider, in Geneva, which, three years ago, discovered the Higgs boson. It is for this reason that the Nobel Committee decided to award this year’s Nobel Prize for yet another neutrino discovery.

Read the entire story here.

Xenophobia: Terrorism of the Mind

I suspect xenophobia is a spectrum disorder. At one end of the spectrum we see the acts of fundamentalist terrorists following their apocalyptic (and isolationist) scripts to their barbaric conclusions. At the other end, we hear the segregationist rants of talking heads demanding litmus tests for migrants, refugees and victims of violence. And, it’s all the more distasteful when one of the talking heads controls vast swathes of the global media.

So, shame on you Rupert Murdoch for suggesting that the US allow entry only to proven Christian refugees. Clearly, tolerance, understanding and inclusiveness are not concepts that Mr.Murdoch understands — traits that he lacks in common with those that he accuses.

From the Guardian:

I see Rupert Murdoch has come up with a foolproof method to ensure that the United States is safe from terrorism.

In a tweet offering advice to the American president, he wrote:

“Obama facing enormous opposition in accepting refugees. Maybe make special exception for proven Christians”

Oh yes he did. Does the News Corp boss not realise that this is just the kind of response to terrorism that the terrorists seek to provoke?

Ostracising all Muslims by refusing them sanctuary on the grounds that that they are potential terrorists is likely to be counter-productive. And, incidentally, is it not unChristian?

I note that the editor of Newsnight, Ian Katz, tongue firmly in cheek, tweeted back to Murdoch:

“Interesting idea… will you come and talk about it on @BBCNewsnight”.

But he didn’t take up the offer. He obviously prefers to let his wisdom shine through in 140 characters.

I am also queasy about the Tuesday editorial in Murdoch’s favourite newspaper, the Sun, which called on British-based Muslims to prove their opposition to “the jihadis” by marching through London with placards saying “not in our name”.

Rightly, the paper points out that Isis “seeks to establish violent, oppressive fundamentalism as the only true faith and to divide Muslims from non-Muslims.”

But I wonder whether the Sun realises that its message is similar: the effect of treating Muslims, all Muslims, as some kind of homogenous entity (and as a thing apart) is more likely to foment divisions with non-Muslims and alienate Muslims still further.

Read the entire story here.

Psychic Media Watch

Watching the media is one of my favorite amateur pursuits. It’s a continuous source of paradox, infotainment, hypocrisy, truthiness (Stephen Colbert, 2005), loud-mouthery (me, 2015) and hence, enjoyment. So, when two opposing headlines collide mid-way across the Atlantic it’s hard for me to resist highlighting the dissonance. I snapped both these stories on the same day, August 28, 2015. The headlines read:

New York Times:

Psychic-news-28Aug2015-NYTApparently, fortunetelling is “a scam”, according to convicted New York psychic, Celia Mitchell.

The Independent:

Psychic-news-28Aug2015-Independent

Yet, in the UK, the College of Policing recommends using psychics to find missing persons.

Enjoy.

Your Goldfish is Better Than You

Common_goldfish

Well, perhaps not at philosophical musings or mathematics. But, your little orange aquatic friend now has an attention span that is longer than yours. And, it’s all thanks to mobile devices and multi-tasking on multiple media platforms. [Psst, by the way, multi-tasking at the level of media consumption is a fallacy]. On average, the adult attention span is now down to a laughingly paltry 8 seconds, whereas the lowly goldfish comes in at 9 seconds. Where of course that leaves your inbetweeners and teenagers is anyone’s guess.

From the Independent:

Humans have become so obsessed with portable devices and overwhelmed by content that we now have attention spans shorter than that of the previously jokingly juxtaposed goldfish.

Microsoft surveyed 2,000 people and used electroencephalograms (EEGs) to monitor the brain activity of another 112 in the study, which sought to determine the impact that pocket-sized devices and the increased availability of digital media and information have had on our daily lives.

Among the good news in the 54-page report is that our ability to multi-task has drastically improved in the information age, but unfortunately attention spans have fallen.

In 2000 the average attention span was 12 seconds, but this has now fallen to just eight. The goldfish is believed to be able to maintain a solid nine.

“Canadians [who were tested] with more digital lifestyles (those who consume more media, are multi-screeners, social media enthusiasts, or earlier adopters of technology) struggle to focus in environments where prolonged attention is needed,” the study reads.

“While digital lifestyles decrease sustained attention overall, it’s only true in the long-term. Early adopters and heavy social media users front load their attention and have more intermittent bursts of high attention. They’re better at identifying what they want/don’t want to engage with and need less to process and commit things to memory.”

Anecdotely, many of us can relate to the increasing inability to focus on tasks, being distracted by checking your phone or scrolling down a news feed.

Another recent study by the National Centre for Biotechnology Information and the National Library of Medicine in the US found that 79 per cent of respondents used portable devices while watching TV (known as dual-screening) and 52 per cent check their phone every 30 minutes.

Read the entire story here.

Image: Common Goldfish. Public Domain.

 

Baroness Thatcher and the Media Baron

The cozy yet fraught relationship between politicians and powerful figures in the media has been with us since the first days of newsprint. It’s a delicate symbiosis of sorts — the politician needs the media magnate to help acquire and retain power; the media baron needs the politician to shape and centralize it. The underlying motivations seem similar for both parties, hence the symbiosis — self-absorbtion, power, vanity.

So, it comes as no surprise to read intimate details of the symbiotic Rupert Murdoch / Margaret Thatcher years. Prime minister Thatcher would sometimes actively, but often surreptitiously, support Murdoch’s megalomaniacal desire to corner the UK (and global) media, while Murdoch would ensure his media channeled appropriately Thatcher-friendly news, spin and op-ed. But the Thatcher-Murdoch story is just one of the latest in a long line of business deals between puppet and puppet-master [you may decide which is which, dear reader]. Over the last hundred years we’ve had William Randolph Hearst and Roosevelt, Lloyd George and Northcliffe, Harold Wilson and Robert Maxwell, Baldwin and Beaverbrook.

Thomas Jefferson deplored newspapers — seeing them as vulgar and cancerous. His prescient analysis of the troubling and complex relationship between the news and politics is just as valid today, “an evil for which there is no remedy; our liberty depends on the freedom of the press, and this cannot be limited without being lost”.

Yet for all the grievous faults and dubious shenanigans of the brutish media barons and their fickle political spouses, the Thatcher-Murdoch story is perhaps not as sinister as one might first think. We now live in an age where faceless corporations and billionaires broker political power and shape policy behind mountains of money, obfuscated institutions and closed doors. This is far more troubling for our democracies. I would rather fight an evil that has a face.

From the Guardian:

The coup that transformed the relationship between British politics and journalism began at a quiet Sunday lunch at Chequers, the official country retreat of the prime minister, Margaret Thatcher. She was trailing in the polls, caught in a recession she had inherited, eager for an assured cheerleader at a difficult time. Her guest had an agenda too. He was Rupert Murdoch, eager to secure her help in acquiring control of nearly 40% of the British press.

Both parties got what they wanted.

The fact that they met at all, on 4 January 1981, was vehemently denied for 30 years. Since their lie was revealed, it has been possible to uncover how the greatest extension of monopoly power in modern press history was planned and executed with such furtive brilliance.

All the wretches in the subsequent hacking sagas – the predators in the red-tops, the scavengers and sleaze merchants, the blackmailers and bribers, the liars, the bullies, the cowed politicians and the bent coppers – were but the detritus of a collapse of integrity in British journalism and political life. At the root of the cruelties and extortions exposed in the recent criminal trials at the Old Bailey, was Margaret Thatcher’s reckless engorgement of the media power of her guest that January Sunday. The simple genesis of the hacking outrages is that Murdoch’s News International came to think it was above the law, because it was.

Thatcher achieved much as a radical prime minister confronted by political turmoil and economic torpor. So did Murdoch, in his liberation of British newspapers from war with the pressroom unions, and by wresting away the print unions’ monopoly of access to computer technology. I applauded his achievements, and still do, as I applauded many of Thatcher’s initiatives when I chaired the editorial boards of the Sunday Times (1967-81) and then the Times (1981-2). It is sad that her successes are stained by recent evidence of her readiness to ensure sunshine headlines for herself in the Murdoch press (especially when it was raining), at a heavy cost to the country. She enabled her guest to avoid a reference to the Monopolies and Mergers Commission, even though he already owned the biggest-selling daily newspaper, the Sun, and the biggest selling Sunday newspaper, the News of the World, and was intent on acquiring the biggest-selling quality weekly, the Sunday Times, and its stablemate, the Times. 

 Times Newspapers had long cherished their independence. In 1966, when the Times was in financial difficulty, the new owner who came to the rescue, Lord Roy Thomson of Fleet, promised to sustain it as an independent non-partisan newspaper – precisely how he had conducted the profitable Sunday Times. Murdoch was able to acquire both publications in 1981 only because he began making solemn pledges that he would maintain the tradition of independence. He broke every one of those promises in the first years. His breach of the undertakings freely made for Times Newspapers was a marked contrast with the independent journalism we at the Sunday Times (and William Rees-Mogg at the Times) had enjoyed under the principled ownership of the Thomson family. Thatcher was a vital force in reviving British competitiveness, but she abetted a concentration of press power that became increasingly arrogant and careless of human dignity in ways that would have appalled her, had she remained in good health long enough to understand what her actions had wrought.

Documents released by the Thatcher Archive Trust, now housed at Churchill College, Cambridge, give the lie to a litany of Murdoch-Thatcher denials about collusion during the bidding for Times Newspapers. They also expose a crucial falsehood in the seventh volume of The History of the Times: The Murdoch Years – the official story of the newspaper from 1981-2002, published in 2005 by the Murdoch-owned HarperCollins. In it Graham Stewart wrote, in all innocence, that Murdoch and Thatcher “had no communication whatsoever during the period in which the Times bid and presumed referral to the Monopolies and Mergers Commission was up for discussion”.

Read the entire story here.

 

Endless Political Campaigning

US-politicians

The great capitalist market has decided — endless political campaigning in the United States is beneficial. If you think the presidential campaign to elect the next leader in 2016 began sometime last year you are not mistaken. In fact, it really does seem that political posturing for the next election often begins before the current one is even decided. We all complain: too many ads, too much negativity, far too much inanity and little substance. Yet, we allow the process to continue, and to grow in scale. Would you put up with a political campaign that lasts a mere 38 days? The British seem to do it. But, then again, the United States is so much more advanced, right?

From WSJ:

On March 23, Ted Cruz announced he is running for president in a packed auditorium at Liberty University in Lynchburg, Va. On April 7, Rand Paul announced he is running for president amid the riverboat décor of the Galt House hotel in Louisville, Ky. On April 12, Hillary Clinton announced she is running for president in a brief segment of a two-minute video. On April 13, Marco Rubio announced he is running before a cheering crowd at the Freedom Tower in Miami. And these are just the official announcements.

Jeb Bush made it known in December that he is interested in running. Scott Walker’s rousing speech at the Freedom Summit in Des Moines, Iowa, on Jan. 24 left no doubt that he will enter the race. Chris Christie’s appearance in New Hampshire last week strongly suggests the same. Previous presidential candidates Mike Huckabee,Rick Perry and Rick Santorum seem almost certain to run. Pediatric surgeon Ben Carson is reportedly ready to announce his run on May 4 at the Detroit Music Hall.

With some 570 days left until Election Day 2016, the race for president is very much under way—to the dismay of a great many Americans. They find the news coverage of the candidates tiresome (what did Hillary order at Chipotle?), are depressed by the negative campaigning that is inevitable in an adversarial process, and dread the onslaught of political TV ads. Too much too soon!

They also note that other countries somehow manage to select their heads of government much more quickly. The U.K. has a general election campaign going on right now. It began on March 30, when the queen, on the advice of the prime minister, dissolved Parliament, and voting will take place on May 7. That’s 38 days later. Britons are complaining that the electioneering goes on too long.

American presidential campaigns did not always begin so soon, but they have for more than a generation now. As a young journalist, Sidney Blumenthal (in recent decades a consigliere to the Clintons) wrote quite a good book titled “The Permanent Campaign.” It was published in 1980. Mr. Blumenthal described what was then a relatively new phenomenon.

When Jimmy Carter announced his candidacy for president in January 1975, he was not taken particularly seriously. But his perseverance paid off, and he took the oath of office two years later. His successors—Ronald Reagan, George H.W. Bush and Bill Clinton—announced their runs in the fall before their election years, although they had all been busy assembling campaigns before that. George W. Bush announced in June 1999, after the adjournment of the Texas legislature. Barack Obama announced in February 2007, two days before Lincoln’s birthday, in Lincoln’s Springfield, Ill. By that standard, declared candidates Mr. Cruz, Mr. Paul, Mrs. Clinton and Mr. Rubio got a bit of a late start.

Why are American presidential campaigns so lengthy? And is there anything that can be done to compress them to a bearable timetable?

One clue to the answers: The presidential nominating process, the weakest part of our political system, is also the one part that was not envisioned by the Founding Fathers. The framers of the Constitution created a powerful presidency, confident (justifiably, as it turned out) that its first incumbent, George Washington, would set precedents that would guide the republic for years to come.

But they did not foresee that even in Washington’s presidency, Americans would develop political parties, which they abhorred. The Founders expected that later presidents would be chosen, usually by the House of Representatives, from local notables promoted by different states in the Electoral College. They did not expect that the Federalist and Republican parties would coalesce around two national leaders—Washington’s vice president, John Adams, and Washington’s first secretary of state, Thomas Jefferson—in the close elections of 1796 and 1800.

The issue then became: When a president followed George Washington’s precedent and retired after two terms, how would the parties choose nominees, in a republic that, from the start, was regionally, ethnically and religiously diverse?

Read the entire story here.

Image courtesy of Google Search.

News Anchor as Cult Hero

Google-search-news-anchor

Why and when did the news anchor, or newsreader as he or she is known in non-US parts of the world, acquire the status of cult hero? And, why is this a peculiarly US phenomenon? Let’s face it TV newsreaders in the UK, on the BBC or ITV, certainly do not have a following along the lines their US celebrity counterparts like Brian Williams, Megyn Kelly or Anderson Cooper. Why?

From the Guardian:

A game! Spot the odd one out in the following story. This year has been a terrible one so far for those who care about American journalism: the much-loved New York Times journalist David Carr died suddenly on 12 February; CBS correspondent Bob Simon was killed in a car crash the day before; Jon Stewart, famously the “leading news source for young Americans”, announced that he is quitting the Daily Show; his colleague Stephen Colbert is moving over from news satire to the softer arena of a nightly talk show; NBC anchor Brian Williams, as famous in America as Jeremy Paxman is in Britain, has been suspended after it was revealed he had “misremembered” events involving himself while covering the war in Iraq; Bill O’Reilly, an anchor on Fox News, the most watched cable news channel in the US, has been accused of being on similarly vague terms with the truth.

News of the Fox News anchor probably sounds like “dog bites man” to most Britons, who remember that this network recently described Birmingham as a no-go area for non-Muslims. But this latest scandal involving O’Reilly reveals something quite telling about journalism in America.

Whereas in Britain journalists are generally viewed as occupying a place on the food chain somewhere between bottom-feeders and cockroaches, in America there remains, still, a certain idealisation of journalists, protected by a gilded halo hammered out by sentimental memories of Edward R Murrow and Walter Cronkite.

Even while Americans’ trust in mass media continues to plummet, journalists enjoy a kind of heroic fame that would baffle their British counterparts. Television anchors and commentators, from Rachel Maddow on the left to Sean Hannity on the right, are lionised in a way that, say, Huw Edwards, is, quite frankly, not. A whole genre of film exists in the US celebrating the heroism of journalists, from All the President’s Men to Good Night, and Good Luck. In Britain, probably the most popular depiction of journalists came from Spitting Image, where they were snuffling pigs in pork-pie hats.

So whenever a journalist in the US has been caught lying, the ensuing soul-searching and garment-rending discovery has been about as prolonged and painful as a PhD on proctology. The New York Times and the New Republic both imploded when it was revealed that their journalists, respectively Jayson Blair and Stephen Glass, had fabricated their stories. Their tales have become part of American popular culture – The Wire referenced Blair in its fifth season and a film was made about the New Republic’s scandal – like national myths that must never be forgotten.

By contrast, when it was revealed that The Independent’s Johann Hari had committed plagiarism and slandered his colleagues on Wikipedia, various journalists wrote bewildering defences of him and the then Independent editor said initially that Hari would return to the paper. Whereas Hari’s return to the public sphere three years after his resignation has been largely welcomed by the British media, Glass and Blair remain shunned figures in the US, more than a decade after their scandals.

Which brings us back to the O’Reilly scandal, now unfolding in the US. Once it was revealed that NBC’s liberal Brian Williams had exaggerated personal anecdotes – claiming to have been in a helicopter that was shot at when he was in the one behind, for starters – the hunt was inevitably on for an equally big conservative news scalp. Enter stage left: Bill O’Reilly.

So sure, O’Reilly claimed that in his career he has been in “active war zones” and “in the Falklands” when he in fact covered a protest in Buenos Aires during the Falklands war. And sure, O’Reilly’s characteristically bullish defence that he “never said” he was “on the Falkland Islands” (original quote: “I was in a situation one time, in a war zone in Argentina, in the Falklands …”) and that being at a protest thousands of miles from combat constitutes “a war zone” verges on the officially bonkers (as the Washington Post put it, “that would mean that any reporter who covered an anti-war protest in Washington during the Iraq War was doing combat reporting”). But does any of this bother either O’Reilly or Fox News? It does not.

Unlike Williams, who slunk away in shame, O’Reilly has been bullishly combative, threatening journalists who dare to cover the story and saying that they deserve to be “in the kill zone”. Fox News too has been predictably untroubled by allegations of lies: “Fox News chairman and CEO Roger Ailes and all senior management are in full support of Bill O’Reilly,” it said in a statement.

Read the entire story here.

Image courtesy of Google Search.

Why Are We Obsessed With Zombies?

Google-search-zombie

Previous generations worried about Frankenstein, evil robots, even more evil aliens, hungry dinosaurs and, more recently, vampires. Nowadays our culture seems to be singularly obsessed with zombies. Why?

From the Conversation:

The zombie invasion is here. Our bookshops, cinemas and TVs are dripping with the pustulating debris of their relentless shuffle to cultural domination.

A search for “zombie fiction” on Amazon currently provides you with more than 25,000 options. Barely a week goes by without another onslaught from the living dead on our screens. We’ve just seen the return of one of the most successful of these, The Walking Dead, starring Andrew Lincoln as small-town sheriff, Rick Grimes. The show follows the adventures of Rick and fellow survivors as they kill lots of zombies and increasingly, other survivors, as they desperately seek safety.

Generational monsters

Since at least the late 19th century each generation has created fictional enemies that reflect a broader unease with cultural or scientific developments. The “Yellow Peril” villains such as Fu Manchu were a response to the massive increase in Chinese migration to the US and Europe from the 1870s, for example.

As the industrial revolution steamed ahead, speculative fiction of authors such as H G Wells began to consider where scientific innovation would take mankind. This trend reached its height in the Cold War during the 1950s and 1960s. Radiation-mutated monsters and invasions from space seen through the paranoid lens of communism all postulated the imminent demise of mankind.

By the 1970s, in films such as The Parallax View and Three Days of the Condor, the enemy evolved into government institutions and powerful corporations. This reflected public disenchantment following years of increasing social conflict, Vietnam and the Watergate scandal.

In the 1980s and 1990s it was the threat of AIDS that was embodied in the monsters of the era, such as “bunny boiling” stalker Alex in Fatal Attraction. Alex’s obsessive pursuit of the man with whom she shared a one night stand, Susanne Leonard argues, represented “the new cultural alignment between risk and sexual contact”, a theme continued with Anne Rices’s vampire Lestat in her series The Vampire Chronicles.

Risk and anxiety

Zombies, the flesh eating undead, have been mentioned in stories for more than 4,000 years. But the genre really developed with the work of H G Wells, Poe and particularly H P Lovecraft in the early 20th century. Yet these ponderous adversaries, descendants of Mary Shelley’s Frankenstein, have little in common with the vast hordes that threaten mankind’s existence in the modern versions.

M Keith Booker argued that in the 1950s, “the golden age of nuclear fear”, radiation and its fictional consequences were the flip side to a growing faith that science would solve the world’s problems. In many respects we are now living with the collapse of this faith. Today we live in societies dominated by an overarching anxiety reflecting the risk associated with each unpredictable scientific development.

Now we know that we are part of the problem, not necessarily the solution. The “breakthroughs” that were welcomed in the last century now represent some of our most pressing concerns. People have lost faith in assumptions of social and scientific “progress”.

Globalisation

Central to this is globalisation. While generating enormous benefits, globalisation is also tearing communities apart. The political landscape is rapidly changing as established political institutions seem unable to meet the challenges presented by the social and economic dislocation.

However, although destructive, globalisation is also forging new links between people, through what Anthony Giddens calls the “emptying of time and space”. Modern digital media has built new transnational alliances, and, particularly in the West, confronted people with stark moral questions about the consequences of their own lifestyles.

As the faith in inexorable scientific “progress” recedes, politics is transformed. The groups emerging from outside the political mainstream engage in much older battles of faith and identity. Whether right-wing nationalists or Islamic fundamentalists, they seek to build “imagined communities” through race, religion or culture and “fear” is their currency.

Evolving zombies

Modern zombies are the product of this globalised, risk conscious world. No longer the work of a single “mad” scientist re-animating the dead, they now appear as the result of secret government programmes creating untreatable viruses. The zombies indiscriminately overwhelm states irrespective of wealth, technology and military strength, turning all order to chaos.

Meanwhile, the zombies themselves are evolving into much more tenacious adversaries. In Danny Boyle’s 28 Days Later it takes only 20 days for society to be devastated. Charlie Higson’s Enemy series of novels have the zombies getting leadership and using tools. In the film of Max Brooks’ novel, World War Z, the seemingly superhuman athleticism of the zombies reflects the devastating springboard that vast urban populations would provide for such a disease. The film, starring Brad Pitt, had a reported budget of US$190m, demonstrating what a big business zombies have become.

Read the entire article here.

Image courtesy of Google Search.

Where Will I Get My News (and Satire)

Google-search-jon-stewart

Jon Stewart. Jon Stewart, you dastardly, villainous so-and-so. How could you? How could you decide to leave the most important show in media history — The Daily Show — after a mere 16 years? Where will I get my news? Where will I find another hypocrisy-meter? Where will I find another truth-seeking David to fend us from the fear-mongering neocon Goliaths led by Rogers Ailes over at the Foxion News Channel? Where will I find such a thoroughly delicious merging of news, fact and satire. Jon Stewart how could you?!

From the Guardian?

“Where will I get my news each night,” lamented Bill Clinton this week. This might have been a reaction to the fall from grace of Brian Williams, America’s top-rated news anchor, who was suspended for embellishing details of his adventures in Iraq. In fact the former US president was anticipating withdrawal symptoms for the impending departure of the comedian Jon Stewart, who – on the same day as Williams’s disgrace – announced that he will step down as the Daily Show host.

Stewart, who began his stint 16 years ago, has achieved something extraordinary from behind a studio desk on a comedy cable channel. Merging the intense desire for factual information with humour, irreverence, scepticism and usually appropriate cynicism, Stewart’s show proved a magnet for opinion formers, top politicians – who clamoured to appear – and most significantly the young, for whom the mix proved irresistible. His ridiculing of neocons became a nightly staple. His rejection from the outset of the Iraq war was prescient. And always he was funny, not least this week in using Williams’s fall to castigate the media for failing to properly scrutinise the Iraq war. Bill Clinton does not mourn alone.

Read the entire story here.

Image courtesy of Google Search.

The Thugs of Cultural Disruption

What becomes of our human culture as Amazon crushes booksellers and publishers, Twitter dumbs down journalism, knowledge is replaced by keyword search, and the internet becomes a popularity contest?

Leon Wieseltier contributing editor at The Atlantic has some thoughts.

From NYT:

Amid the bacchanal of disruption, let us pause to honor the disrupted. The streets of American cities are haunted by the ghosts of bookstores and record stores, which have been destroyed by the greatest thugs in the history of the culture industry. Writers hover between a decent poverty and an indecent one; they are expected to render the fruits of their labors for little and even for nothing, and all the miracles of electronic dissemination somehow do not suffice for compensation, either of the fiscal or the spiritual kind. Everybody talks frantically about media, a second-order subject if ever there was one, as content disappears into “content.” What does the understanding of media contribute to the understanding of life? Journalistic institutions slowly transform themselves into silent sweatshops in which words cannot wait for thoughts, and first responses are promoted into best responses, and patience is a professional liability. As the frequency of expression grows, the force of expression diminishes: Digital expectations of alacrity and terseness confer the highest prestige upon the twittering cacophony of one-liners and promotional announcements. It was always the case that all things must pass, but this is ridiculous.

Meanwhile the discussion of culture is being steadily absorbed into the discussion of business. There are “metrics” for phenomena that cannot be metrically measured. Numerical values are assigned to things that cannot be captured by numbers. Economic concepts go rampaging through noneconomic realms: Economists are our experts on happiness! Where wisdom once was, quantification will now be. Quantification is the most overwhelming influence upon the contemporary American understanding of, well, everything. It is enabled by the idolatry of data, which has itself been enabled by the almost unimaginable data-generating capabilities of the new technology. The distinction between knowledge and information is a thing of the past, and there is no greater disgrace than to be a thing of the past. Beyond its impact upon culture, the new technology penetrates even deeper levels of identity and experience, to cognition and to consciousness. Such transformations embolden certain high priests in the church of tech to espouse the doctrine of “transhumanism” and to suggest, without any recollection of the bankruptcy of utopia, without any consideration of the cost to human dignity, that our computational ability will carry us magnificently beyond our humanity and “allow us to transcend these limitations of our biological bodies and brains. . . . There will be no distinction, post-Singularity, between human and machine.” (The author of that updated mechanistic nonsense is a director of engineering at Google.)

And even as technologism, which is not the same as technology, asserts itself over more and more precincts of human life, so too does scientism, which is not the same as science. The notion that the nonmaterial dimensions of life must be explained in terms of the material dimensions, and that nonscientific understandings must be translated into scientific understandings if they are to qualify as knowledge, is increasingly popular inside and outside the university, where the humanities are disparaged as soft and impractical and insufficiently new. The contrary insistence that the glories of art and thought are not evolutionary adaptations, or that the mind is not the brain, or that love is not just biology’s bait for sex, now amounts to a kind of heresy. So, too, does the view that the strongest defense of the humanities lies not in the appeal to their utility — that literature majors may find good jobs, that theaters may economically revitalize neighborhoods — but rather in the appeal to their defiantly nonutilitarian character, so that individuals can know more than how things work, and develop their powers of discernment and judgment, their competence in matters of truth and goodness and beauty, to equip themselves adequately for the choices and the crucibles of private and public life.

Read the entire essay here.

The Rise of McLiterature

Will-Self-2007A sad symptom of our expanding media binge culture and the fragmentation of our shortening attention spans is the demise of literary fiction. Author Will Self believes the novel, and narrative prose in general, is on a slow, but accelerating, death-spiral. His eloquent views presented in a May 6, 2014 lecture are excerpted below.

From the Guardian:

If you happen to be a writer, one of the great benisons of having children is that your personal culture-mine is equipped with its own canaries. As you tunnel on relentlessly into the future, these little harbingers either choke on the noxious gases released by the extraction of decadence, or they thrive in the clean air of what we might call progress. A few months ago, one of my canaries, who’s in his mid-teens and harbours a laudable ambition to be the world’s greatest ever rock musician, was messing about on his electric guitar. Breaking off from a particularly jagged and angry riff, he launched into an equally jagged diatribe, the gist of which was already familiar to me: everything in popular music had been done before, and usually those who’d done it first had done it best. Besides, the instant availability of almost everything that had ever been done stifled his creativity, and made him feel it was all hopeless.

A miner, if he has any sense, treats his canary well, so I began gently remonstrating with him. Yes, I said, it’s true that the web and the internet have created a permanent Now, eliminating our sense of musical eras; it’s also the case that the queered demographics of our longer-living, lower-birthing population means that the middle-aged squat on top of the pyramid of endeavour, crushing the young with our nostalgic tastes. What’s more, the decimation of the revenue streams once generated by analogues of recorded music have put paid to many a musician’s income. But my canary had to appreciate this: if you took the long view, the advent of the 78rpm shellac disc had also been a disaster for musicians who in the teens and 20s of the last century made their daily bread by live performance. I repeated one of my favourite anecdotes: when the first wax cylinder recording of Feodor Chaliapin singing “The Song of the Volga Boatmen was played, its listeners, despite a lowness of fidelity that would seem laughable to us (imagine a man holding forth from a giant bowl of snapping, crackling and popping Rice Krispies), were nonetheless convinced the portly Russian must be in the room, and searched behind drapes and underneath chaise longues for him.

So recorded sound blew away the nimbus of authenticity surrounding live performers – but it did worse things. My canaries have often heard me tell how back in the 1970s heyday of the pop charts, all you needed was a writing credit on some loathsome chirpy-chirpy-cheep-cheeping ditty in order to spend the rest of your born days lying by a guitar-shaped pool in the Hollywood Hills hoovering up cocaine. Surely if there’s one thing we have to be grateful for it’s that the web has put paid to such an egregious financial multiplier being applied to raw talentlessness. Put paid to it, and also returned musicians to the domain of live performance and, arguably, reinvigorated musicianship in the process. Anyway, I was saying all of this to my canary when I was suddenly overtaken by a great wave of noxiousness only I could smell. I faltered, I fell silent, then I said: sod you and your creative anxieties, what about me? How do you think it feels to have dedicated your entire adult life to an art form only to see the bloody thing dying before your eyes?

My canary is a perceptive songbird – he immediately ceased his own cheeping, except to chirrup: I see what you mean. The literary novel as an art work and a narrative art form central to our culture is indeed dying before our eyes. Let me refine my terms: I do not mean narrative prose fiction tout court is dying – the kidult boywizardsroman and the soft sadomasochistic porn fantasy are clearly in rude good health. And nor do I mean that serious novels will either cease to be written or read. But what is already no longer the case is the situation that obtained when I was a young man. In the early 1980s, and I would argue throughout the second half of the last century, the literary novel was perceived to be the prince of art forms, the cultural capstone and the apogee of creative endeavour. The capability words have when arranged sequentially to both mimic the free flow of human thought and investigate the physical expressions and interactions of thinking subjects; the way they may be shaped into a believable simulacrum of either the commonsensical world, or any number of invented ones; and the capability of the extended prose form itself, which, unlike any other art form, is able to enact self-analysis, to describe other aesthetic modes and even mimic them. All this led to a general acknowledgment: the novel was the true Wagnerian Gesamtkunstwerk.

This is not to say that everyone walked the streets with their head buried in Ulysses or To the Lighthouse, or that popular culture in all its forms didn’t hold sway over the psyches and imaginations of the great majority. Nor do I mean to suggest that in our culture perennial John Bull-headed philistinism wasn’t alive and snorting: “I don’t know much about art, but I know what I like.” However, what didn’t obtain is the current dispensation, wherein those who reject the high arts feel not merely entitled to their opinion, but wholly justified in it. It goes further: the hallmark of our contemporary culture is an active resistance to difficulty in all its aesthetic manifestations, accompanied by a sense of grievance that conflates it with political elitism. Indeed, it’s arguable that tilting at this papery windmill of artistic superiority actively prevents a great many people from confronting the very real economic inequality and political disenfranchisement they’re subject to, exactly as being compelled to chant the mantra “choice” drowns out the harsh background Muzak telling them they have none.

Just because you’re paranoid it doesn’t mean they aren’t out to get you. Simply because you’ve remarked a number of times on the concealed fox gnawing its way into your vitals, it doesn’t mean it hasn’t at this moment swallowed your gall bladder. Ours is an age in which omnipresent threats of imminent extinction are also part of the background noise – nuclear annihilation, terrorism, climate change. So we can be blinkered when it comes to tectonic cultural shifts. The omnipresent and deadly threat to the novel has been imminent now for a long time – getting on, I would say, for a century – and so it’s become part of culture. During that century, more books of all kinds have been printed and read by far than in the entire preceding half millennium since the invention of movable-type printing. If this was death it had a weird, pullulating way of expressing itself. The saying is that there are no second acts in American lives; the novel, I think, has led a very American sort of life: swaggering, confident, brash even – and ever aware of its world-conquering manifest destiny. But unlike Ernest Hemingway or F Scott Fitzgerald, the novel has also had a second life. The form should have been laid to rest at about the time of Finnegans Wake, but in fact it has continued to stalk the corridors of our minds for a further three-quarters of a century. Many fine novels have been written during this period, but I would contend that these were, taking the long view, zombie novels, instances of an undead art form that yet wouldn’t lie down.

Literary critics – themselves a dying breed, a cause for considerable schadenfreude on the part of novelists – make all sorts of mistakes, but some of the most egregious ones result from an inability to think outside of the papery prison within which they conduct their lives’ work. They consider the codex. They are – in Marshall McLuhan’s memorable phrase – the possessors of Gutenberg minds.

There is now an almost ceaseless murmuring about the future of narrative prose. Most of it is at once Panglossian and melioristic: yes, experts assert, there’s no disputing the impact of digitised text on the whole culture of the codex; fewer paper books are being sold, newspapers fold, bookshops continue to close, libraries as well. But … but, well, there’s still no substitute for the experience of close reading as we’ve come to understand and appreciate it – the capacity to imagine entire worlds from parsing a few lines of text; the ability to achieve deep and meditative levels of absorption in others’ psyches. This circling of the wagons comes with a number of public-spirited campaigns: children are given free books; book bags are distributed with slogans on them urging readers to put books in them; books are hymned for their physical attributes – their heft, their appearance, their smell – as if they were the bodily correlates of all those Gutenberg minds, which, of  course, they are.

The seeming realists among the Gutenbergers say such things as: well, clearly, books are going to become a minority technology, but the beau livre will survive. The populist Gutenbergers prate on about how digital texts linked to social media will allow readers to take part in a public conversation. What none of the Gutenbergers are able to countenance, because it is quite literally – for once the intensifier is justified – out of their minds, is that the advent of digital media is not simply destructive of the codex, but of the Gutenberg mind itself. There is one question alone that you must ask yourself in order to establish whether the serious novel will still retain cultural primacy and centrality in another 20 years. This is the question: if you accept that by then the vast majority of text will be read in digital form on devices linked to the web, do you also believe that those readers will voluntarily choose to disable that connectivity? If your answer to this is no, then the death of the novel is sealed out of your own mouth.

Read the entire excerpt here.

Image: Will Self, 2007. Courtesy of Wikipedia / Creative Commons.

Expanding Binge Culture

The framers of the U.S. Declaration of Independence could not have known. They could not have foreseen how commoditization, consumerism, globalisation and always-on media culture would come to transform our culture. They did well to insert “Life, Liberty and the pursuit of Happiness”.

But they failed to consider our collective evolution — if you would wish to denote it as such — towards a sophisticated culture of binge. Significant numbers of us have long binged on physical goods, money, natural resources, food and drink. However, media has lagged, somewhat. But no longer. Now we have at our instantaneous whim entire libraries of all-you-can-eat infotainment. Time will tell if this signals the demise of quality, as it gets replaced with overwhelming quantity. One area shows where we may be heading — witness the “fastfoodification” of our news.

From NYT:

When Beyoncé released, without warning, 17 videos around midnight on Dec. 13, millions of fans rejoiced. As a more casual listener of Ms. Knowles, I balked at the onslaught of new material and watched a few videos before throwing in the towel.

Likewise, when Netflix, in one fell swoop, made complete seasons of “House of Cards” and “Orange Is the New Black” available for streaming, I quailed at the challenge, though countless others happily immersed themselves in their worlds of Washington intrigue and incarcerated women.

Then there is the news, to which floodgates are now fully open thanks to the Internet and cable TV: Flight 370, Putin, Chris Christie, Edward Snowden, Rob Ford, Obamacare, “Duck Dynasty,” “bossy,” #CancelColbert, conscious uncoupling. When presented with 24/7 coverage of these ongoing narratives from an assortment of channels — traditional journalism sites, my Facebook feed, the log-out screen of my email — I followed some closely and very consciously uncoupled from others.

Had these content providers released their offerings in the old-media landscape, à la carte rather than in an all-you-can-eat buffet, the prospect of a seven-course meal might not have seemed so daunting. I could handle a steady drip of one article a day about Mr. Ford in a newspaper. But after two dozen, updated every 10 minutes, plus scores of tweets, videos and GIFs that keep on giving, I wanted to forget altogether about Toronto’s embattled mayor.

While media technology is now catching up to Americans’ penchant for overdoing it and finding plenty of willing indulgers, there are also those like me who recoil from the abundance of binge culture.

In the last decade, media entertainment has given far more freedom to consumers: watch, listen to and read anything at anytime. But Barry Schwartz’s 2004 book, “The Paradox of Choice,” argues that our surfeit of consumer choices engenders anxiety, not satisfaction, and sometimes even a kind of paralysis.

His thesis (which has its dissenters) applies mostly to the profusion of options within a single set: for instance, the challenge of picking out salad dressing from 175 varieties in a supermarket. Nevertheless, it is also germane to the concept of bingeing, when 62 episodes of “Breaking Bad” wait overwhelmingly in a row like bottles of Newman’s Own on a shelf.

Alex Quinlan, 31, a first-year Ph.D. student in poetry at Florida State University, said he used to spend at least an hour every morning reading the news and “putting off my responsibilities,” as well as binge-watching shows. He is busier now, and last fall had trouble installing an Internet connection in his home, which effectively “rewired my media-consumption habits,” he said. “I’m a lot more disciplined. Last night I watched one episode of ‘House of Cards’ and went to bed. A year ago, I probably would’ve watched one, gotten another beer, then watched two more.”

Even shorter-term bingeing can seem like a major commitment, because there is a distorting effect of receiving a large chunk of content at once rather than getting it piecemeal. To watch one Beyoncé video a week would eat as much time as watching all in one day, but their unified dissemination makes them seem intimidatingly movie-length (which they are, approximately) rather than like a series of four-minute clips.

I also experienced some first-world anxiety last year with the release of the fourth season of “Arrested Development.” I had devoured the show’s first three seasons, parceled out in 22-minute weekly installments on Fox as well as on DVD, where I would watch episodes I had already seen (in pre-streaming days, binge-watching required renting or owning a copy, which was more like a contained feast). But when Netflix uploaded 15 new episodes totaling 8.5 hours on May 26, I was not among those queuing up for it. It took me some time to get around to the show, and once I had started, the knowledge of how many episodes stretched in front of me, at my disposal whenever I wanted, proved off-putting.

This despite the keeping-up-with-the-Joneses quality to binge-viewing. If everyone is quickly exhausting every new episode of a show, and writing and talking about it the next day, it’s easy to feel left out of the conversation if you haven’t kept pace. And sometimes when you’re late to the party, you decide to stay home instead.

Because we frequently gorge when left to our own Wi-Fi-enabled devices, the antiquated methods of “scheduling our information consumption” may have been healthier, if less convenient, said Clay Johnson, 36, the author of “The Information Diet.” He recalled rushing home after choir practice when he was younger to catch “Northern Exposure” on TV.

“That idea is now preposterous,” he said. “We don’t have appointment television anymore. Just because we can watch something all the time doesn’t mean we should. Maybe we should schedule it in a way that makes sense around our daily lives.”

“It’s a lot like food,” he added. “You see some people become info-anorexic, who say the answer is to unplug and not consume anything. Much like an eating disorder, it’s just as unhealthy a decision as binge-watching the news and media. There’s a middle ground of people who are saying, ‘I need to start treating this form of input in my life like a conscious decision and to be informed in the right way.’ ”

Read the entire story here.

Content Versus Innovation

VHS-cassetteThe entertainment and media industry is not known for its innovation. Left to its own devices we would all be consuming news from broadsheets and a town crier, and digesting shows at the theater. Not too long ago the industry, led by Hollywood heavyweights, was doing its utmost to kill emerging forms of media consumption, such as the video tape cassette and the VCR.

Following numerous regulatory, legal and political skirmishes innovation finally triumphed over entrenched interests, allowing VHS tape, followed by the DVD, to flourish, albeit for a while. This of course paved the way for new forms of distribution — the rise of Blockbuster and a myriad of neighborhood video rental stores.

In a great ironic twist, the likes of Blockbuster failed to recognize signals from the market that without significant and continual innovation their business models would subsequently crumble. Now Netflix and other streaming services have managed to end our weekend visits to the movie rental store.

A fascinating article excerpted below takes a look back at the lengthy, and continuing, fight between the conservative media empires and the market’s constant pull from technological innovation.

[For a fresh perspective on the future of media distribution, see our recent posting here.]

From TechCrunch:

The once iconic video rental giant Blockbuster is shutting down its remaining stores across the country. Netflix, meanwhile, is emerging as the leader in video rental, now primarily through online streaming. But Blockbuster, Netflix and home media consumption (VCR/DVD/Blu-ray) may never have existed at all in their current form if the content industry had been successful in banning or regulating them. In 1983, nearly 30 years before thousands of websites blacked out in protest of SOPA/PIPA, video stores across the country closed in protest against legislation that would bar their market model.

A Look Back

In 1977, the first video-rental store opened. It was 600 square feet and located on Wilshire Boulevard in Los Angeles. George Atkinson, the entrepreneur who decided to launch this idea, charged $50 for an “annual membership” and $100 for a “lifetime membership” but the memberships only allowed people to rent videos for $10 a day. Despite an unusual business model, Atkinson’s store was an enormous success, growing to 42 affiliated stores in fewer than 20 months and resulting in numerous competitors.

In retrospect, Atkinson’s success represented the emergence of an entirely new market: home consumption of paid content. It would become an $18 billion dollar domestic market, and, rather than cannibalize from the existing movie theater market, it would eclipse it and thereby become a massive revenue source for the industry.

Atkinson’s success in 1977 is particularly remarkable as the Sony Betamax (the first VCR) had only gone on sale domestically in 1975 at a cost of $1,400 (which in 2013 U.S. dollars is $6,093). As a comparison, the first DVD player in 1997 cost $1,458 in 2013 dollars and the first Blu-ray player in 2006 cost $1,161 in 2013 dollars. And unlike the DVD and Blu-ray player, it would take eight years, until 1983, for the VCR to reach 10 percent of U.S. television households. Atkinson’s success, and that of his early competitors, was in catering to a market of well under 10 percent of U.S. households.

While many content companies realized this as a massive new revenue stream — e.g. 20th Century Fox buying one video rental company for $7.5 million in 1979 — the content industry lawyers and lobbyists tried to stop the home content market through litigation and regulation.

The content industry sued to ban the sale of the Betamax, the first VCR. This legal strategy was coupled by leveraging the overwhelming firepower of the content industry in Washington. If they lost in court to ban the technology and rental business model, then they would ban the technology and rental business model in Congress.

Litigation Attack

In 1976, the content industry filed suit against Sony, seeking an injunction to prevent the company from “manufacturing, distributing, selling or offering for sale Betamax or Betamax tapes.” Essentially granting this remedy would have banned the VCR for all Americans. The content industry’s motivation behind this suit was largely to deal with individuals recording live television, but the emergence of the rental industry was likely a contributing factor.

While Sony won at the district court level in 1979, in 1981 it lost at the Court of Appeals for the Ninth Circuit where the court found that Sony was liable for copyright infringement by their users — recording broadcast television. The Appellate court ordered the lower court to impose an appropriate remedy, advising in favor of an injunction to block the sale of the Betamax.

And in 1981, under normal circumstances, the VCR would have been banned then and there. Sony faced liability well beyond its net worth, so it may well have been the end of Sony, or at least its U.S. subsidiary, and the end of the VCR. Millions of private citizens could have been liable for damages for copyright infringement for recording television shows for personal use. But Sony appealed this ruling to the Supreme Court.

The Supreme Court is able to take very few cases. For example in 2009, 1.1 percent of petitions for certiorari were granted, and of these approximately 70 percent are cases where there is a conflict among different courts (here there was no conflict). But in 1982, the Supreme Court granted certiorari and agreed to hear the case.

After an oral hearing, the justices took a vote internally, and originally only one of them was persuaded to keep the VCR as legal (but after discussion, the number of justices in favor of the VCR would eventually increase to four).

With five votes in favor of affirming the previous ruling the Betamax (VCR) was to be illegal in the United States (see Justice Blackmun’s papers).

But then, something even more unusual happened – which is why we have the VCR and subsequent technologies: The Supreme Court decided for both sides to re-argue a portion of the case. Under the Burger Court (when he was Chief Justice), this only happened in 2.6 percent of the cases that received oral argument. In the re-argument of the case, a crucial vote switched sides, which resulted in a 5-4 decision in favor of Sony. The VCR was legal. There would be no injunction barring its sale.

The majority opinion characterized the lawsuit as an “unprecedented attempt to impose copyright liability upon the distributors of copying equipment and rejected “[s]uch an expansion of the copyright privilege” as “beyond the limits” given by Congress. The Court even cited Mr. Rogers who testified during the trial:

I have always felt that with the advent of all of this new technology that allows people to tape the ‘Neighborhood’ off-the-air . . . Very frankly, I am opposed to people being programmed by others.

On the absolute narrowest of legal grounds, through a highly unusual legal process (and significant luck), the VCR was saved by one vote at the Supreme Court in 1984.

Regulation Attack

In 1982 legislation was introduced in Congress to give copyright holders the exclusive right to authorize the rental of prerecorded videos. Legislation was reintroduced in 1983, the Consumer Video Sales Rental Act of 1983. This legislation would have allowed the content industry to shut down the rental market, or charge exorbitant fees, by making it a crime to rent out movies purchased commercially. In effect, this legislation would have ended the existing market model of rental stores. With 34 co-sponsors, major lobbyists and significant campaign contributions to support it, this legislation had substantial support at the time.

Video stores saw the Consumer Video Sales Rental Act as an existential threat, and on October 21, 1983, about 30 years before the SOPA/PIPA protests, video stores across the country closed down for several hours in protest. While the 1983 legislation died in committee, the legislation would be reintroduced in 1984. In 1984, similar legislation was enacted, The Record Rental Amendment of 1984, which banned the renting and leasing of music. In 1990, Congress banned the renting of computer software.

But in the face of public backlash from video retailers and customers, Congress did not pass the Consumer Video Sales Rental Act.

At the same time, the movie studios tried to ban the Betamax VCR through legislation. Eventually the content industry decided to support legislation that would require compulsory licensing rather than an outright ban. But such a compulsory licensing scheme would have drastically driven up the costs of video tape players and may have effectively banned the technology (similar regulations did ban other technologies).

For the content industry, banning the technology was a feature, not a bug.

Read the entire article here.

Image: Video Home System (VHS) cassette tape. Courtesy of Wikipedia.

Dangerous Foreign Films

The next time you cringe because your date or significant other wants to go see a foreign movie with you count your blessings. After all, you don’t live in North Korea.

So, take a deep breath and go see La Dolce Vita, The Discreet Charm of the Bourgeoisie and Rashomon.

From the Telegraph:

South Korea’s JoongAng Ilbo newspaper reported that the co-ordinated public executions took place in seven separate cities earlier this month.

In one case, the local authorities rounded up 10,000 people, including children, and forced them to watch, it reported.

Those put to death were found guilty by the state of minor misdemeanors, including watching videos of South Korean television programmes or possessing a Bible.

Sources told the paper that witnesses saw eight people tied to stakes in the Shinpoong Stadium, in Kangwon Province, before having sacks placed over their heads and being executed by soldiers firing machineguns.

“I heard from the residents that they watched in terror as the corpses were so riddled by machinegun fire that they were hard to identify afterwards,” the source said.

Relatives and friends of the victims were reportedly sent to prison camps, a tactic that North Korea frequently uses to dissuade anyone from breaking the law.

“Reports on public executions across the country would be certain to have a chilling effect on the rest of the people,” Daniel Pinkston, a North Korea analyst with The International Crisis Group in Seoul, said. “All these people want to do is to survive and for their families to survive. The incentives for not breaking the law are very clear now.”

The mass executions could signal a broader crackdown on any hints of discontent among the population – and even rival groups in Pyongyang – against the rule of Kim Jong-un, who came to power after the death of his father in December 2011.

In a new report, the Rand Corporation think tank claims that Kim survived an assassination attempt in 2012 and that his personal security has since been stepped up dramatically. The report concurs with South Korean intelligence sources that stated in March that a faction within the North Korean army had been involved in an attempt on Kim’s life in November of last year.

Read the entire article here.

Image: Kim Jong-un. Supreme leader of North Korea. Courtesy of Time.

Ethical Meat and Idiotic Media

Lab grown meat is now possible. But is not available on an industrial scale to satisfy the human desire for burgers, steak and ribs. While this does represent a breakthrough it’s likely to be a while before the last cow or chicken or pig is slaughtered. Of course, the mainstream media picked up this important event and immediately labeled it with captivating headlines featuring the word “frankenburger”. Perhaps a well-intentioned lab will someday come up with an intelligent form of media organization.

From the New York Times (dot earth):

I first explored livestock-free approaches to keeping meat on menus in 2008 in a pieced titled “Can People Have Meat and a Planet, Too?”

It’s been increasingly clear since then that there are both environmental and — obviously — ethical advantages to using technology to sustain omnivory on a crowding planet. This presumes humans will not all soon shift to a purely vegetarian lifestyle, even though there are signs of what you might call “peak meat” (consumption, that is) in prosperous societies (Mark Bittman wrote a nice piece on this). Given dietary trends as various cultures rise out of poverty, I would say it’s a safe bet meat will remain a favored food for decades to come.

Now non-farmed meat is back in the headlines, with a patty of in-vitro beef – widely dubbed a “frankenburger” — fried and served in London earlier today.

The beef was grown in a lab by a pioneer in this arena — Mark Post of Maastricht University in the Netherlands. My colleague Henry Fountain has reported the details in a fascinating news article. Here’s an excerpt followed by my thoughts on next steps in what I see as an important area of research and development:

According to the three people who ate it, the burger was dry and a bit lacking in flavor. One taster, Josh Schonwald, a Chicago-based author of a book on the future of food [link], said “the bite feels like a conventional hamburger” but that the meat tasted “like an animal-protein cake.”

But taste and texture were largely beside the point: The event, arranged by a public relations firm and broadcast live on the Web, was meant to make a case that so-called in-vitro, or cultured, meat deserves additional financing and research…..

Dr. Post, one of a handful of scientists working in the field, said there was still much research to be done and that it would probably take 10 years or more before cultured meat was commercially viable. Reducing costs is one major issue — he estimated that if production could be scaled up, cultured beef made as this one burger was made would cost more than $30 a pound.

The two-year project to make the one burger, plus extra tissue for testing, cost $325,000. On Monday it was revealed that Sergey Brin, one of the founders of Google, paid for the project. Dr. Post said Mr. Brin got involved because “he basically shares the same concerns about the sustainability of meat production and animal welfare.”
The enormous potential environmental benefits of shifting meat production, where feasible, from farms to factories were estimated in “Environmental Impacts of Cultured Meat Production,”a 2011 study in Environmental Science and Technology.

Read the entire article here.

Image: Professor Mark Post holds the world’s first lab-grown hamburger. Courtesy of Reuters/David Parry / The Atlantic.

Portrait of a Royal Baby

Royal-watchers from all corners of the globe, especially the British one, have been agog over the arrival of the latest royal earlier this week. The overblown media circus got us thinking about baby pictures. Will the Prince of Cambridge be the first heir to the throne to have his portrait enshrined via Instagram? Or, as is more likely, will his royal essence be captured in oil on canvas, as with the 35 or more generations that preceded him?

From Jonathan Jones over at the Guardian:

Royal children have been portrayed by some of the greatest artists down the ages, preserving images of childhood that are still touching today. Will this royal baby fare better than its mother in the portraits that are sure to come? Are there any artists out there who can go head to head with the greats of royal child portraiture?

Agnolo Bronzino has to be first among those greats, because he painted small children in a way that set the tone for many royal images to come. Some might say the Medici rulers of Florence, for whom he worked, were not properly royal – but they definitely acted like a royal family, and the artists who worked for them set the tone of court art all over Europe. In Giovanni de’ Medici As a Child, Bronzino expresses the joy of children and the pleasure of parents in a way that was revolutionary in the 16th century. Chubby-cheeked and jolly, Giovanni clutches a pet goldfinch. In paintings of the Holy Family you know that if Jesus has a pet bird it probably has some dire symbolic meaning. But this pet is just a pet. Giovanni is just a happy kid. Actually, a happy baby: he was about 18 months old.

Hans Holbein took more care to clarify the regal uniqueness of his subject when he portrayed Edward, only son of King Henry VIII of England, in about 1538. Holbein, too, captures the face of early childhood brilliantly. But how old is Edward meant to be? In fact, he was two. Holbein expresses his infancy – his baby face, his baby hands – while having him stand holding out a majestic hand, dressed like his father, next to an inscription that praises the paternal glory of Henry. Who knows, perhaps he really stood like that for a second or two, long enough for Holbein to take a mental photograph.

Diego Velázquez recorded a more nuanced, even anxious, view of royal childhood in his paintings of the royal princesses of 17th-century Spain. In the greatest of them, Las Meninas, the five-year-old Infanta Margarita Teresa stands looking at us, accompanied by her ladies in waiting (meninas) and two dwarves, while Velázquez works on a portrait of her parents, the king and queen. The infanta is beautiful and confident, attended by her own micro-court – but as she looks out of the painting at her parents (who are standing where the spectator of the painting stands) she is performing. And she is under pressure to look and act like a little princess.

The 19th-century painter Stephen Poyntz Denning may not be in the league of these masters. In fact, let’s be blunt: he definitely isn’t. But his painting Queen Victoria, Aged 4 is a fascinating curiosity. Like the Infanta, this royal princess is not allowed to be childlike. She is dressed in an oppressively formal way, in dark clothes that anticipate her mature image – a childhood lost to royal destiny.

Read the entire article here.

Image: Princess Victoria aged Four, Denning, Stephen Poyntz (c. 1787 – 1864). Courtesy of Wikimedia.

The Death of Photojournalism

Really, it was only a matter of time. First, digital cameras killed off their film-dependent predecessors and then dealt a death knell for Kodak. Now social media and the #hashtag is doing the same to the professional photographer.

Camera-enabled smartphones are ubiquitous, making everyone a photographer. And, with almost everyone jacked into at least one social network or photo-sharing site it takes only one point and a couple of clicks to get a fresh image posted to the internet. Ironically, the newsprint media, despite being in the business of news, have failed to recognize this news until recently.

So, now with an eye to cutting costs, and making images more immediate and compelling — via citizens — news organizations are re-tooling their staffs in four ways: first, fire the photographers; second, re-train reporters to take photographs with their smartphones; third, video, video, video; fourth, rely on the ever willing public to snap images, post, tweet, #hashtag and like — for free of course.

From Cult of Mac:

The Chicago Sun-Times, one of the remnants of traditional paper journalism, has let go its entire photography staff of 28 people. Now its reporters will start receiving “iPhone photography basics” training to start producing their own photos and videos.

The move is part of a growing trend towards publications using the iPhone as a replacement for fancy, expensive DSLRs. It’s a also a sign of how traditional journalism is being changed by technology like the iPhone and the advent of digital publishing.

Screen Shot 2013-05-31 at 1.58.39 PM

When Hurricane Sandy hit New York City, reporters for Time used the iPhone to take photos on the field and upload to the publication’s Instagram account. Even the cover photo used on the corresponding issue of Time was taken on an iPhone.

Sun-Times photographer Alex Garcia argues that the “idea that freelancers and reporters could replace a photo staff with iPhones is idiotic at worst, and hopelessly uninformed at best.” Garcia believes that reporters are incapable of writing articles and also producing quality media, but she’s fighting an uphill battle.

Big newspaper companies aren’t making anywhere near the amount of money they used to due to the popularity of online publications and blogs. Free news is a click away nowadays. Getting rid of professional photographers and equipping reporters with iPhones is another way to cut costs.

The iPhone has a better camera than most digital point-and-shoots, and more importantly, it is in everyone’s pocket. It’s a great camera that’s always with you, and that makes it an invaluable tool for any journalist. There will always be a need for videographers and pro photographers that can make studio-level work, but the iPhone is proving to be an invaluable tool for reporters in the modern world.

Read the entire article here.

Image: Kodak 1949-56 Retina IIa 35mm Camera. Courtesy of Wikipedia / Kodak.

Media Consolidation

The age of the rambunctious and megalomaniacal newspaper baron has passed, excepting, of course, Rupert Murdoch. Though while the colorful personalities of the late-19th and early 20th centuries have mostly disappeared, the 21st century has replaced these aging white men with faceless international corporations, all of which are, of course, run by aging white men.

The infographic below puts the current media landscape in clear perspective; one statistic is clear: more and more people are consuming news and entertainment from fewer and fewer sources.

[div class=attrib]Infographic courtesy of Frugal Dad.[end-div]