Category Archives: Idea Soup

The Rich and Powerful Live by Different Rules

Bradley_ManningNever has there been such a wonderful example of blatant utter hypocrisy. This time from the United States Department of Justice. It would be refreshing to convey to our leaders that not only do “Black Lives Matter”, “Less Privileged Lives Matter” as well.

Former director of the CIA no less, and ex-four star general David Petraeus copped a mere two years of probation and a $100,000 fine for leaking classified information to his biographer. Chelsea Manning, formerly Bradley Manning, intelligence analyst and ex-army private, was sentenced to 35 years in prison in 2013 for disclosing classified documents to WikiLeaks.

And, there are many other similar examples.

DCIA David PetraeusWe wince when hearing of oligarchic corruption and favoritism in other nations, such as Russia and China. But, in this country it goes by the euphemism known as “justice” so it must be OK.

From arstechnica:

Yesterday [April 23, 2015], former CIA Director David Petraeus was handed two years of probation and a $100,000 fine after agreeing to a plea deal that ends in no jail time for leaking classified information to Paula Broadwell, his biographer and lover.

“I now look forward to moving on with the next phase of my life and continuing to serve our great nation as a private citizen,” Petraeus said outside the federal courthouse in Charlotte, North Carolina on Thursday.

Lower-level government leakers have not, however, been as likely to walk out of a courthouse applauding the US as Petraeus did. Trevor Timm, executive director of the Freedom of the Press Foundation, called the Petraeus plea deal a “gross hypocrisy.”

“At the same time as Petraeus got off virtually scot-free, the Justice Department has been bringing the hammer down upon other leakers who talk to journalists—sometimes for disclosing information much less sensitive than Petraeus did,” he said.

The Petraeus sentencing came days after the Justice Department demanded (PDF) up to a 24-year-term for Jeffrey Sterling, a former CIA agent who leaked information to a Pulitzer Prize-winning writer about a botched mission to sell nuclear plans to Iran in order to hinder its nuclear-weapons progress.

“A substantial sentence in this case would send an appropriate and much needed message to all persons entrusted with the handling of classified information, i.e., that intentional breaches of the laws governing the safeguarding of national defense information will be pursued aggressively, and those who violate the law in this manner will be tried, convicted, and punished accordingly,” the Justice Department argued in Sterling’s case this week.

The Daily Beast sums up the argument that the Petraeus deal involves a double standard by noting other recent penalties for lower-level leakers:

“Chelsea Manning, formerly Bradley Manning, was sentenced to 35 years in prison in 2013 for disclosing classified documents to WikiLeaks. Stephen Jin-Woo Kim, a former State Department contractor, entered a guilty plea last year to one felony count of disclosing classified information to a Fox News reporter in February 2014. He was sentenced to 13 months in prison. On Monday, prosecutors urged a judge to sentence Jeffrey Sterling, a former CIA officer, to at least 20 years in prison for leaking classified plans to sabotage Iran’s nuclear-weapons program to a New York Times reporter. Sterling will be sentenced next month. And former CIA officer John C. Kiriakou served 30 months in federal prison after he disclosed the name of a covert operative to a reporter. He was released in February and is finishing up three months of house arrest.”

The information Petraeus was accused of leaking, according to the original indictment, contained “classified information regarding the identities of covert officers, war strategy, intelligence capabilities and mechanisms, diplomatic discussions, quotes and deliberative discussions from high-level National Security Council meetings.” The leak also included “discussions with the president of the United States.”

The judge presiding over the case, US Magistrate Judge David Keesler, increased the government’s recommended fine of $40,000 to $100,000 because of Petraeus’ ”grave but uncharacteristic error in judgement.”

Read the entire story here.

Images: Four-Star General David Petraeus; Private Chelsea Manning. Courtesy of Wikipedia.

Send to Kindle

Belief and the Falling Light

Many of us now accept that lights falling from the sky are rocky interlopers from the asteroid clouds within our solar system, rather than visiting angels or signs from an angry (or mysteriously benevolent) God. New analysis of the meteor that overflew Chelyabinsk in Russia in 2013 suggests that one of the key founders of Christianity may have witnessed a similar natural phenomenon around two thousand years ago. However, at the time, Saul (later to become Paul the evangelist) interpreted the dazzling light on the road to Damascus – Acts of the Apostles, New Testament – as a message from a Christian God. The rest, as they say, is history. Luckily, recent scientific progress now means that most of us no longer establish new religious movements based on fireballs in the sky. But, we are awed nonetheless.

From the New Scientist:

Nearly two thousand years ago, a man named Saul had an experience that changed his life, and possibly yours as well. According to Acts of the Apostles, the fifth book of the biblical New Testament, Saul was on the road to Damascus, Syria, when he saw a bright light in the sky, was blinded and heard the voice of Jesus. Changing his name to Paul, he became a major figure in the spread of Christianity.

William Hartmann, co-founder of the Planetary Science Institute in Tucson, Arizona, has a different explanation for what happened to Paul. He says the biblical descriptions of Paul’s experience closely match accounts of the fireball meteor seen above Chelyabinsk, Russia, in 2013.

Hartmann has detailed his argument in the journal Meteoritics & Planetary Science (doi.org/3vn). He analyses three accounts of Paul’s journey, thought to have taken place around AD 35. The first is a third-person description of the event, thought to be the work of one of Jesus’s disciples, Luke. The other two quote what Paul is said to have subsequently told others.

“Everything they are describing in those three accounts in the book of Acts are exactly the sequence you see with a fireball,” Hartmann says. “If that first-century document had been anything other than part of the Bible, that would have been a straightforward story.”

But the Bible is not just any ancient text. Paul’s Damascene conversion and subsequent missionary journeys around the Mediterranean helped build Christianity into the religion it is today. If his conversion was indeed as Hartmann explains it, then a random space rock has played a major role in determining the course of history (see “Christianity minus Paul”).

That’s not as strange as it sounds. A large asteroid impact helped kill off the dinosaurs, paving the way for mammals to dominate the Earth. So why couldn’t a meteor influence the evolution of our beliefs?

“It’s well recorded that extraterrestrial impacts have helped to shape the evolution of life on this planet,” says Bill Cooke, head of NASA’s Meteoroid Environment Office in Huntsville, Alabama. “If it was a Chelyabinsk fireball that was responsible for Paul’s conversion, then obviously that had a great impact on the growth of Christianity.”

Hartmann’s argument is possible now because of the quality of observations of the Chelyabinsk incident. The 2013 meteor is the most well-documented example of larger impacts that occur perhaps only once in 100 years. Before 2013, the 1908 blast in Tunguska, also in Russia, was the best example, but it left just a scattering of seismic data, millions of flattened trees and some eyewitness accounts. With Chelyabinsk, there is a clear scientific argument to be made, says Hartmann. “We have observational data that match what we see in this first-century account.”

Read the entire article here.

Video: Meteor above Chelyabinsk, Russia in 2013. Courtesy of Tuvix72.

Send to Kindle

Endless Political Campaigning

US-politicians

The great capitalist market has decided — endless political campaigning in the United States is beneficial. If you think the presidential campaign to elect the next leader in 2016 began sometime last year you are not mistaken. In fact, it really does seem that political posturing for the next election often begins before the current one is even decided. We all complain: too many ads, too much negativity, far too much inanity and little substance. Yet, we allow the process to continue, and to grow in scale. Would you put up with a political campaign that lasts a mere 38 days? The British seem to do it. But, then again, the United States is so much more advanced, right?

From WSJ:

On March 23, Ted Cruz announced he is running for president in a packed auditorium at Liberty University in Lynchburg, Va. On April 7, Rand Paul announced he is running for president amid the riverboat décor of the Galt House hotel in Louisville, Ky. On April 12, Hillary Clinton announced she is running for president in a brief segment of a two-minute video. On April 13, Marco Rubio announced he is running before a cheering crowd at the Freedom Tower in Miami. And these are just the official announcements.

Jeb Bush made it known in December that he is interested in running. Scott Walker’s rousing speech at the Freedom Summit in Des Moines, Iowa, on Jan. 24 left no doubt that he will enter the race. Chris Christie’s appearance in New Hampshire last week strongly suggests the same. Previous presidential candidates Mike Huckabee,Rick Perry and Rick Santorum seem almost certain to run. Pediatric surgeon Ben Carson is reportedly ready to announce his run on May 4 at the Detroit Music Hall.

With some 570 days left until Election Day 2016, the race for president is very much under way—to the dismay of a great many Americans. They find the news coverage of the candidates tiresome (what did Hillary order at Chipotle?), are depressed by the negative campaigning that is inevitable in an adversarial process, and dread the onslaught of political TV ads. Too much too soon!

They also note that other countries somehow manage to select their heads of government much more quickly. The U.K. has a general election campaign going on right now. It began on March 30, when the queen, on the advice of the prime minister, dissolved Parliament, and voting will take place on May 7. That’s 38 days later. Britons are complaining that the electioneering goes on too long.

American presidential campaigns did not always begin so soon, but they have for more than a generation now. As a young journalist, Sidney Blumenthal (in recent decades a consigliere to the Clintons) wrote quite a good book titled “The Permanent Campaign.” It was published in 1980. Mr. Blumenthal described what was then a relatively new phenomenon.

When Jimmy Carter announced his candidacy for president in January 1975, he was not taken particularly seriously. But his perseverance paid off, and he took the oath of office two years later. His successors—Ronald Reagan, George H.W. Bush and Bill Clinton—announced their runs in the fall before their election years, although they had all been busy assembling campaigns before that. George W. Bush announced in June 1999, after the adjournment of the Texas legislature. Barack Obama announced in February 2007, two days before Lincoln’s birthday, in Lincoln’s Springfield, Ill. By that standard, declared candidates Mr. Cruz, Mr. Paul, Mrs. Clinton and Mr. Rubio got a bit of a late start.

Why are American presidential campaigns so lengthy? And is there anything that can be done to compress them to a bearable timetable?

One clue to the answers: The presidential nominating process, the weakest part of our political system, is also the one part that was not envisioned by the Founding Fathers. The framers of the Constitution created a powerful presidency, confident (justifiably, as it turned out) that its first incumbent, George Washington, would set precedents that would guide the republic for years to come.

But they did not foresee that even in Washington’s presidency, Americans would develop political parties, which they abhorred. The Founders expected that later presidents would be chosen, usually by the House of Representatives, from local notables promoted by different states in the Electoral College. They did not expect that the Federalist and Republican parties would coalesce around two national leaders—Washington’s vice president, John Adams, and Washington’s first secretary of state, Thomas Jefferson—in the close elections of 1796 and 1800.

The issue then became: When a president followed George Washington’s precedent and retired after two terms, how would the parties choose nominees, in a republic that, from the start, was regionally, ethnically and religiously diverse?

Read the entire story here.

Image courtesy of Google Search.

Send to Kindle

Religious Dogma and DNA

Despite ongoing conflicts around the global that are fueled or governed by religious fanaticism it is entirely plausible that our general tendency to supernatural belief is encoded in our DNA. Of course this does not mean that a God or that various gods exist, it merely implies that over time natural selection generally favored those who believed in deities over those did not. We are such complex and contradictory animals.

From NYT:

Most of us find it mind-boggling that some people seem willing to ignore the facts — on climate change, on vaccines, on health care — if the facts conflict with their sense of what someone like them believes. “But those are the facts,” you want to say. “It seems weird to deny them.”

And yet a broad group of scholars is beginning to demonstrate that religious belief and factual belief are indeed different kinds of mental creatures. People process evidence differently when they think with a factual mind-set rather than with a religious mind-set. Even what they count as evidence is different. And they are motivated differently, based on what they conclude. On what grounds do scholars make such claims?

First of all, they have noticed that the very language people use changes when they talk about religious beings, and the changes mean that they think about their realness differently. You do not say, “I believe that my dog is alive.” The fact is so obvious it is not worth stating. You simply talk in ways that presume the dog’s aliveness — you say she’s adorable or hungry or in need of a walk. But to say, “I believe that Jesus Christ is alive” signals that you know that other people might not think so. It also asserts reverence and piety. We seem to regard religious beliefs and factual beliefs with what the philosopher Neil Van Leeuwen calls different “cognitive attitudes.”

Second, these scholars have remarked that when people consider the truth of a religious belief, what the belief does for their lives matters more than, well, the facts. We evaluate factual beliefs often with perceptual evidence. If I believe that the dog is in the study but I find her in the kitchen, I change my belief. We evaluate religious beliefs more with our sense of destiny, purpose and the way we think the world should be. One study found that over 70 percent of people who left a religious cult did so because of a conflict of values. They did not complain that the leader’s views were mistaken. They believed that he was a bad person.

Third, these scholars have found that religious and factual beliefs play different roles in interpreting the same events. Religious beliefs explain why, rather than how. People who understand readily that diseases are caused by natural processes might still attribute sickness at a particular time to demons, or healing to an act of God. The psychologist Cristine H. Legare and her colleagues recently demonstrated that people use both natural and supernatural explanations in this interdependent way across many cultures. They tell a story, as recounted by Tracy Kidder’s book on the anthropologist and physician Paul Farmer, about a woman who had taken her tuberculosis medication and been cured — and who then told Dr. Farmer that she was going to get back at the person who had used sorcery to make her ill. “But if you believe that,” he cried, “why did you take your medicines?” In response to the great doctor she replied, in essence, “Honey, are you incapable of complexity?”

Moreover, people’s reliance on supernatural explanations increases as they age. It may be tempting to think that children are more likely than adults to reach out to magic to explain something, and that they increasingly put that mind-set to the side as they grow up, but the reverse is true. It’s the young kids who seem skeptical when researchers ask them about gods and ancestors, and the adults who seem clear and firm. It seems that supernatural ideas do things for adults they do not yet do for children.

Finally, scholars have determined that people don’t use rational, instrumental reasoning when they deal with religious beliefs. The anthropologist Scott Atran and his colleagues have shown that sacred values are immune to the normal cost-benefit trade-offs that govern other dimensions of our lives. Sacred values are insensitive to quantity (one cartoon can be a profound insult). They don’t respond to material incentives (if you offer people money to give up something that represents their sacred value, and they often become more intractable in their refusal). Sacred values may even have different neural signatures in the brain.

The danger point seems to be when people feel themselves to be completely fused with a group defined by its sacred value. When Mr. Atran and his colleagues surveyed young men in two Moroccan neighborhoods associated with militant jihad (one of them home to five men who helped plot the 2004 Madrid train bombings, and then blew themselves up), they found that those who described themselves as closest to their friends and who upheld Shariah law were also more likely to say that they would suffer grievous harm to defend Shariah law. These people become what Mr. Atran calls “devoted actors” who are unconditionally committed to their sacred value, and they are willing to die for it.

Read the entire article here.

Send to Kindle

MondayMap: Imagining a Post-Post-Ottoman World

Sykes_Picot_Agreement_Map_signed_8_May_1916

The United States is often portrayed as the world’s bully and nefarious geo-political schemer — a nation responsible for many of the world’s current political ills. However, it is the French and British who should be called to account for much of the globe’s ongoing turmoil, particularly in the Middle East. After the end of WWI the victors expeditiously carved up the spoils of the vanquished Austro-Hungarian and Ottoman Empires. Much of Eastern Europe and the Middle East was divvied and traded just a kids might swap baseball or football (soccer) cards today. Then President of France Georges Clemenceau and British Prime Minister David Lloyd George famously bartered and gifted — amongst themselves and their friends — entire regions and cities without thought to historical precedence, geographic and ethnic boundaries, or even the basic needs of entire populations. Their decisions were merely lines to be drawn and re-drawn on a map.

So, it would be a fascinating — though rather naive — exercise to re-draw many of today’s arbitrary and contrived boundaries, and to revert regions to their more appropriate owners. Of course, where and when should this thought experiment begin and end? Pre-roman empire, post-normans, before the Prussians, prior to the Austro-Hungarian Empire, or after the Ottomans, post-Soviets, or after Tito, or way before the Huns, Vandals and the Barbarians and any number of the Germanic tribes?

Nevertheless, essayist Yaroslav Trofimov takes a stab at re-districting to pre-Ottoman boundaries and imagines a world with less bloodshed. A worthy dream.

From WSJ:

Shortly after the end of World War I, the French and British prime ministers took a break from the hard business of redrawing the map of Europe to discuss the easier matter of where frontiers would run in the newly conquered Middle East.

Two years earlier, in 1916, the two allies had agreed on their respective zones of influence in a secret pact—known as the Sykes-Picot agreement—for divvying up the region. But now the Ottoman Empire lay defeated, and the United Kingdom, having done most of the fighting against the Turks, felt that it had earned a juicier reward.

“Tell me what you want,” France’s Georges Clemenceau said to Britain’s David Lloyd George as they strolled in the French embassy in London.

“I want Mosul,” the British prime minister replied.

“You shall have it. Anything else?” Clemenceau asked.

In a few seconds, it was done. The huge Ottoman imperial province of Mosul, home to Sunni Arabs and Kurds and to plentiful oil, ended up as part of the newly created country of Iraq, not the newly created country of Syria.

The Ottomans ran a multilingual, multireligious empire, ruled by a sultan who also bore the title of caliph—commander of all the world’s Muslims. Having joined the losing side in the Great War, however, the Ottomans saw their empire summarily dismantled by European statesmen who knew little about the region’s people, geography and customs.

The resulting Middle Eastern states were often artificial creations, sometimes with implausibly straight lines for borders. They have kept going since then, by and large, remaining within their colonial-era frontiers despite repeated attempts at pan-Arab unification.

The built-in imbalances in some of these newly carved-out states—particularly Syria and Iraq—spawned brutal dictatorships that succeeded for decades in suppressing restive majorities and perpetuating the rule of minority groups.

But now it may all be coming to an end. Syria and Iraq have effectively ceased to function as states. Large parts of both countries lie beyond central government control, and the very meaning of Syrian and Iraqi nationhood has been hollowed out by the dominance of sectarian and ethnic identities.

The rise of Islamic State is the direct result of this meltdown. The Sunni extremist group’s leader, Abu Bakr al-Baghdadi, has proclaimed himself the new caliph and vowed to erase the shame of the “Sykes-Picot conspiracy.” After his men surged from their stronghold in Syria last summer and captured Mosul, now one of Iraq’s largest cities, he promised to destroy the old borders. In that offensive, one of the first actions taken by ISIS (as his group is also known) was to blow up the customs checkpoints between Syria and Iraq.

“What we are witnessing is the demise of the post-Ottoman order, the demise of the legitimate states,” says Francis Ricciardone, a former U.S. ambassador to Turkey and Egypt who is now at the Atlantic Council, a Washington think tank. “ISIS is a piece of that, and it is filling in a vacuum of the collapse of that order.”

In the mayhem now engulfing the Middle East, it is mostly the countries created a century ago by European colonialists that are coming apart. In the region’s more “natural” nations, a much stronger sense of shared history and tradition has, so far, prevented a similar implosion.

“Much of the conflict in the Middle East is the result of insecurity of contrived states,” says Husain Haqqani, an author and a former Pakistani ambassador to the U.S. “Contrived states need state ideologies to make up for lack of history and often flex muscles against their own people or against neighbors to consolidate their identity.”

In Egypt, with its millennial history and strong sense of identity, almost nobody questioned the country’s basic “Egyptian-ness” throughout the upheaval that has followed President Hosni Mubarak’s ouster in a 2011 revolution. As a result, most of Egypt’s institutions have survived the turbulence relatively intact, and violence has stopped well short of outright civil war.

Turkey and Iran—both of them, in bygone eras, the center of vast empires—have also gone largely unscathed in recent years, even though both have large ethnic minorities of their own, including Arabs and Kurds.

The Middle East’s “contrived” countries weren’t necessarily doomed to failure, and some of them—notably Jordan—aren’t collapsing, at least not yet. The world, after all, is full of multiethnic and multiconfessional states that are successful and prosperous, from Switzerland to Singapore to the U.S., which remains a relative newcomer as a nation compared with, say, Iran.

Read the entire article here.

Image: Map of Sykes–Picot Agreement showing Eastern Turkey in Asia, Syria and Western Persia, and areas of control and influence agreed between the British and the French. Royal Geographical Society, 1910-15. Signed by Mark Sykes and François Georges-Picot, 8 May 1916. Courtesy of Wikipedia.

Send to Kindle

Yes M’Lady

google-Thunderbirds

Beneath the shell that envelops us as adults lies the child. We all have one inside — that vulnerable being who dreams, plays and improvises. Sadly, our contemporary society does a wonderful job of selectively numbing these traits, usually as soon as we enter school; our work finishes the process by quashing all remnants of our once colorful and unbounded imaginations. OK, I’m exaggerating a little to make my point. But I’m certain this strikes a chord.

Keeping this in mind, it’s awesomely brilliant to see Thunderbirds making a comeback. You may recall the original Thunderbirds TV shows in the mid-sixties. Created by Gerry and Sylvia Anderson, the marionette puppets and their International Rescue science-fiction machines would save us weekly from the forces of evil, destruction and chaos. The child who lurks within me utterly loved this show — everything would come to a halt to make way for this event on saturday mornings. Now I have a chance of reliving it with my kids, and maintaining some degree of childhood wonder in the process. Thunderbirds are go…

From the Guardian:

5, 4, 3, 2, 1 … Thunderbirds are go – but not quite how older viewers will remember. International Rescue has been given a makeover for the modern age, with the Tracy brothers, Brains, Lady Penelope and Parker smarter, fitter and with better gadgets than they ever had when the “supermarionation” show began on ITV half a century ago.

But fans fearful that its return, complete with Hollywood star Rosamund Pike voicing Lady Penelope, will trample all over their childhood memories can rest easy.

Unlike the 2004 live action film which Thunderbirds creator, the late Gerry Anderson, described as the “biggest load of crap I have ever seen in my life”, the new take on the children’s favourite, called Thunderbirds Are Go, remains remarkably true to the spirit of the 50-year-old original.

Gone are the puppet strings – audience research found that younger viewers wanted something more dynamic – but along with computer generated effects are models and miniature sets (“actually rather huge” said executive producer Estelle Hughes) that faithfully recall the original Thunderbirds.

Speaking after the first screening of the new ITV series on Tuesday, executive producer Giles Ridge said: “We felt we should pay tribute to all those elements that made it special but at the same time update it so it’s suitable and compelling for a modern audience.

“The basic DNA of the show – five young brothers on a secret hideaway island with the most fantastic craft you could imagine, helping people around the world who are in trouble, that’s not a bad place to start.”

The theme music is intact, albeit given a 21st century makeover, as is the Tracy Island setting – complete with the avenue of palm trees that makes way for Thunderbird 2 and the swimming pool that slides into the mountain for the launch of Thunderbird 1.

Lady Penelope – as voiced by Pike – still has a cut-glass accent and is entirely unflappable. When she is not saving the world she is visiting Buckingham Palace or attending receptions at 10 Downing Street. There is also a nod – blink and you miss it – to another Anderson puppet series, Stingray.

Graham, who voiced Parker in the original series, returns in the same role. “I think they were checking me out to see if I was still in one piece,” said Graham, now 89, of the meeting when he was first approached to appear in the new series.

“I was absolutely thrilled to repeat the voice and character of Parker. Although I am older my voice hasn’t changed too much over the years.”

He said the voice of Parker had come from a wine waiter who used to work in the royal household, whom Anderson had taken him to see in a pub in Cookham, Berkshire.

“He came over and said, ‘Would you like to see the wine list, sir?’ And Parker was born. Thank you, old mate.”

Brains, as voiced by Fonejacker star Kayvan Novak, now has an Indian accent.

Sylvia Anderson, Anderson’s widow, who co-created the show, will make a guest appearance as Lady Penelope’s “crazy aunt”.

Read the entire story here.

Image courtesy of Google Search.

Send to Kindle

Your Current Dystopian Nightmare: In Just One Click

Amazon was supposed to give you back precious time by making shopping and spending painlessly simple. Apps on your smartphone were supposed to do the same for all manner of re-tooled on-demand services. What wonderful time-saving inventions! So, now you can live in the moment and make use of all this extra free time. It’s your time now. You’ve won it back and no one can take it away.

And, what do you spend this newly earned free time doing? Well, you sit at home in your isolated cocoon, you shop for more things online, you download some more great apps that promise to bring even greater convenience, you interact less with real humans, and, best of all, you spend more time working. Welcome to your new dystopian nightmare, and it’s happening right now. Click.

From Medium:

Angel the concierge stands behind a lobby desk at a luxe apartment building in downtown San Francisco, and describes the residents of this imperial, 37-story tower. “Ubers, Squares, a few Twitters,” she says. “A lot of work-from-homers.”

And by late afternoon on a Tuesday, they’re striding into the lobby at a just-get-me-home-goddammit clip, some with laptop bags slung over their shoulders, others carrying swank leather satchels. At the same time a second, temporary population streams into the building: the app-based meal delivery people hoisting thermal carrier bags and sacks. Green means Sprig. A huge M means Munchery. Down in the basement, Amazon Prime delivery people check in packages with the porter. The Instacart groceries are plunked straight into a walk-in fridge.

This is a familiar scene. Five months ago I moved into a spartan apartment a few blocks away, where dozens of startups and thousands of tech workers live. Outside my building there’s always a phalanx of befuddled delivery guys who seem relieved when you walk out, so they can get in. Inside, the place is stuffed with the goodies they bring: Amazon Prime boxes sitting outside doors, evidence of the tangible, quotidian needs that are being serviced by the web. The humans who live there, though, I mostly never see. And even when I do, there seems to be a tacit agreement among residents to not talk to one another. I floated a few “hi’s” in the elevator when I first moved in, but in return I got the monosyllabic, no-eye-contact mumble. It was clear: Lady, this is not that kind of building.

Back in the elevator in the 37-story tower, the messengers do talk, one tells me. They end up asking each other which apps they work for: Postmates. Seamless. EAT24. GrubHub. Safeway.com. A woman hauling two Whole Foods sacks reads the concierge an apartment number off her smartphone, along with the resident’s directions: “Please deliver to my door.”

“They have a nice kitchen up there,” Angel says. The apartments rent for as much as $5,000 a month for a one-bedroom. “But so much, so much food comes in. Between 4 and 8 o’clock, they’re on fire.”

I start to walk toward home. En route, I pass an EAT24 ad on a bus stop shelter, and a little further down the street, a Dungeons & Dragons–type dude opens the locked lobby door of yet another glass-box residential building for a Sprig deliveryman:

“You’re…”

“Jonathan?”

“Sweet,” Dungeons & Dragons says, grabbing the bag of food. The door clanks behind him.

And that’s when I realized: the on-demand world isn’t about sharing at all. It’s about being served. This is an economy of shut-ins.

In 1998, Carnegie Mellon researchers warned that the internet could make us into hermits. They released a study monitoring the social behavior of 169 people making their first forays online. The web-surfers started talking less with family and friends, and grew more isolated and depressed. “We were surprised to find that what is a social technology has such anti-social consequences,” said one of the researchers at the time. “And these are the same people who, when asked, describe the Internet as a positive thing.”

We’re now deep into the bombastic buildout of the on-demand economy— with investment in the apps, platforms and services surging exponentially. Right now Americans buy nearly eight percent of all their retail goods online, though that seems a wild underestimate in the most congested, wired, time-strapped urban centers.

Many services promote themselves as life-expanding?—?there to free up your time so you can spend it connecting with the people you care about, not standing at the post office with strangers. Rinse’s ad shows a couple chilling at a park, their laundry being washed by someone, somewhere beyond the picture’s frame. But plenty of the delivery companies are brutally honest that, actually, they never want you to leave home at all.

GrubHub’s advertising banks on us secretly never wanting to talk to a human again: “Everything great about eating, combined with everything great about not talking to people.” DoorDash, another food delivery service, goes for the all-caps, batshit extreme:

“NEVER LEAVE HOME AGAIN.”

Katherine van Ekert isn’t a shut-in, exactly, but there are only two things she ever has to run errands for any more: trash bags and saline solution. For those, she must leave her San Francisco apartment and walk two blocks to the drug store, “so woe is my life,” she tells me. (She realizes her dry humor about #firstworldproblems may not translate, and clarifies later: “Honestly, this is all tongue in cheek. We’re not spoiled brats.”) Everything else is done by app. Her husband’s office contracts with Washio. Groceries come from Instacart. “I live on Amazon,” she says, buying everything from curry leaves to a jogging suit for her dog, complete with hoodie.

She’s so partial to these services, in fact, that she’s running one of her own: A veterinarian by trade, she’s a co-founder of VetPronto, which sends an on-call vet to your house. It’s one of a half-dozen on-demand services in the current batch at Y Combinator, the startup factory, including a marijuana delivery app called Meadow (“You laugh, but they’re going to be rich,” she says). She took a look at her current clients?—?they skew late 20s to late 30s, and work in high-paying jobs: “The kinds of people who use a lot of on demand services and hang out on Yelp a lot ?”

Basically, people a lot like herself. That’s the common wisdom: the apps are created by the urban young for the needs of urban young. The potential of delivery with a swipe of the finger is exciting for van Ekert, who grew up without such services in Sydney and recently arrived in wired San Francisco. “I’m just milking this city for all it’s worth,” she says. “I was talking to my father on Skype the other day. He asked, ‘Don’t you miss a casual stroll to the shop?’ Everything we do now is time-limited, and you do everything with intention. There’s not time to stroll anywhere.”

Suddenly, for people like van Ekert, the end of chores is here. After hours, you’re free from dirty laundry and dishes. (TaskRabbit’s ad rolls by me on a bus: “Buy yourself time?—?literally.”)

So here’s the big question. What does she, or you, or any of us do with all this time we’re buying? Binge on Netflix shows? Go for a run? Van Ekert’s answer: “It’s more to dedicate more time to working.”

Read the entire story here.

Send to Kindle

April Can Mean Only One Thing

April-fool-Hailo-app

The advent of April in the United States usually brings the impending  tax day to mind. In the UK when April rolls in, it means the media goes overboard with April Fool’s jokes. Here’s a smattering of the silliest from Britain’s most serious media outlets.

From the Telegraph: transparent Marmite, Yessus Juice, prison release voting app, Burger King cologne (for men).

From the Guardian: Jeremy Clarkson and fossil fuel divestment.

From the Independent: a round-up of the best gags, including the proposed Edinburgh suspension bridge featuring a gap, Simon Cowell’s effigy on the new £5 note, grocery store aisle trampolines for the short of stature.

Image: Hailo’s new piggyback rideshare service.

Send to Kindle

Women Are From Venus, Men Can’t Remember

Yet another body of research underscores how different women are from men. This time, we are told, that the sexes generally encode and recall memories differently. So, the next time you take issue with a spouse (of different gender) about a — typically trivial — past event keep in mind that your own actions, mood and gender will affect your recall. If you’re female, your memories may be much more vivid than your male counterpart, but not necessarily more correct. If you (male) won last night’s argument, your spouse (female) will — unfortunately for you — remember it more accurately than you, which of course will lead to another argument.

From WSJ:

Carrie Aulenbacher remembers the conversation clearly: Her husband told her he wanted to buy an arcade machine he found on eBay. He said he’d been saving up for it as a birthday present to himself. The spouses sat at the kitchen table and discussed where it would go in the den.

Two weeks later, Ms. Aulenbacher came home from work and found two arcade machines in the garage—and her husband beaming with pride.

“What are these?” she demanded.

“I told you I was picking them up today,” he replied.

She asked him why he’d bought two. He said he’d told her he was getting “a package deal.” She reminded him they’d measured the den for just one. He stood his ground.

“I believe I told her there was a chance I was going to get two,” says Joe Aulenbacher, who is 37 and lives in Erie, Pa.

“It still gets me going to think about it a year later,” says Ms. Aulenbacher, 36. “My home is now overrun with two machines I never agreed upon.” The couple compromised by putting one game in the den and the other in Mr. Aulenbacher’s weight room.

It is striking how many arguments in a relationship start with two different versions of an event: “Your tone of voice was rude.” “No it wasn’t.” “You didn’t say you’d be working late.” “Yes I did.” “I told you we were having dinner with my mother tonight.” “No, honey. You didn’t.”

How can two people have different memories of the same event? It starts with the way each person perceives the event in the first place—and how they encoded that memory. “You may recall something differently at least in part because you understood it differently at the time,” says Dr. Michael Ross, professor emeritus in the psychology department at the University of Waterloo in Ontario, Canada, who has studied memory for many years.

Researchers know that spouses sometimes can’t even agree on concrete events that happened in the past 24 hours—such as whether they had an argument or whether one received a gift from the other. A study in the early 1980s, published in the journal “Behavioral Assessment,” found that couples couldn’t perfectly agree on whether they had sex the previous night.

Women tend to remember more about relationship issues than men do. When husbands and wives are asked to recall concrete relationship events, such as their first date, an argument or a recent vacation, women’s memories are more vivid and detailed.

But not necessarily more accurate. When given a standard memory test where they are shown names or pictures and then asked to recall them, women do just about the same as men.

Researchers have found that women report having more emotions during relationship events than men do. They may remember events better because they pay more attention to the relationship and reminisce more about it.

People also remember their own actions better. So they can recall what they did, just not what their spouse did. Researchers call this an egocentric bias, and study it by asking people to recall their contributions to events, as well as their spouse’s. Who cleans the kitchen more? Who started the argument? Whether the event is positive or negative, people tend to believe that they had more responsibility.

Your mood—both when an event happens and when you recall it later—plays a big part in memory, experts say. If you are in a positive mood or feeling positive about the other person, you will more likely recall a positive experience or give a positive interpretation to a negative experience. Similarly, negative moods tend to reap negative memories.

Negative moods may also cause stronger memories. A person who lost an argument remembers it more clearly than the person who won it, says Dr. Ross. Men tend to win more arguments, he says, which may help to explain why women remember the spat more. But men who lost an argument remember it as well as women who lost.

Read the entire article here.

Send to Kindle

We Are All Always Right, All of the Time

You already know this: you believe that your opinion is correct all the time, about everything. And, interestingly enough, your friends and neighbors believe that they are always right too. Oh, and the colleague at the office with whom you argue all the time — she’s right all the time too.

How can this be, when in an increasingly science-driven, objective universe facts trump opinion? Well, not so fast. It seems that we humans have an internal mechanism that colors our views based on a need for acceptance within a broader group. That is, we generally tend to spin our rational views in favor of group consensus, versus supporting the views of a subject matter expert, which might polarize the group. This is both good and bad. Good because it reinforces the broader benefits of being within a group; bad because we are more likely to reject opinion, evidence and fact from experts outside of our group — think climate change.

From the Washington Post:

It’s both the coolest — and also in some ways the most depressing — psychology study ever.

Indeed, it’s so cool (and so depressing) that the name of its chief finding — the Dunning-Kruger effect — has at least halfway filtered into public consciousness. In the classic 1999 paper, Cornell researchers David Dunning and Justin Kruger found that the less competent people were in three domains — humor, logic, and grammar — the less likely they were to be able to recognize that. Or as the researchers put it:

We propose that those with limited knowledge in a domain suffer from a dual burden: Not only do they reach mistaken conclusions and make regrettable errors, but their incompetence robs them of the ability to realize it.

Dunning and Kruger didn’t directly apply this insight to our debates about science. But I would argue that the effect named after them certainly helps to explain phenomena like vaccine denial, in which medical authorities have voiced a very strong opinion, but some parents just keep on thinking that, somehow, they’re in a position to challenge or ignore this view.

So why do I bring this classic study up now?

The reason is that an important successor to the Dunning-Kruger paper has just been come out — and it, too, is pretty depressing (at least for those of us who believe that domain expertise is a thing to be respected and, indeed, treasured)This time around, psychologists have not uncovered an endless spiral of incompetence and the inability to perceive it. Rather, they’ve shown that people have an “equality bias” when it comes to competence or expertise, such that even when it’s very clear that one person in a group is more skilled, expert, or competent (and the other less), they are nonetheless inclined to seek out a middle ground in determining how correct different viewpoints are.

Yes, that’s right — we’re all right, nobody’s wrong, and nobody gets hurt feelings.

The new study, just published in the Proceedings of the National Academy of Sciences, is by Ali Mahmoodi of the University of Tehran and a long list of colleagues from universities in the UK, Germany, China, Denmark, and the United States. And no wonder: The research was transnational, and the same experiment — with the same basic results — was carried out across cultures in China, Denmark, and Iran.

Read the entire story here.

Send to Kindle

Hyper-Parenting and Couch Potato Kids

Google-search-kids-playing

Parents who are overly engaged in micro-managing the academic, athletic and social lives of their kids may be responsible for ensuring their offspring lead less active lives. A new research study finds children of so-called hyper-parents are significantly less active than peers with less involved parents. Hyper-parenting seems to come in 4 flavors: helicopter parents who hover over their child’s every move; tiger moms who constantly push for superior academic attainment; little emperor parents who constantly bestow their kids material things; and concerted cultivation parents who over-schedule their kids with never-ending after-school activities. If you recognize yourself in one of these parenting styles, take a deep breath, think back on when as a 7-12 year-old you had the most fun, and let you kids play outside — preferably in the rain and mud!

From the WSJ / Preventive Medicine:

Hyper-parenting may increase the risk of physical inactivity in children, a study in the April issue of Preventive Medicine suggests.

Children with parents who tended to be overly involved in their academic, athletic and social lives—a child-rearing style known as hyper-parenting—spent less time outdoors, played fewer after-school sports and were less likely to bike or walk to school, friends’ homes, parks and playgrounds than children with less-involved parents.

Hyperparenting, although it’s intended to benefit children by giving them extra time and attention, could have adverse consequences for their health, the researchers said.

The study, at Queen’s University in Ontario, surveyed 724 parents of children, ages 7 to 12 years old, born in the U.S. and Canada from 2002 to 2007. (The survey was based on parents’ interaction with the oldest child.)

Questionnaires assessed four hyper-parenting styles: helicopter or overprotective parents; little-emperor parents who shower children with material goods; so-called tiger moms who push for exceptional achievement; and parents who schedule excessive extracurricular activities, termed concerted cultivation. Hyperparenting was ranked in five categories from low to high based on average scores in the four styles.

Children’s preferred play location was their yard at home, and 64% of the children played there at least three times a week. Only 12% played on streets and cul-de-sacs away from home. Just over a quarter walked or cycled to school or friends’ homes, and slightly fewer to parks and playgrounds. Organized sports participation was 26%.

Of parents, about 40% had high hyper-parenting scores and 6% had low scores. The most active children had parents with low to below-average scores in all four hyper-parenting styles, while the least active had parents with average-to-high hyper-parenting scores. The difference between children in the low and high hyper-parenting groups was equivalent to about 20 physical-activity sessions a week, the researchers said.

Read the entire story here.

Image courtesy of Google Search.

Send to Kindle

Humor Versus Horror

Faced with unspeakable horror many of usually turn away. Some courageous souls turn to humor to counter the vileness of others. So, it is heartwarming to see comedians and satirists taking up rhetorical arms in the backyards of murderers and terrorists. Fighting violence and terror with much of the same may show progress in the short-term, but ridiculing our enemies with humor and thoughtful dialogue is the only long-term way to fight evil in its many human forms. A profound thank you to these four brave Syrian refugees who, in the face of much personal danger, are able to laugh at their foes.

From the Guardian:

They don’t have much to laugh about. But four young Syrian refugees from Aleppo believe humour may be the only antidote to the horrors taking place back home.

Settled in a makeshift studio in the Turkish city of Gaziantep 40 miles from the Syrian border, the film-makers decided ridicule was an effective way of responding to Islamic State and its grisly record of extreme violence.

“The entire world seems to be terrified of Isis, so we want to laugh at them, expose their hypocrisy and show that their interpretation of Islam does not represent the overwhelming majority of Muslims,” says Maen Watfe, 27. “The media, especially the western media, obsessively reproduce Isis propaganda portraying them as strong and intimidating. We want to show their weaknesses.”

The films and videos on Watfe and his three friends’ website mock the Islamist extremists and depict them as naive simpletons, hypocritical zealots and brutal thugs. It’s a high-risk undertaking. They have had to move house and keep their addresses secret from even their best friends after receiving death threats.

But the video activists – Watfe, Youssef Helali, Mohammed Damlakhy and Aya Brown – will not be deterred.

Their film The Prince shows Isis leader and self-appointed caliph Abu Bakr al-Baghdadi drinking wine, listening to pop music and exchanging selfies with girls on his smartphone. A Moroccan jihadi arrives saying he came to Syria to “liberate Jerusalem”. The leader swaps the wine for milk and switches the music to Islamic chants praising martyrdom. Then he hands the Moroccan a suicide belt and sends him off against a unit of Free Syrian Army fighters. The grenades detonate, and Baghdadi reaches for his glass of wine and turns the pop music back on.

It is pieces like this that have brought hate mail and threats via social media.

“One of them said that they would finish us off like they finished off Charlie [Hebdo],” Brown, 26, recalls. She declined to give her real name out of fear for her family, who still live in Aleppo. “In the end we decided to move from our old apartment.”

The Turkish landlord told them Arabic-speaking men had repeatedly asked for their whereabouts after they left, and kept the studio under surveillance.

Follow the story here.

Video: Happy Valentine. Courtesy of Dayaaltaseh Productions.

Send to Kindle

Household Chores for Kids Are Good

Google-kid-chores

Apparently household chores are becoming rather yesterday. Several recent surveys — no doubt commissioned by my children — show that shared duties in the home are a dying phenomenon. No, I here you cry. Not only do chores provide a necessary respite from the otherwise 24/7-videogame-texting addiction, they help establish a sense of responsibility and reinforce our increasingly imperiled altruistic tendencies. So, parents, get out the duster, vacuum, fresh sheets, laundry basket and put those (little) people to work before it’s too late. But first of all let’s rename “chores” to responsibilities.

From WSJ:

Today’s demands for measurable childhood success—from the Common Core to college placement—have chased household chores from the to-do lists of many young people. In a survey of 1,001 U.S. adults released last fall by Braun Research, 82% reported having regular chores growing up, but only 28% said that they require their own children to do them. With students under pressure to learn Mandarin, run the chess club or get a varsity letter, chores have fallen victim to the imperatives of resume-building—though it is hardly clear that such activities are a better use of their time.

“Parents today want their kids spending time on things that can bring them success, but ironically, we’ve stopped doing one thing that’s actually been a proven predictor of success—and that’s household chores,” says Richard Rende, a developmental psychologist in Paradise Valley, Ariz., and co-author of the forthcoming book “Raising Can-Do Kids.” Decades of studies show the benefits of chores—academically, emotionally and even professionally.

Giving children household chores at an early age helps to build a lasting sense of mastery, responsibility and self-reliance, according to research by Marty Rossmann, professor emeritus at the University of Minnesota. In 2002, Dr. Rossmann analyzed data from a longitudinal study that followed 84 children across four periods in their lives—in preschool, around ages 10 and 15, and in their mid-20s. She found that young adults who began chores at ages 3 and 4 were more likely to have good relationships with family and friends, to achieve academic and early career success and to be self-sufficient, as compared with those who didn’t have chores or who started them as teens.

Chores also teach children how to be empathetic and responsive to others’ needs, notes psychologist Richard Weissbourd of the Harvard Graduate School of Education. In research published last year, he and his team surveyed 10,000 middle- and high-school students and asked them to rank what they valued more: achievement, happiness or caring for others.

Almost 80% chose either achievement or happiness over caring for others. As he points out, however, research suggests that personal happiness comes most reliably not from high achievement but from strong relationships. “We’re out of balance,” says Dr. Weissbourd. A good way to start readjusting priorities, he suggests, is by learning to be kind and helpful at home.

Read the entire story here.

Image courtesy of Google Search.

Send to Kindle

The Damned Embuggerance

Google-search-terry-pratchett-books

Sadly, genre-busting author Sir Terry Pratchett succumbed to DEATH on March 12, 2015. Luckily, for those of us still fending off the clutches of Reaper Man we have seventy-plus works of his to keep us company in the darkness.

So now that our world contains a little less magic it’s important to remind ourselves of a few choice words of his:

A man is not truly dead while his name is still spoken.

Stories of imagination tend to upset those without one.

It’s not worth doing something unless someone, somewhere, would much rather you weren’t doing it.

The truth may be out there, but the lies are inside your head.

Goodness is about what you do. Not who you pray to.

From the Guardian:

Neil Gaiman led tributes from the literary, entertainment and fantasy worlds to Terry Pratchett after the author’s death on Thursday, aged 66.

The author of the Discworld novels, which sold in the tens of millions worldwide, had been afflicted with a rare form of early-onset Alzheimer’s disease.

Gaiman, who collaborated with Pratchett on the huge hit Good Omens, tweeted: “I will miss you, Terry, so much,” pointing to “the last thing I wrote about you”, on the Guardian.

“Terry Pratchett is not a jolly old elf at all,” wrote Gaiman last September. “Not even close. He’s so much more than that. As Terry walks into the darkness much too soon, I find myself raging too: at the injustice that deprives us of – what? Another 20 or 30 books? Another shelf-full of ideas and glorious phrases and old friends and new, of stories in which people do what they really do best, which is use their heads to get themselves out of the trouble they got into by not thinking? … I rage at the imminent loss of my friend. And I think, ‘What would Terry do with this anger?’ Then I pick up my pen, and I start to write.”

Appealing to readers to donate to Alzheimer’s research, Gaiman added on his blog: “Thirty years and a month ago, a beginning author met a young journalist in a Chinese Restaurant, and the two men became friends, and they wrote a book, and they managed to stay friends despite everything. Last night, the author died.

“There was nobody like him. I was fortunate to have written a book with him, when we were younger, which taught me so much.

“I knew his death was coming and it made it no easier.”

Read the entire article here.

Image courtesy of Google Search.

Send to Kindle

Luck

Four-leaf_clover

Some think they have it constantly at their side, like a well-trained puppy. Others crave and seek it. And yet others believe they have been shunned by it. Some put their love lives down to it, and many believe it has had a hand in guiding their careers, friendships, and finances. Of course, many know that it — luck — plays a crucial part in their fortunes at the poker table, roulette wheel or at the races. So what really is luck? Does it stem from within or does it envelope us like a benevolent (mostly) aether? And more importantly, how can more of us find some and tune it to our purposes? 

Carlin Flora over at Aeon presents an insightful analysis, with some rather simple answers. Oh, and you may wish to give away that rabbit’s foot.

From aeon:

In 1992, Archie Karas, then a waiter, headed out to Las Vegas. By 1995, he had turned $50 into $40 million, in what has become known as the biggest winning streak in gambling history. Most of us would call it an instance of great luck, or we might say of Archie himself: ‘What a lucky guy!’ The cold-hearted statistician would laugh at our superstious notions, and instead describe a series of chance processes that happened to work out for Karas. In the larger landscape where randomness reigns, anything can happen at any given casino. Calling its beneficiaries lucky is simply sticking a label on it after the fact.

To investigate luck is to take on one of the grandest of all questions: how can we explain what happens to us, and whether we will be winners, losers or somewhere in the middle at love, work, sports, gambling and life overall? As it turns out, new findings suggest that luck is not a phenomenon that appears exclusively in hindsight, like a hail storm on your wedding day. Nor is it an expression of our desire to see patterns where none exist, like a conviction that your yellow sweater is lucky. The concept of luck is not a myth.

Instead, the studies show, luck can be powered by past good or bad luck, personality and, in a meta-twist, even our own ideas and beliefs about luck itself. Lucky streaks are real, but they are the product of more than just blind fate. Our ideas about luck influence the way we behave in risky situations. We really can make our own luck, though we don’t like to think of ourselves as lucky – a descriptor that undermines other qualities, like talent and skill. Luck can be a force, but it’s one we interact with, shape and cultivate. Luck helps determine our fate here on Earth, even if you think its ultimate cause divine.

Luck is perspective and point of view: if a secular man happened to survive because he took a meeting outside his office at the World Trade Center on the morning of 11 September 2001, he might simply acknowledge random chance in life without assigning a deeper meaning. A Hindu might conclude he had good karma. A Christian might say God was watching out for him so that he could fulfil a special destiny in His service. The mystic could insist he was born under lucky stars, as others are born with green eyes.

Traditionally, the Chinese think luck is an inner trait, like intelligence or upbeat mood, notes Maia Young, a management expert at the University of California, Los Angeles. ‘My mom always used to tell me, “You have a lucky nose”, because its particular shape was a lucky one, according to Chinese lore.’ Growing up in the American Midwest, it dawned on Young that the fleeting luck that Americans often talked about – a luck that seemed to visit the same person at certain times (‘I got lucky on that test!’) but not others (‘I got caught in traffic before my interview!’) – was not equivalent to the unchanging, stable luck her mother saw in her daughter, her nose being an advertisement of its existence within.

‘It’s something that I have that’s a possession of mine, that can be more relied upon than just dumb luck,’ says Young. The distinction stuck with her. You might think someone with a lucky nose wouldn’t roll up their sleeves to work hard – why bother? – but here’s another cultural difference in perceptions of luck. ‘In Chinese culture,’ she says, ‘hard work can go hand-in-hand with being lucky. The belief system accommodates both.’

On the other hand, because Westerners see effort and good fortune as taking up opposite corners of the ring, they are ambivalent about luck. They might pray for it and sincerely wish others they care about ‘Good luck!’ but sometimes they just don’t want to think of themselves as lucky. They’d rather be deserving. The fact that they live in a society that is neither random nor wholly meritocratic makes for an even messier slamdance between ‘hard work’ and ‘luck’. Case in point: when a friend gets into a top law or medical school, we might say: ‘Congratulations! You’ve persevered. You deserve it.’ Were she not to get in, we would say: ‘Acceptance is arbitrary. Everyone’s qualified these days – it’s the luck of the draw.’

Read the entire article here.

Image: Four-leaf clover. Some consider it a sign of god luck. Courtesy of Phyzome.

Send to Kindle

Nuisance Flooding = Sea-Level Rise

hurricane_andrewGovernment officials in Florida are barred from using the terms “climate change”, “global warming”, “sustainable” and other related terms. Apparently, they’ll have to use the euphemism “nuisance flooding” in place of “sea-level rise”. One wonders what literary trick they’ll conjure up next time the state gets hit by a hurricane — “Oh, that? Just a ‘mischievous little breeze’, I’m not a scientist you know.”

From the Guardian:

Officials with the Florida Department of Environmental Protection (DEP), the agency in charge of setting conservation policy and enforcing environmental laws in the state, issued directives in 2011 barring thousands of employees from using the phrases “climate change” and “global warming”, according to a bombshell report by the Florida Center for Investigative Reporting (FCIR).

The report ties the alleged policy, which is described as “unwritten”, to the election of Republican governor Rick Scott and his appointment of a new department director that year. Scott, who was re-elected last November, has declined to say whether he believes in climate change caused by human activity.

“I’m not a scientist,” he said in one appearance last May.

Scott’s office did not return a call Sunday from the Guardian, seeking comment. A spokesperson for the governor told the FCIR team: “There’s no policy on this.”

The FCIR report was based on statements by multiple named former employees who worked in different DEP offices around Florida. The instruction not to refer to “climate change” came from agency supervisors as well as lawyers, according to the report.

“We were told not to use the terms ‘climate change’, ‘global warming’ or ‘sustainability’,” the report quotes Christopher Byrd, who was an attorney with the DEP’s Office of General Counsel in Tallahassee from 2008 to 2013, as saying. “That message was communicated to me and my colleagues by our superiors in the Office of General Counsel.”

“We were instructed by our regional administrator that we were no longer allowed to use the terms ‘global warming’ or ‘climate change’ or even ‘sea-level rise’,” said a second former DEP employee, Kristina Trotta. “Sea-level rise was to be referred to as ‘nuisance flooding’.”

According to the employees’ accounts, the ban left damaging holes in everything from educational material published by the agency to training programs to annual reports on the environment that could be used to set energy and business policy.

The 2014 national climate assessment for the US found an “imminent threat of increased inland flooding” in Florida due to climate change and called the state “uniquely vulnerable to sea level rise”.

Read the entire story here.

Image: Hurricane Floyd 1999, a “mischievous little breeze”. Courtesy of NASA.

Send to Kindle

The Power of Mediocrity

Over-achievers may well frown upon the slacking mediocre souls who strive to do less. But, mediocrity has a way of pervading the lives of the constantly striving 18 hour-a-day, multi-taskers as well. The figure of speech “jack of all trades, master of none”, sums up the inevitability of mediocrity for those who strive to do everything, but do nothing well. In fact, pursuit of the mediocre may well be an immutable universal law — both for under-archievers and over-archievers, and for that vast, second-rate, mediocre middle-ground of averageness.

From the Guardian:

In the early years of the last century, Spanish philosopher José Ortega y Gassetproposed a solution to society’s ills that still strikes me as ingenious, in a deranged way. He argued that all public sector workers from the top down (though, come to think of it, why not everyone else, too?) should be demoted to the level beneath their current job. His reasoning foreshadowed the Peter Principle: in hierarchies, people “rise to their level of incompetence”. Do your job well, and you’re rewarded with promotion, until you reach a job you’re less good at, where you remain.

In a recent book, The Hard Thing About Hard Things, the tech investor Ben Horowitz adds a twist: “The Law of Crappy People”. As soon as someone on a given rung at a company gets as good as the worst person the next rung up, he or she may expect a promotion. Yet, if it’s granted, the firm’s talent levels will gradually slide downhill. No one person need be peculiarly crappy for this to occur; bureaucracies just tend to be crappier than the sum of their parts.

Yet it’s wrong to think of these pitfalls as restricted to organisations. There’s a case to be made that the gravitational pull of the mediocre affects all life – as John Stuart Mill put it, that “the general tendency of things throughout the world is to render mediocrity the ascendant power among mankind”. True, it’s most obvious in the workplace (hence the observation that “a meeting moves at the pace of the slowest mind in the room”), but the broader point is that in any domain – work, love, friendship, health – crappy solutions crowd out good ones time after time, so long as they’re not so bad as to destroy the system. People and organisations hit plateaux not because they couldn’t do better, but because a plateau is a tolerable, even comfortable place. Even evolution – life itself! – is all about mediocrity. “Survival of the fittest” isn’t a progression towards greatness; it just means the survival of the sufficiently non-terrible.

And mediocrity is cunning: it can disguise itself as achievement. The cliche of a “mediocre” worker is a Dilbert-esque manager with little to do. But as Greg McKeown notes, in his book Essentialism: The Disciplined Pursuit Of Less, the busyness of the go-getter can lead to mediocrity, too. Throw yourself at every opportunity and you’ll end up doing unimportant stuff – and badly. You can’t fight this with motivational tricks or cheesy mission statements: you need a discipline, a rule you apply daily, to counter the pull of the sub-par. For a company, that might mean stricter, more objective promotion policies. For the over-busy person, there’s McKeown’s “90% Rule” – when considering an option, ask: does it score at least 9/10 on some relevant criterion? If not, say no. (Ideally, that criterion is: “Is this fulfilling?”, but the rule still works if it’s “Does this pay the bills?”).

Read the entire story here.

Send to Kindle

News Anchor as Cult Hero

Google-search-news-anchor

Why and when did the news anchor, or newsreader as he or she is known in non-US parts of the world, acquire the status of cult hero? And, why is this a peculiarly US phenomenon? Let’s face it TV newsreaders in the UK, on the BBC or ITV, certainly do not have a following along the lines their US celebrity counterparts like Brian Williams, Megyn Kelly or Anderson Cooper. Why?

From the Guardian:

A game! Spot the odd one out in the following story. This year has been a terrible one so far for those who care about American journalism: the much-loved New York Times journalist David Carr died suddenly on 12 February; CBS correspondent Bob Simon was killed in a car crash the day before; Jon Stewart, famously the “leading news source for young Americans”, announced that he is quitting the Daily Show; his colleague Stephen Colbert is moving over from news satire to the softer arena of a nightly talk show; NBC anchor Brian Williams, as famous in America as Jeremy Paxman is in Britain, has been suspended after it was revealed he had “misremembered” events involving himself while covering the war in Iraq; Bill O’Reilly, an anchor on Fox News, the most watched cable news channel in the US, has been accused of being on similarly vague terms with the truth.

News of the Fox News anchor probably sounds like “dog bites man” to most Britons, who remember that this network recently described Birmingham as a no-go area for non-Muslims. But this latest scandal involving O’Reilly reveals something quite telling about journalism in America.

Whereas in Britain journalists are generally viewed as occupying a place on the food chain somewhere between bottom-feeders and cockroaches, in America there remains, still, a certain idealisation of journalists, protected by a gilded halo hammered out by sentimental memories of Edward R Murrow and Walter Cronkite.

Even while Americans’ trust in mass media continues to plummet, journalists enjoy a kind of heroic fame that would baffle their British counterparts. Television anchors and commentators, from Rachel Maddow on the left to Sean Hannity on the right, are lionised in a way that, say, Huw Edwards, is, quite frankly, not. A whole genre of film exists in the US celebrating the heroism of journalists, from All the President’s Men to Good Night, and Good Luck. In Britain, probably the most popular depiction of journalists came from Spitting Image, where they were snuffling pigs in pork-pie hats.

So whenever a journalist in the US has been caught lying, the ensuing soul-searching and garment-rending discovery has been about as prolonged and painful as a PhD on proctology. The New York Times and the New Republic both imploded when it was revealed that their journalists, respectively Jayson Blair and Stephen Glass, had fabricated their stories. Their tales have become part of American popular culture – The Wire referenced Blair in its fifth season and a film was made about the New Republic’s scandal – like national myths that must never be forgotten.

By contrast, when it was revealed that The Independent’s Johann Hari had committed plagiarism and slandered his colleagues on Wikipedia, various journalists wrote bewildering defences of him and the then Independent editor said initially that Hari would return to the paper. Whereas Hari’s return to the public sphere three years after his resignation has been largely welcomed by the British media, Glass and Blair remain shunned figures in the US, more than a decade after their scandals.

Which brings us back to the O’Reilly scandal, now unfolding in the US. Once it was revealed that NBC’s liberal Brian Williams had exaggerated personal anecdotes – claiming to have been in a helicopter that was shot at when he was in the one behind, for starters – the hunt was inevitably on for an equally big conservative news scalp. Enter stage left: Bill O’Reilly.

So sure, O’Reilly claimed that in his career he has been in “active war zones” and “in the Falklands” when he in fact covered a protest in Buenos Aires during the Falklands war. And sure, O’Reilly’s characteristically bullish defence that he “never said” he was “on the Falkland Islands” (original quote: “I was in a situation one time, in a war zone in Argentina, in the Falklands …”) and that being at a protest thousands of miles from combat constitutes “a war zone” verges on the officially bonkers (as the Washington Post put it, “that would mean that any reporter who covered an anti-war protest in Washington during the Iraq War was doing combat reporting”). But does any of this bother either O’Reilly or Fox News? It does not.

Unlike Williams, who slunk away in shame, O’Reilly has been bullishly combative, threatening journalists who dare to cover the story and saying that they deserve to be “in the kill zone”. Fox News too has been predictably untroubled by allegations of lies: “Fox News chairman and CEO Roger Ailes and all senior management are in full support of Bill O’Reilly,” it said in a statement.

Read the entire story here.

Image courtesy of Google Search.

Send to Kindle

The US Senator From Oklahoma and the Snowball

By their own admission Republicans in the US Congress are not scientists, and clearly most, if not all, have no grasp of science, the scientific method, or the meaning of scientific theory or broad scientific consensus. The Senator from Oklahoma, James Inhofe, is the perfect embodiment of this extraordinary condition — perhaps a psychosis even — whereby a human living in the 21st century has no clue. Senator Inhofe recently gave us his infantile analysis of climate change on the Senate floor, accompanied by a snowball. This will make you then laugh, then cry.

From Scientific American:

“In case we have forgotten, because we keep hearing that 2014 has been the warmest year on record, I ask the chair, you know what this is? It’s a snowball. And that’s just from outside here. So it’s very, very cold out.”

Oklahoma Senator James Inhofe, the biggest and loudest climate change denier in Congress, last week on the floor of the senate. But his facile argument, that it’s cold enough for snow to exist in Washington, D.C., therefore climate change is a hoax, was rebutted in the same venue by Rhode Island Senator Sheldon Whitehouse:

“You can believe NASA and you can believe what their satellites measure on the planet, or you can believe the Senator with the snowball. The United States Navy takes this very seriously, to the point where Admiral Locklear, who is the head of the Pacific Command, has said that climate change is the biggest threat that we face in the Pacific…you can either believe the United States Navy or you can believe the Senator with the snowball…every major American scientific society has put itself on record, many of them a decade ago, that climate change is deadly real. They measure it, they see it, they know why it happens. The predictions correlate with what we see as they increasingly come true. And the fundamental principles, that it is derived from carbon pollution, which comes from burning fossil fuels, are beyond legitimate dispute…so you can believe every single major American scientific society, or you can believe the Senator with the snowball.”

Read the entire story here.

Video: Senator Inhofe with Snowball. Courtesy of C-Span.

Send to Kindle

Time For a New Body, Literally

Brainthatwouldntdie_film_poster

Let me be clear. I’m not referring to a hair transplant, but a head transplant.

A disturbing story has been making the media rounds recently. Dr. Sergio Canavero from the Turin Advanced Neuromodulation Group in Italy, suggests that the time is right to attempt the transplantation of a human head onto a different body. Canavero believes that advances in surgical techniques and immunotherapy are such that a transplantation could be attempted by 2017. Interestingly enough, he has already had several people volunteer for a new body.

Ethics aside, it certainly doesn’t stretch the imagination to believe Hollywood’s elite would clamor for this treatment. Now, I wonder if some people, liking their own body, would want a new head?

From New Scientist:

It’s heady stuff. The world’s first attempt to transplant a human head will be launched this year at a surgical conference in the US. The move is a call to arms to get interested parties together to work towards the surgery.

The idea was first proposed in 2013 by Sergio Canavero of the Turin Advanced Neuromodulation Group in Italy. He wants to use the surgery to extend the lives of people whose muscles and nerves have degenerated or whose organs are riddled with cancer. Now he claims the major hurdles, such as fusing the spinal cord and preventing the body’s immune system from rejecting the head, are surmountable, and the surgery could be ready as early as 2017.

Canavero plans to announce the project at the annual conference of the American Academy of Neurological and Orthopaedic Surgeons (AANOS) in Annapolis, Maryland, in June. Is society ready for such momentous surgery? And does the science even stand up?

The first attempt at a head transplant was carried out on a dog by Soviet surgeon Vladimir Demikhov in 1954. A puppy’s head and forelegs were transplanted onto the back of a larger dog. Demikhov conducted several further attempts but the dogs only survived between two and six days.

The first successful head transplant, in which one head was replaced by another, was carried out in 1970. A team led by Robert White at Case Western Reserve University School of Medicine in Cleveland, Ohio, transplanted the head of one monkey onto the body of another. They didn’t attempt to join the spinal cords, though, so the monkey couldn’t move its body, but it was able to breathe with artificial assistance. The monkey lived for nine days until its immune system rejected the head. Although few head transplants have been carried out since, many of the surgical procedures involved have progressed. “I think we are now at a point when the technical aspects are all feasible,” says Canavero.

This month, he published a summary of the technique he believes will allow doctors to transplant a head onto a new body (Surgical Neurology Internationaldoi.org/2c7). It involves cooling the recipient’s head and the donor body to extend the time their cells can survive without oxygen. The tissue around the neck is dissected and the major blood vessels are linked using tiny tubes, before the spinal cords of each person are cut. Cleanly severing the cords is key, says Canavero.

The recipient’s head is then moved onto the donor body and the two ends of the spinal cord – which resemble two densely packed bundles of spaghetti – are fused together. To achieve this, Canavero intends to flush the area with a chemical called polyethylene glycol, and follow up with several hours of injections of the same stuff. Just like hot water makes dry spaghetti stick together, polyethylene glycol encourages the fat in cell membranes to mesh.

Next, the muscles and blood supply would be sutured and the recipient kept in a coma for three or four weeks to prevent movement. Implanted electrodes would provide regular electrical stimulation to the spinal cord, because research suggests this can strengthen new nerve connections.

When the recipient wakes up, Canavero predicts they would be able to move and feel their face and would speak with the same voice. He says that physiotherapy would enable the person to walk within a year. Several people have already volunteered to get a new body, he says.

The trickiest part will be getting the spinal cords to fuse. Polyethylene glycol has been shown to prompt the growth of spinal cord nerves in animals, and Canavero intends to use brain-dead organ donors to test the technique. However, others are sceptical that this would be enough. “There is no evidence that the connectivity of cord and brain would lead to useful sentient or motor function following head transplantation,” says Richard Borgens, director of the Center for Paralysis Research at Purdue University in West Lafayette, Indiana.

Read the entire article here.

Image: Theatrical poster for the movie The Brain That Wouldn’t Die (1962). Courtesy of Wikipedia.

Send to Kindle

Jon Ronson Versus His Spambot Infomorph Imposter

While this may sound like a 1980′s monster flick, it’s rather more serious.

Author, journalist, filmmaker Jon Ronson weaves a fun but sinister tale of the theft of his own identity. The protagonists: a researcher in technology and cyberculture, a so-called “creative technologist” and a university lecturer in English and American literature. Not your typical collection of “identity thieves”, trolls, revenge pornographers, and online shamers. But an unnerving, predatory trio nevertheless.

From the Guardian:

In early January 2012, I noticed that another Jon Ronson had started posting on Twitter. His photograph was a photograph of my face. His Twitter name was @jon_ronson. His most recent tweet read: “Going home. Gotta get the recipe for a huge plate of guarana and mussel in a bap with mayonnaise :D #yummy.”

“Who are you?” I tweeted him.

“Watching #Seinfeld. I would love a big plate of celeriac, grouper and sour cream kebab with lemongrass #foodie,” he tweeted. I didn’t know what to do.

The next morning, I checked @jon_ronson’s timeline before I checked my own. In the night he had tweeted, “I’m dreaming something about #time and #cock.” He had 20 followers.

I did some digging. A young academic from Warwick University called Luke Robert Mason had a few weeks earlier posted a comment on the Guardian site. It was in response to a short video I had made about spambots. “We’ve built Jon his very own infomorph,” he wrote. “You can follow him on Twitter here: @jon_ronson.”

I tweeted him: “Hi!! Will you take down your spambot please?”

Ten minutes passed. Then he replied, “We prefer the term infomorph.”

“But it’s taken my identity,” I wrote.

“The infomorph isn’t taking your identity,” he wrote back. “It is repurposing social media data into an infomorphic aesthetic.”

I felt a tightness in my chest.

“#woohoo damn, I’m in the mood for a tidy plate of onion grill with crusty bread. #foodie,” @jon_ronson tweeted.

I was at war with a robot version of myself.

Advertisement

A month passed. @jon_ronson was tweeting 20 times a day about its whirlwind of social engagements, its “soirées” and wide circle of friends. The spambot left me feeling powerless and sullied.

I tweeted Luke Robert Mason. If he was adamant that he wouldn’t take down his spambot, perhaps we could at least meet? I could film the encounter and put it on YouTube. He agreed.

I rented a room in central London. He arrived with two other men – the team behind the spambot. All three were academics. Luke was the youngest, handsome, in his 20s, a “researcher in technology and cyberculture and director of the Virtual Futures conference”. David Bausola was a “creative technologist” and the CEO of the digital agency Philter Phactory. Dan O’Hara had a shaved head and a clenched jaw. He was in his late 30s, a lecturer in English and American literature at the University of Cologne.

I spelled out my grievances. “Academics,” I began, “don’t swoop into a person’s life uninvited and use him for some kind of academic exercise, and when I ask you to take it down you’re, ‘Oh, it’s not a spambot, it’s an infomorph.’”

Dan nodded. He leaned forward. “There must be lots of Jon Ronsons out there?” he began. “People with your name? Yes?”

I looked suspiciously at him. “I’m sure there are people with my name,” I replied, carefully.

“I’ve got the same problem,” Dan said with a smile. “There’s another academic out there with my name.”

“You don’t have exactly the same problem as me,” I said, “because my exact problem is that three strangers have stolen my identity and have created a robot version of me and are refusing to take it down.”

Dan let out a long-suffering sigh. “You’re saying, ‘There is only one Jon Ronson’,” he said. “You’re proposing yourself as the real McCoy, as it were, and you want to maintain that integrity and authenticity. Yes?”

I stared at him.

“We’re not quite persuaded by that,” he continued. “We think there’s already a layer of artifice and it’s your online personality – the brand Jon Ronson – you’re trying to protect. Yeah?”

“No, it’s just me tweeting,” I yelled.

“The internet is not the real world,” said Dan.

“I write my tweets,” I replied. “And I press send. So it’s me on Twitter.” We glared at each other. “That’s not academic,” I said. “That’s not postmodern. That’s the fact of it. It’s a misrepresentation of me.”

“You’d like it to be more like you?” Dan said.

“I’d like it to not exist,” I said.

“I find that quite aggressive,” he said. “You’d like to kill these algorithms? You must feel threatened in some way.” He gave me a concerned look. “We don’t go around generally trying to kill things we find annoying.”

“You’re a troll!” I yelled.

I dreaded uploading the footage to YouTube, because I’d been so screechy. I steeled myself for mocking comments and posted it. I left it 10 minutes. Then, with apprehension, I had a look.

“This is identity theft,” read the first comment I saw. “They should respect Jon’s personal liberty.”

Read the entire story here.

Video: JON VS JON Part 2 | Escape and Control. Courtesy of Jon Ronson.

Send to Kindle

The Killer Joke and the Killer Idea

Some jokes can make you laugh until you cry. Some jokes can kill. And, research shows that thoughts alone can have equally devastating consequences as well.

From BBC:

Beware the scaremongers. Like a witch doctor’s spell, their words might be spreading modern plagues.

We have long known that expectations of a malady can be as dangerous as a virus. In the same way that voodoo shamans could harm their victims through the power of suggestion, priming someone to think they are ill can often produce the actual symptoms of a disease. Vomiting, dizziness, headaches, and even death, could be triggered through belief alone. It’s called the “nocebo effect”.

But it is now becoming clear just how easily those dangerous beliefs can spread through gossip and hearsay – with potent effect. It may be the reason why certain houses seem cursed with illness, and why people living near wind turbines report puzzling outbreaks of dizziness, insomnia and vomiting. If you have ever felt “fluey” after a vaccination, believed your cell phone was giving you a headache, or suffered an inexplicable food allergy, you may have also fallen victim to a nocebo jinx. “The nocebo effect shows the brain’s power,” says Dimos Mitsikostas, from Athens Naval Hospital in Greece. “And we cannot fully explain it.”

A killer joke

Doctors have long known that beliefs can be deadly – as demonstrated by a rather nasty student prank that went horribly wrong. The 18th Century Viennese medic, Erich Menninger von Lerchenthal, describes how students at his medical school picked on a much-disliked assistant. Planning to teach him a lesson, they sprung upon him before announcing that he was about to be decapitated. Blindfolding him, they bowed his head onto the chopping block, before dropping a wet cloth on his neck. Convinced it was the kiss of a steel blade, the poor man “died on the spot”.

While anecdotes like this abound, modern researchers had mostly focused on the mind’s ability to heal, not harm – the “placebo effect”, from the Latin for “I will please”. Every clinical trial now randomly assigns patients to either a real drug, or a placebo in the form of an inert pill. The patient doesn’t know which they are taking, and even those taking the inert drug tend to show some improvement – thanks to their faith in the treatment.

Yet alongside the benefits, people taking placebos often report puzzling side effects – nausea, headaches, or pain – that are unlikely to come from an inert tablet. The problem is that people in a clinical trial are given exactly the same health warnings whether they are taking the real drug or the placebo – and somehow, the expectation of the symptoms can produce physical manifestations in some placebo takers. “It’s a consistent phenomenon, but medicine has never really dealt with it,” says Ted Kaptchuk at Harvard Medical School.

Over the last 10 years, doctors have shown that this nocebo effect – Latin for “I will harm” – is very common. Reviewing the literature, Mitsikostas has so far documented strong nocebo effects in many treatments for headache, multiple sclerosis, and depression. In trials for Parkinson’s disease, as many as 65% report adverse events as a result of their placebo. “And around one out of 10 treated will drop out of a trial because of nocebo, which is pretty high,” he says.

Although many of the side-effects are somewhat subjective – like nausea or pain – nocebo responses do occasionally show up as rashes and skin complaints, and they are sometimes detectable on physiological tests too. “It’s unbelievable – they are taking sugar pills and when you measure liver enzymes, they are elevated,” says Mitsikostas.

And for those who think these side effects are somehow “deliberately” willed or imagined, measures of nerve activity following nocebo treatment have shown that the spinal cord begins responding to heightened painbefore conscious deliberation would even be possible.

Consider the near fatal case of “Mr A”, reported by doctor Roy Reeves in 2007. Mr A was suffering from depression when he consumed a whole bottle of pills. Regretting his decision, Mr A rushed to ER, and promptly collapsed at reception. It looked serious; his blood pressure had plummeted, and he was hyperventilating; he was immediately given intravenous fluids. Yet blood tests could find no trace of the drug in his system. Four hours later, another doctor arrived to inform Reeves that the man had been in the placebo arm of a drugs trial; he had “overdosed” on sugar tablets. Upon hearing the news, the relieved Mr A soon recovered.

We can never know whether the nocebo effect would have actually killed Mr A, though Fabrizio Benedetti at the University of Turin Medical School thinks it is certainly possible. He has scanned subjects’ brains as they undergo nocebo suggestions, which seems to set off a chain of activation in the hypothalamus, and the pituitary and adrenal glands – areas that deal with extreme threats to our body. If your fear and belief were strong enough, the resulting cocktail of hormones could be deadly, he says.

Read the entire story here.

Send to Kindle

Why Are We Obsessed With Zombies?

Google-search-zombie

Previous generations worried about Frankenstein, evil robots, even more evil aliens, hungry dinosaurs and, more recently, vampires. Nowadays our culture seems to be singularly obsessed with zombies. Why?

From the Conversation:

The zombie invasion is here. Our bookshops, cinemas and TVs are dripping with the pustulating debris of their relentless shuffle to cultural domination.

A search for “zombie fiction” on Amazon currently provides you with more than 25,000 options. Barely a week goes by without another onslaught from the living dead on our screens. We’ve just seen the return of one of the most successful of these, The Walking Dead, starring Andrew Lincoln as small-town sheriff, Rick Grimes. The show follows the adventures of Rick and fellow survivors as they kill lots of zombies and increasingly, other survivors, as they desperately seek safety.

Generational monsters

Since at least the late 19th century each generation has created fictional enemies that reflect a broader unease with cultural or scientific developments. The “Yellow Peril” villains such as Fu Manchu were a response to the massive increase in Chinese migration to the US and Europe from the 1870s, for example.

As the industrial revolution steamed ahead, speculative fiction of authors such as H G Wells began to consider where scientific innovation would take mankind. This trend reached its height in the Cold War during the 1950s and 1960s. Radiation-mutated monsters and invasions from space seen through the paranoid lens of communism all postulated the imminent demise of mankind.

By the 1970s, in films such as The Parallax View and Three Days of the Condor, the enemy evolved into government institutions and powerful corporations. This reflected public disenchantment following years of increasing social conflict, Vietnam and the Watergate scandal.

In the 1980s and 1990s it was the threat of AIDS that was embodied in the monsters of the era, such as “bunny boiling” stalker Alex in Fatal Attraction. Alex’s obsessive pursuit of the man with whom she shared a one night stand, Susanne Leonard argues, represented “the new cultural alignment between risk and sexual contact”, a theme continued with Anne Rices’s vampire Lestat in her series The Vampire Chronicles.

Risk and anxiety

Zombies, the flesh eating undead, have been mentioned in stories for more than 4,000 years. But the genre really developed with the work of H G Wells, Poe and particularly H P Lovecraft in the early 20th century. Yet these ponderous adversaries, descendants of Mary Shelley’s Frankenstein, have little in common with the vast hordes that threaten mankind’s existence in the modern versions.

M Keith Booker argued that in the 1950s, “the golden age of nuclear fear”, radiation and its fictional consequences were the flip side to a growing faith that science would solve the world’s problems. In many respects we are now living with the collapse of this faith. Today we live in societies dominated by an overarching anxiety reflecting the risk associated with each unpredictable scientific development.

Now we know that we are part of the problem, not necessarily the solution. The “breakthroughs” that were welcomed in the last century now represent some of our most pressing concerns. People have lost faith in assumptions of social and scientific “progress”.

Globalisation

Central to this is globalisation. While generating enormous benefits, globalisation is also tearing communities apart. The political landscape is rapidly changing as established political institutions seem unable to meet the challenges presented by the social and economic dislocation.

However, although destructive, globalisation is also forging new links between people, through what Anthony Giddens calls the “emptying of time and space”. Modern digital media has built new transnational alliances, and, particularly in the West, confronted people with stark moral questions about the consequences of their own lifestyles.

As the faith in inexorable scientific “progress” recedes, politics is transformed. The groups emerging from outside the political mainstream engage in much older battles of faith and identity. Whether right-wing nationalists or Islamic fundamentalists, they seek to build “imagined communities” through race, religion or culture and “fear” is their currency.

Evolving zombies

Modern zombies are the product of this globalised, risk conscious world. No longer the work of a single “mad” scientist re-animating the dead, they now appear as the result of secret government programmes creating untreatable viruses. The zombies indiscriminately overwhelm states irrespective of wealth, technology and military strength, turning all order to chaos.

Meanwhile, the zombies themselves are evolving into much more tenacious adversaries. In Danny Boyle’s 28 Days Later it takes only 20 days for society to be devastated. Charlie Higson’s Enemy series of novels have the zombies getting leadership and using tools. In the film of Max Brooks’ novel, World War Z, the seemingly superhuman athleticism of the zombies reflects the devastating springboard that vast urban populations would provide for such a disease. The film, starring Brad Pitt, had a reported budget of US$190m, demonstrating what a big business zombies have become.

Read the entire article here.

Image courtesy of Google Search.

Send to Kindle

The Missing Sock Law

Google-search-socks

If you share a household with children, or adults who continually misplace things, you’ll be intimately familiar with the Missing Sock Law (MSL). No matter how hard you try to keep clothing, and people, organized, and no matter how diligent you are during the laundry process, you will always lose socks. After your weekly laundry you will always end up with an odd number of socks, they will always be mismatched and you will never find the missing ones again. This is the MSL, and science has yet to come up with a solution.

However, an increasing number of enterprising youngsters, non-OCD parents, and even some teens, are adopting a solution that’s been staring them in the face since socks were invented.  Apparently, it is now a monumentally cool fashion statement (at the time writing) to wear mismatched socks — there are strict rules of course, and parents, this is certainly not for you.

From WSJ:

Susana Yourcheck keeps a basket of mismatched socks in her laundry room, hoping that the missing match will eventually reappear. The pile is getting smaller these days, but not because the solitary socks are magically being reunited with their mates.

The credit for the smaller stash goes to her two teenage daughters, who no longer fuss to find socks that match. That’s because fashionable tweens and teens favor a jamboree of solids, colors and patterns on their feet.

“All my friends do it. Everyone in school wears them this way,” says 15-year-old Amelia Yourcheck.

For laundry-folding parents, the best match is sometimes a mismatch.

Generations of adults have cringed at their children’s fashion choices, suffering through bell bottoms, crop tops, piercings and tattoos. Socks have gone through various iterations of coolness: knee-high, no-see, wild patterns, socks worn with sandals, and no socks at all.

But the current trend has advantages for parents like Ms. Yourcheck. She has long been flummoxed by the mystery of socks that “disappear to the land of nowhere.”

“I’m not going to lie—[the mismatched look] bothers me. But I’m also kind of happy because at least we get some use out of them,” says Ms. Yourcheck, who is 40 years old and lives in Holly Springs, N.C.

“It definitely makes laundry way easier because they just go in a pile and you don’t have to throw the odd ones away,” agrees Washington, D.C., resident Jennifer Swanson Prince, whose 15-year-old daughter, Eleni, rocks the unmatched look. “And if we are lucky, the pile will go in a drawer.”

Some parents say they first noticed the trend a few years ago. Some saw girls whip off their shoes at a bat mitzvah celebration and go through a basket of mismatched socks that were supplied by the hosts for more comfortable dancing.

 For some teenage fashionistas, however, the style dictates that certain rules be followed. Among the most important: The socks must always be more or less the same length—no mixing a knee high with a short one. And while patterns can be combined, clashing seasons—as with snowflakes and flowers—are frowned upon.

The trend is so popular that retailers sell socks that go together, but don’t really go together.

“Matching is mundane, but mixing patterns and colors is monumentally cool,” states the website of LittleMissMatched, which has stores in New York, Florida and California. The company sells socks in sets of three that often sport the same pattern—stars, animal prints, argyles, but in different colors.

Read the entire article here.

Image courtesy of Google Search.

Send to Kindle

Why Are Most Satirists Liberal?

Stephen_Colbert_2014Oliver Morrison over at The Atlantic has a tremendous article that ponders the comedic divide that spans our political landscape. Why, he asks, do most political satirists identify with left-of-center thought? And, why are the majority of radio talk show hosts right-wing? Why is there no right-wing Stephen Colbert, and why no leftie Rush? These are very interesting questions.

You’ll find some surprising answers, which go beyond the Liberal stereotype of the humorless Republican with no grasp of satire or irony.

From the Atlantic:

Soon after Jon Stewart arrived at The Daily Show in 1999, the world around him began to change. First, George W. Bush moved into the White House. Then came 9/11, and YouTube, and the advent of viral videos. Over the years, Stewart and his cohort mastered the very difficult task of sorting through all the news quickly and turning it around into biting, relevant satire that worked both for television and the Internet.

Now, as Stewart prepares to leave the show, the brand of comedy he helped invent is stronger than ever. Stephen Colbert is getting ready to bring his deadpan smirk to The Late Show. Bill Maher is continuing to provoke pundits and politicians with his blunt punch lines. John Oliver’s Last Week Tonight is about to celebrate the end of a wildly popular first year. Stewart has yet to announce his post-Daily Show plans, but even if he retires, the genre seems more than capable of carrying on without him.

Stewart, Colbert, Maher, Oliver and co. belong to a type of late-night satire that’s typically characterized as liberal, skewering Republicans (and, less frequently, Democrats) for absurd statements or pompousness or flagrant hypocrisy. “The Daily Show, The Colbert Report, Funny Or Die, and The Onion, while not partisan organs, all clearly have a left-of-center orientation,” wrote Jonathan Chait in The New Republic in 2011.This categorization, though, begs the question of why the form has no equal on the other side of the ideological spectrum. Some self-identified conservative comics argue that the biased liberal media hasn’t given them a chance to thrive. Others point out that Obama is a more difficult target than his Republican predecessor: He was the first African-American president, which meant comedians have had to tip-toe around anything with racial connotations, and his restrained personality has made him difficult to parody.

But six years in, Obama’s party has been thoroughly trounced in the midterms and publicly excoriated by right-wing politicians, yet there’s a dearth of conservative satirists taking aim, even though the niche-targeted structure of cable media today should make it relatively easy for them to find an audience. After all, it would have been difficult for Stewart or Colbert to find an audience during the era when three broadcast stations competed for the entire country and couldn’t afford to alienate too many viewers. But cable TV news programs need only find a niche viewership. Why then, hasn’t a conservative Daily Show found its own place on Fox?

Liberal satirists are certainly having no trouble making light of liberal institutions and societies. Portlandia is about to enter its fifth season skewering the kinds of liberals who don’t understand that eco-terrorismand militant feminism may not be as politically effective as they think. Jon Stewart has had success poking fun at Obama’s policies. And Alison Dagnes, a professor of political science at Shippensburg University, has found that the liberal Clinton was the butt of more jokes on late-night shows of the 1990s than either George W. Bush or Obama would later be.

So if liberals are such vulnerable targets for humor, why do relatively few conservative comedians seem to be taking aim at them?

ne explanation is simply that proportionately fewer people with broadly conservative sensibilities choose to become comedians. Just as liberals dominate academia, journalism, and other writing professions, there are nearly three times as many liberal- as conservative-minded people in the creative arts according to a recent study. Alison Dagnes, a professor of political science at Shippensburg University, argues that the same personality traits that shape political preferences also guide the choice of professions. These tendencies just get more pronounced in the case of comedy, which usually requires years of irregular income, late hours, and travel, as well as a certain tolerance for crudeness and heckling.

There are, of course, high-profile conservative comedians in America, such as the members of the Blue  Collar Comedy Tour. But these performers, who include Jeff Foxworthy and Larry the Cable Guy, tend carefully to avoid politicized topics, mocking so-called “rednecks” in the same spirit as Borscht Belt acts mocked Jewish culture.

When it comes to actual political satire, one of the most well-known figures nationally is Dennis Miller, a former Saturday Night Live cast member who now has a weekly segment on Fox News’ O’Reilly Factor. On a recent show, O’Reilly brought up the Democrats’ election losses, and Miller took the bait. “I think liberalism is like a nude beach,” Miller said. “It’s better off in your mind than actually going there.” His jokes are sometimes amusing, but they tend to be grounded in vague ideologies, not the attentive criticism to the news of the day that has given liberal satires plenty of fodder five days a week. The real problem, Frank Rich wrote about Miller, “is that his tone has become preachy. He too often seems a pundit first and a comic second.”

The Flipside, a more recent attempt at conservative satire, was launched this year by Kfir Alfia, who got his start in political performance a decade ago when he joined the Protest Warriors, a conservative group that counter-demonstrated at anti-war protests. The Flipside started airing this fall in more than 200 stations across the country, but its growth is hampered by its small budget, according to The Flipside’s producer, Rodney Lee Connover, who said he has to work 10 times as hard because his show has 10 times fewer resources than the liberal shows supported by cable networks.

Connover was a writer along with Miller on The 1/2 Hour News Hour, the first major attempt to create a conservative counterpart to The Daily Showin 2007. It was cancelled after just 13 episodes and has remained the worst-rated show of all time on Metacritic. It was widely panned by critics who complained that it was trying to be political first and funny second, so the jokes were unsurprising and flat.

The host of The Flipside, Michael Loftus, says he’s doing the same thing as Jon Stewart, just with some conservative window-dressing. Wearing jeans, Loftus stands and delivers his jokes on a set that looks like the set of Tool Time, the fictional home-improvement show Tim Allen hosts on the sitcom Home Improvement: The walls are decorated with a dartboard, a “Men at Work” sign, and various other items the producers might expect to find in a typical American garage. In a recent episode, after Republicans won the Senate, Loftus sang the song, “Looks like we made it …” to celebrate the victory.

But rather than talking about the news, as Colbert and Stewart do, or deconstructing a big political issue, as Oliver does, Loftus frequently makes dated references without offering new context to freshen them up. “What’s the deal with Harry Reid?” he asked in a recent episode. “You either hate him or you hate him, am I right? The man is in the business of telling people how greedy they are, and how they don’t pay their fair share, and he lives in the Ritz Carlton … This guy is literally Mr. Burns from The Simpsons.” Much of his material seems designed to resonate with only the most ardent Fox News viewers. Loftus obviously can’t yet attract the kinds of celebrity guests his network competitors can. But instead of playing games with the guests he can get, he asks softball questions that simply allow them to spout off.

Greg Gutfeld, the host of Fox’s Red Eye, can also be funny, but his willing-to-be-controversial style often comes across as more hackneyed than insightful. “You know you’re getting close to the truth when someone is calling you a racist,” he once said. Gutfeld has also railed against “greenie” leftists who shop at Whole Foods, tolerance, and football players who are openly gay. Gutfeld’s shtick works okay during its 3 a.m. timeslot, but a recent controversy over sexist jokes about a female fighter pilot highlighted just how far his humor is from working in prime time.

So if conservatives have yet to produce their own Jon Stewart, it could be the relatively small number of working conservative comedians, or their lack of power in the entertainment industry. Or it could be that shows like The Flipside are failing at least, in part, because they’re just not that funny. But what is it about political satire that makes it so hard for conservatives to get it right?

Read the entire article here.

Image: Stephen Colbert at the 2014 MontClair Film Festival. Courtesy of the 2014 MontClair Film Festival.

Send to Kindle

Apocalypse Now in Three Simple Steps

Google-search-apocalypse

Step One: Return to the Seventh Century.

Step Two: Fight the armies from Rome.

Step Three: Await… the apocalypse.

Just three simple steps — pretty straightforward really. Lots of violence, bloodshed and torture along the way. But apparently it’s worth every beheaded infidel, every crucified apostate, every subjugated or raped woman, every tormented child. This is the world according to ISIS, and it makes all other apocalyptic traditions seem like a trip to the candy store.

This makes one believe that apocalyptic Jews and Christians really don’t take their end-of-days beliefs very seriously — otherwise wouldn’t they be fighting alongside their Muslim brothers to reach the other side as quickly as possible?

Hmm. Which God to believe?

If you do nothing else today, read the entire in-depth article below.

From the Atlantic:

What is the Islamic State?

Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.

The group seized Mosul, Iraq, last June, and already rules an area larger than the United Kingdom. Abu Bakr al-Baghdadi has been its leader since May 2010, but until last summer, his most recent known appearance on film was a grainy mug shot from a stay in U.S. captivity at Camp Bucca during the occupation of Iraq. Then, on July 5 of last year, he stepped into the pulpit of the Great Mosque of al-Nuri in Mosul, to deliver a Ramadan sermon as the first caliph in generations—upgrading his resolution from grainy to high-definition, and his position from hunted guerrilla to commander of all Muslims. The inflow of jihadists that followed, from around the world, was unprecedented in its pace and volume, and is continuing.

Our ignorance of the Islamic State is in some ways understandable: It is a hermit kingdom; few have gone there and returned. Baghdadi has spoken on camera only once. But his address, and the Islamic State’s countless other propaganda videos and encyclicals, are online, and the caliphate’s supporters have toiled mightily to make their project knowable. We can gather that their state rejects peace as a matter of principle; that it hungers for genocide; that its religious views make it constitutionally incapable of certain types of change, even if that change might ensure its survival; and that it considers itself a harbinger of—and headline player in—the imminent end of the world.

The Islamic State, also known as the Islamic State of Iraq and al-Sham (ISIS), follows a distinctive variety of Islam whose beliefs about the path to the Day of Judgment matter to its strategy, and can help the West know its enemy and predict its behavior. Its rise to power is less like the triumph of the Muslim Brotherhood in Egypt (a group whose leaders the Islamic State considers apostates) than like the realization of a dystopian alternate reality in which David Koresh or Jim Jones survived to wield absolute power over not just a few hundred people, but some 8 million.

We have misunderstood the nature of the Islamic State in at least two ways. First, we tend to see jihadism as monolithic, and to apply the logic of al?Qaeda to an organization that has decisively eclipsed it. The Islamic State supporters I spoke with still refer to Osama bin Laden as “Sheikh Osama,” a title of honor. But jihadism has evolved since al-Qaeda’s heyday, from about 1998 to 2003, and many jihadists disdain the group’s priorities and current leadership.

Bin Laden viewed his terrorism as a prologue to a caliphate he did not expect to see in his lifetime. His organization was flexible, operating as a geographically diffuse network of autonomous cells. The Islamic State, by contrast, requires territory to remain legitimate, and a top-down structure to rule it. (Its bureaucracy is divided into civil and military arms, and its territory into provinces.)

We are misled in a second way, by a well-intentioned but dishonest campaign to deny the Islamic State’s medieval religious nature. Peter Bergen, who produced the first interview with bin Laden in 1997, titled his first book Holy War, Inc. in part to acknowledge bin Laden as a creature of the modern secular world. Bin Laden corporatized terror and franchised it out. He requested specific political concessions, such as the withdrawal of U.S. forces from Saudi Arabia. His foot soldiers navigated the modern world confidently. On Mohammad Atta’s last full day of life, he shopped at Walmart and ate dinner at Pizza Hut.

There is a temptation to rehearse this observation—that jihadists are modern secular people, with modern political concerns, wearing medieval religious disguise—and make it fit the Islamic State. In fact, much of what the group does looks nonsensical except in light of a sincere, carefully considered commitment to returning civilization to a seventh-century legal environment, and ultimately to bringing about the apocalypse.

The most-articulate spokesmen for that position are the Islamic State’s officials and supporters themselves. They refer derisively to “moderns.” In conversation, they insist that they will not—cannot—waver from governing precepts that were embedded in Islam by the Prophet Muhammad and his earliest followers. They often speak in codes and allusions that sound odd or old-fashioned to non-Muslims, but refer to specific traditions and texts of early Islam.

To take one example: In September, Sheikh Abu Muhammad al-Adnani, the Islamic State’s chief spokesman, called on Muslims in Western countries such as France and Canada to find an infidel and “smash his head with a rock,” poison him, run him over with a car, or “destroy his crops.” To Western ears, the biblical-sounding punishments—the stoning and crop destruction—juxtaposed strangely with his more modern-sounding call to vehicular homicide. (As if to show that he could terrorize by imagery alone, Adnani also referred to Secretary of State John Kerry as an “uncircumcised geezer.”)

But Adnani was not merely talking trash. His speech was laced with theological and legal discussion, and his exhortation to attack crops directly echoed orders from Muhammad to leave well water and crops alone—unless the armies of Islam were in a defensive position, in which case Muslims in the lands of kuffar, or infidels, should be unmerciful, and poison away.

The reality is that the Islamic State is Islamic. Very Islamic. Yes, it has attracted psychopaths and adventure seekers, drawn largely from the disaffected populations of the Middle East and Europe. But the religion preached by its most ardent followers derives from coherent and even learned interpretations of Islam.

Virtually every major decision and law promulgated by the Islamic State adheres to what it calls, in its press and pronouncements, and on its billboards, license plates, stationery, and coins, “the Prophetic methodology,” which means following the prophecy and example of Muhammad, in punctilious detail. Muslims can reject the Islamic State; nearly all do. But pretending that it isn’t actually a religious, millenarian group, with theology that must be understood to be combatted, has already led the United States to underestimate it and back foolish schemes to counter it. We’ll need to get acquainted with the Islamic State’s intellectual genealogy if we are to react in a way that will not strengthen it, but instead help it self-immolate in its own excessive zeal.

Read the entire article here.

Image: Apocalypse. Courtesy if Google Search.

Send to Kindle

Kims as Modern Messiahs

Kim Il Sung

Read this deconstruction of the modern North Korean state under the auspices of the Kim dynasty and you’ll see how easy it is to establish a cult of personality of messianic proportions.

From the Guardian:

In 1994, as it descended into famine, the Democratic People’s Republic of Korea (DPRK) spent millions of dollars raising a ziggurat on top of the mausoleum of Tangun, the founder of the ancient Korean Kojoson dynasty. Despite other more pressing matters, the regime felt it had urgent reasons to commemorate the life of a man whose reign began in 2,333 BC.

Unlike later Korean kingdoms, Tangun’s capital was close to Pyongyang, not Seoul. And so, in 1994, as South Korea blazed ahead in the battle for economic and political legitimacy on the Korean peninsula, the North reached into the past to claim its own.

It was said Tangun’s father had come to earth from heaven near the holy Mount Paektu on North Korea’s border with China. And despite all evidence to the contrary, it was also claimed as the birthplace of North Korea’s late leader Kim Jong-il, and its “founding father” Kim Il-sung’s base for his anti-Japanese guerrilla struggle.

When it came into being in 1948, official history writers dated Kim Il-sung’s Korea back to the year of his own birth. The now familiar Juche calendar, inaugurated in 1997, recalculated time from the year Kim Il-sung was said to have come to earth from heaven in 1912. Like some ancient creation myth newly minted, time itself began, or was renewed, with the birth of Kim Il-sung.

Equally importantly, in 1994 the renovation of Tangun’s Tomb coincided with another multi-million dollar renovation of the Kumsusan Memorial Palace, in which the embalmed body of Kim Il-sung would be displayed, preserving him as the country’s Eternal President.

To this day, the childhood hagiography of Kim Il-sung remains one of the key didactic tools of the North Korean state. The stories of his childhood resound from the walls of “Kim Il-sung Research Institutes” in schools, to the books children enjoy, to the texts electronically loaded on their Samjiyeon tablets.

He was born an ordinary man named Kim Song-ju on 15 April 1912, at the zenith of western and Japanese imperialism. In the first of his eight-volume memoir, he describes the era before his birth as a time of subjugation and national humiliation for the Korean race, and trumpets the new era of his guerrilla struggle.

Yet his birth also coincided with an omen of imperialism’s doom; it was the day the Titanic disappeared beneath the waters of the North Atlantic. In North Korea’s revolutionary cosmology, there is no such thing as chance. There is only destiny.

According to Kim Il-sung, his great-grandfather moved from North Jeolla Province, settling his family in Mangyongdae, then a village on the outskirts of the capital Pyongyang. For generations his family laboured there as farmers and grave keepers, and their suffering would come to symbolise the Korean nation under feudalism and Japanese imperialism. Kim describing them as “the epitome of the misfortune and distress that befell our people after they lost their country”.

In the memoir, Kim Il-sung’s childhood reminiscences lurch from affectations of modesty to statements of self-aggrandisement. In his preface, for example, the Great Leader claims: “I have never considered my life to be extraordinary.” Two pages later he declares: “my whole life… is the epitome of the history of my country and my people.”

Kim even insists it was his own great-grandfather who led the attack on the General Sherman when it sailed the Taedong into Pyongyang in 1866, achieving one of Korea’s first great victories against western economic and military might. Kim’s ancestors glories foreshadow the greater ones to come.

The greatest influence upon the young Kim Il-sung is said to be his father, Kim Hyong-jik. A charismatic teacher and self-taught physician, Kim Hyong-jik becomes a prophetic figure in the history of his nation, raising an heir who will return as saviour to a liberated homeland.

Kim Il-sung’s account says he prepared for his vocation from a tender age; he recalls vowing to defeat the forces of imperialism at the age of five, when he was playing on a swing in his mother’s arms. There could be no clearer distillation of North Korean children’s culture, rehearsed to this day via the Korean Children’s Union and military games in which toddlers and primary school students eviscerate effigies of American and Japanese imperialists. In the revolutionary imagination there is no difference between warriors and innocents.

He wrote himself into the history of the March 1st Movement of 1919, when Korean protests against Japanese imperial rule were violently crushed. “I, then six years old, also joined the ranks of demonstrators,” he says. “When the adults cheered for independence, I joined them. The enemy used swords and guns indiscriminately against the masses … This was the day when I witnessed Korean blood being spilled for the first time. My young heart burned with indignation.”

From that point, the Kim family’s instinctive resistance to Japanese imperialism becomes increasingly bound to the political vision articulated by the Soviet Union. Kim Il-sung recalls his father’s realisation that “the national liberation movement in our country should shift from a nationalist movement to a communist movement.” Instead of bedtime stories of old Korea, his father teaches Kim of Lenin and the October Revolution.

In a series of semi-comic interludes, the young Kim Il-sung scores early victories against the enemy, setting the model for countless juvenile heroes in North Korean children’s literature. For instance, he recalls “wrestling with a Japanese boy bigger than me who I got down with a belly throw.”

In other acts of resistance, Kim lines roads with spikes to tear the wheels of Japanese police bicycles, and defaces Japanese primary school textbooks in protest at linguistic imperialism. Such antics are undoubtedly exaggerated, yet the hagiography is careful to limit Kim Il-sung’s proto-guerrilla struggle to plausible feats of childhood derring-do. Unlike his son, Kim Jong-il, he is not depicted as a Napoleonic genius at 10 years-old.

Kim Hyong-jik does not live to see Korea free with his own eyes. Before he dies in exile in Manchuria, he issues a command to his now 14-year-old son: “You must not forget that you belong to the country and the people. You must win back your country at all costs, even if your bones are broken and your bodies are torn apart.”

Despite his father’s rousing words, Kim Il-sung is still too young to lead a guerrilla war that many North Koreans, until recently, could still recall from living memory. So before Kim’s war begins he studies in Manchuria, albeit in a middle school transformed into a kind of revolutionary Hogwarts.

Even today, the legend of Yuwen Middle School endures. During Kim Jong-il’s state visit to China in September 2010 he detoured to Jilin, undertaking a pilgrimage to his father’s school. There, according to state television, the Dear Leader became “immersed in thoughts while looking at the precious historic objects that contain the bodily odour of our Supreme Leader from his school years some 80 years back.” It was an exquisite act of political theatre. Only days later, returning to Pyongyang, Kim Jong-il revealed that Kim Jong-un would be his young successor.

Read the entire article here.

Image: Kim il-sung and adoring children. Courtesy of AP.

Send to Kindle

Where Will I Get My News (and Satire)

Google-search-jon-stewart

Jon Stewart. Jon Stewart, you dastardly, villainous so-and-so. How could you? How could you decide to leave the most important show in media history — The Daily Show — after a mere 16 years? Where will I get my news? Where will I find another hypocrisy-meter? Where will I find another truth-seeking David to fend us from the fear-mongering neocon Goliaths led by Rogers Ailes over at the Foxion News Channel? Where will I find such a thoroughly delicious merging of news, fact and satire. Jon Stewart how could you?!

From the Guardian?

“Where will I get my news each night,” lamented Bill Clinton this week. This might have been a reaction to the fall from grace of Brian Williams, America’s top-rated news anchor, who was suspended for embellishing details of his adventures in Iraq. In fact the former US president was anticipating withdrawal symptoms for the impending departure of the comedian Jon Stewart, who – on the same day as Williams’s disgrace – announced that he will step down as the Daily Show host.

Stewart, who began his stint 16 years ago, has achieved something extraordinary from behind a studio desk on a comedy cable channel. Merging the intense desire for factual information with humour, irreverence, scepticism and usually appropriate cynicism, Stewart’s show proved a magnet for opinion formers, top politicians – who clamoured to appear – and most significantly the young, for whom the mix proved irresistible. His ridiculing of neocons became a nightly staple. His rejection from the outset of the Iraq war was prescient. And always he was funny, not least this week in using Williams’s fall to castigate the media for failing to properly scrutinise the Iraq war. Bill Clinton does not mourn alone.

Read the entire story here.

Image courtesy of Google Search.

Send to Kindle

Social Media Metes Out Social (Networking) Justice

Before the age of Facebook and Twitter if you were to say something utterly stupid, bigoted, sexist or racist among a small group of friends or colleagues it would, usually, have gone no further. Some members of your audience may have chastised you, while others may have agreed or ignored you. But then the comment would have been largely forgotten.

This is no longer so in our age of social networking and constant inter-connectedness. Our technologies distribute, repeat and amplify our words and actions, which now seem to take on lives of their very own. Love it or hate it — welcome to the age of social networking justice — a 21st century digital pillory.

Say something stupid or do something questionable today — and you’re likely to face a consequential backlash that stretches beyond the present and into your future. Just take the case of Justine Sacco.

From NYT:

As she made the long journey from New York to South Africa, to visit family during the holidays in 2013, Justine Sacco, 30 years old and the senior director of corporate communications at IAC, began tweeting acerbic little jokes about the indignities of travel. There was one about a fellow passenger on the flight from John F. Kennedy International Airport:

“?‘Weird German Dude: You’re in First Class. It’s 2014. Get some deodorant.’ — Inner monologue as I inhale BO. Thank God for pharmaceuticals.”

Then, during her layover at Heathrow:

“Chilly — cucumber sandwiches — bad teeth. Back in London!”

And on Dec. 20, before the final leg of her trip to Cape Town:

“Going to Africa. Hope I don’t get AIDS. Just kidding. I’m white!”

She chuckled to herself as she pressed send on this last one, then wandered around Heathrow’s international terminal for half an hour, sporadically checking her phone. No one replied, which didn’t surprise her. She had only 170 Twitter followers.

Sacco boarded the plane. It was an 11-hour flight, so she slept. When the plane landed in Cape Town and was taxiing on the runway, she turned on her phone. Right away, she got a text from someone she hadn’t spoken to since high school: “I’m so sorry to see what’s happening.” Sacco looked at it, baffled.

Then another text: “You need to call me immediately.” It was from her best friend, Hannah. Then her phone exploded with more texts and alerts. And then it rang. It was Hannah. “You’re the No. 1 worldwide trend on Twitter right now,” she said.

Sacco’s Twitter feed had become a horror show. “In light of @Justine-Sacco disgusting racist tweet, I’m donating to @care today” and “How did @JustineSacco get a PR job?! Her level of racist ignorance belongs on Fox News. #AIDS can affect anyone!” and “I’m an IAC employee and I don’t want @JustineSacco doing any communications on our behalf ever again. Ever.” And then one from her employer, IAC, the corporate owner of The Daily Beast, OKCupid and Vimeo: “This is an outrageous, offensive comment. Employee in question currently unreachable on an intl flight.” The anger soon turned to excitement: “All I want for Christmas is to see @JustineSacco’s face when her plane lands and she checks her inbox/voicemail” and “Oh man, @JustineSacco is going to have the most painful phone-turning-on moment ever when her plane lands” and “We are about to watch this @JustineSacco bitch get fired. In REAL time. Before she even KNOWS she’s getting fired.”

The furor over Sacco’s tweet had become not just an ideological crusade against her perceived bigotry but also a form of idle entertainment. Her complete ignorance of her predicament for those 11 hours lent the episode both dramatic irony and a pleasing narrative arc. As Sacco’s flight traversed the length of Africa, a hashtag began to trend worldwide: #HasJustineLandedYet. “Seriously. I just want to go home to go to bed, but everyone at the bar is SO into #HasJustineLandedYet. Can’t look away. Can’t leave” and “Right, is there no one in Cape Town going to the airport to tweet her arrival? Come on, Twitter! I’d like pictures #HasJustineLandedYet.”

A Twitter user did indeed go to the airport to tweet her arrival. He took her photograph and posted it online. “Yup,” he wrote, “@JustineSacco HAS in fact landed at Cape Town International. She’s decided to wear sunnies as a disguise.”

By the time Sacco had touched down, tens of thousands of angry tweets had been sent in response to her joke. Hannah, meanwhile, frantically deleted her friend’s tweet and her account — Sacco didn’t want to look — but it was far too late. “Sorry @JustineSacco,” wrote one Twitter user, “your tweet lives on forever.”

Read the entire article here.

Send to Kindle

Are Most CEOs Talented or Lucky?

According to Harold G. Hamm, founder and CEO of Continental Resources, most CEOs are lucky not talented. You see, Hamm’s net worth has reached around $18 billion and in recent divorce filings he claims to only have been responsible for generating around 10 percent of this wealth since founding his company in 1988. Interestingly, even though he made most of the key company appointments and oversaw all the key business decisions, he seems to be rather reticent in claiming much of the company’s success as his own. Strange then that his company  would compensate him to the tune of around $43 million during 2006-2013 for essentially being a lucky slacker!

This, of course, enables him to minimize the amount owed to his ex-wife. Thus, one has to surmise from these shenanigans that some CEOs are not only merely lucky, they’re also stupid.

On a broader note this does raise the question of why many CEOs are rewarded such extraordinary sums when it’s mostly luck guiding their company’s progress!

From NYT:

The divorce of the oil billionaire Harold G. Hamm from Sue Ann Arnall has gained attention largely for its outsize dollar amounts. Mr. Hamm, the chief executive and founder of Continental Resources, who was worth more than $18 billion at one point, wrote his ex-wife a check last month for $974,790,317.77 to settle their split. She’s appealing to get more; he’s appealing to pay less.

Yet beyond the staggering sums, the Hamm divorce raises a fundamental question about the wealth of executives and entrepreneurs: How much do they owe their fortunes to skill and hard work, and how much comes from happenstance and luck?

Mr. Hamm, seeking to exploit a wrinkle in divorce law, made the unusual argument that his wealth came largely from forces outside his control, like global oil prices, the expertise of his deputies and other people’s technology. During the nine-week divorce trial, his lawyers claimed that although Mr. Hamm had founded Continental Resources and led the company to become a multibillion-dollar energy giant, he was responsible for less than 10 percent of his personal and corporate success.

Some in the courtroom started calling it the “Jed Clampett defense,” after the lead character in “The Beverly Hillbillies” TV series who got rich after tapping a gusher in his swampland.

In a filing last month supporting his appeal, Mr. Hamm cites the recent drop in oil prices and subsequent 50 percent drop in Continental’s share price and his fortune as further proof that forces outside his control direct his company’s fortunes.

Lawyers for Ms. Arnall argue that Mr. Hamm is responsible for more than 90 percent of his fortune.

While rooted in a messy divorce, the dispute frames a philosophical and ethical debate over inequality and the obligations of the wealthy. If wealth comes mainly from luck or circumstance, many say the wealthy owe a greater debt to society in the form of taxes or charity. If wealth comes from skill and hard work, perhaps higher taxes would discourage that effort.

Sorting out what value is created by luck or skill is a tricky proposition in itself. The limited amount of academic research on the topic, which mainly looks at how executives can influence a company’s value, has often found that broader market forces often have a bigger impact on a company’s success than an executive’s actions.

“As we know from the research, the performance of a large firm is due primarily to things outside the control of the top executive,” said J. Scott Armstrong, a professor at the Wharton School at the University of Pennsylvania. “We call that luck. Executives freely admit this — when they encounter bad luck.”

A study conducted from 1992 to 2011 of how C.E.O. compensation changed in response to luck or events beyond the executives’ control showed that their pay was 25 percent higher when luck favored the C.E.O.

Some management experts say the role of luck is nearly impossible to measure because it depends on the particular industry. Oil, for instance, is especially sensitive to outside forces.

“Within any industry, a more talented management team is going to tend to do better,” said Steven Neil Kaplan of the University of Chicago Booth School of Business. “That is why investors and boards of directors look for the best talent to run their companies. That is why company stock prices often move a lot, in both directions, when a C.E.O. dies or a new C.E.O. is hired.”

The Hamm case hinged on a quirk in divorce law known as “active versus passive appreciation.” In Oklahoma, and many other states, if a spouse owns an asset before the marriage, the increase in the value of an asset during marriage is not subject to division if the increase was because of “passive” appreciation. Passive appreciation is when an asset grows on its own because of factors outside either spouse’s control, like land that appreciates without any improvements or passively held stocks. Any value that’s not deemed as “passive” is considered “active” — meaning it increased because of the efforts, skills or funding of a spouse and can therefore be subject to division in a divorce.

The issue has been at the center of some other big divorces. In the 2002 divorce of the Chicago taxi magnate David Markin and Susan Markin, filed in Palm Beach, Fla., Mr. Markin claimed he was “merely a passenger on this corporate ship traveling through the ocean,” according to the judge. But he ruled that Mr. Markin was more like “the captain of the ship. Certainly he benefited by sailing through some good weather. However, he picked the course and he picked the crew. In short, he was directly responsible for everything that happened.” Ms. Markin was awarded more than $30 million, along with other assets.

Mr. Hamm, now 69, also had favorable conditions after founding Continental Resources well before his marriage in 1988 to Sue Ann, then a lawyer at the company. By this fall, when the trial ended, Continental had a market capitalization of over $30 billion; Mr. Hamm’s stake of 68 percent and other wealth exceeded $18 billion.

Their divorce trial was closed to the public, and all but a few of the documents are under seal. Neither Mr. Hamm nor his lawyers or representatives would comment. Ms. Arnall and her spokesman also declined to comment.

According to people with knowledge of the case, however, Mr. Hamm’s chief strategy was to claim most of his wealth as passive appreciation, and therefore not subject to division. During his testimony, the typically commanding Mr. Hamm, who had been the face of the company for decades, said he couldn’t recall certain decisions, didn’t know much about the engineering aspects of oil drilling and didn’t attend critical meetings.

Mr. Hamm’s lawyers calculated that only 5 to 10 percent of his wealth came from his own effort, skill, management or investment. It’s unclear how they squared this argument with his compensation, which totaled $42.7 million from 2006 to 2013, according to Equilar, an executive compensation data company.

Ms. Arnall called more than 80 witnesses — from Continental executives to leading economists like Glenn Hubbard and Kenneth Button — to show how much better Continental had done than its peers and that Mr. Hamm made most or all of the key decisions about the company’s strategy, finances and operations. They estimated that Mr. Hamm was responsible for $14 billion to $17 billion of his $18 billion fortune.

Read the entire article here.

Send to Kindle