Tag Archives: politics

Lead a Congressional Committee on Science: No Grasp of Science Required

[div class=attrib]From ars technica:[end-div]

The House Committee on Space, Science, and Technology hears testimony on climate change in March 2011.[/ars_img]If you had the chance to ask questions of one of the world’s leading climatologists, would you select a set of topics that would be at home in the heated discussions that take place in the Ars forums? If you watch the video below, you’d find that’s precisely what Dana Rohrabacher (R-CA) chose to do when Penn State’s Richard Alley (a fellow Republican) was called before the House Science Committee, which has already had issues with its grasp of science. Rohrabacher took Alley on a tour of some of the least convincing arguments about climate change, all trying to convince him changes in the Sun were to blame for a changing climate. (Alley, for his part, noted that we have actually measured the Sun, and we’ve seen no such changes.)

Now, if he has his way, Rohrabacher will be chairing the committee once the next Congress is seated. Even if he doesn’t get the job, the alternatives aren’t much better.

There has been some good news for the Science Committee to come out of the last election. Representative Todd Akin (R-MO), whose lack of understanding of biology was made clear by his comments on “legitimate rape,” had to give up his seat to run for the Senate, a race he lost. Meanwhile, Paul Broun (R-GA), who said that evolution and cosmology are “lies straight from the pit of Hell,” won reelection, but he received a bit of a warning in the process: dead English naturalist Charles Darwin, who is ineligible to serve in Congress, managed to draw thousands of write-in votes. And, thanks to limits on chairmanships, Ralph Hall (R-TX), who accused climate scientists of being in it for the money (if so, they’re doing it wrong), will have to step down.

In addition to Rohrabacher, the other Representatives that are vying to lead the Committee are Wisconsin’s James Sensenbrenner and Texas’ Lamar Smith. They all suggest that they will focus on topics like NASA’s budget and the Department of Energy’s plans for future energy tech. But all of them have been embroiled in the controversy over climate change in the past.

In an interview with Science Insider about his candidacy, Rohrabacher engaged in a bit of triumphalism and suggested that his beliefs were winning out. “There were a lot of scientists who were just going along with the flow on the idea that mankind was causing a change in the world’s climate,” he said. “I think that after 10 years of debate, we can show that there are hundreds if not thousands of scientists who have come over to being skeptics, and I don’t know anyone [who was a skeptic] who became a believer in global warming.”

[div class=attrib]Read the entire article following the jump.[end-div]

USANIT

Ever-present in Europe nationalism continues to grow as austerity measures across the continent catalyze xenophobia. And, now it’s spreading westwards across the Atlantic to the United States of America. Well, actually to be more precise nationalistic fervor is spreading to Texas. Perhaps in our lifetimes we’ll have to contend with USANIT — the United States of America Not Including Texas. Seventy-seven thousand Texans, so far, want the Lone Star to fly again across their nascent nation.

[div class=attrib]From the Guardian:[end-div]

Less than a week after Barack Obama was re-elected president, a slew of petitions have appeared on the White House’s We the People site, asking for states to be granted the right to peacefully withdraw from the union.

On Tuesday, all but one of the 33 states listed were far from reaching the 25,000 signature mark needed to get a response from the White House. Texas, however, had gained more than 77,000 online signatures in three days.

People from other states had signed the Texas petition. Another petition on the website was titled: “Deport everyone that signed a petition to withdraw their state from the United States of America.” It had 3,536 signatures.

The Texas petition reads:

Given that the state of Texas maintains a balanced budget and is the 15th largest economy in the world, it is practically feasible for Texas to withdraw from the union, and to do so would protect it’s citizens’ standard of living and re-secure their rights and liberties in accordance with the original ideas and beliefs of our founding fathers which are no longer being reflected by the federal government.

Activists across the country have advocated for independent statehood since the union was restored after the end of the Civil War in 1865. Texas has been host to some of the most fervent fights for independence.

Daniel Miller is the president of the Texas Nationalist Movement, which supports Texan independence and has its own online petition.

“We want to be able to govern ourselves without having some government a thousand-plus miles away that we have to go ask ‘mother may I’ to,” Miller said. “We want to protect our political, our cultural and our economic identities.”

Miller is not a fan of the word “secession”, because he views it as an over-generalization of what his group hopes to accomplish, but he encourages advocates for Texan independence to show their support when they can, including by signing the White House website petition.

“Given the political, cultural and economic pressures the United States is under, it’s not beyond the pale where one could envision the break up of the United States,” he said. “I don’t look at it as possibility, I look at it as an inevitability.”

Miller has been working for Texas independence for 16 years. He pointed to last week’s federal elections as evidence that a state independence movement is gaining traction. Miller pointed to the legalization of the sale of marijuana in Colorado and Washington, disobeying federal mandate.

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]State Flag of Texas courtesy of Wikipedia.[end-div]

Socialism and Capitalism Share the Same Parent

Expanding on the work of Immanuel Kant in the late 18th century, German philosopher Georg Wilhelm Friedrich Hegel laid the foundations for what would later become two opposing political systems, socialism and free market capitalism. His comprehensive framework of Absolute Idealism influenced numerous philosophers and thinkers of all shades including Karl Marx and Ralph Waldo Emerson. While many thinkers later rounded on Hegel’s world view as nothing but a thinly veiled attempt to justify totalitarianism in his own nation, there is no argument as to the profound influence of his works on later thinkers from both the left and the right wings of the political spectrum.

[div class=attrib]From FairObserver:[end-div]

It is common knowledge that among developed western countries the two leading socioeconomic systems are socialism and capitalism. The former is often associated more closely with European systems of governance and the latter with the American free market economy. It is also generally known that these two systems are rooted in two fundamentally different assumptions about how a healthy society progresses. What is not as well known is that they both stem from the same philosophical roots, namely the evolutionary philosophy of Georg Wilhelm Friedrich Hegel.

Georg Wilhelm Friedrich Hegel was a leading figure in the movement known as German Idealism that had its beginnings in the late 18th century. That philosophical movement was initiated by another prominent German thinker, Immanuel Kant. Kant published “The Critique of Pure Reason” in 1781, offering a radical new way to understand how we as human beings get along in the world. Hegel expanded on Kant’s theory of knowledge by adding a theory of social and historical progress. Both socialism and capitalism were inspired by different, and to some extent apposing, interpretations of Hegel’s philosophical system.

Immanuel Kant recognized that human beings create their view of reality by incorporating new information into their previous understanding of reality using the laws of reason. As this integrative process unfolds we are compelled to maintain a coherent picture of what is real in order to operate effectively in the world. The coherent picture of reality that we maintain Kant called a necessary transcendental unity. It can be understood as the overarching picture of reality, or worldview, that helps us make sense of the world and against which we interpret and judge all new experiences and information.

Hegel realized that not only must individuals maintain a cohesive picture of reality, but societies and cultures must also maintain a collectively held and unified understanding of what is real. To use a gross example, it is not enough for me to know what a dollar bill is and what it is worth. If I am to be able to buy something with my money, then other people must agree on its value. Reality is not merely an individual event; it is a collective affair of shared agreement. Hegel further saw that the collective understanding of reality that is held in common by many human beings in any given society develops over the course of history. In his book “The Philosophy of History”, Hegel outlines his theory of how this development occurs. Karl Marx started with Hegel’s philosophy and then added his own profound insights – especially in regards to how oppression and class struggle drive the course of history.

Across the Atlantic in America, there was another thinker, Ralph Waldo Emerson, who was strongly influenced by German Idealism and especially the philosophy of Hegel. In the development of the American mind one cannot overstate the role that Emerson played as the pathfinder who marked trails of thought that continue to guide the  current American worldview. His ideas became grooves in consciousness set so deeply in the American psyche that they are often simply experienced as truth.  What excited Emerson about Hegel was his description of how reality emerged from a universal mind. Emerson similarly believed that what we as human beings experience as real has emerged through time from a universal source of intelligence. This distinctly Hegelian tone in Emerson can be heard clearly in this passage from his essay entitled “History”:

“There is one mind common to all individual men. Of the works of this mind history is the record. Man is explicable by nothing less than all his history. All the facts of history pre-exist as laws. Each law in turn is made by circumstances predominant. The creation of a thousand forests is in one acorn, and Egypt, Greece, Rome, Gaul, Britain, America, lie folded already in the first man. Epoch after epoch, camp, kingdom, empire, republic, democracy, are merely the application of this manifold spirit to the manifold world.”

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Image: The portrait of G.W.F. Hegel (1770-1831); Steel engraving by Lazarus Sichling after a lithograph by Julius L. Sebbers. Courtesy of Wikipedia.[end-div]

Charles Darwin Runs for Office

British voters may recall Screaming Lord Sutch, 3rd Earl of Harrow, of the Official Monster Raving Loony Party, who ran in over 40 parliamentary elections during the 1980s and 90s. He never won, but garnered a respectable number of votes and many fans (he was also a musician).

The United States followed a more dignified path in the 2012 elections, when Charles Darwin ran for a Congressional seat in Georgia. Darwin failed to win, but collected a respectable 4,000 votes. His opponent, Paul Broun, believes that the Earth “is but about 9,000 years old”. Interestingly, Representative Broun serves on the United States House Committee on Science, Space and Technology.

[div class=attrib]From Slate:[end-div]

Anti-evolution Congressman Paul Broun (R-Ga.) ran unopposed in Tuesday’s election, but nearly 4,000 voters wrote in Charles Darwin to protest their representative’s views. (Broun called evolution “lies straight from the pit of hell.”) Darwin fell more than 205,000 votes short of victory, but what would have happened if the father of evolution had out-polled Broun?

Broun still would have won. Georgia, like many other states, doesn’t count votes for write-in candidates who have not filed a notice of intent to stand for election. Even if the finally tally had been reversed, with Charles Darwin winning 209,000 votes and Paul Broun 4,000, Broun would have kept his job.

That’s not to say dead candidates can’t win elections. It happens all the time, but only when the candidate dies after being placed on the ballot. In Tuesday’s election, Orange County, Fla., tax collector Earl Wood won more than 56 percent of the vote, even though he died in October at the age of 96 after holding the office for more than 40 years. Florida law allowed the Democratic Party, of which Wood was a member, to choose a candidate to receive Wood’s votes. In Alabama, Charles Beasley won a seat on the Bibb County Commission despite dying on Oct. 12. (Beasley’s opponent lamented the challenge of running a negative campaign against a dead man.) The governor will appoint a replacement.

[div class=attrib]Read the entire article after the jump.[end-div]

Big Data Versus Talking Heads

With the election in the United States now decided, the dissection of the result is well underway. And, perhaps the biggest winner of all is the science of big data. Yes, mathematical analysis of vast quantities of demographic and polling data won over the voodoo proclamations and gut felt predictions of the punditocracy. Now, that’s a result truly worth celebrating.

[div class=attrib]From ReadWriteWeb:[end-div]

Political pundits, mostly Republican, went into a frenzy when Nate Silver, a New York Times pollster and stats blogger, predicted that Barack Obama would win reelection.

But Silver was right and the pundits were wrong – and the impact of this goes way beyond politics.

Silver won because, um, science. As ReadWrite’s own Dan Rowinski noted,  Silver’s methodology is all based on data. He “takes deep data sets and applies logical analytical methods” to them. It’s all just numbers.

Silver runs a blog called FiveThirtyEight, which is licensed by the Times. In 2008 he called the presidential election with incredible accuracy, getting 49 out of 50 states right. But this year he rolled a perfect score, 50 out of 50, even nailing the margins in many cases. His uncanny accuracy on this year’s election represents what Rowinski calls a victory of “logic over punditry.”

In fact it’s bigger than that. Bear in mind that before turning his attention to politics in 2007 and 2008, Silver was using computer models to make predictions about baseball. What does it mean when some punk kid baseball nerd can just wade into politics and start kicking butt on all these long-time “experts” who have spent their entire lives covering politics?

It means something big is happening.

Man Versus Machine

This is about the triumph of machines and software over gut instinct.

The age of voodoo is over. The era of talking about something as a “dark art” is done. In a world with big computers and big data, there are no dark arts.

And thank God for that. One by one, computers and the people who know how to use them are knocking off these crazy notions about gut instinct and intuition that humans like to cling to. For far too long we’ve applied this kind of fuzzy thinking to everything, from silly stuff like sports to important stuff like medicine.

Someday, and I hope it’s soon, we will enter the age of intelligent machines, when true artificial intellgence becomes a reality, and when we look back on the late 20th and early 21st century it will seem medieval in its simplicity and reliance on superstition.

What most amazes me is the backlash and freak-out that occurs every time some “dark art” gets knocked over in a particular domain. Watch Moneyball (or read the book) and you’ll see the old guard (in that case, baseball scouts) grow furious as they realize that computers can do their job better than they can. (Of course it’s not computers; it’s people who know how to use computers.)

We saw the same thing when IBM’s Deep Blue defeated Garry Kasparov in 1997. We saw it when Watson beat humans at Jeopardy.

It’s happening in advertising, which used to be a dark art but is increasingly a computer-driven numbers game. It’s also happening in my business, the news media, prompting the same kind of furor as happened with the baseball scouts in Moneyball.

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Political pundits, Left to right: Mark Halperin, David Brooks, Jon Stewart, Tim Russert, Matt Drudge, John Harris & Jim VandeHei, Rush Limbaugh, Sean Hannity, Chris Matthews, Karl Rove. Courtesy of Telegraph.[end-div]

The Military-Industrial Complex

[tube]8y06NSBBRtY[/tube]

In his op-ed, author Aaron B. O’Connell reminds us of Eisenhower’s prescient warning to the nation about the growing power of the military-industrial complex in national affairs.

[div class=attrib]From the New York Times:[end-div]

IN 1961, President Dwight D. Eisenhower left office warning of the growing power of the military-industrial complex in American life. Most people know the term the president popularized, but few remember his argument.

In his farewell address, Eisenhower called for a better equilibrium between military and domestic affairs in our economy, politics and culture. He worried that the defense industry’s search for profits would warp foreign policy and, conversely, that too much state control of the private sector would cause economic stagnation. He warned that unending preparations for war were incongruous with the nation’s history. He cautioned that war and warmaking took up too large a proportion of national life, with grave ramifications for our spiritual health.

The military-industrial complex has not emerged in quite the way Eisenhower envisioned. The United States spends an enormous sum on defense — over $700 billion last year, about half of all military spending in the world — but in terms of our total economy, it has steadily declined to less than 5 percent of gross domestic product from 14 percent in 1953. Defense-related research has not produced an ossified garrison state; in fact, it has yielded a host of beneficial technologies, from the Internet to civilian nuclear power to GPS navigation. The United States has an enormous armaments industry, but it has not hampered employment and economic growth. In fact, Congress’s favorite argument against reducing defense spending is the job loss such cuts would entail.

Nor has the private sector infected foreign policy in the way that Eisenhower warned. Foreign policy has become increasingly reliant on military solutions since World War II, but we are a long way from the Marines’ repeated occupations of Haiti, Nicaragua and the Dominican Republic in the early 20th century, when commercial interests influenced military action. Of all the criticisms of the 2003 Iraq war, the idea that it was done to somehow magically decrease the cost of oil is the least credible. Though it’s true that mercenaries and contractors have exploited the wars of the past decade, hard decisions about the use of military force are made today much as they were in Eisenhower’s day: by the president, advised by the Joint Chiefs of Staff and the National Security Council, and then more or less rubber-stamped by Congress. Corporations do not get a vote, at least not yet.

But Eisenhower’s least heeded warning — concerning the spiritual effects of permanent preparations for war — is more important now than ever. Our culture has militarized considerably since Eisenhower’s era, and civilians, not the armed services, have been the principal cause. From lawmakers’ constant use of “support our troops” to justify defense spending, to TV programs and video games like “NCIS,” “Homeland” and “Call of Duty,” to NBC’s shameful and unreal reality show “Stars Earn Stripes,” Americans are subjected to a daily diet of stories that valorize the military while the storytellers pursue their own opportunistic political and commercial agendas. Of course, veterans should be thanked for serving their country, as should police officers, emergency workers and teachers. But no institution — particularly one financed by the taxpayers — should be immune from thoughtful criticism.

[div class=attrib]Read the entire article after the jump.[end-div]

Democracy is Ugly and Petty

While this election cycle in the United States has been especially partisan this season, it’s worth remembering that politics in an open democracy is sometimes brutal, frequently nasty and often petty. Partisan fights, both metaphorical and physical, have been occuring since the Republic was founded

[div class=attrib]From the New York Times:[end-div]

As the cable news channels count down the hours before the first polls close on Tuesday, an entire election cycle will have passed since President Obama last sat down with Fox News. The organization’s standing request to interview the president is now almost two years old.

At NBC News, the journalists reporting on the Romney campaign will continue to absorb taunts from their sources about their sister cable channel, MSNBC. “You mean, Al Sharpton’s network,” as Stuart Stevens, a senior Romney adviser, is especially fond of reminding them.

Spend just a little time watching either Fox News or MSNBC, and it is easy to see why such tensions run high. In fact, by some measures, the partisan bitterness on cable news has never been as stark — and in some ways, as silly or small.

Martin Bashir, the host of MSNBC’s 4 p.m. hour, recently tried to assess why Mitt Romney seemed irritable on the campaign trail and offered a provocative theory: that he might have mental problems.

“Mrs. Romney has expressed concerns about her husband’s mental well-being,” Mr. Bashir told one of his guests. “But do you get the feeling that perhaps there’s more to this than she’s saying?”

Over on Fox News, similar psychological evaluations were under way on “Fox & Friends.” Keith Ablow, a psychiatrist and a member of the channel’s “Medical A-Team,” suggested that Joseph R. Biden Jr.’s “bizarre laughter” during the vice-presidential debate might have something to do with a larger mental health issue. “You have to put dementia on the differential diagnosis,” he noted matter-of-factly.

Neither outlet has built its reputation on moderation and restraint, but during this presidential election, research shows that both are pushing their stridency to new levels.

A Pew Research Center study found that of Fox News stories about Mr. Obama from the end of August through the end of October, just 6 percent were positive and 46 percent were negative.

Pew also found that Mr. Obama was covered far more than Mr. Romney. The president was a significant figure in 74 percent of Fox’s campaign stories, compared with 49 percent for Romney. In 2008, Pew found that the channel reported on Mr. Obama and John McCain in roughly equal amounts.

The greater disparity was on MSNBC, which gave Mr. Romney positive coverage just 3 percent of the time, Pew found. It examined 259 segments about Mr. Romney and found that 71 percent were negative.

MSNBC, whose programs are hosted by a new crop of extravagant partisans like Mr. Bashir, Mr. Sharpton and Lawrence O’Donnell, has tested the limits of good taste this year. Mr. O’Donnell was forced to apologize in April after describing the Mormon Church as nothing more than a scheme cooked up by a man who “got caught having sex with the maid and explained to his wife that God told him to do it.”

The channel’s hosts recycle talking points handed out by the Obama campaign, even using them as titles for program segments, like Mr. Bashir did recently with a segment he called “Romnesia,” referring to Mr. Obama’s term to explain his opponent’s shifting positions.

The hosts insult and mock, like Alex Wagner did in recently describing Mr. Romney’s trip overseas as “National Lampoon’s European Vacation” — a line she borrowed from an Obama spokeswoman. Mr. Romney was not only hapless, Ms. Wagner said, he also looked “disheveled” and “a little bit sweaty” in a recent appearance.

Not that they save their scorn just for their programs. Some MSNBC hosts even use the channel’s own ads promoting its slogan “Lean Forward,” to criticize Mr. Romney and the Republicans. Mr. O’Donnell accuses the Republican nominee of basing his campaign on the false notion that Mr. Obama is inciting class warfare. “You have to come up with a lie,” he says, when your campaign is based on empty rhetoric.

In her ad, Rachel Maddow breathlessly decodes the logic behind the push to overhaul state voting laws. “The idea is to shrink the electorate,” she says, “so a smaller number of people get to decide what happens to all of us.”

Such stridency has put NBC News journalists who cover Republicans in awkward and compromised positions, several people who work for the network said. To distance themselves from their sister channel, they have started taking steps to reassure Republican sources, like pointing out that they are reporting for NBC programs like “Today” and “Nightly News” — not for MSNBC.

At Fox News, there is a palpable sense that the White House punishes the outlet for its coverage, not only by withholding the president, who has done interviews with every other major network, but also by denying them access to Michelle Obama.

This fall, Mrs. Obama has done a spate of television appearances, from CNN to “Jimmy Kimmel Live” on ABC. But when officials from Fox News recently asked for an interview with the first lady, they were told no. She has not appeared on the channel since 2010, when she sat down with Mike Huckabee.

Lately the White House and Fox News have been at odds over the channel’s aggressive coverage of the attack on the American diplomatic mission in Benghazi, Libya. Fox initially raised questions over the White House’s explanation of the events that led to the attack — questions that other news organizations have since started reporting on more fully.

But the commentary on the channel quickly and often turns to accusations that the White House played politics with American lives. “Everything they told us was a lie,” Sean Hannity said recently as he and John H. Sununu, a former governor of New Hampshire and a Romney campaign supporter, took turns raising questions about how the Obama administration misled the public. “A hoax,” Mr. Hannity called the administration’s explanation. “A cover-up.”

Mr. Hannity has also taken to selectively fact-checking Mr. Obama’s claims, co-opting a journalistic tool that has proliferated in this election as news outlets sought to bring more accountability to their coverage.

Mr. Hannity’s guest fact-checkers have included hardly objective sources, like Dick Morris, the former Clinton aide turned conservative commentator; Liz Cheney, the daughter of former Vice President Dick Cheney; and Michelle Malkin, the right-wing provocateur.

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image courtesy of University of Maine at Farmington.[end-div]

It’s About Equality, Stupid

[div class=attrib]From Project Syndicate:[end-div]

The king of Bhutan wants to make us all happier. Governments, he says, should aim to maximize their people’s Gross National Happiness rather than their Gross National Product. Does this new emphasis on happiness represent a shift or just a passing fad?

It is easy to see why governments should de-emphasize economic growth when it is proving so elusive. The eurozone is not expected to grow at all this year. The British economy is contracting. Greece’s economy has been shrinking for years. Even China is expected to slow down. Why not give up growth and enjoy what we have?

No doubt this mood will pass when growth revives, as it is bound to. Nevertheless, a deeper shift in attitude toward growth has occurred, which is likely to make it a less important lodestar in the future – especially in rich countries.

The first factor to undermine the pursuit of growth was concern about its sustainability. Can we continue growing at the old rate without endangering our future?

When people started talking about the “natural” limits to growth in the 1970’s, they meant the impending exhaustion of food and non-renewable natural resources. Recently the debate has shifted to carbon emissions. As the Stern Review of 2006 emphasized, we must sacrifice some growth today to ensure that we do not all fry tomorrow.

Curiously, the one taboo area in this discussion is population. The fewer people there are, the less risk we face of heating up the planet. But, instead of accepting the natural decline in their populations, rich-country governments absorb more and more people to hold down wages and thereby grow faster.

A more recent concern focuses on the disappointing results of growth. It is increasingly understood that growth does not necessarily increase our sense of well-being. So why continue to grow?

The groundwork for this question was laid some time ago. In 1974, the economist Richard Easterlin published a famous paper, “Does Economic Growth Improve the Human Lot? Some Empirical Evidence.” After correlating per capita income and self-reported happiness levels across a number of countries, he reached a startling conclusion: probably not.

Above a rather low level of income (enough to satisfy basic needs), Easterlin found no correlation between happiness and GNP per head. In other words, GNP is a poor measure of life satisfaction.

That finding reinforced efforts to devise alternative indexes. In 1972, two economists, William Nordhaus and James Tobin, introduced a measure that they called “Net Economic Welfare,” obtained by deducting from GNP “bad” outputs, like pollution, and adding non-market activities, like leisure. They showed that a society with more leisure and less work could have as much welfare as one with more work – and therefore more GNP – and less leisure.

More recent metrics have tried to incorporate a wider range of “quality of life” indicators. The trouble is that you can measure quantity of stuff, but not quality of life. How one combines quantity and quality in some index of “life satisfaction” is a matter of morals rather than economics, so it is not surprising that most economists stick to their quantitative measures of “welfare.”

But another finding has also started to influence the current debate on growth: poor people within a country are less happy than rich people. In other words, above a low level of sufficiency, peoples’ happiness levels are determined much less by their absolute income than by their income relative to some reference group. We constantly compare our lot with that of others, feeling either superior or inferior, whatever our income level; well-being depends more on how the fruits of growth are distributed than on their absolute amount.

Put another way, what matters for life satisfaction is the growth not of mean income but of median income – the income of the typical person. Consider a population of ten people (say, a factory) in which the managing director earns $150,000 a year and the other nine, all workers, earn $10,000 each. The mean average of their incomes is $25,000, but 90% earn $10,000. With this kind of income distribution, it would be surprising if growth increased the typical person’s sense of well-being.

[div class=attrib]Read the entire article after the jump.[end-div]

Connectedness: A Force For Good

The internet has the potential to make our current political process obsolete. A review of “The End of Politics” by British politician Douglas Carswell shows how connectedness provides a significant opportunity to reshape the political process, and in some cases completely undermine government, for the good.

[div class=attrib]Charles Moore for the Telegraph:[end-div]

I think I can help you tackle this thought-provoking book. First of all, the title misleads. Enchanting though the idea will sound to many people, this is not about the end of politics. It is, after all, written by a Member of Parliament, Douglas Carswell (Con., Clacton) and he is fascinated by the subject. There’ll always be politics, he is saying, but not as we know it.

Second, you don’t really need to read the first half. It is essentially a passionately expressed set of arguments about why our current political arrangements do not work. It is good stuff, but there is plenty of it in the more independent-minded newspapers most days. The important bit is Part Two, beginning on page 145 and running for a modest 119 pages. It is called “The Birth of iDemocracy”.

Mr Carswell resembles those old barometers in which, in bad weather (Part One), a man with a mackintosh, an umbrella and a scowl comes out of the house. In good weather (Part Two), he pops out wearing a white suit, a straw hat and a broad smile. What makes him happy is the feeling that the digital revolution can restore to the people the power which, in the early days of the universal franchise, they possessed – and much, much more. He believes that the digital revolution has at last harnessed technology to express the “collective brain” of humanity. We develop our collective intelligence by exchanging the properties of our individual ones.

Throughout history, we have been impeded in doing this by physical barriers, such as distance, and by artificial ones, such as priesthoods of bureaucrats and experts. Today, i-this and e-that are cutting out these middlemen. He quotes the internet sage, Clay Shirky: “Here comes everybody”. Mr Carswell directs magnificent scorn at the aides to David Cameron who briefed the media that the Prime Minister now has an iPad app which will allow him, at a stroke of his finger, “to judge the success or failure of ministers with reference to performance-related data”.

The effect of the digital revolution is exactly the opposite of what the aides imagine. Far from now being able to survey everything, always, like God, the Prime Minister – any prime minister – is now in an unprecedentedly weak position in relation to the average citizen: “Digital technology is starting to allow us to choose for ourselves things that until recently Digital Dave and Co decided for us.”

A non-physical business, for instance, can often decide pretty freely where, for the purposes of taxation, it wants to live. Naturally, it will choose benign jurisdictions. Governments can try to ban it from doing so, but they will either fail, or find that they are cutting off their nose to spite their face. The very idea of a “tax base”, on which treasuries depend, wobbles when so much value lies in intellectual property and intellectual property is mobile. So taxes need to be flatter to keep their revenues up. If they are flatter, they will be paid by more people.

Therefore it becomes much harder for government to grow, since most people do not want to pay more.

[div class=attrib]Read the entire article after the jump.[end-div]

The United Swing States of America

Frank Jacobs over at Strange Maps has found a timely reminder that shows the inordinate influence that a few voters in several crucial States have over the rest of us.

[div class=attrib]From Strange Maps:[end-div]

At the stroke of midnight on November 6th, the 21 registered voters of Dixville Notch, gathering in the wood-panelled Ballot Room of the Balsams Grand Resort Hotel, will have just one minute to cast their vote. Speed is of the essence, if the tiny New Hampshire town is to uphold its reputation (est. 1960) as the first place to declare its results in the US presidential elections.

Later that day, well over 200 million other American voters will face the same choice as the good folks of the Notch: returning Barack Obama to the White House for a second and final four-year term, or electing Mitt Romney as the 45th President of the United States.

The winner of that contest will not be determined by whoever wins a simple majority (i.e. 50% of all votes cast, plus at least one). Like many electoral processes across the world, the system to elect the next president of the United States is riddled with idiosyncrasies and peculiarities – the quadrennial quorum in Dixville Notch being just one example.

Even though most US Presidents have indeed gained office by winning the popular vote, but this is not always the case. What is needed, is winning the electoral vote. For the US presidential election is an indirect one: depending on the outcome in each of the 50 states, an Electoral College convenes in Washington DC to elect the President.

The total of 538 electors is distributed across the states in proportion to their population size, and is regularly adjusted to reflect increases or decreases. In 2008 Louisiana had 9 electors and South Carolina had 8; reflecting a relative population decrease, resp. increase, those numbers are now reversed.

Maine and Nebraska are the only states to assign their electors proportionally; the other 48 states (and DC) operate on the ABBA principle: however slight the majority of either candidate in any of those states, he would win all its electoral votes. This rather convoluted system underlines the fact that the US Presidential elections are the sum of 50, state-level contests. It also brings into focus that some states are more important than others.

Obviously, in this system the more populous states carry much more weight than the emptier ones. Consider the map of the United States, and focus on the 17 states west of the straight-ish line of state borders from North Dakota-Minnesota in the north to Texas-Louisiana in the south. Just two states – Texas and California – outweigh the electoral votes of the 15 others.

So presidential candidates concentrate their efforts on the states where they can hope to gain the greatest advantage. This excludes the fairly large number of states that are solidly ‘blue’ (i.e. Democratic) or ‘red’ (Republican). Texas, for example, is reliably Republican, while California can be expected to fall in the Democratic column.

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Map courtesy of Strange Maps / Big Think.[end-div]

Want Your Kids to Be Conservative or Liberal?

Researchers have confirmed what we already know: Parents who endorse a more authoritarian parenting style towards their toddlers are more likely to have children who are ideologically conservative when they reach age 18; parents who support more egalitarian parenting are more likely to have children who grow up to be liberal.

[div class=attrib]From the Pacific Standard:[end-div]

Parents: Do you find yourselves arguing with your adult children over who deserves to win the upcoming election? Does it confuse and frustrate you to realize your political viewpoints are so different?

Newly published research suggests you may only have yourself to blame.
Providing the best evidence yet to back up a decades-old theory, researchers writing in the journal Psychological Science report a link between a mother’s attitude toward parenting and the political ideology her child eventually adopts. In short, authoritarian parents are more prone to produce conservatives, while those who gave their kids more latitude are more likely to produce liberals.

This dynamic was theorized as early as 1950. But until now, almost all the research supporting it has been based on retrospective reports, with parents assessing their child-rearing attitudes in hindsight.

This new study, by a team led by psychologist R. Chris Fraley of the University of Illinois at Urbana-Champaign, begins with new mothers describing their intentions and approach in 1991, and ends with a survey of their children 18 years later. In between, it features an assessment of the child’s temperament at age 4.

The study looked at roughly 700 American children and their parents, who were recruited for the National Institute of Child Health and Human Development’s Study of Early Child Care and Youth Development. When each child was one month old, his or her mother completed a 30-item questionnaire designed to reveal her approach to parenting.

Those who strongly agreed with such statements as “the most important thing to teach children is absolute obedience to whoever is in authority” were categorized as holding authoritarian parenting attitudes. Those who robustly endorsed such sentiments as “children should be allowed to disagree with their parents” were categorized as holding egalitarian parenting attitudes.

When their kids were 54 months old, the mothers assessed their child’s temperament by answering 80 questions about their behavior. The children were evaluated for such traits as shyness, restlessness, attentional focusing (determined by their ability to follow directions and complete tasks) and fear.

Finally, at age 18, the youngsters completed a 28-item survey measuring their political attitudes on a liberal-to-conservative scale.

“Parents who endorsed more authoritarian parenting attitudes when their children were one month old were more likely to have children who were conservative in their ideologies at age 18,” the researchers report. “Parents who endorsed more egalitarian parenting attitudes were more likely to have children who were liberal.”

Temperament at age 4—which, of course, was very likely impacted by those parenting styles—was also associated with later ideological leanings.

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image courtesy of the Daily Show with Jon Stewart and the Colbert Report via Wired.[end-div]

How Great Companies Fail

A fascinating case study shows how Microsoft failed its employees through misguided HR (human resources) policies that pitted colleague against colleague.

[div class=attrib]From the Guardian:[end-div]

The idea for today’s off-topic note came to me when I read “Microsoft’s lost decade”, an aptly titled Vanity Fair story. In the piece, Kurt Eichenwald tracks Microsoft’s decline as he revisits a decade of technical missteps and bad business decisions. Predictably, the piece has generated strong retorts from Microsoft’s Ministry of Truth and from Ballmer himself (“It’s not been a lost decade for me!” he barked from the tumbrel).

But I don’t come to bury Caesar – not, yet, I’ll wait until actual numbers for Windows 8 and the Surface tablets emerge. Instead, let’s consider the centerpiece of Eichenwald’s article, his depiction of the cultural degeneracy and intramural paranoia that comes of a badly implemented performance review system.

Performance assessments are, of course, an important aspect of a healthy company. In order to maintain fighting weight, an organisation must honestly assay its employees’ contributions and cull the dead wood. This is tournament play, after all, and the coach must “release”; players who can’t help get the team to the finals.

But Microsoft’s implementation – “stack ranking”, a bell curve that pits employees and groups against one another like rats in a cage – plunged the company into internecine fights, horse trading, and backstabbing.

…every unit was forced to declare a certain percentage of employees as top performers, then good performers, then average, then below average, then poor…For that reason, executives said, a lot of Microsoft superstars did everything they could to avoid working alongside other top-notch developers, out of fear that they would be hurt in the rankings.

Employees quickly realised that it was more important to focus on organisation politics than actual performance:

Every current and former Microsoft employee I interviewed – every one – cited stack ranking as the most destructive process inside of Microsoft, something that drove out untold numbers of employees.

This brought back bad memories of my corpocrat days working for a noted Valley company. When I landed here in 1985, I was dismayed by the pervasive presence of human resources, an éminence grise that cast a shadow across the entire organisation. Humor being the courtesy of despair, engineers referred to HR as the KGB or, for a more literary reference, the Bene Gesserit, monikers that knowingly imputed an efficiency to a department that offered anything but. Granted, there was no bell curve grading, no obligation to sacrifice the bottom 5%, but the politics were stifling nonetheless, the review process a painful charade.

In memory of those shenanigans, I’ve come up with a possible antidote to manipulative reviews, an attempt to deal honestly and pleasantly with the imperfections of life at work. (Someday I’ll write a Note about an equally important task: How to let go of people with decency – and without lawyers.)

[div class=attrib]Read the entire article here.[end-div]

[div class=attrib]Image courtesy of Telegraph / Microsoft.[end-div]

Are You Cold or Hot? Depends on Your Politics

The United States is gripped by political deadlock. The Do-Nothing Congress consistently gets lower approval ratings than our banks, Paris Hilton, lawyers and BP during the catastrophe in the Gulf of Mexico. This stasis is driven by seemingly intractable ideological beliefs and a no-compromise attitude from both the left and right sides of the aisle.

So, it should come as no surprise that even your opinion of the weather and temperature is colored by your political persuasion.

Daniel Engber over at Slate sifts through some fascinating studies that highlight how our ingrained ideologies determine our worldview, down to even our basic view of the weather and our home thermostat setting.

[div class=attrib]From Slate:[end-div]

A few weeks ago, an academic journal called Weather, Climate and Society posted a curious finding about how Americans perceive the heat and cold. A team of researchers at the University of Oklahoma asked 8,000 adults living across the country to state both their political leanings and their impressions of the local weather. Are you a liberal or a conservative? Have average temperatures where you live been rising, falling, or staying about the same as previous years? Then they compared the answers to actual thermostat readings from each respondent’s ZIP code. Would their sense of how it feels outside be colored by the way they think?

Yes it would, the study found. So much so, in fact, that the people surveyed all but ignored their actual experience. No matter what the weather records showed for a given neighborhood (despite the global trend, it had gotten colder in some places and warmer in others), conservatives and liberals fell into the same two camps. The former said that temperatures were decreasing or had stayed the same, and the latter claimed they were going up. “Actual temperature deviations proved to be a relatively weak predictor of perceptions,” wrote the authors. (Hat tip to Ars Technica for finding the study.)

People’s opinions, then, seem to have an effect on how they feel the air around them. If you believe in climate change and think the world is getting warmer, you’ll be more inclined to sense that warmth on a walk around the block. And if you tend to think instead in terms of crooked scientists and climate conspiracies, then the local weather will seem a little cooler. Either way, the Oklahoma study suggests that the experience of heat and cold derives from “a complex mix of direct observation, ideology, and cultural cognitions.”

It’s easy to see how these factors might play out when people make grand assessments of the weather that rely on several years’ worth of noisy data. But another complex mix of ideology and culture affects how we experience the weather from moment to moment—and how we choose to cope with it. In yesterday’s column, I discussed the environmental case against air conditioning, and the belief that it’s worse to be hypothermic than overheated. But there are other concerns, too, that make their rounds among the anti-A/C brrr-geoisie. Some view air conditioning itself as a threat to their comfort and their health.

The notion that stale, recycled air might be sickening or dangerous has been circulating for as long as we’ve had home cooling. According to historian Marsha E. Ackermann’s Cool Comfort: America’s Romance With Air-Conditioning, the invention of the air conditioner set off a series of debates among high-profile scholars over whether it was better to fill a building with fresh air or to close it off from the elements altogether. One side argued for ventilation even in the most miserable summer weather; the other claimed that a hot, damp breeze could be a hazard to your health. (The precursor to the modern air conditioner, invented by a Floridian named John Gorrie, was designed according to the latter theory. Gorrie thought his device would stave off malaria and yellow fever.)

The cooling industry worked hard to promote the idea that A/C makes us more healthy and productive, and in the years after World War II it gained acceptance as a standard home appliance. Still, marketers worried about a lingering belief in the importance of fresh air, and especially the notion that the “shock effect” of moving too quickly from warm to cold would make you sick. Some of these fears would be realized in a new and deadly form of pneumonia known as Legionnaires’ disease. In the summer of 1976, around 4,000 members of the Pennsylvania State American Legion met for a conference at the fancy, air-conditioned Bellevue Stratford Hotel in Philadelphia, and over the next month, more than 180 Legionnaires took ill. The bacteria responsible for their condition were found to be propagating in the hotel’s cooling tower. Twenty-nine people died from the disease, and we finally had proof that air conditioning posed a mortal danger to America.

A few years later, a new diagnosis began to spread around the country, based on a nebulous array of symptoms including sore throats and headache that seemed to be associated with indoor air. Epidemiologists called the illness “Sick Building Syndrome,” and looked for its source in large-scale heating and cooling ducts. Even today, the particulars of the condition—and the question of whether or not it really exists—have not been resolved. But there is some good evidence for the idea that climate-control systems can breed allergenic mold or other micro-organisms. For a study published in 2004, researchers in France checked the medical records of 920 middle-aged women, and found that the ones who worked in air-conditioned offices (about 15 percent of the total pool) were almost twice as likely to take sick days or make a visit to an ear-nose-throat doctor.

This will come as no surprise to those who already shun the air conditioner and worship in the cult of fresh air. Like the opponents of A/C from a hundred years ago, they blame the sealed environment for creating a miasma of illness and disease. Well, of course it’s unhealthy to keep the windows closed; you need a natural breeze to blow all those spores and germs away. But their old-fashioned plea invites a response that’s just as antique. Why should the air be any fresher in summer than winter (when so few would let it in)? And what about the dangers that “fresh air” might pose in cities where the breeze swirls with soot and dust? A 2009 study in the journal Epidemiology confirmed that air conditioning can help stave off the effects of particulate matter in the environment. Researchers checked the health records of senior citizens who did or didn’t have air conditioners installed in their homes and found that those who were forced to leave their windows open in the summer—and suck down the dirty air outside—were more likely to end up in the hospital for pollution-related cardiovascular disease. Other studies have found similar correlations between a lack of A/C on sooty days and hospitalization for chronic obstructive pulmonary disease and pneumonia.

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Image courtesy of Crosley Air Conditioning / Treehugger.[end-div]

The Benefits of Self-Deception

 

Psychologists have long studied the causes and characteristics of deception. In recent times they have had a huge pool of talented liars from which to draw — bankers, mortgage lenders, Enron executives, borrowers, and of course politicians. Now, researchers have begun to took at the art of self-deception, with some interesting results. Self-deception may be a useful tool in influencing others.

[div class=attrib]From the Wall Street Journal:[end-div]

Lying to yourself—or self-deception, as psychologists call it—can actually have benefits. And nearly everybody does it, based on a growing body of research using new experimental techniques.

Self-deception isn’t just lying or faking, but is deeper and more complicated, says Del Paulhus, psychology professor at University of British Columbia and author of a widely used scale to measure self-deceptive tendencies. It involves strong psychological forces that keep us from acknowledging a threatening truth about ourselves, he says.

Believing we are more talented or intelligent than we really are can help us influence and win over others, says Robert Trivers, an anthropology professor at Rutgers University and author of “The Folly of Fools,” a 2011 book on the subject. An executive who talks himself into believing he is a great public speaker may not only feel better as he performs, but increase “how much he fools people, by having a confident style that persuades them that he’s good,” he says.

Researchers haven’t studied large population samples to compare rates of self-deception or compared men and women, but they know based on smaller studies that it is very common. And scientists in many different disciplines are drawn to studying it, says Michael I. Norton, an associate professor at Harvard Business School. “It’s also one of the most puzzling things that humans do.”

Researchers disagree over what exactly happens in the brain during self-deception. Social psychologists say people deceive themselves in an unconscious effort to boost self-esteem or feel better. Evolutionary psychologists, who say different parts of the brain can harbor conflicting beliefs at the same time, say self-deception is a way of fooling others to our own advantage.

In some people, the tendency seems to be an inborn personality trait. Others may develop a habit of self-deception as a way of coping with problems and challenges.

Behavioral scientists in recent years have begun using new techniques in the laboratory to predict when and why people are likely to deceive themselves. For example, they may give subjects opportunities to inflate their own attractiveness, skill or intelligence. Then, they manipulate such variables as subjects’ mood, promises of rewards or opportunities to cheat. They measure how the prevalence of self-deception changes.

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image: Truth or Consequences. Courtesy of CBS 1950-51 / Wikia.[end-div]

Crony Capitalism

We excerpt below a fascinating article from the WSJ on the increasingly incestuous and damaging relationship between the finance industry and our political institutions.

[div class=attrib]From the Wall Street Journal:[end-div]

Mitt Romney’s résumé at Bain should be a slam dunk. He has been a successful capitalist, and capitalism is the best thing that has ever happened to the material condition of the human race. From the dawn of history until the 18th century, every society in the world was impoverished, with only the thinnest film of wealth on top. Then came capitalism and the Industrial Revolution. Everywhere that capitalism subsequently took hold, national wealth began to increase and poverty began to fall. Everywhere that capitalism didn’t take hold, people remained impoverished. Everywhere that capitalism has been rejected since then, poverty has increased.

Capitalism has lifted the world out of poverty because it gives people a chance to get rich by creating value and reaping the rewards. Who better to be president of the greatest of all capitalist nations than a man who got rich by being a brilliant capitalist?

Yet it hasn’t worked out that way for Mr. Romney. “Capitalist” has become an accusation. The creative destruction that is at the heart of a growing economy is now seen as evil. Americans increasingly appear to accept the mind-set that kept the world in poverty for millennia: If you’ve gotten rich, it is because you made someone else poorer.

What happened to turn the mood of the country so far from our historic celebration of economic success?

Two important changes in objective conditions have contributed to this change in mood. One is the rise of collusive capitalism. Part of that phenomenon involves crony capitalism, whereby the people on top take care of each other at shareholder expense (search on “golden parachutes”).

But the problem of crony capitalism is trivial compared with the collusion engendered by government. In today’s world, every business’s operations and bottom line are affected by rules set by legislators and bureaucrats. The result has been corruption on a massive scale. Sometimes the corruption is retail, whereby a single corporation creates a competitive advantage through the cooperation of regulators or politicians (search on “earmarks”). Sometimes the corruption is wholesale, creating an industrywide potential for profit that would not exist in the absence of government subsidies or regulations (like ethanol used to fuel cars and low-interest mortgages for people who are unlikely to pay them back). Collusive capitalism has become visible to the public and increasingly defines capitalism in the public mind.

Another change in objective conditions has been the emergence of great fortunes made quickly in the financial markets. It has always been easy for Americans to applaud people who get rich by creating products and services that people want to buy. That is why Thomas Edison and Henry Ford were American heroes a century ago, and Steve Jobs was one when he died last year.

When great wealth is generated instead by making smart buy and sell decisions in the markets, it smacks of inside knowledge, arcane financial instruments, opportunities that aren’t accessible to ordinary people, and hocus-pocus. The good that these rich people have done in the process of getting rich is obscure. The benefits of more efficient allocation of capital are huge, but they are really, really hard to explain simply and persuasively. It looks to a large proportion of the public as if we’ve got some fabulously wealthy people who haven’t done anything to deserve their wealth.

The objective changes in capitalism as it is practiced plausibly account for much of the hostility toward capitalism. But they don’t account for the unwillingness of capitalists who are getting rich the old-fashioned way—earning it—to defend themselves.

I assign that timidity to two other causes. First, large numbers of today’s successful capitalists are people of the political left who may think their own work is legitimate but feel no allegiance to capitalism as a system or kinship with capitalists on the other side of the political fence. Furthermore, these capitalists of the left are concentrated where it counts most. The most visible entrepreneurs of the high-tech industry are predominantly liberal. So are most of the people who run the entertainment and news industries. Even leaders of the financial industry increasingly share the politics of George Soros. Whether measured by fundraising data or by the members of Congress elected from the ZIP Codes where they live, the elite centers with the most clout in the culture are filled with people who are embarrassed to identify themselves as capitalists, and it shows in the cultural effect of their work.

Another factor is the segregation of capitalism from virtue. Historically, the merits of free enterprise and the obligations of success were intertwined in the national catechism. McGuffey’s Readers, the books on which generations of American children were raised, have plenty of stories treating initiative, hard work and entrepreneurialism as virtues, but just as many stories praising the virtues of self-restraint, personal integrity and concern for those who depend on you. The freedom to act and a stern moral obligation to act in certain ways were seen as two sides of the same American coin. Little of that has survived.

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image: The Industrial Revolution brought about the end of true capitalism. Courtesy: Time Life Pictures/Mansell/Time Life Pictures/Getty Images.[end-div]

 

Truthiness 101

Strangely and ironically it takes a satirist to tell the truth, and of course, academics now study the phenomenon.

[div class=attrib]From Washington Post:[end-div]

Nation, our so-called universities are in big trouble, and not just because attending one of them leaves you with more debt than the Greek government. No, we’re talking about something even more unsettling: the academic world’s obsession with Stephen Colbert.

Last we checked, Colbert was a mere TV comedian, or a satirist if you want to get fancy about it. (And, of course, being college professors, they do.) He’s a TV star, like Donald Trump, only less of a caricature.

Yet ever since Colbert’s show, “The Colbert Report,” began airing on Comedy Central in 2005, these ivory-tower eggheads have been devoting themselves to studying all things Colbertian. They’ve sliced and diced his comic stylings more ways than a Ginsu knife. Every academic discipline — well, among the liberal arts, at least — seems to want a piece of him. Political science. Journalism. Philosophy. Race relations. Communications studies. Theology. Linguistics. Rhetoric.

There are dozens of scholarly articles, monographs, treatises and essays about Colbert, as well as books of scholarly articles, monographs and essays. A University of Oklahoma student even earned her doctorate last year by examining him and his “Daily Show” running mate Jon Stewart. It was called “Political Humor and Third-Person Perception.”

The academic cult of Colbert (or is it “the cul of Colbert”?) is everywhere. Here’s a small sample. Jim .?.?.

?“Is Stephen Colbert America’s Socrates?,” chapter heading in “Stephen Colbert and Philosophy: I Am Philosophy (And So Can You!),” published by Open Court, 2009.

?“The Wørd Made Fresh: A Theological Exploration of Stephen Colbert,” published in Concepts (“an interdisciplinary journal of graduate studies”), Villanova University, 2010.

?“It’s All About Meme: The Art of the Interview and the Insatiable Ego of the Colbert Bump,” chapter heading in “The Stewart/Colbert Effect: Essays on the Real Impacts of Fake News,” published by McFarland Press, 2011.

?“The Irony of Satire: Political Ideology and the Motivation to See What You Want to See in The Colbert Report,” a 2009 study in the International Journal of Press/Politics that its authors described as an investigation of “biased message processing” and “the influence of political ideology on perceptions of Stephen Colbert.” After much study, the authors found “no significant difference between [conservatives and liberals] in thinking Colbert was funny.”

Colbert-ism has insinuated itself into the undergraduate curriculum, too.

Boston University has offered a seminar called “The Colbert Report: American Satire” for the past two years, which explores Colbert’s use of “syllogism, logical fallacy, burlesque, and travesty,” as lecturer Michael Rodriguez described it on the school’s Web site.

This fall, Towson University will roll out a freshman seminar on politics and popular culture, with Colbert as its focus.

All this for a guy who would undoubtedly mock-celebrate the serious study of himself.

[div class=attrib]Read the entire article after the jump.[end-div]

[div class-attrib]Image: Colbert Report. Courtesy of Business Insider / Comedy Central.[end-div]

Extreme Equals Happy, Moderate Equals Unhappy

[div class=attrib]From the New York Times:[end-div]

WHO is happier about life — liberals or conservatives? The answer might seem straightforward. After all, there is an entire academic literature in the social sciences dedicated to showing conservatives as naturally authoritarian, dogmatic, intolerant of ambiguity, fearful of threat and loss, low in self-esteem and uncomfortable with complex modes of thinking. And it was the candidate Barack Obama in 2008 who infamously labeled blue-collar voters “bitter,” as they “cling to guns or religion.” Obviously, liberals must be happier, right?

Wrong. Scholars on both the left and right have studied this question extensively, and have reached a consensus that it is conservatives who possess the happiness edge. Many data sets show this. For example, the Pew Research Center in 2006 reported that conservative Republicans were 68 percent more likely than liberal Democrats to say they were “very happy” about their lives. This pattern has persisted for decades. The question isn’t whether this is true, but why.

Many conservatives favor an explanation focusing on lifestyle differences, such as marriage and faith. They note that most conservatives are married; most liberals are not. (The percentages are 53 percent to 33 percent, according to my calculations using data from the 2004 General Social Survey, and almost none of the gap is due to the fact that liberals tend to be younger than conservatives.) Marriage and happiness go together. If two people are demographically the same but one is married and the other is not, the married person will be 18 percentage points more likely to say he or she is very happy than the unmarried person.

An explanation for the happiness gap more congenial to liberals is that conservatives are simply inattentive to the misery of others. If they recognized the injustice in the world, they wouldn’t be so cheerful. In the words of Jaime Napier and John Jost, New York University psychologists, in the journal Psychological Science, “Liberals may be less happy than conservatives because they are less ideologically prepared to rationalize (or explain away) the degree of inequality in society.” The academic parlance for this is “system justification.”

The data show that conservatives do indeed see the free enterprise system in a sunnier light than liberals do, believing in each American’s ability to get ahead on the basis of achievement. Liberals are more likely to see people as victims of circumstance and oppression, and doubt whether individuals can climb without governmental help. My own analysis using 2005 survey data from Syracuse University shows that about 90 percent of conservatives agree that “While people may begin with different opportunities, hard work and perseverance can usually overcome those disadvantages.” Liberals — even upper-income liberals — are a third less likely to say this.

So conservatives are ignorant, and ignorance is bliss, right? Not so fast, according to a study from the University of Florida psychologists Barry Schlenker and John Chambers and the University of Toronto psychologist Bonnie Le in the Journal of Research in Personality. These scholars note that liberals define fairness and an improved society in terms of greater economic equality. Liberals then condemn the happiness of conservatives, because conservatives are relatively untroubled by a problem that, it turns out, their political counterparts defined.

There is one other noteworthy political happiness gap that has gotten less scholarly attention than conservatives versus liberals: moderates versus extremists.

Political moderates must be happier than extremists, it always seemed to me. After all, extremists actually advertise their misery with strident bumper stickers that say things like, “If you’re not outraged, you’re not paying attention!”

But it turns out that’s wrong. People at the extremes are happier than political moderates. Correcting for income, education, age, race, family situation and religion, the happiest Americans are those who say they are either “extremely conservative” (48 percent very happy) or “extremely liberal” (35 percent). Everyone else is less happy, with the nadir at dead-center “moderate” (26 percent).

What explains this odd pattern? One possibility is that extremists have the whole world figured out, and sorted into good guys and bad guys. They have the security of knowing what’s wrong, and whom to fight. They are the happy warriors.

Whatever the explanation, the implications are striking. The Occupy Wall Street protesters may have looked like a miserable mess. In truth, they were probably happier than the moderates making fun of them from the offices above. And none, it seems, are happier than the Tea Partiers, many of whom cling to guns and faith with great tenacity. Which some moderately liberal readers of this newspaper might find quite depressing.

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image courtesy of Psychology Today.[end-div]

Resurgence of Western Marxism

The death-knell for Western capitalism has yet to sound. However, increasing economic turmoil, continued shenanigans in the financial industry, burgeoning inequity, and acute global political unease, are combining to undermine the appeal of capitalism to a growing number of young people. Welcome to Marxism 2.012.

[div class=attrib]From the Guardian:[end-div]

Class conflict once seemed so straightforward. Marx and Engels wrote in the second best-selling book of all time, The Communist Manifesto: “What the bourgeoisie therefore produces, above all, are its own grave-diggers. Its fall and the victory of the proletariat are equally inevitable.” (The best-selling book of all time, incidentally, is the Bible – it only feels like it’s 50 Shades of Grey.)

Today, 164 years after Marx and Engels wrote about grave-diggers, the truth is almost the exact opposite. The proletariat, far from burying capitalism, are keeping it on life support. Overworked, underpaid workers ostensibly liberated by the largest socialist revolution in history (China’s) are driven to the brink of suicide to keep those in the west playing with their iPads. Chinese money bankrolls an otherwise bankrupt America.

The irony is scarcely wasted on leading Marxist thinkers. “The domination of capitalism globally depends today on the existence of a Chinese Communist party that gives de-localised capitalist enterprises cheap labour to lower prices and deprive workers of the rights of self-organisation,” says Jacques Rancière, the French marxist thinker and Professor of Philosophy at the University of Paris VIII. “Happily, it is possible to hope for a world less absurd and more just than today’s.”

That hope, perhaps, explains another improbable truth of our economically catastrophic times – the revival in interest in Marx and Marxist thought. Sales of Das Kapital, Marx’s masterpiece of political economy, have soared ever since 2008, as have those of The Communist Manifesto and the Grundrisse (or, to give it its English title, Outlines of the Critique of Political Economy). Their sales rose as British workers bailed out the banks to keep the degraded system going and the snouts of the rich firmly in their troughs while the rest of us struggle in debt, job insecurity or worse. There’s even a Chinese theatre director called He Nian who capitalised on Das Kapital’s renaissance to create an all-singing, all-dancing musical.

And in perhaps the most lovely reversal of the luxuriantly bearded revolutionary theorist’s fortunes, Karl Marx was recently chosen from a list of 10 contenders to appear on a new issue of MasterCard by customers of German bank Sparkasse in Chemnitz. In communist East Germany from 1953 to 1990, Chemnitz was known as Karl Marx Stadt. Clearly, more than two decades after the fall of the Berlin Wall, the former East Germany hasn’t airbrushed its Marxist past. In 2008, Reuters reports, a survey of east Germans found 52% believed the free-market economy was “unsuitable” and 43% said they wanted socialism back. Karl Marx may be dead and buried in Highgate cemetery, but he’s alive and well among credit-hungry Germans. Would Marx have appreciated the irony of his image being deployed on a card to get Germans deeper in debt? You’d think.

Later this week in London, several thousand people will attend Marxism 2012, a five-day festival organised by the Socialist Workers’ Party. It’s an annual event, but what strikes organiser Joseph Choonara is how, in recent years, many more of its attendees are young. “The revival of interest in Marxism, especially for young people comes because it provides tools for analysing capitalism, and especially capitalist crises such as the one we’re in now,” Choonara says.

There has been a glut of books trumpeting Marxism’s relevance. English literature professor Terry Eagleton last year published a book called Why Marx Was Right. French Maoist philosopher Alain Badiou published a little red book called The Communist Hypothesis with a red star on the cover (very Mao, very now) in which he rallied the faithful to usher in the third era of the communist idea (the previous two having gone from the establishment of the French Republic in 1792 to the massacre of the Paris communards in 1871, and from 1917 to the collapse of Mao’s Cultural Revolution in 1976). Isn’t this all a delusion?

Aren’t Marx’s venerable ideas as useful to us as the hand loom would be to shoring up Apple’s reputation for innovation? Isn’t the dream of socialist revolution and communist society an irrelevance in 2012? After all, I suggest to Rancière, the bourgeoisie has failed to produce its own gravediggers. Rancière refuses to be downbeat: “The bourgeoisie has learned to make the exploited pay for its crisis and to use them to disarm its adversaries. But we must not reverse the idea of historical necessity and conclude that the current situation is eternal. The gravediggers are still here, in the form of workers in precarious conditions like the over-exploited workers of factories in the far east. And today’s popular movements – Greece or elsewhere – also indicate that there’s a new will not to let our governments and our bankers inflict their crisis on the people.”

That, at least, is the perspective of a seventysomething Marxist professor. What about younger people of a Marxist temper? I ask Jaswinder Blackwell-Pal, a 22 year-old English and drama student at Goldsmiths College, London, who has just finished her BA course in English and Drama, why she considers Marxist thought still relevant. “The point is that younger people weren’t around when Thatcher was in power or when Marxism was associated with the Soviet Union,” she says. “We tend to see it more as a way of understanding what we’re going through now. Think of what’s happening in Egypt. When Mubarak fell it was so inspiring. It broke so many stereotypes – democracy wasn’t supposed to be something that people would fight for in the Muslim world. It vindicates revolution as a process, not as an event. So there was a revolution in Egypt, and a counter-revolution and a counter-counter revolution. What we learned from it was the importance of organisation.”

This, surely is the key to understanding Marxism’s renaissance in the west: for younger people, it is untainted by association with Stalinist gulags. For younger people too, Francis Fukuyama’s triumphalism in his 1992 book The End of History – in which capitalism seemed incontrovertible, its overthrow impossible to imagine – exercises less of a choke-hold on their imaginations than it does on those of their elders.

Blackwell-Pal will be speaking Thursday on Che Guevara and the Cuban revolution at the Marxism festival. “It’s going to be the first time I’ll have spoken on Marxism,” she says nervously. But what’s the point thinking about Guevara and Castro in this day and age? Surely violent socialist revolution is irrelevant to workers’ struggles today? “Not at all!” she replies. “What’s happening in Britain is quite interesting. We have a very, very weak government mired in in-fighting. I think if we can really organise we can oust them.” Could Britain have its Tahrir Square, its equivalent to Castro’s 26th of July Movement? Let a young woman dream. After last year’s riots and today with most of Britain alienated from the rich men in its government’s cabinet, only a fool would rule it out.

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image: Portrait of Karl Marx. Courtesy of International Institute of Social History in Amsterdam, Netherlands / Wikipedia.[end-div]

Persecution of Scientists: Old and New

The debate over the theory of evolution continues into the 21st century particularly in societies with a religious bent, including the United States of America. Yet, while the theory and corresponding evidence comes under continuous attack from mostly religious apologists, we generally do not see scientists themselves persecuted for supporting evolution, or not.

This cannot be said for climate scientists in Western countries, who while not physically abused or tortured or imprisoned do continue to be targets of verbal abuse and threats from corporate interests or dogmatic politicians and their followers. But, as we know persecution of scientists for embodying new, and thus threatening, ideas has been with us since the dawn of the scientific age. In fact, this behavior probably has been with us since our tribal ancestors moved out of Africa.

So, it is useful to remind ourselves how far we have come and of the distance we still have to travel.

[div class=attrib]From Wired:[end-div]

Turing was famously chemically-castrated after admitting to homosexual acts in the 1950s. He is one of a long line of scientists who have been persecuted for their beliefs or practices.

After admitting to “homosexual acts” in early 1952, Alan Turing was prosecuted and had to make the choice between a custodial sentence or chemical castration through hormone injections. Injections of oestrogen were intended to deal with “abnormal and uncontrollable” sexual urges, according to literature at the time.
He chose this option so that he could stay out of jail and continue his research, although his security clearance was revoked, meaning he could not continue with his cryptographic work. Turing experienced some disturbing side effects, including impotence, from the hormone treatment. Other known side effects include breast swelling, mood changes and an overall “feminization”. Turing completed his year of treatment without major incident. His medication was discontinued in April 1953 and the University of Manchester created a five-year readership position just for him, so it came as a shock when he committed suicide on 7 June, 1954.

Turing isn’t the only scientist to have been persecuted for his personal or professional beliefs or lifestyle. Here’s a a list of other prominent scientific luminaries who have been punished throughout history.

Rhazes (865-925)
Muhammad ibn Zakariy? R?z? or Rhazes was a medical pioneer from Baghdad who lived between 860 and 932 AD. He was responsible for introducing western teachings, rational thought and the works of Hippocrates and Galen to the Arabic world. One of his books, Continens Liber, was a compendium of everything known about medicine. The book made him famous, but offended a Muslim priest who ordered the doctor to be beaten over the head with his own manuscript, which caused him to go blind, preventing him from future practice.

Michael Servetus (1511-1553)
Servetus was a Spanish physician credited with discovering pulmonary circulation. He wrote a book, which outlined his discovery along with his ideas about reforming Christianity — it was deemed to be heretical. He escaped from Spain and the Catholic Inquisition but came up against the Protestant Inquisition in Switzerland, who held him in equal disregard. Under orders from John Calvin, Servetus was arrested, tortured and burned at the stake on the shores of Lake Geneva – copies of his book were accompanied for good measure.

Galileo Galilei (1564-1642)
The Italian astronomer and physicist Galileo Galilei was trialled and convicted in 1633 for publishing his evidence that supported the Copernican theory that the Earth revolves around the Sun. His research was instantly criticized by the Catholic Church for going against the established scripture that places Earth and not the Sun at the center of the universe. Galileo was found “vehemently suspect of heresy” for his heliocentric views and was required to “abjure, curse and detest” his opinions. He was sentenced to house arrest, where he remained for the rest of his life and his offending texts were banned.

Henry Oldenburg (1619-1677)
Oldenburg founded the Royal Society in London in 1662. He sought high quality scientific papers to publish. In order to do this he had to correspond with many foreigners across Europe, including the Netherlands and Italy. The sheer volume of his correspondence caught the attention of authorities, who arrested him as a spy. He was held in the Tower of London for several months.

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image: Engraving of Galileo Galilei offering his telescope to three women (possibly Urania and attendants) seated on a throne; he is pointing toward the sky where some of his astronomical discoveries are depicted, 1655. Courtesy of Library of Congress.[end-div]

King Canute or Mother Nature in North Carolina, Virginia, Texas?

Legislators in North Carolina recently went one better than King C’Nut (Canute). The king of Denmark, England, Norway and parts of Sweden during various periods between 1018 and 1035, famously and unsuccessfully tried to hold back the incoming tide. The now mythic story tells of Canute’s arrogance. Not to be outdone, North Carolina’s state legislature recently passed a law that bans state agencies from reporting that sea-level rise is accelerating.

The bill From North Carolina states:

“… rates shall only be determined using historical data, and these data shall be limited to the time period following the year 1900. Rates of sea-level rise may be extrapolated linearly to estimate future rates of rise but shall not include scenarios of accelerated rates of sea-level rise.”

This comes hot on the heals of the recent revisionist push in Virginia where references to phrases such as “sea level rise” and “climate change” are forbidden in official state communications. Last year of course, Texas led the way for other states following the climate science denial program when the Texas Commission on Environmental Quality, which had commissioned a scientific study of Galveston Bay, removed all references to “rising sea levels”.

For more detailed reporting on this unsurprising and laughable state of affairs check out this article at Skeptical Science.

[div class=attrib]From Scientific American:[end-div]

Less than two weeks after the state’s senate passed a climate science-squelching bill, research shows that sea level along the coast between N.C. and Massachusetts is rising faster than anywhere on Earth.

Could nature be mocking North Carolina’s law-makers? Less than two weeks after the state’s senate passed a bill banning state agencies from reporting that sea-level rise is accelerating, research has shown that the coast between North Carolina and Massachusetts is experiencing the fastest sea-level rise in the world.

Asbury Sallenger, an oceanographer at the US Geological Survey in St Petersburg, Florida, and his colleagues analysed tide-gauge records from around North America. On 24 June, they reported in Nature Climate Change that since 1980, sea-level rise between Cape Hatteras, North Carolina, and Boston, Massachusetts, has accelerated to between 2 and 3.7 millimetres per year. That is three to four times the global average, and it means the coast could see 20–29 centimetres of sea-level rise on top of the metre predicted for the world as a whole by 2100 ( A. H. Sallenger Jr et al. Nature Clim. Change http://doi.org/hz4; 2012).

“Many people mistakenly think that the rate of sea-level rise is the same everywhere as glaciers and ice caps melt,” says Marcia McNutt, director of the US Geological Survey. But variations in currents and land movements can cause large regional differences. The hotspot is consistent with the slowing measured in Atlantic Ocean circulation, which may be tied to changes in water temperature, salinity and density.

North Carolina’s senators, however, have tried to stop state-funded researchers from releasing similar reports. The law approved by the senate on 12 June banned scientists in state agencies from using exponential extrapolation to predict sea-level rise, requiring instead that they stick to linear projections based on historical data.

Following international opprobrium, the state’s House of Representatives rejected the bill on 19 June. However, a compromise between the house and the senate forbids state agencies from basing any laws or plans on exponential extrapolations for the next three to four years, while the state conducts a new sea-level study.

According to local media, the bill was the handiwork of industry lobbyists and coastal municipalities who feared that investors and property developers would be scared off by predictions of high sea-level rises. The lobbyists invoked a paper published in the Journal of Coastal Research last year by James Houston, retired director of the US Army Corps of Engineers’ research centre in Vicksburg, Mississippi, and Robert Dean, emeritus professor of coastal engineering at the University of Florida in Gainesville. They reported that global sea-level rise has slowed since 1930 ( J. R. Houston and R. G. Dean J. Coastal Res. 27 , 409 – 417 ; 2011) — a contention that climate sceptics around the world have seized on.

Speaking to Nature, Dean accused the oceanographic community of ideological bias. “In the United States, there is an overemphasis on unrealistically high sea-level rise,” he says. “The reason is budgets. I am retired, so I have the freedom to report what I find without any bias or need to chase funding.” But Sallenger says that Houston and Dean’s choice of data sets masks acceleration in the sea-level-rise hotspot.

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image courtesy of Policymic.[end-div]

Science and Politics

The tension between science, religion and politics that began several millennia ago continues unabated.

[div class=attrib]From ars technica:[end-div]

In the US, science has become a bit of a political punching bag, with a number of presidential candidates accusing climatologists of fraud, even as state legislators seek to inject phony controversies into science classrooms. It’s enough to make one long for the good old days when science was universally respected. But did those days ever actually exist?

A new look at decades of survey data suggests that there was never a time when science was universally respected, but one political group in particular—conservative voters—has seen its confidence in science decline dramatically over the last 30 years.

The researcher behind the new work, North Carolina’s Gordon Gauchat, figures there are three potential trajectories for the public’s view of science. One possibility is that the public, appreciating the benefits of the technological advances that science has helped to provide, would show a general increase in its affinity for science. An alternative prospect is that this process will inevitably peak, either because there are limits to how admired a field can be, or because a more general discomfort with modernity spills over to a field that helped bring it about.

The last prospect Gauchat considers is that there has been a change in views about science among a subset of the population. He cites previous research that suggests some view the role of science as having changed from one where it enhances productivity and living standards to one where it’s the primary justification for regulatory policies. “Science has always been politicized,” Gauchat writes. “What remains unclear is how political orientations shape public trust in science.”

To figure out which of these trends might apply, he turned to the General Social Survey, which has been gathering information on the US public’s views since 1972. During that time, the survey consistently contained a series of questions about confidence in US institutions, including the scientific community. The answers are divided pretty crudely—”a great deal,” “only some,” and “hardly any”—but they do provide a window into the public’s views on science. (In fact, “hardly any” was the choice of less than 7 percent of the respondents, so Gauchat simply lumped it in with “only some” for his analysis.)

The data showed a few general trends. For much of the study period, moderates actually had the lowest levels of confidence in science, with liberals typically having the highest; the levels of trust for both these groups were fairly steady across the 34 years of data. Conservatives were the odd one out. At the very start of the survey in 1974, they actually had the highest confidence in scientific institutions. By the 1980s, however, they had dropped so that they had significantly less trust than liberals did; in recent years, they’ve become the least trusting of science of any political affiliation.

Examining other demographic trends, Gauchat noted that the only other group to see a significant decline over time is regular churchgoers. Crunching the data, he states, indicates that “The growing force of the religious right in the conservative movement is a chief factor contributing to conservatives’ distrust in science.” This decline in trust occurred even among those who had college or graduate degrees, despite the fact that advanced education typically correlated with enhanced trust in science.

[div class=attrib]Read the entire article after the jump:[end-div]

Do We Become More Conservative as We Age?

A popular stereotype suggests that we become increasingly conservative in our values as we age. Thus, one would expect that older voters would be more likely to vote for Republican candidates. However, a recent social study debunks this view.

[div class=attrib]From Discovery:[end-div]

Amidst the bipartisan banter of election season, there persists an enduring belief that people get more conservative as they age — making older people more likely to vote for Republican candidates.

Ongoing research, however, fails to back up the stereotype. While there is some evidence that today’s seniors may be more conservative than today’s youth, that’s not because older folks are more conservative than they use to be. Instead, our modern elders likely came of age at a time when the political situation favored more conservative views.

In fact, studies show that people may actually get more liberal over time when it comes to certain kinds of beliefs. That suggests that we are not pre-determined to get stodgy, set in our ways or otherwise more inflexible in our retirement years.

Contrary to popular belief, old age can be an open-minded and enlightening time.

NEWS: Is There a Liberal Gene?

“Pigeonholing older people into these rigid attitude boxes or conservative boxes is not a good idea,” said Nick Dangelis, a sociologist and gerontologist at the University of Vermont in Burlington.

“Rather, when they were born, what experiences they had growing up, as well as political, social and economic events have a lot to do with how people behave,” he said. “Our results are showing that these have profound effects.”

[div class=attrib]Read the entire article here.[end-div]

[div class=attrib]Image: A Board of Elections volunteer watches people cast their ballots during early voting October 23, 2008 in Savannah, Georgia. Courtesy of MSNBC.[end-div]

Stephen Colbert: Seriously Funny

A fascinating article of Stephen Colbert, a funny man with some serious jokes about our broken political process.

[div class=attrib]From the New York Times magazine:[end-div]

There used to be just two Stephen Colberts, and they were hard enough to distinguish. The main difference was that one thought the other was an idiot. The idiot Colbert was the one who made a nice paycheck by appearing four times a week on “The Colbert Report” (pronounced in the French fashion, with both t’s silent), the extremely popular fake news show on Comedy Central. The other Colbert, the non-idiot, was the 47-year-old South Carolinian, a practicing Catholic, who lives with his wife and three children in suburban Montclair, N.J., where, according to one of his neighbors, he is “extremely normal.” One of the pleasures of attending a live taping of “The Colbert Report” is watching this Colbert transform himself into a Republican superhero.

Suburban Colbert comes out dressed in the other Colbert’s guise — dark two-button suit, tasteful Brooks Brothersy tie, rimless Rumsfeldian glasses — and answers questions from the audience for a few minutes. (The questions are usually about things like Colbert’s favorite sport or favorite character from “The Lord of the Rings,” but on one memorable occasion a young black boy asked him, “Are you my father?” Colbert hesitated a moment and then said, “Kareem?”) Then he steps onstage, gets a last dab of makeup while someone sprays his hair into an unmussable Romney-like helmet, and turns himself into his alter ego. His body straightens, as if jolted by a shock. A self-satisfied smile creeps across his mouth, and a manically fatuous gleam steals into his eyes.

Lately, though, there has emerged a third Colbert. This one is a version of the TV-show Colbert, except he doesn’t exist just on screen anymore. He exists in the real world and has begun to meddle in it. In 2008, the old Colbert briefly ran for president, entering the Democratic primary in his native state of South Carolina. (He hadn’t really switched parties, but the filing fee for the Republican primary was too expensive.) In 2010, invited by Representative Zoe Lofgren, he testified before Congress about the problem of illegal-immigrant farmworkers and remarked that “the obvious answer is for all of us to stop eating fruits and vegetables.”

But those forays into public life were spoofs, more or less. The new Colbert has crossed the line that separates a TV stunt from reality and a parody from what is being parodied. In June, after petitioning the Federal Election Commission, he started his own super PAC — a real one, with real money. He has run TV ads, endorsed (sort of) the presidential candidacy of Buddy Roemer, the former governor of Louisiana, and almost succeeded in hijacking and renaming the Republican primary in South Carolina. “Basically, the F.E.C. gave me the license to create a killer robot,” Colbert said to me in October, and there are times now when the robot seems to be running the television show instead of the other way around.

“It’s bizarre,” remarked an admiring Jon Stewart, whose own program, “The Daily Show,” immediately precedes “The Colbert Report” on Comedy Central and is where the Colbert character got his start. “Here is this fictional character who is now suddenly interacting in the real world. It’s so far up its own rear end,” he said, or words to that effect, “that you don’t know what to do except get high and sit in a room with a black light and a poster.”

[div class=attrib]Read the entire article here.[end-div]

[div class=attrib]Images courtesy of Google search.[end-div]

Levelling the Political Playing Field

Let’s face it, taking money out of politics in the United States, especially since the 2010 Supreme Court Decision (Citizens United v. Federal Election Commission), is akin to asking a hardcore addict to give up his or her favorite substance — it’s unlikely to be easy, if at all possible.

So, another approach might be to “re-distribute” the funds more equitably. Not a new idea — a number of European nations do this today. However, Max Frankel over at the NY Review of Books offers a thoughtful proposal with a new twist.

[div class=attrib]By Max Frankel:[end-div]

Every election year brings vivid reminders of how money distorts our politics, poisons our lawmaking, and inevitably widens the gulf between those who can afford to buy influence and the vast majority of Americans who cannot. In 2012, this gulf will become a chasm: one analysis predicts that campaign spending on presidential, congressional, and state elections may exceed $6 billion and all previous records. The Supreme Court has held that money is in effect speech, it talks; and those without big money have become progressively voiceless.

That it may cost as much as a billion dollars to run for President is scandal enough, but the multimillions it now takes to pursue or defend a seat in Congress are even more corrupting. Many of our legislators spend hours of every day begging for contributions from wealthy constituents and from the lobbyists for corporate interests. The access and influence that they routinely sell give the moneyed a seat at the tables where laws are written, to the benefit of those contributors and often to the disadvantage of the rest of us.

And why do the candidates need all that money? Because electoral success requires them to buy endless hours of expensive television time for commercials that advertise their virtues and, more often, roundly assail their opponents with often spurious claims. Of the more than a billion dollars spent on political commercials this year, probably more than half will go for attack ads.

It has long been obvious that television ads dominate electioneering in America. Most of those thirty-second ads are glib at best but much of the time they are unfair smears of the opposition. And we all know that those sordid slanders work—the more negative the better—unless they are instantly answered with equally facile and equally expensive rebuttals.

Other election expenses pale beside the ever larger TV budgets. Campaign staffs, phone and email solicitations, billboards and buttons and such could easily be financed with the small contributions of ordinary voters. But the decisive TV competitions leave politicians at the mercy of self-interested wealthy individuals, corporations, unions, and groups, now often disguised in “Super PACs” that can spend freely on any candidate so long as they are not overtly coordinating with that candidate’s campaign. Even incumbents who face no immediate threat feel a need to keep hoarding huge war chests with which to discourage potential challengers. Senator Charles Schumer of New York, for example, was easily reelected to a third term in 2010 but stands poised five years before his next run with a rapidly growing fund of $10 million.

A rational people looking for fairness in their politics would have long ago demanded that television time be made available at no cost and apportioned equally among rival candidates. But no one expects that any such arrangement is now possible. Political ads are jealously guarded as a major source of income by television stations. And what passes for news on most TV channels gives short shrift to most political campaigns except perhaps to “cover” the advertising combat.

As a political reporter and editor, I concluded long ago that efforts to limit campaign contributions and expenditures have been either disingenuous or futile. Most spending caps are too porous. In fact, they have further distorted campaigns by favoring wealthy candidates whose spending on their own behalf the Supreme Court has exempted from all limitations. And the public has overwhelmingly rejected the use of tax money to subsidize campaigning. In any case, private money that wants to buy political influence tends to behave like water running downhill: it will find a way around most obstacles. Since the court’s decision in the 2010 Citizens United case, big money is now able to find endless new paths, channeling even tax-exempt funds into political pools.

There are no easy ways to repair our entire election system. But I believe that a large degree of fairness could be restored to our campaigns if we level the TV playing field. And given the television industry’s huge stake in paid political advertising, it (and the Supreme Court) would surely resist limiting campaign ads, as many European countries do. With so much campaign cash floating around, there is only one attractive remedy I know of: double the price of political commercials so that every candidate’s purchase of TV time automatically pays for a comparable slot awarded to an opponent. The more you spend, the more your rival benefits as well. The more you attack, the more you underwrite the opponent’s responses. The desirable result would likely be that rival candidates would negotiate an arms control agreement, setting their own limits on their TV budgets and maybe even on their rhetoric.

[div class=attrib]Read the entire article here.[end-div]

[div class=attrib]Image courtesy of Alliance for a Just Society.[end-div]

Do We Need Intellectuals in Politics?

The question, as posed by the New York Times, may have been somewhat rhetorical. However, as we can see from the rise of the technocratic classes in Europe intellectuals still seem to be in reasonably strong demand, albeit if no longer revered.

[div class=attrib]From the New York Times:[end-div]

The rise of Newt Gingrich, Ph.D.— along with the apparent anti-intellectualism of many of the other Republican candidates — has once again raised the question of the role of intellectuals in American politics.

In writing about intellectuals, my temptation is to begin by echoing Marianne Moore on poetry: I, too, dislike them.  But that would be a lie: all else equal, I really like intellectuals.  Besides, I’m an intellectual myself, and their self-deprecation is one thing I really do dislike about many intellectuals.

What is an intellectual?  In general, someone seriously devoted to what used to be called the “life of the mind”: thinking pursued not instrumentally, for the sake of practical goals, but simply for the sake of knowing and understanding.  Nowadays, universities are the most congenial spots for intellectuals, although even there corporatism and careerism are increasing threats.

Intellectuals tell us things we need to know: how nature and society work, what happened in our past, how to analyze concepts, how to appreciate art and literature.   They also keep us in conversation with the great minds of our past.  This conversation may not, as some hope, tap into a source of enduring wisdom, but it at least provides a critical standpoint for assessing the limits of our current cultural assumptions.

In his “Republic,” Plato put forward the ideal of a state ruled by intellectuals who combined comprehensive theoretical knowledge with the practical capacity for applying it to concrete problems.  In reality, no one has theoretical expertise in more than a few specialized subjects, and there is no strong correlation between having such knowledge and being able to use it to resolve complex social and political problems.  Even more important, our theoretical knowledge is often highly limited, so that even the best available expert advice may be of little practical value.  An experienced and informed non-expert may well have a better sense of these limits than experts strongly invested in their disciplines.  This analysis supports the traditional American distrust of intellectuals: they are not in general highly suited for political office.

But it does not support the anti-intellectualism that tolerates or even applauds candidates who disdain or are incapable of serious engagement with intellectuals.   Good politicians need not be intellectuals, but they should have intellectual lives.  Concretely, they should have an ability and interest in reading the sorts of articles that appear in, for example, Scientific American, The New York Review of Books, and the science, culture and op-ed sections of major national newspapers — as well as the books discussed in such articles.

It’s often said that what our leaders need is common sense, not fancy theories.  But common-sense ideas that work in individuals’ everyday lives are often useless for dealing with complex problems of society as a whole.  For example, it’s common sense that government payments to the unemployed will lead to more jobs because those receiving the payments will spend the money, thereby increasing demand, which will lead businesses to hire more workers.  But it’s also common sense that if people are paid for not working, they will have less incentive to work, which will increase unemployment.  The trick is to find the amount of unemployment benefits that will strike the most effective balance between stimulating demand and discouraging employment.  This is where our leaders need to talk to economists.

[div class=attrib]Read the entire article here.[end-div]

Supercommittee and Innovation: Oxymoron Du Jour

Today is deadline day for the U.S. Congressional Select Committee on Deficit Reduction to deliver. Perhaps, a little ironically the committee was commonly mistitled the “Super Committee”. Interestingly, pundits and public alike do not expect the committee to deliver any significant, long-term solution to the United States’ fiscal problems. In fact, many do not believe the committee with deliver anything at all beyond reinforcement of right- and left-leaning ideologies, political posturing, pandering to special interests of all colors and, of course, recriminations and spin.

Could the Founders have had such dysfunction in mind when they designed the branches of government with its many checks and balances to guard against excess and tyranny. So, perhaps it’s finally time for the United States’ Congress to gulp a large dose of some corporate-style innovation.

[div class=attrib]From the Washington Post:[end-div]

… Fiscal catastrophe has been around the corner, on and off, for 15 years. In that period, Dole and President Bill Clinton, a Democrat, came together to produce a record-breaking $230 billion surplus. That was later depleted by actions undertaken by both sides, bringing us to the tense situation we have today.

What does this have to do with innovation?

As the profession of innovation management matures, we are learning a few key things, including that constraints can be a good thing — and the “supercommittee” clock is a big constraint. Given this, what is the best strategy when you need to innovate in a hurry?

When innovating under the gun, the first thing you must do is assemble a small, diverse team to own and attack the challenge. The “supercommittee” team is handicapped from the start, since it is neither small (think 4-5 people) nor diverse (neither in age nor expertise). Second, successful innovators envision what success looks like and pursue it single-mindedly – failure is not an option.

Innovators also divide big challenges into smaller challenges that a small team can feel passionate about and assault on an even shorter timeline than the overall challenge. This requires that you put as much (or more) effort into determining the questions that form the challenges as you do into trying to solve them. Innovators ask big questions that challenge the status quo, such as “How could we generate revenue without taxes?” or “What spending could we avoid and how?” or “How would my son or my grandmother approach this?”

To solve the challenges, successful innovators recruit people not only with expertise most relevant to the challenge, but also people with expertise in distant specialties, which, in innovation, is often where the best solutions come from.

But probably most importantly, all nine innovation roles — the revolutionary, the conscript, the connector, the artist, customer champion, troubleshooter, judge, magic maker and evangelist — must be filled for an innovation effort to be successful.

[div class=attrib]Read the entire article here.[end-div]