Tag Archives: power

HR and the Evil Omnipotence of the Passive Construction

Next time you browse through your company’s compensation or business expense policies, or for that matter, anything written by the human resources (HR) department, cast your mind to George Orwell. In one of his critical essays Politics and the English Language, Orwell makes a clear case for the connection between linguistic obfuscation and political power. While Orwell’s obsession was on the political machine, you could just as well apply his reasoning to the mangled literary machinations of every corporate HR department.

Oh, the pen is indeed mightier than the sword, especially when it is used to construct obtuse passive sentences without a subject — perfect for a rulebook that all citizens must follow and that no one can challenge.

From the Guardian:

In our age there is no such thing as ‘keeping out of human resources’. All issues are human resource issues, and human resources itself is a mass of lies, evasions, folly, hatred and schizophrenia.

OK, that’s not exactly what Orwell wrote. The hair-splitters among you will moan that I’ve taken the word “politics” out of the above and replaced it with “human resources”. Sorry.

But I think there’s no denying that had he been alive today, Orwell – the great opponent and satirist of totalitarianism – would have deplored the bureaucratic repression of HR. He would have hated their blind loyalty to power, their unquestioning faithfulness to process, their abhorrence of anything or anyone deviating from the mean.

In particular, Orwell would have utterly despised the language that HR people use. In his excellent essay Politics and the English Language (where he began the thought that ended with Newspeak), Orwell railed against the language crimes committed by politicians.

In our time, political speech and writing are largely the defence of the indefensible … Thus political language has to consist largely of euphemism, question-begging and sheer cloudy vagueness. Defenceless villages are bombarded from the air, the inhabitants driven out into the countryside, the cattle machine-gunned, the huts set on fire with incendiary bullets: this is called pacification. Millions of peasants are robbed of their farms and sent trudging along the roads with no more than they can carry: this is called transfer of population or rectification of frontiers. People are imprisoned for years without trial, or shot in the back of the neck or sent to die of scurvy in Arctic lumber camps: this is called elimination of unreliable elements.

Repeat the politics/human resources switch in the above and the argument remains broadly the same. Yes, HR is not explaining away murders, but it nonetheless deliberately misuse language as a sort of low-tech mind control to avert our eyes from office atrocities and keep us fixed on our inboxes. Thus mass sackings are wrapped up in cowardly sophistry and called rightsizings, individuals are offboarded to the jobcentre and the few hardy souls left are consoled by their membership of a more streamlined organisation.

Orwell would have despised the passive constructions that are the HR department’s default setting. Want some flexibility in your contract? HR says company policy is unable to support that. Forgotten to accede to some arbitrary and impractical office rule? HR says we are minded to ask everyone to remember that it is essential to comply by rule X. Try to question whether an ill-judged commitment could be reversed? HR apologises meekly that the decision has been made.

Not giving subjects to any of these responses is a deliberate ploy. Subjects give ownership. They imbue accountability. Not giving sentences subjects means that HR is passing the buck, but to no one in particular. And with no subject, no one can be blamed, or protested against.

The passive construction is also designed to give the sense that it’s not HR speaking, but that they are the conduit for a higher-up and incontestable power. It’s designed to be both authoritative and banal, so that we torpidly accept it, like the sovereignty of the Queen. It’s saying: “This is the way things are – deal with it because it isn’t changing.” It’s indifferent and deliberately opaque. It’s the worst kind of utopianism (the kind David Graeber targets in his recent book on “stupidity and the secret joys of bureaucracy”), where system and rule are king and hang the individual. It’s deeply, deeply oppressive.

Annual leave is perhaps an even worse example of HR’s linguistic malpractice. The phrase gives the sense that we are not sitting in the office but rather fighting some dismal war and that we should be grateful for the mercy of Field Marshal HR in allowing us a finite absence from the front line. Is it too indulgent and too frivolous to say that we are going on holiday (even if we’re just taking the day to go to Ikea)? Would it so damage our career prospects? Would the emerging markets of the world be emboldened by the decadence and complacency of saying we’re going on hols? I don’t think so, but they clearly do.

Actually, I don’t think it’s so much of a stretch to imagine Orwell himself establishing the whole HR enterprise as a sort of grim parody of Stalinism; a never-ending, ever-expanding live action art installation sequel to Animal Farm and Nineteen Eighty-Four.

Look at your office’s internal newsletter. Is it an incomprehensible black hole of sense? Is it trying to prod you into a place of content, incognisant of all the everyday hardships and irritations you endure? If your answer is yes, then I think that like me, you find it fairly easy to imagine Orwell composing these Newspeak emails from beyond the grave to make us believe that War is Peace, Freedom is Slavery and 2+2=5.

Delving deeper, the parallels become increasingly hard to ignore. Company restructures and key performance indicators make no sense in the abstract, merely serving to demotivate the workforce, sap confidence and obstruct productivity. So are they actually cleverly designed parodies of Stalin’s purges and the cult of Stakhanovism?

Read the entire story here.

 

Google: The Standard Oil of Our Age

Google’s aim to organize the world’s information sounds benign enough. But delve a little deeper into its research and development efforts or witness its boundless encroachment into advertising, software, phones, glasses, cars, home automation, travel, internet services, artificial intelligence, robotics, online shopping (and so on), and you may get a more uneasy and prickly sensation. Is Google out to organize information or you? Perhaps it’s time to begin thinking about Google as a corporate hegemony, not quite a monopoly yet, but so powerful that counter-measures become warranted.

An open letter, excerpted below, from Mathias Döpfner, CEO of Axel Springer AG, does us all a service by raising the alarm bells.

From the Guardian:

Dear Eric Schmidt,

As you know, I am a great admirer of Google’s entrepreneurial success. Google’s employees are always extremely friendly to us and to other publishing houses, but we are not communicating with each other on equal terms. How could we? Google doesn’t need us. But we need Google. We are afraid of Google. I must state this very clearly and frankly, because few of my colleagues dare do so publicly. And as the biggest among the small, perhaps it is also up to us to be the first to speak out in this debate. You yourself speak of the new power of the creators, owners, and users.

In the long term I’m not so sure about the users. Power is soon followed by powerlessness. And this is precisely the reason why we now need to have this discussion in the interests of the long-term integrity of the digital economy’s ecosystem. This applies to competition – not only economic, but also political. As the situation stands, your company will play a leading role in the various areas of our professional and private lives – in the house, in the car, in healthcare, in robotronics. This is a huge opportunity and a no less serious threat. I am afraid that it is simply not enough to state, as you do, that you want to make the world a “better place”.

Google lists its own products, from e-commerce to pages from its own Google+ network, higher than those of its competitors, even if these are sometimes of less value for consumers and should not be displayed in accordance with the Google algorithm. It is not even clearly pointed out to the user that these search results are the result of self-advertising. Even when a Google service has fewer visitors than that of a competitor, it appears higher up the page until it eventually also receives more visitors.

You know very well that this would result in long-term discrimination against, and weakening of, any competition, meaning that Google would be able to develop its superior market position still further. And that this would further weaken the European digital economy in particular.

This also applies to the large and even more problematic set of issues concerning data security and data utilisation. Ever since Edward Snowden triggered the NSA affair, and ever since the close relations between major American online companies and the American secret services became public, the social climate – at least in Europe – has fundamentally changed. People have become more sensitive about what happens to their user data. Nobody knows as much about its customers as Google. Even private or business emails are read by Gmail and, if necessary, can be evaluated. You yourself said in 2010: “We know where you are. We know where you’ve been. We can more or less know what you’re thinking about.” This is a remarkably honest sentence. The question is: are users happy with the fact that this information is used not only for commercial purposes – which may have many advantages, yet a number of spooky negative aspects as well – but could end up in the hands of the intelligence services, and to a certain extent already has?

Google is sitting on the entire current data trove of humanity, like the giant Fafner in The Ring of the Nibelung: “Here I lie and here I hold.” I hope you are aware of your company’s special responsibility. If fossil fuels were the fuels of the 20th century, then those of the 21st century are surely data and user profiles. We need to ask ourselves whether competition can generally still function in the digital age, if data is so extensively concentrated in the hands of one party.

There is a quote from you in this context that concerns me. In 2009 you said: “If you have something that you don’t want anyone to know, maybe you shouldn’t be doing it in the first place.” The essence of freedom is precisely the fact that I am not obliged to disclose everything that I am doing, that I have a right to confidentiality and, yes, even to secrets; that I am able to determine for myself what I wish to disclose about myself. The individual right to this is what makes a democracy. Only dictatorships want transparent citizens instead of a free press.

Against this background, it greatly concerns me that Google – which has just announced the acquisition of drone manufacturer Titan Aerospace – has been seen for some time as being behind a number of planned enormous ships and floating working environments that can cruise and operate in the open ocean. What is the reason for this development? You don’t have to be a conspiracy theorist to find this alarming.

Historically, monopolies have never survived in the long term. Either they have failed as a result of their complacency, which breeds its own success, or they have been weakened by competition – both unlikely scenarios in Google’s case. Or they have been restricted by political initiatives.

Another way would be voluntary self-restraint on the part of the winner. Is it really smart to wait until the first serious politician demands the breakup of Google? Or even worse – until the people refuse to follow?

Sincerely yours,

Mathias Döpfner

Read the entire article here.

 

It’s Official: The U.S. is an Oligarchy

US_Capitol_west_side

Until recently the term oligarchy was usually only applied to Russia and some ex-Soviet satellites. A new study out of Princeton and Northwestern universities makes a case for the oligarchic label right here in the United States. Jaded voters will yawn at this so-called news — most ordinary citizens have known for decades that the U.S. political system is thoroughly broken, polluted with money (“free speech” as the U.S. Supreme Court would deem it) and serves only special interests (on the right or the left).

From the Telegraph:

The US government does not represent the interests of the majority of the country’s citizens, but is instead ruled by those of the rich and powerful, a new study from Princeton and Northwestern Universities has concluded.

The report, entitled Testing Theories of American Politics: Elites, Interest Groups, and Average Citizens, used extensive policy data collected from between the years of 1981 and 2002 to empirically determine the state of the US political system.

After sifting through nearly 1,800 US policies enacted in that period and comparing them to the expressed preferences of average Americans (50th percentile of income), affluent Americans (90th percentile) and large special interests groups, researchers concluded that the United States is dominated by its economic elite.

The peer-reviewed study, which will be taught at these universities in September, says: “The central point that emerges from our research is that economic elites and organised groups representing business interests have substantial independent impacts on US government policy, while mass-based interest groups and average citizens have little or no independent influence.”

Researchers concluded that US government policies rarely align with the the preferences of the majority of Americans, but do favour special interests and lobbying oragnisations: “When a majority of citizens disagrees with economic elites and/or with organised interests, they generally lose. Moreover, because of the strong status quo bias built into the US political system, even when fairly large majorities of Americans favour policy change, they generally do not get it.”

The positions of powerful interest groups are “not substantially correlated with the preferences of average citizens”, but the politics of average Americans and affluent Americans sometimes does overlap. This merely a coincidence, the report says, with the the interests of the average American being served almost exclusively when it also serves those of the richest 10 per cent.

The theory of “biased pluralism” that the Princeton and Northwestern researchers believe the US system fits holds that policy outcomes “tend to tilt towards the wishes of corporations and business and professional associations.”

Read more here.

Image: U.S. Capitol. Courtesy of Wikipedia.

It’s a Woman’s World

[tube]V4UWxlVvT1A[/tube]

Well, not really. Though, there is no doubting that the planet would look rather different if the genders had truly equal opportunities and pay-offs, or if women generally had all the power that tends to be concentrated in masculine hands.

A short movie by French actor and film-maker Eleonoré Pourriat imagines what our Western culture might resemble if the traditional female-male roles were reversed.

A portent of the future? Perhaps not, but thought-provoking nonetheless. One has to believe that if women had all the levers and trappings of power that they could do a better job than men. Or, perhaps not. It may just be possible that power corrupts — regardless of the gender of the empowered.

From the Independent:

Imagine a world where it is the women who pee in the street, jog bare-chested and harass and physically assault the men. Such a world has just gone viral on the internet. A nine-minute satirical film made by Eleonoré Pourriat, the French actress, script-writer and director, has clocked up hundreds of thousands of views in recent days.

The movie, Majorité Opprimée or “Oppressed Majority”, was made in 2010. It caused a flurry of interest when it was first posted on YouTube early last year. But now it’s time seems to have come. “It is astonishing, just incredible that interest in my film has suddenly exploded in this way,” Ms Pourriat told The Independent. “Obviously, I have touched a nerve. Women in France, but not just in France, feel that everyday sexism has been allowed to go on for too long.”

The star of the short film is Pierre, who is played very convincingly by Pierre Bénézit. He is a slightly gormless stay-at-home father, who spends a day besieged by the casual or aggressive sexism of women in a female-dominated planet. The film, in French with English subtitles, begins in a jokey way and turns gradually, and convincingly, nasty. It is not played for cheap laughs. It has a Swiftian capacity to disturb by the simple trick of reversing roles.

Pierre, pushing his baby-buggy, is casually harassed by a bare-breasted female jogger. He meets a male, Muslim babysitter, who is forced by his wife to wear a balaclava in public. He is verbally abused – “Think I don’t see you shaking your arse at me?” – by a drunken female down-and-out. He is sexually assaulted and humiliated by a knife-wielding girl gang. (“Say your dick is small or I’ll cut off your precious jewels.”)

He is humiliated a second time by a policewoman, who implies that he invented the gang assault. “Daylight and no witnesses, that’s strange,” she says. As she takes Pierre’s statement, the policewoman patronises a pretty, young policeman. “I need a coffee, cutie.”

Pierre’s self-important working wife arrives to collect him. She comforts him at first, calling him “kitten” and “pumpkin”. When he complains that he can no longer stand the permanent aggression of a female-dominated society, she says that he is to blame because of the way he dresses: in short sleeves, flip-flops and Bermudas.

At the second, or third, time of asking, interest in Ms Pourriat’s highly charged little movie has exploded in recent days on social media and on feminist and anti-feminist websites on both sides of the Channel and on both sides of the Atlantic. Some men refuse to see the point. “Sorry, but I would adore to live such a life,” said one French male blogger. “To be raped by a gang of girls. Great! That’s every man’s fantasy.”

Ms Pourriat, 42, acts and writes scripts for comedy movies in France. This was her first film as director. “It is rooted absolutely in my own experience as a woman living in France,” she tells me. “I think French men are worse than men elsewhere, but the incredible success of the movie suggests that it is not just a French problem.

“What angers me is that many women seem to accept this kind of behaviour from men or joke about it. I had long wanted to make a film that would turn the situation on its head.

Read the entire article here.

Video: Majorité Opprimée or “Oppressed Majority by Eleonoré Pourriat.

 

Your iPhone is Worth $3,000

iphone_5C-colors

There is a slight catch.

Your iPhone is worth around $3,000 based on the combined value of a sack full of gadgets from over 20 years ago. We all know that no IPhone existed in the early nineties — not even inside Steve Jobs’ head. So intrepid tech-sleuth, Steve Cichon, calculated the iPhone’s value by combining the functions of fifteen or so consumer electronics devices from 1991, found at Radio Shack, which when all combined offer comparable features to one of today’s iPhones.

From the Washington Post:

Buffalo writer Steve Cichon dug up an old Radio Shack ad, offering a variety of what were then cutting-edge gadgets. There are 15 items listed on the page, and Cichon points out that all but two of them — the exceptions are a radar detector and a set of speakers — do jobs that can now be performed with a modern iPhone.

The other 13 items, including a desktop computer, a camcorder, a CD player  and a mobile phone, have a combined price of $3,071.21. The unsubsidized price of an iPhone is $549. And, of course, your iPhone is superior to these devices in many respects. The VHS camcorder, for example, captured video at a quality vastly inferior to the crystal-clear 1080p video an iPhone can record. That $1,599 Tandy computer would have struggled to browse the Web of the 1990s, to say nothing of the sophisticated Web sites iPhones access today. The CD player only lets you carry a few albums worth of music at a time; an iPhone can hold thousands of songs. And of course, the iPhone fits in your pocket.

This example is important to remember in the debate over whether the government’s official inflation figures understate or overstate inflation. In computing the inflation rate, economists assemble a representative “basket of goods” and see how its price changes over time. This isn’t difficult when the items in the basket are milk or gallons of gasoline. But it becomes extremely tricky when thinking about high-tech products. This year’s products are dramatically better than last year’s, so economists include a “quality adjustment” factor to reflect the change. But making apples-to-apples comparisons is difficult.

There’s no basket of 1991 gadgets that exactly duplicates the functionality of a modern iPhone, so deciding what to put into that basket is an inherently subjective enterprise. It’s not obvious that the average customer really gets as much value from his or her iPhone as a gadget lover in 1991 would have gotten from $3,000 worth of Radio Shack gadgets. On the other hand, iPhones do a lot of other things, too, like check Facebook, show movies on the go and provide turn-by-turn directions, that would have been hard to do on any gadget in 1991. So if anything, I suspect the way we measure inflation understates how quickly our standard of living has been improving.

Read the entire story here.

Image: Apple iPhone 5c. Courtesy of ABC News / Apple.

The Power of the Female Artist

Artemisia_Gentileschi-Judith_Beheading_Holofernes

Despite progress gender equality remains a myth in most areas of our modern world. In most endeavors women have made significant strides in catching men — vying for the same levels of attention, education, fame, wealth and power. It is certainly the case in the art world too — women have made, and are continuing to make, progress in attaining parity — but it is still a male dominated culture. That said, some female artists have managed to rise above the male tide to capture the global imagination with their powerful works and ideas.

Jonathan Jones over at his On Art Blog lists for us his top ten most subversive female artists from the last several hundred years. While it would be right to take issue with his notion of subversive, many of the names on the list quite rightly deserve as much mind-share as their male contemporaries.

From the Jonathan Jones:

Artemisia Gentileschi

When she was a teenager, this 17th-century baroque artist was raped by a painter. She responded by turning her art into a weapon. In Gentileschi’s repeated paintings of the biblical story of Judith slaying Holofernes, the Israelite hero is helped by her muscular servant. As one woman holds down Holofernes on his bed, the other saws through his neck with a sword. Blood spurts everywhere in a sensational image of women taking revenge on patriarchy.

Hannah Wilke

In her SOS Starification Object Series (1974-82), Wilke was photographed with blobs of chewing gum stuck on to her flesh. Dotting her face and bare body, these bizarre markings resembled a modern form of tribal scarification (this was before ritualistic body modification became fashionable) and resemble vaginas. Or are they eyes? Wilke’s “starification” marked her with the burden of being objectified by the male gaze.

Adrian Piper

In her Catalysis performances (1970), Piper turned herself into a human provocation in public places such as the New York subway. In one performance, she rode the subway after soaking her clothes in pungent substances for a week to make them stink. She muttered in the street, entered the elevator of the Empire State Building with a red towel stuffed in her mouth or simply made eye contact with strangers. Her purpose was to dramatise social unease and ultimately the unspoken tensions of race in America.

Georgia O’Keeffe

In the early 20th century, Georgia O’Keeffe posed nude for her lover, the modernist photographer and art impressario Alfred Stieglitz, and painted abstractions that have an explicitly vaginal beauty. Compared with some artists in this list she may seem soft, but her cussed exploration of her own body and soul mapped out a new expressive freedom for women making art in the modern age.

Claude Cahun

In photographs taken from the 1920s to 1940s, this French artist often portrays herself in male clothes and hairstyles, contemplating her own transformed image as she experiments with the fictions of gender. Cahun’s pioneering art is typical of the freedom the surrealist movement gave artists to question sexual and social convention.

Louise Bourgeois

The labyrinthine mind of the last great surrealist envelops the spectator of her art in memories of an early 20th-century French childhood, intense secret worlds and the very interior of the body. Collapsing the masculinist art form of sculpture into something organic and ripely carnal, she is the spider of subversion weaving a web that has transformed the very nature of art.

Read the entire list here:

Image: Judith Beheading Holofernes, Artemisia Gentileschi, c1612. Courtesy of Wikimedia.

Me, Myself and I

It’s common sense — the frequency with which you use the personal pronoun “I” tells a lot about you. Now there’s some great research that backs this up, but not in a way that you would have expected.

From WSJ:

You probably don’t think about how often you say the word “I.”

You should. Researchers say that your usage of the pronoun says more about you than you may realize.

Surprising new research from the University of Texas suggests that people who often say “I” are less powerful and less sure of themselves than those who limit their use of the word. Frequent “I” users subconsciously believe they are subordinate to the person to whom they are talking.

Pronouns, in general, tell us a lot about what people are paying attention to, says James W. Pennebaker, chair of the psychology department at the University of Texas at Austin and an author on the study. Pronouns signal where someone’s internal focus is pointing, says Dr. Pennebaker, who has pioneered this line of research. Often, people using “I” are being self-reflective. But they may also be self-conscious or insecure, in physical or emotional pain, or simply trying to please.

Dr. Pennebaker and colleagues conducted five studies of the way relative rank is revealed by the use of pronouns. The research was published last month in the Journal of Language and Social Psychology. In each experiment, people deemed to have higher status used “I” less.

The findings go against the common belief that people who say “I” a lot are full of themselves, maybe even narcissists.

“I” is more powerful than you may realize. It drives perceptions in a conversation so much so that marriage therapists have long held that people should use “I” instead of “you” during a confrontation with a partner or when discussing something emotional. (“I feel unheard.” Not: “You never listen.”) The word “I” is considered less accusatory.

“There is a misconception that people who are confident, have power, have high-status tend to use ‘I’ more than people who are low status,” says Dr. Pennebaker, author of “The Secret Life of Pronouns.” “That is completely wrong. The high-status person is looking out at the world and the low-status person is looking at himself.”

So, how often should you use “I”? More—to sound humble (and not critical when speaking to your spouse)? Or less—to come across as more assured and authoritative?

The answer is “mostly more,” says Dr. Pennebaker. (Although he does say you should try and say it at the same rate as your spouse or partner, to keep the power balance in the relationship.)

In the first language-analysis study Dr. Pennebaker led, business-school students were divided into 41 four-person, mixed-sex groups and asked to work as a team to improve customer service for a fictitious company. One person in each group was randomly assigned to be the leader. The result: The leaders used “I” in 4.5% of their words. Non-leaders used the word 5.6%. (The leaders also used “we” more than followers did.)

In the second study, 112 psychology students were assigned to same-sex groups of two. The pairs worked to solve a series of complex problems. All interaction took place online. No one was assigned to a leadership role, but participants were asked at the end of the experiment who they thought had power and status. Researchers found that the higher the person’s perceived power, the less he or she used “I.”

In study three, 50 pairs of people chatted informally face-to-face, asking questions to get to know one another, as if at a cocktail party. When asked which person had more status or power, they tended to agree—and that person had used “I” less.

Study four looked at emails. Nine people turned over their incoming and outgoing emails with about 15 other people. They rated how much status they had in relation to each correspondent. In each exchange, the person with the higher status used “I” less.

The fifth study was the most unusual. Researchers looked at email communication that the U.S. government had collected (and translated) from the Iraqi military, made public for a period of time as the Iraqi Perspectives Project. They randomly selected 40 correspondences. In each case, the person with higher military rank used “I” less.

People curb their use of “I” subconsciously, Dr. Pennebaker says. “If I am the high-status person, I am thinking of what you need to do. If I am the low-status person, I am more humble and am thinking, ‘I should be doing this.’ “

Dr. Pennebaker has found heavy “I” users across many people: Women (who are typically more reflective than men), people who are more at ease with personal topics, younger people, caring people as well as anxious and depressed people. (Surprisingly, he says, narcissists do not use “I” more than others, according to a meta-analysis of a large number of studies.)

And who avoids using “I,” other than the high-powered? People who are hiding the truth. Avoiding the first-person pronoun is distancing.

Read the entire article here.

Orwell Lives On

George Orwell passed away on January 21, 1950 — an untimely death. He was only 46 years old. The anniversary of his death leads some to wonder what the great author would be doing if he were still alive. Some believe that he would be a food / restaurant critic. Or perhaps he would still, at the age of 109, be writing about injustice, falsehood and hypocrisy. One suspects that he might still be speaking truth to power as he did back in the 1940s, the difference being that this time power is in private hands versus the public sector. Corporate Big Brother is now watching you.

[div class=attrib]From the Guardian:[end-div]

What if George Orwell hadn’t died of tuberculosis in 1950? What if, instead of expiring aged 46 in University College hospital, he had climbed from his sick-bed, taken the fishing rod a friend had brought him for his convalescence and checked out? What if today he was alive and well (perhaps after a period in cryogenic storage – the details aren’t important now)? What would he think of 2013? What, if anything, would he be writing about?

In many respects Orwell is ubiquitous and more relevant than ever. His once-visionary keywords have grotesque afterlives: Big Brother is a TV franchise to make celebrities of nobodies and Room 101 a light-entertainment show on BBC2 currently hosted by Frank Skinner for celebrities to witter about stuff that gets their goat. Meanwhile, Orwellian is the second-most-overused literary-generated adjective (after Kafkaesque). And now St Vince of Cable has been busted down from visionary analyst of recession to turncoat enabler of George Osborne’s austerity measures. Orwell is the go-to thinker to account for our present woes – even though he is 63 years dead. Which, in the Newspeak of 1984, is doubleplusgood.

As we celebrate the first Orwell Day this week, it’s irresistible to play the game of “what if”? If Orwell was fighting in a war akin to the Spanish civil war in 2012, where would he be – Syria? Would he write Homage to Aleppo, perhaps? Or would he have written Homage to Zuccotti Park or Tottenham? If he was writing Down and Out in Paris and London today would it be very different – and, if so, how? If he took a journey to Wigan pier in 2013, what would he find that would resemble the original trip and what would be different? Would there still be a full chamber pot under his hosts’ breakfast table? Let’s hope not.

Would he be working in a call centre rather than going down a mine? Would he feel as patriotic as he did in some of his essays? Would the man born Eric Arthur Blair have spent much of the past decade tilting at the man born Anthony Charles Lynton Blair? The answers to the last three questions are, you’d hope: yes, probably not, and oh, please God, yes.

“It’s almost impossible to imagine,” says Orwell’s biographer, the novelist and critic DJ Taylor. “One of his closest friends, the novelist Anthony Powell, suggested in his journals that Orwell’s politics would have drifted rightwards. He would have been anti-CND, in favour of the Falklands war, disapproved of the miners’ strikes. Powell was a high Tory right winger, but he was very close to Orwell and so those possibilities of what he would have been like had he lived on shouldn’t be dismissed.”

Adam Stock, an Orwell scholar at Newcastle University who did his PhD on mid-20th-century dystopian fiction and political thought, says: “If he were alive today, then Orwell would surely be writing about many of the sorts of areas you identify, bringing to light inequalities, injustices and arguing for what he termed ‘democratic socialism’, and I would like to think – though this may be projection on my part – that at this moment he would be writing specifically in defence of the welfare state.”

You’d hope. But Stock reckons that in 2013 Orwell would also be writing about the politics of food. “Orwell’s novels are marked by their rich detailing of taste, touch and especially smell. Tinned and processed food is a recurring image in his fiction, and it often represents a smoothing out of difference and individuality, a process which mirrors political attempts to make people conform to certain ideological visions of the world in the 1930s and 1940s,” says Stock.

Indeed, during last week’s horsemeat scandal, Stock says a passage from Orwell’s 1939 novel Coming Up for Air came to mind. The character George Bowling bites into a frankfurter he has bought in an milk bar decorated in chrome and mirrors: “The thing burst in my mouth like a rotten pear. A sort of horrible soft stuff was oozing all over my tongue. But the taste! For a moment I just couldn’t believe it. Then I rolled my tongue round it again and had another try. It was fish! A sausage, a thing calling itself a frankfurter, filled with fish! I got up and walked straight out without touching my coffee. God knows what that might have tasted of.”

What’s the present-day significance of that? “The point, I think, is that appearances mask quite different realities in the milk-bar modernity of mirrors in which the character is sitting, trapped between endless reflections,” says Stock. “Orwell had an abiding interest in the countryside, rural life and growing his own food. One thing I suspect he would be campaigning vociferously about in our time is issues surrounding big agribusiness and the provenance of our food, the biological commons, and particularly the patenting of GM crops.”

[div class=attrib]Read more after the jump.[end-div]

[div class=attrib]Image: George Orwell. Courtesy of the BBC.[end-div]

The Military-Industrial Complex

[tube]8y06NSBBRtY[/tube]

In his op-ed, author Aaron B. O’Connell reminds us of Eisenhower’s prescient warning to the nation about the growing power of the military-industrial complex in national affairs.

[div class=attrib]From the New York Times:[end-div]

IN 1961, President Dwight D. Eisenhower left office warning of the growing power of the military-industrial complex in American life. Most people know the term the president popularized, but few remember his argument.

In his farewell address, Eisenhower called for a better equilibrium between military and domestic affairs in our economy, politics and culture. He worried that the defense industry’s search for profits would warp foreign policy and, conversely, that too much state control of the private sector would cause economic stagnation. He warned that unending preparations for war were incongruous with the nation’s history. He cautioned that war and warmaking took up too large a proportion of national life, with grave ramifications for our spiritual health.

The military-industrial complex has not emerged in quite the way Eisenhower envisioned. The United States spends an enormous sum on defense — over $700 billion last year, about half of all military spending in the world — but in terms of our total economy, it has steadily declined to less than 5 percent of gross domestic product from 14 percent in 1953. Defense-related research has not produced an ossified garrison state; in fact, it has yielded a host of beneficial technologies, from the Internet to civilian nuclear power to GPS navigation. The United States has an enormous armaments industry, but it has not hampered employment and economic growth. In fact, Congress’s favorite argument against reducing defense spending is the job loss such cuts would entail.

Nor has the private sector infected foreign policy in the way that Eisenhower warned. Foreign policy has become increasingly reliant on military solutions since World War II, but we are a long way from the Marines’ repeated occupations of Haiti, Nicaragua and the Dominican Republic in the early 20th century, when commercial interests influenced military action. Of all the criticisms of the 2003 Iraq war, the idea that it was done to somehow magically decrease the cost of oil is the least credible. Though it’s true that mercenaries and contractors have exploited the wars of the past decade, hard decisions about the use of military force are made today much as they were in Eisenhower’s day: by the president, advised by the Joint Chiefs of Staff and the National Security Council, and then more or less rubber-stamped by Congress. Corporations do not get a vote, at least not yet.

But Eisenhower’s least heeded warning — concerning the spiritual effects of permanent preparations for war — is more important now than ever. Our culture has militarized considerably since Eisenhower’s era, and civilians, not the armed services, have been the principal cause. From lawmakers’ constant use of “support our troops” to justify defense spending, to TV programs and video games like “NCIS,” “Homeland” and “Call of Duty,” to NBC’s shameful and unreal reality show “Stars Earn Stripes,” Americans are subjected to a daily diet of stories that valorize the military while the storytellers pursue their own opportunistic political and commercial agendas. Of course, veterans should be thanked for serving their country, as should police officers, emergency workers and teachers. But no institution — particularly one financed by the taxpayers — should be immune from thoughtful criticism.

[div class=attrib]Read the entire article after the jump.[end-div]

Power and Baldness

Since behavioral scientists and psychologists first began roaming the globe we have come to know how and (sometimes) why visual appearance is so important in human interactions. Of course, anecdotally, humans have known this for thousands of years — that image is everything. After all it, was not Mary Kay or L’Oreal who brought us make-up but the ancient Egyptians. Yet, it is still fascinating to see how markedly the perception of an individual can change with a basic alteration, and only at the surface. Witness the profound difference in characteristics that we project onto a male with male pattern baldness (wimp) when he shaves his head (tough guy). And, of course, corporations can now assign a monetary value to the shaven look. As for comb-overs, well that is another topic entirely.

[div class=attrib]From the Wall Street Journal:[end-div]

Up for a promotion? If you’re a man, you might want to get out the clippers.

Men with shaved heads are perceived to be more masculine, dominant and, in some cases, to have greater leadership potential than those with longer locks or with thinning hair, according to a recent study out of the University of Pennsylvania’s Wharton School.

That may explain why the power-buzz look has caught on among business leaders in recent years. Venture capitalist and Netscape founder Marc Andreessen, 41 years old, DreamWorks Animation Chief Executive Jeffrey Katzenberg, 61, and Amazon.com Inc. CEO Jeffrey Bezos, 48, all sport some variant of the close-cropped look.

Some executives say the style makes them appear younger—or at least, makes their age less evident—and gives them more confidence than a comb-over or monk-like pate.

“I’m not saying that shaving your head makes you successful, but it starts the conversation that you’ve done something active,” says tech entrepreneur and writer Seth Godin, 52, who has embraced the bare look for two decades. “These are people who decide to own what they have, as opposed to trying to pretend to be something else.”

Wharton management lecturer Albert Mannes conducted three experiments to test peoples’ perceptions of men with shaved heads. In one of the experiments, he showed 344 subjects photos of the same men in two versions: one showing the man with hair and the other showing him with his hair digitally removed, so his head appears shaved.

In all three tests, the subjects reported finding the men with shaved heads as more dominant than their hirsute counterparts. In one test, men with shorn heads were even perceived as an inch taller and about 13% stronger than those with fuller manes. The paper, “Shorn Scalps and Perceptions of Male Dominance,” was published online, and will be included in a coming issue of journal Social Psychological and Personality Science.

The study found that men with thinning hair were viewed as the least attractive and powerful of the bunch, a finding that tracks with other studies showing that people perceive men with typical male-pattern baldness—which affects roughly 35 million Americans—as older and less attractive. For those men, the solution could be as cheap and simple as a shave.

According to Wharton’s Dr. Mannes—who says he was inspired to conduct the research after noticing that people treated him more deferentially when he shaved off his own thinning hair—head shavers may seem powerful because the look is associated with hypermasculine images, such as the military, professional athletes and Hollywood action heroes like Bruce Willis. (Male-pattern baldness, by contrast, conjures images of “Seinfeld” character George Costanza.)

New York image consultant Julie Rath advises her clients to get closely cropped when they start thinning up top. “There’s something really strong, powerful and confident about laying it all bare,” she says, describing the thinning or combed-over look as “kind of shlumpy.”

The look is catching on. A 2010 study from razor maker Gillette, a unit of Procter & Gamble Co., found that 13% of respondents said they shaved their heads, citing reasons as varied as fashion, sports and already thinning hair, according to a company spokesman. HeadBlade Inc., which sells head-shaving accessories, says revenues have grown 30% a year in the past decade.

Shaving his head gave 60-year-old Stephen Carley, CEO of restaurant chain Red Robin Gourmet Burgers Inc., a confidence boost when he was working among 20-somethings at tech start-ups in the 1990s. With his thinning hair shorn, “I didn’t feel like the grandfather in the office anymore.” He adds that the look gave him “the impression that it was much harder to figure out how old I was.”

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image: Comb-over patent, 1977. Courtesy of Wikipedia.[end-div]

GigaBytes and TeraWatts

Online social networks have expanded to include hundreds of millions of twitterati and their followers. An ever increasing volume of data, images, videos and documents continues to move into the expanding virtual “cloud”, hosted in many nameless data centers. Virtual processing and computation on demand is growing by leaps and bounds.

Yet while business models for the providers of these internet services remain ethereal, one segment of this business ecosystem is salivating — electricity companies and utilities — at the staggering demand for electrical power.

[div class=attrib]From the New York Times:[end-div]

Jeff Rothschild’s machines at Facebook had a problem he knew he had to solve immediately. They were about to melt.

The company had been packing a 40-by-60-foot rental space here with racks of computer servers that were needed to store and process information from members’ accounts. The electricity pouring into the computers was overheating Ethernet sockets and other crucial components.

Thinking fast, Mr. Rothschild, the company’s engineering chief, took some employees on an expedition to buy every fan they could find — “We cleaned out all of the Walgreens in the area,” he said — to blast cool air at the equipment and prevent the Web site from going down.

That was in early 2006, when Facebook had a quaint 10 million or so users and the one main server site. Today, the information generated by nearly one billion people requires outsize versions of these facilities, called data centers, with rows and rows of servers spread over hundreds of thousands of square feet, and all with industrial cooling systems.

They are a mere fraction of the tens of thousands of data centers that now exist to support the overall explosion of digital information. Stupendous amounts of data are set in motion each day as, with an innocuous click or tap, people download movies on iTunes, check credit card balances through Visa’s Web site, send Yahoo e-mail with files attached, buy products on Amazon, post on Twitter or read newspapers online.

A yearlong examination by The New York Times has revealed that this foundation of the information industry is sharply at odds with its image of sleek efficiency and environmental friendliness.

Most data centers, by design, consume vast amounts of energy in an incongruously wasteful manner, interviews and documents show. Online companies typically run their facilities at maximum capacity around the clock, whatever the demand. As a result, data centers can waste 90 percent or more of the electricity they pull off the grid, The Times found.

To guard against a power failure, they further rely on banks of generators that emit diesel exhaust. The pollution from data centers has increasingly been cited by the authorities for violating clean air regulations, documents show. In Silicon Valley, many data centers appear on the state government’s Toxic Air Contaminant Inventory, a roster of the area’s top stationary diesel polluters.

Worldwide, the digital warehouses use about 30 billion watts of electricity, roughly equivalent to the output of 30 nuclear power plants, according to estimates industry experts compiled for The Times. Data centers in the United States account for one-quarter to one-third of that load, the estimates show.

“It’s staggering for most people, even people in the industry, to understand the numbers, the sheer size of these systems,” said Peter Gross, who helped design hundreds of data centers. “A single data center can take more power than a medium-size town.”

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image courtesy of the AP / Thanassis Stavrakis.[end-div]

What is the True Power of Photography?

Hint. The answer is not shameless self-promotion or exploitative voyeurism; images used in this way may scratch a personal itch, but rarely influence fundamental societal or political behavior. Importantly, photography has given us a rich, nuanced and lasting medium for artistic expression since cameras and film were first invented. However, the principal answer is lies in photography’s ability to tell truth about and to power.

Michael Glover reminds us of this critical role through the works of a dozen of the most influential photographers from the 1960s and 1970s. Their collective works are on display at a new exhibit at the Barbican Art Gallery, London, which runs until mid-January 2013.

[div class=attrib]From the Independent:[end-div]

Photography has become so thoroughly prostituted as a means of visual exchange, available to all or none for every purpose under the sun (or none worthy of the name), that it is easy to forget that until relatively recently one of the most important consequences of fearless photographic practice was to tell the truth about power.

This group show at the Barbican focuses on the work of 12 photographers from around the world, including Vietnam, India, the US, Mexico, Japan, China, Ukraine, Germany, Mali, Japan and South Africa, examining their photographic practice in relation to the particular historical moments through which they lived. The covert eye of the camera often shows us what the authorities do not want us to see: the bleak injustice of life lived under apartheid; the scarring aftermath of the allied bombing and occupation of Japan; the brutish day-to-day realities of the Vietnam war.

Photography, it has often been said, documents the world. This suggests that the photographer might be a dispassionate observer of neutral spaces, more machine than emotive being. Nonsense. Using a camera is the photographer’s own way of discovering his or her own particular angle of view. It is a point of intersection between self and world. There is no such thing as a neutral landscape; there is only ever a personal landscape, cropped by the ever quizzical human eye. The good photographer, in the words of Bruce Davidson, the man (well represented in this show) who tirelessly and fearlessly chronicled the fight for civil rights in America in the early 1960s, seeks out the “emotional truth” of a situation.

For more than half a century, David Goldblatt, born in the mining town of Randfontein of Lithuanian Jewish parentage, has been chronicling the social divisions of South Africa. Goldblatt’s images are stark, forensic and pitiless, from the matchbox houses in the dusty, treeless streets of 1970s Soweto, to the lean man in the hat who is caught wearily and systematically butchering the coal-merchant’s dead horse for food in a bleak scrubland of wrecked cars. Goldblatt captures the day-to-day life of the Afrikaners: their narrowness of view; that tenacious conviction of rightness; the visceral bond with the soil. There is nothing demonstrative or rhetorical about his work. It is utterly, monochromatically sober, and quite subtly focused on the job in hand, as if he wishes to say to the onlooker that reality is quite stark enough.

Boris Mikhailov, wild, impish and contrarian in spirit, turns photography into a self-consciously subversive art form. Born in Kharkov in Ukraine under communism, his photographic montages represent a ferociously energetic fight-back against the grinding dullness, drabness and tedium of accepted notions of conformity. He frames a sugary image of a Kremlin tower in a circlet of slabs of raw meat. He reduces accepted ideas of beauty to kitsch. Underwear swings gaily in the air beside a receding railway track. He mercilessly lampoons the fact that the authorities forbade the photographing of nudity. This is the not-so-gentle art of blowing red raspberries.

Shomei Tomatsu has been preoccupied all his life by a single theme that he circles around obsessively: the American occupation of Japan in the aftermath of its humiliating military capitulation. Born in 1930, he still lives in Okinawa, the island from which the Americans launched their B52s during the Vietnam war. His angle of view suggests a mixture of abhorrence with the invasion of an utterly alien culture and a fascination with its practical consequences: a Japanese child blows a huge chewing gum bubble beside a street sign that reads “Bar Oasis”. The image of the child is distorted in the bubble.

But this show is not all about cocking a snook at authority. It is also about aesthetic issues: the use of colour as a way of shaping a different kind of reality, for example. William Eggleston made his series of photographic portraits of ordinary people from Memphis, Tennessee, often at night, in the 1970s. These are seemingly casual and immediate moments of intimate engagement between photographer and subject. Until this moment, colour had often been used by the camera (and especially the movie camera), not to particularise but to glamorise. Not so here. Eggleston is especially good at registering the lonely decrepitude of objects – a jukebox on a Memphis wall; the reptilian patina of a rusting street light; the resonance of an empty room in Las Vegas.

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Image courtesy of “Everything Was Moving: Photography from the 60s and 70s”, Barbican Art Gallery. Copyright Bruce Davidson / Magnum Photos.[end-div]

The Benefits of Self-Deception

 

Psychologists have long studied the causes and characteristics of deception. In recent times they have had a huge pool of talented liars from which to draw — bankers, mortgage lenders, Enron executives, borrowers, and of course politicians. Now, researchers have begun to took at the art of self-deception, with some interesting results. Self-deception may be a useful tool in influencing others.

[div class=attrib]From the Wall Street Journal:[end-div]

Lying to yourself—or self-deception, as psychologists call it—can actually have benefits. And nearly everybody does it, based on a growing body of research using new experimental techniques.

Self-deception isn’t just lying or faking, but is deeper and more complicated, says Del Paulhus, psychology professor at University of British Columbia and author of a widely used scale to measure self-deceptive tendencies. It involves strong psychological forces that keep us from acknowledging a threatening truth about ourselves, he says.

Believing we are more talented or intelligent than we really are can help us influence and win over others, says Robert Trivers, an anthropology professor at Rutgers University and author of “The Folly of Fools,” a 2011 book on the subject. An executive who talks himself into believing he is a great public speaker may not only feel better as he performs, but increase “how much he fools people, by having a confident style that persuades them that he’s good,” he says.

Researchers haven’t studied large population samples to compare rates of self-deception or compared men and women, but they know based on smaller studies that it is very common. And scientists in many different disciplines are drawn to studying it, says Michael I. Norton, an associate professor at Harvard Business School. “It’s also one of the most puzzling things that humans do.”

Researchers disagree over what exactly happens in the brain during self-deception. Social psychologists say people deceive themselves in an unconscious effort to boost self-esteem or feel better. Evolutionary psychologists, who say different parts of the brain can harbor conflicting beliefs at the same time, say self-deception is a way of fooling others to our own advantage.

In some people, the tendency seems to be an inborn personality trait. Others may develop a habit of self-deception as a way of coping with problems and challenges.

Behavioral scientists in recent years have begun using new techniques in the laboratory to predict when and why people are likely to deceive themselves. For example, they may give subjects opportunities to inflate their own attractiveness, skill or intelligence. Then, they manipulate such variables as subjects’ mood, promises of rewards or opportunities to cheat. They measure how the prevalence of self-deception changes.

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image: Truth or Consequences. Courtesy of CBS 1950-51 / Wikia.[end-div]

All Power Corrupts

[div class=attrib]From the Economist:[end-div]

DURING the second world war a new term of abuse entered the English language. To call someone “a little Hitler” meant he was a menial functionary who employed what power he had in order to annoy and frustrate others for his own gratification. From nightclub bouncers to the squaddies at Abu Ghraib prison who tormented their prisoners for fun, little Hitlers plague the world. The phenomenon has not, though, hitherto been subject to scientific investigation.

Nathanael Fast of the University of Southern California has changed that. He observed that lots of psychological experiments have been done on the effects of status and lots on the effects of power. But few, if any, have been done on both combined. He and his colleagues Nir Halevy of Stanford University and Adam Galinsky of Northwestern University, in Chicago, set out to correct this. In particular they wanted to see if it is circumstances that create little Hitlers or, rather, whether people of that type simply gravitate into jobs which allow them to behave badly. Their results have just been published in the Journal of Experimental Social Psychology.

Dr Fast’s experiment randomly assigned each of 213 participants to one of four situations that manipulated their status and power. All participants were informed that they were taking part in a study on virtual organisations and would be interacting with, but not meeting, a fellow student who worked in the same fictional consulting firm. Participants were then assigned either the role of “idea producer”, a job that entailed generating and working with important ideas, or of “worker”, a job that involved menial tasks like checking for typos. A post-experiment questionnaire demonstrated that participants did, as might be expected, look upon the role of idea producer with respect and admiration. Equally unsurprisingly, they looked down on the role of worker.

Participants who had both status and power did not greatly demean their partners. They chose an average of 0.67 demeaning activities for those partners to perform. Low-power/low-status and low-power/high-status participants behaved similarly. They chose, on average, 0.67 and 0.85 demeaning activities. However, participants who were low in status but high in power—the classic “little Hitler” combination—chose an average of 1.12 deeply demeaning tasks for their partners to engage in. That was a highly statistically significant distinction.

Of course, not everybody in the high-power/low-status quadrant of the experiment behaved badly. Underlying personality may still have a role. But as with previous experiments in which random members of the public have been asked to play prison guard or interrogator, Dr Fast’s result suggests that many quite ordinary people will succumb to bad behaviour if the circumstances are right.

[div class=attrib]Read more here.[end-div]

[div class=attrib]Image courtesy of the Economist / Getty Images.[end-div]

The Lanier Effect

Twenty or so years ago the economic prognosticators and technology pundits would all have had us believe that the internet would transform society; it would level the playing field; it would help the little guy compete against the corporate behemoth; it would make us all “socially” rich if not financially. Yet, the promise of those early, heady days seems remarkably narrow nowadays. What happened? Or rather, what didn’t happen?

We excerpt a lengthy interview with Jaron Lanier over at the Edge. Lanier, a pioneer in the sphere of virtual reality, offers some well-laid arguments for and against concentration of market power as enabled by information systems and the internet. Though he leaves his most powerful criticism at the doors of Google. Their (in)famous corporate mantra — “do no evil” — will start to look remarkably disingenuous.

[div class=attrib]From the Edge:[end-div]

I’ve focused quite a lot on how this stealthy component of computation can affect our sense of ourselves, what it is to be a person. But lately I’ve been thinking a lot about what it means to economics.

In particular, I’m interested in a pretty simple problem, but one that is devastating. In recent years, many of us have worked very hard to make the Internet grow, to become available to people, and that’s happened. It’s one of the great topics of mankind of this era.  Everyone’s into Internet things, and yet we have this huge global economic trouble. If you had talked to anyone involved in it twenty years ago, everyone would have said that the ability for people to inexpensively have access to a tremendous global computation and networking facility ought to create wealth. This ought to create wellbeing; this ought to create this incredible expansion in just people living decently, and in personal liberty. And indeed, some of that’s happened. Yet if you look at the big picture, it obviously isn’t happening enough, if it’s happening at all.

The situation reminds me a little bit of something that is deeply connected, which is the way that computer networks transformed finance. You have more and more complex financial instruments, derivatives and so forth, and high frequency trading, all these extraordinary constructions that would be inconceivable without computation and networking technology.

At the start, the idea was, “Well, this is all in the service of the greater good because we’ll manage risk so much better, and we’ll increase the intelligence with which we collectively make decisions.” Yet if you look at what happened, risk was increased instead of decreased.

… We were doing a great job through the turn of the century. In the ’80s and ’90s, one of the things I liked about being in the Silicon Valley community was that we were growing the middle class. The personal computer revolution could have easily been mostly about enterprises. It could have been about just fighting IBM and getting computers on desks in big corporations or something, instead of this notion of the consumer, ordinary person having access to a computer, of a little mom and pop shop having a computer, and owning their own information. When you own information, you have power. Information is power. The personal computer gave people their own information, and it enabled a lot of lives.

… But at any rate, the Apple idea is that instead of the personal computer model where people own their own information, and everybody can be a creator as well as a consumer, we’re moving towards this iPad, iPhone model where it’s not as adequate for media creation as the real media creation tools, and even though you can become a seller over the network, you have to pass through Apple’s gate to accept what you do, and your chances of doing well are very small, and it’s not a person to person thing, it’s a business through a hub, through Apple to others, and it doesn’t create a middle class, it creates a new kind of upper class.

Google has done something that might even be more destructive of the middle class, which is they’ve said, “Well, since Moore’s law makes computation really cheap, let’s just give away the computation, but keep the data.” And that’s a disaster.

What’s happened now is that we’ve created this new regimen where the bigger your computer servers are, the more smart mathematicians you have working for you, and the more connected you are, the more powerful and rich you are. (Unless you own an oil field, which is the old way.) II benefit from it because I’m close to the big servers, but basically wealth is measured by how close you are to one of the big servers, and the servers have started to act like private spying agencies, essentially.

With Google, or with Facebook, if they can ever figure out how to steal some of Google’s business, there’s this notion that you get all of this stuff for free, except somebody else owns the data, and they use the data to sell access to you, and the ability to manipulate you, to third parties that you don’t necessarily get to know about. The third parties tend to be kind of tawdry.

[div class=attrib]Read the entire article.[end-div]

[div class=attrib]Image courtesy of Jaron Lanier.[end-div]