Tag Archives: deception

Influencing and Bullying

We sway our co-workers. We coach teams. We cajole our spouses and we parent our kids. But what characterizes this behavior over more overt and negative forms of influencing, such as bullying? It’s a question very much worth exploring since we are all bullies at some point — much more so than we tend to think of ourselves. And, not surprisingly, this goes hand-in-hand with deceit.

From the NYT:

WHAT is the chance that you could get someone to lie for you? What about vandalizing public property at your suggestion?

Most of us assume that others would go along with such schemes only if, on some level, they felt comfortable doing so. If not, they’d simply say “no,” right?

Yet research suggests that saying “no” can be more difficult than we believe — and that we have more power over others’ decisions than we think.

Social psychologists have spent decades demonstrating how difficult it can be to say “no” to other people’s propositions, even when they are morally questionable — consider Stanley Milgram’s infamous experiments, in which participants were persuaded to administer what they believed to be dangerous electric shocks to a fellow participant.

Countless studies have subsequently shown that we find it similarly difficult to resist social pressure from peers, friends and colleagues. Our decisions regarding everything from whether to turn the lights off when we leave a room to whether to call in sick to take a day off from work are affected by the actions and opinions of our neighbors and colleagues.

But what about those times when we are the ones trying to get someone to act unethically? Do we realize how much power we wield with a simple request, suggestion or dare? New research by my students and me suggests that we don’t.

We examined this question in a series of studies in which we had participants ask strangers to perform unethical acts. Before making their requests, participants predicted how many people they thought would comply. In one study, 25 college students asked 108 unfamiliar students to vandalize a library book. Targets who complied wrote the word “pickle” in pen on one of the pages.

As in the Milgram studies, many of the targets protested. They asked the instigators to take full responsibility for any repercussions. Yet, despite their hesitation, a large portion still complied.

Most important for our research question, more targets complied than participants had anticipated. Our participants predicted that an average of 28.5 percent would go along. In fact, fully half of those who were approached agreed. Moreover, 87 percent of participants underestimated the number they would be able to persuade to vandalize the book.

In another study, we asked 155 participants to think about a series of ethical dilemmas — for example, calling in sick to work to attend a baseball game. One group was told to think about these misdeeds from the perspective of a person deciding whether to commit them, and to imagine receiving advice from a colleague suggesting they do it or not. Another group took the opposite side, and thought about them from the perspective of someone advising another person about whether or not to do each deed.

Those in the first group were strongly influenced by the advice they received. When they were urged to engage in the misdeed, they said they would be more comfortable doing so than when they were advised not to. Their average reported comfort level fell around the midpoint of a 7-point scale after receiving unethical advice, but fell closer to the low end after receiving ethical advice.

However, participants in the “advisory” role thought that their opinions would hold little sway over the other person’s decision, assuming that participants in the first group would feel equally comfortable regardless of whether they had received unethical or ethical advice.

Taken together, our research, which was recently published in the journal Personality and Social Psychology Bulletin, suggests that we often fail to recognize the power of social pressure when we are the ones doing the pressuring.

Notably, this tendency may be especially pronounced in cultures like the United States’, where independence is so highly valued. American culture idolizes individuals who stand up to peer pressure. But that doesn’t mean that most do; in fact, such idolatry may hide, and thus facilitate, compliance under social pressure, especially when we are the ones putting on the pressure.

Consider the roles in the Milgram experiments: Most people have probably fantasized about being one of the subjects and standing up to the pressure. But in daily life, we play the role of the metaphorical experimenter in those studies as often as we play the participant. We bully. We pressure others to blow off work to come out for a drink or stiff a waitress who is having a bad night. These suggestions are not always wrong or unethical, but they may impact others’ behaviors more than we realize.

Read the entire story here.

Pretending to be Smart

Have you ever taken a date to a cerebral movie or the opera? Have you ever taken a classic work of literature to read at the beach? If so, you are not alone. But why are you doing it?

From the Telegraph:

Men try to impress their friends almost twice as much as women do by quoting Shakespeare and pretending to like jazz to seem more clever.

A fifth of all adults admitted they have tried to impress others by making out they are more cultured than they really are, but this rises to 41 per cent in London.

Scotland is the least pretentious country as only 14 per cent of the 1,000 UK adults surveyed had faked their intelligence there, according to Ask Jeeves research.

Typical methods of trying to seem cleverer ranged from deliberately reading a ‘serious’ novel on the beach, passing off other people’s witty remarks as one’s own and talking loudly about politics in front of others.

Two thirds put on the pretensions for friends, while 36 per cent did it to seem smarter in their workplace and 32 per cent tried to impress a potential partner.

One in five swapped their usual holiday read for something more serious on the beach and one in four went to an art gallery to look more cultured.

When it came to music tastes, 20 per cent have pretended to prefer Beethoven to Beyonce and many have referenced operas they have never seen.

A spokesman for Ask Jeeves said: “We were surprised by just how many people think they should go to such lengths in order to impress someone else.

“They obviously think they will make a better impression if they pretend to like Beethoven rather than admit they listen to Beyonce or read The Spectator rather than Loaded.

“Social media and the internet means it is increasingly easy to present this kind of false image about themselves.

“But in the end, if they are really going to be liked then it is going to be for the person they really are rather than the person they are pretending to be.”

Social media also plays a large part with people sharing Facebook posts on politics or re-tweeting clever tweets to raise their intellectual profile.

Men were the biggest offenders, with 26 per cent of men admitting to the acts of pretence compared to 14 per cent of women.

Top things people have done to seem smarter:

Repeated someone else’s joke as your own

Gone to an art gallery

Listened to classical music in front of others

Read a ‘serious’ book on the beach

Re-tweeted a clever tweet

Talked loudly about politics in front of others

Read a ‘serious’ magazine on public transport

Shared an intellectual article on Facebook

Quoted Shakespeare

Pretended to know about wine

Worn glasses with clear lenses

Mentioned an opera you’d ‘seen’

Pretended to like jazz

Read the entire article here.

Image: Opera. Courtesy of the New York Times.

Leadership and the Tyranny of Big Data

“There are three kinds of lies: lies, damned lies, and statistics”, goes the adage popularized by author Mark Twain.

Most people take for granted that numbers can be persuasive — just take a look at your bank balance. Also, most accept the notion that data can be used, misused, misinterpreted, re-interpreted and distorted to support or counter almost any argument. Just listen to a politician quote polling numbers and then hear an opposing politician make a contrary argument using the very same statistics. Or, better still, familiarize yourself with pseudo-science of economics.

Authors Kenneth Cukier (data editor for The Economist) and Viktor Mayer-Schönberger (professor of Internet governance) examine this phenomenon in their book Big Data: A Revolution That Will Transform How We Live, Work, and Think. They eloquently present the example of Robert McNamara, U.S. defense secretary during the Vietnam war, who in(famously) used his detailed spreadsheets — including daily body count — to manage and measure progress. Following the end of the war, many U.S. generals later described this over-reliance on numbers as misguided dictatorship that led many to make ill-informed decisions — based solely on numbers — and to fudge their figures.

This classic example leads them to a timely and important caution: as the range and scale of big data becomes ever greater, and while it may offer us great benefits, it can and will be used to mislead.

From Technology review:

Big data is poised to transform society, from how we diagnose illness to how we educate children, even making it possible for a car to drive itself. Information is emerging as a new economic input, a vital resource. Companies, governments, and even individuals will be measuring and optimizing everything possible.

But there is a dark side. Big data erodes privacy. And when it is used to make predictions about what we are likely to do but haven’t yet done, it threatens freedom as well. Yet big data also exacerbates a very old problem: relying on the numbers when they are far more fallible than we think. Nothing underscores the consequences of data analysis gone awry more than the story of Robert McNamara.

McNamara was a numbers guy. Appointed the U.S. secretary of defense when tensions in Vietnam rose in the early 1960s, he insisted on getting data on everything he could. Only by applying statistical rigor, he believed, could decision makers understand a complex situation and make the right choices. The world in his view was a mass of unruly information that—if delineated, denoted, demarcated, and quantified—could be tamed by human hand and fall under human will. McNamara sought Truth, and that Truth could be found in data. Among the numbers that came back to him was the “body count.”

McNamara developed his love of numbers as a student at Harvard Business School and then as its youngest assistant professor at age 24. He applied this rigor during the Second World War as part of an elite Pentagon team called Statistical Control, which brought data-driven decision making to one of the world’s largest bureaucracies. Before this, the military was blind. It didn’t know, for instance, the type, quantity, or location of spare airplane parts. Data came to the rescue. Just making armament procurement more efficient saved $3.6 billion in 1943. Modern war demanded the efficient allocation of resources; the team’s work was a stunning success.

At war’s end, the members of this group offered their skills to corporate America. The Ford Motor Company was floundering, and a desperate Henry Ford II handed them the reins. Just as they knew nothing about the military when they helped win the war, so too were they clueless about making cars. Still, the so-called “Whiz Kids” turned the company around.

McNamara rose swiftly up the ranks, trotting out a data point for every situation. Harried factory managers produced the figures he demanded—whether they were correct or not. When an edict came down that all inventory from one car model must be used before a new model could begin production, exasperated line managers simply dumped excess parts into a nearby river. The joke at the factory was that a fellow could walk on water—atop rusted pieces of 1950 and 1951 cars.

McNamara epitomized the hyper-rational executive who relied on numbers rather than sentiments, and who could apply his quantitative skills to any industry he turned them to. In 1960 he was named president of Ford, a position he held for only a few weeks before being tapped to join President Kennedy’s cabinet as secretary of defense.

As the Vietnam conflict escalated and the United States sent more troops, it became clear that this was a war of wills, not of territory. America’s strategy was to pound the Viet Cong to the negotiation table. The way to measure progress, therefore, was by the number of enemy killed. The body count was published daily in the newspapers. To the war’s supporters it was proof of progress; to critics, evidence of its immorality. The body count was the data point that defined an era.

McNamara relied on the figures, fetishized them. With his perfectly combed-back hair and his flawlessly knotted tie, McNamara felt he could comprehend what was happening on the ground only by staring at a spreadsheet—at all those orderly rows and columns, calculations and charts, whose mastery seemed to bring him one standard deviation closer to God.

In 1977, two years after the last helicopter lifted off the rooftop of the U.S. embassy in Saigon, a retired Army general, Douglas Kinnard, published a landmark survey called The War Managers that revealed the quagmire of quantification. A mere 2 percent of America’s generals considered the body count a valid way to measure progress. “A fake—totally worthless,” wrote one general in his comments. “Often blatant lies,” wrote another. “They were grossly exaggerated by many units primarily because of the incredible interest shown by people like McNamara,” said a third.

Read the entire article after the jump.

Image: Robert McNamara at a cabinet meeting, 22 Nov 1967. Courtesy of Wikipedia / Public domain.

QTWTAIN: Are there Nazis living on the moon?

QTWTAIN is a Twitterspeak acronym for a Question To Which The Answer Is No.

QTWTAINs are a relatively recent journalistic phenomenon. They are often used as headlines to great effect by media organizations to grab a reader’s attention. But importantly, QTWTAINs imply that something ridiculous is true — by posing a headline as a question no evidence seems to be required. Here’s an example of a recent headline:

“Europe: Are there Nazis living on the moon?”

Author and journalist John Rentoul has done all connoisseurs of QTWTAINs a great service by collecting an outstanding selection from hundreds of his favorites into a new book, Questions to Which the Answer is No. Rentoul tells us his story, excerpted, below.

[div class=attrib]From the Independent:[end-div]

I have an unusual hobby. I collect headlines in the form of questions to which the answer is no. This is a specialist art form that has long been a staple of “prepare to be amazed” journalism. Such questions allow newspapers, television programmes and websites to imply that something preposterous is true without having to provide the evidence.

If you see a question mark after a headline, ask yourself why it is not expressed as a statement, such as “Church of England threatened by excess of cellulite” or “Revealed: Marlene Dietrich plotted to murder Hitler” or, “This penguin is a communist”.

My collection started with a bishop, a grudge against Marks & Spencer and a theft in broad daylight. The theft was carried out by me: I had been inspired by Oliver Kamm, a friend and hero of mine, who wrote about Great Historical Questions to Which the Answer is No on his blog. Then I came across this long headline in Britain’s second-best-selling newspaper three years ago: “He’s the outcast bishop who denies the Holocaust – yet has been welcomed back by the Pope. But are Bishop Williamson’s repugnant views the result of a festering grudge against Marks & Spencer?” Thus was an internet meme born.

Since then readers of The Independent blog and people on Twitter with nothing better to do have supplied me with a constant stream of QTWTAIN. If this game had a serious purpose, which it does not, it would be to make fun of conspiracy theories. After a while, a few themes recurred: flying saucers, yetis, Jesus, the murder of John F Kennedy, the death of Marilyn Monroe and reincarnation.

An enterprising PhD student could use my series as raw material for a thesis entitled: “A Typology of Popular Irrationalism in Early 21st-Century Media”. But that would be to take it too seriously. The proper use of the series is as a drinking game, to be followed by a rousing chorus of “Jerusalem”, which consists largely of questions to which the answer is no.

My only rule in compiling the series is that the author or publisher of the question has to imply that the answer is yes (“Does Nick Clegg Really Expect Us to Accept His Apology?” for example, would be ruled out of order). So far I have collected 841 of them, and the best have been selected for a book published this week. I hope you like them.

Is the Loch Ness monster on Google Earth?

Daily Telegraph, 26 August 2009

A picture of something that actually looked like a giant squid had been spotted by a security guard as he browsed the digital planet. A similar question had been asked by the Telegraph six months earlier, on 19 February, about a different picture: “Has the Loch Ness Monster emigrated to Borneo?”

Would Boudicca have been a Liberal Democrat?

This one is cheating, because Paul Richards, who asked it in an article in Progress magazine, 12 March 2010, did not imply that the answer was yes. He was actually making a point about the misuse of historical conjecture, comparing Douglas Carswell, the Conservative MP, who suggested that the Levellers were early Tories, to the spiritualist interviewed by The Sun in 1992, who was asked how Winston Churchill, Joseph Stalin, Karl Marx and Chairman Mao would have voted (Churchill was for John Major; the rest for Neil Kinnock, naturally).

Is Tony Blair a Mossad agent?

A question asked by Peza, who appears to be a cat, on an internet forum on 9 April 2010. One reader had a good reply: “Peza, are you drinking that vodka-flavoured milk?”

Could Angelina Jolie be the first female US President?

Daily Express, 24 June 2009

An awkward one this, because one of my early QTWTAIN was “Is the Express a newspaper?” I had formulated an arbitrary rule that its headlines did not count. But what are rules for, if not for changing?

[div class=attrib]Read the entire article after the jump?[end-div]

[div class=attrib]Book Cover: Questions to Which the Answer is No, by John Rentoul. Courtesy of the Independent / John Rentoul.[end-div]

An Evolutionary Benefit to Self-deception

[div class=attrib]From Scientific American:[end-div]

We lie to ourselves all the time. We tell ourselves that we are better than average — that we are more moral, more capable, less likely to become sick or suffer an accident. It’s an odd phenomenon, and an especially puzzling one to those who think about our evolutionary origins. Self-deception is so pervasive that it must confer some advantage. But how could we be well served by a brain that deceives us? This is one of the topics tackled by Robert Trivers in his new book, “The Folly of Fools,” a colorful survey of deception that includes plane crashes, neuroscience and the transvestites of the animal world. He answered questions from Mind Matters editor Gareth Cook.

Cook: Do you have any favorite examples of deception in the natural world?
Trivers: Tough call. They are so numerous, intricate and bizarre.  But you can hardly beat female mimics for general interest. These are males that mimic females in order to achieve closeness to a territory-holding male, who then attracts a real female ready to lay eggs. The territory-holding male imagines that he is in bed (so to speak) with two females, when really he is in bed with one female and another male, who, in turn, steals part of the paternity of the eggs being laid by the female. The internal dynamics of such transvestite threesomes is only just being analyzed. But for pure reproductive artistry one can not beat the tiny blister beetles that assemble in arrays of 100’s to 1000’s, linking together to produce the larger illusion of a female solitary bee, which attracts a male bee who flies into the mirage in order to copulate and thereby carries the beetles to their next host.

Cook: At what age do we see the first signs of deception in humans?
Trivers: In the last trimester of pregnancy, that is, while the offspring is still inside its mother. The baby takes over control of the mother’s blood sugar level (raising it), pulse rate (raising it) and blood distribution (withdrawing it from extremities and positioning it above the developing baby). It does so by putting into the maternal blood stream the same chemicals—or close mimics—as those that the mother normally produces to control these variables. You could argue that this benefits mom. She says, my child knows better what it needs than I do so let me give the child control. But it is not in the mother’s best interests to allow the offspring to get everything it wants; the mother must apportion her biological investment among other offspring, past, present and future. The proof is in the inefficiency of the new arrangement, the hallmark of conflict. The offspring produces these chemicals at 1000 times the level that the mother does. This suggests a co-evolutionary struggle in which the mother’s body becomes deafer as the offspring becomes louder.
After birth, the first clear signs of deception come about age 6 months, which is when the child fakes need when there appears to be no good reason. The child will scream and bawl, roll on the floor in apparent agony and yet stop within seconds after the audience leaves the room, only to resume within seconds when the audience is back. Later, the child will hide objects from the view of others and deny that it cares about a punishment when it clearly does.  So-called ‘white lies’, of the sort “The meal you served was delicious” appear after age 5.

[div class=attrib]Read the entire article here.[end-div]