Tag Archives: intelligence

Dishonesty and Intelligence

Another day, another survey. This time it’s one that links honesty and intelligence. Apparently, the more intelligent you are — as measured by a quick intelligence test — the less likely you’ll be to lie. Fascinatingly, the survey also shows that those who do lie from the small subgroup of the most intelligent tell smaller whoppers; people in the less intelligent subgroup tell bigger lies, for a bigger payoff.

From Washington Post:

Last summer, a couple of researchers ran a funny experiment about honesty. They went to an Israeli shopping mall and recruited people, one-by-one, into a private booth. Alone inside the booth, each subject rolled a six-sided die. Then they stepped out and reported the number that came up.

There was an incentive to lie. The higher the number, the more money people received. If they rolled a one, they got a bonus of about $2.50. If they rolled a two, they got a bonus of $5, and so on. If they rolled a six, the bonus was about $15. (Everyone also received $5 just for participating.)

Before I reveal the results, think about what you would do in that situation. Someone comes up to you at the mall and offers you free money to roll a die. If you wanted to make a few extra bucks, you could lie about what you rolled. Nobody would know, and nobody would be harmed.

Imagine you went into that booth and rolled a 1. What would you do? Would you be dishonest? Would you say you rolled a six, just to get the largest payout?

The researchers, Bradley Ruffle of Wilfrid Laurier University and Yossef Tobol, of the Jerusalem College of Technology, wanted to know what kinds of people would lie in this situation. So they asked everyone about their backgrounds, whether they considered themselves honest, whether they thought honesty was important. They asked whether people were employed, how much money they earned, and whether they were religious. They also gave people a quick intelligence test.

Out of all those attributes, brainpower stood out. Smarter people were less likely to lie about the number they rolled.

It didn’t matter whether they claimed they were honest or not; it didn’t matter whether they were religious, whether they were male or female, or whether they lived in a city. Money didn’t seem to be a factor either. Even after controlling for incomes, the researchers found that the most honest people were the ones who scored highest on the intelligence test.

Read the entire article here.

Focus on Process, Not Perfect Grades

If you are a parent of a school-age child then it is highly likely that you have, on multiple occasions, chastised her or him and withheld privileges for poor grades. It’s also likely that you have rewarded the same child for being smart at math or having Picasso-like artistic talent. I have done this myself. But, there is a better way to nurture young minds, and it is through “telling stories about achievements that result from hard work.”

From Scientific American:

A brilliant student, Jonathan sailed through grade school. He completed his assignments easily and routinely earned As. Jonathan puzzled over why some of his classmates struggled, and his parents told him he had a special gift. In the seventh grade, however, Jonathan suddenly lost interest in school, refusing to do homework or study for tests. As a consequence, his grades plummeted. His parents tried to boost their son’s confidence by assuring him that he was very smart. But their attempts failed to motivate Jonathan (who is a composite drawn from several children). Schoolwork, their son maintained, was boring and pointless.

Our society worships talent, and many people assume that possessing superior intelligence or ability—along with confidence in that ability—is a recipe for success. In fact, however, more than 35 years of scientific investigation suggests that an overemphasis on intellect or talent leaves people vulnerable to failure, fearful of challenges and unwilling to remedy their shortcomings.

The result plays out in children like Jonathan, who coast through the early grades under the dangerous notion that no-effort academic achievement defines them as smart or gifted. Such children hold an implicit belief that intelligence is innate and fixed, making striving to learn seem far less important than being (or looking) smart. This belief also makes them see challenges, mistakes and even the need to exert effort as threats to their ego rather than as opportunities to improve. And it causes them to lose confidence and motivation when the work is no longer easy for them.

Praising children’s innate abilities, as Jonathan’s parents did, reinforces this mind-set, which can also prevent young athletes or people in the workforce and even marriages from living up to their potential. On the other hand, our studies show that teaching people to have a “growth mind-set,” which encourages a focus on “process” (consisting of personal effort and effective strategies) rather than on intelligence or talent, helps make them into high achievers in school and in life.

The Opportunity of Defeat
I first began to investigate the underpinnings of human motivation—and how people persevere after setbacks—as a psychology graduate student at Yale University in the 1960s. Animal experiments by psychologists Martin Seligman, Steven Maier and Richard Solomon, all then at the University of Pennsylvania, had shown that after repeated failures, most animals conclude that a situation is hopeless and beyond their control. After such an experience, the researchers found, an animal often remains passive even when it can effect change—a state they called learned helplessness.

People can learn to be helpless, too, but not everyone reacts to setbacks this way. I wondered: Why do some students give up when they encounter difficulty, whereas others who are no more skilled continue to strive and learn? One answer, I soon discovered, lay in people’s beliefs about why they had failed.

In particular, attributing poor performance to a lack of ability depresses motivation more than does the belief that lack of effort is to blame. In 1972, when I taught a group of elementary and middle school children who displayed helpless behavior in school that a lack of effort (rather than lack of ability) led to their mistakes on math problems, the kids learned to keep trying when the problems got tough. They also solved many more problems even in the face of difficulty. Another group of helpless children who were simply rewarded for their success on easier problems did not improve their ability to solve hard math problems. These experiments were an early indication that a focus on effort can help resolve helplessness and engender success.

Subsequent studies revealed that the most persistent students do not ruminate about their own failure much at all but instead think of mistakes as problems to be solved. At the University of Illinois in the 1970s I, along with my then graduate student Carol Diener, asked 60 fifth graders to think out loud while they solved very difficult pattern-recognition problems. Some students reacted defensively to mistakes, denigrating their skills with comments such as “I never did have a good rememory,” and their problem-solving strategies deteriorated.

Others, meanwhile, focused on fixing errors and honing their skills. One advised himself: “I should slow down and try to figure this out.” Two schoolchildren were particularly inspiring. One, in the wake of difficulty, pulled up his chair, rubbed his hands together, smacked his lips and said, “I love a challenge!” The other, also confronting the hard problems, looked up at the experimenter and approvingly declared, “I was hoping this would be informative!” Predictably, the students with this attitude outperformed their cohorts in these studies.

Read the entire article here.

Pretending to be Smart

Have you ever taken a date to a cerebral movie or the opera? Have you ever taken a classic work of literature to read at the beach? If so, you are not alone. But why are you doing it?

From the Telegraph:

Men try to impress their friends almost twice as much as women do by quoting Shakespeare and pretending to like jazz to seem more clever.

A fifth of all adults admitted they have tried to impress others by making out they are more cultured than they really are, but this rises to 41 per cent in London.

Scotland is the least pretentious country as only 14 per cent of the 1,000 UK adults surveyed had faked their intelligence there, according to Ask Jeeves research.

Typical methods of trying to seem cleverer ranged from deliberately reading a ‘serious’ novel on the beach, passing off other people’s witty remarks as one’s own and talking loudly about politics in front of others.

Two thirds put on the pretensions for friends, while 36 per cent did it to seem smarter in their workplace and 32 per cent tried to impress a potential partner.

One in five swapped their usual holiday read for something more serious on the beach and one in four went to an art gallery to look more cultured.

When it came to music tastes, 20 per cent have pretended to prefer Beethoven to Beyonce and many have referenced operas they have never seen.

A spokesman for Ask Jeeves said: “We were surprised by just how many people think they should go to such lengths in order to impress someone else.

“They obviously think they will make a better impression if they pretend to like Beethoven rather than admit they listen to Beyonce or read The Spectator rather than Loaded.

“Social media and the internet means it is increasingly easy to present this kind of false image about themselves.

“But in the end, if they are really going to be liked then it is going to be for the person they really are rather than the person they are pretending to be.”

Social media also plays a large part with people sharing Facebook posts on politics or re-tweeting clever tweets to raise their intellectual profile.

Men were the biggest offenders, with 26 per cent of men admitting to the acts of pretence compared to 14 per cent of women.

Top things people have done to seem smarter:

Repeated someone else’s joke as your own

Gone to an art gallery

Listened to classical music in front of others

Read a ‘serious’ book on the beach

Re-tweeted a clever tweet

Talked loudly about politics in front of others

Read a ‘serious’ magazine on public transport

Shared an intellectual article on Facebook

Quoted Shakespeare

Pretended to know about wine

Worn glasses with clear lenses

Mentioned an opera you’d ‘seen’

Pretended to like jazz

Read the entire article here.

Image: Opera. Courtesy of the New York Times.

The Benefits of Human Stupidity

Human intelligence is a wonderful thing. At both the individual and collective level it drives our complex communication, our fundamental discoveries and inventions, and impressive and accelerating progress. Intelligence allows us to innovate, to design, to build; and it underlies our superior capacity, over other animals, for empathy, altruism, art, and social and cultural evolution. Yet, despite our intellectual abilities and seemingly limitless potential, we humans still do lots of stupid things. Why is this?

From New Scientist:

“EARTH has its boundaries, but human stupidity is limitless,” wrote Gustave Flaubert. He was almost unhinged by the fact. Colourful fulminations about his fatuous peers filled his many letters to Louise Colet, the French poet who inspired his novel Madame Bovary. He saw stupidity everywhere, from the gossip of middle-class busybodies to the lectures of academics. Not even Voltaire escaped his critical eye. Consumed by this obsession, he devoted his final years to collecting thousands of examples for a kind of encyclopedia of stupidity. He died before his magnum opus was complete, and some attribute his sudden death, aged 58, to the frustration of researching the book.

Documenting the extent of human stupidity may itself seem a fool’s errand, which could explain why studies of human intellect have tended to focus on the high end of the intelligence spectrum. And yet, the sheer breadth of that spectrum raises many intriguing questions. If being smart is such an overwhelming advantage, for instance, why aren’t we all uniformly intelligent? Or are there drawbacks to being clever that sometimes give slower thinkers the upper hand? And why are even the smartest people prone to – well, stupidity?

It turns out that our usual measures of intelligence – particularly IQ – have very little to do with the kind of irrational, illogical behaviours that so enraged Flaubert. You really can be highly intelligent, and at the same time very stupid. Understanding the factors that lead clever people to make bad decisions is beginning to shed light on many of society’s biggest catastrophes, including the recent economic crisis. More intriguingly, the latest research may suggest ways to evade a condition that can plague us all.

The idea that intelligence and stupidity are simply opposing ends of a single spectrum is a surprisingly modern one. The Renaissance theologian Erasmus painted Folly – or Stultitia in Latin – as a distinct entity in her own right, descended from the god of wealth and the nymph of youth; others saw it as a combination of vanity, stubbornness and imitation. It was only in the middle of the 18th century that stupidity became conflated with mediocre intelligence, says Matthijs van Boxsel, a Dutch historian who has written many books about stupidity. “Around that time, the bourgeoisie rose to power, and reason became a new norm with the Enlightenment,” he says. “That put every man in charge of his own fate.”

Modern attempts to study variations in human ability tended to focus on IQ tests that put a single number on someone’s mental capacity. They are perhaps best recognised as a measure of abstract reasoning, says psychologist Richard Nisbett at the University of Michigan in Ann Arbor. “If you have an IQ of 120, calculus is easy. If it’s 100, you can learn it but you’ll have to be motivated to put in a lot of work. If your IQ is 70, you have no chance of grasping calculus.” The measure seems to predict academic and professional success.

Various factors will determine where you lie on the IQ scale. Possibly a third of the variation in our intelligence is down to the environment in which we grow up – nutrition and education, for example. Genes, meanwhile, contribute more than 40 per cent of the differences between two people.

These differences may manifest themselves in our brain’s wiring. Smarter brains seem to have more efficient networks of connections between neurons. That may determine how well someone is able to use their short-term “working” memory to link disparate ideas and quickly access problem-solving strategies, says Jennie Ferrell, a psychologist at the University of the West of England in Bristol. “Those neural connections are the biological basis for making efficient mental connections.”

This variation in intelligence has led some to wonder whether superior brain power comes at a cost – otherwise, why haven’t we all evolved to be geniuses? Unfortunately, evidence is in short supply. For instance, some proposed that depression may be more common among more intelligent people, leading to higher suicide rates, but no studies have managed to support the idea. One of the only studies to report a downside to intelligence found that soldiers with higher IQs were more likely to die during the second world war. The effect was slight, however, and other factors might have skewed the data.

Intellectual wasteland

Alternatively, the variation in our intelligence may have arisen from a process called “genetic drift”, after human civilisation eased the challenges driving the evolution of our brains. Gerald Crabtree at Stanford University in California is one of the leading proponents of this idea. He points out that our intelligence depends on around 2000 to 5000 constantly mutating genes. In the distant past, people whose mutations had slowed their intellect would not have survived to pass on their genes; but Crabtree suggests that as human societies became more collaborative, slower thinkers were able to piggyback on the success of those with higher intellect. In fact, he says, someone plucked from 1000 BC and placed in modern society, would be “among the brightest and most intellectually alive of our colleagues and companions” (Trends in Genetics, vol 29, p 1).

This theory is often called the “idiocracy” hypothesis, after the eponymous film, which imagines a future in which the social safety net has created an intellectual wasteland. Although it has some supporters, the evidence is shaky. We can’t easily estimate the intelligence of our distant ancestors, and the average IQ has in fact risen slightly in the immediate past. At the very least, “this disproves the fear that less intelligent people have more children and therefore the national intelligence will fall”, says psychologist Alan Baddeley at the University of York, UK.

In any case, such theories on the evolution of intelligence may need a radical rethink in the light of recent developments, which have led many to speculate that there are more dimensions to human thinking than IQ measures. Critics have long pointed out that IQ scores can easily be skewed by factors such as dyslexia, education and culture. “I would probably soundly fail an intelligence test devised by an 18th-century Sioux Indian,” says Nisbett. Additionally, people with scores as low as 80 can still speak multiple languages and even, in the case of one British man, engage in complex financial fraud. Conversely, high IQ is no guarantee that a person will act rationally – think of the brilliant physicists who insist that climate change is a hoax.

It was this inability to weigh up evidence and make sound decisions that so infuriated Flaubert. Unlike the French writer, however, many scientists avoid talking about stupidity per se – “the term is unscientific”, says Baddeley. However, Flaubert’s understanding that profound lapses in logic can plague the brightest minds is now getting attention. “There are intelligent people who are stupid,” says Dylan Evans, a psychologist and author who studies emotion and intelligence.

Read the entire article after the jump.

Helplessness and Intelligence Go Hand in Hand

From the Wall Street Journal:

Why are children so, well, so helpless? Why did I spend a recent Sunday morning putting blueberry pancake bits on my 1-year-old grandson’s fork and then picking them up again off the floor? And why are toddlers most helpless when they’re trying to be helpful? Augie’s vigorous efforts to sweep up the pancake detritus with a much-too-large broom (“I clean!”) were adorable but not exactly effective.

This isn’t just a caregiver’s cri de coeur—it’s also an important scientific question. Human babies and young children are an evolutionary paradox. Why must big animals invest so much time and energy just keeping the little ones alive? This is especially true of our human young, helpless and needy for far longer than the young of other primates.

One idea is that our distinctive long childhood helps to develop our equally distinctive intelligence. We have both a much longer childhood and a much larger brain than other primates. Restless humans have to learn about more different physical environments than stay-at-home chimps, and with our propensity for culture, we constantly create new social environments. Childhood gives us a protected time to master new physical and social tools, from a whisk broom to a winning comment, before we have to use them to survive.

The usual museum diorama of our evolutionary origins features brave hunters pursuing a rearing mammoth. But a Pleistocene version of the scene in my kitchen, with ground cassava roots instead of pancakes, might be more accurate, if less exciting.

Of course, many scientists are justifiably skeptical about such “just-so stories” in evolutionary psychology. The idea that our useless babies are really useful learners is appealing, but what kind of evidence could support (or refute) it? There’s still controversy, but two recent studies at least show how we might go about proving the idea empirically.

One of the problems with much evolutionary psychology is that it just concentrates on humans, or sometimes on humans and chimps. To really make an evolutionary argument, you need to study a much wider variety of animals. Is it just a coincidence that we humans have both needy children and big brains? Or will we find the same evolutionary pattern in animals who are very different from us? In 2010, Vera Weisbecker of Cambridge University and a colleague found a correlation between brain size and dependence across 52 different species of marsupials, from familiar ones like kangaroos and opossums to more exotic ones like quokkas.

Quokkas are about the same size as Virginia opossums, but baby quokkas nurse for three times as long, their parents invest more in each baby, and their brains are twice as big.

Read the entire article after the jump.

Intelligenetics

Intelligenetics isn’t recognized as a real word by Websters or the Oxford English dictionary. We just coined a term that might best represent the growing field of research examining the genetic basis for human intelligence. Of course, it’s not a new subject and comes with many cautionary tales. Past research into the genetic foundations of intelligence has often been misused by one group seeking racial, ethnic or political power over another. However, with strong and appropriate safeguards in place science does have a legitimate place in uncovering what makes some brains excel while others do not.

[div class=attrib]From the Wall Street Journal:[end-div]

At a former paper-printing factory in Hong Kong, a 20-year-old wunderkind named Zhao Bowen has embarked on a challenging and potentially controversial quest: uncovering the genetics of intelligence.

Mr. Zhao is a high-school dropout who has been described as China’s Bill Gates. He oversees the cognitive genomics lab at BGI, a private company that is partly funded by the Chinese government.

At the Hong Kong facility, more than 100 powerful gene-sequencing machines are deciphering about 2,200 DNA samples, reading off their 3.2 billion chemical base pairs one letter at a time. These are no ordinary DNA samples. Most come from some of America’s brightest people—extreme outliers in the intelligence sweepstakes.

The majority of the DNA samples come from people with IQs of 160 or higher. By comparison, average IQ in any population is set at 100. The average Nobel laureate registers at around 145. Only one in every 30,000 people is as smart as most of the participants in the Hong Kong project—and finding them was a quest of its own.

“People have chosen to ignore the genetics of intelligence for a long time,” said Mr. Zhao, who hopes to publish his team’s initial findings this summer. “People believe it’s a controversial topic, especially in the West. That’s not the case in China,” where IQ studies are regarded more as a scientific challenge and therefore are easier to fund.

The roots of intelligence are a mystery. Studies show that at least half of the variation in intelligence quotient, or IQ, is inherited. But while scientists have identified some genes that can significantly lower IQ—in people afflicted with mental retardation, for example—truly important genes that affect normal IQ variation have yet to be pinned down.

The Hong Kong researchers hope to crack the problem by comparing the genomes of super-high-IQ individuals with the genomes of people drawn from the general population. By studying the variation in the two groups, they hope to isolate some of the hereditary factors behind IQ.

Their conclusions could lay the groundwork for a genetic test to predict a person’s inherited cognitive ability. Such a tool could be useful, but it also might be divisive.

“If you can identify kids who are going to have trouble learning, you can intervene” early on in their lives, through special schooling or other programs, says Robert Plomin, a professor of behavioral genetics at King’s College, London, who is involved in the BGI project.

[div class=attrib]Read the entire article following the jump.[end-div]

Building Character in Kids

Many parents have known this for a long time: it takes more than a stellar IQ, SAT or ACT score to make a well-rounded kid. Arguably there a many more important traits that never feature on these quantitative tests. Such qualities as leadership, curiosity, initiative, perseverance, motivation, courage and empathy come to mind.

An excerpt below from Paul Tough’s book, “How Children Succeed: Grit, Curiosity and the Hidden Power of Character”.

[div class=attrib]From the Wall Street Journal:[end-div]

We are living through a particularly anxious moment in the history of American parenting. In the nation’s big cities these days, the competition among affluent parents over slots in favored preschools verges on the gladiatorial. A pair of economists from the University of California recently dubbed this contest for early academic achievement the “Rug Rat Race,” and each year, the race seems to be starting earlier and growing more intense.

At the root of this parental anxiety is an idea you might call the cognitive hypothesis. It is the belief, rarely spoken aloud but commonly held nonetheless, that success in the U.S. today depends more than anything else on cognitive skill—the kind of intelligence that gets measured on IQ tests—and that the best way to develop those skills is to practice them as much as possible, beginning as early as possible.

There is something undeniably compelling about the cognitive hypothesis. The world it describes is so reassuringly linear, such a clear case of inputs here leading to outputs there. Fewer books in the home means less reading ability; fewer words spoken by your parents means a smaller vocabulary; more math work sheets for your 3-year-old means better math scores in elementary school. But in the past decade, and especially in the past few years, a disparate group of economists, educators, psychologists and neuroscientists has begun to produce evidence that calls into question many of the assumptions behind the cognitive hypothesis.

What matters most in a child’s development, they say, is not how much information we can stuff into her brain in the first few years of life. What matters, instead, is whether we are able to help her develop a very different set of qualities, a list that includes persistence, self-control, curiosity, conscientiousness, grit and self-confidence. Economists refer to these as noncognitive skills, psychologists call them personality traits, and the rest of us often think of them as character.

If there is one person at the hub of this new interdisciplinary network, it is James Heckman, an economist at the University of Chicago who in 2000 won the Nobel Prize in economics. In recent years, Mr. Heckman has been convening regular invitation-only conferences of economists and psychologists, all engaged in one form or another with the same questions: Which skills and traits lead to success? How do they develop in childhood? And what kind of interventions might help children do better?

The transformation of Mr. Heckman’s career has its roots in a study he undertook in the late 1990s on the General Educational Development program, better known as the GED, which was at the time becoming an increasingly popular way for high-school dropouts to earn the equivalent of high-school diplomas. The GED’s growth was founded on a version of the cognitive hypothesis, on the belief that what schools develop, and what a high-school diploma certifies, is cognitive skill. If a teenager already has the knowledge and the smarts to graduate from high school, according to this logic, he doesn’t need to waste his time actually finishing high school. He can just take a test that measures that knowledge and those skills, and the state will certify that he is, legally, a high-school graduate, as well-prepared as any other high-school graduate to go on to college or other postsecondary pursuits.

Mr. Heckman wanted to examine this idea more closely, so he analyzed a few large national databases of student performance. He found that in many important ways, the premise behind the GED was entirely valid. According to their scores on achievement tests, GED recipients were every bit as smart as high-school graduates. But when Mr. Heckman looked at their path through higher education, he found that GED recipients weren’t anything like high-school graduates. At age 22, Mr. Heckman found, just 3% of GED recipients were either enrolled in a four-year university or had completed some kind of postsecondary degree, compared with 46% of high-school graduates. In fact, Heckman discovered that when you consider all kinds of important future outcomes—annual income, unemployment rate, divorce rate, use of illegal drugs—GED recipients look exactly like high-school dropouts, despite the fact that they have earned this supposedly valuable extra credential, and despite the fact that they are, on average, considerably more intelligent than high-school dropouts.

These results posed, for Mr. Heckman, a confounding intellectual puzzle. Like most economists, he had always believed that cognitive ability was the single most reliable determinant of how a person’s life would turn out. Now he had discovered a group—GED holders—whose good test scores didn’t seem to have any positive effect on their eventual outcomes. What was missing from the equation, Mr. Heckman concluded, were the psychological traits, or noncognitive skills, that had allowed the high-school graduates to make it through school.

So what can parents do to help their children develop skills like motivation and perseverance? The reality is that when it comes to noncognitive skills, the traditional calculus of the cognitive hypothesis—start earlier and work harder—falls apart. Children can’t get better at overcoming disappointment just by working at it for more hours. And they don’t lag behind in curiosity simply because they didn’t start doing curiosity work sheets at an early enough age.

[div class=attrib]Read the entire article after the jump.[end-div]