This post’s title belongs to the great physicist and bongo player Richard Feynman. It brings into sharp relief one of the many challenges in our current fractured political discourse — that objective fact is a political tool and scientific denialism is now worn as a badge of honor by many politicians (mostly on the right).
Climate science is a great example of the chasm between rational debate and established facts on the one hand and anti-science, conspiracy mythologists [I’m still searching for a better word] on the other. Some climate deniers simply wave away evidence as nothing but regular weather. Others pronounce that climate change is a plot by the Chinese.
I firmly believe in the scientific method and objective fact; the progress we have witnessed over the last 150 or so years due to science and scientists alone is spectacular. Long may it continue. Yet as Scientific American tells us we need to be alarmed and remain vigilant — it wouldn’t take much effort to return to the Dark Ages.
From Scientific American:
Four years ago in these pages, writer Shawn Otto warned our readers of the danger of a growing antiscience current in American politics. “By turning public opinion away from the antiauthoritarian principles of the nation’s founders,” Otto wrote, “the new science denialism is creating an existential crisis like few the country has faced before.”
Otto wrote those words in the heat of a presidential election race that now seems quaint by comparison to the one the nation now finds itself in. As if to prove his point, one of the two major party candidates for the highest office in the land has repeatedly and resoundingly demonstrated a disregard, if not outright contempt, for science. Donald Trump also has shown an authoritarian tendency to base policy arguments on questionable assertions of fact and a cult of personality.
Americans have long prided themselves on their ability to see the world for what it is, as opposed to what someone says it is or what most people happen to believe. In one of the most powerful lines in American literature, Huck Finn says: “It warn’t so. I tried it.” A respect for evidence is not just a part of the national character. It goes to the heart of the country’s particular brand of democratic government. When the founding fathers, including Benjamin Franklin, scientist and inventor, wrote arguably the most important line in the Declaration of Independence—“We hold these truths to be self-evident”—they were asserting the fledgling nation’s grounding in the primacy of reason based on evidence.
Evolutionary biologist Richard Dawkins sprang to the public’s attention via his immensely popular book The Selfish Gene. Since its publication almost 40 years ago, its author has assumed the unofficial mantle of Atheist-In-Chief. His passionate and impatient defense — some would call it crusading offense — of all things godless has rubbed many the wrong way, including numerous unbelievers. That said, his reasoning remains crystal clear and his focus laser-like. I just wish he would stay away from Twitter.
In Dublin, not long ago, Richard Dawkins visited a steakhouse called Darwin’s. He was in town to give a talk on the origins of life at Trinity College with the American physicist Lawrence Krauss. In the restaurant, a large model gorilla squatted in a corner and a series of sepia paintings of early man hung in the dining room – though, Dawkins pointed out, not quite in the right chronological order. A space by the bar had been refitted to resemble the interior of the Beagle, the vessel on which Charles Darwin sailed to South America in 1831 and conceived his theory of natural selection. “Oh look at this!” Dawkins said, examining the decor. “It’s terrific! Oh, wonderful.”
Over the years, Dawkins, a zoologist by training, has expressed admiration for Darwin in the way a schoolboy might worship a sporting giant. In his first memoir, Dawkins noted the “serendipitous realisation” that his full name – Clinton Richard Dawkins – shared the same initials as Charles Robert Darwin. He owns a prized first edition of On The Origin of Species, which he can quote from memory. For Dawkins, the book is totemic, the founding text of his career. “It’s such a thorough, unanswerable case,” he said one afternoon. “[Darwin] called it one long argument.” As a description of Dawkins’s own life, particularly its late phase, “one long argument” serves fairly well. As the global face of atheism over the last decade, Dawkins has ratcheted up the rhetoric in his self-declared war against religion. He is the general who chooses to fight on the front line – whose scorched-earth tactics have won him fervent admirers, and ferocious enemies. What is less clear, however, is whether he is winning.
Over dinner – chicken for Dawkins, steak for everyone else – he spoke little. He was anxious to leave early in order to discuss the format of the event with Krauss. Though Dawkins gives a talk roughly once a fortnight, he still obsessively overprepares. On this occasion, there was no need – he and Krauss had put on a similar show the night before at the University of Ulster in Belfast. They had also appeared on a radio talkshow, during which they had attempted to debate a creationist (an “idiot”, in Dawkins’s terminology). “She simply tried to shout down everything Lawrence and I said. So she was in effect going la la la la la.” Dawkins stuck his fingers in his ears as he sang.
Krauss and Dawkins have toured frequently as a double act, partners in a global quest to broadcast the wonder of science and the nonexistence of God. Dawkins has been on this mission ever since 1976, when he published The Selfish Gene, the book that made him famous, which has now sold over a million copies. Since then, he has written another 10 influential books on science and evolution, plus The God Delusion, his atheist blockbuster, and become the most prominent of the so-called New Atheists – a group of writers, including Christopher Hitchens and Sam Harris, who published anti-religion polemics in the years after 9/11.
An hour or so after dinner, the Burke Theatre in Trinity College, a large modern lecture hall with banked seating, was full. After separate presentations, Krauss and Dawkins conversed freely, swapping ideas on the origins of life. As he spoke, Dawkins took on a grandfatherly air, as though passing on hard-earned wisdom. He has always sought to inject beauty into biology, and his voice wavered with emotion as he shifted from dry fact to lyrical metaphor.
Dawkins has the stately confidence of one who has spent half a life behind a lectern. He has aged well, thanks to the determined jaw and carved cheekbones of a 1950s matinee idol. His hair remains in the style that has served him for 70 years, a lopsided sweep. A prominent brow and hawkish stare give him a look of constant urgency, as though he is waiting for everyone to catch up. In Dublin, his outfit was academic-on-tour: jacket, woolly jumper and tie, one of a collection hand-painted by his wife, Lalla Ward, which depict penguins, fish, birds of prey.
At the end of the Trinity event, a crowd of about 40 audience members descended on to the stage, clutching books to be signed. Dawkins eventually retreated into the wings to avoid a crush. One young schoolteacher lingered in the hallway long after the rest of the audience had left, in the hope of shaking Dawkins’s hand. Earlier that day, Dawkins had expressed bewilderment at his own celebrity. “I find the epidemic of selfies disconcerting,” he said. “It’s always, ‘one quick photo.’ One quick. But it never is.” Though he is used to receiving a steady flow of letters from fans of The God Delusion and new converts to atheism, he does not perceive himself as a figurehead. “I don’t need to say if I think of myself as a leader,” he said a few weeks later. “I simply need to say the book has sold three million copies.”
Dawkins turned 74 in March this year. To celebrate, he had dinner with Ward at Cherwell Boathouse, a smart restaurant overlooking the river in Oxford; the occasion was marred only slightly by a loud-voiced fellow diner, Dawkins recalled, “who quacked like Donald Duck”. An academic of his eminence could, by now, have eased into a distinguished late period: more books, the odd speech, master of an Oxford college, a gentle tending to his legacy. Though he is in a retrospective phase – one memoir published, a second on its way later this year – peaceful retreat from public life has not been the Dawkins way. “Some people might say why don’t you just get on with gardening,” he said. “I think [there’s a] passion for truth and a passion for justice that doesn’t allow me to do that.”
Instead, Dawkins remains indefatigably active. He rarely takes a holiday, but travels frequently to give talks – in the last four months he has been to Ireland, the Czech Republic, Bulgaria and Brazil. Though he says he prefers to speak about science, God inevitably looms. “I suppose some of what I do is an attempt to change people’s minds about religion,” he said, with some understatement, between events in Ireland. “And I do think that’s a politically important thing to be doing.” For Dawkins, who describes his own politics as “vaguely left”, this means a concern for the state of the world, and a desire, ultimately, to eradicate religion from society. In his mission, Dawkins is still, at heart, a teacher. “I would like to leave the world a better place,” he said. “I like to think my science books have had a positive educational effect, but I also want to leave the world a better place in influencing opinion in other fields where there is illogic, obscurantism, pretension.” Religious faith, for Dawkins, is above all a sign of faulty thinking, of ignorance; he wants to educate the ill-informed out of their mistakes. He sees religion, as he once put it on Twitter, as “an organised licence to be acceptably stupid”.
The two strands of Dawkins’s mission – promoting science, demolishing religion – are intended to be complementary. “If they are antagonistic to each other, that would be regrettable,” he said, “but I don’t see why they should be.” But antagonism is part of Dawkins’s daily life. “I suppose some of the passions that I show are more appropriate to a young man than somebody of my age.” Since his arrival on Twitter in 2008, his public pronouncements have become more combative – and, at times, flamboyantly irritable: “How dare you force your dopey unsubstantiated superstitions on innocent children too young to resist?,” he tweeted last June. “How DARE you?”
A recent popular refrain from politicians in the US is “I am not a scientist”. This is code, mostly from the mouths of Republicans, for a train of thought that goes something like this:
1. Facts discovered through the scientific method are nothing more than opinion.
2. However, my beliefs are fact.
3. Hence, anything that is explained through science is wrong.
4. Thus, case closed.
Those who would have us believe that climate change is an illusion now take cover behind this quaint “I am not a scientist” phrase, and in so doing are able to shirk from questions of any consequence. So, it’s good to hear potential Republican presidential candidate, Scott Walker, tow the party line recently by telling us that he’s no scientist and “punting” (aka ignoring) on questions of climate change. This on the same day that NASA, Cornell and Columbia warn that global warming is likely to bring severe, multi-decade long megadroughts — the worst in a thousand years — to the central and southwestern US in our children’s lifetimes.
The optimist in me hopes that when my children come of age they will elect politicians who are scientists or leaders who accept the scientific method. Please. It’s time to ditch Flat Earthers, creationists and “believers”. It’s time to shun those who shun critical thinking, reason and evidence. It’s time to move beyond those who merely say anything or nothing to get elected.
From ars technica:
Given that February 12 would be Charles Darwin’s 206th birthday, having people spare some thought for the theory of evolution doesn’t seem outrageously out of place this week. But, for a US politician visiting London, a question on the matter was clearly unwelcome.
Scott Walker, governor of Wisconsin and possible presidential candidate, was obviously hoping for a chance to have a few experiences that would make him seem more credible on the foreign policy scene. But the host of a British TV show asked some questions that, for many in the US, touch on matters of personal belief and the ability to think critically: “Are you comfortable with the idea of evolution? Do you believe in it? Do you accept it?” (A video that includes these questions along with extensive commentary is available here.)
Walker, rather than oblige his host, literally answered that he was going to dodge the question, saying, “For me, I’m going to punt on that one as well. That’s a question a politician shouldn’t be involved in one way or another.”
“Punting,” for those not up on their sports metaphors, is a means of tactically giving up. When a football team punts, it gives the other team control of the ball but prevents a variety of many worse situations from developing.
In some ways, this is an improvement for a politician. When it comes to climate change, many politicians perform a dodge by saying “I’m not a scientist” and then proceed to make stupid pronouncements about the state of science. Here, Walker didn’t make any statements whatsoever.
So, that’s a step up from excusing stupidity. But is this really a question that should be punted? To begin with, Walker may not feel it’s a question a politician should be involved with, but plenty of other politicians clearly do. At a minimum, punting meant Walker passed on an opportunity to explain why he feels those efforts to interfere in science education are misguided and why his stand is more principled.
But, much more realistically, Walker is punting not because he feels the question shouldn’t be answered by politicians, but because he sees lots of political downsides to answering. Politicians had been getting hit with the evolution question since at least 2007, and our initial analysis of it still stands. If you agree with over a century of scientific exploration, you run the risk of alienating a community that has established itself as a reliable contributor of votes to Republican politicians such as Walker. We could see why he would want to avoid that.
Saying you refuse to accept evolution raises valid questions about your willingness to analyze evidence rationally and accept the opinions of people with expertise in a topic. Either that, or it suggests you’re willing to say anything in order to improve your chances of being elected. But punting is effectively the same thing—it suggests you’ll avoid saying anything in order to improve your chances of being elected.
Completing an annual tax return, and sending even more hard-earned cash, to the government is not much fun for anyone. So, it’s no surprise that many people procrastinate. In the UK, the organization entrusted with gathering pounds and pennies from the public is Her Majesty’s Revenue and Customs department — the equivalent of the Internal Revenue Service (IRS) in the US.
HMRC recently released a list of the worst excuses from taxpayers for not filing their returns on time. It includes such gems as “late due to death of a pet goldfish” and “late due to run in with a cow.” This re-confirms that the British are indeed the eighth wonder of the world.
From the Telegraph:
A builder who handed in his tax return late blamed the death of his pet goldfish, while a farmer said it was the fault of an unruly cow.
A third culprit said he failed to send in his forms after becoming obsessed with an erupting volcano on the television news.
They were among thousands of excuses used by individuals and businesses last year in a bid to avoid paying a penalty for a late tax return.
But, while HM Revenue & Customs says it considers genuine explanations, it has little regard for lame excuses.
As the top ten was disclosed, officials said all had been hit with £100 fines for late returns. They had all appealed, but lost their actions.
The list was released to encourage the self-employed, and other taxpayers, to meet this year’s January 31 deadline. In all, 10.9 million people are due to file tax returns this month. The number required to fill in a self-assessment form has been inflated by changes to Child Benefit. Any household with an individual earning more than £50,000 must now complete the form if they still receive the benefit.
Ruth Owen, the director general of personal tax, said: “There will always be unforeseen events that mean a taxpayer could not file their tax return on time.
“However, your pet goldfish passing away isn’t one of them.”
The ten worst excuses:
1. My pet goldfish died (self-employed builder)
2. I had a run-in with a cow (Midlands farmer)
3. After seeing a volcanic eruption on the news, I couldn’t concentrate on anything else (London woman)
4. My wife won’t give me my mail (self-employed trader)
5. My husband told me the deadline was March 31, and I believed him (Leicester hairdresser)
6. I’ve been far too busy touring the country with my one-man play (Coventry writer)
7. My bad back means I can’t go upstairs. That’s where my tax return is (a working taxi driver)
8. I’ve been cruising round the world in my yacht, and only picking up post when I’m on dry land (South East man)
9. Our business doesn’t really do anything (Kent financial services firm)
10. I’ve been too busy submitting my clients’ tax returns (London accountant)
Imagine a nation, or even a world, where political decisions and policy are driven by science rather than emotion. Well, small experiments are underway, so this may not be as far off as many would believe, or even dare to hope.
[div class=attrib]From the New Scientist:[end-div]
In your wildest dreams, could you imagine a government that builds its policies on carefully gathered scientific evidence? One that publishes the rationale behind its decisions, complete with data, analysis and supporting arguments? Well, dream no longer: that’s where the UK is heading.
It has been a long time coming, according to Chris Wormald, permanent secretary at the Department for Education. The civil service is not short of clever people, he points out, and there is no lack of desire to use evidence properly. More than 20 years as a serving politician has convinced him that they are as keen as anyone to create effective policies. “I’ve never met a minister who didn’t want to know what worked,” he says. What has changed now is that informed policy-making is at last becoming a practical possibility.
That is largely thanks to the abundance of accessible data and the ease with which new, relevant data can be created. This has supported a desire to move away from hunch-based politics.
Last week, for instance, Rebecca Endean, chief scientific advisor and director of analytical services at the Ministry of Justice, announced that the UK government is planning to open up its data for analysis by academics, accelerating the potential for use in policy planning.
At the same meeting, hosted by innovation-promoting charity NESTA, Wormald announced a plan to create teaching schools based on the model of teaching hospitals. In education, he said, the biggest single problem is a culture that often relies on anecdotal experience rather than systematically reported data from practitioners, as happens in medicine. “We want to move teacher training and research and practice much more onto the health model,” Wormald said.
Test, learn, adapt
In June last year the Cabinet Office published a paper called “Test, Learn, Adapt: Developing public policy with randomised controlled trials”. One of its authors, the doctor and campaigning health journalist Ben Goldacre, has also been working with the Department of Education to compile a comparison of education and health research practices, to be published in the BMJ.
In education, the evidence-based revolution has already begun. A charity called the Education Endowment Foundation is spending £1.4 million on a randomised controlled trial of reading programmes in 50 British schools.
There are reservations though. The Ministry of Justice is more circumspect about the role of such trials. Where it has carried out randomised controlled trials, they often failed to change policy, or even irked politicians with conclusions that were obvious. “It is not a panacea,” Endean says.
Power of prediction
The biggest need is perhaps foresight. Ministers often need instant answers, and sometimes the data are simply not available. Bang goes any hope of evidence-based policy.
“The timescales of policy-making and evidence-gathering don’t match,” says Paul Wiles, a criminologist at the University of Oxford and a former chief scientific adviser to the Home Office. Wiles believes that to get round this we need to predict the issues that the government is likely to face over the next decade. “We can probably come up with 90 per cent of them now,” he says.
Crucial to the process will be convincing the public about the value and use of data, so that everyone is on-board. This is not going to be easy. When the government launched its Administrative Data Taskforce, which set out to look at data in all departments and opening it up so that it could be used for evidence-based policy, it attracted minimal media interest.
The taskforce’s remit includes finding ways to increase trust in data security. Then there is the problem of whether different departments are legally allowed to exchange data. There are other practical issues: many departments format data in incompatible ways. “At the moment it’s incredibly difficult,” says Jonathan Breckon, manager of the Alliance for Useful Evidence, a collaboration between NESTA and the Economic and Social Research Council.
[div class=attrib]Read the entire article after the jump.[end-div]
With daily headlines focusing on war, terrorism, and the abuses of repressive governments, and religious leaders frequently bemoaning declining standards of public and private behavior, it is easy to get the impression that we are witnessing a moral collapse. But I think that we have grounds to be optimistic about the future.
Thirty years ago, I wrote a book called The Expanding Circle, in which I asserted that, historically, the circle of beings to whom we extend moral consideration has widened, first from the tribe to the nation, then to the race or ethnic group, then to all human beings, and, finally, to non-human animals. That, surely, is moral progress.
We might think that evolution leads to the selection of individuals who think only of their own interests, and those of their kin, because genes for such traits would be more likely to spread. But, as I argued then, the development of reason could take us in a different direction.
On the one hand, having a capacity to reason confers an obvious evolutionary advantage, because it makes it possible to solve problems and to plan to avoid dangers, thereby increasing the prospects of survival. Yet, on the other hand, reason is more than a neutral problem-solving tool. It is more like an escalator: once we get on it, we are liable to be taken to places that we never expected to reach. In particular, reason enables us to see that others, previously outside the bounds of our moral view, are like us in relevant respects. Excluding them from the sphere of beings to whom we owe moral consideration can then seem arbitrary, or just plain wrong.
Steven Pinker’s recent book The Better Angels of Our Nature lends weighty support to this view. Pinker, a professor of psychology at Harvard University, draws on recent research in history, psychology, cognitive science, economics, and sociology to argue that our era is less violent, less cruel, and more peaceful than any previous period of human existence.
The decline in violence holds for families, neighborhoods, tribes, and states. In essence, humans living today are less likely to meet a violent death, or to suffer from violence or cruelty at the hands of others, than their predecessors in any previous century.
Many people will doubt this claim. Some hold a rosy view of the simpler, supposedly more placid lives of tribal hunter-gatherers relative to our own. But examination of skeletons found at archaeological sites suggests that as many as 15% of prehistoric humans met a violent death at the hands of another person. (For comparison, in the first half of the twentieth century, the two world wars caused a death rate in Europe of not much more than 3%.)
Even those tribal peoples extolled by anthropologists as especially “gentle” – for example, the Semai of Malaysia, the Kung of the Kalahari, and the Central Arctic Inuit – turn out to have murder rates that are, relative to population, comparable to Detroit, which has one of the highest murder rates in the United States. In Europe, your chance of being murdered is now less than one-tenth, and in some countries only one-fiftieth, of what it would have been had you lived 500 years ago.
Pinker accepts that reason is an important factor underlying the trends that he describes. In support of this claim, he refers to the “Flynn Effect” – the remarkable finding by the philosopher James Flynn that since IQ tests were first administered, scores have risen considerably. The average IQ is, by definition, 100; but, to achieve that result, raw test results have to be standardized. If the average teenager today took an IQ test in 1910, he or she would score 130, which would be better than 98% of those taking the test then.
It is not easy to attribute this rise to improved education, because the aspects of the tests on which scores have risen the most do not require a good vocabulary, or even mathematical ability, but instead assess powers of abstract reasoning.
[div class=attrib]Read the entire article after the jump.[end-div]
Over the last couple of years a number of researchers have upended conventional wisdom by finding that complex decisions, for instance, those having lots of variables, are better “made” through our emotional system. This flies in the face of the commonly held belief that complexity is best handled by our rational side.
[div class=attrib]Jonah Lehrer over at the Frontal Cortex brings us up to date on current thinking.[end-div]
We live in a world filled with difficult decisions. In fact, we’ve managed to turn even trivial choices – say, picking a toothpaste – into a tortured mental task, as the typical supermarket has more than 200 different dental cleaning options. Should I choose a toothpaste based on fluoride content? Do I need a whitener in my toothpaste? Is Crest different than Colgate? The end result is that the banal selection becomes cognitively demanding, as I have to assess dozens of alternatives and take an array of variables into account. And it’s not just toothpaste: The same thing has happened to nearly every consumption decision, from bottled water to blue jeans to stocks. There are no simple choices left – capitalism makes everything complicated.
How should we make all these hard choices? How does one navigate a world of seemingly infinite alternatives? For thousands of years, the answer has seemed obvious: when faced with a difficult dilemma, we should carefully assess our options and spend a few moments consciously deliberating the information. Then, we should choose the toothpaste that best fits our preferences. This is how we maximize utility and get the most bang for the buck. We are rational agents – we should make decisions in a rational manner.
But what if rationality backfires? What if we make better decisions when we trust our gut instincts? While there is an extensive literature on the potential wisdom of human emotion, it’s only in the last few years that researchers have demonstrated that the emotional system (aka Type 1 thinking) might excel at complex decisions, or those involving lots of variables. If true, this would suggest that the unconscious is better suited for difficult cognitive tasks than the conscious brain, that the very thought process we’ve long disregarded as irrational and impulsive might actually be “smarter” than reasoned deliberation. This is largely because the unconscious is able to handle a surfeit of information, digesting the facts without getting overwhelmed. (Human reason, in contrast, has a very strict bottleneck and can only process about four bits of data at any given moment.) When confused in the toothpaste aisle, bewildered by all the different options, we should go with the product that feels the best.
The most widely cited demonstration of this theory is a 2006 Science paper led by Ap Dijksterhuis. (I wrote about the research in How We Decide.) The experiment went like this: Dijksterhuis got together a group of Dutch car shoppers and gave them descriptions of four different used cars. Each of the cars was rated in four different categories, for a total of sixteen pieces of information. Car number 1, for example, was described as getting good mileage, but had a shoddy transmission and poor sound system. Car number 2 handled poorly, but had lots of legroom. Dijksterhuis designed the experiment so that one car was objectively ideal, with “predominantly positive aspects”. After showing people these car ratings, Dijksterhuis then gave them a few minutes to consciously contemplate their decision. In this “easy” situation, more than fifty percent of the subjects ended up choosing the best car.
[div class=attrib]Read more of the article and Ap Dijksterhuis’ classic experiment here.[end-div]
[div class=attrib]Image courtesy of CustomerSpeak.[end-div]
For all its stellar achievements, human reason seems particularly ill suited to, well, reasoning. Study after study demonstrates reason’s deficiencies, such as the oft-noted confirmation bias (the tendency to recall, select, or interpret evidence in a way that supports one’s preexisting beliefs) and people’s poor performance on straightforward logic puzzles. Why is reason so defective?
To the contrary, reason isn’t defective in the least, argue cognitive scientists Hugo Mercier of the University of Pennsylvania and Dan Sperber of the Jean Nicod Institute in Paris. The problem is that we’ve misunderstood why reason exists and measured its strengths and weaknesses against the wrong standards.
Mercier and Sperber argue that reason did not evolve to allow individuals to think through problems and make brilliant decisions on their own. Rather, it serves a fundamentally social purpose: It promotes argument. Research shows that people solve problems more effectively when they debate them in groups—and the interchange also allows people to hone essential social skills. Supposed defects such as the confirmation bias are well fitted to this purpose because they enable people to efficiently marshal the evidence they need in arguing with others.
[div class=attrib]More from theSource here.[end-div]
[div class=attrib]By Massimo Pigliucci at Rationally Speaking:[end-div]
A recent paper on the evolutionary psychology of reasoning has made mainstream news, with extensive coverage by the New York Times, among others. Too bad the “research” is badly flawed, and the lesson drawn by Patricia Cohen’s commentary in the Times is precisely the wrong one.
Readers of this blog and listeners to our podcast know very well that I tend to be pretty skeptical of evolutionary psychology in general. The reason isn’t because there is anything inherently wrong about thinking that (some) human behavioral traits evolved in response to natural selection. That’s just an uncontroversial consequence of standard evolutionary theory. The devil, rather, is in the details: it is next to impossible to test specific evopsych hypotheses because the crucial data are often missing. The fossil record hardly helps (if we are talking about behavior), there are precious few closely related species for comparison (and they are not at all that closely related), and the current ecological-social environment is very different from the “ERE,” the Evolutionarily Relevant Environment (which means that measuring selection on a given trait in today’s humans is pretty much irrelevant).
That said, I was curious about Hugo Mercier and Dan Sperber’s paper, “Why do humans reason? Arguments for an argumentative theory,” published in Behavioral and Brain Sciences (volume 34, pp. 57-111, 2011), which is accompanied by an extensive peer commentary. My curiosity was piqued in particular because of the Times’ headline from the June 14 article: “Reason Seen More as Weapon Than Path to Truth.” Oh crap, I thought.
Mercier and Sperber’s basic argument is that reason did not evolve to allow us to seek truth, but rather to win arguments with our fellow human beings. We are natural lawyers, not natural philosophers. This, according to them, explains why people are so bad at reasoning, for instance why we tend to fall for basic mistakes such as the well known confirmation bias — a tendency to seek evidence in favor of one’s position and discount contrary evidence that is well on display in politics and pseudoscience. (One could immediately raise the obvious “so what?” objection to all of this: language possibly evolved to coordinate hunting and gossip about your neighbor. That doesn’t mean we can’t take writing and speaking courses and dramatically improve on our given endowment, natural selection be damned.)
The first substantive thing to notice about the paper is that there isn’t a single new datum to back up the central hypothesis. It is one (long) argument in which the authors review well known cognitive science literature and simply apply evopsych speculation to it. If that’s the way to get into the New York Times, I better increase my speculation quotient.
[div class=attrib]More from theSource here.[end-div]
Ideas on complexity and randomness originally suggested by Gottfried W. Leibniz in 1686, combined with modern information theory, imply that there can never be a “theory of everything” for all of mathematics.
In 1956 Scientific American published an article by Ernest Nagel and James R. Newman entitled “Gödel’s Proof.” Two years later the writers published a book with the same title–a wonderful work that is still in print. I was a child, not even a teenager, and I was obsessed by this little book. I remember the thrill of discovering it in the New York Public Library. I used to carry it around with me and try to explain it to other children.
It fascinated me because Kurt Gödel used mathematics to show that mathematics itself has limitations. Gödel refuted the position of David Hilbert , who about a century ago declared that there was a theory of everything for math, a finite set of principles from which one could mindlessly deduce all mathematical truths by tediously following the rules of symbolic logic. But Gödel demonstrated that mathematics contains true statements that cannot be proved that way. His result is based on two self-referential paradoxes: “This statement is false” and “This statement is unprovable.”.
[div class=attrib]More from theSource here.[end-div]