Tag Archives: cognition

Towards an Understanding of Consciousness

Robert-Fudd-Consciousness-17C

The modern scientific method has helped us make great strides in our understanding of much that surrounds us. From knowledge of the infinitesimally small building blocks of atoms to the vast structures of the universe, theory and experiment have enlightened us considerably over the last several hundred years.

Yet a detailed understanding of consciousness still eludes us. Despite the intricate philosophical essays of John Locke in 1690 that laid the foundations for our modern day views of consciousness, a fundamental grasp of its mechanisms remain as elusive as our knowledge of the universe’s dark matter.

So, it’s encouraging to come across a refreshing view of consciousness, described in the context of evolutionary biology. Michael Graziano, associate professor of psychology and neuroscience at Princeton University, makes a thoughtful case for Attention Schema Theory (AST), which centers on the simple notion that there is adaptive value for the brain to build awareness. According to AST, the brain is constantly constructing and refreshing a model — in Graziano’s words an “attention schema” — that describes what its covert attention is doing from one moment to the next. The brain constructs this schema as an analog to its awareness of attention in others — a sound adaptive perception.

Yet, while this view may hold promise from a purely adaptive and evolutionary standpoint, it does have some way to go before it is able to explain how the brain’s abstraction of a holistic awareness is constructed from the physical substrate — the neurons and connections between them.

Read more of Michael Graziano’s essay, A New Theory Explains How Consciousness Evolved. Graziano is the author of Consciousness and the Social Brain, which serves as his introduction to AST. And, for a compelling rebuttal, check out R. Scott Bakker’s article, Graziano, the Attention Schema Theory, and the Neuroscientific Explananda Problem.

Unfortunately, until our experimentalists make some definitive progress in this area, our understanding will remain just as abstract as the theories themselves, however compelling. But, ideas such as these inch us towards a deeper understanding.

Image: Representation of consciousness from the seventeenth century. Robert FluddUtriusque cosmi maioris scilicet et minoris […] historia, tomus II (1619), tractatus I, sectio I, liber X, De triplici animae in corpore visione. Courtesy: Wikipedia. Public Domain.

Multitasking: A Powerful and Diabolical Illusion

Our ever-increasingly ubiquitous technology makes possible all manner of things that would have been insurmountable just decades ago. We carry smartphones that envelope more computational power than mainframes just a generation ago. Yet for all this power at our fingertips we seem to forget that we are still very much human animals with limitations. One such “shortcoming” [your friendly editor believes it’s a boon] is our inability to multitask like our phones. I’ve written about this before, and am compelled to do so again after reading this thoughtful essay by Daniel J. Levitin, extracted from his book The Organized Mind: Thinking Straight in the Age of Information Overload. I even had to use his phrasing for the title of this post.

From the Guardian:

Our brains are busier than ever before. We’re assaulted with facts, pseudo facts, jibber-jabber, and rumour, all posing as information. Trying to figure out what you need to know and what you can ignore is exhausting. At the same time, we are all doing more. Thirty years ago, travel agents made our airline and rail reservations, salespeople helped us find what we were looking for in shops, and professional typists or secretaries helped busy people with their correspondence. Now we do most of those things ourselves. We are doing the jobs of 10 different people while still trying to keep up with our lives, our children and parents, our friends, our careers, our hobbies, and our favourite TV shows.

Our smartphones have become Swiss army knife–like appliances that include a dictionary, calculator, web browser, email, Game Boy, appointment calendar, voice recorder, guitar tuner, weather forecaster, GPS, texter, tweeter, Facebook updater, and flashlight. They’re more powerful and do more things than the most advanced computer at IBM corporate headquarters 30 years ago. And we use them all the time, part of a 21st-century mania for cramming everything we do into every single spare moment of downtime. We text while we’re walking across the street, catch up on email while standing in a queue – and while having lunch with friends, we surreptitiously check to see what our other friends are doing. At the kitchen counter, cosy and secure in our domicile, we write our shopping lists on smartphones while we are listening to that wonderfully informative podcast on urban beekeeping.

But there’s a fly in the ointment. Although we think we’re doing several things at once, multitasking, this is a powerful and diabolical illusion. Earl Miller, a neuroscientist at MIT and one of the world experts on divided attention, says that our brains are “not wired to multitask well… When people think they’re multitasking, they’re actually just switching from one task to another very rapidly. And every time they do, there’s a cognitive cost in doing so.” So we’re not actually keeping a lot of balls in the air like an expert juggler; we’re more like a bad amateur plate spinner, frantically switching from one task to another, ignoring the one that is not right in front of us but worried it will come crashing down any minute. Even though we think we’re getting a lot done, ironically, multitasking makes us demonstrably less efficient.

Multitasking has been found to increase the production of the stress hormone cortisol as well as the fight-or-flight hormone adrenaline, which can overstimulate your brain and cause mental fog or scrambled thinking. Multitasking creates a dopamine-addiction feedback loop, effectively rewarding the brain for losing focus and for constantly searching for external stimulation. To make matters worse, the prefrontal cortex has a novelty bias, meaning that its attention can be easily hijacked by something new – the proverbial shiny objects we use to entice infants, puppies, and kittens. The irony here for those of us who are trying to focus amid competing activities is clear: the very brain region we need to rely on for staying on task is easily distracted. We answer the phone, look up something on the internet, check our email, send an SMS, and each of these things tweaks the novelty- seeking, reward-seeking centres of the brain, causing a burst of endogenous opioids (no wonder it feels so good!), all to the detriment of our staying on task. It is the ultimate empty-caloried brain candy. Instead of reaping the big rewards that come from sustained, focused effort, we instead reap empty rewards from completing a thousand little sugar-coated tasks.

In the old days, if the phone rang and we were busy, we either didn’t answer or we turned the ringer off. When all phones were wired to a wall, there was no expectation of being able to reach us at all times – one might have gone out for a walk or been between places – and so if someone couldn’t reach you (or you didn’t feel like being reached), it was considered normal. Now more people have mobile phones than have toilets. This has created an implicit expectation that you should be able to reach someone when it is convenient for you, regardless of whether it is convenient for them. This expectation is so ingrained that people in meetings routinely answer their mobile phones to say, “I’m sorry, I can’t talk now, I’m in a meeting.” Just a decade or two ago, those same people would have let a landline on their desk go unanswered during a meeting, so different were the expectations for reachability.

Just having the opportunity to multitask is detrimental to cognitive performance. Glenn Wilson, former visiting professor of psychology at Gresham College, London, calls it info-mania. His research found that being in a situation where you are trying to concentrate on a task, and an email is sitting unread in your inbox, can reduce your effective IQ by 10 points. And although people ascribe many benefits to marijuana, including enhanced creativity and reduced pain and stress, it is well documented that its chief ingredient, cannabinol, activates dedicated cannabinol receptors in the brain and interferes profoundly with memory and with our ability to concentrate on several things at once. Wilson showed that the cognitive losses from multitasking are even greater than the cognitive losses from pot?smoking.

Russ Poldrack, a neuroscientist at Stanford, found that learning information while multitasking causes the new information to go to the wrong part of the brain. If students study and watch TV at the same time, for example, the information from their schoolwork goes into the striatum, a region specialised for storing new procedures and skills, not facts and ideas. Without the distraction of TV, the information goes into the hippocampus, where it is organised and categorised in a variety of ways, making it easier to retrieve. MIT’s Earl Miller adds, “People can’t do [multitasking] very well, and when they say they can, they’re deluding themselves.” And it turns out the brain is very good at this deluding business.

Then there are the metabolic costs that I wrote about earlier. Asking the brain to shift attention from one activity to another causes the prefrontal cortex and striatum to burn up oxygenated glucose, the same fuel they need to stay on task. And the kind of rapid, continual shifting we do with multitasking causes the brain to burn through fuel so quickly that we feel exhausted and disoriented after even a short time. We’ve literally depleted the nutrients in our brain. This leads to compromises in both cognitive and physical performance. Among other things, repeated task switching leads to anxiety, which raises levels of the stress hormone cortisol in the brain, which in turn can lead to aggressive and impulsive behaviour. By contrast, staying on task is controlled by the anterior cingulate and the striatum, and once we engage the central executive mode, staying in that state uses less energy than multitasking and actually reduces the brain’s need for glucose.

To make matters worse, lots of multitasking requires decision-making: Do I answer this text message or ignore it? How do I respond to this? How do I file this email? Do I continue what I’m working on now or take a break? It turns out that decision-making is also very hard on your neural resources and that little decisions appear to take up as much energy as big ones. One of the first things we lose is impulse control. This rapidly spirals into a depleted state in which, after making lots of insignificant decisions, we can end up making truly bad decisions about something important. Why would anyone want to add to their daily weight of information processing by trying to multitask?

Read the entire article here.

Technology: Mind Exp(a/e)nder

Rattling off esoteric facts to friends and colleagues at a party or in the office is often seen as a simple way to impress. You may have tried this at some point — to impress a prospective boy or girl friend or a group of peers or even your boss. Not surprisingly, your facts will impress if they are relevant to the discussion at hand. However, your audience will be even more agog at your uncanny, intellectual prowess if the facts and figures relate to some wildly obtuse domain — quotes from authors, local bird species, gold prices through the years, land-speed records through the ages, how electrolysis works, etymology of polysyllabic words, and so it goes.

So, it comes as no surprise that many technology companies fall over themselves to promote their products as a way to make you, the smart user, even smarter. But does having constant, realtime access to a powerful computer or smartphone or spectacles linked to an immense library of interconnected content, make you smarter? Some would argue that it does; that having access to a vast, virtual disk drive of information will improve your cognitive abilities. There is no doubt that our technology puts an unparalleled repository of information within instant and constant reach: we can read all the classic literature — for that matter we can read the entire contents of the Library of Congress; we can find an answer to almost any question — it’s just a Google search away; we can find fresh research and rich reference material on every subject imaginable.

Yet, all this information will not directly make us any smarter; it is not applied knowledge nor is it experiential wisdom. It will not make us more creative or insightful. However, it is more likely to influence our cognition indirectly — freed from our need to carry volumes of often useless facts and figures in our heads, we will be able to turn our minds to more consequential and noble pursuits — to think, rather than to memorize. That is a good thing.

From Slate:

Quick, what’s the square root of 2,130? How many Roadmaster convertibles did Buick build in 1949? What airline has never lost a jet plane in a crash?

If you answered “46.1519,” “8,000,” and “Qantas,” there are two possibilities. One is that you’re Rain Man. The other is that you’re using the most powerful brain-enhancement technology of the 21st century so far: Internet search.

True, the Web isn’t actually part of your brain. And Dustin Hoffman rattled off those bits of trivia a few seconds faster in the movie than you could with the aid of Google. But functionally, the distinctions between encyclopedic knowledge and reliable mobile Internet access are less significant than you might think. Math and trivia are just the beginning. Memory, communication, data analysis—Internet-connected devices can give us superhuman powers in all of these realms. A growing chorus of critics warns that the Internet is making us lazy, stupid, lonely, or crazy. Yet tools like Google, Facebook, and Evernote hold at least as much potential to make us not only more knowledgeable and more productive but literally smarter than we’ve ever been before.

The idea that we could invent tools that change our cognitive abilities might sound outlandish, but it’s actually a defining feature of human evolution. When our ancestors developed language, it altered not only how they could communicate but how they could think. Mathematics, the printing press, and science further extended the reach of the human mind, and by the 20th century, tools such as telephones, calculators, and Encyclopedia Britannica gave people easy access to more knowledge about the world than they could absorb in a lifetime.

Yet it would be a stretch to say that this information was part of people’s minds. There remained a real distinction between what we knew and what we could find out if we cared to.

The Internet and mobile technology have begun to change that. Many of us now carry our smartphones with us everywhere, and high-speed data networks blanket the developed world. If I asked you the capital of Angola, it would hardly matter anymore whether you knew it off the top of your head. Pull out your phone and repeat the question using Google Voice Search, and a mechanized voice will shoot back, “Luanda.” When it comes to trivia, the difference between a world-class savant and your average modern technophile is perhaps five seconds. And Watson’s Jeopardy! triumph over Ken Jennings suggests even that time lag might soon be erased—especially as wearable technology like Google Glass begins to collapse the distance between our minds and the cloud.

So is the Internet now essentially an external hard drive for our brains? That’s the essence of an idea called “the extended mind,” first propounded by philosophers Andy Clark and David Chalmers in 1998. The theory was a novel response to philosophy’s long-standing “mind-brain problem,” which asks whether our minds are reducible to the biology of our brains. Clark and Chalmers proposed that the modern human mind is a system that transcends the brain to encompass aspects of the outside environment. They argued that certain technological tools—computer modeling, navigation by slide rule, long division via pencil and paper—can be every bit as integral to our mental operations as the internal workings of our brains. They wrote: “If, as we confront some task, a part of the world functions as a process which, were it done in the head, we would have no hesitation in recognizing as part of the cognitive process, then that part of the world is (so we claim) part of the cognitive process.”

Fifteen years on and well into the age of Google, the idea of the extended mind feels more relevant today. “Ned Block [an NYU professor] likes to say, ‘Your thesis was false when you wrote the article—since then it has come true,’ ” Chalmers says with a laugh.

The basic Google search, which has become our central means of retrieving published information about the world—is only the most obvious example. Personal-assistant tools like Apple’s Siri instantly retrieve information such as phone numbers and directions that we once had to memorize or commit to paper. Potentially even more powerful as memory aids are cloud-based note-taking apps like Evernote, whose slogan is, “Remember everything.”

So here’s a second pop quiz. Where were you on the night of Feb. 8, 2010? What are the names and email addresses of all the people you know who currently live in New York City? What’s the exact recipe for your favorite homemade pastry?

Read the entire article after the jump.

Image: Google Glass. Courtesy of Google.

Chocolate for the Soul and Mind (But Not Body)

Hot on the heels of the recent research finding that the Mediterranean diet improves heart health, come news that choc-a-holics the world over have been anxiously awaiting — chocolate improves brain function.

Researchers have found that chocolate rich in compounds known as flavanols can improve cognitive function. Now, before you rush out the door to visit the local grocery store to purchase a mountain of Mars bars (perhaps not coincidentally, Mars, Inc., partly funded the research study), Godiva pralines, Cadbury flakes or a slab of Dove, take note that all chocolate is not created equally. Flavanols tend to be found in highest concentrations in raw cocoa. In fact, during the process of making most chocolate, including the dark kind, most flavanols tend to be removed or destroyed. Perhaps the silver lining here is that to replicate the dose of flavanols found to have a positive effect on brain function, you would have to eat around 20 bars of chocolate per day for several months. This may be good news for your brain, but not your waistline!

[div class=attrib]From Scientific American:[end-div]

It’s news chocolate lovers have been craving: raw cocoa may be packed with brain-boosting compounds. Researchers at the University of L’Aquila in Italy, with scientists from Mars, Inc., and their colleagues published findings last September that suggest cognitive function in the elderly is improved by ingesting high levels of natural compounds found in cocoa called flavanols. The study included 90 individuals with mild cognitive impairment, a precursor to Alzheimer’s disease. Subjects who drank a cocoa beverage containing either moderate or high levels of flavanols daily for eight weeks demonstrated greater cognitive function than those who consumed low levels of flavanols on three separate tests that measured factors that included verbal fluency, visual searching and attention.

Exactly how cocoa causes these changes is still unknown, but emerging research points to one flavanol in particular: (-)-epicatechin, pronounced “minus epicatechin.” Its name signifies its structure, differentiating it from other catechins, organic compounds highly abundant in cocoa and present in apples, wine and tea. The graph below shows how (-)-epicatechin fits into the world of brain-altering food molecules. Other studies suggest that the compound supports increased circulation and the growth of blood vessels, which could explain improvements in cognition, because better blood flow would bring the brain more oxygen and improve its function.

Animal research has already demonstrated how pure (-)-epicatechin enhances memory. Findings published last October in the Journal of Experimental Biology note that snails can remember a trained task—such as holding their breath in deoxygenated water—for more than a day when given (-)-epicatechin but for less than three hours without the flavanol. Salk Institute neuroscientist Fred Gage and his colleagues found previously that (-)-epicatechin improves spatial memory and increases vasculature in mice. “It’s amazing that a single dietary change could have such profound effects on behavior,” Gage says. If further research confirms the compound’s cognitive effects, flavanol supplements—or raw cocoa beans—could be just what the doctor ordered.

So, Can We Binge on Chocolate Now?

Nope, sorry. A food’s origin, processing, storage and preparation can each alter its chemical composition. As a result, it is nearly impossible to predict which flavanols—and how many—remain in your bonbon or cup of tea. Tragically for chocoholics, most methods of processing cocoa remove many of the flavanols found in the raw plant. Even dark chocolate, touted as the “healthy” option, can be treated such that the cocoa darkens while flavanols are stripped.

Researchers are only beginning to establish standards for measuring flavanol content in chocolate. A typical one and a half ounce chocolate bar might contain about 50 milligrams of flavanols, which means you would need to consume 10 to 20 bars daily to approach the flavanol levels used in the University of L’Aquila study. At that point, the sugars and fats in these sweet confections would probably outweigh any possible brain benefits. Mars Botanical nutritionist and toxicologist Catherine Kwik-Uribe, an author on the University of L’Aquila study, says, “There’s now even more reasons to enjoy tea, apples and chocolate. But diversity and variety in your diet remain key.”

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image courtesy of Google Search.[end-div]

Single-tasking is Human

If you’re an office worker you will relate. Recently, you will have participated on a team meeting or conference call only to have at least one person say, when asked a question, “sorry can you please repeat that, I was multitasking.”

Many of us believe, or have been tricked into believing, that doing multiple things at once makes us more productive. This phenomenon was branded by business as multitasking. After all, if computers could do it, then why not humans. Yet, experience shows that humans are woefully inadequate at performing multiple concurrent tasks that require dedicated attention. Of course, humans are experts at walking and chewing gum at the same time. However, in the majority of cases these activities require very little involvement from the higher functions of the brain. There is a growing body of anecdotal and experimental evidence that shows poorer performance on multiple tasks done concurrently versus the same tasks performed sequentially. In fact, for quite some time, researchers have shown that dealing with multiple streams of information at once is a real problem for our limited brains.

Yet, most businesses seem to demand or reward multitasking behavior. And damagingly, the multitasking epidemic now seems to be the norm in the home as well.

[div class=attrib]From the WSJ:[end-div]

In the few minutes it takes to read this article, chances are you’ll pause to check your phone, answer a text, switch to your desktop to read an email from the boss’s assistant, or glance at the Facebook or Twitter messages popping up in the corner of your screen. Off-screen, in your open-plan office, crosstalk about a colleague’s preschooler might lure you away, or a co-worker may stop by your desk for a quick question.

And bosses wonder why it is tough to get any work done.

Distraction at the office is hardly new, but as screens multiply and managers push frazzled workers to do more with less, companies say the problem is worsening and is affecting business.

While some firms make noises about workers wasting time on the Web, companies are realizing the problem is partly their own fault.

Even though digital technology has led to significant productivity increases, the modern workday seems custom-built to destroy individual focus. Open-plan offices and an emphasis on collaborative work leave workers with little insulation from colleagues’ chatter. A ceaseless tide of meetings and internal emails means that workers increasingly scramble to get their “real work” done on the margins, early in the morning or late in the evening. And the tempting lure of social-networking streams and status updates make it easy for workers to interrupt themselves.

“It is an epidemic,” says Lacy Roberson, a director of learning and organizational development at eBay Inc. At most companies, it’s a struggle “to get work done on a daily basis, with all these things coming at you,” she says.

Office workers are interrupted—or self-interrupt—roughly every three minutes, academic studies have found, with numerous distractions coming in both digital and human forms. Once thrown off track, it can take some 23 minutes for a worker to return to the original task, says Gloria Mark, a professor of informatics at the University of California, Irvine, who studies digital distraction.

Companies are experimenting with strategies to keep workers focused. Some are limiting internal emails—with one company moving to ban them entirely—while others are reducing the number of projects workers can tackle at a time.

Last year, Jamey Jacobs, a divisional vice president at Abbott Vascular, a unit of health-care company Abbott Laboratories learned that his 200 employees had grown stressed trying to squeeze in more heads-down, focused work amid the daily thrum of email and meetings.

“It became personally frustrating that they were not getting the things they wanted to get done,” he says. At meetings, attendees were often checking email, trying to multitask and in the process obliterating their focus.

Part of the solution for Mr. Jacobs’s team was that oft-forgotten piece of office technology: the telephone.

Mr. Jacobs and productivity consultant Daniel Markovitz found that employees communicated almost entirely over email, whether the matter was mundane, such as cake in the break room, or urgent, like an equipment issue.

The pair instructed workers to let the importance and complexity of their message dictate whether to use cellphones, office phones or email. Truly urgent messages and complex issues merited phone calls or in-person conversations, while email was reserved for messages that could wait.

Workers now pick up the phone more, logging fewer internal emails and say they’ve got clarity on what’s urgent and what’s not, although Mr. Jacobs says staff still have to stay current with emails from clients or co-workers outside the group.

[div class=attrib]Read the entire article after the jump, and learn more in this insightful article on multitasking over at Big Think.[end-div]

[div class=attrib]Image courtesy of Big Think.[end-div]

Cocktail Party Science and Multitasking


The hit drama Mad Men shows us that cocktail parties can be fun — colorful drinks and colorful conversations with a host of very colorful characters. Yet cocktail parties also highlight one of our limitations, the inability to multitask. We are single-threaded animals despite the constant and simultaneous bombardment for our attention from all directions, and to all our senses.

Melinda Beck over at the WSJ Health Journal summarizes recent research that shows the deleterious effects of our attempts to multitask — why it’s so hard and why it’s probably not a good idea anyway, especially while driving.

[div class=attrib]From the Wall Street Journal:[end-div]

You’re at a party. Music is playing. Glasses are clinking. Dozens of conversations are driving up the decibel level. Yet amid all those distractions, you can zero in on the one conversation you want to hear.

This ability to hyper-focus on one stream of sound amid a cacophony of others is what researchers call the “cocktail-party effect.” Now, scientists at the University of California in San Francisco have pinpointed where that sound-editing process occurs in the brain—in the auditory cortex just behind the ear, not in areas of higher thought. The auditory cortex boosts some sounds and turns down others so that when the signal reaches the higher brain, “it’s as if only one person was speaking alone,” says principle investigator Edward Chang.

These findings, published in the journal Nature last week, underscore why people aren’t very good at multitasking—our brains are wired for “selective attention” and can focus on only one thing at a time. That innate ability has helped humans survive in a world buzzing with visual and auditory stimulation. But we keep trying to push the limits with multitasking, sometimes with tragic consequences. Drivers talking on cellphones, for example, are four times as likely to get into traffic accidents as those who aren’t.

Many of those accidents are due to “inattentional blindness,” in which people can, in effect, turn a blind eye to things they aren’t focusing on. Images land on our retinas and are either boosted or played down in the visual cortex before being passed to the brain, just as the auditory cortex filters sounds, as shown in the Nature study last week. “It’s a push-pull relationship—the more we focus on one thing, the less we can focus on others,” says Diane M. Beck, an associate professor of psychology at the University of Illinois.

That people can be completely oblivious to things in their field of vision was demonstrated famously in the “Invisible Gorilla experiment” devised at Harvard in the 1990s. Observers are shown a short video of youths tossing a basketball and asked to count how often the ball is passed by those wearing white. Afterward, the observers are asked several questions, including, “Did you see the gorilla?” Typically, about half the observers failed to notice that someone in a gorilla suit walked through the scene. They’re usually flabbergasted because they’re certain they would have noticed something like that.

“We largely see what we expect to see,” says Daniel Simons, one of the study’s creators and now a professor of psychology at the University of Illinois. As he notes in his subsequent book, “The Invisible Gorilla,” the more attention a task demands, the less attention we can pay to other things in our field of vision. That’s why pilots sometimes fail to notice obstacles on runways and radiologists may overlook anomalies on X-rays, especially in areas they aren’t scrutinizing.

And it isn’t just that sights and sounds compete for the brain’s attention. All the sensory inputs vie to become the mind’s top priority.

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image courtesy of Getty Images / Wall Street Journal.[end-div]

Crossword Puzzles and Cognition

[div class=attrib]From the New Scientist:[end-div]

TACKLING a crossword can crowd the tip of your tongue. You know that you know the answers to 3 down and 5 across, but the words just won’t come out. Then, when you’ve given up and moved on to another clue, comes blessed relief. The elusive answer suddenly occurs to you, crystal clear.

The processes leading to that flash of insight can illuminate many of the human mind’s curious characteristics. Crosswords can reflect the nature of intuition, hint at the way we retrieve words from our memory, and reveal a surprising connection between puzzle solving and our ability to recognise a human face.

“What’s fascinating about a crossword is that it involves many aspects of cognition that we normally study piecemeal, such as memory search and problem solving, all rolled into one ball,” says Raymond Nickerson, a psychologist at Tufts University in Medford, Massachusetts. In a paper published earlier this year, he brought profession and hobby together by analysing the mental processes of crossword solving (Psychonomic Bulletin and Review, vol 18, p 217).

1 across: “You stinker!” – audible cry that allegedly marked displacement activity (6)

Most of our mental machinations take place pre-consciously, with the results dropping into our conscious minds only after they have been decided elsewhere in the brain. Intuition plays a big role in solving a crossword, Nickerson observes. Indeed, sometimes your pre-conscious mind may be so quick that it produces the goods instantly.

At other times, you might need to take a more methodical approach and consider possible solutions one by one, perhaps listing synonyms of a word in the clue.

Even if your list doesn’t seem to make much sense, it might reflect the way your pre-conscious mind is homing in on the solution. Nickerson points to work in the 1990s by Peter Farvolden at the University of Toronto in Canada, who gave his subjects four-letter fragments of seven-letter target words (as may happen in some crossword layouts, especially in the US, where many words overlap). While his volunteers attempted to work out the target, they were asked to give any other word that occurred to them in the meantime. The words tended to be associated in meaning with the eventual answer, hinting that the pre-conscious mind solves a problem in steps.

Should your powers of deduction fail you, it may help to let your mind chew over the clue while your conscious attention is elsewhere. Studies back up our everyday experience that a period of incubation can lead you to the eventual “aha” moment. Don’t switch off entirely, though. For verbal problems, a break from the clue seems to be more fruitful if you occupy yourself with another task, such as drawing a picture or reading (Psychological Bulletin, vol 135, p 94).

So if 1 across has you flummoxed, you could leave it and take a nice bath, or better still read a novel. Or just move on to the next clue.

[div class=attrib]Read the entire article here.[end-div]

[div class=attrib]Image: Newspaper crossword puzzle. Courtesy of Polytechnic West.[end-div]

Boost Your Brainpower: Chew Gum

So you wish to boost your brain function? Well, forget the folate, B vitamins, omega-3 fatty acids, ginko biloba, and the countless array of other supplements. Researchers have confirmed that chewing gum increases cognitive abilities. However, while gum chewers perform significantly better on a battery of psychological tests, the boost is fleeting — lasting only on average for the first 20 minutes of testing.

[div class=attrib]From Wired:[end-div]

Why do people chew gum? If an anthropologist from Mars ever visited a typical supermarket, they’d be confounded by those shelves near the checkout aisle that display dozens of flavored gum options. Chewing without eating seems like such a ridiculous habit, the oral equivalent of running on a treadmill. And yet, people have been chewing gum for thousands of years, ever since the ancient Greeks began popping wads of mastic tree resin in their mouth to sweeten the breath. Socrates probably chewed gum.

It turns out there’s an excellent rationale for this long-standing cultural habit: Gum is an effective booster of mental performance, conferring all sorts of benefits without any side effects. The latest investigation of gum chewing comes from a team of psychologists at St. Lawrence University. The experiment went like this: 159 students were given a battery of demanding cognitive tasks, such as repeating random numbers backward and solving difficult logic puzzles. Half of the subjects chewed gum (sugar-free and sugar-added) while the other half were given nothing. Here’s where things get peculiar: Those randomly assigned to the gum-chewing condition significantly outperformed those in the control condition on five out of six tests. (The one exception was verbal fluency, in which subjects were asked to name as many words as possible from a given category, such as “animals.”) The sugar content of the gum had no effect on test performance.

While previous studies achieved similar results — chewing gum is often a better test aid than caffeine — this latest research investigated the time course of the gum advantage. It turns out to be rather short lived, as gum chewers only showed an increase in performance during the first 20 minutes of testing. After that, they performed identically to non-chewers.

What’s responsible for this mental boost? Nobody really knows. It doesn’t appear to depend on glucose, since sugar-free gum generated the same benefits. Instead, the researchers propose that gum enhances performance due to “mastication-induced arousal.” The act of chewing, in other words, wakes us up, ensuring that we are fully focused on the task at hand.

[div class=attrib]Read the entire article here.[end-div]

[div class=attrib]Image: Chewing gum tree, Mexico D.F. Courtesy of mexicolore.[end-div]

Improvements to Our Lives Through Science

Ask a hundred people how science can be used for the good and you’re likely to get a hundred different answers. Well, Edge Magazine did just that, posing the question: “What scientific concept would improve everybody’s cognitive toolkit”, to 159 critical thinkers. Below we excerpt some of our favorites. The thoroughly engrossing, novel length article can be found here in its entirety.

[div class=attrib]From Edge:[end-div]

ether
Richard H. Thaler. Father of behavioral economics.

I recently posted a question in this space asking people to name their favorite example of a wrong scientific belief. One of my favorite answers came from Clay Shirky. Here is an excerpt:
The existence of ether, the medium through which light (was thought to) travel. It was believed to be true by analogy — waves propagate through water, and sound waves propagate through air, so light must propagate through X, and the name of this particular X was ether.
It’s also my favorite because it illustrates how hard it is to accumulate evidence for deciding something doesn’t exist. Ether was both required by 19th century theories and undetectable by 19th century apparatus, so it accumulated a raft of negative characteristics: it was odorless, colorless, inert, and so on.

Ecology
Brian Eno. Artist; Composer; Recording Producer: U2, Cold Play, Talking Heads, Paul Simon.

That idea, or bundle of ideas, seems to me the most important revolution in general thinking in the last 150 years. It has given us a whole new sense of who we are, where we fit, and how things work. It has made commonplace and intuitive a type of perception that used to be the province of mystics — the sense of wholeness and interconnectedness.
Beginning with Copernicus, our picture of a semi-divine humankind perfectly located at the centre of The Universe began to falter: we discovered that we live on a small planet circling a medium sized star at the edge of an average galaxy. And then, following Darwin, we stopped being able to locate ourselves at the centre of life. Darwin gave us a matrix upon which we could locate life in all its forms: and the shocking news was that we weren’t at the centre of that either — just another species in the innumerable panoply of species, inseparably woven into the whole fabric (and not an indispensable part of it either). We have been cut down to size, but at the same time we have discovered ourselves to be part of the most unimaginably vast and beautiful drama called Life.

We Are Not Alone In The Universe
J. Craig Venter. Leading scientist of the 21st century.

I cannot imagine any single discovery that would have more impact on humanity than the discovery of life outside of our solar system. There is a human-centric, Earth-centric view of life that permeates most cultural and societal thinking. Finding that there are multiple, perhaps millions of origins of life and that life is ubiquitous throughout the universe will profoundly affect every human.

Correlation is not a cause
Susan Blackmore. Psychologist; Author, Consciousness: An Introduction.

The phrase “correlation is not a cause” (CINAC) may be familiar to every scientist but has not found its way into everyday language, even though critical thinking and scientific understanding would improve if more people had this simple reminder in their mental toolkit.
One reason for this lack is that CINAC can be surprisingly difficult to grasp. I learned just how difficult when teaching experimental design to nurses, physiotherapists and other assorted groups. They usually understood my favourite example: imagine you are watching at a railway station. More and more people arrive until the platform is crowded, and then — hey presto — along comes a train. Did the people cause the train to arrive (A causes B)? Did the train cause the people to arrive (B causes A)? No, they both depended on a railway timetable (C caused both A and B).

A Statistically Significant Difference in Understanding the Scientific Process
Diane F. Halpern. Professor, Claremont McKenna College; Past-president, American Psychological Society.

Statistically significant difference — It is a simple phrase that is essential to science and that has become common parlance among educated adults. These three words convey a basic understanding of the scientific process, random events, and the laws of probability. The term appears almost everywhere that research is discussed — in newspaper articles, advertisements for “miracle” diets, research publications, and student laboratory reports, to name just a few of the many diverse contexts where the term is used. It is a short hand abstraction for a sequence of events that includes an experiment (or other research design), the specification of a null and alternative hypothesis, (numerical) data collection, statistical analysis, and the probability of an unlikely outcome. That is a lot of science conveyed in a few words.

 

Confabulation
Fiery Cushman. Post-doctoral fellow, Mind/Brain/Behavior Interfaculty Initiative, Harvard University.

We are shockingly ignorant of the causes of our own behavior. The explanations that we provide are sometimes wholly fabricated, and certainly never complete. Yet, that is not how it feels. Instead it feels like we know exactly what we’re doing and why. This is confabulation: Guessing at plausible explanations for our behavior, and then regarding those guesses as introspective certainties. Every year psychologists use dramatic examples to entertain their undergraduate audiences. Confabulation is funny, but there is a serious side, too. Understanding it can help us act better and think better in everyday life.

We are Lost in Thought
Sam Harris. Neuroscientist; Chairman, The Reason Project; Author, Letter to a Christian Nation.

I invite you to pay attention to anything — the sight of this text, the sensation of breathing, the feeling of your body resting against your chair — for a mere sixty seconds without getting distracted by discursive thought. It sounds simple enough: Just pay attention. The truth, however, is that you will find the task impossible. If the lives of your children depended on it, you could not focus on anything — even the feeling of a knife at your throat — for more than a few seconds, before your awareness would be submerged again by the flow of thought. This forced plunge into unreality is a problem. In fact, it is the problem from which every other problem in human life appears to be made.
I am by no means denying the importance of thinking. Linguistic thought is indispensable to us. It is the basis for planning, explicit learning, moral reasoning, and many other capacities that make us human. Thinking is the substance of every social relationship and cultural institution we have. It is also the foundation of science. But our habitual identification with the flow of thought — that is, our failure to recognize thoughts as thoughts, as transient appearances in consciousness — is a primary source of human suffering and confusion.

Knowledge
Mark Pagel. Professor of Evolutionary Biology, Reading University, England and The Santa Fe.

The Oracle of Delphi famously pronounced Socrates to be “the most intelligent man in the world because he knew that he knew nothing”. Over 2000 years later the physicist-turned-historian Jacob Bronowski would emphasize — in the last episode of his landmark 1970s television series the “Ascent of Man” — the danger of our all-too-human conceit of thinking we know something. What Socrates knew and what Bronowski had come to appreciate is that knowledge — true knowledge — is difficult, maybe even impossible, to come buy, it is prone to misunderstanding and counterfactuals, and most importantly it can never be acquired with exact precision, there will always be some element of doubt about anything we come to “know”‘ from our observations of the world.

[div class=attrib]More from theSource here.[end-div]