Tag Archives: perception

Thoughts As Shapes

wednesday is indigo blue bookcoverJonathan Jackson has a very rare form of a rare neurological condition. He has synesthesia, which is a cross-connection of two (or more) unrelated senses where an perception in one sense causes an automatic experience in another sense. Some synesthetes, for instance, see various sounds or musical notes as distinct colors (chromesthesia), others perceive different words as distinct tastes (lexical-gustatory synesthesia).

Jackson, on the other hand, experiences his thoughts as shapes in a visual mindmap. This is so fascinating I’ve excerpted a short piece of his story below.

Also, if you are further intrigued by this subject I recommend three great reads on the subject: Wednesday Is Indigo Blue: Discovering the Brain of Synesthesia by Richard Cytowic, and David M. Eagleman; Musicophilia: Tales of Music and the Brain, by Oliver Sacks; The Man Who Tasted Shapes by Richard Cytowic.

From the Atlantic:

One spring evening in the mid 2000s, Jonathan Jackson and Andy Linscott sat on some seaside rocks near their college campus, smoking the kind of cigarettes reserved for heartbreak. Linscott was, by his own admission, “emotionally spewing” over a girl, and Jackson was consoling him.

Jackson had always been a particularly good listener. But in the middle of their talk, he did something Linscott found deeply odd.

“He got up and jumped over to this much higher rock,” Linscott says. “He was like, ‘Andy, I’m listening, I just want to get a different angle. I want to see what you’re saying and the shape of your words from a different perspective.’ I was baffled.”

For Jackson, moving physically to think differently about an idea seemed totally natural. “People say, ‘Okay, we need to think about this from a new angle’ all the time!” he says. “But for me that’s literal.”

Jackson has synesthesia, a neurological phenomenon that has long been defined as the co-activation of two or more conventionally unrelated senses. Some synesthetes see music (known as auditory-visual synesthesia) or read letters and numbers in specific hues (grapheme-color synesthesia). But recent research has complicated that definition, exploring where in the sensory process those overlaps start and opening up the term to include types of synesthesia in which senses interact in a much more complex manner.

Read the entire  story here.

Image: Wednesday Is Indigo Blue, bookcover, Courtesy: By Richard E. Cytowic and David M. Eagleman, MIT Press.

Paper is the Next Big Thing

Da-Vinci-Hammer-Codex

Luddites and technophobes rejoice, paper-bound books may be with us for quite some time. And, there may be some genuinely scientific reasons why physical books will remain. Recent research shows that people learn more effectively when reading from paper versus its digital offspring.

From Wired:

Paper books were supposed to be dead by now. For years, information theorists, marketers, and early adopters have told us their demise was imminent. Ikea even redesigned a bookshelf to hold something other than books. Yet in a world of screen ubiquity, many people still prefer to do their serious reading on paper.

Count me among them. When I need to read deeply—when I want to lose myself in a story or an intellectual journey, when focus and comprehension are paramount—I still turn to paper. Something just feels fundamentally richer about reading on it. And researchers are starting to think there’s something to this feeling.

To those who see dead tree editions as successors to scrolls and clay tablets in history’s remainder bin, this might seem like literary Luddism. But I e-read often: when I need to copy text for research or don’t want to carry a small library with me. There’s something especially delicious about late-night sci-fi by the light of a Kindle Paperwhite.

What I’ve read on screen seems slippery, though. When I later recall it, the text is slightly translucent in my mind’s eye. It’s as if my brain better absorbs what’s presented on paper. Pixels just don’t seem to stick. And often I’ve found myself wondering, why might that be?

The usual explanation is that internet devices foster distraction, or that my late-thirty-something brain isn’t that of a true digital native, accustomed to screens since infancy. But I have the same feeling when I am reading a screen that’s not connected to the internet and Twitter or online Boggle can’t get in the way. And research finds that kids these days consistently prefer their textbooks in print rather than pixels. Whatever the answer, it’s not just about habit.

Another explanation, expressed in a recent Washington Post article on the decline of deep reading, blames a sweeping change in our lifestyles: We’re all so multitasked and attention-fragmented that our brains are losing the ability to focus on long, linear texts. I certainly feel this way, but if I don’t read deeply as often or easily as I used to, it does still happen. It just doesn’t happen on screen, and not even on devices designed specifically for that experience.

Maybe it’s time to start thinking of paper and screens another way: not as an old technology and its inevitable replacement, but as different and complementary interfaces, each stimulating particular modes of thinking. Maybe paper is a technology uniquely suited for imbibing novels and essays and complex narratives, just as screens are for browsing and scanning.

“Reading is human-technology interaction,” says literacy professor Anne Mangen of Norway’s University of Stavenger. “Perhaps the tactility and physical permanence of paper yields a different cognitive and emotional experience.” This is especially true, she says, for “reading that can’t be done in snippets, scanning here and there, but requires sustained attention.”

Mangen is among a small group of researchers who study how people read on different media. It’s a field that goes back several decades, but yields no easy conclusions. People tended to read slowly and somewhat inaccurately on early screens. The technology, particularly e-paper, has improved dramatically, to the point where speed and accuracy aren’t now problems, but deeper issues of memory and comprehension are not yet well-characterized.

Complicating the scientific story further, there are many types of reading. Most experiments involve short passages read by students in an academic setting, and for this sort of reading, some studies have found no obvious differences between screens and paper. Those don’t necessarily capture the dynamics of deep reading, though, and nobody’s yet run the sort of experiment, involving thousands of readers in real-world conditions who are tracked for years on a battery of cognitive and psychological measures, that might fully illuminate the matter.

In the meantime, other research does suggest possible differences. A 2004 study found that students more fully remembered what they’d read on paper. Those results were echoed by an experiment that looked specifically at e-books, and another by psychologist Erik Wästlund at Sweden’s Karlstad University, who found that students learned better when reading from paper.

Wästlund followed up that study with one designed to investigate screen reading dynamics in more detail. He presented students with a variety of on-screen document formats. The most influential factor, he found, was whether they could see pages in their entirety. When they had to scroll, their performance suffered.

According to Wästlund, scrolling had two impacts, the most basic being distraction. Even the slight effort required to drag a mouse or swipe a finger requires a small but significant investment of attention, one that’s higher than flipping a page. Text flowing up and down a page also disrupts a reader’s visual attention, forcing eyes to search for a new starting point and re-focus.

Mangen is among a small group of researchers who study how people read on different media. It’s a field that goes back several decades, but yields no easy conclusions. People tended to read slowly and somewhat inaccurately on early screens. The technology, particularly e-paper, has improved dramatically, to the point where speed and accuracy aren’t now problems, but deeper issues of memory and comprehension are not yet well-characterized.

Complicating the scientific story further, there are many types of reading. Most experiments involve short passages read by students in an academic setting, and for this sort of reading, some studies have found no obvious differences between screens and paper. Those don’t necessarily capture the dynamics of deep reading, though, and nobody’s yet run the sort of experiment, involving thousands of readers in real-world conditions who are tracked for years on a battery of cognitive and psychological measures, that might fully illuminate the matter.

In the meantime, other research does suggest possible differences. A 2004 study found that students more fully remembered what they’d read on paper. Those results were echoed by an experiment that looked specifically at e-books, and another by psychologist Erik Wästlund at Sweden’s Karlstad University, who found that students learned better when reading from paper.

Wästlund followed up that study with one designed to investigate screen reading dynamics in more detail. He presented students with a variety of on-screen document formats. The most influential factor, he found, was whether they could see pages in their entirety. When they had to scroll, their performance suffered.

According to Wästlund, scrolling had two impacts, the most basic being distraction. Even the slight effort required to drag a mouse or swipe a finger requires a small but significant investment of attention, one that’s higher than flipping a page. Text flowing up and down a page also disrupts a reader’s visual attention, forcing eyes to search for a new starting point and re-focus.

Read the entire electronic article here.

Image: Leicester or Hammer Codex, by Leonardo da Vinci (1452-1519). Courtesy of Wikipedia / Public domain.

 

You Are Middle-Aged

google-search-middle-age

If you are losing touch with new technology, are growing increasingly hairy — in all the wrong places — and increasingly detest noisy environments, then you are middle-aged. Significantly, many now characterize the middle-aged years as 44-60. And, of course, if you continually misplace your glasses or feed the neighborhood birds more frequently, though you are still younger than 44 years, then you may just be acting middle-aged. Read on for some more telltale signs of your imminent demise.

From the Washington Post:

How do you know you’re middle-aged? How about when you wear clothes and shoes based on comfort rather than style, or grow hair in all the wrong places: nose, ears, eyebrows? Those are just two of the signs mentioned in a recent British survey about when middle age begins and how to identify it.

The 2,000 people surveyed by Benenden, a health-care and insurance firm, also made clear that middle age was no longer something for 30- or 40-year-olds to worry about. The life change, they said, began at 53. In fact, nearly half of the older-than-50s who were surveyed said they personally had not experienced “middle age” yet.

“A variety of factors — including more active lifestyles and healthier living — mean that people find their attitudes towards getting older are changing. Over half of the people surveyed didn’t feel that there even was such a thing as ‘middle age’ anymore,” Paul Keenan, head of communications at Benenden Health, said in a statement when the survey was released in August.

“Being ‘old’ appears to be a state of mind rather than being a specific age,” he added. “People no longer see ‘middle age’ as a numerical milestone and don’t tend to think of themselves as ‘old’ as they hit their fifties and beyond. I’m 54 myself, with the mind-set of a thirty-something — perhaps sometimes even that of a teenager!”

So beyond comfort shoes and ear hair, what are some signs that you’re no longer young? Here’s the full list offered up by respondents to the survey. Some are particularly British (e.g., joining the National Trust, taking a flask of tea on a day out). But you’ll get the point.

Losing touch with everyday technology such as tablets and TVs

Finding you have no idea what “young people” are talking about

Feeling stiff

Needing an afternoon nap

Groaning when you bend down

Not remembering the name of any modern bands

Talking a lot about your joints/ailments

Hating noisy pubs

Getting more hairy — ears, eyebrows, nose, face, etc.

Thinking policemen/teachers/doctors look really young

Preferring a night in with a board game than a night on the town

You don’t know any songs in the top 10

Choosing clothes and shoes for comfort rather than style

Taking a flask of tea on a day out

Obsessive gardening or bird feeding

Thinking there is nothing wrong with wearing an anorak

Forgetting people’s names

Booking a cruise

Misplacing your glasses, bag, car keys, etc.

Complaining about the rubbish on television these days

Gasping for a cup of tea

Getting bed socks for Christmas and being very grateful

Taking a keen interest in “The Antiques Road Show”

When you start complaining about more things

Listening to the Archers

You move from Radio 1 to Radio 2

Joining the National Trust

Being told off for politically incorrect opinions

Flogging the family car for something sportier

When you can’t lose six pounds in two days anymore

You get shocked by how racy music videos are

Taking a keen interest in the garden

Buying travel sweets for the car

Considering going on a “no children” cruise for a holiday

When you know your alcohol limit

Obsessively recycling/ knowing the collection dates

Always carrying a handy pack of tissues

Falling asleep after one glass of wine

Spending more money on face creams/anti-aging products

Preferring a Sunday walk to a lie-in

By comparison to those who participated in the British survey, Americans have a different take on when middle age begins, at least according to a paper published in 2011 by researchers at Florida State University. That study, which used nationally representative data collected in 1995-1996 and 2004-2006, showed that the perceived beginning of middle age varied, not surprisingly, depending on the age group that was providing the estimate. Overall, the researchers said, most people think of middle age as beginning at 44 and ending at 60.

Read the entire article here.

Image courtesy of Google Search.

 

Happy Listening to Sad Music

Why do we listen to sad music, and how is it that sad music can be as attractive as its lighter, happier cousin? After all we tend to want to steer clear of sad situations. New research suggests that it is more complex than a desire for catharsis, rather there is a disconnect between the felt emotion and the perceived emotion.

From the New York Times:

Sadness is an emotion we usually try to avoid. So why do we choose to listen to sad music?

Musicologists and philosophers have wondered about this. Sad music can induce intense emotions, yet the type of sadness evoked by music also seems pleasing in its own way. Why? Aristotle famously suggested the idea of catharsis: that by overwhelming us with an undesirable emotion, music (or drama) somehow purges us of it.

But what if, despite their apparent similarity, sadness in the realm of artistic appreciation is not the same thing as sadness in everyday life?

In a study published this summer in the journal Frontiers in Psychology, my colleagues and I explored the idea that “musical emotion” encompasses both the felt emotion that the music induces in the listener and the perceived emotion that the listener judges the music to express. By isolating these two overlapping sets of emotions and observing how they related to each other, we hoped to gain a better understanding of sad music.

Forty-four people served as participants in our experiment. We asked them to listen to one of three musical excerpts of approximately 30 seconds each. The excerpts were from Mikhail Glinka’s “La Séparation” (F minor), Felix Blumenfeld’s “Sur Mer” (G minor) and Enrique Granados’s “Allegro de Concierto” (C sharp major, though the excerpt was in G major, which we transposed to G minor).

We were interested in the minor key because it is canonically associated with sad music, and we steered clear of well-known compositions to avoid interference from any personal memories related to the pieces.

(Our participants were more or less split between men and women, as well as between musicians and nonmusicians, though these divisions turned out to be immaterial to our findings.)

A participant would listen to an excerpt and then answer a question about his felt emotions: “How did you feel when listening to this music?” Then he would listen to a “happy” version of the excerpt — i.e., transposed into the major key — and answer the same question. Next he would listen to the excerpt, again in both sad and happy versions, each time answering a question about other listeners that was designed to elicit perceived emotion: “How would normal people feel when listening to this music?”

(This is a slight simplification: in the actual study, the order in which the participant answered questions about felt and perceived emotion, and listened to sad and happy excerpts, varied from participant to participant.)

Our participants answered each question by rating 62 emotion-related descriptive words and phrases — from happy to sad, from bouncy to solemn, from heroic to wistful — on a scale from 0 (not at all) to 4 (very much).

We found, as anticipated, that felt emotion did not correspond exactly to perceived emotion. Although the sad music was both perceived and felt as “tragic” (e.g., gloomy, meditative and miserable), the listeners did not actually feel the tragic emotion as much as they perceived it. Likewise, when listening to sad music, the listeners felt more “romantic” emotion (e.g., fascinated, dear and in love) and “blithe” emotion (e.g., merry, animated and feel like dancing) than they perceived.

Read the entire article here.

Image: Detail of Marie-Magdalene, Entombment of Christ, 1672. Courtesy of Wikipedia.

Heavenly Light or Neuronal Hallucination

Many who have survived near-death experiences recount approaching a distant light as if closing in on the exit from a dark tunnel. Is it a heavenly light beckoning us towards the eternal afterlife in paradise? Perhaps, there is a simpler, scientific explanation.

From the Washington Post:

It’s called a near-death experience, but the emphasis is on “near.” The heart stops, you feel yourself float up and out of your body. You glide toward the entrance of a tunnel, and a searing bright light envelops your field of vision.

It could be the afterlife, as many people who have come close to dying have asserted. But a new study says it might well be a show created by the brain, which is still very much alive. When the heart stops, neurons in the brain appeared to communicate at an even higher level than normal, perhaps setting off the last picture show, packed with special effects.

“A lot of people believed that what they saw was heaven,” said lead researcher and neurologist Jimo Borjigin. “Science hadn’t given them a convincing alternative.”

Scientists from the University of Michigan recorded electroencephalogram (EEG) signals in nine anesthetized rats after inducing cardiac arrest. Within the first 30 seconds after the heart had stopped, all the mammals displayed a surge of highly synchronized brain activity that had features associated with consciousness and visual activation. The burst of electrical patterns even exceeded levels seen during a normal, awake state.

In other words, they may have been having the rodent version of a near-death experience.

“On a fundamental level, this study makes us think about the neurobiology of the dying brain,” said senior author and anesthesiologist George A. Mashour. It was published Monday online by the Proceedings of the National Academy of Sciences.

Near-death experiences have been reported by many who have faced death, worldwide and across cultures. About 20 percent of cardiac arrest survivors report visions or perceptions during clinical death, with features such as a bright light, life playback or an out-of-body feeling.

“There’s hundreds of thousands of people reporting these experiences,” Borjigin said. “If that experience comes from the brain, there has to be a fingerprint of that.”

An unanswered question from a previous experiment set her down the path of exploring the phenomenon. In 2007, Borjigin had been monitoring neurotransmitter secretion in rats when, in the middle of the night, two of her animals unexpectedly died. Upon reviewing the overnight data, she saw several unknown peaks near the time of death.

This got her thinking: What kinds of changes does the brain go through at the moment of death?

Then last year, Borjigin turned to Mashour, a colleague with expertise in EEG and consciousness, for help conducting the first experiment to systematically investigate the brain after cardiac arrest. EEG uses electrodes to measure voltage fluctuations in the brain caused by many neurons firing at once. A normal, awake brain should show spikes depending on what types of processing are going on; in a completely dead brain, it flat-lines.

When the heart suddenly stops, blood flow to the brain stops and causes death in a human within minutes. A likely assumption would be that, without a fresh supply of oxygen, any sort of brain activity would go flat. But after the rats went into cardiac arrest, Mashour and his colleagues observed the opposite happening.

Read the entire article here.

Image courtesy of Discovery.

Science and Art of the Brain

Nobel laureate and professor of brain science Eric Kandel describes how our perception of art can help us define a better functional map of the mind.

From the New York Times:

This month, President Obama unveiled a breathtakingly ambitious initiative to map the human brain, the ultimate goal of which is to understand the workings of the human mind in biological terms.

Many of the insights that have brought us to this point arose from the merger over the past 50 years of cognitive psychology, the science of mind, and neuroscience, the science of the brain. The discipline that has emerged now seeks to understand the human mind as a set of functions carried out by the brain.

This new approach to the science of mind not only promises to offer a deeper understanding of what makes us who we are, but also opens dialogues with other areas of study — conversations that may help make science part of our common cultural experience.

Consider what we can learn about the mind by examining how we view figurative art. In a recently published book, I tried to explore this question by focusing on portraiture, because we are now beginning to understand how our brains respond to the facial expressions and bodily postures of others.

The portraiture that flourished in Vienna at the turn of the 20th century is a good place to start. Not only does this modernist school hold a prominent place in the history of art, it consists of just three major artists — Gustav Klimt, Oskar Kokoschka and Egon Schiele — which makes it easier to study in depth.

As a group, these artists sought to depict the unconscious, instinctual strivings of the people in their portraits, but each painter developed a distinctive way of using facial expressions and hand and body gestures to communicate those mental processes.

Their efforts to get at the truth beneath the appearance of an individual both paralleled and were influenced by similar efforts at the time in the fields of biology and psychoanalysis. Thus the portraits of the modernists in the period known as “Vienna 1900” offer a great example of how artistic, psychological and scientific insights can enrich one another.

The idea that truth lies beneath the surface derives from Carl von Rokitansky, a gifted pathologist who was dean of the Vienna School of Medicine in the middle of the 19th century. Baron von Rokitansky compared what his clinician colleague Josef Skoda heard and saw at the bedsides of his patients with autopsy findings after their deaths. This systematic correlation of clinical and pathological findings taught them that only by going deep below the skin could they understand the nature of illness.

This same notion — that truth is hidden below the surface — was soon steeped in the thinking of Sigmund Freud, who trained at the Vienna School of Medicine in the Rokitansky era and who used psychoanalysis to delve beneath the conscious minds of his patients and reveal their inner feelings. That, too, is what the Austrian modernist painters did in their portraits.

Klimt’s drawings display a nuanced intuition of female sexuality and convey his understanding of sexuality’s link with aggression, picking up on things that even Freud missed. Kokoschka and Schiele grasped the idea that insight into another begins with understanding of oneself. In honest self-portraits with his lover Alma Mahler, Kokoschka captured himself as hopelessly anxious, certain that he would be rejected — which he was. Schiele, the youngest of the group, revealed his vulnerability more deeply, rendering himself, often nude and exposed, as subject to the existential crises of modern life.

Such real-world collisions of artistic, medical and biological modes of thought raise the question: How can art and science be brought together?

Alois Riegl, of the Vienna School of Art History in 1900, was the first to truly address this question. He understood that art is incomplete without the perceptual and emotional involvement of the viewer. Not only does the viewer collaborate with the artist in transforming a two-dimensional likeness on a canvas into a three-dimensional depiction of the world, the viewer interprets what he or she sees on the canvas in personal terms, thereby adding meaning to the picture. Riegl called this phenomenon the “beholder’s involvement” or the “beholder’s share.”

Art history was now aligned with psychology. Ernst Kris and Ernst Gombrich, two of Riegl’s disciples, argued that a work of art is inherently ambiguous and therefore that each person who sees it has a different interpretation. In essence, the beholder recapitulates in his or her own brain the artist’s creative steps.

This insight implied that the brain is a creativity machine, which obtains incomplete information from the outside world and completes it. We can see this with illusions and ambiguous figures that trick our brain into thinking that we see things that are not there. In this sense, a task of figurative painting is to convince the beholder that an illusion is true.

Some of this creative process is determined by the way the structure of our brain develops, which is why we all see the world in pretty much the same way. However, our brains also have differences that are determined in part by our individual experiences.

Read the entire article following the jump.

Yourself, The Illusion

A growing body of evidence suggests that our brains live in the future, construct explanations for the past and that our notion of the present is an entirely fictitious concoction. On the surface this makes our lives seem like nothing more than a construction taken right out of The Matrix movies. However, while we may not be pawns in an illusion constructed by malevolent aliens, our perception of “self” does appear to be illusory. As researchers delve deeper into the inner workings of the brain it becomes clearer that our conscious selves are a beautifully derived narrative, built by the brain to make sense of the past and prepare for our future actions.

[div class=attrib]From the New Scientist:[end-div]

It seems obvious that we exist in the present. The past is gone and the future has not yet happened, so where else could we be? But perhaps we should not be so certain.

Sensory information reaches usMovie Camera at different speeds, yet appears unified as one moment. Nerve signals need time to be transmitted and time to be processed by the brain. And there are events – such as a light flashing, or someone snapping their fingers – that take less time to occur than our system needs to process them. By the time we become aware of the flash or the finger-snap, it is already history.

Our experience of the world resembles a television broadcast with a time lag; conscious perception is not “live”. This on its own might not be too much cause for concern, but in the same way the TV time lag makes last-minute censorship possible, our brain, rather than showing us what happened a moment ago, sometimes constructs a present that has never actually happened.

Evidence for this can be found in the “flash-lag” illusion. In one version, a screen displays a rotating disc with an arrow on it, pointing outwards (see “Now you see it…”). Next to the disc is a spot of light that is programmed to flash at the exact moment the spinning arrow passes it. Yet this is not what we perceive. Instead, the flash lags behind, apparently occuring after the arrow has passed.

One explanation is that our brain extrapolates into the future. Visual stimuli take time to process, so the brain compensates by predicting where the arrow will be. The static flash – which it can’t anticipate – seems to lag behind.

Neat as this explanation is, it cannot be right, as was shown by a variant of the illusion designed by David Eagleman of the Baylor College of Medicine in Houston, Texas, and Terrence Sejnowski of the Salk Institute for Biological Studies in La Jolla, California.

If the brain were predicting the spinning arrow’s trajectory, people would see the lag even if the arrow stopped at the exact moment it was pointing at the spot. But in this case the lag does not occur. What’s more, if the arrow starts stationary and moves in either direction immediately after the flash, the movement is perceived before the flash. How can the brain predict the direction of movement if it doesn’t start until after the flash?

The explanation is that rather than extrapolating into the future, our brain is interpolating events in the past, assembling a story of what happened retrospectively (Science, vol 287, p 2036). The perception of what is happening at the moment of the flash is determined by what happens to the disc after it. This seems paradoxical, but other tests have confirmed that what is perceived to have occurred at a certain time can be influenced by what happens later.

All of this is slightly worrying if we hold on to the common-sense view that our selves are placed in the present. If the moment in time we are supposed to be inhabiting turns out to be a mere construction, the same is likely to be true of the self existing in that present.

[div class=attrib]Read the entire article after the jump.[end-div]

Map as Illusion

We love maps here at theDiagonal. We also love ideas that challenge the status quo. And, this latest Strange Map, courtesy of Frank Jacobs over at Big Think does both. What we appreciate about his cartographic masterpiece is that it challenges our visual perception, and, more importantly, challenges our assumed hemispheric worldview.

[div class=attrib]Read more of this article after the jump.[end-div]

The Beauty of Ugliness

The endless pursuit of beauty in human affairs probably pre-dates our historical record. We certainly know that ancient Egyptians used cosmetics believing them to offer magical and religious powers, in addition to aesthetic value.

Yet paradoxically beauty it is rather subjective and often fleeting. The French singer, songwriter, composer and bon viveur once said that, “ugliness is superior to beauty because it lasts longer”. Author Stephen Bayley argues in his new book “Ugly: The Aesthetics of Everything”, that beauty is downright boring.

[div class=attrib]From the Telegraph:[end-div]

Beauty is boring. And the evidence is piling up. An article in the journal Psychological Science now confirms what partygoers have known forever: that beauty and charm are no more directly linked than a high IQ and a talent for whistling.

A group of scientists set out to discover whether physically attractive people also have appealing character traits and values, and found, according to Lihi Segal-Caspi, who carried out part of the research, that “beautiful people tend to focus more on conformity and self-promotion than independence and tolerance”.

Certainly, while a room full of beautiful people might be impressively stiff with the whiff of Chanel No 5, the intellectual atmosphere will be carrying a very low charge. If positive at all.

The grizzled and gargoyle-like Parisian chanteur, and legendary lover, Serge Gainsbourg always used to pick up the ugliest girls at parties. This was not simply because predatory male folklore insists that ill-favoured women will be more “grateful”, but because Gainsbourg, a stylish contrarian, knew that the conversation would be better, the uglier the girl.

Beauty is a conformist conspiracy. And the conspirators include the fashion, cosmetics and movie businesses: a terrible Greek chorus of brainless idolatry towards abstract form. The conspirators insist that women – and, nowadays, men, too – should be un-creased, smooth, fat-free, tanned and, with the exception of the skull, hairless. Flawlessly dull. Even Hollywood once acknowledged the weakness of this proposition: Marilyn Monroe was made more attractive still by the addition of a “beauty spot”, a blemish turned into an asset.

The red carpet version of beauty is a feeble, temporary construction. Bodies corrode and erode, sag and bulge, just as cars rust and buildings develop a fine patina over time. This is not to be feared, rather to be understood and enjoyed. Anyone wishing to arrest these processes with the aid of surgery, aerosols, paint, glue, drugs, tape and Lycra must be both very stupid and very vain. Hence the problems encountered in conversation with beautiful people: stupidity and vanity rarely contribute much to wit and creativity.

Fine features may be all very well, but the great tragedy of beauty is that it is so ephemeral. Albert Camus said it “drives us to despair, offering for a minute the glimpse of an eternity that we should like to stretch out over the whole of time”. And Gainsbourg agreed when he said: “Ugliness is superior to beauty because it lasts longer.” A hegemony of beautiful perfection would be intolerable: we need a good measure of ugliness to keep our senses keen. If everything were beautiful, nothing would be.

And yet, despite the evidence against, there has been a conviction that beauty and goodness are somehow inextricably and permanently linked. Political propaganda exploited our primitive fear of ugliness, so we had Second World War American posters of Japanese looking like vampire bats. The Greeks believed that beauty had a moral character: beautiful people – discus-throwers and so on – were necessarily good people. Darwin explained our need for “beauty” in saying that breeding attractive children is a survival characteristic: I may feel the need to fuse my premium genetic material with yours, so that humanity continues in the same fine style.

This became a lazy consensus, described as the “beauty premium” by US economists Markus M Mobius and Tanya S Rosenblat. The “beauty premium” insists that as attractive children grow into attractive adults, they may find it easier to develop agreeable interpersonal communications skills because their audience reacts more favourably to them. In this beauty-related employment theory, short people are less likely to get a good job. As Randy Newman sang: “Short people got no reason to live.” So Darwin’s argument that evolutionary forces favour a certain physical type may be proven in the job market as well as the wider world.

But as soon as you try to grasp the concept of beauty, it disappears.

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Image courtesy of Google search.[end-div]

The Benefits and Beauty of Blue

[div class=attrib]From the New York Times:[end-div]

For the French Fauvist painter and color gourmand Raoul Dufy, blue was the only color with enough strength of character to remain blue “in all its tones.” Darkened red looks brown and whitened red turns pink, Dufy said, while yellow blackens with shading and fades away in the light. But blue can be brightened or dimmed, the artist said, and “it will always stay blue.”

Scientists, too, have lately been bullish on blue, captivated by its optical purity, complexity and metaphorical fluency. They’re exploring the physics and chemistry of blueness in nature, the evolution of blue ornaments and blue come-ons, and the sheer brazenness of being blue when most earthly life forms opt for earthy raiments of beige, ruddy or taupe.

One research team recently reported the structural analysis of a small, dazzlingly blue fruit from the African Pollia condensata plant that may well be the brightest terrestrial object in nature. Another group working in the central Congo basin announced the discovery of a new species of monkey, a rare event in mammalogy. Rarer still is the noteworthiest trait of the monkey, called the lesula: a patch of brilliant blue skin on the male’s buttocks and scrotal area that stands out from the surrounding fur like neon underpants.

Still other researchers are tracing the history of blue pigments in human culture, and the role those pigments have played in shaping our notions of virtue, authority, divinity and social class. “Blue pigments played an outstanding role in human development,” said Heinz Berke, an emeritus professor of chemistry at the University of Zurich. For some cultures, he said, they were as valuable as gold.

As a raft of surveys has shown, blue love is a global affair. Ask people their favorite color, and in most parts of the world roughly half will say blue, a figure three to four times the support accorded common second-place finishers like purple or green. Just one in six Americans is blue-eyed, but nearly one in two consider blue the prettiest eye color, which could be why some 50 percent of tinted contact lenses sold are the kind that make your brown eyes blue.

Sick children like their caretakers in blue: A recent study at the Cleveland Clinic found that young patients preferred nurses wearing blue uniforms to those in white or yellow. And am I the only person in the United States who doesn’t own a single pair of those permanently popular pants formerly known as dungarees?

“For Americans, bluejeans have a special connotation because of their association with the Old West and rugged individualism,” said Steven Bleicher, author of “Contemporary Color: Theory and Use.” The jeans take their John Wayne reputation seriously. “Because the indigo dye fades during washing, everyone’s blue becomes uniquely different,” said Dr. Bleicher, a professor of visual arts at Coastal Carolina University. “They’re your bluejeans.”

According to psychologists who explore the complex interplay of color, mood and behavior, blue’s basic emotional valence is calmness and open-endedness, in contrast to the aggressive specificity associated with red. Blue is sea and sky, a pocket-size vacation.

In a study that appeared in the journal Perceptual & Motor Skills, researchers at Aichi University in Japan found that subjects who performed a lengthy video game exercise while sitting next to a blue partition reported feeling less fatigued and claustrophobic, and displayed a more regular heart beat pattern, than did people who sat by red or yellow partitions.

In the journal Science, researchers at the University of British Columbia described their study of how computer screen color affected participants’ ability to solve either creative problems — for example, determining the word that best unifies the terms “shelf,” “read” and “end” (answer: book) — or detail-oriented tasks like copy editing. The researchers found that blue screens were superior to red or white backgrounds at enhancing creativity, while red screens worked best for accuracy tasks. Interestingly, when participants were asked to predict which screen color would improve performance on the two categories of problems, big majorities deemed blue the ideal desktop setting for both.

But skies have their limits, and blue can also imply coldness, sorrow and death. On learning of a good friend’s suicide in 1901, Pablo Picasso fell into a severe depression, and he began painting images of beggars, drunks, the poor and the halt, all famously rendered in a palette of blue.

The provenance of using “the blues” to mean sadness isn’t clear, but L. Elizabeth Crawford, a professor of psychology at the University of Richmond in Virginia, suggested that the association arose from the look of the body when it’s in a low energy, low oxygen state. “The lips turn blue, there’s a blue pallor to the complexion,” she said. “It’s the opposite of the warm flushing of the skin that we associate with love, kindness and affection.”

Blue is also known to suppress the appetite, possibly as an adaptation against eating rotten meat, which can have a bluish tinge. “If you’re on a diet, my advice is, take the white bulb out of the refrigerator and put in a blue one instead,” Dr. Bleicher said. “A blue glow makes food look very unappetizing.”

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Image: Morpho didius, dorsal view of male butterfly. Courtesy of Wikipedia.[end-div]

Women See Bodies; Men See Body Parts

Yet another research study of gender differences shows some fascinating variation in the way men and women see and process their perceptions of others. Men tend to be perceived as a whole, women, on the other hand, are more likely to be perceived as parts.

[div class=attrib]From Scientific American:[end-div]

A glimpse at the magazine rack in any supermarket checkout line will tell you that women are frequently the focus of sexual objectification. Now, new research finds that the brain actually processes images of women differently than those of men, contributing to this trend.

Women are more likely to be picked apart by the brain and seen as parts rather than a whole, according to research published online June 29 in the European Journal of Social Psychology. Men, on the other hand, are processed as a whole rather than the sum of their parts.

“Everyday, ordinary women are being reduced to their sexual body parts,” said study author Sarah Gervais, a psychologist at the University of Nebraska, Lincoln. “This isn’t just something that supermodels or porn stars have to deal with.”

Objectification hurts
Numerous studies have found that feeling objectified is bad for women. Being ogled can make women do worse on math tests, and self-sexualization, or scrutiny of one’s own shape, is linked to body shame, eating disorders and poor mood.

But those findings have all focused on the perception of being sexualized or objectified, Gervais told LiveScience. She and her colleagues wondered about the eye of the beholder: Are people really objectifying women more than men?

To find out, the researchers focused on two types of mental processing, global and local. Global processing is how the brain identifies objects as a whole. It tends to be used when recognizing people, where it’s not just important to know the shape of the nose, for example, but also how the nose sits in relation to the eyes and mouth. Local processing focuses more on the individual parts of an object. You might recognize a house by its door alone, for instance, while you’re less likely to recognize a person’s arm without the benefit of seeing the rest of their body.

If women are sexually objectified, people should process their bodies in a more local way, focusing on individual body parts like breasts. To test the idea, Gervais and her colleagues carried out two nearly identical experiments with a total of 227 undergraduate participants. Each person was shown non-sexualized photographs, each of either a young man or young woman, 48 in total. After seeing each original full-body image, the participants saw two side-by-side photographs. One was the original image, while the other was the original with a slight alteration to the chest or waist (chosen because these are sexualized body parts). Participants had to pick which image they’d seen before.

In some cases, the second set of photos zoomed in on the chest or waist only, asking participants to pick the body part they’d seen previously versus the one that had been altered.

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Image: People focus on the parts of a woman’s body when processing her image, according to research published in June in the European Journal of Social Psychology. Courtesy of LiveScience / Yuri Arcurs, Shutterstock.[end-div]

Time Flows Uphill

Many people in industrialized countries often describe time as flowing like a river: it flows back into the past, and it flows forward into the future. Of course, for bored workers time sometimes stands still, while for kids on summer vacation time flows all too quickly. And, for many people over, say the age of forty, days often drag, but the years fly by.

For some, time flows uphill, and it flows downhill.

[div class=attrib]From New Scientist:[end-div]

“HERE and now”, “Back in the 1950s”, “Going forward”… Western languages are full of spatial metaphors for time, and whether you are, say, British, French or German, you no doubt think of the past as behind you and the future as stretching out ahead. Time is a straight line that runs through your body.

Once thought to be universal, this “embodied cognition of time” is in fact strictly cultural. Over the past decade, encounters with various remote tribal societies have revealed a rich diversity of the ways in which humans relate to time (see “Attitudes across the latitudes”). The latest, coming from the Yupno people of Papua New Guinea, is perhaps the most remarkable. Time for the Yupno flows uphill and is not even linear.

Rafael Núñez of the University of California, San Diego, led his team into the Finisterre mountain range of north-east Papua New Guinea to study the Yupno living in the village of Gua. There are no roads in this remote region. The Yupno have no electricity or even domestic animals to work the land. They live with very little contact with the western world.

Núñez and his colleagues noticed that the tribespeople made spontaneous gestures when speaking about the past, present and future. They filmed and analysed the gestures and found that for the Yupno the past is always downhill, in the direction of the mouth of the local river. The future, meanwhile, is towards the river’s source, which lies uphill from Gua.

This was true regardless of the direction they were facing. For instance, if they were facing downhill when talking about the future, a person would gesture backwards up the slope. But when they turned around to face uphill, they pointed forwards.

Núñez thinks the explanation is historical. The Yupno’s ancestors arrived by sea and climbed up the 2500-metre-high mountain valley, so lowlands may represent the past, and time flows uphill.

But the most unusual aspect of the Yupno timeline is its shape. The village of Gua, the river’s source and its mouth do not lie in a straight line, so the timeline is kinked. “This is the first time ever that a culture has been documented to have everyday notions of time anchored in topographic properties,” says Núñez.

Within the dark confines of their homes, geographical landmarks disappear and the timeline appears to straighten out somewhat. The Yupno always point towards the doorway when talking about the past, and away from the door to indicate the future, regardless of their home’s orientation. That could be because entrances are always raised, says Núñez. You have to climb down – towards the past – to leave the house, so each home has its own timeline.

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Image: The Persistence of Memory, by Salvador Dalí. Courtesy of Salvador Dalí, Gala-Salvador Dalí Foundation / Artists Rights Society (ARS), Museum of Modern Art New York / Wikipedia.[end-div]

Our Perception of Time

[div class=attrib]From Evolutionary Philosophy:[end-div]

We have learned to see time as if it appears in chunks – minutes, hours, days, and years. But if time comes in chunks how do we experience past memories in the present? How does the previous moment’s chunk of time connect to the chunk of the present moment?

Wait a minute. It will take an hour. He is five years old. These are all sentences that contain expressions of units of time. We are all tremendously comfortable with the idea that time comes in discrete units – but does it? William James and Charles Sanders Peirce thought not.

If moments of time were truly discrete, separate units lined up like dominoes in a row, how would it be possible to have a memory of a past event? What connects the present moment to all the past moments that have already gone by?

One answer to the question is to suppose the existence of a transcendental self. That means some self that exists over and above our experience and can connect all the moments together for us. Imagine moments in time that stick together like boxcars of a train. If you are in one boxcar – i.e. inside the present moment – how could you possibly know anything about the boxcar behind you – i.e. the moment past? The only way would be to see from outside of your boxcar – you would at least stick your head out of the window to see the boxcar behind you.

If the boxcar represents your experience of the present moment then we are saying that you would have to leave the present moment at least a little bit to be able to see what happened in the moment behind you. How can you leave the present moment? Where do you go if you leave your experience of the present moment? Where is the space that you exist in when you are outside of your experience? It would have to be a space that transcended your experience – a transcendental space outside of reality as we experience it. It would be a supernatural space and the part of you that existed in that space would be a supernatural extra-experiential you.

For those who had been raised in a Christian context this would not be so hard to except because this extra-experiential you would sound a great deal like the soul. In fact Immanuel Kant who first articulated the idea of a transcendental self was through his philosophy actively trying to reserve space for the human soul in an intellectual atmosphere that he saw as excessively materialistic.

William James and Charles Sanders Peirce believed in unity and therefore they could not accept the idea of a transcendental ego that would exist in some transcendent realm. In some of their thinking they were anticipating the later developments of quantum theory and non-locality.

William James described who we appear to travel through a river of time – and like all rivers the river ahead of us already exists before we arrive there. In the same way the future already exists now. Not in a pre-determined sense but at least as some potentiality. As we arrive at the future moment our arrival marks the passage from the fluid form that we call future to the definitive solid form that we experience as the past. We do not create time by passing through it; we simply freeze it in its tracks.

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image courtesy of Google search.[end-div]

Zen and the Art of Meditation Messaging

Quite often you will be skimming a book or leafing through pages of your favorite magazine and you will recall having “seen” a specific word. However, you will not remember having read that page or section or having looked at that particular word. But, without fail, when you retrace your steps and look back you will find that specific word, that word that you did not consciously “see”. So, what’s going on?

[div class=attrib]From the New Scientist:[end-div]

MEDITATION increases our ability to tap into the hidden recesses of our brain that are usually outside the reach of our conscious awareness.

That’s according to Madelijn Strick of Utrecht University in the Netherlands and colleagues, who tested whether meditation has an effect on our ability to pick up subliminal messages.

The brain registers subliminal messages, but we are often unable to recall them consciously. To investigate, the team recruited 34 experienced practitioners of Zen meditation and randomly assigned them to either a meditation group or a control group. The meditation group was asked to meditate for 20 minutes in a session led by a professional Zen master. The control group was asked to merely relax for 20 minutes.

The volunteers were then asked 20 questions, each with three or four correct answers – for instance: “Name one of the four seasons”. Just before the subjects saw the question on a computer screen one potential answer – such as “spring” – flashed up for a subliminal 16 milliseconds.

The meditation group gave 6.8 answers, on average, that matched the subliminal words, whereas the control group gave just 4.9 (Consciousness and Cognition, DOI: 10.1016/j.concog.2012.02.010).

Strick thinks that the explanation lies in the difference between what the brain is paying attention to and what we are conscious of. Meditators are potentially accessing more of what the brain has paid attention to than non-meditators, she says.

“It is a truly exciting development that the second wave of rigorous, scientific meditation research is now yielding concrete results,” says Thomas Metzinger, at Johannes Gutenberg University in Mainz, Germany. “Meditation may be best seen as a process that literally expands the space of conscious experience.”

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image courtesy of Yoga.am.[end-div]

Male Brain + Female = Jello

[div class=attrib]From Scientific American:[end-div]

In one experiment, just telling a man he would be observed by a female was enough to hurt his psychological performance.

Movies and television shows are full of scenes where a man tries unsuccessfully to interact with a pretty woman. In many cases, the potential suitor ends up acting foolishly despite his best attempts to impress. It seems like his brain isn’t working quite properly and according to new findings, it may not be.

Researchers have begun to explore the cognitive impairment that men experience before and after interacting with women. A 2009 study demonstrated that after a short interaction with an attractive woman, men experienced a decline in mental performance. A more recent study suggests that this cognitive impairment takes hold even w hen men simply anticipate interacting with a woman who they know very little about.

Sanne Nauts and her colleagues at Radboud University Nijmegen in the Netherlands ran two experiments using men and women university students as participants. They first collected a baseline measure of cognitive performance by having the students complete a Stroop test. Developed in 1935 by the psychologist John Ridley Stroop, the test is a common way of assessing our ability to process competing information. The test involves showing people a series of words describing different colors that are printed in different colored inks. For example, the word “blue” might be printed in green ink and the word “red” printed in blue ink. Participants are asked to name, as quickly as they can, the color of the ink that the words are written in. The test is cognitively demanding because our brains can’t help but process the meaning of the word along with the color of the ink. When people are mentally tired, they tend to complete the task at a slower rate.

After completing the Stroop Test, participants in Nauts’ study were asked to take part in another supposedly unrelated task. They were asked to read out loud a number of Dutch words while sitting in front of a webcam. The experimenters told them that during this “lip reading task” an observer would watch them over the webcam. The observer was given either a common male or female name. Participants were led to believe that this person would see them over the web cam, but they would not be able to interact with the person. No pictures or other identifying information were provided about the observer—all the participants knew was his or her name. After the lip reading task, the participants took another Stroop test. Women’s performance on the second test did not differ, regardless of the gender of their observer. However men who thought a woman was observing them ended up performing worse on the second Stroop test. This cognitive impairment occurred even though the men had not interacted with the female observer.

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image courtesy of Scientific American / iStock/Iconogenic.[end-div]

Mirror, Mirror

A thoughtful question posed below by philosopher Eric Schwitzgebel over at The Splinted Mind. Gazing in a mirror or reflection is something we all do on a frequent basis. In fact, is there any human activity that trumps this in frequency? Yet, have we ever given thought to how and why we perceive ourselves in space differently to say a car in a rearview mirror. The car in the rearview mirror is quite clearly approaching us from behind as we drive. However, where exactly is our reflection we when cast our eyes at the mirror in the bathrooom?

[div class=attrib]From the Splintered Mind:[end-div]

When I gaze into a mirror, does it look like there’s someone a few feet away gazing back at me? (Someone who looks a lot like me, though perhaps a bit older and grumpier.) Or does it look like I’m standing where I in fact am, in the middle of the bathroom? Or does it somehow look both ways? Suppose my son is sneaking up behind me and I see him in the same mirror. Does it look like he is seven feet in front of me, sneaking up behind the dope in the mirror and I only infer that he is actually behind me? Or does he simply look, instead, one foot behind me?

Suppose I’m in a new restaurant and it takes me a moment to notice that one wall is a mirror. Surely, before I notice, the table that I’m looking at in the mirror appears to me to be in a location other than its real location. Right? Now, after I notice that it’s a mirror, does the table look to be in a different place than it looked to be a moment ago? I’m inclined to say that in the dominant sense of “apparent location”, the apparent location of the table is just the same, but now I’m wise to it and I know its apparent location isn’t its real location. On the other hand, though, when I look in the rear-view mirror in my car I want to say that it looks like that Mazda is coming up fast behind me, not that it looks like there is a Mazda up in space somewhere in front of me.

What is the difference between these cases that makes me want to treat them differently? Does it have to do with familiarity and skill? I guess that’s what I’m tempted to say. But then it seems to follow that, with enough skill, things will look veridical through all kinds of reflections, refractions, and distortions. Does the oar angling into water really look straight to the skilled punter? With enough skill, could even the image in a carnival mirror look perfectly veridical? Part of me wants to resist at least that last thought, but I’m not sure how to do so and still say all the other things I want to say.

[div class=attrib]More from theSource here.[end-div]

[div class=attrib]Image courtesy of Adrian Pingstone, Wikipedia / Creative Commons.[end-div]

Why Does Time Fly?

[div class=attrib]From Scientific American:[end-div]

Everybody knows that the passage of time is not constant. Moments of terror or elation can stretch a clock tick to what seems like a life time. Yet, we do not know how the brain “constructs” the experience of subjective time. Would it not be important to know so we can find ways to make moments last, or pass by, more quickly?

A recent study by van Wassenhove and colleagues is beginning to shed some light on this problem. This group used a simple experimental set up to measure the “subjective” experience of time. They found that people accurately judge whether a dot appears on the screen for shorter, longer or the same amount of time as another dot. However, when the dot increases in size so as to appear to be moving toward the individual — i.e. the dot is “looming” — something strange happens. People overestimate the time that the dot lasted on the screen.  This overestimation does not happen when the dot seems to move away.  Thus, the overestimation is not simply a function of motion. Van Wassenhove and colleagues conducted this experiment during functional magnetic resonance imaging, which enabled them to examine how the brain reacted differently to looming and receding.

The brain imaging data revealed two main findings. First, structures in the middle of the brain were more active during the looming condition. These brain areas are also known to activate in experiments that involve the comparison of self-judgments to the judgments of others, or when an experimenter does not tell the subject what to do. In both cases, the prevailing idea is that the brain is busy wondering about itself, its ongoing plans and activities, and relating oneself to the rest of the world.

Read more from the original study here.

[div class=attrib]More from theSource here.[end-div]

[div class=attrib]Image courtesy of Sawayasu Tsuji.[end-div]

The Movies in Our Eyes

[div class=attrib]From Scientific American:[end-div]

The retina processes information much morethan anyone has ever imagined, sending a dozen different movies to the brain.

We take our astonishing visual capabilities so much for granted that few of us ever stop to consider how we actually see. For decades, scientists have likened our visual-processing machinery to a television camera: the eye’s lens focuses incoming light onto an array of photoreceptors in the retina. These light detectors magically convert those photons into electrical signals that are sent along the optic nerve to the brain for processing. But recent experiments by the two of us and others indicate that this analogy is inadequate. The retina actually performs a significant amount of preprocessing right inside the eye and then sends a series of partial representations to the brain for interpretation.

We came to this surprising conclusion after investigating the retinas of rabbits, which are remarkably similar to those in humans. (Our work with salamanders has led to similar results.) The retina, it appears, is a tiny crescent of brain matter that has been brought out to the periphery to gain more direct access to the world. How does the retina construct the representations it sends? What do they “look” like when they reach the brain’s visual centers? How do they convey the vast richness of the real world? Do they impart meaning, helping the brain to analyze a scene? These are just some of the compelling questions the work has begun to answer.

[div class=attrib]More from theSource here.[end-div]