All posts by Mike

Dark Matter: An Illusion?

Cosmologists and particle physicists have over the last decade or so proposed the existence of Dark Matter. It’s so called because it cannot be seen or sensed directly. It is inferred from gravitational effects on visible matter. Together with it’s theoretical cousin, Dark Energy, the two were hypothesized to make up most of the universe. In fact, the regular star-stuff — matter and energy — of which we, our planet, solar system and the visible universe are made, consists of only a paltry 4 percent.

Dark Matter and Dark Energy were originally proposed to account for discrepancies in calculations of the mass of large objects such as galaxies and galaxy clusters, and calculations derived from the mass of smaller visible objects such as stars, nebulae and interstellar gas.

The problem with Dark Matter is that it remains elusive and for the most part a theoretical construct. And, now a new group of theories suggest that the dark stuff may in fact be an illusion.

[div class=attrib]From National Geographic:[end-div]

The mysterious substance known as dark matter may actually be an illusion created by gravitational interactions between short-lived particles of matter and antimatter, a new study says.

Dark matter is thought to be an invisible substance that makes up almost a quarter of the mass in the universe. The concept was first proposed in 1933 to explain why the outer galaxies in galaxy clusters orbit faster than they should, based on the galaxies’ visible mass.

(Related: “Dark-Matter Galaxy Detected: Hidden Dwarf Lurks Nearby?”)

At the observed speeds, the outer galaxies should be flung out into space, since the clusters don’t appear to have enough mass to keep the galaxies at their edges gravitationally bound.

So physicists proposed that the galaxies are surrounded by halos of invisible matter. This dark matter provides the extra mass, which in turn creates gravitational fields strong enough to hold the clusters together.

In the new study, physicist Dragan Hajdukovic at the European Organization for Nuclear Research (CERN) in Switzerland proposes an alternative explanation, based on something he calls the “gravitational polarization of the quantum vacuum.”

(Also see “Einstein’s Gravity Confirmed on a Cosmic Scale.”)

Empty Space Filled With “Virtual” Particles

The quantum vacuum is the name physicists give to what we see as empty space.

According to quantum physics, empty space is not actually barren but is a boiling sea of so-called virtual particles and antiparticles constantly popping in and out of existence.

Antimatter particles are mirror opposites of normal matter particles. For example, an antiproton is a negatively charged version of the positively charged proton, one of the basic constituents of the atom.

When matter and antimatter collide, they annihilate in a flash of energy. The virtual particles spontaneously created in the quantum vacuum appear and then disappear so quickly that they can’t be directly observed.

In his new mathematical model, Hajdukovic investigates what would happen if virtual matter and virtual antimatter were not only electrical opposites but also gravitational opposites—an idea some physicists previously proposed.

“Mainstream physics assumes that there is only one gravitational charge, while I have assumed that there are two gravitational charges,” Hajdukovic said.

According to his idea, outlined in the current issue of the journal Astrophysics and Space Science, matter has a positive gravitational charge and antimatter a negative one.

That would mean matter and antimatter are gravitationally repulsive, so that an object made of antimatter would “fall up” in the gravitational field of Earth, which is composed of normal matter.

Particles and antiparticles could still collide, however, since gravitational repulsion is much weaker than electrical attraction.

How Galaxies Could Get Gravity Boost

While the idea of particle antigravity might seem exotic, Hajdukovic says his theory is based on well-established tenants in quantum physics.

For example, it’s long been known that particles can team up to create a so-called electric dipole, with positively charge particles at one end and negatively charged particles at the other. (See “Universe’s Existence May Be Explained by New Material.”)

According to theory, there are countless electric dipoles created by virtual particles in any given volume of the quantum vacuum.

All of these electric dipoles are randomly oriented—like countless compass needles pointing every which way. But if the dipoles form in the presence of an existing electric field, they immediately align along the same direction as the field.

According to quantum field theory, this sudden snapping to order of electric dipoles, called polarization, generates a secondary electric field that combines with and strengthens the first field.

Hajdukovic suggests that a similar phenomenon happens with gravity. If virtual matter and antimatter particles have different gravitational charges, then randomly oriented gravitational dipoles would be generated in space.

[div class=attrib]More from theSource here.[end-div]

Improvements to Our Lives Through Science

Ask a hundred people how science can be used for the good and you’re likely to get a hundred different answers. Well, Edge Magazine did just that, posing the question: “What scientific concept would improve everybody’s cognitive toolkit”, to 159 critical thinkers. Below we excerpt some of our favorites. The thoroughly engrossing, novel length article can be found here in its entirety.

[div class=attrib]From Edge:[end-div]

ether
Richard H. Thaler. Father of behavioral economics.

I recently posted a question in this space asking people to name their favorite example of a wrong scientific belief. One of my favorite answers came from Clay Shirky. Here is an excerpt:
The existence of ether, the medium through which light (was thought to) travel. It was believed to be true by analogy — waves propagate through water, and sound waves propagate through air, so light must propagate through X, and the name of this particular X was ether.
It’s also my favorite because it illustrates how hard it is to accumulate evidence for deciding something doesn’t exist. Ether was both required by 19th century theories and undetectable by 19th century apparatus, so it accumulated a raft of negative characteristics: it was odorless, colorless, inert, and so on.

Ecology
Brian Eno. Artist; Composer; Recording Producer: U2, Cold Play, Talking Heads, Paul Simon.

That idea, or bundle of ideas, seems to me the most important revolution in general thinking in the last 150 years. It has given us a whole new sense of who we are, where we fit, and how things work. It has made commonplace and intuitive a type of perception that used to be the province of mystics — the sense of wholeness and interconnectedness.
Beginning with Copernicus, our picture of a semi-divine humankind perfectly located at the centre of The Universe began to falter: we discovered that we live on a small planet circling a medium sized star at the edge of an average galaxy. And then, following Darwin, we stopped being able to locate ourselves at the centre of life. Darwin gave us a matrix upon which we could locate life in all its forms: and the shocking news was that we weren’t at the centre of that either — just another species in the innumerable panoply of species, inseparably woven into the whole fabric (and not an indispensable part of it either). We have been cut down to size, but at the same time we have discovered ourselves to be part of the most unimaginably vast and beautiful drama called Life.

We Are Not Alone In The Universe
J. Craig Venter. Leading scientist of the 21st century.

I cannot imagine any single discovery that would have more impact on humanity than the discovery of life outside of our solar system. There is a human-centric, Earth-centric view of life that permeates most cultural and societal thinking. Finding that there are multiple, perhaps millions of origins of life and that life is ubiquitous throughout the universe will profoundly affect every human.

Correlation is not a cause
Susan Blackmore. Psychologist; Author, Consciousness: An Introduction.

The phrase “correlation is not a cause” (CINAC) may be familiar to every scientist but has not found its way into everyday language, even though critical thinking and scientific understanding would improve if more people had this simple reminder in their mental toolkit.
One reason for this lack is that CINAC can be surprisingly difficult to grasp. I learned just how difficult when teaching experimental design to nurses, physiotherapists and other assorted groups. They usually understood my favourite example: imagine you are watching at a railway station. More and more people arrive until the platform is crowded, and then — hey presto — along comes a train. Did the people cause the train to arrive (A causes B)? Did the train cause the people to arrive (B causes A)? No, they both depended on a railway timetable (C caused both A and B).

A Statistically Significant Difference in Understanding the Scientific Process
Diane F. Halpern. Professor, Claremont McKenna College; Past-president, American Psychological Society.

Statistically significant difference — It is a simple phrase that is essential to science and that has become common parlance among educated adults. These three words convey a basic understanding of the scientific process, random events, and the laws of probability. The term appears almost everywhere that research is discussed — in newspaper articles, advertisements for “miracle” diets, research publications, and student laboratory reports, to name just a few of the many diverse contexts where the term is used. It is a short hand abstraction for a sequence of events that includes an experiment (or other research design), the specification of a null and alternative hypothesis, (numerical) data collection, statistical analysis, and the probability of an unlikely outcome. That is a lot of science conveyed in a few words.

 

Confabulation
Fiery Cushman. Post-doctoral fellow, Mind/Brain/Behavior Interfaculty Initiative, Harvard University.

We are shockingly ignorant of the causes of our own behavior. The explanations that we provide are sometimes wholly fabricated, and certainly never complete. Yet, that is not how it feels. Instead it feels like we know exactly what we’re doing and why. This is confabulation: Guessing at plausible explanations for our behavior, and then regarding those guesses as introspective certainties. Every year psychologists use dramatic examples to entertain their undergraduate audiences. Confabulation is funny, but there is a serious side, too. Understanding it can help us act better and think better in everyday life.

We are Lost in Thought
Sam Harris. Neuroscientist; Chairman, The Reason Project; Author, Letter to a Christian Nation.

I invite you to pay attention to anything — the sight of this text, the sensation of breathing, the feeling of your body resting against your chair — for a mere sixty seconds without getting distracted by discursive thought. It sounds simple enough: Just pay attention. The truth, however, is that you will find the task impossible. If the lives of your children depended on it, you could not focus on anything — even the feeling of a knife at your throat — for more than a few seconds, before your awareness would be submerged again by the flow of thought. This forced plunge into unreality is a problem. In fact, it is the problem from which every other problem in human life appears to be made.
I am by no means denying the importance of thinking. Linguistic thought is indispensable to us. It is the basis for planning, explicit learning, moral reasoning, and many other capacities that make us human. Thinking is the substance of every social relationship and cultural institution we have. It is also the foundation of science. But our habitual identification with the flow of thought — that is, our failure to recognize thoughts as thoughts, as transient appearances in consciousness — is a primary source of human suffering and confusion.

Knowledge
Mark Pagel. Professor of Evolutionary Biology, Reading University, England and The Santa Fe.

The Oracle of Delphi famously pronounced Socrates to be “the most intelligent man in the world because he knew that he knew nothing”. Over 2000 years later the physicist-turned-historian Jacob Bronowski would emphasize — in the last episode of his landmark 1970s television series the “Ascent of Man” — the danger of our all-too-human conceit of thinking we know something. What Socrates knew and what Bronowski had come to appreciate is that knowledge — true knowledge — is difficult, maybe even impossible, to come buy, it is prone to misunderstanding and counterfactuals, and most importantly it can never be acquired with exact precision, there will always be some element of doubt about anything we come to “know”‘ from our observations of the world.

[div class=attrib]More from theSource here.[end-div]

The Business of Making Us Feel Good

Advertisers have long known how to pull at our fickle emotions and inner motivations to sell their products. Further still many corporations fine-tune their products to the nth degree to ensure we learn to crave more of the same. Whether it’s the comforting feel of an armchair, the soft yet lingering texture of yogurt, the fresh scent of hand soap, or the crunchiness of the perfect potato chip, myriad focus groups, industrial designers and food scientists are hard at work engineering our addictions.

[div class=attrib]From the New York Times:[end-div]

Feeling low? According to a new study in the Journal of Consumer Research, when people feel bad, their sense of touch quickens and they instinctively want to hug something or someone. Tykes cling to a teddy bear or blanket. It’s a mammal thing. If young mammals feel gloomy, it’s usually because they’re hurt, sick, cold, scared or lost. So their brain rewards them with a gust of pleasure if they scamper back to mom for a warm nuzzle and a meal. No need to think it over. All they know is that, when a negative mood hits, a cuddle just feels right; and if they’re upbeat and alert, then their eyes hunger for new sights and they’re itching to explore.

It’s part of evolution’s gold standard, the old carrot-and-stick gambit, an impulse that evades reflection because it evolved to help infants thrive by telling them what to do — not in words but in sequins of taste, heartwarming touches, piquant smells, luscious colors.

Back in the days before our kind knew what berries to eat, let alone which merlot to choose or HD-TV to buy, the question naturally arose: How do you teach a reckless animal to live smart? Some brains endorsed correct, lifesaving behavior by doling out sensory rewards. Healthy food just tasted yummy, which is why we now crave the sweet, salty, fatty foods our ancestors did — except that for them such essentials were rare, needing to be painstakingly gathered or hunted. The seasoned hedonists lived to explore and nuzzle another day — long enough to pass along their snuggly, junk-food-bedeviled genes.

[div class=attrib]More from theSource here.[end-div]

Cities Might Influence Not Just Our Civilizations, but Our Evolution

[div class=attrib]From Scientific American:[end-div]

Cities reverberate through history as centers of civilization. Ur. Babylon. Rome. Baghdad. Tenochtitlan. Beijing. Paris. London. New York. As pivotal as cities have been for our art and culture, our commerce and trade, our science and technology, our wars and peace, it turns out that cities might have been even more important than we had suspected, influencing our very genes and evolution.

Cities reverberate through history as centers of civilization. Ur. Babylon. Rome. Baghdad. Tenochtitlan. Beijing. Paris. London. New York. As pivotal as cities have been for our art and culture, our commerce and trade, our science and technology, our wars and peace, it turns out that cities might have been even more important than we had suspected, influencing our very genes and evolution.

Cities have been painted as hives of scum and villainy, dens of filth and squalor, with unsafe water, bad sanitation, industrial pollution and overcrowded neighborhoods. It turns out that by bringing people closer together and spreading disease, cities might increase the chance that, over time, the descendants of survivors could resist infections.

Evolutionary biologist Ian Barnes at the University of London and his colleagues focused on a genetic variant with the alphabet-soup name of SLC11A1 1729+55del4. This variant is linked with natural resistance to germs that dwell within cells, such as tuberculosis and leprosy.

The scientists analyzed DNA samples from 17 modern populations that had occupied their cities for various lengths of time. The cities ranged from Çatalhöyük in Turkey, settled in roughly 6000 B.C., to Juba in Sudan, settled in the 20th century.

The researchers discovered an apparently highly significant link between the occurrence of this genetic variant and the duration of urban settlement. People from a long-populated urban area often seemed better adapted to resisting these specific types of infections — for instance, those in areas settled for more than 5,200 years, such as Susa in Iran, were almost certain to possess this variant, while in cities settled for only a few hundred years, such as Yakutsk in Siberia, only 70 percent to 80 percent of people would have it.

[div class=attrib]More from theSource here.[end-div]

[div class=attrib]Image courtesy of Scientific American.[end-div]

Our Kids’ Glorious New Age of Distraction

[div class=attrib]From Slate:[end-div]

Children are not what they used to be. They tweet and blog and text without batting an eyelash. Whenever they need the answer to a question, they simply log onto their phone and look it up on Google. They live in a state of perpetual, endless distraction, and, for many parents and educators, it’s a source of real concern. Will future generations be able to finish a whole book? Will they be able to sit through an entire movie without checking their phones? Are we raising a generation of impatient brats?

According to Cathy N. Davidson, a professor of interdisciplinary studies at Duke University, and the author of the new book “Now You See It: How Brain Science of Attention Will Transform the Way We Live, Work, and Learn,” much of the panic about children’s shortened attention spans isn’t just misguided, it’s harmful. Younger generations, she argues, don’t just think about technology more casually, they’re actually wired to respond to it in a different manner than we are, and it’s up to us — and our education system — to catch up to them.

Davidson is personally invested in finding a solution to the problem. As vice provost at Duke, she spearheaded a project to hand out a free iPod to every member of the incoming class, and began using wikis and blogs as part of her teaching. In a move that garnered national media attention, she crowd-sourced the grading in her course. In her book, she explains how everything from video gaming to redesigned schools can enhance our children’s education — and ultimately, our future.

Salon spoke to Davidson over the phone about the structure of our brains, the danger of multiple-choice testing, and what the workplace of the future will actually look like.

[div class=attrib]More from theSource here.[end-div]

[div class=attrib]Image courtesy of Wikipedia / Creative Commons.[end-div]

A Holiday in Hell

Not quite as poetic and intricate as Dante’s circuitous map of hell but a fascinating invention by Tom Gauld nonetheless.

[div class=attrib]From Frank Jacobs for Strange Maps:[end-div]

“A perpetual holiday is a good working definition of hell”, said George Bernard Shaw; in fact, just the odd few weeks of summer vacation may be near enough unbearable – what with all the frantic packing and driving, the getting lost and having forgotten, the primitive lodgings, lousy food and miserable weather, not to mention the risk of disease and unfriendly natives.

And yet, even for the bored teenagers forced to join their parents’ on their annual work detox, the horrors of the summer holiday mix with the chance of thrilling adventures, beckoning beyond the unfamiliar horizon.

Tom Gauld may well have been such a teenager, for this cartoon of his deftly expresses both the eeriness and the allure of obligatory relaxation in a less than opportune location. It evokes both the ennui of being where you don’t want to be, and the exhilarating exotism of those surroundings – as if they were an endpaper map of a Boys’ Own adventure, waiting for the dotted line of your very own expedition.

 

[div class=attrib]More from theSource here.[end-div]

People Who Become Nouns

John Montagu, 4th Earl of Sandwich.

The world of science is replete with nouns derived from people. There is the Amp (named after André-Marie Ampère); the Volt (after Alessandro Giuseppe Antonio Anastasio Volta), the Watt (after the Scottish engineer James Watt). And the list goes on. We have the Kelvin, Ohm, Coulomb, Celsius, Hertz, Joule, Sievert. We also have more commonly used nouns in circulation that derive from people. The mackintosh, cardigan and sandwich are perhaps the most frequently used.

[div class=attrib]From Slate:[end-div]

Before there were silhouettes, there was a French fellow named Silhouette. And before there were Jacuzzi parties there were seven inventive brothers by that name. It’s easy to forget that some of the most common words in the English language came from living, breathing people. Explore these real-life namesakes courtesy of Slate’s partnership with LIFE.com.

Jules Leotard: Tight Fit

French acrobat Jules Leotard didn’t just invent the art of the trapeze, he also lent his name to the skin-tight, one-piece outfit that allowed him to keep his limbs free while performing.

It would be fascinating to see if today’s popular culture might lend surnames with equal staying power to our language.

[div class=attrib]Slate has some more fascinating examples, here.[end-div]

[div class=attrib]Image of John Montagu, 4th Earl of Sandwich, 1783, by Thomas Gainsborough. Courtesy of Wikipedia / Creative Commons.[end-div]

Why Does “Cool” Live On and Not “Groovy”?

Why do some words take hold in the public consciousness and persist through generations while others fall by the wayside after one season?

Despite the fleetingness of many new slang terms, such as txtnesia (“when you forget what you texted someone last”), a visit to the Urbandictionary will undoubtedly amuse at the inventiveness of our our language., though gobsmacked and codswallop may come to mind as well.

[div class=attrib]From State:[end-div]

Feeling nostalgic for a journalistic era I never experienced, I recently read Tom Wolfe’s 1968 The Electric Kool-Aid Acid Test. I’d been warned that the New Journalists slathered their prose with slang, so I wasn’t shocked to find nonstandard English on nearly every line: dig, trippy, groovy, grok, heads, hip, mysto and, of course, cool. This psychedelic time capsule led me to wonder about the relative stickiness of all these words—the omnipresence of cool versus the datedness of groovy and the dweeb cachet of grok, a Robert Heinlein coinage from Stranger in a Strange Land literally signifying to drink but implying profound understanding. Mysto, an abbreviation for mystical, seems to have fallen into disuse. It doesn’t even have an Urban Dictionary entry.

There’s no grand unified theory for why some slang terms live and others die. In fact, it’s even worse than that: The very definition of slang is tenuous and clunky. Writing for the journal American Speech, Bethany Dumas and Jonathan Lighter argued in 1978 that slang must meet at least two of the following criteria: It lowers “the dignity of formal or serious speech or writing,” it implies that the user is savvy (he knows what the word means, and knows people who know what it means), it sounds taboo in ordinary discourse (as in with adults or your superiors), and it replaces a conventional synonym. This characterization seems to open the door to words that most would not recognize as slang, including like in the quotative sense: “I was like … and he was like.” It replaces a conventional synonym (said), and certainly lowers seriousness, but is probably better categorized as a tic.

At least it’s widely agreed that young people, seeking to make a mark, are especially prone to generating such dignity-reducing terms. (The editor of The New Partridge Dictionary of Slang and Unconventional English, Tom Dalzell, told me that “every generation comes up with a new word for a marijuana cigarette.”) Oppressed people, criminals, and sports fans make significant contributions, too. There’s also a consensus that most slang, like mysto, is ephemeral. Connie Eble, a linguist at the University of North Carolina, has been collecting slang from her students since the early 1970s. (She asks them to write down terms heard around campus.) In 1996, when she reviewed all the submissions she’d received, she found that more than half were only turned in once. While many words made it from one year to the next, only a tiny minority lasted a decade.

[div class=attrib]More from theSource here.[end-div]

[div class=attrib]Image courtesy of Slate.[end-div]

MondayPoem: Silence

A poem by Billy Collins ushers in another week. Collins served two terms as the U.S. Poet Laureate, from 2001-2003. He is known for poetry imbued with leftfield humor and deep insight.

[div]By Billy Collins:[end-div]

Silence —

There is the sudden silence of the crowd
above a player not moving on the field,
and the silence of the orchid.

The silence of the falling vase
before it strikes the floor,
the silence of the belt when it is not striking the child.

The stillness of the cup and the water in it,
the silence of the moon
and the quiet of the day far from the roar of the sun.

The silence when I hold you to my chest,
the silence of the window above us,
and the silence when you rise and turn away.

And there is the silence of this morning
which I have broken with my pen,
a silence that had piled up all night

like snow falling in the darkness of the house—
the silence before I wrote a word
and the poorer silence now.

[div class=attrib]Image courtesy of Poetry Foundation.[end-div]

Undesign

Jonathan Ive, the design brains behind such iconic contraptions as the iMac, iPod and the iPhone discusses his notion of “undesign”. Ive has over 300 patents and is often cited as one of the most influential industrial designers of the last 20 years. Perhaps it’s purely coincidental that’s Ive’s understated “undesign” comes from his unassuming Britishness.

[div class=attrib]From Slate:[end-div]

Macworld, 1999. That was the year Apple introduced the iMac in five candy colors. The iMac was already a translucent computer that tried its best not to make you nervous. Now it strove to be even more welcoming, almost silly. And here was Apple’s newish head of design, Jonathan Ive, talking about the product in a video—back when he let his hair grow and before he had permanently donned his dark T-shirt uniform. Even then, Ive had the confessional intimacy that makes him the star of Apple promotional videos today. His statement is so ridiculous that he laughs at it himself: “A computer absolutely can be sexy, it’s um … yeah, it can.”

A decade later, no one would laugh (too loudly) if you said that an Apple product was sexy. Look at how we all caress our iPhones. This is not an accident. In interviews, Ive talks intensely about the tactile quality of industrial design. The team he runs at Apple is obsessed with mocking up prototypes. There is a now-legendary story from Ive’s student days of an apartment filled with foam models of his projects. Watch this scene in the documentary Objectified where Ive explains the various processes used to machine a MacBook Air keyboard. He gazes almost longingly upon a titanium blank. This is a man who loves his materials.

Ive’s fixation on how a product feels in your hand, and his micro-focus on aspects like the shininess of the stainless steel, or the exact amount of reflectivity in the screen, were first fully realized with the iPod. From that success, you can see how Ive and Steve Jobs led Apple to glory in the past decade. The iPod begat the iPhone, which in turned inspired the iPad. A new kind of tactile computing was born. Ive’s primary concern for physicality, and his perfectionist desire to think through every aspect of the manufacturing process (even the boring parts), were the exact gifts needed to make a singular product like the iPhone a reality and to guide Apple products through a new era of human-computer interaction. Putting design first has reaped huge financial rewards: Apple is now vying with Exxon to be the world’s most valuable company.

[div class=attrib]More from theSource here.[end-div]

[div class=attrib]Image courtesy of CNNMoney.[end-div]

Business Transforms Street Art

Street art once known as graffiti used to be a derided outlet for social misfits and cultural rebels. Now it is big business. Corporations have embraced the medium, and some street artists have even “sold out” to commercial interests.

Jonathan Jones laments the demise of this art form and its transformation into just another type of corporate advertising.

[div class=attrib]By Jonathan Jones for the Guardian:[end-div]

Street art is so much part of the establishment that when David Cameron spoke about this summer’s riots, he was photographed in front of a bright and bulbous Oxfordshire graffiti painting. Contradiction? Of course not. The efforts of Banksy and all the would-be Banksys have so deeply inscribed the “coolness” of street art into the middle-class mind that it is now as respectable as the Proms, and enjoyed by the same crowd – who can now take a picnic basket down to watch a painting marathon under the railway arches.

No wonder an event described as “the UK’s biggest street art project” (60 artists from all over the world decorating Nelson Street in Bristol last week) went down fairly quietly in the national press. It’s not that new or surprising any more, let alone controversial. Nowadays, doing a bit of street art is as routine as checking your emails. There’s probably an app for it.

Visitors to London buy Banksy prints on canvas from street stalls, while in Tripoli photographers latch on to any bloke with a spray can near any wall that’s still standing. Graffiti and street art have become instant – and slightly lazy – icons of everything our culture lauds, from youth to rebellion to making a fast buck from art.

Is this how street art will die – not with a bang, but with a whimper? Maybe there was a time when painting a wittily satirical or cheekily rude picture or comment on a wall was genuinely disruptive and shocking. That time is gone. Councils still do their bit to keep street art alive by occasionally obliterating it, and so confirming that it has edge. But basically it has been absorbed so deep into the mainstream that old folk who once railed at graffiti in their town are now more likely to have a Banksy book on their shelves than a collection of Giles cartoons.

[div class=attrib]More from theSource here.[end-div]

[div class=attrib]Image of highly decorative graffiti typically found in Olinda, Pernambuco, Brazil. Courtesy of Bjørn Christian Tørrissen.[end-div]

The Culture of Vanity

Alexander Edmonds has a thoroughly engrossing piece on the pursuit of “beauty” and the culture of vanity as commodity. And the role of plastic surgeon as both enabler and arbiter comes under a very necessary microscope.

[div class=attrib]Alexander Edmonds for the New York Times:[end-div]

While living in Rio de Janeiro in 1999, I saw something that caught my attention: a television broadcast of a Carnival parade that paid homage to a plastic surgeon, Dr. Ivo Pitanguy. The doctor led the procession surrounded by samba dancers in feathers and bikinis.  Over a thundering drum section and anarchic screech of a cuica, the singer praised Pitanguy for “awakening the self-esteem in each ego” with a “scalpel guided by heaven.”

It was the height of Rio’s sticky summer and the city had almost slowed to a standstill, as had progress on my anthropology doctorate research on Afro-Brazilian syncretism. After seeing the parade, I began to notice that Rio’s plastic surgery clinics were almost as numerous as beauty parlors (and there are a lot of those).  Newsstands sold magazines with titles like Plástica & Beauty, next to Marie Claire.  I assumed that the popularity of cosmetic surgery in a developing nation was one more example of Brazil’s gaping inequalities.  But Pitanguy had long maintained that plastic surgery was not only for the rich: “The poor have the right to be beautiful, too,” he has said.

The beauty of the human body has raised distinct ethical issues for different epochs.  The literary scholar Elaine Scarry pointed out that in the classical world a glimpse of a beautiful person could imperil an observer. In his “Phaedrus” Plato describes a man who after beholding a beautiful youth begins to spin, shudder, shiver and sweat.   With the rise of mass consumption, ethical discussions have focused on images of female beauty.  Beauty ideals are blamed for eating disorders and body alienation.  But Pitanguy’s remark raises yet another issue: Is beauty a right, which, like education or health care, should be realized with the help of public institutions and expertise?

The question might seem absurd.  Pitanguy’s talk of rights echoes the slogans of make-up marketing (L’Oreal’s “Because you’re worth it.”). Yet his vision of plastic surgery reflects a clinical reality that he helped create.  For years he has performed charity surgeries for the poor. More radically, some of his students offer free cosmetic operations in the nation’s public health system.

[div class=attrib]More from theSource here.[end-div]

There’s Weird and Then There’s WEIRD

[div class=attrib]From Neuroanthropology:[end-div]

The most recent edition of Behavioral and Brain Sciences carries a remarkable review article by Joseph Henrich, Steven J. Heine and Ara Norenzayan, ‘The weirdest people in the world?’ The article outlines two central propositions; first, that most behavioural science theory is built upon research that examines intensely a narrow sample of human variation (disproportionately US university undergraduates who are, as the authors write, Western, Educated, Industrialized, Rich, and Democratic, or ‘WEIRD’).

More controversially, the authors go on to argue that, where there is robust cross-cultural research, WEIRD subjects tend to be outliers on a range of measurable traits that do vary, including visual perception, sense of fairness, cooperation, spatial reasoning, and a host of other basic psychological traits. They don’t ignore universals – discussing them in several places – but they do highlight human variation and its implications for psychological theory.

As is the custom at BBS, the target article is accompanied by a large number of responses from scholars around the world, and then a synthetic reflection from the original target article authors to the many responses (in this case, 28). The total of the discussion weighs in at a hefty 75 pages, so it will take most readers (like me) a couple of days to digest the whole thing.

t’s my second time encountering the article as I read a pre-print version and contemplated proposing a response, but, sadly, there was just too much I wanted to say, and not enough time in the calendar (conference organizing and the like dominating my life) for me to be able to pull it together. I regret not writing a rejoinder, but I can do so here with no limit on my space and the added advantage of seeing how other scholars responded to the article.

My one word review of the collection of target article and responses: AMEN!

Or maybe that should be, AAAAAAAMEEEEEN! {Sung by angelic voices.}

There’s a short version of the argument in Nature as well, but the longer version is well worth the read.

[div class=attrib]More from theSource here.[end-div]

The Ascent of Scent

Scents are deeply evokative. A faint whiff of a distinct and rare scent can bring back a long forgotten memory and make it vivid, and do so like no other sense. Smells can make our stomachs churn and make us swoon.

The scent-making industry has been with us for thousands of years. In 2005, archeologists discovered the remains of a perfume factory on the island of Cyprus dating back over 4,000 years. So, it’s no surprise that makers of fragrances, from artificial aromas for foods to complex nasal “notes” for perfumes and deodorants, now comprise a multi-billion dollar global industry. Krystal D’Costa over at Anthropology in Practice takes us on a fine aromatic tour, and concludes her article with a view to which most can surely relate:

My perfume definitely makes me feel better. It wraps me in a protective cocoon that prepares me to face just about any situation. Hopefully, when others encounter a trace of it, they think of me in my most confident and warmest form.

A related article in the Independent describes how an increasing number of retailers are experimenting with scents to entice shoppers to linger and spend more time and money in their stores. We learn that

. . . a study run by Nike showed that adding scents to their stores increased intent to purchase by 80 per cent, while in another experiment at a petrol station with a mini-mart attached to it, pumping around the smell of coffee saw purchases of the drink increase by 300 per cent.

[div class=attrib]More from Anthropology in Practice:[end-div]

At seventeen I discovered the perfume that would become my signature scent. It’s a warm, rich, inviting fragrance that reminds me (and hopefully others) of a rose garden in full bloom. Despite this fullness, it’s light enough to wear all day and it’s been in the background of many of my life experiences. It announces me: the trace that lingers in my wake leaves a subtle reminder of my presence. And I can’t deny that it makes me feel a certain way: as though I could conquer the world. (Perhaps one day, when I do conquer the world, that will be the quirk my biographers note: that I had a bottle of X in my bag at all times.)

Our world is awash in smells—everything has an odor. Some are pleasant, like flowers or baked goods, and some are unpleasant, like exhaust fumes or sweaty socks—and they’re all a subjective experience: The odors that one person finds intoxicating may not have the same effect on another. (Hermione Granger’s fondness for toothpaste is fantastic example of the personal relationship we can have with the smells that permeate our world.)  Nonetheless, they constitute a very important part of our experiences. We use them to make judgments about our environments (and each other), they can trigger memories, and even influence behaviors.

No odors seem to concern us more than our own, however. But you don’t have to take my word for it—the numbers speak for themselves: In 2010, people around the world spent the equivalent of $2.2 billion USD on fragrances, making the sale of essential oils and aroma chemicals a booming business. The history of aromatics sketches our attempts to control and manipulate scents— socially and chemically—illustrating how we’ve carefully constructed the smells in our lives.

[div class=attrib]More from theSource here.[end-div]

[div class=attrib]Image: The Perfume Maker by Ernst, Rodolphe, courtesy of Wikipedia / Creative Commons.[end-div]

Mirror, Mirror

A thoughtful question posed below by philosopher Eric Schwitzgebel over at The Splinted Mind. Gazing in a mirror or reflection is something we all do on a frequent basis. In fact, is there any human activity that trumps this in frequency? Yet, have we ever given thought to how and why we perceive ourselves in space differently to say a car in a rearview mirror. The car in the rearview mirror is quite clearly approaching us from behind as we drive. However, where exactly is our reflection we when cast our eyes at the mirror in the bathrooom?

[div class=attrib]From the Splintered Mind:[end-div]

When I gaze into a mirror, does it look like there’s someone a few feet away gazing back at me? (Someone who looks a lot like me, though perhaps a bit older and grumpier.) Or does it look like I’m standing where I in fact am, in the middle of the bathroom? Or does it somehow look both ways? Suppose my son is sneaking up behind me and I see him in the same mirror. Does it look like he is seven feet in front of me, sneaking up behind the dope in the mirror and I only infer that he is actually behind me? Or does he simply look, instead, one foot behind me?

Suppose I’m in a new restaurant and it takes me a moment to notice that one wall is a mirror. Surely, before I notice, the table that I’m looking at in the mirror appears to me to be in a location other than its real location. Right? Now, after I notice that it’s a mirror, does the table look to be in a different place than it looked to be a moment ago? I’m inclined to say that in the dominant sense of “apparent location”, the apparent location of the table is just the same, but now I’m wise to it and I know its apparent location isn’t its real location. On the other hand, though, when I look in the rear-view mirror in my car I want to say that it looks like that Mazda is coming up fast behind me, not that it looks like there is a Mazda up in space somewhere in front of me.

What is the difference between these cases that makes me want to treat them differently? Does it have to do with familiarity and skill? I guess that’s what I’m tempted to say. But then it seems to follow that, with enough skill, things will look veridical through all kinds of reflections, refractions, and distortions. Does the oar angling into water really look straight to the skilled punter? With enough skill, could even the image in a carnival mirror look perfectly veridical? Part of me wants to resist at least that last thought, but I’m not sure how to do so and still say all the other things I want to say.

[div class=attrib]More from theSource here.[end-div]

[div class=attrib]Image courtesy of Adrian Pingstone, Wikipedia / Creative Commons.[end-div]

Book Review: The Believing Brain. Michael Shermer

Skeptic in-chief, Michael Shermer has an important and fascinating new book. The Believing Brain: From Ghosts and Gods to Politics and Conspiracies – How We Construct Beliefs and Reinforce Them as Truths – describes how our beliefs arise from patterns and that these beliefs come first, and explanations for those beliefs comes second.

Shermer reviews 30 years of leading research in cognitive science, neurobiology, evolutionary psychology and anthropology and numerous real-world examples to show how the belief mechanism works. This holds for our beliefs in all manner of important spheres: religion, politics, economics, superstition and the supernatural.

Shermer proposes that our brains are “belief engines” that “look for and find patterns” quite naturally, and it is only following this that our brains assign these patterns with meaning. It is these meaningful patterns that form what Shermer terms “belief-dependent reality.” Additionally, our brains tend to gravitate towards information that further reinforces our beliefs, and ignore data that contradicts these beliefs. This becomes a self-reinforcing loop where beliefs drive explanation seeking behaviors to confirm those beliefs which are further reinforced, and drive further confirmation seeking behavior.

In fact, the human brain is so adept at looking for patterns it “sees” them in places where none exist. Shermer calls this “illusory correlation”. Birds do it, rats to it; humans are masters at it. B.F. Skinner’s groundbreaking experiments on partial reinforcement in animals shows this “patternicity” exquisitely. As Shermer describes:

Skinner discovered that if he randomly delivered the food reinforcement, whatever the pigeon happened to be doing jiust before the delivery of the food would be repeated the next time, such as spinning around once to the left before pecking at the key. This is pigeon patternicity or the learning of a superstition.

. . . If you doubt its potency as a force in  human behavior, just visit a Las Vegas casino and observe people playing the slots with their varied attempts to find a pattern between (A) pulling the slot machine handle and (B) the payoff.

This goes a long way to describing all manner of superstitious behaviors in humans. But Shermer doesn’t stop there. He also describes how and why we look for patterns in the behaviors of others and assign meaning to these as well. Shermer call this “agenticity”. This is “the tendency to infuse patterns with meaning, intention and agency”. As he goes on to describe:

… we often impart the patterns we find with agency and intention, and believe that these intentional agents control the world, sometimes invisibly from the top down, instead of bottom-up causal laws and randomness that makes up much of our world. Souls, spirits, ghosts, gods, demons, angels, aliens, intelligent designers, government conspiracists, and all manner of invisible agents with power and intention are believed to haunt our world and control our lives. Combined with our propensity to find meaningful patterns in both meaningful and meaningless noise, patternicity and agenticity form the cognitive basis of shamanism, paganism, animism, polytheism, monotheism, and all modes of Old and New Age spiritualisms.

Backed with the results of numerous cross-disciplinary scientific studies, Shermer’s arguments are thoroughly engrossing and objectively difficult to refute. This is by far Shermer’s best book to date.

(By the way, in the interest of full disclosure this book thoroughly validated the reviewer’s own beliefs.)

Software is Eating the World

[div class=attrib]By Marc Andreesen for the WSJ:[end-div]

This week, Hewlett-Packard (where I am on the board) announced that it is exploring jettisoning its struggling PC business in favor of investing more heavily in software, where it sees better potential for growth. Meanwhile, Google plans to buy up the cellphone handset maker Motorola Mobility. Both moves surprised the tech world. But both moves are also in line with a trend I’ve observed, one that makes me optimistic about the future growth of the American and world economies, despite the recent turmoil in the stock market.

In short, software is eating the world.

More than 10 years after the peak of the 1990s dot-com bubble, a dozen or so new Internet companies like Facebook and Twitter are sparking controversy in Silicon Valley, due to their rapidly growing private market valuations, and even the occasional successful IPO. With scars from the heyday of Webvan and Pets.com still fresh in the investor psyche, people are asking, “Isn’t this just a dangerous new bubble?”

I, along with others, have been arguing the other side of the case. (I am co-founder and general partner of venture capital firm Andreessen-Horowitz, which has invested in Facebook, Groupon, Skype, Twitter, Zynga, and Foursquare, among others. I am also personally an investor in LinkedIn.) We believe that many of the prominent new Internet companies are building real, high-growth, high-margin, highly defensible businesses.

. . .

Why is this happening now?

Six decades into the computer revolution, four decades since the invention of the microprocessor, and two decades into the rise of the modern Internet, all of the technology required to transform industries through software finally works and can be widely delivered at global scale.

. . .

Perhaps the single most dramatic example of this phenomenon of software eating a traditional business is the suicide of Borders and corresponding rise of Amazon. In 2001, Borders agreed to hand over its online business to Amazon under the theory that online book sales were non-strategic and unimportant.

Oops.

Today, the world’s largest bookseller, Amazon, is a software company—its core capability is its amazing software engine for selling virtually everything online, no retail stores necessary. On top of that, while Borders was thrashing in the throes of impending bankruptcy, Amazon rearranged its web site to promote its Kindle digital books over physical books for the first time. Now even the books themselves are software.

[div class=attrib]More from theSource here.[end-div]

Friending the Dead Online

Accumulating likes, collecting followers and quantifying one’s friends online is serious business. If you don’t have more than a couple of hundred professional connections in your LinkedIn profile or at least twice that number of “friends” through Facebook or ten times that volume of Twittering followers, you’re most likely to be a corporate wallflower, a social has-been.

Professional connection collectors and others who measure their worth through numbers, such as politicians, can of course purchase “friends” and followers. There are a number of agencies online whose purpose is to purchase Twitter followers for their clients. Many of these “followers” come from dummy or inactive accounts; others are professional followers who also pay to be followed themselves. If this is not a sign that connections are now commodity then what is?

Of course social networks recognize that many of their members place a value on the quantity of connections — the more connections a member has the more, well, the more something that person has. So, many networks proactively and regularly present lists of potential connections to their registered members; “know this person? Just click here to connect!”. It’s become so simple and convenient to collect new relationships online.

So, it comes a no surprise that a number of networks recommend friends and colleagues that have since departed, as in “passed away”. Christopher Mims over at Slate has a great article on the consequences of being followed by the dead online.

[div class=attrib]From Technology Review:[end-div]

Aside from the feeling that I’m giving up yet more of my privacy out of fear of becoming techno-socially irrelevant, the worst part of signing up for a new social network like Google+ is having the service recommend that I invite or classify a dead friend.

Now, I’m aware that I could prevent this happening by deleting this friend from my email contacts list, because I’m a Reasonably Savvy Geek™ and I’ve intuited that the Gmail contacts list is Google’s central repository of everyone with whom I’d like to pretend I’m more than just acquaintances (by ingesting them into the whirligig of my carefully mediated, frequently updated, lavishly illustrated social networking persona).

But what about the overwhelming majority of people who don’t know this or won’t bother? And what happens when I figure out how to overcome Facebook’s intransigence about being rendered irrelevant and extract my social graph from that site and stuff it into Google+, and this friend is re-imported? Round and round we go.

Even though I know it’s an option, I don’t want to simply erase this friend from my view of the Internet. Even though I know the virtual world, unlike the physical, can be reconfigured to swallow every last unsavory landmark in our past.

[div class=attrib]More from theSource here.[end-div]

[div class=attrib]Images courtesy of Wikipedia / Creative Commons.[end-div]

Honey? Does this Outfit Look Good?

Regardless of culture, every spouse (most often the male in this case) on the planet knows to tread very carefully when formulating the answer to that question. An answer that’s conclusively negative will consign the outfit to the disposable pile and earn a scowl; a response that’s only a little negative will get a scowl; a response that’s ebulliently positive will not be believed; one that slightly positive will not be believed and earn another scowl; and the ambivalent, non-committal answer gets an even bigger scowl. This oft repeated situation is very much a lose-lose event. That is, until now.

A new mobile app and website, called Go Try It On, aims to give crowdsourced, anonymous feedback in real-time to any of the outfit-challenged amongst us. Spouses can now relax – no more awkward conversations about clothing.

[div class=attrib]From the New York Times:[end-div]

There is a reason that women go shopping in groups — they like to ask their stylish friend, mother or the store’s dressing room attendant whether something looks good.

Go Try It On, a start-up that runs a Web site and mobile app for getting real-time feedback on outfits, believes that with computers and cellphones, fashion consultations should be possible even when people aren’t together.

“It’s crowdsourcing an opinion on an outfit and getting a quick, unbiased second opinion,” said Marissa Evans, Go Try It On’s founder and chief executive.

On Friday, Go Try It On will announce that it has raised $3 million from investors including SPA Investments and Index Ventures. It is also introducing a way to make money, by allowing brands to critique users’ outfits and suggest products, beginning with Gap and Sephora.

Users upload a photo or use a Webcam to show an outfit and solicit advice from other users. The service, which is one of several trying to make online shopping more social, started last year, and so far 250,000 people have downloaded the app and commented on outfits 10 million times. Most of the users are young women, and 30 percent live abroad.

[div class=attrib]More from theSource here.[end-div]

World’s Narrowest House – 4 Feet Wide

[div class=attrib]From TreeHugger:[end-div]

Aristotle said “No great genius was without a mixture of insanity.” Marcel Proust wrote “Everything great in the world is created by neurotics. They have composed our masterpieces, but we don’t consider what they have cost their creators in sleepless nights, and worst of all, fear of death.”

Perhaps that’s why Jakub Szcz?sny designed this hermitage, this “studio for invited guests – young creators and intellectualists from all over the world.”- it will drive them completely crazy.

Don’t get me wrong, I love the idea of living in small spaces. I write about them all the time. But the Keret House is 122 cm (48.031″) at its widest, 72 (28.34″) at its narrowest. I know people wider than that.

[div class=attrib]More from theSource here.[end-div]

The Postcard, Another Victim of Technology

That very quaint form of communication, the printed postcard, reserved for independent children to their clingy parents and boastful travelers to their (not) distant (enough) family members, may soon become as arcane as the LP or paper-based map. Until the late-90s there were some rather common sights associated with the postcard: the tourist lounging in a cafe musing with great difficulty over the two or three pithy lines he would write from Paris; the traveler asking for a postcard stamp in broken German; the remaining 3 from a pack of 6 unwritten postcards of the Vatican now used as bookmarks; the over saturated colors of the sunset.

Technology continues to march on, though some would argue that it may not necessarily be a march forward. Technology is indifferent to romance and historic precedent, and so the lowly postcard finds itself increasing under threat from Flickr and Twitter and smartphones and Instagram and Facebook.

[div class=attrib]Charles Simic laments over at the New York Review of Books:[end-div]

Here it is already August and I have received only one postcard this summer. It was sent to me by a European friend who was traveling in Mongolia (as far as I could deduce from the postage stamp) and who simply sent me his greetings and signed his name. The picture in color on the other side was of a desert broken up by some parched hills without any hint of vegetation or sign of life, the name of the place in characters I could not read. Even receiving such an enigmatic card pleased me immensely. This piece of snail mail, I thought, left at the reception desk of a hotel, dropped in a mailbox, or taken to the local post office, made its unknown and most likely arduous journey by truck, train, camel, donkey—or whatever it was— and finally by plane to where I live.

Until a few years ago, hardly a day would go by in the summer without the mailman bringing a postcard from a vacationing friend or acquaintance. Nowadays, you’re bound to get an email enclosing a photograph, or, if your grandchildren are the ones doing the traveling, a brief message telling you that their flight has been delayed or that they have arrived. The terrific thing about postcards was their immense variety. It wasn’t just the Eiffel Tower or the Taj Mahal, or some other famous tourist attraction you were likely to receive in the mail, but also a card with a picture of a roadside diner in Iowa, the biggest hog at some state fair in the South, and even a funeral parlor touting the professional excellence that their customers have come to expect over a hundred years. Almost every business in this country, from a dog photographer to a fancy resort and spa, had a card. In my experience, people in the habit of sending cards could be divided into those who go for the conventional images of famous places and those who delight in sending images whose bad taste guarantees a shock or a laugh.

[div class=attrib]More from theSource here.[end-div]

[div class=attrib]Image of Aloha Nui Postcard of Luakaha, Home of C.M. Cooke, courtesy of Wikipedia / Creative Commons.[end-div]

Ravelry 1, Facebook 0

Facebook with its estimated 600-700 million users, multi-billion dollar valuation, and its 2,500 or so employees in 15 countries is an internet juggernaut by most measures. But, measure a social network by the loyalty and adoration of its users and Facebook is likely to be eclipsed by a social network of knitters and crocheters.

The online community is known as Ravelry. It was created by a wife-and-husband team and has four employees, including the founders, and boasts around 1.5 million members.

[div class=attrib]From Slate:[end-div]

The best social network you’ve (probably) never heard of is one-five-hundredth the size of Facebook. It has no video chat feature, it doesn’t let you check in to your favorite restaurant, and there are no games. The company that runs it has just four employees, one of whom is responsible for programming the entire operation. It has never taken any venture capital money and has no plans to go public. Despite these apparent shortcomings, the site’s members absolutely adore it. They consider it a key part of their social lives, and they use it to forge deeper connections with strangers—and share more about themselves—than you’re likely to see elsewhere online. There’s a good chance this site isn’t for you, but after you see how much fun people have there, you’ll wish you had a similar online haunt. The social network is called Ravelry. It’s for knitters (and crocheters).

Ravelry’s success is evidence in favor of an argument that you often hear from Facebook’s critics: A single giant social network is no fun. Social sites work better when they’re smaller and bespoke, created to cater to a specific group. What makes Ravelry work so well is that, in addition to being a place to catch up with friends, it is also a boon to its users’ favorite hobby—it helps people catalog their yarn, their favorite patterns, and the stuff they’ve made or plan on making. In other words, there is something to do there. And having something to do turns out to make an enormous difference in the way people interact with one another on the Web.

[div class=attrib]More from theSource here.[end-div]

Shnakule, Ishabor and Cinbric: The Biggest Networks You’ve Never Heard

Shnakule, Ishabor, Cinbric, Naargo and Vidzeban are not five fictional colleagues of Lord Voldemort from the mind of JK Rowling. They are indeed bad guys, but they live in our real world, online. Shnakule and its peers are the top 5 malware delivery networks. That is, they host a range of diverse and sophisticated malicious software, or malware, on ever-changing computer networks that seek to avoid detection. Malware on these networks includes: fake anti-virus software, fake software updates, drive-by downloads, suspicious link farming, ransomware, pharmacy spam, malvertising, work-at-home scams and unsolicited pornography. Other malware includes: computer viruses, worms, trojan horses, spyware, dishonest adware, and other unwanted software.

Malware researcher Chris Larsen, with Blue Coat, derived this malware infographic from the company’s Mid-Year Security Report. Interestingly, search engine poisoning is the most prevalent point of entry for the delivery of malware to a user’s computer. As the New York Times reports:

Search engine poisoning (SEP) makes up 40% of malware delivery vectors on the Web. It is easy to see why. People want to be able to trust that what they search for in Google, Bing or Yahoo is safe to click on. Users are not conditioned to think that search results could be harmful to the health of their computers. The other leading attack vectors on the Web all pale in comparison to SEP, with malvertising, email, porn and social networking all 10% of malware delivery.

[div class=attrib]Infographic courtesy of Blue Coat:[end-div]

MondayPoem: Three Six Five Zero

A poignant, poetic view of our relationships, increasingly mediated and recalled for us through technology. Conor O’Callaghan’s poem ushers in this week’s collection of articles at theDiagonal focused on technology.

Conor O’Callaghan is an Irish poet. He teaches at Wake Forest University and Sheffield Hallam University in the United Kingdom.

[div class=attrib]By Conor O’Callaghan, courtesy of Poetry Foundation:[end-div]

Three Six Five Zero

I called up tech and got the voicemail code.
It’s taken me this long to find my feet.
Since last we spoke that evening it has snowed.

Fifty-four new messages. Most are old
and blinking into a future months complete.
I contacted tech to get my voicemail code

to hear your voice, not some bozo on the road
the week of Thanksgiving dubbing me his sweet
and breaking up and bleating how it snowed

the Nashville side of Chattanooga and slowed
the beltway to a standstill. The radio said sleet.
The kid in tech sent on my voicemail code.

I blew a night on lightening the system’s load,
woke to white enveloping the trees, the street
that’s blanked out by my leaving. It had snowed.

Lately others’ pasts will turn me cold.
I heard out every message, pressed delete.
I’d happily forget my voice, the mail, its code.
We spoke at last that evening. Then it snowed.

Automating the Art Critic

Where evaluating artistic style was once the exclusive domain of seasoned art historians and art critics with many decades of experience, a computer armed with sophisticated image processing software is making a stir in art circles.

Computer scientist, Dr. Lior Shamir of Lawrence Technological University in Michigan authored a recent paper that suggests computers may be just as adept as human art experts at evaluating similarities, and differences, of artistic styles.

Dr. Shamir’s breakthrough was to decompose the task of evaluating a painting into discrete quantifiable components that could be assigned a numeric value and hence available for computation. These components, or descriptors, included surface texture, color intensity and type, distribution of lines and edges, and number and types of shapes used in the painting.

[div class=attrib]From the Economist:[end-div]

Dr Shamir, a computer scientist, presented 57 images by each of nine painters—Salvador Dalí, Giorgio de Chirico, Max Ernst, Vasily Kandinsky, Claude Monet, Jackson Pollock, Pierre-Auguste Renoir, Mark Rothko and Vincent van Gogh—to a computer, to see what it made of them. The computer broke the images into a number of so-called numerical descriptors. These descriptors quantified textures and colours, the statistical distribution of edges across a canvas, the distributions of particular types of shape, the intensity of the colour of individual points on a painting, and also the nature of any fractal-like patterns within it (fractals are features that reproduce similar shapes at different scales; the edges of snowflakes, for example).

All told, the computer identified 4,027 different numerical descriptors. Once their values had been established for each of the 513 artworks that had been fed into it, it was ready to do the analysis.

Dr Shamir’s aim was to look for quantifiable ways of distinguishing between the work of different artists. If such things could be established, it might make the task of deciding who painted what a little easier. Such decisions matter because, even excluding deliberate forgeries, there are many paintings in existence that cannot conclusively be attributed to a master rather than his pupils, or that may be honestly made copies whose provenance is now lost.

[div class=attrib]More from theSource here.[end-div]

[div class=attrib]Persistence of Memory by Salvador Dalí, courtesy of  Salvador Dalí, Gala-Salvador Dalí Foundation/Artists Rights Society.[end-div]

[div class=attrib]Love Song by Giorgio de Chirico, courtesy of Wikipedia / Creative Commons.[end-div]

The Right of Not Turning Left

In 2007 UPS made the headlines by declaring left-hand turns for its army of delivery truck drivers undesirable. Of course, we left-handers have always known that our left or “sinister” side is fatefully less attractive and still branded as unlucky or evil. Chinese culture brands left-handedness as improper as well.

UPS had other motives for poo-pooing left-hand turns. For a company which runs over 95,000 big brown delivery trucks optimizing delivery routes could result in tremendous savings. In fact, careful research showed that the company could reduce its annual delivery routes by 28.5 million miles, save around 3 million gallons of fuel and reduce CO2 emissions by over 30,000 metric tons. And, eliminating or reducing left-hand turns would be safer as well. Of the 2.4 million crashes at intersections in the United States in 2007, most involved left-hand turns, according to the U.S. Federal Highway Administration.

Now urban planners and highway designers in the United States are evaluating the same thing — how to reduce the need for left-hand turns. Drivers in Europe, especially the United Kingdom, will be all too familiar with the roundabout technique for reducing left-handed turns on many A and B roads. Roundabouts have yet to gain significant traction in the United States, so now comes the Diverging Diamond Interchange.

[div class=attrib]From Slate:[end-div]

. . . Left turns are the bane of traffic engineers. Their idea of utopia runs clockwise. (UPS’ routing software famously has drivers turn right whenever possible, to save money and time.) The left-turning vehicle presents not only the aforementioned safety hazard, but a coagulation in the smooth flow of traffic. It’s either a car stopped in an active traffic lane, waiting to turn; or, even worse, it’s cars in a dedicated left-turn lane that, when traffic is heavy enough, requires its own “dedicated signal phase,” lengthening the delay for through traffic as well as cross traffic. And when traffic volumes really increase, as in the junction of two suburban arterials, multiple left-turn lanes are required, costing even more in space and money.

And, increasingly, because of shifting demographics and “lollipop” development patterns, suburban arterials are where the action is: They represent, according to one report, less than 10 percent of the nation’s road mileage, but account for 48 percent of its vehicle-miles traveled.

. . . What can you do when you’ve tinkered all you can with the traffic signals, added as many left-turn lanes as you can, rerouted as much traffic as you can, in areas that have already been built to a sprawling standard? Welcome to the world of the “unconventional intersection,” where left turns are engineered out of existence.

. . . “Grade separation” is the most extreme way to eliminate traffic conflicts. But it’s not only aesthetically unappealing in many environments, it’s expensive. There is, however, a cheaper, less disruptive approach, one that promises its own safety and efficiency gains, that has become recently popular in the United States: the diverging diamond interchange. There’s just one catch: You briefly have to drive the wrong way. But more on that in a bit.

The “DDI” is the brainchild of Gilbert Chlewicki, who first theorized what he called the “criss-cross interchange” as an engineering student at the University of Maryland in 2000.

The DDI is the sort of thing that is easier to visualize than describe (this simulation may help), but here, roughly, is how a DDI built under a highway overpass works: As the eastbound driver approaches the highway interchange (whose lanes run north-south), traffic lanes “criss cross” at a traffic signal. The driver will now find himself on the “left” side of the road, where he can either make an unimpeded left turn onto the highway ramp, or cross over again to the right once he has gone under the highway overpass.

[div class=attrib]More from theSource here.[end-div]

So the Universe is Flat?


Having just posted an article that described the universe in terms of holographic principles – a 3-D projection on a two dimensional surface, it’s timely to put the theory in context, of other theories of course. There’s a theory that posits that the universe is a bubble wrought from the collision of high-dimensional branes (membrane that is). There’s a theory that suggests that our universe is one of many in a soup of multi-verses. Other theories suggest that the universe is made up of 9, 10 or 11 dimensions.

There’s another theory that the universe is flat, and that’s where Davide Castelvecchi (mathematician, science editor at Scientific American and blogger) over at Degrees of Freedom describes the current thinking.

[div class=attrib]What Do You Mean, The Universe Is Flat? (Part I), from Degrees of Freedom:[end-div]

In the last decade—you may have read this news countless times—cosmologists have found what they say is rather convincing evidence that the universe (meaning 3-D space) is flat, or at least very close to being flat.

The exact meaning of flat, versus curved, space deserves a post of its own, and that is what Part II of this series will be about. For the time being, it is convenient to just visualize a plane as our archetype of flat object, and the surface of the Earth as our archetype of a curved one. Both are two-dimensional, but as I will describe in the next installment, flatness and curviness make sense in any number of dimensions.

What I do want to talk about here is what it is that is supposed to be flat.

When cosmologists say that the universe is flat they are referring to space—the nowverse and its parallel siblings of time past. Spacetime is not flat. It can’t be: Einstein’s general theory of relativity says that matter and energy curve spacetime, and there are enough matter and energy lying around to provide for curvature. Besides, if spacetime were flat I wouldn’t be sitting here because there would be no gravity to keep me on the chair. To put it succintly: space can be flat even if spacetime isn’t.

Moreover, when they talk about the flatness of space cosmologists are referring to the large-scale appearance of the universe. When you “zoom in” and look at something of less-than-cosmic scale, such as the solar system, space—not just spacetime—is definitely not flat. Remarkable fresh evidence for this fact was obtained recently by the longest-running experiment in NASA history, Gravity Probe B, which took a direct measurement of the curvature of space around Earth. (And the most extreme case of non-flatness of space is thought to occur inside the event horizon of a black hole, but that’s another story.)

On a cosmic scale, the curvature created in space by the countless stars, black holes, dust clouds, galaxies, and so on constitutes just a bunch of little bumps on a space that is, overall, boringly flat.

Thus the seeming contradiction:

Matter curves spacetime. The universe is flat

is easily explained, too: spacetime is curved, and so is space; but on a large scale, space is overall flat.

[div class=attrib]More from theSource here.[end-div]

[div class=attrib]Image of Cosmic Microwave Background temperature fluctuations from the 7-year Wilkinson Microwave Anisotropy Probe data seen over the full sky. Courtesy of NASA.[end-div]

The Plastic Bag Wars

[div class=attrib]From Rolling Stone:[end-div]

American shoppers use an estimated 102 billion plastic shopping bags each year — more than 500 per consumer. Named by Guinness World Records as “the most ubiquitous consumer item in the world,” the ultrathin bags have become a leading source of pollution worldwide. They litter the world’s beaches, clog city sewers, contribute to floods in developing countries and fuel a massive flow of plastic waste that is killing wildlife from sea turtles to camels. “The plastic bag has come to represent the collective sins of the age of plastic,” says Susan Freinkel, author of Plastic: A Toxic Love Story.

Many countries have instituted tough new rules to curb the use of plastic bags. Some, like China, have issued outright bans. Others, including many European nations, have imposed stiff fees to pay for the mess created by all the plastic trash. “There is simply zero justification for manufacturing them anymore, anywhere,” the United Nations Environment Programme recently declared. But in the United States, the plastics industry has launched a concerted campaign to derail and defeat anti-bag measures nationwide. The effort includes well-placed political donations, intensive lobbying at both the state and national levels, and a pervasive PR campaign designed to shift the focus away from plastic bags to the supposed threat of canvas and paper bags — including misleading claims that reusable bags “could” contain bacteria and unsafe levels of lead.

“It’s just like Big Tobacco,” says Amy Westervelt, founding editor of Plastic Free Times, a website sponsored by the nonprofit Plastic Pollution Coalition. “They’re using the same underhanded tactics — and even using the same lobbying firm that Philip Morris started and bankrolled in the Nineties. Their sole aim is to maintain the status quo and protect their profits. They will stop at nothing to suppress or discredit science that clearly links chemicals in plastic to negative impacts on human, animal and environmental health.”

Made from high-density polyethylene — a byproduct of oil and natural gas — the single-use shopping bag was invented by a Swedish company in the mid-Sixties and brought to the U.S. by ExxonMobil. Introduced to grocery-store checkout lines in 1976, the “T-shirt bag,” as it is known in the industry, can now be found literally every where on the planet, from the bottom of the ocean to the peaks of Mount Everest. The bags are durable, waterproof, cheaper to produce than paper bags and able to carry 1,000 times their own weight. They are also a nightmare to recycle: The flimsy bags, many thinner than a strand of human hair, gum up the sorting equipment used by most recycling facilities. “Plastic bags and other thin-film plastic is the number-one enemy of the equipment we use,” says Jeff Murray, vice president of Far West Fibers, the largest recycler in Oregon. “More than 300,000 plastic bags are removed from our machines every day — and since most of the removal has to be done by hand, that means more than 25 percent of our labor costs involves plastic-bag removal.”

[div class=attrib]More from theSource here.[end-div]