The Right of Not Turning Left

In 2007 UPS made the headlines by declaring left-hand turns for its army of delivery truck drivers undesirable. Of course, we left-handers have always known that our left or “sinister” side is fatefully less attractive and still branded as unlucky or evil. Chinese culture brands left-handedness as improper as well.

UPS had other motives for poo-pooing left-hand turns. For a company which runs over 95,000 big brown delivery trucks optimizing delivery routes could result in tremendous savings. In fact, careful research showed that the company could reduce its annual delivery routes by 28.5 million miles, save around 3 million gallons of fuel and reduce CO2 emissions by over 30,000 metric tons. And, eliminating or reducing left-hand turns would be safer as well. Of the 2.4 million crashes at intersections in the United States in 2007, most involved left-hand turns, according to the U.S. Federal Highway Administration.

Now urban planners and highway designers in the United States are evaluating the same thing — how to reduce the need for left-hand turns. Drivers in Europe, especially the United Kingdom, will be all too familiar with the roundabout technique for reducing left-handed turns on many A and B roads. Roundabouts have yet to gain significant traction in the United States, so now comes the Diverging Diamond Interchange.

[div class=attrib]From Slate:[end-div]

. . . Left turns are the bane of traffic engineers. Their idea of utopia runs clockwise. (UPS’ routing software famously has drivers turn right whenever possible, to save money and time.) The left-turning vehicle presents not only the aforementioned safety hazard, but a coagulation in the smooth flow of traffic. It’s either a car stopped in an active traffic lane, waiting to turn; or, even worse, it’s cars in a dedicated left-turn lane that, when traffic is heavy enough, requires its own “dedicated signal phase,” lengthening the delay for through traffic as well as cross traffic. And when traffic volumes really increase, as in the junction of two suburban arterials, multiple left-turn lanes are required, costing even more in space and money.

And, increasingly, because of shifting demographics and “lollipop” development patterns, suburban arterials are where the action is: They represent, according to one report, less than 10 percent of the nation’s road mileage, but account for 48 percent of its vehicle-miles traveled.

. . . What can you do when you’ve tinkered all you can with the traffic signals, added as many left-turn lanes as you can, rerouted as much traffic as you can, in areas that have already been built to a sprawling standard? Welcome to the world of the “unconventional intersection,” where left turns are engineered out of existence.

. . . “Grade separation” is the most extreme way to eliminate traffic conflicts. But it’s not only aesthetically unappealing in many environments, it’s expensive. There is, however, a cheaper, less disruptive approach, one that promises its own safety and efficiency gains, that has become recently popular in the United States: the diverging diamond interchange. There’s just one catch: You briefly have to drive the wrong way. But more on that in a bit.

The “DDI” is the brainchild of Gilbert Chlewicki, who first theorized what he called the “criss-cross interchange” as an engineering student at the University of Maryland in 2000.

The DDI is the sort of thing that is easier to visualize than describe (this simulation may help), but here, roughly, is how a DDI built under a highway overpass works: As the eastbound driver approaches the highway interchange (whose lanes run north-south), traffic lanes “criss cross” at a traffic signal. The driver will now find himself on the “left” side of the road, where he can either make an unimpeded left turn onto the highway ramp, or cross over again to the right once he has gone under the highway overpass.

[div class=attrib]More from theSource here.[end-div]

So the Universe is Flat?


Having just posted an article that described the universe in terms of holographic principles – a 3-D projection on a two dimensional surface, it’s timely to put the theory in context, of other theories of course. There’s a theory that posits that the universe is a bubble wrought from the collision of high-dimensional branes (membrane that is). There’s a theory that suggests that our universe is one of many in a soup of multi-verses. Other theories suggest that the universe is made up of 9, 10 or 11 dimensions.

There’s another theory that the universe is flat, and that’s where Davide Castelvecchi (mathematician, science editor at Scientific American and blogger) over at Degrees of Freedom describes the current thinking.

[div class=attrib]What Do You Mean, The Universe Is Flat? (Part I), from Degrees of Freedom:[end-div]

In the last decade—you may have read this news countless times—cosmologists have found what they say is rather convincing evidence that the universe (meaning 3-D space) is flat, or at least very close to being flat.

The exact meaning of flat, versus curved, space deserves a post of its own, and that is what Part II of this series will be about. For the time being, it is convenient to just visualize a plane as our archetype of flat object, and the surface of the Earth as our archetype of a curved one. Both are two-dimensional, but as I will describe in the next installment, flatness and curviness make sense in any number of dimensions.

What I do want to talk about here is what it is that is supposed to be flat.

When cosmologists say that the universe is flat they are referring to space—the nowverse and its parallel siblings of time past. Spacetime is not flat. It can’t be: Einstein’s general theory of relativity says that matter and energy curve spacetime, and there are enough matter and energy lying around to provide for curvature. Besides, if spacetime were flat I wouldn’t be sitting here because there would be no gravity to keep me on the chair. To put it succintly: space can be flat even if spacetime isn’t.

Moreover, when they talk about the flatness of space cosmologists are referring to the large-scale appearance of the universe. When you “zoom in” and look at something of less-than-cosmic scale, such as the solar system, space—not just spacetime—is definitely not flat. Remarkable fresh evidence for this fact was obtained recently by the longest-running experiment in NASA history, Gravity Probe B, which took a direct measurement of the curvature of space around Earth. (And the most extreme case of non-flatness of space is thought to occur inside the event horizon of a black hole, but that’s another story.)

On a cosmic scale, the curvature created in space by the countless stars, black holes, dust clouds, galaxies, and so on constitutes just a bunch of little bumps on a space that is, overall, boringly flat.

Thus the seeming contradiction:

Matter curves spacetime. The universe is flat

is easily explained, too: spacetime is curved, and so is space; but on a large scale, space is overall flat.

[div class=attrib]More from theSource here.[end-div]

[div class=attrib]Image of Cosmic Microwave Background temperature fluctuations from the 7-year Wilkinson Microwave Anisotropy Probe data seen over the full sky. Courtesy of NASA.[end-div]

The Plastic Bag Wars

[div class=attrib]From Rolling Stone:[end-div]

American shoppers use an estimated 102 billion plastic shopping bags each year — more than 500 per consumer. Named by Guinness World Records as “the most ubiquitous consumer item in the world,” the ultrathin bags have become a leading source of pollution worldwide. They litter the world’s beaches, clog city sewers, contribute to floods in developing countries and fuel a massive flow of plastic waste that is killing wildlife from sea turtles to camels. “The plastic bag has come to represent the collective sins of the age of plastic,” says Susan Freinkel, author of Plastic: A Toxic Love Story.

Many countries have instituted tough new rules to curb the use of plastic bags. Some, like China, have issued outright bans. Others, including many European nations, have imposed stiff fees to pay for the mess created by all the plastic trash. “There is simply zero justification for manufacturing them anymore, anywhere,” the United Nations Environment Programme recently declared. But in the United States, the plastics industry has launched a concerted campaign to derail and defeat anti-bag measures nationwide. The effort includes well-placed political donations, intensive lobbying at both the state and national levels, and a pervasive PR campaign designed to shift the focus away from plastic bags to the supposed threat of canvas and paper bags — including misleading claims that reusable bags “could” contain bacteria and unsafe levels of lead.

“It’s just like Big Tobacco,” says Amy Westervelt, founding editor of Plastic Free Times, a website sponsored by the nonprofit Plastic Pollution Coalition. “They’re using the same underhanded tactics — and even using the same lobbying firm that Philip Morris started and bankrolled in the Nineties. Their sole aim is to maintain the status quo and protect their profits. They will stop at nothing to suppress or discredit science that clearly links chemicals in plastic to negative impacts on human, animal and environmental health.”

Made from high-density polyethylene — a byproduct of oil and natural gas — the single-use shopping bag was invented by a Swedish company in the mid-Sixties and brought to the U.S. by ExxonMobil. Introduced to grocery-store checkout lines in 1976, the “T-shirt bag,” as it is known in the industry, can now be found literally every where on the planet, from the bottom of the ocean to the peaks of Mount Everest. The bags are durable, waterproof, cheaper to produce than paper bags and able to carry 1,000 times their own weight. They are also a nightmare to recycle: The flimsy bags, many thinner than a strand of human hair, gum up the sorting equipment used by most recycling facilities. “Plastic bags and other thin-film plastic is the number-one enemy of the equipment we use,” says Jeff Murray, vice president of Far West Fibers, the largest recycler in Oregon. “More than 300,000 plastic bags are removed from our machines every day — and since most of the removal has to be done by hand, that means more than 25 percent of our labor costs involves plastic-bag removal.”

[div class=attrib]More from theSource here.[end-div]

Using An Antimagnet to Build an Invisibility Cloak

The invisibility cloak of science fiction takes another step further into science fact this week. Researchers over at Physics arVix report a practical method for building a device that repels electromagnetic waves. Alvaro Sanchez and colleagues at Spain’s Universitat Autonoma de Barcelona describe the design of a such a device utilizing the bizarre properties of metamaterials.

[div class=attrib]From Technology Review:[end-div]

A metamaterial is a bizarre substance with properties that physicists can fine tune as they wish. Tuned in a certain way, a metamaterial can make light perform all kinds of gymnastics, steering it round objects to make them seem invisible.

This phenomenon, known as cloaking, is set to revolutionise various areas of electromagnetic science.

But metamaterials can do more. One idea is that as well as electromagnetic fields, metamaterials ought to be able to manipulate plain old magnetic fields too. After all, a static magnetic field is merely an electromagnetic wave with a frequency of zero.

So creating a magnetic invisibility cloak isn’t such a crazy idea.

Today, Alvaro Sanchez and friends at Universitat Autonoma de Barcelona in Spain reveal the design of a cloak that can do just this.

The basic ingredients are two materials; one with a permeability that is smaller than 1 in one direction and one with a permeability greater than one in a perpendicular direction.

Materials with these permeabilities are easy to find. Superconductors have a permeability of 0 and ordinary ferromagnets have a permeability greater than 1.

The difficulty is creating a material with both these properties at the same time. Sanchez and co solve the problem with a design consisting of ferromagnetic shells coated with a superconducting layer.

The result is a device that can completely shield the outside world from a magnet inside it.

[div class=attrib]More from theSource here.[end-div]

Nuclear Fission in the Kitchen

theDiagonal usually does not report on the news. Though we do make a few worthy exceptions based on the import or surreal nature of the event. A case in point below.

Humans do have a curious way of repeating history. In a less meticulous attempt to re-enact the late-90s true story, which eventually led to the book “The Radioactive Boy Scout“, a Swedish man was recently arrested for trying to set up a nuclear reactor in his kitchen.

[div class=attrib]From the AP:[end-div]

A Swedish man who was arrested after trying to split atoms in his kitchen said Wednesday he was only doing it as a hobby.

Richard Handl told The Associated Press that he had the radioactive elements radium, americium and uranium in his apartment in southern Sweden when police showed up and arrested him on charges of unauthorized possession of nuclear material.

The 31-year-old Handl said he had tried for months to set up a nuclear reactor at home and kept a blog about his experiments, describing how he created a small meltdown on his stove.

Only later did he realize it might not be legal and sent a question to Sweden’s Radiation Authority, which answered by sending the police.

“I have always been interested in physics and chemistry,” Handl said, adding he just wanted to “see if it’s possible to split atoms at home.”

[div class=attrib]More from theSource here.[end-div]

A Reason for Reason

[div class attrib]From Wilson Quarterly:[end-div]

For all its stellar achievements, human reason seems particularly ill suited to, well, reasoning. Study after study demonstrates reason’s deficiencies, such as the oft-noted confirmation bias (the tendency to recall, select, or interpret evidence in a way that supports one’s preexisting beliefs) and people’s poor performance on straightforward logic puzzles. Why is reason so defective?

To the contrary, reason isn’t defective in the least, argue cognitive scientists Hugo Mercier of the University of Pennsylvania and Dan Sperber of the Jean Nicod Institute in Paris. The problem is that we’ve misunderstood why reason exists and measured its strengths and weaknesses against the wrong standards.

Mercier and Sperber argue that reason did not evolve to allow individuals to think through problems and make brilliant decisions on their own. Rather, it serves a fundamentally social purpose: It promotes argument. Research shows that people solve problems more effectively when they debate them in groups—and the interchange also allows people to hone essential social skills. Supposed defects such as the confirmation bias are well fitted to this purpose because they enable people to efficiently marshal the evidence they need in arguing with others.

[div class=attrib]More from theSource here.[end-div]

Ultimate logic: To infinity and beyond

[div class=attrib]From the New Scientist:[end-div]

WHEN David Hilbert left the podium at the Sorbonne in Paris, France, on 8 August 1900, few of the assembled delegates seemed overly impressed. According to one contemporary report, the discussion following his address to the second International Congress of Mathematicians was “rather desultory”. Passions seem to have been more inflamed by a subsequent debate on whether Esperanto should be adopted as mathematics’ working language.

Yet Hilbert’s address set the mathematical agenda for the 20th century. It crystallised into a list of 23 crucial unanswered questions, including how to pack spheres to make best use of the available space, and whether the Riemann hypothesis, which concerns how the prime numbers are distributed, is true.

Today many of these problems have been resolved, sphere-packing among them. Others, such as the Riemann hypothesis, have seen little or no progress. But the first item on Hilbert’s list stands out for the sheer oddness of the answer supplied by generations of mathematicians since: that mathematics is simply not equipped to provide an answer.

This curiously intractable riddle is known as the continuum hypothesis, and it concerns that most enigmatic quantity, infinity. Now, 140 years after the problem was formulated, a respected US mathematician believes he has cracked it. What’s more, he claims to have arrived at the solution not by using mathematics as we know it, but by building a new, radically stronger logical structure: a structure he dubs “ultimate L”.

The journey to this point began in the early 1870s, when the German Georg Cantor was laying the foundations of set theory. Set theory deals with the counting and manipulation of collections of objects, and provides the crucial logical underpinnings of mathematics: because numbers can be associated with the size of sets, the rules for manipulating sets also determine the logic of arithmetic and everything that builds on it.

[div class=attrib]More from theSource here.[end-div]

[div class=attrib]Image courtesy of Wikipedia / Creative Commons.[end-div]

Are You Real, Or Are You a Hologram?

The principle of a holographic universe, not to be confused with the Holographic Universe, an album by swedish death metal rock band Scar Symmetry, continues to hold serious sway among a not insignificant group of even more serious cosmologists.

Originally proposed by noted physicists Gerard ‘t Hooft, and Leonard Susskind in the mid-1990s, the holographic theory of the universe suggests that our entire universe can described as a informational 3-D projection painted in two dimensions on a cosmological boundary. This is analogous to the flat hologram printed on a credit card creating the illusion of a 3-D object.

While current mathematical theory and experimental verification is lagging, the theory has garnered much interest and forward momentum — so this area warrants a brief status check, courtesy of the New Scientist.

[div class=attrib]From the New Scientist:[end-div]

TAKE a look around you. The walls, the chair you’re sitting in, your own body – they all seem real and solid. Yet there is a possibility that everything we see in the universe – including you and me – may be nothing more than a hologram.

It sounds preposterous, yet there is already some evidence that it may be true, and we could know for sure within a couple of years. If it does turn out to be the case, it would turn our common-sense conception of reality inside out.

The idea has a long history, stemming from an apparent paradox posed by Stephen Hawking’s work in the 1970s. He discovered that black holes slowly radiate their mass away. This Hawking radiation appears to carry no information, however, raising the question of what happens to the information that described the original star once the black hole evaporates. It is a cornerstone of physics that information cannot be destroyed.

In 1972 Jacob Bekenstein at the Hebrew University of Jerusalem, Israel, showed that the information content of a black hole is proportional to the two-dimensional surface area of its event horizon – the point-of-no-return for in-falling light or matter. Later, string theorists managed to show how the original star’s information could be encoded in tiny lumps and bumps on the event horizon, which would then imprint it on the Hawking radiation departing the black hole.

This solved the paradox, but theoretical physicists Leonard Susskind and Gerard ‘t Hooft decided to take the idea a step further: if a three-dimensional star could be encoded on a black hole’s 2D event horizon, maybe the same could be true of the whole universe. The universe does, after all, have a horizon 42 billion light years away, beyond which point light would not have had time to reach us since the big bang. Susskind and ‘t Hooft suggested that this 2D “surface” may encode the entire 3D universe that we experience – much like the 3D hologram that is projected from your credit card.

[div class=attrib]More from theSource here.[end-div]

[div class=attrib]Image courtesy of Computerarts.[end-div]

Flowing Water on Mars?

NASA’s latest spacecraft to visit Mars, the Mars Reconnaissance Orbiter, has made some stunning observations that show the possibility of flowing water on the red planet. Intriguingly,  repeated observations of the same regions over several Martian seasons show visible changes attributable to some kind of dynamic flow.

[div class=attrib]From NASA / JPL:[end-div]

Observations from NASA’s Mars Reconnaissance Orbiter have revealed possible flowing water during the warmest months on Mars.

“NASA’s Mars Exploration Program keeps bringing us closer to determining whether the Red Planet could harbor life in some form,” NASA Administrator Charles Bolden said, “and it reaffirms Mars as an important future destination for human exploration.”

Dark, finger-like features appear and extend down some Martian slopes during late spring through summer, fade in winter, and return during the next spring. Repeated observations have tracked the seasonal changes in these recurring features on several steep slopes in the middle latitudes of Mars’ southern hemisphere.

“The best explanation for these observations so far is the flow of briny water,” said Alfred McEwen of the University of Arizona, Tucson. McEwen is the principal investigator for the orbiter’s High Resolution Imaging Science Experiment (HiRISE) and lead author of a report about the recurring flows published in Thursday’s edition of the journal Science.

Some aspects of the observations still puzzle researchers, but flows of liquid brine fit the features’ characteristics better than alternate hypotheses. Saltiness lowers the freezing temperature of water. Sites with active flows get warm enough, even in the shallow subsurface, to sustain liquid water that is about as salty as Earth’s oceans, while pure water would freeze at the observed temperatures.

[div class=attrib]More from theSource here.[end-div]

Tim Berners-Lee’s “Baby” Hits 20 – Happy Birthday World Wide Web

In early 1990 at CERN headquarters in Geneva, Switzerland, Tim Berners-Lee and Robert Cailliau published a formal proposal to build a “Hypertext project” called “WorldWideWeb” as a “web” of “hypertext documents” to be viewed by “browsers”.

Following development work the pair introduced the proposal to a wider audience in December, and on August 6, 1991, 20 years ago, the World Wide Web officially opened for business on the internet. On that day Berners-Lee posted the first web page — a short summary of the World Wide Web project on the alt.hypertext newsgroup.

The page authored by Tim Berners-Lee was http://info.cern.ch/hypertext/WWW/TheProject.html. A later version on the page can be found here. The page described Berners-Lee’s summary of a project for organizing information on a computer network using a web or links. In fact, the the effort was originally coined “Mesh”, but later became the “World Wide Web”.

The first photograph on the web was uploaded by Berners-Lee in 1992, an image of the CERN house band Les Horribles Cernettes. Twenty years on, one website alone — Flickr – hosts around 5.75 billion images.

[div class=attrib]Photograph of Les Horribles Cernettes, the very first photo to be published on the world wide web in 1992. Image courtesy of Cernettes / Silvano de Gennaro. Granted under fair use.[end-div]

The End of 140

Five years in internet time is analogous to several entire human lifespans. So, it’s no surprise that Twitter seems to have been with us forever. Despite the near ubiquity of the little blue bird, most of the service’s tweeters have no idea why they are constrained to using a mere 140 characters to express themselves.

Farhad Manjoo over at Slate has a well-reasoned plea to increase this upper character limit for the more garrulous amongst us.

Though perhaps more importantly is the effect of this truncated form of messaging on our broader mechanisms of expression and communication. Time will tell if our patterns of speech and the written word will adjust accordingly.

[div class=attrib]From Slate:[end-div]

Five years ago this month, Twitter opened itself up to the public. The new service, initially called Twttr, was born out of software engineer Jack Dorsey’s fascination with an overlooked corner of the modern metropolis—the central dispatch systems that track delivery trucks, taxis, emergency vehicles, and bike messengers as they’re moving about town. As Dorsey once told the Los Angeles Times, the logs of central dispatchers contained “this very rich sense of what’s happening right now in the city.” For a long time, Dorsey tried to build a public version of that log. It was only around 2005, when text messaging began to take off in America, that his dream became technically feasible. There was only one problem with building Twittr on mobile carriers’ SMS system, though—texts were limited to 160 characters, and if you included space for a user’s handle, that left only 140 characters per message.

What could you say in 140 characters? Not a whole lot—and that was the point. Dorsey believed that Twitter would be used for status updates—his prototypical tweets were “in bed” and “going to park,” and his first real tweet was “inviting coworkers.” That’s not how we use Twitter nowadays. In 2009, the company acknowledged that its service had “outgrown the concept of personal status updates,” and it changed its home-screen prompt from “What are you doing?” to the more open-ended “What’s happening?”

As far as I can tell, though, Twitter has never considered removing the 140-character limit, and Twitter’s embrace of this constraint has been held up as one of the key reasons for the service’s success. But I’m hoping Twitter celebrates its fifth birthday by rethinking this stubborn stance. The 140-character limit now feels less like a feature than a big, obvious bug. I don’t want Twitter to allow messages of unlimited length, as that would encourage people to drone on interminably. But since very few Twitter users now access the system through SMS, it’s technically possible for the network to accommodate longer tweets. I suggest doubling the ceiling—give me 280 characters, Jack, and I’ll give you the best tweets you’ve ever seen!

[div class=attrib]More from theSource here.[end-div]

MondayPoem: Life Cycle of Common Man

Twice Poet Laureate of the United States, Howard Nemerov, catalogs the human condition in his work “Life Cycle of Common Man”.

[div class=attrib]By Howard Nemerov, courtesy of Poetry Foundation:[end-div]

Life Cycle of Common Man

Roughly figured, this man of moderate habits,
This average consumer of the middle class,
Consumed in the course of his average life span
Just under half a million cigarettes,
Four thousand fifths of gin and about
A quarter as much vermouth; he drank
Maybe a hundred thousand cups of coffee,
And counting his parents’ share it cost
Something like half a million dollars
To put him through life. How many beasts
Died to provide him with meat, belt and shoes
Cannot be certainly said.
But anyhow,
It is in this way that a man travels through time,
Leaving behind him a lengthening trail
Of empty bottles and bones, of broken shoes,
Frayed collars and worn out or outgrown
Diapers and dinnerjackets, silk ties and slickers.

Given the energy and security thus achieved,
He did . . . ? What? The usual things, of course,
The eating, dreaming, drinking and begetting,
And he worked for the money which was to pay
For the eating, et cetera, which were necessary
If he were to go on working for the money, et cetera,
But chiefly he talked. As the bottles and bones
Accumulated behind him, the words proceeded
Steadily from the front of his face as he
Advanced into the silence and made it verbal.
Who can tally the tale of his words? A lifetime
Would barely suffice for their repetition;
If you merely printed all his commas the result
Would be a very large volume, and the number of times
He said “thank you” or “very little sugar, please,”
Would stagger the imagination. There were also
Witticisms, platitudes, and statements beginning
“It seems to me” or “As I always say.”
Consider the courage in all that, and behold the man
Walking into deep silence, with the ectoplastic
Cartoon’s balloon of speech proceeding
Steadily out of the front of his face, the words
Borne along on the breath which is his spirit
Telling the numberless tale of his untold Word
Which makes the world his apple, and forces him to eat.

[div class=attrib]Source: The Collected Poems of Howard Nemerov (The University of Chicago Press, 1977).[end-div]

The Prospect of Immortality

A recently opened solo art show takes an fascinating inside peek at the cryonics industry. Entitled “The Prospect of Immortality” the show features photography by Murray Ballard. Ballard’s collection of images follows a 5-year investigation of cryonics in England, the United States and Russia. Cryonics is the practice of freezing the human body just after death in the hope that future science will one day have the capability of restoring it to life.

Ballard presents the topic in a fair an balanced way, leaving viewers to question and weigh the process of cryonics for themselves.

[div class=attrib]From Impressions Gallery:[end-div]

The result of five year’s unprecedented access and international investigation, Murray Ballard offers an amazing photographic insight into the practice of : the process of freezing a human body after death in the hope that scientific advances may one day bring it back to life. Premiering at Impressions Gallery, this is Murray Ballard’s first major solo show.

Ballard’s images take the viewer on a journey through the tiny but dedicated international cryonics community, from the English seaside retirement town of Peacehaven; to the high-tech laboratories of Arizona; to the rudimentary facilities of Kriorus, just outside Moscow.  Worldwide there are approximately 200 ‘patients’ stored permanently in liquid nitrogen, with a further thousand people signed up for cryonics after death.

[div class=attrib]More from theSource here.[end-div]

[div class=attrib]Images courtesy of Impressions Gallery / Murray Ballard.[end-div]

The Science Behind Dreaming

[div class=attrib]From Scientific American:[end-div]

For centuries people have pondered the meaning of dreams. Early civilizations thought of dreams as a medium between our earthly world and that of the gods. In fact, the Greeks and Romans were convinced that dreams had certain prophetic powers. While there has always been a great interest in the interpretation of human dreams, it wasn’t until the end of the nineteenth century that Sigmund Freud and Carl Jung put forth some of the most widely-known modern theories of dreaming. Freud’s theory centred around the notion of repressed longing — the idea that dreaming allows us to sort through unresolved, repressed wishes. Carl Jung (who studied under Freud) also believed that dreams had psychological importance, but proposed different theories about their meaning.

Since then, technological advancements have allowed for the development of other theories. One prominent neurobiological theory of dreaming is the “activation-synthesis hypothesis,” which states that dreams don’t actually mean anything: they are merely electrical brain impulses that pull random thoughts and imagery from our memories. Humans, the theory goes, construct dream stories after they wake up, in a natural attempt to make sense of it all. Yet, given the vast documentation of realistic aspects to human dreaming as well as indirect experimental evidence that other mammals such as cats also dream, evolutionary psychologists have theorized that dreaming really does serve a purpose. In particular, the “threat simulation theory” suggests that dreaming should be seen as an ancient biological defence mechanism that provided an evolutionary advantage because of  its capacity to repeatedly simulate potential threatening events – enhancing the neuro-cognitive mechanisms required for efficient threat perception and avoidance.

So, over the years, numerous theories have been put forth in an attempt to illuminate the mystery behind human dreams, but, until recently, strong tangible evidence has remained largely elusive.

Yet, new research published in the Journal of Neuroscience provides compelling insights into the mechanisms that underlie dreaming and the strong relationship our dreams have with our memories. Cristina Marzano and her colleagues at the University of Rome have succeeded, for the first time, in explaining how humans remember their dreams. The scientists predicted the likelihood of successful dream recall based on a signature pattern of brain waves. In order to do this, the Italian research team invited 65 students to spend two consecutive nights in their research laboratory.

[div class=attrib]More from theSource here.[end-div]

[div class=attrib]Image: The Knight’s Dream by Antonio de Pereda. Courtesy of Wikipedia / Creative Commons.[end-div]

Rate This Article: What’s Wrong with the Culture of Critique

[div class=attrib]From Wired:[end-div]

You don’t have to read this essay to know whether you’ll like it. Just go online and assess how provocative it is by the number of comments at the bottom of the web version. (If you’re already reading the web version, done and done.) To find out whether it has gone viral, check how many people have hit the little thumbs-up, or tweeted about it, or liked it on Facebook, or dug it on Digg. These increasingly ubiquitous mechanisms of assessment have some real advantages: In this case, you could save 10 minutes’ reading time. Unfortunately, life is also getting a little ruined in the process.

A funny thing has quietly accompanied our era’s eye-gouging proliferation of information, and by funny I mean not very funny. For every ocean of new data we generate each hour—videos, blog posts, VRBO listings, MP3s, ebooks, tweets—an attendant ocean’s worth of reviewage follows. The Internet-begotten abundance of absolutely everything has given rise to a parallel universe of stars, rankings, most-recommended lists, and other valuations designed to help us sort the wheat from all the chaff we’re drowning in. I’ve never been to Massimo’s pizzeria in Princeton, New Jersey, but thanks to the Yelpers I can already describe the personality of Big Vince, a man I’ve never met. (And why would I want to? He’s surly and drums his fingers while you order, apparently.) Everything exists to be charted and evaluated, and the charts and evaluations themselves grow more baroque by the day. Was this review helpful to you? We even review our reviews.

Technoculture critic and former Wired contributor Erik Davis is concerned about the proliferation of reviews, too. “Our culture is afflicted with knowingness,” he says. “We exalt in being able to know as much as possible. And that’s great on many levels. But we’re forgetting the pleasures of not knowing. I’m no Luddite, but we’ve started replacing actual experience with someone else’s already digested knowledge.”

Of course, Yelpification of the universe is so thorough as to be invisible. I scarcely blinked the other day when, after a Skype chat with my mother, I was asked to rate the call. (I assumed they were talking about connection quality, but if they want to hear about how Mom still pronounces it noo-cu-lar, I’m happy to share.) That same afternoon, the UPS guy delivered a guitar stand I’d ordered. Even before I could weigh in on the product, or on the seller’s expeditiousness, I was presented with a third assessment opportunity. It was emblazoned on the cardboard box: “Rate this packaging.”

[div class=attrib]More from theSource here.[end-div]

Communicating Meaning in Cyberspace

Clarifying intent, emotion, wishes and meaning is a rather tricky and cumbersome process that we all navigate each day. Online in the digital world this is even more challenging, if not sometimes impossible. The pre-digital method of exchanging information in a social context would have been face-to-face. Such a method provides the full gamut of verbal and non-verbal dialogue between two or more parties. Importantly, it also provides a channel for the exchange of unconscious cues between people, which researchers are increasingly finding to be of critical importance during communication.

So, now replace the the face-to-face interaction with email, texting, instant messaging, video chat, and other forms of digital communication and you have a new playground for researchers in cognitive and social sciences. The intriguing question for researchers, and all of us for that matter, is: how do we ensure our meaning, motivations and intent are expressed clearly through digital communications?

There are some partial answers over at Anthropology in Practice, which looks at how users of digital media express emotion, resolve ambiguity and communicate cross-culturally.

[div class=attrib]Anthropology in Practice:[end-div]

The ability to interpret social data is rooted in our theory of mind—our capacity to attribute mental states (beliefs, intents, desires, knowledge, etc.) to the self and to others. This cognitive development reflects some understanding of how other individuals relate to the world, allowing for the prediction of behaviors.1 As social beings we require consistent and frequent confirmation of our social placement. This confirmation is vital to the preservation of our networks—we need to be able to gauge the state of our relationships with others.

Research has shown that children whose capacity to mentalize is diminished find other ways to successfully interpret nonverbal social and visual cues 2-6, suggesting that the capacity to mentalize is necessary to social life. Digitally-mediated communication, such as text messaging and instant messaging, does not readily permit social biofeedback. However cyber communicators still find ways of conveying beliefs, desires, intent, deceit, and knowledge online, which may reflect an effort to preserve the capacity to mentalize in digital media.

The Challenges of Digitally-Mediated Communication

In its most basic form DMC is text-based, although the growth of video conferencing technology indicates DMC is still evolving. One of the biggest criticisms of DMC has been the lack of nonverbal cues which are an important indicator to the speaker’s meaning, particularly when the message is ambiguous.

Email communicators are all too familiar with this issue. After all, in speech the same statement can have multiple meanings depending on tone, expression, emphasis, inflection, and gesture. Speech conveys not only what is said, but how it is said—and consequently, reveals a bit of the speaker’s mind to interested parties. In a plain-text environment like email only the typist knows whether a statement should be read with sarcasm.

[div class=attrib]More from theSource here.[end-div]

[div class=attrib]Image courtesy of Wikipedia / Creative Commons.[end-div]

The Slow Food – Fast Food Debate

For watchers of the human condition, dissecting and analyzing our food culture is both fascinating and troubling. The global agricultural-industrial complex with its enormous efficiencies and finely engineered end-products, churns out mountains of food stuffs that help feed a significant proportion of the world. And yet, many argue that the same over-refined, highly-processed, preservative-doped, high-fructose enriched, sugar and salt laden, color saturated foods are to blame for many of our modern ills. The catalog of dangers from that box of “fish” sticks, orange “cheese” and twinkies goes something likes this: heart disease, cancer, diabetes, and obesity.

To counterbalance the fast/processed food juggernaut the grassroots International Slow Food movement established its manifesto in 1989. Its stated vision is:

We envision a world in which all people can access and enjoy food that is good for them, good for those who grow it and good for the planet.

They go on to say:

We believe that everyone has a fundamental right to the pleasure of good food and consequently the responsibility to protect the heritage of food, tradition and culture that make this pleasure possible. Our association believes in the concept of neo-gastronomy – recognition of the strong connections between plate, planet, people and culture.

These are lofty ideals. Many would argue that the goals of the Slow Food movement, while worthy, are somewhat elitist and totally impractical in current times on our over-crowded, resource constrained little blue planet.

Krystal D’Costa over at Anthropology in Practice has a fascinating analysis and takes a more pragmatic view.

[div class=attrib]From Krystal D’Costa over at Anthropology in Practice:[end-div]

There’s a sign hanging in my local deli that offers customers some tips on what to expect in terms of quality and service. It reads:

Your order:

Can be fast and good, but it won’t be cheap.
Can be fast and cheap, but it won’t be good.
Can be good and cheap, but it won’t be fast.
Pick two—because you aren’t going to get it good, cheap, and fast.

The Good/Fast/Cheap Model is certainly not new. It’s been a longstanding principle in design, and has been applied to many other things. The idea is a simple one: we can’t have our cake and eat it too. But that doesn’t mean we can’t or won’t try—and no where does this battle rage more fiercely than when it comes to fast food.

In a landscape dominated by golden arches, dollar menus, and value meals serving up to 2,150 calories, fast food has been much maligned. It’s fast, it’s cheap, but we know it’s generally not good for us. And yet, well-touted statistics report that Americans are spending more than ever on fast food:

In 1970, Americans spent about $6 billion on fast food; in 2000, they spent more than $110 billion. Americans now spend more money on fast food than on higher education, personal computers, computer software, or new cars. They spend more on fast food than on movies, books, magazines, newspapers, videos, and recorded music—combined.[i]

With waistlines growing at an alarming rate, fast food has become an easy target. Concern has spurned the emergence of healthier chains (where it’s good and fast, but not cheap), half servings, and posted calorie counts. We talk about awareness and “food prints” enthusiastically, aspire to incorporate more organic produce in our diets, and struggle to encourage others to do the same even while we acknowledge that differing economic means may be a limiting factor.

In short, we long to return to a simpler food time—when local harvests were common and more than adequately provided the sustenance we needed, and we relied less on processed, industrialized foods. We long for a time when home-cooked meals, from scratch, were the norm—and any number of cooking shows on the American airways today work to convince us that it’s easy to do. We’re told to shun fast food, and while it’s true that modern, fast, processed foods represent an extreme in portion size and nutrition, it is also true that our nostalgia is misguided: raw, unprocessed foods—the “natural” that we yearn for—were a challenge for our ancestors. In fact, these foods were downright dangerous.

Step back in time to when fresh meat rotted before it could be consumed and you still consumed it, to when fresh fruits were sour, vegetables were bitter, and when roots and tubers were poisonous. Nature, ever fickle, could withhold her bounty as easily as she could share it: droughts wreaked havoc on produce, storms hampered fishing, cows stopped giving milk, and hens stopped laying.[ii] What would you do then?

[div class=attrib]More from theSource here.[end-div]

[div class=attrib]Images courtesy of International Slow Food Movement / Fred Meyer store by lyzadanger.[end-div]

Graduate Job Picture

Encouraging news for the class 0f 2011. The National Association of Colleges and Employers (NACE) released results from a recent survey showing a slightly improved job picture for 2011 college graduates.

[div class=attrib]From Course Hero:[end-div]

[div class=attrib]More from theSource here.[end-div]

QR Codes as Art

It’s only a matter of time before someone has a cool looking QR code tattooed to their eyelid.

A QR or Quick Response code is a two-dimensional matrix that looks like a scrambled barcode, and behaves much like one, with one important difference. The QR code exhibits a rather high level of tolerance for errors. Some have reported that up to 20-30 percent of the QR code can be selectively altered without affecting its ability to be scanned correctly. Try scanning a regular barcode that has some lines missing or has been altered and your scanner is likely to give you a warning beep. The QR code however still scans correctly even if specific areas are missing or changed. This is important because a QR code does not require a high-end, dedicated barcode scanner for it to be scanned, and therefore also makes it suitable for outdoor use.

A QR code can be scanned, actually photographed, with a regular smartphone (or other device) equipped with a camera and QR code reading app. This makes it possible for QR codes to take up residence anywhere, not just on product packages, and scanned by anyone with a smartphone. In fact you may have seen QR codes displayed on street corners, posters, doors, billboards, websites, vehicles and magazines.

Of course, once you snap a picture of a code, your smartphone app will deliver more details about the object on which the QR code resides. For instance, take a picture of a code placed on a billboard advertising a new BMW model, and you’ll be linked to the BMW website with special promotions for your region. QR codes not only link to websites, but also can be used to send pre-defined text messages, provide further textual information, and deliver location maps.

Since parts of a QR code can be changed without reducing its ability to be scanned correctly, artists and designers now have the leeway to customize the matrix with some creative results.

Some favorites below.

[div]Images courtesy of Duncan Robertson, BBC; Louis Vuitton, SET; Ayara Thai Cuisine Restaurant.[end-div]

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

The science behind disgust

[div class=attrib]From Salon:[end-div]

We all have things that disgust us irrationally, whether it be cockroaches or chitterlings or cotton balls. For me, it’s fruit soda. It started when I was 3; my mom offered me a can of Sunkist after inner ear surgery. Still woozy from the anesthesia, I gulped it down, and by the time we made it to the cashier, all of it managed to come back up. Although it is nearly 30 years later, just the smell of this “fun, sun and the beach” drink is enough to turn my stomach.

But what, exactly, happens when we feel disgust? As Daniel Kelly, an assistant professor of philosophy at Purdue University, explains in his new book, “Yuck!: The Nature and Moral Significance of Disgust,” it’s not just a physical sensation, it’s a powerful emotional warning sign. Although disgust initially helped keep us away from rotting food and contagious disease, the defense mechanism changed over time to effect the distance we keep from one another. When allowed to play a role in the creation of social policy, Kelly argues, disgust might actually cause more harm than good.

Salon spoke with Kelly about hiding the science behind disgust, why we’re captivated by things we find revolting, and how it can be a very dangerous thing.

What exactly is disgust?

Simply speaking, disgust is the response we have to things we find repulsive. Some of the things that trigger disgust are innate, like the smell of sewage on a hot summer day. No one has to teach you to feel disgusted by garbage, you just are. Other things that are automatically disgusting are rotting food and visible cues of infection or illness. We have this base layer of core disgusting things, and a lot of them don’t seem like they’re learned.

[div class=attrib]More from theSource here.[end-div]

If Televisions Could See Us

A fascinating and disturbing series of still photographs from Andris Feldmanis shows us what the television “sees” as its viewers glare seemingly mindlessly at the box. As Feldmanis describes,

An average person in Estonia spends several hours a day watching the television. This is the situation reversed, the people portrayed here are posing for their television sets. It is not a critique of mass media and its influence, it is a fictional document of what the TV sees.

Makes one wonder what the viewers were watching. Or does it even matter? More of the series courtesy of Art Fag City, here. All the images show the one-sidedness of the human-television relationship.

[div class=attrib]Image courtesy of Andris Feldmanis.[end-div]

Dawn Over Vesta

More precisely NASA’s Dawn spacecraft entered into orbit around the asteroid Vesta on July 15, 2011. Vesta is the second largest of our solar system’s asteroids and is located in the asteroid belt between Mars and Jupiter.

Now that Dawn is safely in orbit, the spacecraft will circle about 10,000 miles above Vesta’s surface for a year and use two different cameras, a gamma-ray detector and a neutron detector, to study the asteroid.

Then in July 2012, Dawn will depart for a visit to Vesta’s close neighbor and largest object in the asteroid belt, Ceres.

The image of Vesta above was taken from a distance of about 9,500 miles (15,000 kilometers) away.

[div class=attrib]Image courtesy of NASA/JPL-Caltech/UCLA/MPS/DLR/IDA.[end-div]

Mr.Carrier, Thanks for Inventing the Air Conditioner

It’s #$% hot in the southern plains of the United States, with high temperatures constantly above 100 degrees F, and lows never dipping below 80. For that matter, it’s hotter than average this year in most parts of the country. So, a timely article over at Slate gives a great overview of the history of the air conditioning system, courtesy of inventor Willis Carrier.

[div class=attrib]From Slate:[end-div]

Anyone tempted to yearn for a simpler time must reckon with a few undeniable unpleasantries of life before modern technology: abscessed teeth, chamber pots, the bubonic plague—and a lack of air conditioning in late July. As temperatures rise into the triple digits across the eastern United States, it’s worth remembering how we arrived at the climate-controlled summer environments we have today.

Until the 20th century, Americans dealt with the hot weather as many still do around the world: They sweated and fanned themselves. Primitive air-conditioning systems have existed since ancient times, but in most cases, these were so costly and inefficient as to preclude their use by any but the wealthiest people. In the United States, things began to change in the early 1900s, when the first electric fans appeared in homes. But cooling units have only spread beyond American borders in the last couple of decades, with the confluence of a rising global middle class and breakthroughs in energy-efficient technology. . . .

The big breakthrough, of course, was electricity. Nikola Tesla’s development of alternating current motors made possible the invention of oscillating fans in the early 20th century. And in 1902, a 25-year-old engineer from New York named Willis Carrier invented the first modern air-conditioning system. The mechanical unit, which sent air through water-cooled coils, was not aimed at human comfort, however; it was designed to control humidity in the printing plant where he worked.

[div class=attrib]More from theSource here.[end-div]

[div class=attrib]Image of Willis Carrier courtesy of Wikipedia / Creative Commons.[end-div]

Seven Sisters Star Cluster

The Seven Sisters star cluster, also known as the Pleiades, consists of many, young, bright and hot stars. While the cluster contains hundreds of stars it is so named because only seven are typically visible to the naked eye. The Seven Sisters is visible from the northern hemisphere, and resides in the constellation Taurus.

[div class=attrib]Image and supporting text courtesy of Davide De Martin over at Skyfactory.[end-div]

This image is a composite from black and white images taken with the Palomar Observatory’s 48-inch (1.2-meter) Samuel Oschin Telescope as a part of the second National Geographic Palomar Observatory Sky Survey (POSS II). The images were recorded on two type of glass photographic plates – one sensitive to red light and the other to blue and later they were digitized. Credit: Caltech, Palomar Observatory, Digitized Sky Survey.

In order to produce the color image seen here, I worked with data coming from 2 different photographic plates taken in 1986 and 1989. Original file is 10.252 x 9.735 pixels with a resolution of about 1 arcsec per pixel. The image shows an area of sky large 2,7° x 2,7° (for comparison, the full-Moon is about 0,5° in diameter).

[div class=attrib]More from theSource here.[end-div]

MondayPoem: Starlight

Monday’s poem authored by William Meredith, was selected for it is in keeping with our cosmology theme this week.

William Meredith was born in New York City in 1919. He studied English at Princeton University where he graduated Magna Cum Laude. His senior thesis focused on the poetry of Robert Frost, a major influence for Meredith throughout his career.
[div class=attrib]By William Meredith, courtesy of Poets.org:[end-div]

Going abruptly into a starry night
It is ignorance we blink from, dark, unhoused;
There is a gaze of animal delight
Before the human vision. Then, aroused
To nebulous danger, we may look for easy stars,
Orion and the Dipper; but they are not ours,

These learned fields. Dark and ignorant,
Unable to see here what our forebears saw,
We keep some fear of random firmament
Vestigial in us. And we think, Ah,
If I had lived then, when these stories were made up, I
Could have found more likely pictures in haphazard sky.

But this is not so. Indeed, we have proved fools
When it comes to myths and images. A few
Old bestiaries, pantheons and tools
Translated to the heavens years ago—
Scales and hunter, goat and horologe—are all
That save us when, time and again, our systems fall.

And what would we do, given a fresh sky
And our dearth of image? Our fears, our few beliefs
Do not have shapes. They are like that astral way
We have called milky, vague stars and star-reefs
That were shapeless even to the fecund eye of myth—
Surely these are no forms to start a zodiac with.

To keep the sky free of luxurious shapes
Is an occupation for most of us, the mind
Free of luxurious thoughts. If we choose to escape,
What venial constellations will unwind
Around a point of light, and then cannot be found
Another night or by another man or from other ground.

As for me, I would find faces there,
Or perhaps one face I have long taken for guide;
Far-fetched, maybe, like Cygnus, but as fair,
And a constellation anyone could read
Once it was pointed out; an enlightenment of night,
The way the pronoun you will turn dark verses bright.

And You Thought Being Direct and Precise Was Good

A new psychological study upends our understanding of the benefits of direct and precise information as a motivational tool. Results from the study by Himanshu Mishra and Baba Shiv describe the cognitive benefits of vague and inarticulate feedback over precise information. At first glance this seems to be counter-intuitive. After all, fuzzy math, blurred reasoning and unclear directives would seem to be the banes of current societal norms that value data in as a precise a form as possible. We measure, calibrate, verify and re-measure and report information to the nth degree.

[div class=attrib]Stanford Business:[end-div]

Want to lose weight in 2011? You’ve got a better chance of pulling it off if you tell yourself, “I’d like to slim down and maybe lose somewhere between 5 and 15 pounds this year” instead of, “I’d like to lose 12 pounds by July 4.”

In a paper to be published in an upcoming issue of the journal Psychological Science, business school Professor Baba Shiv concludes that people are more likely to stay motivated and achieve a goal if it’s sketched out in vague terms than if it’s set in stone as a rigid or precise plan.

“For one to be successful, one needs to be motivated,” says Shiv, the Stanford Graduate School of Business Sanwa Bank, Limited, Professor of Marketing. He is coauthor of the paper “In Praise of Vagueness: Malleability of Vague Information as a Performance Booster” with Himanshu Mishra and Arul Mishra, both of the University of Utah. Presenting information in a vague way — for instance using numerical ranges or qualitative descriptions — “allows you to sample from the information that’s in your favor,” says Shiv, whose research includes studying people’s responses to incentives. “You’re sampling and can pick the part you want,” the part that seems achievable or encourages you to keep your expectations upbeat to stay on track, says Shiv.

By comparison, information presented in a more-precise form doesn’t let you view it in a rosy light and so can be discouraging. For instance, Shiv says, a coach could try to motivate a sprinter by reviewing all her past times, recorded down to the thousandths of a second. That would remind her of her good times but also the poor ones, potentially de-motivating her. Or, the coach could give the athlete less-precise but still-accurate qualitative information. “Good coaches get people not to focus on the times but on a dimension that is malleable,” says Shiv. “They’ll say, “You’re mentally tough.’ You can’t measure that.” The runner can then zero in on her mental strength to help her concentrate on her best past performances, boosting her motivation and ultimately improving her times. “She’s cherry-picking her memories, and that’s okay, because that’s allowing her to get motivated,” says Shiv.

Of course, Shiv isn’t saying there’s no place for precise information. A pilot needs exact data to monitor a plane’s location, direction, and fuel levels, for instance. But information meant to motivate is different, and people seeking motivation need the chance to focus on just the positive. When it comes to motivation, Shiv said, “negative information outweighs positive. If I give you five pieces of negative information and five pieces of positive information, the brain weighs the negative far more than the positive … It’s a survival mechanism. The brain weighs the negative to keep us secure.”

[div class=attrib]More from theSource here.[end-div]

 

Answers to Life’s Big Questions

Do you gulp Pepsi or Coke? Are you a Mac or a PC? Do you side with MSNBC or Fox News? Do you sip tea or coffee? Do you prefer thin crust or deep pan pizza.

Hunch has compiled a telling infographic compiled from millions of answers gathered via its online Teach Hunch About You (THAY) questions. Interestingly, it looks like 61 percent of respondents are “dog people” and 31 percent “cat people” (with 8 percent neither).

[div class=attrib]From Hunch:[end-div]

[div class=attrib]More from theSource here.[end-div]

Morality 1: Good without gods

[div class=attrib]From QualiaSoup:[end-div]

Some people claim that morality is dependent upon religion, that atheists cannot possibly be moral since god and morality are intertwined (well, in their minds). Unfortunately, this is one way that religious people dehumanise atheists who have a logical way of thinking about what constitutes moral social behaviour. More than simply being a (incorrect) definition in the Oxford dictionary, morality is actually the main subject of many philosophers’ intellectual lives. This video, the first of a multi-part series, begins this discussion by defining morality and then moving on to look at six hypothetical cultures’ and their beliefs.

[tube]T7xt5LtgsxQ[/tube]

Art Makes Your Body Tingle

The next time you wander through an art gallery and feel lightheaded after seeing a Monroe silkscreen by Warhol, or feel reflective and soothed by a scene from Monet’s garden you’ll be in good company. New research shows that the body reacts to art not just our grey matter.

The study by Wolfgang Tschacher and colleagues, and published by the American Psychological Association, found that:

. . . physiological responses during perception of an artwork were significantly related to aesthetic-emotional experiencing. The dimensions “Aesthetic Quality,” “Surprise/Humor,” “Dominance,” and “Curatorial Quality” were associated with cardiac measures (heart rate variability, heart rate level) and skin conductance variability.

In other words, art makes your pulse race, your skin perspire and your body tingle.

[div class=attrib]From Miller-McCune:[end-div]

Art exhibits are not generally thought of as opportunities to get our pulses racing and skin tingling. But newly published research suggests aesthetic appreciation is, in fact, a full-body experience.

Three hundred and seventy-three visitors to a Swiss museum agreed to wear special gloves measuring four physiological responses as they strolled through an art exhibit. Researchers found an association between the gallery-goers’ reported responses to the artworks and three of the four measurements of bodily stimulation.

“Our findings suggest that an idiosyncratically human property — finding aesthetic pleasure in viewing artistic artifacts — is linked to biological markers,” researchers led by psychologist Wolfgang Tschacher of the University of Bern, Switzerland, write in the journal Psychology of Aesthetics, Creativity and the Arts.

Their study, the first of its kind conducted in an actual art gallery, provides evidence for what Tschacher and his colleagues call “the embodiment of aesthetics.”

[div class=attrib]More from theSource here.[end-div]

[div class=attrib]Image courtesy of Wikipedia / Creative Commons.[end-div]

Favela Futurism, Very Chic

[div class=attrib]From BigThink:[end-div]

The future of global innovation is the Brazilian favela, the Mumbai slum and the Nairobi shanty-town. At a time when countries across the world, from Latin America to Africa to Asia, are producing new mega-slums on an epic scale, when emerging mega-cities in China are pushing the limits of urban infrastructure by adding millions of new inhabitants each year, it is becoming increasingly likely that the lowly favela, slum or ghetto may hold the key to the future of human development.

Back in 2009, futurist and science fiction writer Bruce Sterling first introduced Favela Chic as a way of thinking about our modern world. What is favela chic? It’s what happens “when you’ve lost everything materially… but are wired to the gills and are big on Facebook.” Favela chic doesn’t have to be exclusively an emerging market notion, either. As Sterling has noted, it can be a hastily thrown-together high-rise in downtown Miami, covered over with weeds, without any indoor plumbing, filled with squatters.

Flash forward to the end of 2010, when the World Future Society named favela innovation one of the Top 10 trends to watch in 2011: “Dwellers of slums, favelas, and ghettos have learned to use and reuse resources and commodities more efficiently than their wealthier counterparts. The neighborhoods are high-density and walkable, mixing commercial and residential areas rather than segregating these functions. In many of these informal cities, participants play a role in communal commercial endeavors such as growing food or raising livestock.”

What’s fascinating is that the online digital communities we are busy creating in “developed” nations more closely resemble favelas than they do carefully planned urban cities. They are messy, emergent and always in beta. With few exceptions, there are no civil rights and no effective ways to organize. When asked how to define favela chic at this year’s SXSW event in Austin, Sterling referred to Facebook as the poster child of a digital favela. It’s thrown-up, in permanent beta, and easily disposed of quickly. Apps and social games are the corrugated steel of our digital shanty-towns.

[div class=attrib]More from theSource here.[end-div]