Tag Archives: creativity

Computational Folkloristics

hca_by_thora_hallager_1869What do you get when you set AI (artificial intelligence) the task of reading through 30,000 Danish folk and fairy tales? Well, you get a host of fascinating, newly discovered insights into Scandinavian witches and trolls.

More importantly, you hammer another nail into the coffin of literary criticism and set AI on a collision course with yet another preserve of once exclusive human endeavor. It’s probably safe to assume that creative writing will fall to intelligent machines in the not too distant future (as well) — certainly human-powered investigative journalism seemed to became extinct in 2016; replaced by algorithmic aggregation, social bots and fake-mongers.

From aeon:

Where do witches come from, and what do those places have in common? While browsing a large collection of traditional Danish folktales, the folklorist Timothy Tangherlini and his colleague Peter Broadwell, both at the University of California, Los Angeles, decided to find out. Armed with a geographical index and some 30,000 stories, they developed WitchHunter, an interactive ‘geo-semantic’ map of Denmark that highlights the hotspots for witchcraft.

The system used artificial intelligence (AI) techniques to unearth a trove of surprising insights. For example, they found that evil sorcery often took place close to Catholic monasteries. This made a certain amount of sense, since Catholic sites in Denmark were tarred with diabolical associations after the Protestant Reformation in the 16th century. By plotting the distance and direction of witchcraft relative to the storyteller’s location, WitchHunter also showed that enchantresses tend to be found within the local community, much closer to home than other kinds of threats. ‘Witches and robbers are human threats to the economic stability of the community,’ the researchers write. ‘Yet, while witches threaten from within, robbers are generally situated at a remove from the well-described village, often living in woods, forests, or the heath … it seems that no matter how far one goes, nor where one turns, one is in danger of encountering a witch.’

Such ‘computational folkloristics’ raise a big question: what can algorithms tell us about the stories we love to read? Any proposed answer seems to point to as many uncertainties as it resolves, especially as AI technologies grow in power. Can literature really be sliced up into computable bits of ‘information’, or is there something about the experience of reading that is irreducible? Could AI enhance literary interpretation, or will it alter the field of literary criticism beyond recognition? And could algorithms ever derive meaning from books in the way humans do, or even produce literature themselves?

Author and computational linguist Inderjeet Mani concludes his essay thus:

Computational analysis and ‘traditional’ literary interpretation need not be a winner-takes-all scenario. Digital technology has already started to blur the line between creators and critics. In a similar way, literary critics should start combining their deep expertise with ingenuity in their use of AI tools, as Broadwell and Tangherlini did with WitchHunter. Without algorithmic assistance, researchers would be hard-pressed to make such supernaturally intriguing findings, especially as the quantity and diversity of writing proliferates online.

In the future, scholars who lean on digital helpmates are likely to dominate the rest, enriching our literary culture and changing the kinds of questions that can be explored. Those who resist the temptation to unleash the capabilities of machines will have to content themselves with the pleasures afforded by smaller-scale, and fewer, discoveries. While critics and book reviewers may continue to be an essential part of public cultural life, literary theorists who do not embrace AI will be at risk of becoming an exotic species – like the librarians who once used index cards to search for information.

Read the entire tale here.

Image: Portrait of the Danish writer Hans Christian Andersen. Courtesy: Thora Hallager, 10/16 October 1869. Wikipedia. Public Domain.

Five Tips For Re-Learning How to Walk

Google-search-walking-with-smartphone

It seems that the aimless walk to clear one’s mind has become a rarity. So too the gentle stroll to ponder and think. Purposeless walking, it seems, is a dying art. Indeed many in the West are so pampered for transportation alternatives and (self-)limited in time that walking has become an indulgence — who can afford to walk any more when driving or taking the bus or the train can save so much time (and energy). Moreover, when we do walk, we’re firmly hunched over our smartphones, entranced by cyberspace and its virtual acknowledgments and affirmations, and thoroughly unaware of our surroundings.

Google-search-walking-in-nature

Yet keep in mind that many of our revered artists, photographers, authors and philosophers were great walkers. They used the walk to sense and think. In fact, studies find a link between walking and creativity.

So, without further ado I present 5 tips to help you revive an endangered pastime:

#1. Ditch the smartphone and any other mobile device.

#2. Find a treasured place to walk. Stomping to the nearest pub or 7-Eleven does not count.

#3. Pay attention to your surroundings and walk mindfully. Observe the world around you. This goes back to #1.

#4. Take off the headphones, take out the earbuds and leave your soundtrack at home. Listen to the world around you.

#5. Leave the partner, friend and dog (or other walking companion) at home. Walk alone.

From the BBC:

A number of recent books have lauded the connection between walking – just for its own sake – and thinking. But are people losing their love of the purposeless walk?

Walking is a luxury in the West. Very few people, particularly in cities, are obliged to do much of it at all. Cars, bicycles, buses, trams, and trains all beckon.

Instead, walking for any distance is usually a planned leisure activity. Or a health aid. Something to help people lose weight. Or keep their fitness. But there’s something else people get from choosing to walk. A place to think.

Wordsworth was a walker. His work is inextricably bound up with tramping in the Lake District. Drinking in the stark beauty. Getting lost in his thoughts.

Charles Dickens was a walker. He could easily rack up 20 miles, often at night. You can almost smell London’s atmosphere in his prose. Virginia Woolf walked for inspiration. She walked out from her home at Rodmell in the South Downs. She wandered through London’s parks.

Henry David Thoreau, who was both author and naturalist, walked and walked and walked. But even he couldn’t match the feat of someone like Constantin Brancusi, the sculptor who walked much of the way between his home village in Romania and Paris. Or indeed Patrick Leigh Fermor, whose walk from the Hook of Holland to Istanbul at the age of 18 inspired several volumes of travel writing. George Orwell, Thomas De Quincey, Nassim Nicholas Taleb, Friedrich Nietzsche, Bruce Chatwin, WG Sebald and Vladimir Nabokov are just some of the others who have written about it.

Read the entire article here.

Images courtesy of Google Search: Walking with smartphone. Walking in nature (my preference).

Practice May Make You Perfect, But Not Creative

Practice will help you improve in a field with well-defined and well-developed tasks, processes and rules. This includes areas like sports and musicianship. Though, keep in mind that it may indeed take some accident of genetics to be really good at one of these disciplines in the first place.

But, don’t expect practice to make you better in all areas of life, particularly in creative endeavors. Creativity stems from original thought not replicable behavior. Scott Kaufman director of the Imagination Institute at the University of Pennsylvania reminds us of this in a recent book review.” The authors of Peak: Secrets from the New Science of Expertise, psychologist Anders Ericsson and journalist Robert Pool, review a swath of research on human learning and skill acquisition and conclude that deliberate, well-structured practice can help anyone master new skills. I think we can all agree with this conclusion.

But like Kaufman I believe that many creative “skills” lie in an area of human endeavor that is firmly beyond the assistance of practice. Most certainly practice will help an artist hone and improve her brushstrokes; but practice alone will not bring forth her masterpiece. So, here is a brief summary of 12 key elements that Kaufman distilled from over 50 years of research studies into creativity:

Excerpts from Creativity Is Much More Than 10,000 Hours of Deliberate Practice by Scott Kaufman:

  1. Creativity is often blind. If only creativity was all about deliberate practice… in reality, it’s impossible for creators to know completely whether their new idea or product will be well received.
  2. Creative people often have messy processes. While expertise is characterized by consistency and reliability, creativity is characterized by many false starts and lots and lots of trial-and-error.
  3. Creators rarely receive helpful feedback. When creators put something novel out into the world, the reactions are typically either acclaim or rejection
  4. The “10-Year Rule” is not a rule. The idea that it takes 10 years to become a world-class expert in any domain is not a rule. [This is the so-called Ericsson rule from his original paper on deliberate practice amongst musicians.]
  5. Talent is relevant to creative accomplishment. If we define talent as simply the rate at which a person acquires expertise, then talent undeniably matters for creativity.
  6. Personality is relevant. Not only does the speed of expertise acquisition matter, but so do a whole host of other traits. People differ from one another in a multitude of ways… At the very least, research has shown that creative people do tend to have a greater inclination toward nonconformity, unconventionality, independence, openness to experience, ego strength, risk taking, and even mild forms of psychopathology.
  7. Genes are relevant. [M]odern behavioral genetics has discovered that virtually every single psychological trait — including the inclination and willingness to practice — is influenced by innate genetic endowment.
  8. Environmental experiences also matter. [R]esearchers have found that many other environmental experiences substantially affect creativity– including socioeconomic origins, and the sociocultural, political, and economic context in which one is raised.
  9. Creative people have broad interests. While the deliberate practice approach tends to focus on highly specialized training… creative experts tend to have broader interests and greater versatility compared to their less creative expert colleagues.
  10. Too much expertise can be detrimental to creative greatness. The deliberate practice approach assumes that performance is a linear function of practice. Some knowledge is good, but too much knowledge can impair flexibility.
  11. Outsiders often have a creative advantage. If creativity were all about deliberate practice, then outsiders who lack the requisite expertise shouldn’t be very creative. But many highly innovative individuals were outsiders to the field in which they contributed. Many marginalized people throughout history — including immigrants — came up with highly creative ideas not in spite of their experiences as an outsider, but because of their experiences as an outsider.
  12. Sometimes the creator needs to create a new path for others to deliberately practice. Creative people are not just good at solving problems, however. They are also good at finding problems.

In my view the most salient of Kaufman’s dozen ingredients for creativity are #11 and #12 — and I can personally attest to their importance: fresh ideas are more likely to come from outsiders; and, creativeness in one domain often stems from experiences in another, unrelated, realm.

Read Kaufman’s enlightening article in full here.

The Rembrandt Algorithm

new-rembrandt

Over the last few decades robots have been steadily replacing humans in industrial and manufacturing sectors. Increasingly, robots are appearing in a broader array of service sectors; they’re stocking shelves, cleaning hotels, buffing windows, tending bar, dispensing cash.

Nowadays you’re likely to be the recipient of news articles filtered, and in some cases written, by pieces of code and business algorithms. Indeed, many boilerplate financial reports are now “written” by “analysts” who reside, not as flesh-and-bones, but virtually, inside server-farms. Just recently a collection of circuitry and software trounced a human being at the strategic board game, Go.

So, can computers progress from repetitive, mechanical and programmatic roles to more creative, free-wheeling vocations? Can computers become artists?

A group of data scientists, computer engineers, software developers and art historians set out to answer the question.

Jonathan Jones over at the Guardian has a few choice words on the result:

I’ve been away for a few days and missed the April Fool stories in Friday’s papers – until I spotted the one about a team of Dutch “data analysts, developers, engineers and art historians” creating a new painting using digital technology: a virtual Rembrandt painted by a Rembrandt app. Hilarious! But wait, this was too late to be an April Fool’s joke. This is a real thing that is actually happening.

What a horrible, tasteless, insensitive and soulless travesty of all that is creative in human nature. What a vile product of our strange time when the best brains dedicate themselves to the stupidest “challenges”, when technology is used for things it should never be used for and everybody feels obliged to applaud the heartless results because we so revere everything digital.

Hey, they’ve replaced the most poetic and searching portrait painter in history with a machine. When are we going to get Shakespeare’s plays and Bach’s St Matthew Passion rebooted by computers? I cannot wait for Love’s Labours Have Been Successfully Functionalised by William Shakesbot.

You cannot, I repeat, cannot, replicate the genius of Rembrandt van Rijn. His art is not a set of algorithms or stylistic tics that can be recreated by a human or mechanical imitator. He can only be faked – and a fake is a dead, dull thing with none of the life of the original. What these silly people have done is to invent a new way to mock art. Bravo to them! But the Dutch art historians and museums who appear to have lent their authority to such a venture are fools.

Rembrandt lived from 1606 to 1669. His art only has meaning as a historical record of his encounters with the people, beliefs and anguishes of his time. Its universality is the consequence of the depth and profundity with which it does so. Looking into the eyes of Rembrandt’s Self-Portrait at the Age of 63, I am looking at time itself: the time he has lived, and the time since he lived. A man who stared, hard, at himself in his 17th-century mirror now looks back at me, at you, his gaze so deep his mottled flesh is just the surface of what we see.

We glimpse his very soul. It’s not style and surface effects that make his paintings so great but the artist’s capacity to reveal his inner life and make us aware in turn of our own interiority – to experience an uncanny contact, soul to soul. Let’s call it the Rembrandt Shudder, that feeling I long for – and get – in front of every true Rembrandt masterpiece..

Is that a mystical claim? The implication of the digital Rembrandt is that we get too sentimental and moist-eyed about art, that great art is just a set of mannerisms that can be digitised. I disagree. If it’s mystical to see Rembrandt as a special and unique human being who created unrepeatable, inexhaustible masterpieces of perception and intuition then count me a mystic.

Read the entire story here.

Image: The Next Rembrandt (based on 168,263 Rembrandt painting fragments). Courtesy: Microsoft, Delft University of Technology,  Mauritshuis (Hague), Rembrandt House Museum (Amsterdam).

You Could Be Galactic Viceroy

Many corporations, by necessity, are not the most innovative of human aggregations. Most are conservative by nature — making money today, based on what worked yesterday. So, to maintain some degree of creative spirit and keep workers loyal they allow (some) employees to adopt rather — by corporate standards — wacky, left-field titles.

My favorite of this bunch: Digital Prophet, which I much prefer over iCup Technician, Wizard of Lightbulb Moments, and Wet Leisure Attendant.

Read more oddball titles here.

Creativity and Mental Illness

Vincent_van_Gogh-Self_portrait_with_bandaged_ear

The creative genius — oft misunderstood, outcast, tortured, misanthropic, fueled by demon spirits. Yet, this same description would seem to be equally apt at describing many of those who are unfortunate enough to suffer from mental illness. So, could creativity and mental illness be high-level symptoms of a broader underlying spectrum “disorder”? After all, a not insignificant number of people and businesses tend to regard creativity as a behavioral problem — best left outside the front-door to the office. Time to check out the results of the latest psychological study.

From the Guardian:

The ancient Greeks were first to make the point. Shakespeare raised the prospect too. But Lord Byron was, perhaps, the most direct of them all: “We of the craft are all crazy,” he told the Countess of Blessington, casting a wary eye over his fellow poets.

The notion of the tortured artist is a stubborn meme. Creativity, it states, is fuelled by the demons that artists wrestle in their darkest hours. The idea is fanciful to many scientists. But a new study claims the link may be well-founded after all, and written into the twisted molecules of our DNA.

In a large study published on Monday, scientists in Iceland report that genetic factors that raise the risk of bipolar disorder and schizophrenia are found more often in people in creative professions. Painters, musicians, writers and dancers were, on average, 25% more likely to carry the gene variants than professions the scientists judged to be less creative, among which were farmers, manual labourers and salespeople.

Kari Stefansson, founder and CEO of deCODE, a genetics company based in Reykjavik, said the findings, described in the journal Nature Neuroscience, point to a common biology for some mental disorders and creativity. “To be creative, you have to think differently,” he told the Guardian. “And when we are different, we have a tendency to be labelled strange, crazy and even insane.”

The scientists drew on genetic and medical information from 86,000 Icelanders to find genetic variants that doubled the average risk of schizophrenia, and raised the risk of bipolar disorder by more than a third. When they looked at how common these variants were in members of national arts societies, they found a 17% increase compared with non-members.

The researchers went on to check their findings in large medical databases held in the Netherlands and Sweden. Among these 35,000 people, those deemed to be creative (by profession or through answers to a questionnaire) were nearly 25% more likely to carry the mental disorder variants.

Stefansson believes that scores of genes increase the risk of schizophrenia and bipolar disorder. These may alter the ways in which many people think, but in most people do nothing very harmful. But for 1% of the population, genetic factors, life experiences and other influences can culminate in problems, and a diagnosis of mental illness.

“Often, when people are creating something new, they end up straddling between sanity and insanity,” said Stefansson. “I think these results support the old concept of the mad genius. Creativity is a quality that has given us Mozart, Bach, Van Gogh. It’s a quality that is very important for our society. But it comes at a risk to the individual, and 1% of the population pays the price for it.”

Stefansson concedes that his study found only a weak link between the genetic variants for mental illness and creativity. And it is this that other scientists pick up on. The genetic factors that raise the risk of mental problems explained only about 0.25% of the variation in peoples’ artistic ability, the study found. David Cutler, a geneticist at Emory University in Atlanta, puts that number in perspective: “If the distance between me, the least artistic person you are going to meet, and an actual artist is one mile, these variants appear to collectively explain 13 feet of the distance,” he said.

Most of the artist’s creative flair, then, is down to different genetic factors, or to other influences altogether, such as life experiences, that set them on their creative journey.

For Stefansson, even a small overlap between the biology of mental illness and creativity is fascinating. “It means that a lot of the good things we get in life, through creativity, come at a price. It tells me that when it comes to our biology, we have to understand that everything is in some way good and in some way bad,” he said.

Read the entire article here.

Image: Vincent van Gogh, self-portrait, 1889. Courtesy of Courtauld Institute Galleries, London. Wikipaintings.org. Public Domain.

A New Mobile App or Genomic Understanding?

Eyjafjallajökull

Silicon Valley has been a tremendous incubator for some of most our recent inventions: the first integrated transistor chip, which led to Intel; the first true personal computer, which led to Apple. Yet, this esteemed venture capital (VC) community now seems to need a self-medication of innovation. Aren’t we all getting a little jaded from yet another “new, great mobile app” — worth in the tens of billions (but having no revenue model) — courtesy of a bright and young group of 20-somethings?

It is indeed gratifying to see innovators, young and old, rewarded for their creativity and perseverance. Yet, we should be encouraging more of our pioneers to look beyond the next cool smartphone invention. Perhaps our technological and industrial luminaries and their retinues of futurists could do us all a favor if they channeled more of their speculative funds at longer-term and more significant endeavors: cost-effective desalination; cheaper medications; understanding and curing our insidious diseases; antibiotic replacements; more effective recycling; cleaner power; cheaper and stronger infrastructure; more effective education. These are all difficult problems. But therein lies the reward.

Clearly some pioneering businesses are investing in these areas. But isn’t it time we insisted that the majority of our private and public intellectual capital (and financial) should be invested in truly meaningful ways. Here’s an example from Iceland — with their national human genome project.

From ars technica:

An Icelandic genetics firm has sequenced the genomes of 2,636 of its countrymen and women, finding genetic markers for a variety of diseases, as well as a new timeline for the paternal ancestor of all humans.

Iceland is, in many ways, perfectly suited to being a genetic case study. It has a small population with limited genetic diversity, a result of the population descending from a small number of settlers—between 8 and 20 thousand, who arrived just 1100 years ago. It also has an unusually well-documented genealogical history, with information sometimes stretching all the way back to the initial settlement of the country. Combined with excellent medical records, it’s a veritable treasure trove for genetic researchers.

The researchers at genetics firm deCODE compared the complete genomes of participants with historical and medical records, publishing their findings in a series of four papers in Nature Genetics last Wednesday. The wealth of data allowed them to track down genetic mutations that are related to a number of diseases, some of them rare. Although few diseases are caused by a single genetic mutation, a combination of mutations can increase the risk for certain diseases. Having access to a large genetic sample with corresponding medical data can help to pinpoint certain risk-increasing mutations.

Among their headline findings was the identification of the gene ABCA7 as a risk factor for Alzheimer’s disease. Although previous research had established that a gene in this region was involved in Alzheimer’s, this result delivers a new level of precision. The researchers replicated their results in further groups in Europe and the United States.

Also identified was a genetic mutation that causes early-onset atrial fibrillation, a heart condition causing an irregular and often very fast heart rate. It’s the most common cardiac arrhythmia condition, and it’s considered early-onset if it’s diagnosed before the age of 60. The researchers found eight Icelanders diagnosed with the condition, all carrying a mutation in the same gene, MYL4.

The studies also turned up a gene with an unusual pattern of inheritance. It causes increased levels of thyroid stimulation when it’s passed down from the mother, but decreased levels when inherited from the father.

Genetic research in mice often involves “knocking out” or switching off a particular gene to explore the effects. However, mouse genetics aren’t a perfect approximation of human genetics. Obviously, doing this in humans presents all sorts of ethical problems, but a population such as Iceland provides the perfect natural laboratory to explore how knockouts affect human health.

The data showed that eight percent of people in Iceland have the equivalent of a knockout, one gene that isn’t working. This provides an opportunity to look at the data in a different way: rather than only looking for people with a particular diagnosis and finding out what they have in common genetically, the researchers can look for people who have genetic knockouts, and then examine their medical records to see how their missing genes affect their health. It’s then possible to start piecing together the story of how certain genes affect physiology.

Finally, the researchers used the data to explore human history, using Y chromosome data from 753 Icelandic males. Based on knowledge about mutation rates, Y chromosomes can be used to trace the male lineage of human groups, establishing dates of events like migrations. This technique has also been used to work out when the common ancestor of all humans was alive. The maternal ancestor, known as “Mitochondrial Eve,” is thought to have lived 170,000 to 180,000 years ago, while the paternal ancestor had previously been estimated to have lived around 338,000 years ago.

The Icelandic data allowed the researchers to calculate what they suggest is a more accurate mutation rate, placing the father of all humans at around 239,000 years ago. This is the estimate with the greatest likelihood, but the full range falls between 174,000 and 321,000 years ago. This estimate places the paternal ancestor closer in time to the maternal ancestor.

Read the entire story here.

Image: Gígjökull, an outlet glacier extending from Eyjafjallajökull, Iceland. Courtesy of Andreas Tille / Wikipedia.

PowerPoint Karaoke Olympics

PPT-karaokeIt may not be beyond the realm of fantasy to imagine a day in the not too distant future when PowerPoint Karaoke features as an olympic sport. Ugh!

Without a doubt karaoke has set human culture back at least a thousand years (thanks Japan). And, Powerpoint has singlehandedly dealt killer blows to creativity, deep thought and literary progress (thanks Microsoft). Surely, combining these two banes of modern society into a competitive event is the stuff of true horror. But, this hasn’t stopped the activity from becoming a burgeoning improv phenomenon for corporate hacks — validating the trend in which humans continue making fools of themselves. After all, it must be big — and there’s probably money in it — if the WSJ is reporting on it.

Nonetheless,

  • Count
  • me
  • out!

From the WSJ:

On a sunny Friday afternoon earlier this month, about 100 employees of Adobe Systems Inc. filed expectantly into an auditorium to watch PowerPoint presentations.

“I am really thrilled to be here today,” began Kimberley Chambers, a 37-year-old communications manager for the software company, as she nervously clutched a microphone. “I want to talk you through…my experience with whales, in both my personal and professional life.”

Co-workers giggled. Ms. Chambers glanced behind her, where a PowerPoint slide displayed four ink sketches of bare-chested male torsos, each with a distinct pattern of chest hair. The giggles became guffaws. “What you might not know,” she continued, “is that whales can be uniquely identified by a number of different characteristics, not the least of which is body hair.”

Ms. Chambers, sporting a black blazer and her employee ID badge, hadn’t seen this slide in advance, nor the five others that popped up as she clicked her remote control. To accompany the slides, she gave a nine-minute impromptu talk about whales, a topic she was handed 30 seconds earlier.

Forums like this at Adobe, called “PowerPoint karaoke” or “battle decks,” are cropping up as a way for office workers of the world to mock an oppressor, the ubiquitous PowerPoint presentation. The mix of improvised comedy and corporate-culture takedown is based on a simple notion: Many PowerPoint presentations are unintentional parody already, so why not go all the way?

Library associations in Texas and California held PowerPoint karaoke sessions at their annual conferences. At a Wal-Mart StoresInc. event last year, workers gave fake talks based on real slides from a meatpacking supplier. Twitter Inc. Chief Executive Dick Costolo, armed with his training from comedy troupe Second City, has faced off with employees at “battle decks” contests during company meetings.

One veteran corporate satirist gives these events a thumbs up. “Riffing off of PowerPoints without knowing what your next slide is going to be? The humorist in me says it’s kinda brilliant,” said “Dilbert” cartoonist Scott Adams, who has spent 26 years training his jaundiced eye on office work. “I assume this game requires drinking?” he asked. (Drinking is technically not required, but it is common.)

Mr. Adams, who worked for years at a bank and at a telephone company, said PowerPoint is popular because it offers a rare dose of autonomy in cubicle culture. But it often bores, because creators lose sight of their mission. “If you just look at a page and drag things around and play with fonts, you think you’re a genius and you’re in full control of your world,” he said.

At a February PowerPoint karaoke show in San Francisco, contestants were given pairings of topics and slides ranging from a self-help seminar for people who abuse Amazon Prime, with slides including a dog balancing a stack of pancakes on its nose, to a sermon on “Fifty Shades of Grey,” with slides including a pyramid dotted with blocks of numbers. Another had to explain the dating app Tinder to aliens invading the Earth, accompanied by a slide of old floppy disk drives, among other things.

Read and sing-a-long to the entire article here.

Cross-Connection Requires a Certain Daring

A previously unpublished essay by Isaac Asimov on the creative process shows us his well reasoned thinking on the subject. While he believed that deriving new ideas could be done productively in a group, he seemed to gravitate more towards the notion of the lone creative genius. Both, however, require the innovator(s) to cross-connect thoughts, often from disparate sources.

From Technology Review:

How do people get new ideas?

Presumably, the process of creativity, whatever it is, is essentially the same in all its branches and varieties, so that the evolution of a new art form, a new gadget, a new scientific principle, all involve common factors. We are most interested in the “creation” of a new scientific principle or a new application of an old one, but we can be general here.

One way of investigating the problem is to consider the great ideas of the past and see just how they were generated. Unfortunately, the method of generation is never clear even to the “generators” themselves.

But what if the same earth-shaking idea occurred to two men, simultaneously and independently? Perhaps, the common factors involved would be illuminating. Consider the theory of evolution by natural selection, independently created by Charles Darwin and Alfred Wallace.

There is a great deal in common there. Both traveled to far places, observing strange species of plants and animals and the manner in which they varied from place to place. Both were keenly interested in finding an explanation for this, and both failed until each happened to read Malthus’s “Essay on Population.”

Both then saw how the notion of overpopulation and weeding out (which Malthus had applied to human beings) would fit into the doctrine of evolution by natural selection (if applied to species generally).

Obviously, then, what is needed is not only people with a good background in a particular field, but also people capable of making a connection between item 1 and item 2 which might not ordinarily seem connected.

Undoubtedly in the first half of the 19th century, a great many naturalists had studied the manner in which species were differentiated among themselves. A great many people had read Malthus. Perhaps some both studied species and read Malthus. But what you needed was someone who studied species, read Malthus, and had the ability to make a cross-connection.

That is the crucial point that is the rare characteristic that must be found. Once the cross-connection is made, it becomes obvious. Thomas H. Huxley is supposed to have exclaimed after reading On the Origin of Species, “How stupid of me not to have thought of this.”

But why didn’t he think of it? The history of human thought would make it seem that there is difficulty in thinking of an idea even when all the facts are on the table. Making the cross-connection requires a certain daring. It must, for any cross-connection that does not require daring is performed at once by many and develops not as a “new idea,” but as a mere “corollary of an old idea.”

It is only afterward that a new idea seems reasonable. To begin with, it usually seems unreasonable. It seems the height of unreason to suppose the earth was round instead of flat, or that it moved instead of the sun, or that objects required a force to stop them when in motion, instead of a force to keep them moving, and so on.

A person willing to fly in the face of reason, authority, and common sense must be a person of considerable self-assurance. Since he occurs only rarely, he must seem eccentric (in at least that respect) to the rest of us. A person eccentric in one respect is often eccentric in others.

Consequently, the person who is most likely to get new ideas is a person of good background in the field of interest and one who is unconventional in his habits. (To be a crackpot is not, however, enough in itself.)

Once you have the people you want, the next question is: Do you want to bring them together so that they may discuss the problem mutually, or should you inform each of the problem and allow them to work in isolation?

My feeling is that as far as creativity is concerned, isolation is required. The creative person is, in any case, continually working at it. His mind is shuffling his information at all times, even when he is not conscious of it. (The famous example of Kekule working out the structure of benzene in his sleep is well-known.)

The presence of others can only inhibit this process, since creation is embarrassing. For every new good idea you have, there are a hundred, ten thousand foolish ones, which you naturally do not care to display.

Nevertheless, a meeting of such people may be desirable for reasons other than the act of creation itself.

Read the entire article here.

Innovation in Education?

While many aspects of our lives have changed, mostly for the better, over the last two to three hundred years, one area remains relatively untouched. Education. Since the industrial revolution that began in Western Europe and then swept the world not much has changed in the way we educate our children. It is still very much an industrial, factory-oriented process.

You could argue that technology has altered how we learn, and you would be partly correct. You could also argue that our children are also much more informed and learned compared with their peers in Victorian England, and, again, you would be partly correct. But the most critical element remains the same — the process. The regimented approach, the rote teaching, the focus on tests and testing, and the systematic squelching of creativity still remains solidly in place.

William Torrey Harris, one of the founders of the U.S. public school system in the late-1800s, once said,

“Ninety-nine [students] out of a hundred are automata, careful to walk in prescribed paths, careful to follow the prescribed custom. This is not an accident but the result of substantial education, which, scientifically defined, is the subsumption of the individual.”

And, in testament to his enduring legacy, much of this can still be seen in action today in most Western public schools.

Yet, in some pockets of the world there is hope. Sir Ken Robinson’s vision may yet come to fruition.

From Wired:

José Urbina López Primary School sits next to a dump just across the US border in Mexico. The school serves residents of Matamoros, a dusty, sunbaked city of 489,000 that is a flash point in the war on drugs. There are regular shoot-outs, and it’s not uncommon for locals to find bodies scattered in the street in the morning. To get to the school, students walk along a white dirt road that parallels a fetid canal. On a recent morning there was a 1940s-era tractor, a decaying boat in a ditch, and a herd of goats nibbling gray strands of grass. A cinder-block barrier separates the school from a wasteland—the far end of which is a mound of trash that grew so big, it was finally closed down. On most days, a rotten smell drifts through the cement-walled classrooms. Some people here call the school un lugar de castigo—”a place of punishment.”

For 12-year-old Paloma Noyola Bueno, it was a bright spot. More than 25 years ago, her family moved to the border from central Mexico in search of a better life. Instead, they got stuck living beside the dump. Her father spent all day scavenging for scrap, digging for pieces of aluminum, glass, and plastic in the muck. Recently, he had developed nosebleeds, but he didn’t want Paloma to worry. She was his little angel—the youngest of eight children.

After school, Paloma would come home and sit with her father in the main room of their cement-and-wood home. Her father was a weather-beaten, gaunt man who always wore a cowboy hat. Paloma would recite the day’s lessons for him in her crisp uniform—gray polo, blue-and-white skirt—and try to cheer him up. She had long black hair, a high forehead, and a thoughtful, measured way of talking. School had never been challenging for her. She sat in rows with the other students while teachers told the kids what they needed to know. It wasn’t hard to repeat it back, and she got good grades without thinking too much. As she headed into fifth grade, she assumed she was in for more of the same—lectures, memorization, and busy work.

Sergio Juárez Correa was used to teaching that kind of class. For five years, he had stood in front of students and worked his way through the government-mandated curriculum. It was mind-numbingly boring for him and the students, and he’d come to the conclusion that it was a waste of time. Test scores were poor, and even the students who did well weren’t truly engaged. Something had to change.

He too had grown up beside a garbage dump in Matamoros, and he had become a teacher to help kids learn enough to make something more of their lives. So in 2011—when Paloma entered his class—Juárez Correa decided to start experimenting. He began reading books and searching for ideas online. Soon he stumbled on a video describing the work of Sugata Mitra, a professor of educational technology at Newcastle University in the UK. In the late 1990s and throughout the 2000s, Mitra conducted experiments in which he gave children in India access to computers. Without any instruction, they were able to teach themselves a surprising variety of things, from DNA replication to English.

Juárez Correa didn’t know it yet, but he had happened on an emerging educational philosophy, one that applies the logic of the digital age to the classroom. That logic is inexorable: Access to a world of infinite information has changed how we communicate, process information, and think. Decentralized systems have proven to be more productive and agile than rigid, top-down ones. Innovation, creativity, and independent thinking are increasingly crucial to the global economy.

And yet the dominant model of public education is still fundamentally rooted in the industrial revolution that spawned it, when workplaces valued punctuality, regularity, attention, and silence above all else. (In 1899, William T. Harris, the US commissioner of education, celebrated the fact that US schools had developed the “appearance of a machine,” one that teaches the student “to behave in an orderly manner, to stay in his own place, and not get in the way of others.”) We don’t openly profess those values nowadays, but our educational system—which routinely tests kids on their ability to recall information and demonstrate mastery of a narrow set of skills—doubles down on the view that students are material to be processed, programmed, and quality-tested. School administrators prepare curriculum standards and “pacing guides” that tell teachers what to teach each day. Legions of managers supervise everything that happens in the classroom; in 2010 only 50 percent of public school staff members in the US were teachers.

The results speak for themselves: Hundreds of thousands of kids drop out of public high school every year. Of those who do graduate from high school, almost a third are “not prepared academically for first-year college courses,” according to a 2013 report from the testing service ACT. The World Economic Forum ranks the US just 49th out of 148 developed and developing nations in quality of math and science instruction. “The fundamental basis of the system is fatally flawed,” says Linda Darling-Hammond, a professor of education at Stanford and founding director of the National Commission on Teaching and America’s Future. “In 1970 the top three skills required by the Fortune 500 were the three Rs: reading, writing, and arithmetic. In 1999 the top three skills in demand were teamwork, problem-solving, and interpersonal skills. We need schools that are developing these skills.”

That’s why a new breed of educators, inspired by everything from the Internet to evolutionary psychology, neuroscience, and AI, are inventing radical new ways for children to learn, grow, and thrive. To them, knowledge isn’t a commodity that’s delivered from teacher to student but something that emerges from the students’ own curiosity-fueled exploration. Teachers provide prompts, not answers, and then they step aside so students can teach themselves and one another. They are creating ways for children to discover their passion—and uncovering a generation of geniuses in the process.

At home in Matamoros, Juárez Correa found himself utterly absorbed by these ideas. And the more he learned, the more excited he became. On August 21, 2011—the start of the school year — he walked into his classroom and pulled the battered wooden desks into small groups. When Paloma and the other students filed in, they looked confused. Juárez Correa invited them to take a seat and then sat down with them.

He started by telling them that there were kids in other parts of the world who could memorize pi to hundreds of decimal points. They could write symphonies and build robots and airplanes. Most people wouldn’t think that the students at José Urbina López could do those kinds of things. Kids just across the border in Brownsville, Texas, had laptops, high-speed Internet, and tutoring, while in Matamoros the students had intermittent electricity, few computers, limited Internet, and sometimes not enough to eat.

“But you do have one thing that makes you the equal of any kid in the world,” Juárez Correa said. “Potential.”

He looked around the room. “And from now on,” he told them, “we’re going to use that potential to make you the best students in the world.”

Paloma was silent, waiting to be told what to do. She didn’t realize that over the next nine months, her experience of school would be rewritten, tapping into an array of educational innovations from around the world and vaulting her and some of her classmates to the top of the math and language rankings in Mexico.

“So,” Juárez Correa said, “what do you want to learn?”

In 1999, Sugata Mitra was chief scientist at a company in New Delhi that trains software developers. His office was on the edge of a slum, and on a hunch one day, he decided to put a computer into a nook in a wall separating his building from the slum. He was curious to see what the kids would do, particularly if he said nothing. He simply powered the computer on and watched from a distance. To his surprise, the children quickly figured out how to use the machine.

Over the years, Mitra got more ambitious. For a study published in 2010, he loaded a computer with molecular biology materials and set it up in Kalikuppam, a village in southern India. He selected a small group of 10- to 14-year-olds and told them there was some interesting stuff on the computer, and might they take a look? Then he applied his new pedagogical method: He said no more and left.

Over the next 75 days, the children worked out how to use the computer and began to learn. When Mitra returned, he administered a written test on molecular biology. The kids answered about one of four questions correctly. After another 75 days, with the encouragement of a friendly local, they were getting every other question right. “If you put a computer in front of children and remove all other adult restrictions, they will self-organize around it,” Mitra says, “like bees around a flower.”

A charismatic and convincing proselytizer, Mitra has become a darling in the tech world. In early 2013 he won a $1 million grant from TED, the global ideas conference, to pursue his work. He’s now in the process of establishing seven “schools in the cloud,” five in India and two in the UK. In India, most of his schools are single-room buildings. There will be no teachers, curriculum, or separation into age groups—just six or so computers and a woman to look after the kids’ safety. His defining principle: “The children are completely in charge.”

Read the entire article here.

Image: William Torrey Harris, (September 10, 1835 – November 5, 1909), American educator, philosopher, and lexicographer. Courtesy of Wikipedia.

Six Rules to Super-Charge Your Creativity

Creative minds by their very nature are all different. Yet upon further examination it seems that there are some key elements and common routines that underlie many of the great, innovative thinkers. First and foremost, of course, is to be an early-bird.

From the Guardian:

One morning this summer, I got up at first light – I’d left the blinds open the night before – then drank a strong cup of coffee, sat near-naked by an open window for an hour, worked all morning, then had a martini with lunch. I took a long afternoon walk, and for the rest of the week experimented with never working for more than three hours at a stretch.

This was all in an effort to adopt the rituals of some great artists and thinkers: the rising-at-dawn bit came from Ernest Hemingway, who was up at around 5.30am, even if he’d been drinking the night before; the strong coffee was borrowed from Beethoven, who personally counted out the 60 beans his morning cup required. Benjamin Franklin swore by “air baths”, which was his term for sitting around naked in the morning, whatever the weather. And the midday cocktail was a favourite of VS Pritchett (among many others). I couldn’t try every trick I discovered in a new book, Daily Rituals: How Great Minds Make Time, Find Inspiration And Get To Work; oddly, my girlfriend was unwilling to play the role of Freud’s wife, who put toothpaste on his toothbrush each day to save him time. Still, I learned a lot. For example: did you know that lunchtime martinis aren’t conducive to productivity?

As a writer working from home, of course, I have an unusual degree of control over my schedule – not everyone could run such an experiment. But for anyone who thinks of their work as creative, or who pursues creative projects in their spare time, reading about the habits of the successful, can be addictive. Partly, that’s because it’s comforting to learn that even Franz Kafka struggled with the demands of his day job, or that Franklin was chronically disorganised. But it’s also because of a covert thought that sounds delusionally arrogant if expressed out loud: just maybe, if I took very hot baths like Flaubert, or amphetamines like Auden, I might inch closer to their genius.

Several weeks later, I’m no longer taking “air baths”, while the lunchtime martini didn’t last more than a day (I mean, come on). But I’m still rising early and, when time allows, taking long walks. Two big insights have emerged. One is how ill-suited the nine-to-five routine is to most desk-based jobs involving mental focus; it turns out I get far more done when I start earlier, end a little later, and don’t even pretend to do brain work for several hours in the middle. The other is the importance of momentum. When I get straight down to something really important early in the morning, before checking email, before interruptions from others, it beneficially alters the feel of the whole day: once interruptions do arise, they’re never quite so problematic. Another technique I couldn’t manage without comes from the writer and consultant Tony Schwartz: use a timer to work in 90-minute “sprints”, interspersed with signficant breaks. (Thanks to this, I’m far better than I used to be at separating work from faffing around, rather than spending half the day flailing around in a mixture of the two.)

The one true lesson of the book, says its author, Mason Currey, is that “there’s no one way to get things done”. For every Joyce Carol Oates, industriously plugging away from 8am to 1pm and again from 4pm to 7pm, or Anthony Trollope, timing himself typing 250 words per quarter-hour, there’s a Sylvia Plath, unable to stick to a schedule. (Or a Friedrich Schiller, who could only write in the presence of the smell of rotting apples.) Still, some patterns do emerge. Here, then, are six lessons from history’s most creative minds.

1. Be a morning person

It’s not that there aren’t successful night owls: Marcel Proust, for one, rose sometime between 3pm and 6pm, immediately smoked opium powders to relieve his asthma, then rang for his coffee and croissant. But very early risers form a clear majority, including everyone from Mozart to Georgia O’Keeffe to Frank Lloyd Wright. (The 18th-century theologian Jonathan Edwards, Currey tells us, went so far as to argue that Jesus had endorsed early rising “by his rising from the grave very early”.) For some, waking at 5am or 6am is a necessity, the only way to combine their writing or painting with the demands of a job, raising children, or both. For others, it’s a way to avoid interruption: at that hour, as Hemingway wrote, “There is no one to disturb you and it is cool or cold and you come to your work and warm as you write.” There’s another, surprising argument in favour of rising early, which might persuade sceptics: that early-morning drowsiness might actually be helpful. At one point in his career, the novelist Nicholson Baker took to getting up at 4.30am, and he liked what it did to his brain: “The mind is newly cleansed, but it’s also befuddled… I found that I wrote differently then.”

Psychologists categorise people by what they call, rather charmingly, “morningness” and “eveningness”, but it’s not clear that either is objectively superior. There is evidence that morning people are happier and more conscientious, but also that night owls might be more intelligent. If you’re determined to join the ranks of the early risers, the crucial trick is to start getting up at the same time daily, but to go to bed only when you’re truly tired. You might sacrifice a day or two to exhaustion, but you’ll adjust to your new schedule more rapidly.

2. Don’t give up the day job

Time is short, my strength is limited, the office is a horror, the apartment is noisy,” Franz Kafka complained to his fiancee, “and if a pleasant, straightforward life is not possible, then one must try to wriggle through by subtle manoeuvres.” He crammed in his writing between 10.30pm and the small hours of the morning. But in truth, a “pleasant, straightforward life” might not have been preferable, artistically speaking: Kafka, who worked in an insurance office, was one of many artists who have thrived on fitting creative activities around the edges of a busy life. William Faulkner wrote As I Lay Dying in the afternoons, before commencing his night shift at a power plant; TS Eliot’s day job at Lloyds bank gave him crucial financial security; William Carlos Williams, a paediatrician, scribbled poetry on the backs of his prescription pads. Limited time focuses the mind, and the self-discipline required to show up for a job seeps back into the processes of art. “I find that having a job is one of the best things in the world that could happen to me,” wrote Wallace Stevens, an insurance executive and poet. “It introduces discipline and regularity into one’s life.” Indeed, one obvious explanation for the alcoholism that pervades the lives of full-time authors is that it’s impossible to focus on writing for more than a few hours a day, and, well, you’ve got to make those other hours pass somehow.

3. Take lots of walks

There’s no shortage of evidence to suggest that walking – especially walking in natural settings, or just lingering amid greenery, even if you don’t actually walk much – is associated with increased productivity and proficiency at creative tasks. But Currey was surprised, in researching his book, by the sheer ubiquity of walking, especially in the daily routines of composers, including Beethoven, Mahler, Erik Satie and Tchaikovksy, “who believed he had to take a walk of exactly two hours a day and that if he returned even a few minutes early, great misfortunes would befall him”. It’s long been observed that doing almost anything other than sitting at a desk can be the best route to novel insights. These days, there’s surely an additional factor at play: when you’re on a walk, you’re physically removed from many of the sources of distraction – televisions, computer screens – that might otherwise interfere with deep thought.

Read the entire article here.

Image: Frank Lloyd Wright, architect, c. March 1, 1926. Courtesy of U.S. Library of Congress.

Lego Expressionism: But is it Art?

Lego as we know it — think brightly colored, interlinking, metamorphic bricks – has been around for over 60 years. Even in this high-tech, electronic age it is still likely that most kids around the world have made a little house or a robot with Lego bricks. It satisfies our need to create and to build (and of course, to destroy). But is it art? Jonathan Jones has some ideas.

[div class=attrib]From the Guardian:[end-div]

Lego is the clay of the modern world, the stuff of creativity. You can shape it, unshape it, make worlds and smash them up to be replaced by new ideas. It’s a perpetual-motion machine of kids’ imaginations.

Today’s Lego is very different from the Lego I played with when I was eight. For adults like me who grew up with simple Lego bricks and no instructions, just a free-for-all, the kits that now dazzle in their bright impressive boxes take some adjusting to. A puritan might well be troubled that this year’s new Christmas Lego recreates the film The Hobbit in yet another addition to a popular culture repertoire that includes Marvel Superheroes Lego and the ever-popular Star Wars range.

The Danish toymaker is ruthless in its pursuit of mass entertainment. Harry Potter Lego was a major product – until the film series finished. This summer, it suddenly vanished from shops. I had to go to the Harry Potter Studios to get a Knight Bus.

Cool bus, though. Purple Lego! And it fits together in such a way that, when dropped or otherwise subjected to the rigours of play, the three floors of the bus neatly separate and can easily be reconnected. It is a kit, a toy, and a stimulus to story-telling.

Do not doubt the creative value of modern Lego. Making these kits isn’t a fetishistic, sterile enterprise – children don’t think like that. Rather, the ambition of the kits inspires children to aim high with their own crazy designs – the scenarios Lego provides stimulate inventive play. Children can tell stories with Lego, invest the fantastic mini-figures with names and characters, and build what they like after the models disintegrate. Above all, there is something innately humorous about Lego.

But is it art? It definitely teaches something about art. Like a three-dimensional sketchpad, Lego allows you to doodle in bright colours. It is “virtual”, but real and solid. It has practical limits and potentials that have to be respected, while teaching that anyone can create anything. You can be a representational Lego artist, meticulously following instructions and making accurate models, or an abstract one. It really is liberating stuff: shapeshifting, metamorphic. And now I am off to play with it.

[div class=attrib]Read the entire article following the jump.[end-div]

[div class=attrib]Image courtesy of Nathan Sawaya, the Lego brick artist.[end-div]

Light From Gravity

Often the best creative ideas and the most elegant solutions are the simplest. GravityLight is an example of this type of innovation. Here’s the problem: replace damaging and expensive kerosene fuel lamps in Africa with a less harmful and cheaper alternative. And, the solution:

[tube]1dd9NIlhvlI[/tube]

[div class=attrib]From ars technica:[end-div]

A London design consultancy has developed a cheap, clean, and safer alternative to the kerosene lamp. Kerosene burning lamps are thought to be used by over a billion people in developing nations, often in remote rural parts where electricity is either prohibitively expensive or simply unavailable. Kerosene’s potential replacement, GravityLight, is powered by gravity without the need of a battery—it’s also seen by its creators as a superior alternative to solar-powered lamps.

Kerosene lamps are problematic in three ways: they release pollutants which can contribute to respiratory disease; they pose a fire risk; and, thanks to the ongoing need to buy kerosene fuel, they are expensive to run. Research out of Brown University from July of last year called kerosene lamps a “significant contributor to respiratory diseases, which kill over 1.5 million people every year” in developing countries. The same paper found that kerosene lamps were responsible for 70 percent of fires (which cause 300,000 deaths every year) and 80 percent of burns. The World Bank has compared the indoor use of a kerosene lamp with smoking two packs of cigarettes per day.

The economics of the kerosene lamps are nearly as problematic, with the fuel costing many rural families a significant proportion of their money. The designers of the GravityLight say 10 to 20 percent of household income is typical, and they describe kerosene as a poverty trap, locking people into a “permanent state of subsistence living.” Considering that the median rural price of kerosene in Tanzania, Mali, Ghana, Kenya, and Senegal is $1.30 per liter, and the average rural income in Tanzania is under $9 per month, the designers’ figures seem depressingly plausible.

Approached by the charity Solar Aid to design a solar-powered LED alternative, London design consultancy Therefore shifted the emphasis away from solar, which requires expensive batteries that degrade over time. The company’s answer is both more simple and more radical: an LED lamp driven by a bag of sand, earth, or stones, pulled toward the Earth by gravity.

It takes only seconds to hoist the bag into place, after which the lamp provides up to half an hour of ambient light, or about 18 minutes of brighter task lighting. Though it isn’t clear quite how much light the GravityLight emits, its makers insist it is more than a kerosene lamp. Also unclear are the precise inner workings of the device, though clearly the weighted bag pulls a cord, driving an inner mechanism with a low-powered dynamo, with the aid of some robust plastic gearing. Talking to Ars by telephone, Therefore’s Jim Fullalove was loath to divulge details, but did reveal the gearing took the kinetic energy from a weighted bag descending at a rate of a millimeter per second to power a dynamo spinning at 2000rpm.

[div class=attrib]Read more about GravityLight after the jump.[end-div]

[div class=attrib]Video courtesy of GravityLight.[end-div]

Remembering the Future

Memory is a very useful cognitive tool. After all, where would we be if we had no recall of our family, friends, foods, words, tasks and dangers.

But, it turns our that memory may also help us imagine the future — another very important human trait.

[div class=attrib]From the New Scientist:[end-div]

WHEN thinking about the workings of the mind, it is easy to imagine memory as a kind of mental autobiography – the private book of you. To relive the trepidation of your first day at school, say, you simply dust off the cover and turn to the relevant pages. But there is a problem with this idea. Why are the contents of that book so unreliable? It is not simply our tendency to forget key details. We are also prone to “remember” events that never actually took place, almost as if a chapter from another book has somehow slipped into our autobiography. Such flaws are puzzling if you believe that the purpose of memory is to record your past – but they begin to make sense if it is for something else entirely.

That is exactly what memory researchers are now starting to realise. They believe that human memory didn’t evolve so that we could remember but to allow us to imagine what might be. This idea began with the work of Endel Tulving, now at the Rotman Research Institute in Toronto, Canada, who discovered a person with amnesia who could remember facts but not episodic memories relating to past events in his life. Crucially, whenever Tulving asked him about his plans for that evening, the next day or the summer, his mind went blank – leading Tulving to suspect that foresight was the flipside of episodic memory.

Subsequent brain scans supported the idea, suggesting that every time we think about a possible future, we tear up the pages of our autobiographies and stitch together the fragments into a montage that represents the new scenario. This process is the key to foresight and ingenuity, but it comes at the cost of accuracy, as our recollections become frayed and shuffled along the way. “It’s not surprising that we confuse memories and imagination, considering that they share so many processes,” says Daniel Schacter, a psychologist at Harvard University.

Over the next 10 pages, we will show how this theory has brought about a revolution in our understanding of memory. Given the many survival benefits of being able to imagine the future, for instance, it is not surprising that other creatures show a rudimentary ability to think in this way (“Do animals ever forget?”). Memory’s role in planning and problem solving, meanwhile, suggests that problems accessing the past may lie behind mental illnesses like depression and post-traumatic stress disorder, offering a new approach to treating these conditions (“Boosting your mental fortress”). Equally, a growing understanding of our sense of self can explain why we are so selective in the events that we weave into our life story – again showing definite parallels with the way we imagine the future (“How the brain spins your life story”). The work might even suggest some dieting tips (“Lost in the here and now”).

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image: The Persistence of Memory, 1931. Salvador Dalí. Courtesy of Salvador Dalí, Gala-Salvador Dalí Foundation/Artists Rights Society.[end-div]

Social Outcast = Creative Wunderkind

A recent study published in the Journal of Experimental Psychology correlates social ostracization and rejection with creativity. Businesses seeking creative individuals take note: perhaps your next great hire is a social misfit.

[div class=attrib]From Fast Company:[end-div]

Are you a recovering high school geek who still can’t get the girl? Are you always the last person picked for your company’s softball team? When you watched Office Space, did you feel a special kinship to the stapler-obsessed Milton Waddams? If you answered yes to any of these questions, do not despair. Researchers at Johns Hopkins and Cornell have recently found that the socially rejected might also be society’s most creatively powerful people.

The study, which is forthcoming in the Journal of Experimental Psychology, is called “Outside Advantage: Can Social Rejection Fuel Creative Thought?” It found that people who already have a strong “self-concept”–i.e. are independently minded–become creatively fecund in the face of rejection. “We were inspired by the stories of highly creative individuals like Steve Jobs and Lady Gaga,” says the study’s lead author, Hopkins professor Sharon Kim. “And we wanted to find a silver lining in all the popular press about bullying. There are benefits to being different.”

The study consisted of 200 Cornell students and set out to identify the relationship between the strength of an individual’s self-concept and their level of creativity. First, Kim tested the strength of each student’s self-concept by assessing his or her “need for uniqueness.” In other words, how important it is for each individual to feel separate from the crowd. Next, students were told that they’d either been included in or rejected from a hypothetical group project. Finally, they were given a simple, but creatively demanding, task: Draw an alien from a planet unlike earth.

If you’re curious about your own general creativity level (at least by the standards of Kim’s study), go ahead and sketch an alien right now…Okay, got your alien? Now give yourself a point for every non-human characteristic you’ve included in the drawing. If your alien has two eyes between the nose and forehead, you don’t get any points. If your alien has two eyes below the mouth, or three eyes that breathe fire, you get a point. If your alien doesn’t even have eyes or a mouth, give yourself a bunch of points. In short, the more dissimilar your alien is to a human, the higher your creativity score.

Kim found that people with a strong self-concept who were rejected produced more creative aliens than people from any other group, including people with a strong self-concept who were accepted. “If you’re in a mindset where you don’t care what others think,” she explained, “you’re open to ideas that you may not be open to if you’re concerned about what other people are thinking.”

This may seem like an obvious conclusion, but Kim pointed out that most companies don’t encourage the kind of freedom and independence that readers of Fast Company probably expect. “The benefits of being different is not a message everyone is getting,” she said.

But Kim also discovered something unexpected. People with a weak self-concept could be influenced toward a stronger one and, thus, toward a more creative mindset. In one part of the study, students were asked to read a short story in which all the pronouns were either singular (I/me) or plural (we/us) and then to circle all the pronouns. They were then “accepted” or “rejected” and asked to draw their aliens.

Kim found that all of the students who read stories with singular pronouns and were rejected produced more creative aliens. Even the students who originally had a weaker self-concept. Once these group-oriented individuals focused on individual-centric prose, they became more individualized themselves. And that made them more creative.

This finding doesn’t prove that you can teach someone to have a strong self-concept but it suggests that you can create a professional environment that facilitates independent and creative thought.

[div class=attrib]Read the entire article after the jump.[end-div]

Creativity and Immorality

[div class=attrib]From Scientific American:[end-div]

In the mid 1990’s, Apple Computers was a dying company.  Microsoft’s Windows operating system was overwhelmingly favored by consumers, and Apple’s attempts to win back market share by improving the Macintosh operating system were unsuccessful.  After several years of debilitating financial losses, the company chose to purchase a fledgling software company called NeXT.  Along with purchasing the rights to NeXT’s software, this move allowed Apple to regain the services of one of the company’s founders, the late Steve Jobs.  Under the guidance of Jobs, Apple returned to profitability and is now the largest technology company in the world, with the creativity of Steve Jobs receiving much of the credit.

However, despite the widespread positive image of Jobs as a creative genius, he also has a dark reputation for encouraging censorship,“ losing sight of honesty and integrity”, belittling employees, and engaging in other morally questionable actions. These harshly contrasting images of Jobs raise the question of why a CEO held in such near-universal positive regard could also be the same one accused of engaging in such contemptible behavior.  The answer, it turns out, may have something to do with the aspect of Jobs which is so admired by so many.

In a recent paper published in the Journal of Personality and Social Psychology, researchers at Harvard and Duke Universities demonstrate that creativity can lead people to behave unethically.  In five studies, the authors show that creative individuals are more likely to be dishonest, and that individuals induced to think creatively were more likely to be dishonest. Importantly, they showed that this effect is not explained by any tendency for creative people to be more intelligent, but rather that creativity leads people to more easily come up with justifications for their unscrupulous actions.

In one study, the authors administered a survey to employees at an advertising agency.  The survey asked the employees how likely they were to engage in various kinds of unethical behaviors, such as taking office supplies home or inflating business expense reports.  The employees were also asked to report how much creativity was required for their job.  Further, the authors asked the executives of the company to provide creativity ratings for each department within the company.

Those who said that their jobs required more creativity also tended to self-report a greater likelihood of unethical behavior.  And if the executives said that a particular department required more creativity, the individuals in that department tended to report greater likelihoods of unethical behavior.

The authors hypothesized that it is creativity which causes unethical behavior by allowing people the means to justify their misdeeds, but it is hard to say for certain whether this is correct given the correlational nature of the study.  It could just as easily be true, after all, that unethical behavior leads people to be more creative, or that there is something else which causes both creativity and dishonesty, such as intelligence.  To explore this, the authors set up an experiment in which participants were induced into a creative mindset and then given the opportunity to cheat.

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image courtesy of Scientific American / iStock.[end-div]

Creativity and Failure at School

[div class=attrib]From the Wall Street Journal:[end-div]

Most of our high schools and colleges are not preparing students to become innovators. To succeed in the 21st-century economy, students must learn to analyze and solve problems, collaborate, persevere, take calculated risks and learn from failure. To find out how to encourage these skills, I interviewed scores of innovators and their parents, teachers and employers. What I learned is that young Americans learn how to innovate most often despite their schooling—not because of it.

Though few young people will become brilliant innovators like Steve Jobs, most can be taught the skills needed to become more innovative in whatever they do. A handful of high schools, colleges and graduate schools are teaching young people these skills—places like High Tech High in San Diego, the New Tech high schools (a network of 86 schools in 16 states), Olin College in Massachusetts, the Institute of Design (d.school) at Stanford and the MIT Media Lab. The culture of learning in these programs is radically at odds with the culture of schooling in most classrooms.

In most high-school and college classes, failure is penalized. But without trial and error, there is no innovation. Amanda Alonzo, a 32-year-old teacher at Lynbrook High School in San Jose, Calif., who has mentored two Intel Science Prize finalists and 10 semifinalists in the last two years—more than any other public school science teacher in the U.S.—told me, “One of the most important things I have to teach my students is that when you fail, you are learning.” Students gain lasting self-confidence not by being protected from failure but by learning that they can survive it.

The university system today demands and rewards specialization. Professors earn tenure based on research in narrow academic fields, and students are required to declare a major in a subject area. Though expertise is important, Google’s director of talent, Judy Gilbert, told me that the most important thing educators can do to prepare students for work in companies like hers is to teach them that problems can never be understood or solved in the context of a single academic discipline. At Stanford’s d.school and MIT’s Media Lab, all courses are interdisciplinary and based on the exploration of a problem or new opportunity. At Olin College, half the students create interdisciplinary majors like “Design for Sustainable Development” or “Mathematical Biology.”

Learning in most conventional education settings is a passive experience: The students listen. But at the most innovative schools, classes are “hands-on,” and students are creators, not mere consumers. They acquire skills and knowledge while solving a problem, creating a product or generating a new understanding. At High Tech High, ninth graders must develop a new business concept—imagining a new product or service, writing a business and marketing plan, and developing a budget. The teams present their plans to a panel of business leaders who assess their work. At Olin College, seniors take part in a yearlong project in which students work in teams on a real engineering problem supplied by one of the college’s corporate partners.

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image courtesy of NY Daily News.[end-div]

Creativity: Insight, Shower, Wine, Perspiration? Yes

Some believe creativity stems from a sudden insightful realization, a bolt from the blue that awakens the imagination. Others believe creativity comes from years of discipline and hard work. Well, both groups are correct, but the answer is a little more complex.

[div class=attrib]From the Wall Street Journal:[end-div]

Creativity can seem like magic. We look at people like Steve Jobs and Bob Dylan, and we conclude that they must possess supernatural powers denied to mere mortals like us, gifts that allow them to imagine what has never existed before. They’re “creative types.” We’re not.

But creativity is not magic, and there’s no such thing as a creative type. Creativity is not a trait that we inherit in our genes or a blessing bestowed by the angels. It’s a skill. Anyone can learn to be creative and to get better at it. New research is shedding light on what allows people to develop world-changing products and to solve the toughest problems. A surprisingly concrete set of lessons has emerged about what creativity is and how to spark it in ourselves and our work.

The science of creativity is relatively new. Until the Enlightenment, acts of imagination were always equated with higher powers. Being creative meant channeling the muses, giving voice to the gods. (“Inspiration” literally means “breathed upon.”) Even in modern times, scientists have paid little attention to the sources of creativity.

But over the past decade, that has begun to change. Imagination was once thought to be a single thing, separate from other kinds of cognition. The latest research suggests that this assumption is false. It turns out that we use “creativity” as a catchall term for a variety of cognitive tools, each of which applies to particular sorts of problems and is coaxed to action in a particular way.

Does the challenge that we’re facing require a moment of insight, a sudden leap in consciousness? Or can it be solved gradually, one piece at a time? The answer often determines whether we should drink a beer to relax or hop ourselves up on Red Bull, whether we take a long shower or stay late at the office.

The new research also suggests how best to approach the thorniest problems. We tend to assume that experts are the creative geniuses in their own fields. But big breakthroughs often depend on the naive daring of outsiders. For prompting creativity, few things are as important as time devoted to cross-pollination with fields outside our areas of expertise.

Let’s start with the hardest problems, those challenges that at first blush seem impossible. Such problems are typically solved (if they are solved at all) in a moment of insight.

Consider the case of Arthur Fry, an engineer at 3M in the paper products division. In the winter of 1974, Mr. Fry attended a presentation by Sheldon Silver, an engineer working on adhesives. Mr. Silver had developed an extremely weak glue, a paste so feeble it could barely hold two pieces of paper together. Like everyone else in the room, Mr. Fry patiently listened to the presentation and then failed to come up with any practical applications for the compound. What good, after all, is a glue that doesn’t stick?

On a frigid Sunday morning, however, the paste would re-enter Mr. Fry’s thoughts, albeit in a rather unlikely context. He sang in the church choir and liked to put little pieces of paper in the hymnal to mark the songs he was supposed to sing. Unfortunately, the little pieces of paper often fell out, forcing Mr. Fry to spend the service frantically thumbing through the book, looking for the right page. It seemed like an unfixable problem, one of those ordinary hassles that we’re forced to live with.

But then, during a particularly tedious sermon, Mr. Fry had an epiphany. He suddenly realized how he might make use of that weak glue: It could be applied to paper to create a reusable bookmark! Because the adhesive was barely sticky, it would adhere to the page but wouldn’t tear it when removed. That revelation in the church would eventually result in one of the most widely used office products in the world: the Post-it Note.

Mr. Fry’s invention was a classic moment of insight. Though such events seem to spring from nowhere, as if the cortex is surprising us with a breakthrough, scientists have begun studying how they occur. They do this by giving people “insight” puzzles, like the one that follows, and watching what happens in the brain:

A man has married 20 women in a small town. All of the women are still alive, and none of them is divorced. The man has broken no laws. Who is the man?

[div class=attrib]Read the entire article here.[end-div]

[div class=attrib]Image courtesy of Google search.[end-div]

Need Creative Inpiration? Take a New Route to Work

[div class=attrib]From Miller-McCune:[end-div]

Want to boost your creativity? Tomorrow morning, pour some milk into an empty bowl, and then add the cereal.

That may sound, well, flaky. But according to a newly published study, preparing a common meal in reverse order may stimulate innovative thinking.

Avoiding conventional behavior at the breakfast table “can help people break their cognitive patterns, and thus lead them to think more flexibly and creatively,” according to a research team led by psychologist Simone Ritter of Radboud University Nijmegen in the Netherlands.

She and her colleagues, including Rodica Ioana Damian of the University of California, Davis, argue that “active involvement in an unusual event” can trigger higher levels of creativity. They note this activity can take many forms, from studying abroad for a semester to coping with the unexpected death of a loved one.
But, writing in the Journal of Experimental Social Psychology, they provide evidence that something simpler will suffice.

The researchers describe an experiment in which Dutch university students were asked to prepare a breakfast sandwich popular in the Netherlands.

Half of them did so in the conventional manner: They put a slice of bread on a plate, buttered the bread and then placed chocolate chips on top. The others — prompted by a script on a computer screen — first put chocolate chips on a plate, then buttered a slice of bread and finally “placed the bread butter-side-down on the dish with the chocolate chips.”

After completing their culinary assignment, they turned their attention to the “Unusual Uses Task,” a widely used measure of creativity. They were given two minutes to generate uses for a brick and another two minutes to come up with as many answers as they could to the question: “What makes sound?”

“Cognitive flexibility” was scored not by counting how many answers they came up with, but rather by the number of categories those answers fell into. For the “What makes sound?” test, a participant whose answers were all animals or machines received a score of one, while someone whose list included “dog,” “car” and “ocean” received a three.

“A high cognitive flexibility score indicates an ability to switch between categories, overcome fixedness, and thus think more creativity,” Ritter and her colleagues write.
On both tests, those who made their breakfast treat backwards had higher scores. Breaking their normal sandwich-making pattern apparently opened them up; their minds wandered more freely, allowing for more innovative thought.

[div class=attrib]Read the entire article here.[end-div]

The Unconscious Mind Boosts Creativity

[div class=attrib]From Miller-McCune:[end-div]

New research finds we’re better able to identify genuinely creative ideas when they’ve emerged from the unconscious mind.

Truly creative ideas are both highly prized and, for most of us, maddeningly elusive. If our best efforts produce nothing brilliant, we’re often advised to put aside the issue at hand and give our unconscious minds a chance to work.

Newly published research suggests that is indeed a good idea — but not for the reason you might think.

A study from the Netherlands finds allowing ideas to incubate in the back of the mind is, in a narrow sense, overrated. People who let their unconscious minds take a crack at a problem were no more adept at coming up with innovative solutions than those who consciously deliberated over the dilemma.

But they did perform better on the vital second step of this process: determining which of their ideas was the most creative. That realization provides essential information; without it, how do you decide which solution you should actually try to implement?

Given the value of discerning truly fresh ideas, “we can conclude that the unconscious mind plays a vital role in creative performance,” a research team led by Simone Ritter of the Radboud University Behavioral Science Institute writes in the journal Thinking Skills and Creativity.

In the first of two experiments, 112 university students were given two minutes to come up with creative ideas to an everyday problem: how to make the time spent waiting in line at a cash register more bearable. Half the participants went at it immediately, while the others first spent two minutes performing a distracting task — clicking on circles that appeared on a computer screen. This allowed time for ideas to percolate outside their conscious awareness.

After writing down as many ideas as they could think of, they were asked to choose which of their notions was the most creative.  Participants were scored by the number of ideas they came up with, the creativity level of those ideas (as measured by trained raters), and whether their perception of their most innovative idea coincided with that of the raters.
The two groups scored evenly on both the number of ideas generated and the average creativity of those ideas. But those who had been distracted, and thus had ideas spring from their unconscious minds, were better at selecting their most creative concept.

[div class=attrib]Read the entire article here.[end-div]

The U.S. Education System Through the Eyes of a Student and Sir Ken

[tube]iG9CE55wbtY[/tube]

Nikhil Goyal is an observant 16-year-old Junior at a New York high-school. He ponders the state of the U.S. educational system, which he finds sadly wanting. Sir Ken Robinson has a young standard-bearer. Adults take note:

[div class=attrib]From the Huffington Post:[end-div]

The United States education system really sucks. We continue to toil in a 19th century factory-based model of education, stressing conformity and standardization. This is all true even though globalization has transformed the world we live in, flipping the status quo of the labor market upside down. The education system has miserably failed in creating students that have the dexterity to think creatively and critically, work collaboratively, and communicate their thoughts.

Over the past decade, when government has tried to muddle its way through education, it has gotten fairly ugly. President Bush passed No Child Left Behind and President Obama passed Race to the Top, infatuating our schools with a culture of fill in the bubble tests and drill-and-kill teaching methods. Schools were transformed into test-preparation factories and the process of memorization and regurgitation hijacked classroom learning.

Our society has failed to understand what’s at stake. For the 21st century American economy, all economic value will derive from entrepreneurship and innovation. Low-cost manufacturing will essentially be wiped out of this country and shipped to China, India, and other nations. While we may have the top companies in the world, as in Apple and Google, our competitive edge is at risk. The education system was designed to create well-disciplined employees, not entrepreneurs and innovators. According to Cathy N. Davidson, co-director of the annual MacArthur Foundation Digital Media and Learning Competitions, 65 percent of today’s grade-school kids may end up doing work that hasn’t been invented yet.

I propose that we institute a 21st century model of education, rooted in 21st century learning skills and creativity, imagination, discovery, and project-based learning. We need to stop telling kids to shut up, sit down, and listen to the teacher passively. As Sir Ken Robinson said in his well-acclaimed TED talk, “Schools kill creativity.”

[div class=attrib]Read more of the article here.[end-div]

Creativity and Anger

It turns out that creativity gets a boost from anger. While anger certainly is not beneficial in some contexts, researchers have found that angry people are more likely to be creative.

[div class=attrib]From Scientific American:[end-div]

This counterintuitive idea was pursued by researchers Matthijs Baas, Carsten De Dreu, and Bernard Nijstad in a series of studies  recently published in The Journal of Experimental Social Psychology. They found that angry people were more likely to be creative – though this advantage didn’t last for long, as the taxing nature of anger eventually leveled out creativity. This study joins several recent lines of research exploring the relative upside to anger – the ways in which anger is not only less harmful than typically assumed, but may even be helpful (though perhaps in small doses).

In an initial study, the researchers found that feeling angry was indeed associated with brainstorming in a more unstructured manner, consistent with “creative” problem solving. In a second study, the researchers first elicited anger from the study participants (or sadness, or a non-emotional state) and then asked them to engage in a brainstorming session in which they generated ideas to preserve and improve the environment. In the beginning of this task, angry participants generated more ideas (by volume) and generated more original ideas (those thought of by less than 1 percent or less of the other participants), compared to the other sad or non-emotional participants. However, this benefit was only present in the beginning of the task, and eventually, the angry participants generated only as many ideas as the other participants.

These findings reported by Baas and colleagues make sense, given what we already know about anger. Though anger may be unpleasant to feel, it is associated with a variety of attributes that may facilitate creativity. First, anger is an energizing feeling, important for the sustained attention needed to solve problems creatively. Second, anger leads to more flexible, unstructured thought processes.

Anecdotal evidence from internal meetings at Apple certainly reinforces the notion that creativity may benefit from well-channeled anger. Apple is often cited as one of the wolrd’s most creative companies.

[div class=attrib]From Jonah Lehred over at Wired:[end-div]

Many of my favorite Steve Jobs stories feature his anger, as he unleashes his incisive temper on those who fail to meet his incredibly high standards. A few months ago, Adam Lashinsky had a fascinating article in Fortune describing life inside the sanctum of 1 Infinite Loop. The article begins with the following scene:

In the summer of 2008, when Apple launched the first version of its iPhone that worked on third-generation mobile networks, it also debuted MobileMe, an e-mail system that was supposed to provide the seamless synchronization features that corporate users love about their BlackBerry smartphones. MobileMe was a dud. Users complained about lost e-mails, and syncing was spotty at best. Though reviewers gushed over the new iPhone, they panned the MobileMe service.

Steve Jobs doesn’t tolerate duds. Shortly after the launch event, he summoned the MobileMe team, gathering them in the Town Hall auditorium in Building 4 of Apple’s campus, the venue the company uses for intimate product unveilings for journalists. According to a participant in the meeting, Jobs walked in, clad in his trademark black mock turtleneck and blue jeans, clasped his hands together, and asked a simple question:

“Can anyone tell me what MobileMe is supposed to do?” Having received a satisfactory answer, he continued, “So why the fuck doesn’t it do that?”

For the next half-hour Jobs berated the group. “You’ve tarnished Apple’s reputation,” he told them. “You should hate each other for having let each other down.” The public humiliation particularly infuriated Jobs. Walt Mossberg, the influential Wall Street Journal gadget columnist, had panned MobileMe. “Mossberg, our friend, is no longer writing good things about us,” Jobs said. On the spot, Jobs named a new executive to run the group.

Brutal, right? But those flashes of intolerant anger have always been an important part of Jobs’ management approach. He isn’t shy about the confrontation of failure and he doesn’t hold back negative feedback. He is blunt at all costs, a cultural habit that has permeated the company. Jonathan Ive, the lead designer at Apple, describes the tenor of group meetings as “brutally critical.”

[div class=attrib]More from theSource here and here.[end-div]

[div class=attrib]Image of Brandy Norwood, courtesy of Wikipedia / Creative Commons.[end-div]

A New Tool for Creative Thinking: Mind-Body Dissonance

[div class=attrib]From Scientific American:[end-div]

A New Tool for Creative Thinking: Mind-Body Dissonance

Did you ever get the giggles during a religious service or some other serious occasion?  Did you ever have to smile politely when you felt like screaming?  In these situations, the emotions that we are required to express differ from the ones we are feeling inside.  That can be stressful, unpleasant, and exhausting.  Normally our minds and our bodies are in harmony.  When facial expressions or posture depart from how we feel, we experience what two psychologists at Northwestern University, Li Huang and Adam Galinsky, call mind–body dissonance.  And in a fascinating new paper, they show that such awkward clashes between mind and body can actually be useful: they help us think more expansively.

Ask yourself, would you say that a camel is a vehicle?  Would you describe a handbag as an item of clothing?  Your default answer might be negative, but there’s a way in which the camels can be regarded as forms of transport, and handbags can certainly be said to dress up an outfit.  When we think expansively, we think about categories more inclusively, we stop privileging the average cases, and extend our horizons to the atypical or exotic.  Expansive thought can be regarded a kind of creativity, and an opportunity for new insights.

Huang and Galinsky have shown that mind–body dissonance can make us think expansively.  In a clever series of studies, they developed a way to get people’s facial expressions to depart from their emotional experiences.  Participants were asked to either hold a pen between their teeth, forcing an unwitting smile, or to affix two golf tees in a particular position on their foreheads, unwittingly forcing an expression of sadness.  While in these facial configurations subjects were asked to recall happy and sad events or listen to happy and sad music.

[div class=attrib]More from theSource here.[end-div]