Benjamin Saves Us From Hollywood

Benjamin-screenshot

Not a moment too soon. Benjamin has arrived in California to save us from ill-conceived and poorly written screenplays vying to be the next Hollywood blockbuster.

Thankfully, Benjamin is neither the 20-something, creative-wunderkind nor a 30-something know-it-all uber-producer; he (or she) is not even human. Benjamin is an AI (artificial intelligence) based automatic screenwriter, and author of Sunspring, a short science fiction film.

From ars technica:

Ars is excited to be hosting this online debut of Sunspring, a short science fiction film that’s not entirely what it seems. It’s about three people living in a weird future, possibly on a space station, probably in a love triangle. You know it’s the future because H (played with neurotic gravity by Silicon Valley‘s Thomas Middleditch) is wearing a shiny gold jacket, H2 (Elisabeth Gray) is playing with computers, and C (Humphrey Ker) announces that he has to “go to the skull” before sticking his face into a bunch of green lights. It sounds like your typical sci-fi B-movie, complete with an incoherent plot. Except Sunspring isn’t the product of Hollywood hacks—it was written entirely by an AI. To be specific, it was authored by a recurrent neural network called long short-term memory, or LSTM for short. At least, that’s what we’d call it. The AI named itself Benjamin.

Knowing that an AI wrote Sunspring makes the movie more fun to watch, especially once you know how the cast and crew put it together. Director Oscar Sharp made the movie for Sci-Fi London, an annual film festival that includes the 48-Hour Film Challenge, where contestants are given a set of prompts (mostly props and lines) that have to appear in a movie they make over the next two days. Sharp’s longtime collaborator, Ross Goodwin, is an AI researcher at New York University, and he supplied the movie’s AI writer, initially called Jetson. As the cast gathered around a tiny printer, Benjamin spat out the screenplay, complete with almost impossible stage directions like “He is standing in the stars and sitting on the floor.” Then Sharp randomly assigned roles to the actors in the room. “As soon as we had a read-through, everyone around the table was laughing their heads off with delight,” Sharp told Ars. The actors interpreted the lines as they read, adding tone and body language, and the results are what you see in the movie. Somehow, a slightly garbled series of sentences became a tale of romance and murder, set in a dark future world. It even has its own musical interlude (performed by Andrew and Tiger), with a pop song Benjamin composed after learning from a corpus of 30,000 other pop songs.

Read more here.

Image: Benjamin screenshot. Courtesy of Benjamin.

Send to Kindle

Fish Roasts Human: Don’t Read It, Share It

Common_goldfish2

Interestingly enough, though perhaps not surprisingly, people on social media share news stories rather than read them. At first glance this seems rather perplexing: after all, why would you tweet or re-tweet or like or share a news item before actually reading and understanding it?

Arnaud Legout co-author of a recent study, out of Columbia University and the French National Institute (Inria), tells us that “People form an opinion based on a summary, or summary of summaries, without making the effort to go deeper.” More confusingly, he adds, “Our results show that sharing content and actually reading it are poorly correlated.”

Please take 8 seconds or more to mull over this last statement again:

Our results show that sharing content and actually reading it are poorly correlated.

Without doubt our new technological platforms and social media have upended traditional journalism. But, in light of this unnerving finding I have to wonder if this means the eventual and complete collapse of deep analytical, investigative journalism and the replacement of thoughtful reflection with “NationalEnquirerThink”.

Perhaps I’m reading too much into the findings, but it does seem that it is more important for social media users to bond with and seek affirmation from their followers than it is to be personally informed.

With average human attention span now down to 8 seconds I think our literary and contemplative future now seems to belong safely in the fins of our cousin, the goldfish (attention span, 9 seconds).

Learn more about Arnaud Legout’s disturbing study here.

Image: Common Goldfish. Courtesy: Wikipedia. Public Domain.

Send to Kindle

Psychic Quanta From the New Age Wisdom Generator

Over the last couple of years I’ve been compiling a list of my favorite online generators. You know. Enter a key word here or click a button there and the service will return some deeply meaningful and usually darkly funny computer-generated content — sans human intervention.

Check-out my recent Fave-Five list if you’re respectively weary of billionaire plutocrats, self-aggrandizing start-ups, politicians, unfathomable science and ivory tower academics:

Now, I have the profound pleasure to add another to my list:

This latest one delivers some profound transcendental literary waveforms worthy of any New Age mystic. A sample of its recent teachings:

We grow, we exist, we are reborn. Energy is the nature of inseparability, and of us. Soon there will be an unveiling of life-force the likes of which the infinite has never seen. We are in the midst of a psychic ennobling of intuition that will align us with the quantum soup itself. Our conversations with other beings have led to an unveiling of pseudo-unlimited consciousness. Humankind has nothing to lose. Sharing is the driver of consciousness. Nothing is impossible. The planet is electrified with vibrations.

Send to Kindle

Bedlam and the Mysterious Air Loom

Air Loom machine

During my college years I was fortunate enough to spend time as a volunteer in a Victorian era psychiatric hospital in the United Kingdom. Fortunate in two ways: that I was able to make some small, yet positive difference to the lives of some of the patients; and, fortunate enough to live on the outside.

Despite the good and professional intentions of the many caring staff the hospital itself — to remain nameless — was a dreary embodiment of many a nightmarish horror flick. The building had dark, endless corridors; small, leaky windows; creaky doors, many with locks exclusively on the outside, and even creakier plumbing; spare cell-like rooms for patients; treatment rooms with passive restraints on chairs and beds. Most locals still called it “____ lunatic asylum”.

All of this leads me to the fascinating and tragic story of James Tilly Matthews, a rebellious (and somewhat paranoid) peace activist who was confined to London’s infamous Bedlam asylum in 1797. He was incarcerated for believing he was being coerced and brainwashed by a mysterious governmental mind control machine known as the “Air Loom”.

Subsequent inquiries pronounced Matthews thoroughly sane, but the British government kept him institutionalized anyway because of his verbal threats against officials and then king, George III. In effect, this made Matthews a political prisoner — precisely that which he had always steadfastly maintained.

Ironically, George III’s well-documented, recurrent and serious mental illness had no adverse effect on his own reign as monarch from 1760-1820. Interestingly enough, Bedlam was the popular name for the Bethlem Royal Hospital, sometimes known as St Mary Bethlehem Hospital.

The word “Bedlam”, of course, later came to be a synonym for confusion and chaos.

Read the entire story of James Tilly Matthews and his nemesis, apothecary and discredited lay-psychiatrist, John Haslam, at Public Domain Review.

Image: Detail from the lower portion of James Tilly Matthews’ illustration of the Air Loom featured in John Haslam’s Illustrations of Madness (1810). Courtesy: Public Domain Review / Wellcome Library, London. Public Domain.

Send to Kindle

The Accelerated Acceleration

Dark_Energy

Until the mid-1990s accepted scientific understanding of the universe held that the cosmos was expanding. Scientists have accepted this since 1929 when Edwin Hubble‘s celestial observations showed that distant galaxies were all apparently moving away from us.

But, in 1998 two independent groups of cosmologists made a startling finding. The universe was not only expanding, its expansion was accelerating. Recent studies show that this acceleration in the fabric of spacetime is actually faster than first theorized and observed.

And, nobody knows why. This expansion, indeed the accelerating expansion, remains one of our current great scientific mysteries.

Cosmologists, astronomers and theoreticians of all stripes have proposed no shortage of possible explanations. But, there is still scant observational evidence to support any of the leading theories. The most popular revolves around the peculiar idea of dark energy.

From Scientific American:

Our universe is flying apart, with galaxies moving away from each other faster each moment than they were the moment before. Scientists have known about this acceleration since the late 1990s, but whatever is causing it—dubbed dark energy—remains a mystery. Now the latest measurement of how fast the cosmos is growing thickens the plot further: The universe appears to be ballooning more quickly than it should be, even after accounting for the accelerating expansion caused by dark energy.

Scientists came to this conclusion after comparing their new measurement of the cosmic expansion rate, called the Hubble constant, to predictions of what the Hubble constant should be based on evidence from the early universe. The puzzling conflict—which was hinted at in earlier data and confirmed in the new calculation—means that either one or both of the measurements are flawed, or that dark energy or some other aspect of nature acts differently than we think.

“The bottom line is that the universe looks like it’s expanding about eight percent faster than you would have expected based on how it looked in its youth and how we expect it to evolve,” says study leader Adam Riess of the Space Telescope Science Institute in Baltimore, Md. “We have to take this pretty darn seriously.” He and his colleagues described their findings, based on observations from the Hubble Space Telescope, in a paper submitted last week to the Astrophysical Journal and posted on the preprint server arXiv.

One of the most exciting possibilities is that dark energy is even stranger than the leading theory suggests. Most observations support the idea that dark energy behaves like a “cosmological constant,” a term Albert Einstein inserted into his equations of general relativity and later removed. This kind of dark energy would arise from empty space, which, according to quantum mechanics, is not empty at all, but rather filled with pairs of “virtual” particles and antiparticles that constantly pop in and out of existence. These virtual particles would carry energy, which in turn might exert a kind of negative gravity that pushes everything in the universe outward.

Read the entire story here.

Image: The universe’s accelerated expansion. Courtesy: NASA and ESA.

Send to Kindle

Poor Leadership and Destruction of Meaningful Work

WomanFactory1940s

First, your boss may be a great leader but she or he has little or no sway over how you assess the meaningfulness of the work you do. Second, while there is no correlation between a boss and meaningful work, a bad boss can destroy any likelihood of meaningful effort.

That’s the recent finding, excerpted below, by researchers from the University of Sussex and the University of Greenwich in the UK.

Therein lies a valuable set of lessons for any business wishing to recruit, retain and motivate employees.

From University of Sussex:

Bosses play no role in fostering a sense of meaningfulness at work – but they do have the capacity to destroy it and should stay out of the way, new research shows.

Published in MIT Sloan Management Review, the research indicates that, rather than being similar to other work-related attitudes, such as engagement or commitment, meaningfulness at work tends to be intensely personal and individual, and is often revealed to employees as they reflect on their work.

Thus what managers can do to encourage meaningfulness is limited, though what they can do to introduce meaninglessness is unfortunately of far greater capacity.

The authors identified five qualities of meaningful work:

1. Self-Transcendent. Individuals tend to experience their work as meaningful when it matters to others more than just to themselves. In this way, meaningful work is self-transcendent.

2. Poignant. People often find their work to be full of meaning at moments associated with mixed, uncomfortable, or even painful thoughts and feelings, not just a sense of unalloyed joy and happiness.

 3. Episodic. A sense of meaningfulness arises in an episodic rather than a sustained way. It seems that no one can find their work consistently meaningful, but rather that an awareness that work is meaningful arises at peak times that are generative of strong experiences.

4. Reflective. Meaningfulness is rarely experienced in the moment, but rather in retrospect and on reflection when people are able to see their completed work and make connections between their achievements and a wider sense of life meaning.

5. Personal. Work that is meaningful is often understood by people not just in the context of their work but also in the wider context of their personal life experiences.

Read more here.

Image: Turret lathe operator machining parts for transport planes at the Consolidated Aircraft Corporation plant, Fort Worth, Texas, USA, 1942. Courtesy: United States Library of Congress’s Prints and Photographs division. Public Domain.

Send to Kindle

Towards an Understanding of Consciousness

Robert-Fudd-Consciousness-17C

The modern scientific method has helped us make great strides in our understanding of much that surrounds us. From knowledge of the infinitesimally small building blocks of atoms to the vast structures of the universe, theory and experiment have enlightened us considerably over the last several hundred years.

Yet a detailed understanding of consciousness still eludes us. Despite the intricate philosophical essays of John Locke in 1690 that laid the foundations for our modern day views of consciousness, a fundamental grasp of its mechanisms remain as elusive as our knowledge of the universe’s dark matter.

So, it’s encouraging to come across a refreshing view of consciousness, described in the context of evolutionary biology. Michael Graziano, associate professor of psychology and neuroscience at Princeton University, makes a thoughtful case for Attention Schema Theory (AST), which centers on the simple notion that there is adaptive value for the brain to build awareness. According to AST, the brain is constantly constructing and refreshing a model — in Graziano’s words an “attention schema” — that describes what its covert attention is doing from one moment to the next. The brain constructs this schema as an analog to its awareness of attention in others — a sound adaptive perception.

Yet, while this view may hold promise from a purely adaptive and evolutionary standpoint, it does have some way to go before it is able to explain how the brain’s abstraction of a holistic awareness is constructed from the physical substrate — the neurons and connections between them.

Read more of Michael Graziano’s essay, A New Theory Explains How Consciousness Evolved. Graziano is the author of Consciousness and the Social Brain, which serves as his introduction to AST. And, for a compelling rebuttal, check out R. Scott Bakker’s article, Graziano, the Attention Schema Theory, and the Neuroscientific Explananda Problem.

Unfortunately, until our experimentalists make some definitive progress in this area, our understanding will remain just as abstract as the theories themselves, however compelling. But, ideas such as these inch us towards a deeper understanding.

Image: Representation of consciousness from the seventeenth century. Robert FluddUtriusque cosmi maioris scilicet et minoris […] historia, tomus II (1619), tractatus I, sectio I, liber X, De triplici animae in corpore visione. Courtesy: Wikipedia. Public Domain.

Send to Kindle

Five Tips For Re-Learning How to Walk

Google-search-walking-with-smartphone

It seems that the aimless walk to clear one’s mind has become a rarity. So too the gentle stroll to ponder and think. Purposeless walking, it seems, is a dying art. Indeed many in the West are so pampered for transportation alternatives and (self-)limited in time that walking has become an indulgence — who can afford to walk any more when driving or taking the bus or the train can save so much time (and energy). Moreover, when we do walk, we’re firmly hunched over our smartphones, entranced by cyberspace and its virtual acknowledgments and affirmations, and thoroughly unaware of our surroundings.

Google-search-walking-in-nature

Yet keep in mind that many of our revered artists, photographers, authors and philosophers were great walkers. They used the walk to sense and think. In fact, studies find a link between walking and creativity.

So, without further ado I present 5 tips to help you revive an endangered pastime:

#1. Ditch the smartphone and any other mobile device.

#2. Find a treasured place to walk. Stomping to the nearest pub or 7-Eleven does not count.

#3. Pay attention to your surroundings and walk mindfully. Observe the world around you. This goes back to #1.

#4. Take off the headphones, take out the earbuds and leave your soundtrack at home. Listen to the world around you.

#5. Leave the partner, friend and dog (or other walking companion) at home. Walk alone.

From the BBC:

A number of recent books have lauded the connection between walking – just for its own sake – and thinking. But are people losing their love of the purposeless walk?

Walking is a luxury in the West. Very few people, particularly in cities, are obliged to do much of it at all. Cars, bicycles, buses, trams, and trains all beckon.

Instead, walking for any distance is usually a planned leisure activity. Or a health aid. Something to help people lose weight. Or keep their fitness. But there’s something else people get from choosing to walk. A place to think.

Wordsworth was a walker. His work is inextricably bound up with tramping in the Lake District. Drinking in the stark beauty. Getting lost in his thoughts.

Charles Dickens was a walker. He could easily rack up 20 miles, often at night. You can almost smell London’s atmosphere in his prose. Virginia Woolf walked for inspiration. She walked out from her home at Rodmell in the South Downs. She wandered through London’s parks.

Henry David Thoreau, who was both author and naturalist, walked and walked and walked. But even he couldn’t match the feat of someone like Constantin Brancusi, the sculptor who walked much of the way between his home village in Romania and Paris. Or indeed Patrick Leigh Fermor, whose walk from the Hook of Holland to Istanbul at the age of 18 inspired several volumes of travel writing. George Orwell, Thomas De Quincey, Nassim Nicholas Taleb, Friedrich Nietzsche, Bruce Chatwin, WG Sebald and Vladimir Nabokov are just some of the others who have written about it.

Read the entire article here.

Images courtesy of Google Search: Walking with smartphone. Walking in nature (my preference).

Send to Kindle

Search and the Invisible Hand of Bias

duck-duck-go

I’ve written about the online filter bubble for a while now. It’s an insidious and disturbing consequence of our online world. It refers to the phenomenon whereby our profile, personal preferences, history and connections pre-select and filter the type of content that reaches us, eliminating things we don’t need to see. The filter bubble reduces our exposure to the wider world of information and serendipitous discovery.

If this were not bad enough the online world enables a much more dangerous threat — one of hidden bias through explicit manipulation. We’re all familiar with the pull and push exerted by the constant bombardment from overt advertising. We’re also familiar with more subtle techniques of ambient and subliminal control, which aim to sway our minds without our conscious awareness — think mood music in your grocery store (it really does work).

So, now comes another more subtle form of manipulation, but with more powerful results, and it’s tied to search engines and the central role these tools play in our daily lives.

Online search engines, such as Google, know you. They know your eye movements and your click habits; they know your proclivity to select a search result near the top of the first search engine results page (SERP). Advertisers part with a fortune each day with the goal of appearing in this sweet spot on a SERP. This is a tried and tested process — higher ranking on a SERP leads to more clicks and shifts more product.

Google and many other search engines will list a handful of sponsored results at the top of a SERP, followed by a collection of random results listed in order that best fit your search query. Your expectation is that these results are tailored to your query, but that they’re non-biased. That’s the key.

New research shows that you believe these SERP results to be non-biased, even if they are manipulated behind the scenes. Moreover, these manipulated results can greatly sway your opinion. The phenomenon now comes with a name, the search engine manipulation effect, or SEME (pronounced “seem”).

In the wrong hands — government overlords or technology oligarchs — this heralds a disturbing possible (and probable) future, already underway in countries with tightly controlled media and flows of information.

Check out a detailed essay on SEME by Robert Epstein here. Epstein is an author and research psychologist at the American Institute for Behavioral Research and Technology in California.

Finally, if you’re interested in using an alternative search engine that’s less interested in taking over the world, check out DuckDuckGo.

Image courtesy of DuckDuckGo.

Send to Kindle

Pokemon Go and the Post-Apocalyptic Future is Nigh

google-search-pokemon-go

Some have lauded Pokémon Go as the next great health and fitness enabler since the “invention” of running. After all, over the span of just a few days it has forced half of Western civilization to unplug from Netflix, get off the couch and move around, and to do so outside!

The cynic in me perceives deeper, darker motives at play: a plot by North Korea to distract the West while it prepares a preemptive nuclear strike; a corporate sponsored youth brain-washing program; an exquisitely orchestrated, self-perpetuated genocidal time-bomb wrought by shady political operatives; a Google inspired initiative to tackle the obesity epidemic.

While the true nature of this elegantly devious phenomenon unfolds over the long-term — and maintains the collective attention of tens of millions of teens and millennials in the process — I will make a dozen bold, short-term predictions:

  1. A legendary Pokémon, such as Mewtwo, will show up at the Republican National Convention in Cleveland, and it will be promptly shot by open carry fanatics.
  2. The first Pokémon Go fatality will occur by July 31, 2016 — a player will inadvertently step into traffic while trying to throw a Poké Ball.
  3. The hundredth Pokémon Go fatality will occur on August 1, 2016 — the 49th player to fall into a sewer and drown.
  4. Sales of comfortable running shoes will skyrocket over the next 3 days, as the West discovers walking.
  5. Evangelical mega-churches in the US will hack the game to ensure Pokémon characters appear during revivals to draw more potential customers.
  6. Pokémon characters will begin showing up on Fox News and the Supreme Court.
  7. Tinder will file for chapter 11 bankruptcy and emerge as a Pokémon dating site.
  8. Gyms and stadia around the country will ditch all sporting events to make way for mass Pokémon hunts; NFL’s next expansion team will be virtual and led by Pikachu as quarterback.
  9. The Pokémon Company, Nintendo and Niantic Labs will join forces to purchase Japan by year’s end.
  10. Google and Tesla will team up to deliver Poké Spot in-car navigation allowing players to automatically drive to Pokémon locations.
  11. Donald Trump will assume office of PokémonPresident of the United States on January 20, 2017; 18-35-year-olds forgot to vote.
  12. World ends, January 21, 2017.

Pokemon-Go WSJ screenshot 13Jul2016If you’re one of the few earthlings wondering what Pokémon Go is all about, and how in the space of just a few days our neighborhoods have become overrun by zombie-like players, look no further than the WSJ. Rupert Murdoch must be a fan.

Image courtesy of Google Search.

 

Send to Kindle

Steps of Life

Steps-of-life-19th-century-print

Are you adolescent or middle-aged? Are you on life’s upwardly mobile journey towards the peak years (whatever these may be) or are you spiraling downwards in terminal decline?

The stages of life — from childhood to death — may be the simplistic invention of ancient scholars who sought a way to classify and explain the human condition, but over hundreds of years authors and artists have continued to be drawn to the subject. Our contemporary demographers and market researchers are just the latest in a long line of those who seek to explain, and now monetize, particular groups by age.

So, if you’re fascinated by this somewhat arbitrary chronological classification system the Public Domain Review has a treat. They’ve assembled a fine collection of images from the last five hundred years that depict the different ages of man and woman.

A common representation is to show ages ascending a series of steps from infancy to a peak and then descending towards old-age, senility and death. The image above is a particularly wonderful example of the genre and while the ages are noted in French the categories are not difficult to decipher:

20 years: “Jeunesse”

40 years: “Age de discretion”

50 years: “Age de Maturité”

90 years: “Age de decrépitude”

Image: “Le cours de la vie de l’homme dans ses différents âges”. Early 19th-century print showing stages of life at ten year intervals from 10-90 years as ascending and then descending steps. Courtesy: Wikipedia. Public Domain.

Send to Kindle

The Transit of Dione

Dione-transit-of-Saturn

This gorgeous image of Saturn’s moon Dione as it transits the gas giant was snapped by the Cassini spacecraft on May 27, 2015. For more beautiful views of the stunning ringed planet and its curious moons visit NASA’s Cassini mission site.

Image: Saturn’s moon Dione transits the giant ringed planet. Courtesy: NASA/JPL-Caltech/Space Science Institute in Boulder, Colorado.

 

 

Send to Kindle

Hoverboard or Jet Pack With That Martini?

Personally, I’m still waiting for the advent of Star Trek-like teleportation to get me from point A to point B.

But, in the meantime, and Hyperloop notwithstanding, I’ll go for the hoverboard. It looks like a much more technically finessed product than James Bond’s jet pack.

Check out this Wired report on a recent record-setting hoverboard adventure around and over Saussett-Le-Pins, near Marseille, France.

Video: Franky Zapata set a new record for the farthest hoverboard flight. Courtesy: Guinness World Records.

Send to Kindle

As Clear As Black and White

Police-violence-screenshot-7Jul2016

The terrible tragedy that is wrought by guns in the United States continues unabated. And, it’s even more tragic when elements of our police forces fuel the unending violence, more often than not, enabled by racism. The governor of Minnesota Mark Dayton put it quite starkly yesterday, following the fatal shooting of Philando Castile on July 6, 2016, a resident of Falcon Heights, pulled over for a broken tail-light.

Just one day earlier, police officers in Baton Rouge, Louisiana shot and killed Alton Sterling.

Anti-police-violence-screenshot-8Jul2016

And, today we hear that the cycle of mistrust, hatred and deadly violence — courtesy of guns — has come full circle. A racist sniper (or snipers)  apparently targeting and murdering five white police officers in Dallas, Texas on July 7, 2016.

Images: Screenshots courtesy of Washington Post and WSJ, respectively.

Send to Kindle

Are You Monotasking or Just Paying Attention?

We have indeed reached the era of peak multi-tasking. It’s time to select a different corporate meme.

Study after recent study shows that multi-tasking is an illusion — we can’t perform two or more cognitive tasks in parallel, at the same time. Rather, we timeshare: dividing our attention from one task to another sequentially. These studies also show that dividing our attention in this way tends to have a deleterious effect on all of the tasks. I say cognitive tasks because it’s rather obvious that we can all perform some tasks at the same time: walk and chew gum (or thumb a smartphone); drive and sing; shower and think; read and eat. But, all of these combinations require that one of these tasks is mostly autonomic. That is, we perform one task without conscious effort.

Yet more social scientists have determined that multi-tasking is a fraud — perhaps perpetuated by corporate industrial engineers convinced that they can wring more hours of work from you.

What are we to do now having learned that our super-efficient world of juggling numerous tasks as the “same time” is nothing but a mirage?

Well, observers of the fragile human condition have not rested. This time social scientists have discovered an amazing human talent. And they’ve coined a mesmerizing new term, known as monotasking. In some circles it’s called uni-tasking or single-tasking.

When I was growing up this was called “paying attention”.

But, this being the era of self-help-life-experience-consulting gone mad and sub-minute attention spans (fueled by multi-tasking) we can now all eagerly await the rise of an entirely new industry dedicated to this wonderful monotasking breakthrough. Expect a whole host of monotasking books, buzzworthy news articles, daytime TV shows with monotasking tips and personal coaching experts at TED events armed with “look what monotasking can do for you” powerpoint decks.

Personally, I will quietly retreat, and return to old-school staying focused, and remind my kids to do the same.

From NYT:

Stop what you’re doing.

Well, keep reading. Just stop everything else that you’re doing.

Mute your music. Turn off your television. Put down your sandwich and ignore that text message. While you’re at it, put your phone away entirely. (Unless you’re reading this on your phone. In which case, don’t. But the other rules still apply.)

Just read.

You are now monotasking.

Maybe this doesn’t feel like a big deal. Doing one thing at a time isn’t a new idea.

Indeed, multitasking, that bulwark of anemic résumés everywhere, has come under fire in recent years. A 2014 study in the Journal of Experimental Psychology found that interruptions as brief as two to three seconds — which is to say, less than the amount of time it would take you to toggle from this article to your email and back again — were enough to double the number of errors participants made in an assigned task.

Earlier research out of Stanford revealed that self-identified “high media multitaskers” are actually more easily distracted than those who limit their time toggling.

So, in layman’s terms, by doing more you’re getting less done.

But monotasking, also referred to as single-tasking or unitasking, isn’t just about getting things done.

Not the same as mindfulness, which focuses on emotional awareness, monotasking is a 21st-century term for what your high school English teacher probably just called “paying attention.”

“It’s a digital literacy skill,” said Manoush Zomorodi, the host and managing editor of WNYC Studios’ “Note to Self” podcast, which recently offered a weeklong interactive series called Infomagical, addressing the effects of information overload. “Our gadgets and all the things we look at on them are designed to not let us single-task. We weren’t talking about this before because we simply weren’t as distracted.”

Continue reading the main story

Ms. Zomorodi prefers the term “single-tasking”: “ ‘Monotasking’ seemed boring to me. It sounds like ‘monotonous.’ ”

Kelly McGonigal, a psychologist, lecturer at Stanford and the author of “The Willpower Instinct,” believes that monotasking is “something that needs to be practiced.” She said: “It’s an important ability and a form of self-awareness as opposed to a cognitive limitation.”

Read the entire article here.

Image courtesy of Google Search.

Send to Kindle

The Collapsing Wave Function

Schrodinger-equationOnce in every while I have to delve into the esoteric world of quantum mechanics. So, you will have to forgive me.

Since it was formalized in the mid-1920s QM has been extremely successful at describing the behavior of systems at the atomic scale. Two giants of the field — Niels Bohr and Werner Heisenberg — devised the intricate mathematics behind QM in 1927. Since then it has become known as the Copenhagen Interpretation, and has been widely and accurately used to predict and describe the workings of elementary particles and forces between them.

Yet recent theoretical stirrings in the field threaten to turn this widely held and accepted framework on its head. The Copenhagen Interpretation holds that particles do not have definitive locations until they are observed. Rather, their positions and movements are defined by a wave function that describes a spectrum of probabilities, but no certainties.

Rather understandably, this probabilistic description of our microscopic world tends to unnerve those who seek a more solid view of what we actually observe. Enter Bohmian mechanics, or more correctly, the De BroglieBohm theory of quantum mechanics. An increasing number of present day researchers and theorists are revisiting this theory, which may yet hold some promise.

From Wired:

Of the many counterintuitive features of quantum mechanics, perhaps the most challenging to our notions of common sense is that particles do not have locations until they are observed. This is exactly what the standard view of quantum mechanics, often called the Copenhagen interpretation, asks us to believe.

But there’s another view—one that’s been around for almost a century—in which particles really do have precise positions at all times. This alternative view, known as pilot-wave theory or Bohmian mechanics, never became as popular as the Copenhagen view, in part because Bohmian mechanics implies that the world must be strange in other ways. In particular, a 1992 study claimed to crystalize certain bizarre consequences of Bohmian mechanics and in doing so deal it a fatal conceptual blow. The authors of that paper concluded that a particle following the laws of Bohmian mechanics would end up taking a trajectory that was so unphysical—even by the warped standards of quantum theory—that they described it as “surreal.”

Nearly a quarter-century later, a group of scientists has carried out an experiment in a Toronto laboratory that aims to test this idea. And if their results, first reported earlier this year, hold up to scrutiny, the Bohmian view of quantum mechanics—less fuzzy but in some ways more strange than the traditional view—may be poised for a comeback.

As with the Copenhagen view, there’s a wave function governed by the Schrödinger equation. In addition, every particle has an actual, definite location, even when it’s not being observed. Changes in the positions of the particles are given by another equation, known as the “pilot wave” equation (or “guiding equation”). The theory is fully deterministic; if you know the initial state of a system, and you’ve got the wave function, you can calculate where each particle will end up.

That may sound like a throwback to classical mechanics, but there’s a crucial difference. Classical mechanics is purely “local”—stuff can affect other stuff only if it is adjacent to it (or via the influence of some kind of field, like an electric field, which can send impulses no faster than the speed of light). Quantum mechanics, in contrast, is inherently nonlocal. The best-known example of a nonlocal effect—one that Einstein himself considered, back in the 1930s—is when a pair of particles are connected in such a way that a measurement of one particle appears to affect the state of another, distant particle. The idea was ridiculed by Einstein as “spooky action at a distance.” But hundreds of experiments, beginning in the 1980s, have confirmed that this spooky action is a very real characteristic of our universe.

Read the entire article here.

Image: Schrödinger’s time-dependent equation. Courtesy: Wikipedia.

 

 

Send to Kindle

Juno on the 4th of July

Jupiter and Ganymede

Perhaps not coincidentally, NASA’s latest foray into the great beyond reached a key milestone today. The Juno spacecraft entered orbit around the gas giant Jupiter on the 4th of July, 2016.

NASA is still awaiting all the cool science (and image-capture) to begin. So, in the meantime I’m posting an gorgeous picture taken of Jupiter by the Hubble Space Telescope.

Image: Jupiter and Ganymede, Taken April 9, 2007. Courtesy: Credit: NASA, ESA, and E. Karkoschka (University of Arizona).

Send to Kindle

Achieving Failure

Our society values success.

Our work environments value triumphing over the competition. We look to our investments to beat the market. We support our favorite teams, but adore them when they trounce their rivals. Our schools and colleges (mostly) help educate our children, but do so in a way that rewards success — good grades, good test scores and good behavior (as in, same as everyone else). We continually reward our kids for success on a task, at school, with a team.

Yet, all of us know, in our hearts and the back of our minds, that the most important lessons and trials stem from failure — not success. From failure we learn to persevere, we learn to change and adapt, we learn to overcome. From failure we learn to avoid, or tackle obstacles head on; we learn to reassess and reevaluate. We evolve from our failures.

So this begs the question: why are so many of our processes and systems geared solely to rewarding and reinforcing success?

From NPR:

Is failure a positive opportunity to learn and grow, or is it a negative experience that hinders success? How parents answer that question has a big influence on how much children think they can improve their intelligence through hard work, a study says.

“Parents are a really critical force in child development when you think about how motivation and mindsets develop,” says Kyla Haimovitz, a professor of psychology at Stanford University. She coauthored the study, published in Psychological Science with colleague Carol Dweck, who pioneered research on mindsets. “Parents have this powerful effect really early on and throughout childhood to send messages about what is failure, how to respond to it.”

Although there’s been a lot of research on how these forces play out, relatively little looks at what parents can do to motivate their kids in school, Haimovitz says. This study begins filling that gap.

“There is a fair amount of evidence showing that when children view their abilities as more malleable and something they can change over time, then they deal with obstacles in a more constructive way,” says Gail Heyman, a professor of psychology at the University of California at San Diego who was not involved in this study.

But communicating that message to children is not simple.

“Parents need to represent this to their kids in the ways they react about their kids’ failures and setbacks,” Haimovitz says. “We need to really think about what’s visible to the other person, what message I’m sending in terms of my words and my deeds.”

In other words, if a child comes home with a D on a math test, how a parent responds will influence how the child perceives their own ability to learn math. Even a well-intentioned, comforting response of “It’s OK, you’re still a great writer” may send the message that it’s time to give up on math rather than learn from the problems they got wrong, Haimovitz explains.

Read the entire story here.

Send to Kindle

What Keeps NASA Going?

Apollo 17 Commander Gene Cernan on lunar rover

Apollo astronaut Eugene Cernan is the last human to have set foot on a world other than Earth. It’s been 44 years since he last stepped off the moon. In fact, in 1972 he drove around using the lunar rover and found time to scribble his daughter’s initials on the dusty lunar surface. So, other than forays to the International Space Station (ISS) and trips to service the Hubble Space Telescope (HST) NASA has kept humans firmly rooted to the homeland.

Of course, in the intervening decades the space agency has not rested on its laurels. NASA has sent probes and robots all over the Solar System and beyond: Voyager to the gas giants and on to interstellar space,  Dawn to visit asteroids; Rosetta (in concert with the European Space Agency) to visit a comet; SOHO and its countless cousins to keep an eye on our home star; Galileo and Pioneer to Jupiter; countless spacecraft including Curiosity Rover to Mars; Messenger to map Mercury; Magellan to probe the clouds of Venus; Cassini to survey Saturn and its fascinating moons; and of course, New Horizons to Pluto and beyond.

Spiral galaxies together with irregular galaxies make up approximately 60% of the galaxies in the local Universe. However, despite their prevalence, each spiral galaxy is unique — like snowflakes, no two are alike. This is demonstrated by the striking face-on spiral galaxy NGC 6814, whose luminous nucleus and spectacular sweeping arms, rippled with an intricate pattern of dark dust, are captured in this NASA/ESA Hubble Space Telescope image. NGC 6814 has an extremely bright nucleus, a telltale sign that the galaxy is a Seyfert galaxy. These galaxies have very active centres that can emit strong bursts of radiation. The luminous heart of NGC 6814 is a highly variable source of X-ray radiation, causing scientists to suspect that it hosts a supermassive black hole with a mass about 18 million times that of the Sun. As NGC 6814 is a very active galaxy, many regions of ionised gas are studded along  its spiral arms. In these large clouds of gas, a burst of star formation has recently taken place, forging the brilliant blue stars that are visible scattered throughout the galaxy.

Our mechanical human proxies reach out a little farther each day to learn more about our universe and our place in it. Exploration and discovery is part of our human DNA; it’s what we do. NASA is our vehicle. So, it’s good to see what NASA is planning. The agency just funded eight advanced-technology programs that officials believe may help transform space exploration. The grants are part of the NASA Innovative Advanced Concepts (NIAC) program. The most interesting, perhaps, are a program to evaluate inducing hibernation in Mars-bound astronauts, and an assessment of directed energy propulsion for interstellar travel.

Our science and technology becomes more and more like science fiction each day.

Read more about NIAC programs here.

Image 1: Apollo 17 mission commander Eugene A. Cernan makes a short checkout of the Lunar Roving Vehicle during the early part of the first Apollo 17 extravehicular activity at the Taurus-Littrow landing site. Courtesy: NASA.

Image 2: Hubble Spies a Spiral Snowflake, galaxy NGC 6814. Courtesy: NASA/ESA Hubble Space Telescope.

Send to Kindle