aplicativo espião windows phone this web page kinderschutz internet ipod

Post*Factua!ly Speaking – Inauguration Day

postfactually-screenshot

In keeping with today’s historic (and peaceful) transition of power in the United States — I’m taking time to celebrate the inauguration of… Post*factua!ly.

Post*factua!ly is my new social art project aimed at collecting lies, sharing misquotes and debunking facts. How timely, right?

We’ve entered a new age where lies matter and fact is meaningless. As a result Post*factua!ly aims to become a community focal point — with an artistic slant — for fibs, lies, falsehoods, deceit, half-truths, fabrications, bluffing, disinformation, misinformation, untruth, truthiness, post-truth, post-fact, and other stuff that’s just not real (or perhaps it is).

Post*factua!ly will formally open it’s doors by early February. So, in the meantime if you wish to join the community please visit this link, and thanks for giving the world of post-fact and truthiness a chance.

Send to Kindle

What Up With That: Nationalism

The recent political earthquake in the US is just one example of a nationalistic wave that swept across Western democracies in 2015-2016. The election in the US seemed to surprise many political talking-heads since the nation was, and still is, on a continuing path towards greater liberalism (mostly due to demographics).

So, what exactly is up with that? Can American liberals enter a coma for the next 4 years, sure to awaken refreshed and ready for a new left-of-center regime? Or, is the current nationalistic mood — albeit courtesy of a large minority — likely to prevail for a while longer? Well, there’s no clear answer, and political scientists and researchers are baffled.

Care to learn more about theories of nationalism and the historical underpinnings of nationalism? Visit my reading list over at Goodreads. But make sure you start with: Imagined Communities: Reflections on the Origin and Spread of Nationalism by Benedict Anderson. It’s been the global masterwork on the analysis of nationalism since it was first published in 1983.

I tend to agree with Anderson’s thesis, that a nation is mostly a collective figment of people’s imagination facilitated by modern communications networks. So, I have to believe that eventually our networks will help us overcome the false strictures of our many national walls and borders.

From Scientific American:

Waves of nationalist sentiment are reshaping the politics of Western democracies in unexpected ways — carrying Donald Trump to a surprise victory last month in the US presidential election, and pushing the United Kingdom to vote in June to exit the European Union. And nationalist parties are rising in popularity across Europe.

Many economists see this political shift as a consequence of globalization and technological innovation over the past quarter of a century, which have eliminated many jobs in the West. And political scientists are tracing the influence of cultural tensions arising from immigration and from ethnic, racial and sexual diversity. But researchers are struggling to understand why these disparate forces have combined to drive an unpredictable brand of populist politics.

“We have to start worrying about the stability of our democracies,” says Yascha Mounk, a political scientist at Harvard University in Cambridge, Massachusetts. He notes that the long-running World Values Survey shows that people are increasingly disaffected with their governments — and more willing to support authoritarian leaders.

Some academics have explored potential parallels between the roots of the current global political shift and the rise of populism during the Great Depression, including in Nazi Germany. But Helmut Anheier, president of the Hertie School of Governance in Berlin, cautions that the economic struggles of middle-class citizens across the West today are very different, particularly in mainland Europe.

The Nazis took advantage of the extreme economic hardship that followed the First World War and a global depression, but today’s populist movements are growing powerful in wealthy European countries with strong social programmes. “What brings about a right-wing movement when there are no good reasons for it?”Anheier asks.

In the United States, some have suggested that racism motivated a significant number of Trump voters. But that is too simplistic an explanation, says Theda Skocpol, a sociologist at Harvard University.  “Trump dominated the news for more than a year, and did so with provocative statements that were meant to exacerbate every tension in the US,” she says.

Read the entire story here.

p.s. What Up With That is my homage to the recurring Saturday Night Live (SNL) sketch of the same name.

Send to Kindle

Reliving the Titan Descent

A couple of days ago NASA released this gorgeous video constructed from real images taken by the Huygens lander. This revisits Huygen’s successful landing on Titan — Saturn’s largest moon, just over 12 years ago, on January 14, 2005.

Huygens made up half of the Cassini-Huygens joint NASA-ESA (European Space Agency) mission to investigate Saturn and its strange moons. Cassini is currently still in close orbit around Saturn. To date the mission remains the first to successfully land on a moon beyond Earth’s own.

Video: This movie was built thanks to the data collected by ESA’s Huygens Descent Imager/Spectral Radiometer (DISR) on 14 January 2005, during the 147-minutes plunge through Titan’s thick orange-brown atmosphere to a soft sandy riverbed. In 4 minutes 40 seconds, the movie shows what the probe ‘saw’ within the few hours of the descent and the eventual landing. Courtesy: NASA/ESA.

Send to Kindle

MondayMap: The Feds Own 84.5 Percent of Nevada

map-federal_lands

The Unites States government owns almost one-third (28 percent) of the entire nation. Through various agencies that include the United States Forest Service, the National Park Service, the Bureau of Land Management, and the Fish and Wildlife Service, the total owned “by the people, for the people” comes to a staggering 640 million acres of land.

Perhaps not surprisingly, most of the federal owned land lies in the Rocky Mountains and to the West. In fact, the US government owns 47 percent of the land in the western states, versus just 4 percent in states east of the Rockies.

More from Frank Jacobs over at Strange Maps:

The rough beauty of the American West seems as far as you can get from the polished corridors of power in Washington DC. Until you look at the title to the land. The federal government owns large tracts of the western states: from a low of 29.9% in Montana, already more than the national average, up to a whopping 84.5% in Nevada.

What is all that federal land for? And exactly who is in charge? According to the Congressional Research Service [4], a total area of just under 610 million acres – more than twice the size of Namibia – is administered by no more than 4 federal government agencies:

* The United States Forest Service (USFS), which oversees timber harvesting, recreation, wildlife habitat protection and other sustainable uses on a total of 193 million acres – almost the size of Turkey – mainly designated as National Forests.

* The National Park Service (NPS) conserves lands and resources on 80 million acres – a Norway-sized area – in order to preserve them for the public. Any harvesting or resource removal is generally prohibited.

* the Bureau of Land Management (BLM), managing 248 million acres [5] – an area the size of Egypt – has a multiple-use, sustained-yield mandate, supporting energy development, recreation, grazing, conservation, and other uses.

* the Fish and Wildlife Service (FWS) manages 89 million acres – an area slightly bigger than Germany – to conserve and protect animal and plant species.

Check out the entire story here.

Image: Federal Real Property Profile. Courtesy: U.S. General Services Administration /  ‘Can the West Lead Us To A Better Place?‘ an article in Stanford Magazine.

Send to Kindle

Consumerism Gone Utterly Utterly Mad

amazon-patent-afc

I’m not sure whether to love or hate Amazon (the online retailer). I love the one-click convenience and the mall-less shopping experience. But, Amazon’s lengthy tentacles are increasingly encroaching into every aspect of our lives. Its avaricious quest to “serve the customer” has me scared.

I don’t want Amazon to be the sole source for everything that I eat, wear and use. I don’t want Amazon to run the world’s computing infrastructure. I don’t want Amazon making and peddling movies. I don’t want Amazon tech eavesdropping on my household conversations. I don’t want Amazon owning telecommunications and fiber infrastructure, nor do I want it making phones. I don’t wish to live in a nation that has to all intents become a giant, nationwide Amazon warehouse. And, this leads me to the company’s latest crazy idea.

The company was granted patent #9,305,280 in April 2016 for an “airborne fulfillment center utilizing unmanned aerial vehicles for item delivery“. You got it: a flying warehouse stocked full of goodies hovering over your neighborhood armed and ready to launch your favorite washing detergent, a pair of Zappos shoes, diapers and a salame to your doorstep via missile drone.

Apparently the proposed airborne fulfillment center (AFC) “may be an airship that remains at a high altitude (e.g., 45,000 feet)”. Not surprisingly, the AFC mothership will use unmanned aerial vehicles (UAV) — drones — “to deliver ordered items to user designated delivery locations”. But, in addition, the patent filing suggests that “shuttles (smaller airships) may be used to replenish the AFC with inventory, UAVs, supplies, fuel, etc. Likewise, the shuttles may be utilized to transport workers to and from the AFC”. The proposed airship will also deliver customized airborne advertising tied to its inventory enabling on-the-fly (pun intended) product promotions and fulfillment.

As Annalee Newitz, Tech Culture Editor, over at ars technica remarks, “sounds like something out of a Philip K. Dick novel“. Yes, and while Dick’s many novels were gloriously imagined, we don’t necessarily need them to enter the real world. Please let our androids continue dreaming (of electric sheep).

Image: Figure 2 from Amazon’s patent for an airborne fulfillment center utilizing unmanned aerial vehicles for item delivery. US patent #9305280. Courtesy: USPTO. Public Domain.

Send to Kindle

Heroes Only Die at the Top of Hills

google-search-heroes-comicWe all need heroes. So, if you wish to become one, you would stand a better chance if you took your dying breaths atop a hill. Also, it would really help your cause if you arrived via virgin birth.

Accordingly, please refer to the Rank-Raglan Mythotype — it is a list of 22 universal archetypes that are prerequisites to you becoming a hero of mythological proportions (far beyond being a Youtube sensation):

  1. Hero’s mother is a royal virgin;
  2. His father is a king, and
  3. Often a near relative of his mother, but
  4. The circumstances of his conception are unusual, and
  5. He is also reputed to be the son of a god.
  6. At birth an attempt is made, usually by his father or his maternal grand father to kill him, but
  7. He is spirited away, and
  8. Reared by foster -parents in a far country.
  9. We are told nothing of his childhood, but
  10. On reaching manhood he returns or goes to his future Kingdom.
  11. After a victory over the king and/or a giant, dragon, or wild beast,
  12. He marries a princess, often the daughter of his predecessor and
  13. Becomes king.
  14. For a time he reigns uneventfully and
  15. Prescribes laws, but
  16. Later he loses favor with the gods and/or his subjects, and
  17. Is driven from the throne and city, after which
  18. He meets with a mysterious death,
  19. Often at the top of a hill,
  20. His children, if any do not succeed him.
  21. His body is not buried, but nevertheless
  22. He has one or more holy sepulchres.

By far the most heroic fit to date is Mithradates the Great with 22 out of a possible 22 cross-cultural traits. Jesus comes in with a score of 18-20 (based on interpretation) out of 22 , beaten by Krishna with 21, while Robin Hood only manages a paltry 13. Interestingly, Buddha collects 15 points, followed closely by Czar Nicholas II with 14.

The mythotype comes from the book The Hero: A study in Tradition, Myth and Dreams by Lord Raglan.

List courtesy of Professor Thomas J. Sienkewicz, Monmouth College, Monmouth, Illinois. It is based upon material used in his mythology classes for many years, first at Howard University in Washington, D.C., and then at Monmouth College in Monmouth, Illinois.

Image courtesy of Google Search.

Send to Kindle

The Golden Age of TV: Trailer Park Boys

tpb-screenshot

I have noticed that critics of our pop culture seem to agree that we are in a second golden age of television in the United States (and elsewhere). It’s a period beginning in the late 1990s, and stretching to the present day, marked by the production of a significant number of critically and internationally acclaimed programs. The original golden age of television spanned the late 1940s and early 50s (e.g., Kraft Television Theater, Four Star Playhouse, The Clock, Alfred Hitchcock Presents).

I’m not much of a TV watcher so my credentials are somewhat dubious. But, I must weigh in to set the record straight on our current golden age. To be precise, it began in Canada on April 22, 2001, and to a fashion, continues to this day.

You see, on April 22, 2001, the CBC (the Canadian Broadcasting Corporation) aired “Take Your Little Gun and Get Out of My Trailer Park“, the first episode of the first season of Trailer Park Boys.

google-search-tpb-memes

I first stumbled across Trailer Park Boys on BBC America while channel surfing in 2004 (I know, 3 years late!). Unfamiliar? Trailer Park Boys (TPB) is a now legendary Canadian mockumentary comedy chronicling the (mis-)adventures of Julian, Ricky and Bubbles, and other colorful residents of fictitious Sunnyvale Trailer Park in Nova Scotia. The show now in its 11th season is a booze and pot-fueled catalog of vulgar, outrageous hare-brained silliness.

I love it. To date I have never laughed so much while watching TV. Luckily for me, and other fans, the show and related movies are now available on Netflix.

So, long may the real golden age of TV continue complete with Bubble’s kitties, Julian and Ricky’s get-rich-quick schemes, Randy’s stomach, Mr.Lahey, Cyrus the nutter, J-Roc, rum-and-coke, Tyrone, Lucy, Officer Green, Trinity, shopping carts and the rest of the madcap bunch.

Image 1: Trailer Park Boys screenshot. Courtesy of Swearnet.

Image 2 courtesy of Google Search.

Send to Kindle

Hate Work Email? Become a French Citizen

google-search-work-stress

Many non-French cultures admire the French. They live in a gorgeous country with a rich history, and, besides, its crammed with sumptuous food and wine. And, perhaps as a result, the French seem to have a very firm understanding of the so-called work-life balance. They’re often characterized as a people who work to live, rather than their earnest Anglo-Saxon cousins who generally live to work. While these may be over-generalized aphorisms a new French law highlights the gulf between employee rights of the French versus those of other more corporate-friendly nations.

Yes, as of January 1, 2017, an employee of a French company, having over 50 staff, has the legal right to ignore work-related emails, and other communications, outside of regular working hours.

Vive la France! More on this “right to disconnect” law here.

From the Guardian:

From Sunday [January 1, 2017], French companies will be required to guarantee their employees a “right to disconnect” from technology as the country seeks to tackle the modern-day scourge of compulsive out-of-hours email checking.

On 1 January, an employment law will enter into force that obliges organisations with more than 50 workers to start negotiations to define the rights of employees to ignore their smartphones.

Overuse of digital devices has been blamed for everything from burnout to sleeplessness as well as relationship problems, with many employees uncertain of when they can switch off.

The measure is intended to tackle the so-called “always-on” work culture that has led to a surge in usually unpaid overtime – while also giving employees flexibility to work outside the office.

“There’s a real expectation that companies will seize on the ‘right to disconnect’ as a protective measure,” said Xavier Zunigo, a French workplace expert, as a new survey on the subject was published in October.

“At the same time, workers don’t want to lose the autonomy and flexibility that digital devices give them,” added Zunigo, who is an academic and director of research group Aristat.

The measure was introduced by labour minister Myriam El Khomri, who commissioned a report submitted in September 2015 which warned about the health impact of “info-obesity” which afflicts many workplaces.

Under the new law, companies will be obliged to negotiate with employees to agree on their rights to switch off and ways they can reduce the intrusion of work into their private lives.

Read the entire article here.

Image courtesy of Google Search.

Send to Kindle

The Messiah Myth

merryoldsanta

Now that almost two weeks have gone by since Christmas it’s time to reflect on it’s (historical) meaning beyond the shopping discounts, santa hats and incessant cheesy music.

We know that Christmas falls on two different dates depending on whether you follow the Gregorian or Julian (orthodox) calendars.

We know that many Christmas traditions were poached and re-purposed from rituals and traditions that predate the birth of Jesus, regardless of which calendar you adhere to: the 12-days of Christmas (christmastide) originated from the ancient Germanic mid-winter, more recently Norse, festival of Yule; the tradition of gift giving and partying came from the ancient Roman festival of Saturnalia; the Western Christian church settled on December 25 based on the ancient Roman date of the winter solstice; holiday lights came from the ancient pagans who lit bonfires and candles on the winter solstice to celebrate the return of the light.

And, let’s not forget the now ubiquitous westernized Santa Claus. We know that Santa has evolved over the centuries from a melting pot of European traditions including those surrounding Saint Nicolas, who was born to a Greek family in Asia Minor (Greek Anatolia in present-day Turkey), and the white-bearded Norse god, Odin.

So, what of Jesus? We know that the gospels describing him are contradictory, written by different, anonymous and usually biased authors, and at different times, often decades after the reported fact. We have no eye-witness accounts. We lack a complete record — there is no account for Jesus’ years 12-30. Indeed, religion aside, many scholars, now question the historic existence of Jesus, the man.

From Big Think:

Today, several books approach the subject, including Zealot by Reza Aslan, Nailed: Ten Christian Myths That Show Jesus Never Existed at All by David Fitzgerald, and How Jesus Became God by Bart Ehrman. Historian Richard Carrier in his 600 page monograph: On the Historicity of Jesus, writes that the story may have derived from earlier semi-divine beings from Near East myth, who were murdered by demons in the celestial realm. This would develop over time into the gospels, he said. Another theory is that Jesus was a historical figure who become mythicized later on.

Carrier believes the pieces added to the work of Josephus were done by Christian scribes. In one particular passage, Carrier says that the execution by Pilate of Jesus was obviously lifted from the Gospel of Luke. Similar problems such as miscopying and misrepresentations are found throughout Tacitus. So where do all the stories in the New Testament derive? According to Carrier, Jesus may be as much a mythical figure as Hercules or Oedipus.

Ehrman focuses on the lack of witnesses. “What sorts of things do pagan authors from the time of Jesus have to say about him? Nothing. As odd as it may seem, there is no mention of Jesus at all by any of his pagan contemporaries. There are no birth records, no trial transcripts, no death certificates; there are no expressions of interest, no heated slanders, no passing references – nothing.”

One biblical scholar holds an even more radical idea, that Jesus story was an early form of psychological warfare to help quell a violent insurgency. The Great Revolt against Rome occurred in 66 BCE. Fierce Jewish warriors known as the Zealots won two decisive victories early on. But Rome returned with 60,000 heavily armed troops. What resulted was a bloody war of attrition that raged for three decades.

 Atwill contends that the Zealots were awaiting the arrival of a warrior messiah to throw off the interlopers. Knowing this, the Roman court under Titus Flavius decided to create their own, competing messiah who promoted pacifism among the populous. According to Atwill, the story of Jesus was taken from many sources, including the campaigns of a previous Caesar.

Of course, there may very well have been a Rabbi Yeshua ben Yosef (as would have been Jesus’s real name) who gathered a flock around his teachings in the first century. Most antiquarians believe a real man existed and became mythicized. But the historical record itself is thin.

Read the entire article here.

Image:  “Merry Old Santa Claus”, by Thomas Nast,  January 1, 1881 edition of Harper’s Weekly. Public Domain.

Send to Kindle

We Live in a Flat Universe

universe-shape

Cosmologists generally agree that our universe is flat. But how exactly can that be for our 3-dimensional selves and everything else for that matter? Well, first it’s useful to note that the flatness is a property of geometry, and not topology. So, even though it’s flat, the universe could be folded and/or twisted in any number of different, esoteric ways.

From Space:

The universe is flat. But there’s a lot of subtlety packed into that innocent-looking statement. What does it mean for a 3D object to be “flat”? How do we measure the shape of the universe anyway? Since the universe is flat, is that…it? Is there anything else interesting to say?

Oh yes, there is.

First, we need to define what we mean by flat. The screen you’re reading this on is obviously flat (I hope), and you know that the Earth is curved (I hope). But how can we quantify that mathematically? Such an exercise might be useful if we want to go around measuring the shape of the whole entire universe. [The History & Structure of the Universe (Infographic)]

One answer lies in parallel lines. If you start drawing two parallel lines on your paper and let them continue on, they’ll stay perfectly parallel forever (or at least until you run out of paper). That was essentially the definition of a parallel line for a couple thousand years, so we should be good.

Let’s repeat the exercise on the surface of the Earth. Start at the equator and draw a couple parallel lines, each pointing directly north. As the lines continue, they never turn left or right but still end up intersecting at the North Pole. The curvature of the Earth itself caused these initially parallel lines to end up not-so-parallel. Ergo, the Earth is curved.

The opposite of the Earth’s curved shape is a saddle: on that surface, lines that start out parallel end up spreading apart from each other (in swanky mathematical circles this is known as “ultraparallel”).

Read the entire article here.

Image: The shape of the universe depends on its density. If the density is more than the critical density, the universe is closed and curves like a sphere; if less, it will curve like a saddle. But if the actual density of the universe is equal to the critical density, as scientists think it is, then it will extend forever like a flat piece of paper. Courtesy: NASA/WMAP Science team.

Send to Kindle

MondayMap: Food Rhythms

rhythm-of-food-screenshot

OK, I admit it. Today’s article is not strictly about a map, but I couldn’t resist these fascinating data visualizations. The graphic show some of the patterns and trends that can be derived from the vast mountains of data gathered from Google searches. A group of designers and data scientists from Truth & Beauty teamed up with Google News Labs to produce a portfolio of charts that show food and drink related searches over the last 12 years.

The visual above shows a clear spike in cocktail related searches in December (for entertaining during holiday season). Interestingly Searches for a “Tom Collins” have increased since 2004 whereas those for “Martini” have decreased in number. A more recent phenomenon on the cocktail scene seems to be the “Moscow Mule”.

Since most of the searches emanated in the United States the resulting charts show some fascinating changes in the nation’s collective nutritional mood. While some visualizations confirm the obvious — fruit searches peak when in season; pizza is popular year round — some  specific insights are more curious:

  • Orange Jell-O [“jelly” for my British readers] is popular for US Thanksgiving.
  • Tamale searches peak around Christmas.
  • Pumpkin spice latte searches increase in the fall, but searches are peaking earlier each year.
  • Superfood searches are up; fat-free searches are down.
  • Nacho searches peak around Super Bowl Sunday.
  • Cauliflower may be the new Kale.

You can check out much more from this gorgeous data visualization project at The Rhythm of Food.

Image: Screenshot from Rhythm of Food. Courtesy: Rhythm of Food.

Send to Kindle

Vera Rubin: Astronomy Pioneer

Vera Rubin passed away on December 26, 2016, aged 88. She was a pioneer in the male-dominated world of astronomy, notable for her original work on dark matter,  galaxy rotation and galaxy clumping.

From Popular Science:

Vera Rubin, who essentially created a new field of astronomy by discovering dark matter, was a favorite to win the Nobel Prize in physics for years. But she never received her early-morning call from Stockholm. On Sunday, she died at the age of 88.

Rubin’s death would sadden the scientific community under the best of circumstances. Countless scientists were inspired by her work. Countless scientists are researching questions that wouldn’t exist if not for her work. But her passing brings another blow: The Nobel Prize cannot be awarded posthumously. The most prestigious award in physics will never be bestowed upon a woman who was inarguably deserving.

In the 1960s and ’70s, Rubin and her colleague Kent Ford found that the stars within spiral galaxies weren’t behaving as the laws of physics dictated that they should. This strange spinning led her and others to conclude that some unseen mass must be influencing the galactic rotation. This unknown matter—now dubbed dark matter—outnumbers the traditional stuff by at least five to one. This is a big deal.

Read more here.

Send to Kindle

Spacetime Without the Time

anti-de-sitter-spaceSince they were first dreamed up explanations of the very small (quantum mechanics) and the very large (general relativity) have both been highly successful at describing their respective spheres of influence. Yet, these two descriptions of our physical universe are not compatible, particularly when it comes to describing gravity. Indeed, physicists and theorists have struggled for decades to unite these two frameworks. Many agree that we need a new theory (of everything).

One new idea, from theorist Erik Verlinde of the University of Amsterdam, proposes that time is an emergent construct (it’s not a fundamental building block) and that dark matter is an illusion.

From Quanta:

Theoretical physicists striving to unify quantum mechanics and general relativity into an all-encompassing theory of quantum gravity face what’s called the “problem of time.”

In quantum mechanics, time is universal and absolute; its steady ticks dictate the evolving entanglements between particles. But in general relativity (Albert Einstein’s theory of gravity), time is relative and dynamical, a dimension that’s inextricably interwoven with directions x, y and z into a four-dimensional “space-time” fabric. The fabric warps under the weight of matter, causing nearby stuff to fall toward it (this is gravity), and slowing the passage of time relative to clocks far away. Or hop in a rocket and use fuel rather than gravity to accelerate through space, and time dilates; you age less than someone who stayed at home.

Unifying quantum mechanics and general relativity requires reconciling their absolute and relative notions of time. Recently, a promising burst of research on quantum gravity has provided an outline of what the reconciliation might look like — as well as insights on the true nature of time.

As I described in an article this week on a new theoretical attempt to explain away dark matter, many leading physicists now consider space-time and gravity to be “emergent” phenomena: Bendy, curvy space-time and the matter within it are a hologram that arises out of a network of entangled qubits (quantum bits of information), much as the three-dimensional environment of a computer game is encoded in the classical bits on a silicon chip. “I think we now understand that space-time really is just a geometrical representation of the entanglement structure of these underlying quantum systems,” said Mark Van Raamsdonk, a theoretical physicist at the University of British Columbia.

Researchers have worked out the math showing how the hologram arises in toy universes that possess a fisheye space-time geometry known as “anti-de Sitter” (AdS) space. In these warped worlds, spatial increments get shorter and shorter as you move out from the center. Eventually, the spatial dimension extending from the center shrinks to nothing, hitting a boundary. The existence of this boundary — which has one fewer spatial dimension than the interior space-time, or “bulk” — aids calculations by providing a rigid stage on which to model the entangled qubits that project the hologram within. “Inside the bulk, time starts bending and curving with the space in dramatic ways,” said Brian Swingle of Harvard and Brandeis universities. “We have an understanding of how to describe that in terms of the ‘sludge’ on the boundary,” he added, referring to the entangled qubits.

The states of the qubits evolve according to universal time as if executing steps in a computer code, giving rise to warped, relativistic time in the bulk of the AdS space. The only thing is, that’s not quite how it works in our universe.

Here, the space-time fabric has a “de Sitter” geometry, stretching as you look into the distance. The fabric stretches until the universe hits a very different sort of boundary from the one in AdS space: the end of time. At that point, in an event known as “heat death,” space-time will have stretched so much that everything in it will become causally disconnected from everything else, such that no signals can ever again travel between them. The familiar notion of time breaks down. From then on, nothing happens.

On the timeless boundary of our space-time bubble, the entanglements linking together qubits (and encoding the universe’s dynamical interior) would presumably remain intact, since these quantum correlations do not require that signals be sent back and forth. But the state of the qubits must be static and timeless. This line of reasoning suggests that somehow, just as the qubits on the boundary of AdS space give rise to an interior with one extra spatial dimension, qubits on the timeless boundary of de Sitter space must give rise to a universe with time — dynamical time, in particular. Researchers haven’t yet figured out how to do these calculations. “In de Sitter space,” Swingle said, “we don’t have a good idea for how to understand the emergence of time.”

Read the entire article here.

Image: Image of (1 + 1)-dimensional anti-de Sitter space embedded in flat (1 + 2)-dimensional space. The t1- and t2-axes lie in the plane of rotational symmetry, and the x1-axis is normal to that plane. The embedded surface contains closed timelike curves circling the x1 axis, though these can be eliminated by “unrolling” the embedding (more precisely, by taking the universal cover). Courtesy: Krishnavedala. Wikipedia. Creative Commons Attribution-Share Alike 3.0.

Send to Kindle

MondayMap: A Global Radio Roadtrip

radio-garden-screenshot1

As a kid my radio allowed me to travel the world. I could use the dial to transport myself over border walls and across oceans to visit new cultures and discover new sounds. I’d always eagerly anticipate the next discovery as I carefully moved the dial around the Short Wave, Long Wave (and later the FM) spectrum, waiting for new music and voices to replace the soothing crackle and hiss of the intervening static.

So, what a revelation it is to stumble across Radio.Garden. It’s a glorious, app that combines the now arcane radio dial with the power of the internet enabling you to journey around the globe on a virtual radio roadtrip.

Trek to Tromsø north of the arctic circle in Norway, then hop over to Omsk in central Russia. Check out the meditative tunes in Kathmandu before heading southwest to Ruwi, Oman on the Persian Gulf. Stopover in Kuching, Malaysia, then visit Nhulunbuy in Australia’s Northern Territory. Take in a mid-Pacific talk radio show in Bairiki, in the Republic of Kiribati, then some salsa inspired tuned in Tacna, Peru, and followed by pounding Brazilian Euro-techno in João Pessoa. Journey to Kinshasa in the DRC for some refreshing African beats, then rest for the day with some lively conversation in the Italian Apennine Mountains in Parma, Italy.

radio-garden-screenshot2

During this wonderful border free journey one thing is becomes crystal clear: we are part of one global community with much in common. History will eventually prove the racists and xenophobes among us wrong.

Images: Screenshots of Radio.Garden. Courtesy of Radio.Garden.

Send to Kindle

Computational Folkloristics

hca_by_thora_hallager_1869What do you get when you set AI (artificial intelligence) the task of reading through 30,000 Danish folk and fairy tales? Well, you get a host of fascinating, newly discovered insights into Scandinavian witches and trolls.

More importantly, you hammer another nail into the coffin of literary criticism and set AI on a collision course with yet another preserve of once exclusive human endeavor. It’s probably safe to assume that creative writing will fall to intelligent machines in the not too distant future (as well) — certainly human-powered investigative journalism seemed to became extinct in 2016; replaced by algorithmic aggregation, social bots and fake-mongers.

From aeon:

Where do witches come from, and what do those places have in common? While browsing a large collection of traditional Danish folktales, the folklorist Timothy Tangherlini and his colleague Peter Broadwell, both at the University of California, Los Angeles, decided to find out. Armed with a geographical index and some 30,000 stories, they developed WitchHunter, an interactive ‘geo-semantic’ map of Denmark that highlights the hotspots for witchcraft.

The system used artificial intelligence (AI) techniques to unearth a trove of surprising insights. For example, they found that evil sorcery often took place close to Catholic monasteries. This made a certain amount of sense, since Catholic sites in Denmark were tarred with diabolical associations after the Protestant Reformation in the 16th century. By plotting the distance and direction of witchcraft relative to the storyteller’s location, WitchHunter also showed that enchantresses tend to be found within the local community, much closer to home than other kinds of threats. ‘Witches and robbers are human threats to the economic stability of the community,’ the researchers write. ‘Yet, while witches threaten from within, robbers are generally situated at a remove from the well-described village, often living in woods, forests, or the heath … it seems that no matter how far one goes, nor where one turns, one is in danger of encountering a witch.’

Such ‘computational folkloristics’ raise a big question: what can algorithms tell us about the stories we love to read? Any proposed answer seems to point to as many uncertainties as it resolves, especially as AI technologies grow in power. Can literature really be sliced up into computable bits of ‘information’, or is there something about the experience of reading that is irreducible? Could AI enhance literary interpretation, or will it alter the field of literary criticism beyond recognition? And could algorithms ever derive meaning from books in the way humans do, or even produce literature themselves?

Author and computational linguist Inderjeet Mani concludes his essay thus:

Computational analysis and ‘traditional’ literary interpretation need not be a winner-takes-all scenario. Digital technology has already started to blur the line between creators and critics. In a similar way, literary critics should start combining their deep expertise with ingenuity in their use of AI tools, as Broadwell and Tangherlini did with WitchHunter. Without algorithmic assistance, researchers would be hard-pressed to make such supernaturally intriguing findings, especially as the quantity and diversity of writing proliferates online.

In the future, scholars who lean on digital helpmates are likely to dominate the rest, enriching our literary culture and changing the kinds of questions that can be explored. Those who resist the temptation to unleash the capabilities of machines will have to content themselves with the pleasures afforded by smaller-scale, and fewer, discoveries. While critics and book reviewers may continue to be an essential part of public cultural life, literary theorists who do not embrace AI will be at risk of becoming an exotic species – like the librarians who once used index cards to search for information.

Read the entire tale here.

Image: Portrait of the Danish writer Hans Christian Andersen. Courtesy: Thora Hallager, 10/16 October 1869. Wikipedia. Public Domain.

Send to Kindle

Wound Man

wound-man-wellcome-library-ms-49

No, the image is not a still from a forthcoming episode of Law & Order or Criminal Minds. Nor is it a nightmarish Hieronymus Bosch artwork.

Rather, “Wound Man”, as he was known, is a visual table of contents to a medieval manuscript of medical cures, treatments and surgeries. Wound Man first appeared in German surgical texts in the early 15th century. Arranged around each of his various wounds and ailments are references to further details on appropriate treatments. For instance, reference number 38 alongside an arrow penetrating Wound Man’s thigh, “An arrow whose shaft is still in place”, leads to details on how to address the wound — presumably a relatively common occurrence in the Middle Ages.

From Public Domain Review:

Staring impassively out of the page, he bears a multitude of graphic wounds. His skin is covered in bleeding cuts and lesions, stabbed and sliced by knives, spears and swords of varying sizes, many of which remain in the skin, protruding porcupine-like from his body. Another dagger pierces his side, and through his strangely transparent chest we see its tip puncture his heart. His thighs are pierced with arrows, some intact, some snapped down to just their heads or shafts. A club slams into his shoulder, another into the side of his face.

His neck, armpits and groin sport rounded blue buboes, swollen glands suggesting that the figure has contracted plague. His shins and feet are pockmarked with clustered lacerations and thorn scratches, and he is beset by rabid animals. A dog, snake and scorpion bite at his ankles, a bee stings his elbow, and even inside the cavity of his stomach a toad aggravates his innards.

Despite this horrendous cumulative barrage of injuries, however, the Wound Man is very much alive. For the purpose of this image was not to threaten or inspire fear, but to herald potential cures for all of the depicted maladies. He contrarily represented something altogether more hopeful than his battered body: an arresting reminder of the powerful knowledge that could be channelled and dispensed in the practice of late medieval medicine.

The earliest known versions of the Wound Man appeared at the turn of the fifteenth century in books on the surgical craft, particularly works from southern Germany associated with the renowned Würzburg surgeon Ortolf von Baierland (died before 1339). Accompanying a text known as the “Wundarznei” (The Surgery), these first Wound Men effectively functioned as a human table of contents for the cures contained within the relevant treatise. Look closely at the remarkable Wound Man shown above from the Wellcome Library’s MS. 49 – a miscellany including medical material produced in Germany in about 1420 – and you see that the figure is penetrated not only by weapons but also by text.

Read the entire article here.

Image: The Wound Man. Courtesy: Wellcome Library’s MS. 49 — Source (CC BY 4.0). Public Domain Review.

Send to Kindle

Fake News: Who’s Too Blame?

alien-abduction-waltonShould we blame the creative originators of fake news, conspiracy theories, disinformation and click-bait hype? Or, should we blame the media for disseminating, spinning and aggrandizing these stories for their own profit or political motives? Or, should we blame us — the witless consumers.

I subscribe to the opinion that all three constituencies share responsibility — it’s very much a symbiotic relationship.

James Warren chief media writer for Poynter has a different opinion; he lays the blame squarely at the feet of gullible and unquestioning citizens. He makes a very compelling argument.

Perhaps if any educated political scholars remain several hundred years from now, they’ll hold the US presidential election of 2016 as the culmination of a process where lazy stupidity triumphed over healthy skepticism and reason.

From Hive:

The rise of “fake news” inspires the press to uncover its many practitioners worldwide, discern its economics and herald the alleged guilt-ridden soul-searching by its greatest enablers, Facebook and Google.

But the media dances around another reality with the dexterity of Beyonce, Usher and septuagenarian Mick Jagger: the stupidity of a growing number of Americans.

So thanks to Neal Gabler for taking to Bill Moyers’ website to pen, “Who’s Really to Blame for Fake News.” (Moyers)

Fake news, of course, “is an assault on the very principle of truth itself: a way to upend the reference points by which mankind has long operated. You could say, without exaggeration, that fake news is actually an attempt to reverse the Enlightenment. And because a democracy relies on truth — which is why dystopian writers have always described how future oligarchs need to undermine it — fake news is an assault on democracy as well.”

Gabler is identified here as the author of five books, without mentioning any. Well, one is 1995’s Winchell: Gossip, Power and the Culture of Celebrity. It’s a superb look at Walter Winchell, the man who really invented the gossip column and wound up with a readership and radio audience of 50 million, or two-thirds of the then-population, as he helped create our modern media world of privacy-invading gossip and personal destruction as entertainment.

“What is truly horrifying is that fake news is not the manipulation of an unsuspecting public,” Gabler writes of our current mess. “Quite the opposite. It is willful belief by the public. In effect, the American people are accessories in their own disinformation campaign. That is our current situation, and it is no sure thing that either truth or democracy survives.”

Think of it. The goofy stories, the lies, the conspiracy theories that now routinely gain credibility among millions who can’t be bothered to read a newspaper or decent digital site and can’t differentiate between Breitbart and The New York Times. Ask all those pissed-off Trump loyalists in rural towns to name their two U.S. senators.

We love convincing ourselves of the strengths of democracy, including the inevitable collective wisdom setting us back on a right track if ever we go astray. And while the media may hold itself out as cultural anthropologists in explaining the “anger” or “frustration” of “real people,” as is the case after Donald Trump’s election victory, we won’t really underscore rampant illiteracy and incomprehension.

So read Gabler. “Above all else, fake news is a lazy person’s news. It provides passive entertainment, demanding nothing of us. And that is a major reason we now have a fake news president.”

Read the entire essay here.

Image: Artist’s conception of an alien spacecraft tractor-beaming a human victim. Courtesy: unknown artist, Wikipedia. Public Domain.

Send to Kindle

Uber For…

google-search-uber

There’s an Uber for pet-sitters (Rover). There’s an Uber for dog walkers (Wag). There’s an Uber for private jets (JetMe). There are several Ubers for alcohol (Minibar, Saucey, Drizly, Thirstie). In fact, enter the keywords “Uber for…” into Google and the search engine will return “Uber for kids, Uber for icecream, Uber for news, Uber for seniors, Uber for trucks, Uber for haircuts, Uber for iPads (?), Uber for food, Uber for undertakers (??)…” and thousands of other results.

The list of Uber-like copycats, startups and ideas is seemingly endless — a sign, without doubt, that we have indeed reached peak-Uber. Perhaps VCs in the valley should move on to some more meaningful investments, before the Uber bubble bursts.

From Wired:

“Uber for X” has been the headline of more than four hundred news articles. Thousands of would-be entrepreneurs used the phrase to describe their companies in their pitch decks. On one site alone—AngelList, where startups can court angel investors and employees—526 companies included “Uber for” in their listings. As a judge for various emerging technology startup competitions, I saw “Uber for” so many times that at some point, I developed perceptual blindness.

Nearly all the organizations I advised at that time wanted to know about the “Uber for” of their respective industries. A university wanted to develop an “Uber for tutoring”; a government agency was hoping to solve an impending transit issue with an “Uber for parking.” I knew that “Uber for” had reached critical mass when one large media organization, in need of a sustainable profit center, pitched me their “Uber for news strategy.”

“We’re going to be the Uber for news,” the news exec told me. Confused, I asked what, exactly, he meant by that.

“Three years from now, we’ll have an on-demand news platform for Millennials. They tap a button on their phones and they get the news delivered right to them, wherever they are,” the editor said enthusiastically. “This is the future of news!”

“Is it an app?” I asked, trying to understand.

“Maybe. The point is that you get the news right away, when you want it, wherever you are,” the exec said.

“So you mean an app,” I pressed. “Yes!” he said. “But more like Uber.”

The mass “Uber for X” excitement is a good example of what happens when we don’t stop to investigate a trend, asking difficult questions and challenging our cherished beliefs. We need to first understand what, exactly, Uber is and what led to entrepreneurs coining that catchphrase.

Read the entire story here.

Image courtesy of Google Search.

Send to Kindle

The Anomaly

Is the smallest, lightest, most ghostly particle about to upend our understanding of the universe? Recently, the ephemeral neutrino has begun to give up some of its secrets. Beginning in 1998 the neutrino experiments at Super-Kamiokande and Sudbury Neutrino Observatory showed for the first time that neutrinos oscillate with one of three flavors. In 2015, two physicists were awarded the Nobel prize for this discovery, which also proved that neutrinos must have mass. More recently, a small anomaly at the Super-Kamiokande detector has surfaced, which, is hoped, could shed light on why the universe is constructed primarily from matter and not anti-matter.

From Quanta:

The anomaly, detected by the T2K experiment, is not yet pronounced enough to be sure of, but it and the findings of two related experiments “are all pointing in the same direction,” said Hirohisa Tanaka of the University of Toronto, a member of the T2K team who presented the result to a packed audience in London earlier this month.

“A full proof will take more time,” said Werner Rodejohann, a neutrino specialist at the Max Planck Institute for Nuclear Physics in Heidelberg who was not involved in the experiments, “but my and many others’ feeling is that there is something real here.”

The long-standing puzzle to be solved is why we and everything we see is matter-made. More to the point, why does anything — matter or antimatter — exist at all? The reigning laws of particle physics, known as the Standard Model, treat matter and antimatter nearly equivalently, respecting (with one known exception) so-called charge-parity, or “CP,” symmetry: For every particle decay that produces, say, a negatively charged electron, the mirror-image decay yielding a positively charged antielectron occurs at the same rate. But this cannot be the whole story. If equal amounts of matter and antimatter were produced during the Big Bang, equal amounts should have existed shortly thereafter. And since matter and antimatter annihilate upon contact, such a situation would have led to the wholesale destruction of both, resulting in an empty cosmos.

Somehow, significantly more matter than antimatter must have been created, such that a matter surplus survived the annihilation and now holds sway. The question is, what CP-violating process beyond the Standard Model favored the production of matter over antimatter?

Many physicists suspect that the answer lies with neutrinos — ultra-elusive, omnipresent particles that pass unfelt through your body by the trillions each second.

Read the entire article here.

Send to Kindle

Robots Beware. Humans Are Still (Sort of) Smarter Than You

So, it looks like we humans may have a few more years to go as the smartest beings on the planet, before being overrun by ubiquitous sentient robots. Some may question my assertion based on recent election results in the UK and the US, but I digress.

A recent experiment featuring some of our best-loved voice-activated assistants, such as Apple’s Siri, Amazon’s Alexa and Google’s Home, clearly shows our digital brethren have some learning to do. A conversation between two of these rapidly enters an infinite loop.

Read more about this here.

Video: Echo/Google Home infinite loop. Courtesy: Adam Jakowenko.

Send to Kindle

The Existential Dangers of the Online Echo Chamber

google-search-fake-news

The online filter bubble is a natural extension of our preexisting biases, particularly evident in our media consumption. Those of us of a certain age — above 30 years — once purchased (and maybe still do) our favorite paper-based newspapers and glued ourselves to our favorite TV news channels. These sources mirrored, for the most part, our cultural and political preferences. The internet took this a step further by building a tightly wound, self-reinforcing feedback loop. We consume our favorite online media, which solicits algorithms to deliver more of the same. I’ve written about the filter bubble for years (here, here and here).

The online filter bubble in which each of us lives — those of us online — may seem no more dangerous than its offline predecessor. After all, the online version of the NYT delivers left-of-center news, just like its printed cousin. So what’s the big deal? Well, the pervasiveness of our technology has now enabled these filters to creep insidiously into many aspects of our lives, from news consumption and entertainment programming to shopping and even dating. And, since we now spend growing  swathes of our time online, our serendipitous exposure to varied content that typically lies outside this bubble in the real, offline world is diminishing. Consequently, the online filter bubble is taking on a much more critical role and having greater effect in maintaining our tunnel vision.

However, that’s not all. Over the last few years we have become exposed to yet another dangerous phenomenon to have made the jump from the offline world to online — the echo chamber. The online echo chamber is enabled by our like-minded online communities and catalyzed by the tools of social media. And, it turns our filter bubble into a self-reinforcing, exclusionary community that is harmful to varied, reasoned opinion and healthy skepticism.

Those of us who reside on Facebook are likely to be part of a very homogeneous social circle, which trusts, shares and reinforces information accepted by the group and discards information that does not match the group’s social norms. This makes the spread of misinformation — fake stories, conspiracy theories, hoaxes, rumors — so very effective. Importantly, this is increasingly to the exclusion of all else, including real news and accepted scientific fact.

Why embrace objective journalism, trusted science and thoughtful political dialogue when you can get a juicy, emotive meme from a friend of a friend on Facebook? Why trust a story from Reuters or science from Scientific American when you get your “news” via a friend’s link from Alex Jones and the Brietbart News Network?

And, there’s no simple solution, which puts many of our once trusted institutions in severe jeopardy. Those of us who care have a duty to ensure these issues are in the minds of our public officials and the guardians of our technology and media networks.

From Scientific American:

If you get your news from social media, as most Americans do, you are exposed to a daily dose of hoaxes, rumors, conspiracy theories and misleading news. When it’s all mixed in with reliable information from honest sources, the truth can be very hard to discern.

In fact, my research team’s analysis of data from Columbia University’s Emergent rumor tracker suggests that this misinformation is just as likely to go viral as reliable information.

Many are asking whether this onslaught of digital misinformation affected the outcome of the 2016 U.S. election. The truth is we do not know, although there are reasons to believe it is entirely possible, based on past analysis and accounts from other countries. Each piece of misinformation contributes to the shaping of our opinions. Overall, the harm can be very real: If people can be conned into jeopardizing our children’s lives, as they do when they opt out of immunizations, why not our democracy?

As a researcher on the spread of misinformation through social media, I know that limiting news fakers’ ability to sell ads, as recently announced by Google and Facebook, is a step in the right direction. But it will not curb abuses driven by political motives.

Read the entire article here.

Image courtesy of Google Search.

Send to Kindle

The Birthday Problem

birthday_paradox

I first came across the Birthday Problem in my first few days of my first year of secondary school in London [that would be 6th grade for my US readers]. My mathematics teacher at the time realized the need to discuss abstract problems in concrete terms, especially statistics and probability. So, he wowed many of us — in a class of close to 30 kids — by firmly stating that there was a better than even chance that two of us shared the same birthday. In a class of 30, the actual probability is 60 percent, and rises to close to 100 percent is a group of only 60.

Startlingly, two in our class did indeed share the same birthday. How could that be possible, I wondered?

Well, the answer is grounded in the simple probability of large populations. But, it is also colored by our selective biases to remember “remarkable” coincidences and to ignore the much, much larger number of instances where there is no coincidence at all.

From the Washington Post.

Mathematician Joseph Mazur was in the back of a van snaking through the mountains of Sardinia when he heard one of his favorite coincidence stories. The driver, an Italian language teacher named Francesco, told of meeting a woman named Manuela who had come to study at his school. Francesco and Manuela met for the first time in a hotel lobby, and then went to have coffee.

They spoke for an hour, getting acquainted, before the uncomfortable truth came out. Noting Manuela’s nearly perfect Italian, Francesco finally asked why she decided to come to his school.

“She said, ‘Italian? What are you talk about? I’m not here to learn Italian,’” Mazur relates. “And then it dawned on both of them that she was the wrong Manuela and he was the wrong Francesco.” They returned to the hotel lobby where they had met to find a different Francesco offering a different Manuela a job she didn’t want or expect.

The tale is one of the many stories that populate Mazur’s new book, “Fluke,” in which he explores the probability of coincidences.

Read the entire article here.

Image: The computed probability of at least two people sharing a birthday versus the number of people. Courtesy: Rajkiran g / Wikipedia. CC BY-SA 3.0.

Send to Kindle

Surplus Humans and the Death of Work

detroit-industry-north-wall-diego-rivera

It’s a simple equation: too many humans, not enough work. Low paying, physical jobs continue to disappear, replaced by mechanization. More cognitive work characterized by the need to think is increasingly likely to be automated and robotized. This has complex and dire consequences, and not just global economic ramifications, but moral ones. What are we to make of ourselves and of a culture that has intimately linked work with meaning when the work is outsourced or eliminated entirely?

A striking example comes from the richest country in the world — the United States. Recently and anomalously life-expectancy has shown a decrease among white people in economically depressed areas of the nation. Many economists suggest that the quest for ever-increasing productivity — usually delivered through automation — is chipping away at the very essence of what it means to be human: value purpose through work.

James Livingston professor of history at Rutgers University summarizes the existential dilemma, excerpted below, in his latest book No More Work: Why Full Employment is a Bad Idea.

From aeon:

Work means everything to us Americans. For centuries – since, say, 1650 – we’ve believed that it builds character (punctuality, initiative, honesty, self-discipline, and so forth). We’ve also believed that the market in labour, where we go to find work, has been relatively efficient in allocating opportunities and incomes. And we’ve believed that, even if it sucks, a job gives meaning, purpose and structure to our everyday lives – at any rate, we’re pretty sure that it gets us out of bed, pays the bills, makes us feel responsible, and keeps us away from daytime TV.

These beliefs are no longer plausible. In fact, they’ve become ridiculous, because there’s not enough work to go around, and what there is of it won’t pay the bills – unless of course you’ve landed a job as a drug dealer or a Wall Street banker, becoming a gangster either way.

These days, everybody from Left to Right – from the economist Dean Baker to the social scientist Arthur C Brooks, from Bernie Sanders to Donald Trump – addresses this breakdown of the labour market by advocating ‘full employment’, as if having a job is self-evidently a good thing, no matter how dangerous, demanding or demeaning it is. But ‘full employment’ is not the way to restore our faith in hard work, or in playing by the rules, or in whatever else sounds good. The official unemployment rate in the United States is already below 6 per cent, which is pretty close to what economists used to call ‘full employment’, but income inequality hasn’t changed a bit. Shitty jobs for everyone won’t solve any social problems we now face.

Don’t take my word for it, look at the numbers. Already a fourth of the adults actually employed in the US are paid wages lower than would lift them above the official poverty line – and so a fifth of American children live in poverty. Almost half of employed adults in this country are eligible for food stamps (most of those who are eligible don’t apply). The market in labour has broken down, along with most others.

Those jobs that disappeared in the Great Recession just aren’t coming back, regardless of what the unemployment rate tells you – the net gain in jobs since 2000 still stands at zero – and if they do return from the dead, they’ll be zombies, those contingent, part-time or minimum-wage jobs where the bosses shuffle your shift from week to week: welcome to Wal-Mart, where food stamps are a benefit.

Read the entire essay here.

Image: Detroit Industry North Wall, Diego Rivera. Courtesy: Detroit Institute of Arts. Wikipedia.

Send to Kindle

Breathe, You’re On Vacation

google-search-vacation

I’m lucky enough to be able to take a couple of vacations [holidays for my British readers] each year. Over the decades my vacations have generally tended to fall into two categories. First, there is the inactive, relaxing vacation of nothingness. This usually involves lounging and listening to ocean waves break along a beautiful beach, reading some choice literature and just, well, relaxing — when without kids in tow. Second, there is the active vacation spent trekking in the wilderness or discovering a far-flung natural or cultural wonder of the world.

However, even though I began these vacation rituals with my parents when I was a child myself, and have now done this for decades, I may have had the idea of a vacation completely wrong. Apparently, the ideal vacation must involve breathing, mindfulness, and self-improvement. So, forget the relaxation.

Ironically, it seems that Google has yet to learn about our active needs for vacation wellness and enrichment. Search for “vacation” online and Google will first deliver many thousands of images of people relaxing at the beach under a deep blue sky.

From NYT:

When I was 22, I used to have a fantasy about going away to a sanitarium, like in “The Magic Mountain.” I would do nothing but sit on balconies, wrapped in steamer rugs, and go to the doctor, avoiding the rigors of the real world and emerging after a short period brighter, happier, better.

I’m beginning to think this was a prescient impulse. Over the decades we have embraced a widening and diverse array of practices and traditions, but the idea that we can be improved — in mind, body or spirit — has remained a constant. That this could be accomplished with money and in an allotted parcel of time has become increasingly popular with a generation reared in a maximalist minimalist moment that, as with fashion and interior design, demands grandiose, well-documented freedom from the world. If stuff was once an indicator of security, now the very lack of it — of dust, of furniture, of body fat, of errant thoughts — defines aspiration. A glamorous back-to-nature exercise in pricey self-abnegation has become the logical way to spend one’s leisure time.

We live in a golden age of the “wellness vacation,” a sort of hybrid retreat, boot camp, spa and roving therapy session that, for the cost of room and board, promises to refresh body and mind and send you back to your life more whole. Pravassa, a “wellness travel company,” summarizes its (trademarked) philosophy as “Breathe. Experience. Move. Mindfulness. Nourish.” (The Kripalu Center for Yoga & Health, a wellness retreat in New England, boasts the eerily similar tagline: “Breathe. Connect. Move. Discover. Shine.”) A 10-day trip to Thailand with Pravassa includes a travel guide — who works, in her day job, as a “mindfulness-based psychotherapist” in Atlanta — as well as temple pilgrimages at dawn and, more abstractly, the potential to bring all that mindfulness back home with you. Selfies are not only allowed but encouraged.

Read the entire article here.

Image courtesy of Google Search.

Send to Kindle

MondayMap: National Superlatives

international-number-ones-2016

OK, I must admit that some maps can be somewhat dubious. Or, is it all maps?

Despite their shaky foundations some maps form the basis for many centuries of human (mis-)understanding, only to be subsequently overturned by a new (and improved) chart. For instance, the geocentric models of our cosmos courtesy of Aristotle and Ptolemy were not replaced for around 1,400 years, until Nicolaus Copernicus proposed a heliocentric view of the solar system.

Thus, keep in mind the latest view of our globe, courtesy of David McCandless. He compiled this esoteric worldview, because every nation is the best at something, from a collection of global data sources.

Looks like the US is “best” at spam (not the luncheon meat). While Russia leads in, of course, dash cams.

Send to Kindle

Beware. Economic Growth May Kill You

There is a long-held belief that economic growth and prosperity makes for a happier, healthier populace. Most economists and social scientists, and indeed lay-people, have subscribed to this idea for many decades.

But, this may be completely wrong.

A handful of contrarian economists began noticing a strange paradox in their research studies from 2000. Evidence suggests that rising incomes and personal well-being are linked in the opposite way. It seems that when the US economy is improving, people suffer more medical problems and die faster.

How could this be? Well, put simply, there are three main factors: increased pollution from increased industrial activity; greater occupational hazards from increased work; and, higher exposure to risky behaviors from greater income.

From the Washington Post:

Yet in recent years, accumulating evidence suggests that rising incomes and personal well-being are linked in the opposite way. It seems that economic growth actually kills people.

Christopher Ruhm, an economics professor at the University of Virginia, was one of the first to notice this paradox. In a 2000 paper, he showed that when the American economy is on an upswing, people suffer more medical problems and die faster; when the economy falters, people tend to live longer.

“It’s very puzzling,” says Adriana Lleras-Muney, an economics professor at the University of California, Los Angeles. “We know that people in rich countries live longer than people in poor countries. There’s a strong relationship between GDP and life expectancy, suggesting that more money is better. And yet, when the economy is doing well, when it’s growing faster than average, we find that more people are dying.”

In other words, there are great benefits to being wealthy. But the process of becoming wealthy — well, that seems to be dangerous.

Lleras-Muney and her colleagues, David Cutler of Harvard and Wei Huang of the National Bureau of Economic Research, believe they can explain why. They have conducted one of the most comprehensive investigations yet of this phenomenon, analyzing over 200 years of data from 32 countries. In a draft of their research, released last week, they lay out something of a grand unified theory of life, death and economic growth.

To start, the economists confirm that when a country’s economic output — its GDP — is higher than expected, mortality rates are also higher than expected.

The data show that when economies are growing particularly fast, emissions and pollution are also on the rise. After controlling for changes in air quality, the economists find that economic growth doesn’t seem to impact death rates as much. “As much as two-thirds of the adverse effect of booms may be the result of increased pollution,” they write.

A booming economy spurs death in other ways too. People start to spend more time at their jobs, exposing them to occupational hazards, as well as the stress of overwork. People drive more, leading to an increase in traffic-related fatalities. People also drink more, causing health problems and accidents. In particular, the economists’ data suggest that alcohol-related mortality is the second-most important explanation, after pollution, for the connection between economic growth and death rates.

This is consistent with other studies finding that people are more likely to die right after they receive their tax rebates. More income makes it easier for people to pay for health care and other basic necessities, but it also makes it easier for people to engage in risky activities and hurt themselves.

Read the entire story here.

Send to Kindle

You’re Not In Control

dual_elevator_door_buttons

Press a button, then something happens. Eat too much chocolate, then you feel great (and then put on weight). Step in to the middle of a busy road, then you get hit by an oncoming car. Walk in the rain, then you get wet. Watch your favorite comedy show, then you laugh.

Every moment of our lives is filled with actions and consequences, causes and effects. Usually we have a good sense of what is likely to happen when we take a specific action. This sense of predictability smooths our lives and makes us feel in control.

But sometimes all is not what is seems. Take the buttons on some of the most actively used objects in our daily lives. Press the “close door” button on the elevator [or “lift” for my British readers], then the door closes, right? Press the “pedestrian crossing” button at the crosswalk [or “zebra crossing”], then the safe to cross signal blinks to life, right? Adjust the office thermostat, then you feel more comfortable, right?

Well, if you think that by pressing a button you are commanding the elevator door to close, or the crosswalk signal to flash, or the thermostat to change the office temperature, you’re probably wrong. You may feel in control, but actually you’re not. In many cases the button may serve no functional purpose; the systems just work automatically. But the button still offers a psychological purpose — a placebo-like effect. We are so conditioned to the notion that pressing a button yields an action, that we still feel in control even when the button does nothing beyond making an audible click.

From the NYT:

Pressing the door-close button on an elevator might make you feel better, but it will do nothing to hasten your trip.

Karen W. Penafiel, executive director of National Elevator Industry Inc., a trade group, said the close-door feature faded into obsolescence a few years after the enactment of the Americans With Disabilities Act in 1990.

The legislation required that elevator doors remain open long enough for anyone who uses crutches, a cane or wheelchair to get on board, Ms. Penafiel said in an interview on Tuesday. “The riding public would not be able to make those doors close any faster,” she said.

The buttons can be operated by firefighters and maintenance workers who have the proper keys or codes.

No figures were available for the number of elevators still in operation with functioning door-close buttons. Given that the estimated useful life of an elevator is 25 years, it is likely that most elevators in service today have been modernized or refurbished, rendering the door-close buttons a thing of the past for riders, Ms. Penafiel said.

Read the entire story here.

Image: Elevator control panel, cropped to show only dual “door open” and “door close” buttons. Courtesy: Nils R. Barth. Wikipedia. Creative Commons CC0 1.0 Universal Public Domain Dedication.

Send to Kindle

How and Why Did Metamorphosis Evolve?

papilio_machaon

Evolution is a truly wondrous thing. It has given us eyes and lots of grey matter [which we still don’t use very well]. It has given us the beautiful tiger and shimmering hues and soaring songs of our birds. It has given us the towering Sequoias, creepy insects, gorgeous ocean-bound creatures and invisible bacteria and viruses. Yet for all its wondrous adaptations one evolutionary invention still seems mysteriously supernatural — metamorphosis.

So, how and why did it evolve? A compelling new theory on the origins of insect metamorphosis by James W. Truman and Lynn M. Riddiford is excerpted below (from a detailed article in Scientific American).

The theory posits that a beneficial mutation around 300 million years ago led to the emergence of metamorphosis in insects:

By combining evidence from the fossil record with studies on insect anatomy and development, biologists have established a plausible narrative about the origin of insect metamorphosis, which they continue to revise as new information surfaces. The earliest insects in Earth’s history did not metamorphose; they hatched from eggs, essentially as miniature adults. Between 280 million and 300 million years ago, however, some insects began to mature a little differently—they hatched in forms that neither looked nor behaved like their adult versions. This shift proved remarkably beneficial: young and old insects were no longer competing for the same resources. Metamorphosis was so successful that, today, as many as 65 percent of all animal species on the planet are metamorphosing insects.

And, there are essentially three types of metamorphosis:

Wingless ametabolous insects, such as silverfish and bristletails, undergo little or no metamorphosis. When they hatch from eggs, they already look like adults, albeit tiny ones, and simply grow larger over time through a series of molts in which they shed their exoskeletons. Hemimetaboly, or incomplete metamorphosis, describes insects such as cockroaches, grasshoppers and dragonflies that hatch as nymphs—miniature versions of their adult forms that gradually develop wings and functional genitals as they molt and grow. Holometaboly, or complete metamorphosis, refers to insects such as beetles, flies, butterflies, moths and bees, which hatch as wormlike larvae that eventually enter a quiescent pupal stage before emerging as adults that look nothing like the larvae.

And, it’s backed by a concrete survival and reproductive advantage:

[T]he enormous numbers of metamorphosing insects on the planet speak for its success as a reproductive strategy. The primary advantage of complete metamorphosis is eliminating competition between the young and old. Larval insects and adult insects occupy very different ecological niches. Whereas caterpillars are busy gorging themselves on leaves, completely disinterested in reproduction, butterflies are flitting from flower to flower in search of nectar and mates. Because larvas and adults do not compete with one another for space or resources, more of each can coexist relative to species in which the young and old live in the same places and eat the same things.

Read the entire article here.

Image: Old World Swallowtail (Papilio machaon). Courtesy: fesoj – Otakárek fenyklový [Papilio machaon]. CC BY 2.0, https://commons.wikimedia.org/w/index.php?curid=7263187

Send to Kindle

Nightmare Machine

mit-nightmare-machine

Now that the abject terror of the US presidential election is over — at least for a while — we have to turn our minds to new forms of pain and horror.

In recent years a growing number of illustrious scientists and technologists has described artificial intelligence (AI) as the greatest existential threat to humanity. They worry, rightfully, that a well-drilled, unfettered AI could eventually out-think and out-smart us at every level. Eventually, a super-intelligent AI would determine that humans were either peripheral or superfluous to its needs and goals, and then either enslave or extinguish us. This is the stuff of real nightmares.

Yet, at a more playful level, AI can also learn to deliver imagined nightmares. This Halloween researchers at MIT used AI techniques to create and optimize horrifying images of human faces and places. They called their AI the Nightmare Machine.

For the first step, researchers fed hundreds of thousands of celebrity photos into their AI algorithm, known as a deep convolutional generative adversarial network. This allowed the AI to learn about faces and how to create new ones. Second, they flavored the results with a second learning algorithm that had been trained on images of zombies. The combination allowed the AI to learn the critical factors that make for scary images and to selectively improve upon upon them. It turns out that blood on the face, empty eyeball sockets, and missing or misshaped teeth tend to illicit the greatest horror and fear.

While the results are not quite as scary as Stephen Hawkins’ warning of AI-led human extinction the images are terrorizing nonetheless.

Learn more about the MIT Media Lab’s Nightmare Machine here.

Image: Horror imagery generated by artificial intelligence. Courtesy: MIT Media Lab.

Send to Kindle