Vera Rubin: Astronomy Pioneer

Vera Rubin passed away on December 26, 2016, aged 88. She was a pioneer in the male-dominated world of astronomy, notable for her original work on dark matter,  galaxy rotation and galaxy clumping.

From Popular Science:

Vera Rubin, who essentially created a new field of astronomy by discovering dark matter, was a favorite to win the Nobel Prize in physics for years. But she never received her early-morning call from Stockholm. On Sunday, she died at the age of 88.

Rubin’s death would sadden the scientific community under the best of circumstances. Countless scientists were inspired by her work. Countless scientists are researching questions that wouldn’t exist if not for her work. But her passing brings another blow: The Nobel Prize cannot be awarded posthumously. The most prestigious award in physics will never be bestowed upon a woman who was inarguably deserving.

In the 1960s and ’70s, Rubin and her colleague Kent Ford found that the stars within spiral galaxies weren’t behaving as the laws of physics dictated that they should. This strange spinning led her and others to conclude that some unseen mass must be influencing the galactic rotation. This unknown matter—now dubbed dark matter—outnumbers the traditional stuff by at least five to one. This is a big deal.

Read more here.

Send to Kindle

Spacetime Without the Time

anti-de-sitter-spaceSince they were first dreamed up explanations of the very small (quantum mechanics) and the very large (general relativity) have both been highly successful at describing their respective spheres of influence. Yet, these two descriptions of our physical universe are not compatible, particularly when it comes to describing gravity. Indeed, physicists and theorists have struggled for decades to unite these two frameworks. Many agree that we need a new theory (of everything).

One new idea, from theorist Erik Verlinde of the University of Amsterdam, proposes that time is an emergent construct (it’s not a fundamental building block) and that dark matter is an illusion.

From Quanta:

Theoretical physicists striving to unify quantum mechanics and general relativity into an all-encompassing theory of quantum gravity face what’s called the “problem of time.”

In quantum mechanics, time is universal and absolute; its steady ticks dictate the evolving entanglements between particles. But in general relativity (Albert Einstein’s theory of gravity), time is relative and dynamical, a dimension that’s inextricably interwoven with directions x, y and z into a four-dimensional “space-time” fabric. The fabric warps under the weight of matter, causing nearby stuff to fall toward it (this is gravity), and slowing the passage of time relative to clocks far away. Or hop in a rocket and use fuel rather than gravity to accelerate through space, and time dilates; you age less than someone who stayed at home.

Unifying quantum mechanics and general relativity requires reconciling their absolute and relative notions of time. Recently, a promising burst of research on quantum gravity has provided an outline of what the reconciliation might look like — as well as insights on the true nature of time.

As I described in an article this week on a new theoretical attempt to explain away dark matter, many leading physicists now consider space-time and gravity to be “emergent” phenomena: Bendy, curvy space-time and the matter within it are a hologram that arises out of a network of entangled qubits (quantum bits of information), much as the three-dimensional environment of a computer game is encoded in the classical bits on a silicon chip. “I think we now understand that space-time really is just a geometrical representation of the entanglement structure of these underlying quantum systems,” said Mark Van Raamsdonk, a theoretical physicist at the University of British Columbia.

Researchers have worked out the math showing how the hologram arises in toy universes that possess a fisheye space-time geometry known as “anti-de Sitter” (AdS) space. In these warped worlds, spatial increments get shorter and shorter as you move out from the center. Eventually, the spatial dimension extending from the center shrinks to nothing, hitting a boundary. The existence of this boundary — which has one fewer spatial dimension than the interior space-time, or “bulk” — aids calculations by providing a rigid stage on which to model the entangled qubits that project the hologram within. “Inside the bulk, time starts bending and curving with the space in dramatic ways,” said Brian Swingle of Harvard and Brandeis universities. “We have an understanding of how to describe that in terms of the ‘sludge’ on the boundary,” he added, referring to the entangled qubits.

The states of the qubits evolve according to universal time as if executing steps in a computer code, giving rise to warped, relativistic time in the bulk of the AdS space. The only thing is, that’s not quite how it works in our universe.

Here, the space-time fabric has a “de Sitter” geometry, stretching as you look into the distance. The fabric stretches until the universe hits a very different sort of boundary from the one in AdS space: the end of time. At that point, in an event known as “heat death,” space-time will have stretched so much that everything in it will become causally disconnected from everything else, such that no signals can ever again travel between them. The familiar notion of time breaks down. From then on, nothing happens.

On the timeless boundary of our space-time bubble, the entanglements linking together qubits (and encoding the universe’s dynamical interior) would presumably remain intact, since these quantum correlations do not require that signals be sent back and forth. But the state of the qubits must be static and timeless. This line of reasoning suggests that somehow, just as the qubits on the boundary of AdS space give rise to an interior with one extra spatial dimension, qubits on the timeless boundary of de Sitter space must give rise to a universe with time — dynamical time, in particular. Researchers haven’t yet figured out how to do these calculations. “In de Sitter space,” Swingle said, “we don’t have a good idea for how to understand the emergence of time.”

Read the entire article here.

Image: Image of (1 + 1)-dimensional anti-de Sitter space embedded in flat (1 + 2)-dimensional space. The t1- and t2-axes lie in the plane of rotational symmetry, and the x1-axis is normal to that plane. The embedded surface contains closed timelike curves circling the x1 axis, though these can be eliminated by “unrolling” the embedding (more precisely, by taking the universal cover). Courtesy: Krishnavedala. Wikipedia. Creative Commons Attribution-Share Alike 3.0.

Send to Kindle

MondayMap: A Global Radio Roadtrip

radio-garden-screenshot1

As a kid my radio allowed me to travel the world. I could use the dial to transport myself over border walls and across oceans to visit new cultures and discover new sounds. I’d always eagerly anticipate the next discovery as I carefully moved the dial around the Short Wave, Long Wave (and later the FM) spectrum, waiting for new music and voices to replace the soothing crackle and hiss of the intervening static.

So, what a revelation it is to stumble across Radio.Garden. It’s a glorious, app that combines the now arcane radio dial with the power of the internet enabling you to journey around the globe on a virtual radio roadtrip.

Trek to Tromsø north of the arctic circle in Norway, then hop over to Omsk in central Russia. Check out the meditative tunes in Kathmandu before heading southwest to Ruwi, Oman on the Persian Gulf. Stopover in Kuching, Malaysia, then visit Nhulunbuy in Australia’s Northern Territory. Take in a mid-Pacific talk radio show in Bairiki, in the Republic of Kiribati, then some salsa inspired tuned in Tacna, Peru, and followed by pounding Brazilian Euro-techno in João Pessoa. Journey to Kinshasa in the DRC for some refreshing African beats, then rest for the day with some lively conversation in the Italian Apennine Mountains in Parma, Italy.

radio-garden-screenshot2

During this wonderful border free journey one thing is becomes crystal clear: we are part of one global community with much in common. History will eventually prove the racists and xenophobes among us wrong.

Images: Screenshots of Radio.Garden. Courtesy of Radio.Garden.

Send to Kindle

Computational Folkloristics

hca_by_thora_hallager_1869What do you get when you set AI (artificial intelligence) the task of reading through 30,000 Danish folk and fairy tales? Well, you get a host of fascinating, newly discovered insights into Scandinavian witches and trolls.

More importantly, you hammer another nail into the coffin of literary criticism and set AI on a collision course with yet another preserve of once exclusive human endeavor. It’s probably safe to assume that creative writing will fall to intelligent machines in the not too distant future (as well) — certainly human-powered investigative journalism seemed to became extinct in 2016; replaced by algorithmic aggregation, social bots and fake-mongers.

From aeon:

Where do witches come from, and what do those places have in common? While browsing a large collection of traditional Danish folktales, the folklorist Timothy Tangherlini and his colleague Peter Broadwell, both at the University of California, Los Angeles, decided to find out. Armed with a geographical index and some 30,000 stories, they developed WitchHunter, an interactive ‘geo-semantic’ map of Denmark that highlights the hotspots for witchcraft.

The system used artificial intelligence (AI) techniques to unearth a trove of surprising insights. For example, they found that evil sorcery often took place close to Catholic monasteries. This made a certain amount of sense, since Catholic sites in Denmark were tarred with diabolical associations after the Protestant Reformation in the 16th century. By plotting the distance and direction of witchcraft relative to the storyteller’s location, WitchHunter also showed that enchantresses tend to be found within the local community, much closer to home than other kinds of threats. ‘Witches and robbers are human threats to the economic stability of the community,’ the researchers write. ‘Yet, while witches threaten from within, robbers are generally situated at a remove from the well-described village, often living in woods, forests, or the heath … it seems that no matter how far one goes, nor where one turns, one is in danger of encountering a witch.’

Such ‘computational folkloristics’ raise a big question: what can algorithms tell us about the stories we love to read? Any proposed answer seems to point to as many uncertainties as it resolves, especially as AI technologies grow in power. Can literature really be sliced up into computable bits of ‘information’, or is there something about the experience of reading that is irreducible? Could AI enhance literary interpretation, or will it alter the field of literary criticism beyond recognition? And could algorithms ever derive meaning from books in the way humans do, or even produce literature themselves?

Author and computational linguist Inderjeet Mani concludes his essay thus:

Computational analysis and ‘traditional’ literary interpretation need not be a winner-takes-all scenario. Digital technology has already started to blur the line between creators and critics. In a similar way, literary critics should start combining their deep expertise with ingenuity in their use of AI tools, as Broadwell and Tangherlini did with WitchHunter. Without algorithmic assistance, researchers would be hard-pressed to make such supernaturally intriguing findings, especially as the quantity and diversity of writing proliferates online.

In the future, scholars who lean on digital helpmates are likely to dominate the rest, enriching our literary culture and changing the kinds of questions that can be explored. Those who resist the temptation to unleash the capabilities of machines will have to content themselves with the pleasures afforded by smaller-scale, and fewer, discoveries. While critics and book reviewers may continue to be an essential part of public cultural life, literary theorists who do not embrace AI will be at risk of becoming an exotic species – like the librarians who once used index cards to search for information.

Read the entire tale here.

Image: Portrait of the Danish writer Hans Christian Andersen. Courtesy: Thora Hallager, 10/16 October 1869. Wikipedia. Public Domain.

Send to Kindle

Wound Man

wound-man-wellcome-library-ms-49

No, the image is not a still from a forthcoming episode of Law & Order or Criminal Minds. Nor is it a nightmarish Hieronymus Bosch artwork.

Rather, “Wound Man”, as he was known, is a visual table of contents to a medieval manuscript of medical cures, treatments and surgeries. Wound Man first appeared in German surgical texts in the early 15th century. Arranged around each of his various wounds and ailments are references to further details on appropriate treatments. For instance, reference number 38 alongside an arrow penetrating Wound Man’s thigh, “An arrow whose shaft is still in place”, leads to details on how to address the wound — presumably a relatively common occurrence in the Middle Ages.

From Public Domain Review:

Staring impassively out of the page, he bears a multitude of graphic wounds. His skin is covered in bleeding cuts and lesions, stabbed and sliced by knives, spears and swords of varying sizes, many of which remain in the skin, protruding porcupine-like from his body. Another dagger pierces his side, and through his strangely transparent chest we see its tip puncture his heart. His thighs are pierced with arrows, some intact, some snapped down to just their heads or shafts. A club slams into his shoulder, another into the side of his face.

His neck, armpits and groin sport rounded blue buboes, swollen glands suggesting that the figure has contracted plague. His shins and feet are pockmarked with clustered lacerations and thorn scratches, and he is beset by rabid animals. A dog, snake and scorpion bite at his ankles, a bee stings his elbow, and even inside the cavity of his stomach a toad aggravates his innards.

Despite this horrendous cumulative barrage of injuries, however, the Wound Man is very much alive. For the purpose of this image was not to threaten or inspire fear, but to herald potential cures for all of the depicted maladies. He contrarily represented something altogether more hopeful than his battered body: an arresting reminder of the powerful knowledge that could be channelled and dispensed in the practice of late medieval medicine.

The earliest known versions of the Wound Man appeared at the turn of the fifteenth century in books on the surgical craft, particularly works from southern Germany associated with the renowned Würzburg surgeon Ortolf von Baierland (died before 1339). Accompanying a text known as the “Wundarznei” (The Surgery), these first Wound Men effectively functioned as a human table of contents for the cures contained within the relevant treatise. Look closely at the remarkable Wound Man shown above from the Wellcome Library’s MS. 49 – a miscellany including medical material produced in Germany in about 1420 – and you see that the figure is penetrated not only by weapons but also by text.

Read the entire article here.

Image: The Wound Man. Courtesy: Wellcome Library’s MS. 49 — Source (CC BY 4.0). Public Domain Review.

Send to Kindle

Fake News: Who’s Too Blame?

alien-abduction-waltonShould we blame the creative originators of fake news, conspiracy theories, disinformation and click-bait hype? Or, should we blame the media for disseminating, spinning and aggrandizing these stories for their own profit or political motives? Or, should we blame us — the witless consumers.

I subscribe to the opinion that all three constituencies share responsibility — it’s very much a symbiotic relationship.

James Warren chief media writer for Poynter has a different opinion; he lays the blame squarely at the feet of gullible and unquestioning citizens. He makes a very compelling argument.

Perhaps if any educated political scholars remain several hundred years from now, they’ll hold the US presidential election of 2016 as the culmination of a process where lazy stupidity triumphed over healthy skepticism and reason.

From Hive:

The rise of “fake news” inspires the press to uncover its many practitioners worldwide, discern its economics and herald the alleged guilt-ridden soul-searching by its greatest enablers, Facebook and Google.

But the media dances around another reality with the dexterity of Beyonce, Usher and septuagenarian Mick Jagger: the stupidity of a growing number of Americans.

So thanks to Neal Gabler for taking to Bill Moyers’ website to pen, “Who’s Really to Blame for Fake News.” (Moyers)

Fake news, of course, “is an assault on the very principle of truth itself: a way to upend the reference points by which mankind has long operated. You could say, without exaggeration, that fake news is actually an attempt to reverse the Enlightenment. And because a democracy relies on truth — which is why dystopian writers have always described how future oligarchs need to undermine it — fake news is an assault on democracy as well.”

Gabler is identified here as the author of five books, without mentioning any. Well, one is 1995’s Winchell: Gossip, Power and the Culture of Celebrity. It’s a superb look at Walter Winchell, the man who really invented the gossip column and wound up with a readership and radio audience of 50 million, or two-thirds of the then-population, as he helped create our modern media world of privacy-invading gossip and personal destruction as entertainment.

“What is truly horrifying is that fake news is not the manipulation of an unsuspecting public,” Gabler writes of our current mess. “Quite the opposite. It is willful belief by the public. In effect, the American people are accessories in their own disinformation campaign. That is our current situation, and it is no sure thing that either truth or democracy survives.”

Think of it. The goofy stories, the lies, the conspiracy theories that now routinely gain credibility among millions who can’t be bothered to read a newspaper or decent digital site and can’t differentiate between Breitbart and The New York Times. Ask all those pissed-off Trump loyalists in rural towns to name their two U.S. senators.

We love convincing ourselves of the strengths of democracy, including the inevitable collective wisdom setting us back on a right track if ever we go astray. And while the media may hold itself out as cultural anthropologists in explaining the “anger” or “frustration” of “real people,” as is the case after Donald Trump’s election victory, we won’t really underscore rampant illiteracy and incomprehension.

So read Gabler. “Above all else, fake news is a lazy person’s news. It provides passive entertainment, demanding nothing of us. And that is a major reason we now have a fake news president.”

Read the entire essay here.

Image: Artist’s conception of an alien spacecraft tractor-beaming a human victim. Courtesy: unknown artist, Wikipedia. Public Domain.

Send to Kindle

Uber For…

google-search-uber

There’s an Uber for pet-sitters (Rover). There’s an Uber for dog walkers (Wag). There’s an Uber for private jets (JetMe). There are several Ubers for alcohol (Minibar, Saucey, Drizly, Thirstie). In fact, enter the keywords “Uber for…” into Google and the search engine will return “Uber for kids, Uber for icecream, Uber for news, Uber for seniors, Uber for trucks, Uber for haircuts, Uber for iPads (?), Uber for food, Uber for undertakers (??)…” and thousands of other results.

The list of Uber-like copycats, startups and ideas is seemingly endless — a sign, without doubt, that we have indeed reached peak-Uber. Perhaps VCs in the valley should move on to some more meaningful investments, before the Uber bubble bursts.

From Wired:

“Uber for X” has been the headline of more than four hundred news articles. Thousands of would-be entrepreneurs used the phrase to describe their companies in their pitch decks. On one site alone—AngelList, where startups can court angel investors and employees—526 companies included “Uber for” in their listings. As a judge for various emerging technology startup competitions, I saw “Uber for” so many times that at some point, I developed perceptual blindness.

Nearly all the organizations I advised at that time wanted to know about the “Uber for” of their respective industries. A university wanted to develop an “Uber for tutoring”; a government agency was hoping to solve an impending transit issue with an “Uber for parking.” I knew that “Uber for” had reached critical mass when one large media organization, in need of a sustainable profit center, pitched me their “Uber for news strategy.”

“We’re going to be the Uber for news,” the news exec told me. Confused, I asked what, exactly, he meant by that.

“Three years from now, we’ll have an on-demand news platform for Millennials. They tap a button on their phones and they get the news delivered right to them, wherever they are,” the editor said enthusiastically. “This is the future of news!”

“Is it an app?” I asked, trying to understand.

“Maybe. The point is that you get the news right away, when you want it, wherever you are,” the exec said.

“So you mean an app,” I pressed. “Yes!” he said. “But more like Uber.”

The mass “Uber for X” excitement is a good example of what happens when we don’t stop to investigate a trend, asking difficult questions and challenging our cherished beliefs. We need to first understand what, exactly, Uber is and what led to entrepreneurs coining that catchphrase.

Read the entire story here.

Image courtesy of Google Search.

Send to Kindle

The Anomaly

Is the smallest, lightest, most ghostly particle about to upend our understanding of the universe? Recently, the ephemeral neutrino has begun to give up some of its secrets. Beginning in 1998 the neutrino experiments at Super-Kamiokande and Sudbury Neutrino Observatory showed for the first time that neutrinos oscillate with one of three flavors. In 2015, two physicists were awarded the Nobel prize for this discovery, which also proved that neutrinos must have mass. More recently, a small anomaly at the Super-Kamiokande detector has surfaced, which, is hoped, could shed light on why the universe is constructed primarily from matter and not anti-matter.

From Quanta:

The anomaly, detected by the T2K experiment, is not yet pronounced enough to be sure of, but it and the findings of two related experiments “are all pointing in the same direction,” said Hirohisa Tanaka of the University of Toronto, a member of the T2K team who presented the result to a packed audience in London earlier this month.

“A full proof will take more time,” said Werner Rodejohann, a neutrino specialist at the Max Planck Institute for Nuclear Physics in Heidelberg who was not involved in the experiments, “but my and many others’ feeling is that there is something real here.”

The long-standing puzzle to be solved is why we and everything we see is matter-made. More to the point, why does anything — matter or antimatter — exist at all? The reigning laws of particle physics, known as the Standard Model, treat matter and antimatter nearly equivalently, respecting (with one known exception) so-called charge-parity, or “CP,” symmetry: For every particle decay that produces, say, a negatively charged electron, the mirror-image decay yielding a positively charged antielectron occurs at the same rate. But this cannot be the whole story. If equal amounts of matter and antimatter were produced during the Big Bang, equal amounts should have existed shortly thereafter. And since matter and antimatter annihilate upon contact, such a situation would have led to the wholesale destruction of both, resulting in an empty cosmos.

Somehow, significantly more matter than antimatter must have been created, such that a matter surplus survived the annihilation and now holds sway. The question is, what CP-violating process beyond the Standard Model favored the production of matter over antimatter?

Many physicists suspect that the answer lies with neutrinos — ultra-elusive, omnipresent particles that pass unfelt through your body by the trillions each second.

Read the entire article here.

Send to Kindle

Robots Beware. Humans Are Still (Sort of) Smarter Than You

So, it looks like we humans may have a few more years to go as the smartest beings on the planet, before being overrun by ubiquitous sentient robots. Some may question my assertion based on recent election results in the UK and the US, but I digress.

A recent experiment featuring some of our best-loved voice-activated assistants, such as Apple’s Siri, Amazon’s Alexa and Google’s Home, clearly shows our digital brethren have some learning to do. A conversation between two of these rapidly enters an infinite loop.

Read more about this here.

Video: Echo/Google Home infinite loop. Courtesy: Adam Jakowenko.

Send to Kindle

The Existential Dangers of the Online Echo Chamber

google-search-fake-news

The online filter bubble is a natural extension of our preexisting biases, particularly evident in our media consumption. Those of us of a certain age — above 30 years — once purchased (and maybe still do) our favorite paper-based newspapers and glued ourselves to our favorite TV news channels. These sources mirrored, for the most part, our cultural and political preferences. The internet took this a step further by building a tightly wound, self-reinforcing feedback loop. We consume our favorite online media, which solicits algorithms to deliver more of the same. I’ve written about the filter bubble for years (here, here and here).

The online filter bubble in which each of us lives — those of us online — may seem no more dangerous than its offline predecessor. After all, the online version of the NYT delivers left-of-center news, just like its printed cousin. So what’s the big deal? Well, the pervasiveness of our technology has now enabled these filters to creep insidiously into many aspects of our lives, from news consumption and entertainment programming to shopping and even dating. And, since we now spend growing  swathes of our time online, our serendipitous exposure to varied content that typically lies outside this bubble in the real, offline world is diminishing. Consequently, the online filter bubble is taking on a much more critical role and having greater effect in maintaining our tunnel vision.

However, that’s not all. Over the last few years we have become exposed to yet another dangerous phenomenon to have made the jump from the offline world to online — the echo chamber. The online echo chamber is enabled by our like-minded online communities and catalyzed by the tools of social media. And, it turns our filter bubble into a self-reinforcing, exclusionary community that is harmful to varied, reasoned opinion and healthy skepticism.

Those of us who reside on Facebook are likely to be part of a very homogeneous social circle, which trusts, shares and reinforces information accepted by the group and discards information that does not match the group’s social norms. This makes the spread of misinformation — fake stories, conspiracy theories, hoaxes, rumors — so very effective. Importantly, this is increasingly to the exclusion of all else, including real news and accepted scientific fact.

Why embrace objective journalism, trusted science and thoughtful political dialogue when you can get a juicy, emotive meme from a friend of a friend on Facebook? Why trust a story from Reuters or science from Scientific American when you get your “news” via a friend’s link from Alex Jones and the Brietbart News Network?

And, there’s no simple solution, which puts many of our once trusted institutions in severe jeopardy. Those of us who care have a duty to ensure these issues are in the minds of our public officials and the guardians of our technology and media networks.

From Scientific American:

If you get your news from social media, as most Americans do, you are exposed to a daily dose of hoaxes, rumors, conspiracy theories and misleading news. When it’s all mixed in with reliable information from honest sources, the truth can be very hard to discern.

In fact, my research team’s analysis of data from Columbia University’s Emergent rumor tracker suggests that this misinformation is just as likely to go viral as reliable information.

Many are asking whether this onslaught of digital misinformation affected the outcome of the 2016 U.S. election. The truth is we do not know, although there are reasons to believe it is entirely possible, based on past analysis and accounts from other countries. Each piece of misinformation contributes to the shaping of our opinions. Overall, the harm can be very real: If people can be conned into jeopardizing our children’s lives, as they do when they opt out of immunizations, why not our democracy?

As a researcher on the spread of misinformation through social media, I know that limiting news fakers’ ability to sell ads, as recently announced by Google and Facebook, is a step in the right direction. But it will not curb abuses driven by political motives.

Read the entire article here.

Image courtesy of Google Search.

Send to Kindle

The Birthday Problem

birthday_paradox

I first came across the Birthday Problem in my first few days of my first year of secondary school in London [that would be 6th grade for my US readers]. My mathematics teacher at the time realized the need to discuss abstract problems in concrete terms, especially statistics and probability. So, he wowed many of us — in a class of close to 30 kids — by firmly stating that there was a better than even chance that two of us shared the same birthday. In a class of 30, the actual probability is 60 percent, and rises to close to 100 percent is a group of only 60.

Startlingly, two in our class did indeed share the same birthday. How could that be possible, I wondered?

Well, the answer is grounded in the simple probability of large populations. But, it is also colored by our selective biases to remember “remarkable” coincidences and to ignore the much, much larger number of instances where there is no coincidence at all.

From the Washington Post.

Mathematician Joseph Mazur was in the back of a van snaking through the mountains of Sardinia when he heard one of his favorite coincidence stories. The driver, an Italian language teacher named Francesco, told of meeting a woman named Manuela who had come to study at his school. Francesco and Manuela met for the first time in a hotel lobby, and then went to have coffee.

They spoke for an hour, getting acquainted, before the uncomfortable truth came out. Noting Manuela’s nearly perfect Italian, Francesco finally asked why she decided to come to his school.

“She said, ‘Italian? What are you talk about? I’m not here to learn Italian,’” Mazur relates. “And then it dawned on both of them that she was the wrong Manuela and he was the wrong Francesco.” They returned to the hotel lobby where they had met to find a different Francesco offering a different Manuela a job she didn’t want or expect.

The tale is one of the many stories that populate Mazur’s new book, “Fluke,” in which he explores the probability of coincidences.

Read the entire article here.

Image: The computed probability of at least two people sharing a birthday versus the number of people. Courtesy: Rajkiran g / Wikipedia. CC BY-SA 3.0.

Send to Kindle

Surplus Humans and the Death of Work

detroit-industry-north-wall-diego-rivera

It’s a simple equation: too many humans, not enough work. Low paying, physical jobs continue to disappear, replaced by mechanization. More cognitive work characterized by the need to think is increasingly likely to be automated and robotized. This has complex and dire consequences, and not just global economic ramifications, but moral ones. What are we to make of ourselves and of a culture that has intimately linked work with meaning when the work is outsourced or eliminated entirely?

A striking example comes from the richest country in the world — the United States. Recently and anomalously life-expectancy has shown a decrease among white people in economically depressed areas of the nation. Many economists suggest that the quest for ever-increasing productivity — usually delivered through automation — is chipping away at the very essence of what it means to be human: value purpose through work.

James Livingston professor of history at Rutgers University summarizes the existential dilemma, excerpted below, in his latest book No More Work: Why Full Employment is a Bad Idea.

From aeon:

Work means everything to us Americans. For centuries – since, say, 1650 – we’ve believed that it builds character (punctuality, initiative, honesty, self-discipline, and so forth). We’ve also believed that the market in labour, where we go to find work, has been relatively efficient in allocating opportunities and incomes. And we’ve believed that, even if it sucks, a job gives meaning, purpose and structure to our everyday lives – at any rate, we’re pretty sure that it gets us out of bed, pays the bills, makes us feel responsible, and keeps us away from daytime TV.

These beliefs are no longer plausible. In fact, they’ve become ridiculous, because there’s not enough work to go around, and what there is of it won’t pay the bills – unless of course you’ve landed a job as a drug dealer or a Wall Street banker, becoming a gangster either way.

These days, everybody from Left to Right – from the economist Dean Baker to the social scientist Arthur C Brooks, from Bernie Sanders to Donald Trump – addresses this breakdown of the labour market by advocating ‘full employment’, as if having a job is self-evidently a good thing, no matter how dangerous, demanding or demeaning it is. But ‘full employment’ is not the way to restore our faith in hard work, or in playing by the rules, or in whatever else sounds good. The official unemployment rate in the United States is already below 6 per cent, which is pretty close to what economists used to call ‘full employment’, but income inequality hasn’t changed a bit. Shitty jobs for everyone won’t solve any social problems we now face.

Don’t take my word for it, look at the numbers. Already a fourth of the adults actually employed in the US are paid wages lower than would lift them above the official poverty line – and so a fifth of American children live in poverty. Almost half of employed adults in this country are eligible for food stamps (most of those who are eligible don’t apply). The market in labour has broken down, along with most others.

Those jobs that disappeared in the Great Recession just aren’t coming back, regardless of what the unemployment rate tells you – the net gain in jobs since 2000 still stands at zero – and if they do return from the dead, they’ll be zombies, those contingent, part-time or minimum-wage jobs where the bosses shuffle your shift from week to week: welcome to Wal-Mart, where food stamps are a benefit.

Read the entire essay here.

Image: Detroit Industry North Wall, Diego Rivera. Courtesy: Detroit Institute of Arts. Wikipedia.

Send to Kindle

Breathe, You’re On Vacation

google-search-vacation

I’m lucky enough to be able to take a couple of vacations [holidays for my British readers] each year. Over the decades my vacations have generally tended to fall into two categories. First, there is the inactive, relaxing vacation of nothingness. This usually involves lounging and listening to ocean waves break along a beautiful beach, reading some choice literature and just, well, relaxing — when without kids in tow. Second, there is the active vacation spent trekking in the wilderness or discovering a far-flung natural or cultural wonder of the world.

However, even though I began these vacation rituals with my parents when I was a child myself, and have now done this for decades, I may have had the idea of a vacation completely wrong. Apparently, the ideal vacation must involve breathing, mindfulness, and self-improvement. So, forget the relaxation.

Ironically, it seems that Google has yet to learn about our active needs for vacation wellness and enrichment. Search for “vacation” online and Google will first deliver many thousands of images of people relaxing at the beach under a deep blue sky.

From NYT:

When I was 22, I used to have a fantasy about going away to a sanitarium, like in “The Magic Mountain.” I would do nothing but sit on balconies, wrapped in steamer rugs, and go to the doctor, avoiding the rigors of the real world and emerging after a short period brighter, happier, better.

I’m beginning to think this was a prescient impulse. Over the decades we have embraced a widening and diverse array of practices and traditions, but the idea that we can be improved — in mind, body or spirit — has remained a constant. That this could be accomplished with money and in an allotted parcel of time has become increasingly popular with a generation reared in a maximalist minimalist moment that, as with fashion and interior design, demands grandiose, well-documented freedom from the world. If stuff was once an indicator of security, now the very lack of it — of dust, of furniture, of body fat, of errant thoughts — defines aspiration. A glamorous back-to-nature exercise in pricey self-abnegation has become the logical way to spend one’s leisure time.

We live in a golden age of the “wellness vacation,” a sort of hybrid retreat, boot camp, spa and roving therapy session that, for the cost of room and board, promises to refresh body and mind and send you back to your life more whole. Pravassa, a “wellness travel company,” summarizes its (trademarked) philosophy as “Breathe. Experience. Move. Mindfulness. Nourish.” (The Kripalu Center for Yoga & Health, a wellness retreat in New England, boasts the eerily similar tagline: “Breathe. Connect. Move. Discover. Shine.”) A 10-day trip to Thailand with Pravassa includes a travel guide — who works, in her day job, as a “mindfulness-based psychotherapist” in Atlanta — as well as temple pilgrimages at dawn and, more abstractly, the potential to bring all that mindfulness back home with you. Selfies are not only allowed but encouraged.

Read the entire article here.

Image courtesy of Google Search.

Send to Kindle

MondayMap: National Superlatives

international-number-ones-2016

OK, I must admit that some maps can be somewhat dubious. Or, is it all maps?

Despite their shaky foundations some maps form the basis for many centuries of human (mis-)understanding, only to be subsequently overturned by a new (and improved) chart. For instance, the geocentric models of our cosmos courtesy of Aristotle and Ptolemy were not replaced for around 1,400 years, until Nicolaus Copernicus proposed a heliocentric view of the solar system.

Thus, keep in mind the latest view of our globe, courtesy of David McCandless. He compiled this esoteric worldview, because every nation is the best at something, from a collection of global data sources.

Looks like the US is “best” at spam (not the luncheon meat). While Russia leads in, of course, dash cams.

Send to Kindle

Beware. Economic Growth May Kill You

There is a long-held belief that economic growth and prosperity makes for a happier, healthier populace. Most economists and social scientists, and indeed lay-people, have subscribed to this idea for many decades.

But, this may be completely wrong.

A handful of contrarian economists began noticing a strange paradox in their research studies from 2000. Evidence suggests that rising incomes and personal well-being are linked in the opposite way. It seems that when the US economy is improving, people suffer more medical problems and die faster.

How could this be? Well, put simply, there are three main factors: increased pollution from increased industrial activity; greater occupational hazards from increased work; and, higher exposure to risky behaviors from greater income.

From the Washington Post:

Yet in recent years, accumulating evidence suggests that rising incomes and personal well-being are linked in the opposite way. It seems that economic growth actually kills people.

Christopher Ruhm, an economics professor at the University of Virginia, was one of the first to notice this paradox. In a 2000 paper, he showed that when the American economy is on an upswing, people suffer more medical problems and die faster; when the economy falters, people tend to live longer.

“It’s very puzzling,” says Adriana Lleras-Muney, an economics professor at the University of California, Los Angeles. “We know that people in rich countries live longer than people in poor countries. There’s a strong relationship between GDP and life expectancy, suggesting that more money is better. And yet, when the economy is doing well, when it’s growing faster than average, we find that more people are dying.”

In other words, there are great benefits to being wealthy. But the process of becoming wealthy — well, that seems to be dangerous.

Lleras-Muney and her colleagues, David Cutler of Harvard and Wei Huang of the National Bureau of Economic Research, believe they can explain why. They have conducted one of the most comprehensive investigations yet of this phenomenon, analyzing over 200 years of data from 32 countries. In a draft of their research, released last week, they lay out something of a grand unified theory of life, death and economic growth.

To start, the economists confirm that when a country’s economic output — its GDP — is higher than expected, mortality rates are also higher than expected.

The data show that when economies are growing particularly fast, emissions and pollution are also on the rise. After controlling for changes in air quality, the economists find that economic growth doesn’t seem to impact death rates as much. “As much as two-thirds of the adverse effect of booms may be the result of increased pollution,” they write.

A booming economy spurs death in other ways too. People start to spend more time at their jobs, exposing them to occupational hazards, as well as the stress of overwork. People drive more, leading to an increase in traffic-related fatalities. People also drink more, causing health problems and accidents. In particular, the economists’ data suggest that alcohol-related mortality is the second-most important explanation, after pollution, for the connection between economic growth and death rates.

This is consistent with other studies finding that people are more likely to die right after they receive their tax rebates. More income makes it easier for people to pay for health care and other basic necessities, but it also makes it easier for people to engage in risky activities and hurt themselves.

Read the entire story here.

Send to Kindle