Vera Rubin: Astronomy Pioneer

Vera Rubin passed away on December 26, 2016, aged 88. She was a pioneer in the male-dominated world of astronomy, notable for her original work on dark matter,  galaxy rotation and galaxy clumping.

From Popular Science:

Vera Rubin, who essentially created a new field of astronomy by discovering dark matter, was a favorite to win the Nobel Prize in physics for years. But she never received her early-morning call from Stockholm. On Sunday, she died at the age of 88.

Rubin’s death would sadden the scientific community under the best of circumstances. Countless scientists were inspired by her work. Countless scientists are researching questions that wouldn’t exist if not for her work. But her passing brings another blow: The Nobel Prize cannot be awarded posthumously. The most prestigious award in physics will never be bestowed upon a woman who was inarguably deserving.

In the 1960s and ’70s, Rubin and her colleague Kent Ford found that the stars within spiral galaxies weren’t behaving as the laws of physics dictated that they should. This strange spinning led her and others to conclude that some unseen mass must be influencing the galactic rotation. This unknown matter—now dubbed dark matter—outnumbers the traditional stuff by at least five to one. This is a big deal.

Read more here.

Spacetime Without the Time

anti-de-sitter-spaceSince they were first dreamed up explanations of the very small (quantum mechanics) and the very large (general relativity) have both been highly successful at describing their respective spheres of influence. Yet, these two descriptions of our physical universe are not compatible, particularly when it comes to describing gravity. Indeed, physicists and theorists have struggled for decades to unite these two frameworks. Many agree that we need a new theory (of everything).

One new idea, from theorist Erik Verlinde of the University of Amsterdam, proposes that time is an emergent construct (it’s not a fundamental building block) and that dark matter is an illusion.

From Quanta:

Theoretical physicists striving to unify quantum mechanics and general relativity into an all-encompassing theory of quantum gravity face what’s called the “problem of time.”

In quantum mechanics, time is universal and absolute; its steady ticks dictate the evolving entanglements between particles. But in general relativity (Albert Einstein’s theory of gravity), time is relative and dynamical, a dimension that’s inextricably interwoven with directions x, y and z into a four-dimensional “space-time” fabric. The fabric warps under the weight of matter, causing nearby stuff to fall toward it (this is gravity), and slowing the passage of time relative to clocks far away. Or hop in a rocket and use fuel rather than gravity to accelerate through space, and time dilates; you age less than someone who stayed at home.

Unifying quantum mechanics and general relativity requires reconciling their absolute and relative notions of time. Recently, a promising burst of research on quantum gravity has provided an outline of what the reconciliation might look like — as well as insights on the true nature of time.

As I described in an article this week on a new theoretical attempt to explain away dark matter, many leading physicists now consider space-time and gravity to be “emergent” phenomena: Bendy, curvy space-time and the matter within it are a hologram that arises out of a network of entangled qubits (quantum bits of information), much as the three-dimensional environment of a computer game is encoded in the classical bits on a silicon chip. “I think we now understand that space-time really is just a geometrical representation of the entanglement structure of these underlying quantum systems,” said Mark Van Raamsdonk, a theoretical physicist at the University of British Columbia.

Researchers have worked out the math showing how the hologram arises in toy universes that possess a fisheye space-time geometry known as “anti-de Sitter” (AdS) space. In these warped worlds, spatial increments get shorter and shorter as you move out from the center. Eventually, the spatial dimension extending from the center shrinks to nothing, hitting a boundary. The existence of this boundary — which has one fewer spatial dimension than the interior space-time, or “bulk” — aids calculations by providing a rigid stage on which to model the entangled qubits that project the hologram within. “Inside the bulk, time starts bending and curving with the space in dramatic ways,” said Brian Swingle of Harvard and Brandeis universities. “We have an understanding of how to describe that in terms of the ‘sludge’ on the boundary,” he added, referring to the entangled qubits.

The states of the qubits evolve according to universal time as if executing steps in a computer code, giving rise to warped, relativistic time in the bulk of the AdS space. The only thing is, that’s not quite how it works in our universe.

Here, the space-time fabric has a “de Sitter” geometry, stretching as you look into the distance. The fabric stretches until the universe hits a very different sort of boundary from the one in AdS space: the end of time. At that point, in an event known as “heat death,” space-time will have stretched so much that everything in it will become causally disconnected from everything else, such that no signals can ever again travel between them. The familiar notion of time breaks down. From then on, nothing happens.

On the timeless boundary of our space-time bubble, the entanglements linking together qubits (and encoding the universe’s dynamical interior) would presumably remain intact, since these quantum correlations do not require that signals be sent back and forth. But the state of the qubits must be static and timeless. This line of reasoning suggests that somehow, just as the qubits on the boundary of AdS space give rise to an interior with one extra spatial dimension, qubits on the timeless boundary of de Sitter space must give rise to a universe with time — dynamical time, in particular. Researchers haven’t yet figured out how to do these calculations. “In de Sitter space,” Swingle said, “we don’t have a good idea for how to understand the emergence of time.”

Read the entire article here.

Image: Image of (1 + 1)-dimensional anti-de Sitter space embedded in flat (1 + 2)-dimensional space. The t1- and t2-axes lie in the plane of rotational symmetry, and the x1-axis is normal to that plane. The embedded surface contains closed timelike curves circling the x1 axis, though these can be eliminated by “unrolling” the embedding (more precisely, by taking the universal cover). Courtesy: Krishnavedala. Wikipedia. Creative Commons Attribution-Share Alike 3.0.

MondayMap: A Global Radio Roadtrip

radio-garden-screenshot1

As a kid my radio allowed me to travel the world. I could use the dial to transport myself over border walls and across oceans to visit new cultures and discover new sounds. I’d always eagerly anticipate the next discovery as I carefully moved the dial around the Short Wave, Long Wave (and later the FM) spectrum, waiting for new music and voices to replace the soothing crackle and hiss of the intervening static.

So, what a revelation it is to stumble across Radio.Garden. It’s a glorious, app that combines the now arcane radio dial with the power of the internet enabling you to journey around the globe on a virtual radio roadtrip.

Trek to Tromsø north of the arctic circle in Norway, then hop over to Omsk in central Russia. Check out the meditative tunes in Kathmandu before heading southwest to Ruwi, Oman on the Persian Gulf. Stopover in Kuching, Malaysia, then visit Nhulunbuy in Australia’s Northern Territory. Take in a mid-Pacific talk radio show in Bairiki, in the Republic of Kiribati, then some salsa inspired tuned in Tacna, Peru, and followed by pounding Brazilian Euro-techno in João Pessoa. Journey to Kinshasa in the DRC for some refreshing African beats, then rest for the day with some lively conversation in the Italian Apennine Mountains in Parma, Italy.

radio-garden-screenshot2

During this wonderful border free journey one thing is becomes crystal clear: we are part of one global community with much in common. History will eventually prove the racists and xenophobes among us wrong.

Images: Screenshots of Radio.Garden. Courtesy of Radio.Garden.

Computational Folkloristics

hca_by_thora_hallager_1869What do you get when you set AI (artificial intelligence) the task of reading through 30,000 Danish folk and fairy tales? Well, you get a host of fascinating, newly discovered insights into Scandinavian witches and trolls.

More importantly, you hammer another nail into the coffin of literary criticism and set AI on a collision course with yet another preserve of once exclusive human endeavor. It’s probably safe to assume that creative writing will fall to intelligent machines in the not too distant future (as well) — certainly human-powered investigative journalism seemed to became extinct in 2016; replaced by algorithmic aggregation, social bots and fake-mongers.

From aeon:

Where do witches come from, and what do those places have in common? While browsing a large collection of traditional Danish folktales, the folklorist Timothy Tangherlini and his colleague Peter Broadwell, both at the University of California, Los Angeles, decided to find out. Armed with a geographical index and some 30,000 stories, they developed WitchHunter, an interactive ‘geo-semantic’ map of Denmark that highlights the hotspots for witchcraft.

The system used artificial intelligence (AI) techniques to unearth a trove of surprising insights. For example, they found that evil sorcery often took place close to Catholic monasteries. This made a certain amount of sense, since Catholic sites in Denmark were tarred with diabolical associations after the Protestant Reformation in the 16th century. By plotting the distance and direction of witchcraft relative to the storyteller’s location, WitchHunter also showed that enchantresses tend to be found within the local community, much closer to home than other kinds of threats. ‘Witches and robbers are human threats to the economic stability of the community,’ the researchers write. ‘Yet, while witches threaten from within, robbers are generally situated at a remove from the well-described village, often living in woods, forests, or the heath … it seems that no matter how far one goes, nor where one turns, one is in danger of encountering a witch.’

Such ‘computational folkloristics’ raise a big question: what can algorithms tell us about the stories we love to read? Any proposed answer seems to point to as many uncertainties as it resolves, especially as AI technologies grow in power. Can literature really be sliced up into computable bits of ‘information’, or is there something about the experience of reading that is irreducible? Could AI enhance literary interpretation, or will it alter the field of literary criticism beyond recognition? And could algorithms ever derive meaning from books in the way humans do, or even produce literature themselves?

Author and computational linguist Inderjeet Mani concludes his essay thus:

Computational analysis and ‘traditional’ literary interpretation need not be a winner-takes-all scenario. Digital technology has already started to blur the line between creators and critics. In a similar way, literary critics should start combining their deep expertise with ingenuity in their use of AI tools, as Broadwell and Tangherlini did with WitchHunter. Without algorithmic assistance, researchers would be hard-pressed to make such supernaturally intriguing findings, especially as the quantity and diversity of writing proliferates online.

In the future, scholars who lean on digital helpmates are likely to dominate the rest, enriching our literary culture and changing the kinds of questions that can be explored. Those who resist the temptation to unleash the capabilities of machines will have to content themselves with the pleasures afforded by smaller-scale, and fewer, discoveries. While critics and book reviewers may continue to be an essential part of public cultural life, literary theorists who do not embrace AI will be at risk of becoming an exotic species – like the librarians who once used index cards to search for information.

Read the entire tale here.

Image: Portrait of the Danish writer Hans Christian Andersen. Courtesy: Thora Hallager, 10/16 October 1869. Wikipedia. Public Domain.

Wound Man

wound-man-wellcome-library-ms-49

No, the image is not a still from a forthcoming episode of Law & Order or Criminal Minds. Nor is it a nightmarish Hieronymus Bosch artwork.

Rather, “Wound Man”, as he was known, is a visual table of contents to a medieval manuscript of medical cures, treatments and surgeries. Wound Man first appeared in German surgical texts in the early 15th century. Arranged around each of his various wounds and ailments are references to further details on appropriate treatments. For instance, reference number 38 alongside an arrow penetrating Wound Man’s thigh, “An arrow whose shaft is still in place”, leads to details on how to address the wound — presumably a relatively common occurrence in the Middle Ages.

From Public Domain Review:

Staring impassively out of the page, he bears a multitude of graphic wounds. His skin is covered in bleeding cuts and lesions, stabbed and sliced by knives, spears and swords of varying sizes, many of which remain in the skin, protruding porcupine-like from his body. Another dagger pierces his side, and through his strangely transparent chest we see its tip puncture his heart. His thighs are pierced with arrows, some intact, some snapped down to just their heads or shafts. A club slams into his shoulder, another into the side of his face.

His neck, armpits and groin sport rounded blue buboes, swollen glands suggesting that the figure has contracted plague. His shins and feet are pockmarked with clustered lacerations and thorn scratches, and he is beset by rabid animals. A dog, snake and scorpion bite at his ankles, a bee stings his elbow, and even inside the cavity of his stomach a toad aggravates his innards.

Despite this horrendous cumulative barrage of injuries, however, the Wound Man is very much alive. For the purpose of this image was not to threaten or inspire fear, but to herald potential cures for all of the depicted maladies. He contrarily represented something altogether more hopeful than his battered body: an arresting reminder of the powerful knowledge that could be channelled and dispensed in the practice of late medieval medicine.

The earliest known versions of the Wound Man appeared at the turn of the fifteenth century in books on the surgical craft, particularly works from southern Germany associated with the renowned Würzburg surgeon Ortolf von Baierland (died before 1339). Accompanying a text known as the “Wundarznei” (The Surgery), these first Wound Men effectively functioned as a human table of contents for the cures contained within the relevant treatise. Look closely at the remarkable Wound Man shown above from the Wellcome Library’s MS. 49 – a miscellany including medical material produced in Germany in about 1420 – and you see that the figure is penetrated not only by weapons but also by text.

Read the entire article here.

Image: The Wound Man. Courtesy: Wellcome Library’s MS. 49 — Source (CC BY 4.0). Public Domain Review.

Fake News: Who’s Too Blame?

alien-abduction-waltonShould we blame the creative originators of fake news, conspiracy theories, disinformation and click-bait hype? Or, should we blame the media for disseminating, spinning and aggrandizing these stories for their own profit or political motives? Or, should we blame us — the witless consumers.

I subscribe to the opinion that all three constituencies share responsibility — it’s very much a symbiotic relationship.

James Warren chief media writer for Poynter has a different opinion; he lays the blame squarely at the feet of gullible and unquestioning citizens. He makes a very compelling argument.

Perhaps if any educated political scholars remain several hundred years from now, they’ll hold the US presidential election of 2016 as the culmination of a process where lazy stupidity triumphed over healthy skepticism and reason.

From Hive:

The rise of “fake news” inspires the press to uncover its many practitioners worldwide, discern its economics and herald the alleged guilt-ridden soul-searching by its greatest enablers, Facebook and Google.

But the media dances around another reality with the dexterity of Beyonce, Usher and septuagenarian Mick Jagger: the stupidity of a growing number of Americans.

So thanks to Neal Gabler for taking to Bill Moyers’ website to pen, “Who’s Really to Blame for Fake News.” (Moyers)

Fake news, of course, “is an assault on the very principle of truth itself: a way to upend the reference points by which mankind has long operated. You could say, without exaggeration, that fake news is actually an attempt to reverse the Enlightenment. And because a democracy relies on truth — which is why dystopian writers have always described how future oligarchs need to undermine it — fake news is an assault on democracy as well.”

Gabler is identified here as the author of five books, without mentioning any. Well, one is 1995’s Winchell: Gossip, Power and the Culture of Celebrity. It’s a superb look at Walter Winchell, the man who really invented the gossip column and wound up with a readership and radio audience of 50 million, or two-thirds of the then-population, as he helped create our modern media world of privacy-invading gossip and personal destruction as entertainment.

“What is truly horrifying is that fake news is not the manipulation of an unsuspecting public,” Gabler writes of our current mess. “Quite the opposite. It is willful belief by the public. In effect, the American people are accessories in their own disinformation campaign. That is our current situation, and it is no sure thing that either truth or democracy survives.”

Think of it. The goofy stories, the lies, the conspiracy theories that now routinely gain credibility among millions who can’t be bothered to read a newspaper or decent digital site and can’t differentiate between Breitbart and The New York Times. Ask all those pissed-off Trump loyalists in rural towns to name their two U.S. senators.

We love convincing ourselves of the strengths of democracy, including the inevitable collective wisdom setting us back on a right track if ever we go astray. And while the media may hold itself out as cultural anthropologists in explaining the “anger” or “frustration” of “real people,” as is the case after Donald Trump’s election victory, we won’t really underscore rampant illiteracy and incomprehension.

So read Gabler. “Above all else, fake news is a lazy person’s news. It provides passive entertainment, demanding nothing of us. And that is a major reason we now have a fake news president.”

Read the entire essay here.

Image: Artist’s conception of an alien spacecraft tractor-beaming a human victim. Courtesy: unknown artist, Wikipedia. Public Domain.

Uber For…

google-search-uber

There’s an Uber for pet-sitters (Rover). There’s an Uber for dog walkers (Wag). There’s an Uber for private jets (JetMe). There are several Ubers for alcohol (Minibar, Saucey, Drizly, Thirstie). In fact, enter the keywords “Uber for…” into Google and the search engine will return “Uber for kids, Uber for icecream, Uber for news, Uber for seniors, Uber for trucks, Uber for haircuts, Uber for iPads (?), Uber for food, Uber for undertakers (??)…” and thousands of other results.

The list of Uber-like copycats, startups and ideas is seemingly endless — a sign, without doubt, that we have indeed reached peak-Uber. Perhaps VCs in the valley should move on to some more meaningful investments, before the Uber bubble bursts.

From Wired:

“Uber for X” has been the headline of more than four hundred news articles. Thousands of would-be entrepreneurs used the phrase to describe their companies in their pitch decks. On one site alone—AngelList, where startups can court angel investors and employees—526 companies included “Uber for” in their listings. As a judge for various emerging technology startup competitions, I saw “Uber for” so many times that at some point, I developed perceptual blindness.

Nearly all the organizations I advised at that time wanted to know about the “Uber for” of their respective industries. A university wanted to develop an “Uber for tutoring”; a government agency was hoping to solve an impending transit issue with an “Uber for parking.” I knew that “Uber for” had reached critical mass when one large media organization, in need of a sustainable profit center, pitched me their “Uber for news strategy.”

“We’re going to be the Uber for news,” the news exec told me. Confused, I asked what, exactly, he meant by that.

“Three years from now, we’ll have an on-demand news platform for Millennials. They tap a button on their phones and they get the news delivered right to them, wherever they are,” the editor said enthusiastically. “This is the future of news!”

“Is it an app?” I asked, trying to understand.

“Maybe. The point is that you get the news right away, when you want it, wherever you are,” the exec said.

“So you mean an app,” I pressed. “Yes!” he said. “But more like Uber.”

The mass “Uber for X” excitement is a good example of what happens when we don’t stop to investigate a trend, asking difficult questions and challenging our cherished beliefs. We need to first understand what, exactly, Uber is and what led to entrepreneurs coining that catchphrase.

Read the entire story here.

Image courtesy of Google Search.

The Anomaly

Is the smallest, lightest, most ghostly particle about to upend our understanding of the universe? Recently, the ephemeral neutrino has begun to give up some of its secrets. Beginning in 1998 the neutrino experiments at Super-Kamiokande and Sudbury Neutrino Observatory showed for the first time that neutrinos oscillate with one of three flavors. In 2015, two physicists were awarded the Nobel prize for this discovery, which also proved that neutrinos must have mass. More recently, a small anomaly at the Super-Kamiokande detector has surfaced, which, is hoped, could shed light on why the universe is constructed primarily from matter and not anti-matter.

From Quanta:

The anomaly, detected by the T2K experiment, is not yet pronounced enough to be sure of, but it and the findings of two related experiments “are all pointing in the same direction,” said Hirohisa Tanaka of the University of Toronto, a member of the T2K team who presented the result to a packed audience in London earlier this month.

“A full proof will take more time,” said Werner Rodejohann, a neutrino specialist at the Max Planck Institute for Nuclear Physics in Heidelberg who was not involved in the experiments, “but my and many others’ feeling is that there is something real here.”

The long-standing puzzle to be solved is why we and everything we see is matter-made. More to the point, why does anything — matter or antimatter — exist at all? The reigning laws of particle physics, known as the Standard Model, treat matter and antimatter nearly equivalently, respecting (with one known exception) so-called charge-parity, or “CP,” symmetry: For every particle decay that produces, say, a negatively charged electron, the mirror-image decay yielding a positively charged antielectron occurs at the same rate. But this cannot be the whole story. If equal amounts of matter and antimatter were produced during the Big Bang, equal amounts should have existed shortly thereafter. And since matter and antimatter annihilate upon contact, such a situation would have led to the wholesale destruction of both, resulting in an empty cosmos.

Somehow, significantly more matter than antimatter must have been created, such that a matter surplus survived the annihilation and now holds sway. The question is, what CP-violating process beyond the Standard Model favored the production of matter over antimatter?

Many physicists suspect that the answer lies with neutrinos — ultra-elusive, omnipresent particles that pass unfelt through your body by the trillions each second.

Read the entire article here.

Robots Beware. Humans Are Still (Sort of) Smarter Than You

[tube]ZfCfTYZJWtI[/tube]

So, it looks like we humans may have a few more years to go as the smartest beings on the planet, before being overrun by ubiquitous sentient robots. Some may question my assertion based on recent election results in the UK and the US, but I digress.

A recent experiment featuring some of our best-loved voice-activated assistants, such as Apple’s Siri, Amazon’s Alexa and Google’s Home, clearly shows our digital brethren have some learning to do. A conversation between two of these rapidly enters an infinite loop.

Read more about this here.

Video: Echo/Google Home infinite loop. Courtesy: Adam Jakowenko.

The Existential Dangers of the Online Echo Chamber

google-search-fake-news

The online filter bubble is a natural extension of our preexisting biases, particularly evident in our media consumption. Those of us of a certain age — above 30 years — once purchased (and maybe still do) our favorite paper-based newspapers and glued ourselves to our favorite TV news channels. These sources mirrored, for the most part, our cultural and political preferences. The internet took this a step further by building a tightly wound, self-reinforcing feedback loop. We consume our favorite online media, which solicits algorithms to deliver more of the same. I’ve written about the filter bubble for years (here, here and here).

The online filter bubble in which each of us lives — those of us online — may seem no more dangerous than its offline predecessor. After all, the online version of the NYT delivers left-of-center news, just like its printed cousin. So what’s the big deal? Well, the pervasiveness of our technology has now enabled these filters to creep insidiously into many aspects of our lives, from news consumption and entertainment programming to shopping and even dating. And, since we now spend growing  swathes of our time online, our serendipitous exposure to varied content that typically lies outside this bubble in the real, offline world is diminishing. Consequently, the online filter bubble is taking on a much more critical role and having greater effect in maintaining our tunnel vision.

However, that’s not all. Over the last few years we have become exposed to yet another dangerous phenomenon to have made the jump from the offline world to online — the echo chamber. The online echo chamber is enabled by our like-minded online communities and catalyzed by the tools of social media. And, it turns our filter bubble into a self-reinforcing, exclusionary community that is harmful to varied, reasoned opinion and healthy skepticism.

Those of us who reside on Facebook are likely to be part of a very homogeneous social circle, which trusts, shares and reinforces information accepted by the group and discards information that does not match the group’s social norms. This makes the spread of misinformation — fake stories, conspiracy theories, hoaxes, rumors — so very effective. Importantly, this is increasingly to the exclusion of all else, including real news and accepted scientific fact.

Why embrace objective journalism, trusted science and thoughtful political dialogue when you can get a juicy, emotive meme from a friend of a friend on Facebook? Why trust a story from Reuters or science from Scientific American when you get your “news” via a friend’s link from Alex Jones and the Brietbart News Network?

And, there’s no simple solution, which puts many of our once trusted institutions in severe jeopardy. Those of us who care have a duty to ensure these issues are in the minds of our public officials and the guardians of our technology and media networks.

From Scientific American:

If you get your news from social media, as most Americans do, you are exposed to a daily dose of hoaxes, rumors, conspiracy theories and misleading news. When it’s all mixed in with reliable information from honest sources, the truth can be very hard to discern.

In fact, my research team’s analysis of data from Columbia University’s Emergent rumor tracker suggests that this misinformation is just as likely to go viral as reliable information.

Many are asking whether this onslaught of digital misinformation affected the outcome of the 2016 U.S. election. The truth is we do not know, although there are reasons to believe it is entirely possible, based on past analysis and accounts from other countries. Each piece of misinformation contributes to the shaping of our opinions. Overall, the harm can be very real: If people can be conned into jeopardizing our children’s lives, as they do when they opt out of immunizations, why not our democracy?

As a researcher on the spread of misinformation through social media, I know that limiting news fakers’ ability to sell ads, as recently announced by Google and Facebook, is a step in the right direction. But it will not curb abuses driven by political motives.

Read the entire article here.

Image courtesy of Google Search.

The Birthday Problem

birthday_paradox

I first came across the Birthday Problem in my first few days of my first year of secondary school in London [that would be 6th grade for my US readers]. My mathematics teacher at the time realized the need to discuss abstract problems in concrete terms, especially statistics and probability. So, he wowed many of us — in a class of close to 30 kids — by firmly stating that there was a better than even chance that two of us shared the same birthday. In a class of 30, the actual probability is 60 percent, and rises to close to 100 percent is a group of only 60.

Startlingly, two in our class did indeed share the same birthday. How could that be possible, I wondered?

Well, the answer is grounded in the simple probability of large populations. But, it is also colored by our selective biases to remember “remarkable” coincidences and to ignore the much, much larger number of instances where there is no coincidence at all.

From the Washington Post.

Mathematician Joseph Mazur was in the back of a van snaking through the mountains of Sardinia when he heard one of his favorite coincidence stories. The driver, an Italian language teacher named Francesco, told of meeting a woman named Manuela who had come to study at his school. Francesco and Manuela met for the first time in a hotel lobby, and then went to have coffee.

They spoke for an hour, getting acquainted, before the uncomfortable truth came out. Noting Manuela’s nearly perfect Italian, Francesco finally asked why she decided to come to his school.

“She said, ‘Italian? What are you talk about? I’m not here to learn Italian,’” Mazur relates. “And then it dawned on both of them that she was the wrong Manuela and he was the wrong Francesco.” They returned to the hotel lobby where they had met to find a different Francesco offering a different Manuela a job she didn’t want or expect.

The tale is one of the many stories that populate Mazur’s new book, “Fluke,” in which he explores the probability of coincidences.

Read the entire article here.

Image: The computed probability of at least two people sharing a birthday versus the number of people. Courtesy: Rajkiran g / Wikipedia. CC BY-SA 3.0.

Surplus Humans and the Death of Work

detroit-industry-north-wall-diego-rivera

It’s a simple equation: too many humans, not enough work. Low paying, physical jobs continue to disappear, replaced by mechanization. More cognitive work characterized by the need to think is increasingly likely to be automated and robotized. This has complex and dire consequences, and not just global economic ramifications, but moral ones. What are we to make of ourselves and of a culture that has intimately linked work with meaning when the work is outsourced or eliminated entirely?

A striking example comes from the richest country in the world — the United States. Recently and anomalously life-expectancy has shown a decrease among white people in economically depressed areas of the nation. Many economists suggest that the quest for ever-increasing productivity — usually delivered through automation — is chipping away at the very essence of what it means to be human: value purpose through work.

James Livingston professor of history at Rutgers University summarizes the existential dilemma, excerpted below, in his latest book No More Work: Why Full Employment is a Bad Idea.

From aeon:

Work means everything to us Americans. For centuries – since, say, 1650 – we’ve believed that it builds character (punctuality, initiative, honesty, self-discipline, and so forth). We’ve also believed that the market in labour, where we go to find work, has been relatively efficient in allocating opportunities and incomes. And we’ve believed that, even if it sucks, a job gives meaning, purpose and structure to our everyday lives – at any rate, we’re pretty sure that it gets us out of bed, pays the bills, makes us feel responsible, and keeps us away from daytime TV.

These beliefs are no longer plausible. In fact, they’ve become ridiculous, because there’s not enough work to go around, and what there is of it won’t pay the bills – unless of course you’ve landed a job as a drug dealer or a Wall Street banker, becoming a gangster either way.

These days, everybody from Left to Right – from the economist Dean Baker to the social scientist Arthur C Brooks, from Bernie Sanders to Donald Trump – addresses this breakdown of the labour market by advocating ‘full employment’, as if having a job is self-evidently a good thing, no matter how dangerous, demanding or demeaning it is. But ‘full employment’ is not the way to restore our faith in hard work, or in playing by the rules, or in whatever else sounds good. The official unemployment rate in the United States is already below 6 per cent, which is pretty close to what economists used to call ‘full employment’, but income inequality hasn’t changed a bit. Shitty jobs for everyone won’t solve any social problems we now face.

Don’t take my word for it, look at the numbers. Already a fourth of the adults actually employed in the US are paid wages lower than would lift them above the official poverty line – and so a fifth of American children live in poverty. Almost half of employed adults in this country are eligible for food stamps (most of those who are eligible don’t apply). The market in labour has broken down, along with most others.

Those jobs that disappeared in the Great Recession just aren’t coming back, regardless of what the unemployment rate tells you – the net gain in jobs since 2000 still stands at zero – and if they do return from the dead, they’ll be zombies, those contingent, part-time or minimum-wage jobs where the bosses shuffle your shift from week to week: welcome to Wal-Mart, where food stamps are a benefit.

Read the entire essay here.

Image: Detroit Industry North Wall, Diego Rivera. Courtesy: Detroit Institute of Arts. Wikipedia.

Breathe, You’re On Vacation

google-search-vacation

I’m lucky enough to be able to take a couple of vacations [holidays for my British readers] each year. Over the decades my vacations have generally tended to fall into two categories. First, there is the inactive, relaxing vacation of nothingness. This usually involves lounging and listening to ocean waves break along a beautiful beach, reading some choice literature and just, well, relaxing — when without kids in tow. Second, there is the active vacation spent trekking in the wilderness or discovering a far-flung natural or cultural wonder of the world.

However, even though I began these vacation rituals with my parents when I was a child myself, and have now done this for decades, I may have had the idea of a vacation completely wrong. Apparently, the ideal vacation must involve breathing, mindfulness, and self-improvement. So, forget the relaxation.

Ironically, it seems that Google has yet to learn about our active needs for vacation wellness and enrichment. Search for “vacation” online and Google will first deliver many thousands of images of people relaxing at the beach under a deep blue sky.

From NYT:

When I was 22, I used to have a fantasy about going away to a sanitarium, like in “The Magic Mountain.” I would do nothing but sit on balconies, wrapped in steamer rugs, and go to the doctor, avoiding the rigors of the real world and emerging after a short period brighter, happier, better.

I’m beginning to think this was a prescient impulse. Over the decades we have embraced a widening and diverse array of practices and traditions, but the idea that we can be improved — in mind, body or spirit — has remained a constant. That this could be accomplished with money and in an allotted parcel of time has become increasingly popular with a generation reared in a maximalist minimalist moment that, as with fashion and interior design, demands grandiose, well-documented freedom from the world. If stuff was once an indicator of security, now the very lack of it — of dust, of furniture, of body fat, of errant thoughts — defines aspiration. A glamorous back-to-nature exercise in pricey self-abnegation has become the logical way to spend one’s leisure time.

We live in a golden age of the “wellness vacation,” a sort of hybrid retreat, boot camp, spa and roving therapy session that, for the cost of room and board, promises to refresh body and mind and send you back to your life more whole. Pravassa, a “wellness travel company,” summarizes its (trademarked) philosophy as “Breathe. Experience. Move. Mindfulness. Nourish.” (The Kripalu Center for Yoga & Health, a wellness retreat in New England, boasts the eerily similar tagline: “Breathe. Connect. Move. Discover. Shine.”) A 10-day trip to Thailand with Pravassa includes a travel guide — who works, in her day job, as a “mindfulness-based psychotherapist” in Atlanta — as well as temple pilgrimages at dawn and, more abstractly, the potential to bring all that mindfulness back home with you. Selfies are not only allowed but encouraged.

Read the entire article here.

Image courtesy of Google Search.

MondayMap: National Superlatives

international-number-ones-2016

OK, I must admit that some maps can be somewhat dubious. Or, is it all maps?

Despite their shaky foundations some maps form the basis for many centuries of human (mis-)understanding, only to be subsequently overturned by a new (and improved) chart. For instance, the geocentric models of our cosmos courtesy of Aristotle and Ptolemy were not replaced for around 1,400 years, until Nicolaus Copernicus proposed a heliocentric view of the solar system.

Thus, keep in mind the latest view of our globe, courtesy of David McCandless. He compiled this esoteric worldview, because every nation is the best at something, from a collection of global data sources.

Looks like the US is “best” at spam (not the luncheon meat). While Russia leads in, of course, dash cams.

Beware. Economic Growth May Kill You

There is a long-held belief that economic growth and prosperity makes for a happier, healthier populace. Most economists and social scientists, and indeed lay-people, have subscribed to this idea for many decades.

But, this may be completely wrong.

A handful of contrarian economists began noticing a strange paradox in their research studies from 2000. Evidence suggests that rising incomes and personal well-being are linked in the opposite way. It seems that when the US economy is improving, people suffer more medical problems and die faster.

How could this be? Well, put simply, there are three main factors: increased pollution from increased industrial activity; greater occupational hazards from increased work; and, higher exposure to risky behaviors from greater income.

From the Washington Post:

Yet in recent years, accumulating evidence suggests that rising incomes and personal well-being are linked in the opposite way. It seems that economic growth actually kills people.

Christopher Ruhm, an economics professor at the University of Virginia, was one of the first to notice this paradox. In a 2000 paper, he showed that when the American economy is on an upswing, people suffer more medical problems and die faster; when the economy falters, people tend to live longer.

“It’s very puzzling,” says Adriana Lleras-Muney, an economics professor at the University of California, Los Angeles. “We know that people in rich countries live longer than people in poor countries. There’s a strong relationship between GDP and life expectancy, suggesting that more money is better. And yet, when the economy is doing well, when it’s growing faster than average, we find that more people are dying.”

In other words, there are great benefits to being wealthy. But the process of becoming wealthy — well, that seems to be dangerous.

Lleras-Muney and her colleagues, David Cutler of Harvard and Wei Huang of the National Bureau of Economic Research, believe they can explain why. They have conducted one of the most comprehensive investigations yet of this phenomenon, analyzing over 200 years of data from 32 countries. In a draft of their research, released last week, they lay out something of a grand unified theory of life, death and economic growth.

To start, the economists confirm that when a country’s economic output — its GDP — is higher than expected, mortality rates are also higher than expected.

The data show that when economies are growing particularly fast, emissions and pollution are also on the rise. After controlling for changes in air quality, the economists find that economic growth doesn’t seem to impact death rates as much. “As much as two-thirds of the adverse effect of booms may be the result of increased pollution,” they write.

A booming economy spurs death in other ways too. People start to spend more time at their jobs, exposing them to occupational hazards, as well as the stress of overwork. People drive more, leading to an increase in traffic-related fatalities. People also drink more, causing health problems and accidents. In particular, the economists’ data suggest that alcohol-related mortality is the second-most important explanation, after pollution, for the connection between economic growth and death rates.

This is consistent with other studies finding that people are more likely to die right after they receive their tax rebates. More income makes it easier for people to pay for health care and other basic necessities, but it also makes it easier for people to engage in risky activities and hurt themselves.

Read the entire story here.

You’re Not In Control

dual_elevator_door_buttons

Press a button, then something happens. Eat too much chocolate, then you feel great (and then put on weight). Step in to the middle of a busy road, then you get hit by an oncoming car. Walk in the rain, then you get wet. Watch your favorite comedy show, then you laugh.

Every moment of our lives is filled with actions and consequences, causes and effects. Usually we have a good sense of what is likely to happen when we take a specific action. This sense of predictability smooths our lives and makes us feel in control.

But sometimes all is not what is seems. Take the buttons on some of the most actively used objects in our daily lives. Press the “close door” button on the elevator [or “lift” for my British readers], then the door closes, right? Press the “pedestrian crossing” button at the crosswalk [or “zebra crossing”], then the safe to cross signal blinks to life, right? Adjust the office thermostat, then you feel more comfortable, right?

Well, if you think that by pressing a button you are commanding the elevator door to close, or the crosswalk signal to flash, or the thermostat to change the office temperature, you’re probably wrong. You may feel in control, but actually you’re not. In many cases the button may serve no functional purpose; the systems just work automatically. But the button still offers a psychological purpose — a placebo-like effect. We are so conditioned to the notion that pressing a button yields an action, that we still feel in control even when the button does nothing beyond making an audible click.

From the NYT:

Pressing the door-close button on an elevator might make you feel better, but it will do nothing to hasten your trip.

Karen W. Penafiel, executive director of National Elevator Industry Inc., a trade group, said the close-door feature faded into obsolescence a few years after the enactment of the Americans With Disabilities Act in 1990.

The legislation required that elevator doors remain open long enough for anyone who uses crutches, a cane or wheelchair to get on board, Ms. Penafiel said in an interview on Tuesday. “The riding public would not be able to make those doors close any faster,” she said.

The buttons can be operated by firefighters and maintenance workers who have the proper keys or codes.

No figures were available for the number of elevators still in operation with functioning door-close buttons. Given that the estimated useful life of an elevator is 25 years, it is likely that most elevators in service today have been modernized or refurbished, rendering the door-close buttons a thing of the past for riders, Ms. Penafiel said.

Read the entire story here.

Image: Elevator control panel, cropped to show only dual “door open” and “door close” buttons. Courtesy: Nils R. Barth. Wikipedia. Creative Commons CC0 1.0 Universal Public Domain Dedication.

How and Why Did Metamorphosis Evolve?

papilio_machaon

Evolution is a truly wondrous thing. It has given us eyes and lots of grey matter [which we still don’t use very well]. It has given us the beautiful tiger and shimmering hues and soaring songs of our birds. It has given us the towering Sequoias, creepy insects, gorgeous ocean-bound creatures and invisible bacteria and viruses. Yet for all its wondrous adaptations one evolutionary invention still seems mysteriously supernatural — metamorphosis.

So, how and why did it evolve? A compelling new theory on the origins of insect metamorphosis by James W. Truman and Lynn M. Riddiford is excerpted below (from a detailed article in Scientific American).

The theory posits that a beneficial mutation around 300 million years ago led to the emergence of metamorphosis in insects:

By combining evidence from the fossil record with studies on insect anatomy and development, biologists have established a plausible narrative about the origin of insect metamorphosis, which they continue to revise as new information surfaces. The earliest insects in Earth’s history did not metamorphose; they hatched from eggs, essentially as miniature adults. Between 280 million and 300 million years ago, however, some insects began to mature a little differently—they hatched in forms that neither looked nor behaved like their adult versions. This shift proved remarkably beneficial: young and old insects were no longer competing for the same resources. Metamorphosis was so successful that, today, as many as 65 percent of all animal species on the planet are metamorphosing insects.

And, there are essentially three types of metamorphosis:

Wingless ametabolous insects, such as silverfish and bristletails, undergo little or no metamorphosis. When they hatch from eggs, they already look like adults, albeit tiny ones, and simply grow larger over time through a series of molts in which they shed their exoskeletons. Hemimetaboly, or incomplete metamorphosis, describes insects such as cockroaches, grasshoppers and dragonflies that hatch as nymphs—miniature versions of their adult forms that gradually develop wings and functional genitals as they molt and grow. Holometaboly, or complete metamorphosis, refers to insects such as beetles, flies, butterflies, moths and bees, which hatch as wormlike larvae that eventually enter a quiescent pupal stage before emerging as adults that look nothing like the larvae.

And, it’s backed by a concrete survival and reproductive advantage:

[T]he enormous numbers of metamorphosing insects on the planet speak for its success as a reproductive strategy. The primary advantage of complete metamorphosis is eliminating competition between the young and old. Larval insects and adult insects occupy very different ecological niches. Whereas caterpillars are busy gorging themselves on leaves, completely disinterested in reproduction, butterflies are flitting from flower to flower in search of nectar and mates. Because larvas and adults do not compete with one another for space or resources, more of each can coexist relative to species in which the young and old live in the same places and eat the same things.

Read the entire article here.

Image: Old World Swallowtail (Papilio machaon). Courtesy: fesoj – Otakárek fenyklový [Papilio machaon]. CC BY 2.0, https://commons.wikimedia.org/w/index.php?curid=7263187

Nightmare Machine

mit-nightmare-machine

Now that the abject terror of the US presidential election is over — at least for a while — we have to turn our minds to new forms of pain and horror.

In recent years a growing number of illustrious scientists and technologists has described artificial intelligence (AI) as the greatest existential threat to humanity. They worry, rightfully, that a well-drilled, unfettered AI could eventually out-think and out-smart us at every level. Eventually, a super-intelligent AI would determine that humans were either peripheral or superfluous to its needs and goals, and then either enslave or extinguish us. This is the stuff of real nightmares.

Yet, at a more playful level, AI can also learn to deliver imagined nightmares. This Halloween researchers at MIT used AI techniques to create and optimize horrifying images of human faces and places. They called their AI the Nightmare Machine.

For the first step, researchers fed hundreds of thousands of celebrity photos into their AI algorithm, known as a deep convolutional generative adversarial network. This allowed the AI to learn about faces and how to create new ones. Second, they flavored the results with a second learning algorithm that had been trained on images of zombies. The combination allowed the AI to learn the critical factors that make for scary images and to selectively improve upon upon them. It turns out that blood on the face, empty eyeball sockets, and missing or misshaped teeth tend to illicit the greatest horror and fear.

While the results are not quite as scary as Stephen Hawkins’ warning of AI-led human extinction the images are terrorizing nonetheless.

Learn more about the MIT Media Lab’s Nightmare Machine here.

Image: Horror imagery generated by artificial intelligence. Courtesy: MIT Media Lab.

Who Needs Education?

misd-proposed-stadium

Here’s a great example of the value that some citizens place on education in the United States. It’s one more recent example of a distorted system that ranks sporting success, or just dreams of success, over learning, teaching and intellectual accomplishment.

McKinney independent school district (MISD), part of the Dallas-Ft.Worth metropolitan area approved a $70 million bond package to finance a new 7,000 capacity stadium and other city improvements. By Texas’ standards this is small potatoes, nearby Allen ISD completed a 18,000 capacity high school stadium in 2012.

Put into perspective: most non-premium level, professional sports teams in Europe have lower capacity stadiums [stadia, for my British readers].

From Guardian:

In the middle of the change from small town to booming Dallas suburb is football. Celina could end up with more than one high school and therefore more than one football team, a division of the local talent pool that would vex some. But a more immediate question is over the future need for a new stadium to house the existing team and its swelling fanbase. The current 3,800-capacity Bobcat Stadium, regularly packed, might soon be unable to cope with demand.

These are interesting times for high school football stadiums in Texas. Nearby McKinney recently approved the construction of a new $70m, 12,000-seat stadium to be shared by the city’s three high schools. That followed hard on the heels of a $60m, 18,000-capacity venue for neighboring Allen – which has one high school – completed in 2012. Local media have called the sprouting of expensive stadiums among rival school districts in affluent suburbs an arms race. The adjacent Frisco, meanwhile, entered a partnership with the Dallas Cowboys for its schools to play in the NFL team’s new indoor practice facility built in the city. The Frisco independent school district is chipping in $30m so area kids can run out at The Ford Center at The Star, capacity 12,000.

Critics argue the money could be better spent elsewhere in the education system.

Read the entire article here.

Image: Proposed McKinney High School Stadium. Courtesy McKinney Independent School District (MISD) press handout.

MondayMap: The Architecture of Music

musicmap

A couple of years ago I wrote about Every Noise At Once a visualization, with samples, of (almost) every musical genre. At last count Glenn McDonald’s brainchild had algorithmically-generated and scatter-plotted 1,496 genres.

Now courtesy of Belgian architect Kwinten Crauwels we have the next gorgeous visual iteration of the music universe — Musicmap. It took Crauwels seven years to construct this astounding and comprehensive, interactive map of music genres, sub-genres and their relationships. It traces the genealogy of around 150 years of popular music.

Crauwels color-coded each of the major genres and devised different types of lines to show different relationships across the hundreds of genres and sub-genres. You can fly around the map to follow the links and drill-down to learn more about each musical style.

musicmap2

Now you can visually trace how Garage Rock is related to Detroit’s Motown and Doo Wop, or how present day Industrial Synth evolved from Krautrock of the 1970s.

It’s a visual, and musical, masterpiece. Read more about Musicmap here.

Image: Musicmap screenshots. Courtesy of Kwinten Crauwels, Musicmap.

The Only Gettysburg Address

lincolns_gettysburg_address_gettysburg

One hundred and fifty three years ago today, President Abraham Lincoln delivered, during the American Civil War, one of the most memorable speeches in US history. His resonant words will continue to be taught, studied and remembered.

Four score and seven years ago our fathers brought forth on this continent, a new nation, conceived in Liberty, and dedicated to the proposition that all men are created equal.

Now we are engaged in a great civil war, testing whether that nation, or any nation so conceived and so dedicated, can long endure. We are met on a great battle-field of that war. We have come to dedicate a portion of that field, as a final resting place for those who here gave their lives that that nation might live. It is altogether fitting and proper that we should do this.

But, in a larger sense, we can not dedicate — we can not consecrate — we can not hallow — this ground. The brave men, living and dead, who struggled here, have consecrated it, far above our poor power to add or detract. The world will little note, nor long remember what we say here, but it can never forget what they did here. It is for us the living, rather, to be dedicated here to the unfinished work which they who fought here have thus far so nobly advanced. It is rather for us to be here dedicated to the great task remaining before us — that from these honored dead we take increased devotion to that cause for which they gave the last full measure of devotion — that we here highly resolve that these dead shall not have died in vain — that this nation, under God, shall have a new birth of freedom — and that government of the people, by the people, for the people, shall not perish from the earth.

Others have delivered words on the hallowed grounds of Gettysburg. One recent example treated us, not to heartfelt oratory, but to whining about a rigged election, railing against the disgusting media, and regurgitating personal grievances and attacks. This speech train-of-thought nonsense will be discarded and forgotten, unless future scholars return to dissect the most spectacular campaign failure — and disgusting individual — in modern US politics.

Image: The only confirmed photo of Abraham Lincoln at Gettysburg, some three hours before the speech, 19 November 1863. Courtesy: United States Library of Congress. Public Domain.

Speaking in (Alien) Tongues

Famous_fantastic_mysteries_195107

Considering that we humans cannot clearly communicate with any other living species on the planet it seems rather fanciful that we might be able to chat with an extraterrestrial intelligence.

But some linguists have a plan should we ever come across an alien civilization, or more likely should they ever choose to give Earth a visit. The idea is to develop a communication process using monolingual fieldwork.

From Scientific American:

In the upcoming sci-fi drama “Arrival,” several mysterious spacecraft touch down around the planet, and humanity is faced with how to approach—and eventually communicate—with these extraterrestrial visitors.

In the film, a team of experts is assembled to investigate, and among the chosen individuals is a linguist, played by actress Amy Adams. Though the story is rooted in science fiction, it does tackle a very real challenge: How do you communicate with someone—or how do you learn that individual’s language—when you have no intermediary language in common?

The film is based on “Story of Your Life,” a short story by Ted Chiang. It taps into the common science-fiction theme of alien tongues; not only the communication barrier they might present, but the unusual ways they could differ from human language. “There’s a long tradition of science fiction that deals with language and communication,” Chiang told Live Science in an email.

And in both the short story and film, linguists play a key role in bridging the gap between humans and aliens—something that isn’t entirely farfetched, according to Daniel Everett, a linguist at Bentley University in Massachusetts. “Linguists who’ve had extensive field experience can do this. That’s what they do,” Everett told Live Science.

Everett spent more than 30 years working with the Pirahãpeople of the Brazilian Amazon, learning and studying their language, which was poorly documented prior to his work. Pirahãis what’s called a language isolate, a linguistic orphan of sorts, and is the last surviving member of its language family. It is also well-known for some of its atypical qualities, such as a lack of counting numbers or relative directions, such as “left” and “right,” qualities which Everett worked out over years of study.

The people were similarly isolated, and were entirely monolingual, he said. So it didn’t matter that Everett didn’t know Portuguese. Rather than asking questions about the Pirahãlanguage in a shared second language, he conducted his research in a style known as monolingual fieldwork.

Pointing to a nearby object, like a stick, and asking (even in English) what it’s called is typically interpreted as a cue to name it, Everett said. From the names of things, a linguist can then work their way towards actions, and how to express relationships between objects, Everett said. All the while, linguists typically transcribe the statements, paying attention to the sounds, the grammar and the way meanings are combined, building a working theory of the language, he said.

Read the entire article here.

Image: Reprint of The War of the Worlds cover-featured on the July 1951 issue of Famous Fantastic Mysteries. Public Domain.

The Next 4 Years

It’s taken me a week to recover from the visceral shock of the US Presidential election. A vile process that continued for 18 months finally culminated in the election of, quite simply, a neo-fascist-lite for our Twitter age.

Like many other so-called elitists — if we should equate elitism with a higher education — I had hoped for a different outcome. Well, it wasn’t to be. So, it’s time to accept the result and move on, right?

Not quite, since this is an existential threat to my children and our democracy, like no other.

Thus, I will begin the next four years by reminding myself, and you dear reader of the President-elect’s vulgarities, bigotry, hypocrisy, contempt, mendacity and other dangerously ignorant, poisonous nonsense and complete bullshit from the depraved, despotic, shameless, shallow, deceitful, volatile, puerile, vindictive, noxious, boastful, misogynistic, racist, corrupt, thuggish, insensitive, naive, irrational, petulant, solipsistic, authoritarian, vengeful, disgraceful, abusive, irresponsible, narcissistic, pompous, vacuous, cowardly, amoral, self-aggrandizing, unprincipled, pathologically deranged, completely detached-from-reality (crazy), unapologetically fraudulent, chronically repulsive, thoroughly sleazy and incoherent mind mouth of the President-elect (think about that very carefully for several minutes each day over the next 1,450 or so days).

Open-Office or Home-Based?

google-search-open-office

Enough with the open office. Despite claims to democratize the workspace, improve employee camaraderie and boost interactions the open office layout reduces productivity.

Employers, here’s a better idea. Let your employees work from home. It really works: cuts corporate costs, increases productivity and morale, and reduces greenhouse emissions (from less commuting). Everybody wins — except, perhaps, for those who thrive on office gossip or require an in situ foosball table.

From the Washington Post:

A year ago, my boss announced that our large New York ad agency would be moving to an open office. After nine years as a senior writer, I was forced to trade in my private office for a seat at a long, shared table. It felt like my boss had ripped off my clothes and left me standing in my skivvies.

Our new, modern Tribeca office was beautifully airy, and yet remarkably oppressive. Nothing was private. On the first day, I took my seat at the table assigned to our creative department, next to a nice woman who I suspect was an air horn in a former life.  All day, there was constant shuffling, yelling, and laughing, along with loud music piped through a PA system.  As an excessive water drinker, I feared my co-workers were tallying my frequent bathroom trips.  At day’s end, I bid adieu to the 12 pairs of eyes I felt judging my 5:04 p.m. departure time. I beelined to the Beats store to purchase their best noise-cancelling headphones in an unmistakably visible neon blue.

Despite its obvious problems, the open-office model has continued to encroach on workers across the country. Now, about 70 percent of U.S. offices have no or low partitions, according to the International Facility Management Association. Silicon Valley has been the leader in bringing down the dividers. Google, Yahoo, eBay, Goldman Sachs and American Express are all adherents.  Facebook CEO Mark Zuckerberg enlisted famed architect Frank Gehry to design the largest open floor plan in the world, housing nearly 3,000 engineers. And as a businessman, Michael Bloomberg was an early adopter of the open-space trend, saying it promoted transparency and fairness. He famously carried the model into city hall when he became mayor of New York,  making “the Bullpen” a symbol of open communication and accessibility to the city’s chief.

Read the entire story here.

Image courtesy of Google Search.

Quiet Please

dakota-ridge-29nov2015

Our world is a noisy place. And, for all our technological progress it is becoming increasingly noisy. Many who can afford to do so spend a significant slice of their incomes seeking the elusive place or moment(s) that bring peace and quiet. So, it’s no surprise to see an uptick in demand for all things quiet — silent reading, silent dining, silent hiking, silent meditation.

From the Guardian:

Once the preserve of monastic retreats and hardcore meditators, simply being quiet is growing in appeal. Whole businesses have sprung up to meet a rising demand for quiet time, from silent weekend getaways to silent dining, silent reading parties and even silent dating. This month sees the release of documentary In Pursuit of Silence, a “meditative film” about our relationship with noise, promoted with a delicate two-minute trailer in which not a word is uttered.

Silence can, as the film attests, mean different things to different people. It can be a space for quiet reflection or a state fraught with discomfort. There is a certain intimacy inherent in being silent with other people – we usually do so only with those closest to us. So there is something almost radical about the recent trend for enjoying silence with strangers.

Mariel Symeonidou started a regular silent reading party in Dundee just under a year ago, in a moment of “uncharacteristic extroversion”. Readers bring their books and meet in a bar, where they read together in silence for an hour or sometimes two, then put the books away to chat and have a drink.

Read the entire article, in silence, here.

Image: Early winter, Dakota Ridge. Courtesy of the author.

Asgardia

asgardia-screenshot

With all this earthbound turmoil around us perhaps it’s time to move elsewhere. Asgardia? Well, almost. You may soon be able to become an Asgardian citizen. First the project leaders must convince the United Nations that a satellite to be launched in 2017 deems legal, sovereign status. One catch, though. You’ll still have to reside on Earth.

From the Guardian:

Proposals for the “first nation state in space” have been unveiled by a team of scientists and legal experts, who say the move will foster peace, open up access to space technologies and offer protection for citizens of planet Earth.

Dubbed “Asgardia” after one of the mythical worlds inhabited by the Norse gods, the team say the “new nation” will eventually become a member of the United Nations, with its own flag and anthem devised by members of the public through a series of competitions.

According to the project website, Asgardia “will offer an independent platform
free from the constraint of a land-based country’s laws. It will become a place it in orbit which is truly ‘no man’s land’”.

Initially, it would seem, this new nation will consist of a single satellite, scheduled to be launched next year, with its citizens residing firmly on terra firma.

Speaking to the Guardian through an interpreter, the project lead Igor Ashurbeyli, said: “Physically the citizens of that nation state will be on Earth; they will be living in different countries on Earth, so they will be a citizen of their own country and at the same time they will be citizens of Asgardia.”

“When the number of those applications goes above 100,000 we can officially apply to the UN for the status of state,” he added.

Read the story here.

Image: Screenshot from Asgardia website.

MondayMap: Red Versus Blue

1883-county-map

You may believe that colorful, graphical electoral analysis is a relatively recent phenomenon. You know, those cool red and blue maps (and now sometimes green or purple) of each state and country.

But our present day news networks and the internet did not invent this type of infographic map.

Susan Schulten, chair of the history department at the University of Denver, discovered what may be the earliest example of a US county-level electoral map. Published in 1883 it shows results from the 1880 Presidential election between Republican James Garfield and Democrat Winfield Hancock. Garfield won.

Two notable reversals in the 1880 map versus today’s counterpart: First, Democrats are in red; Republicans in blue. Second, Democrats make up the majority in much of the South and Midwest; Republicans rule in the Northeast. Interestingly, the color scheme switched numerous times over the last hundred years and did not formally become Democrat=Blue, Republican=Red until the 2000 election cycle.

For more fascinating details of our electoral maps, past and present, check out this article by Lazaro Gamio, over at the Washington Post.

Image: Plate 11 from Scribner’s Statistical Atlas of the United States, published in 1883. Courtesy: Library of Congress. Public Domain.

Fear the First 100 Days

Imagine, in your rosy colored dreams or your darkest nightmares, what the first one hundred days of a Republican presidency would look like.

Actually, you don’t need to do much imagining since you can for the most part piece together what would become of the United States based on the daily flow of Trumpian vulgarities, bigotry, hypocrisy, contempt and other dangerously ignorant, poisonous nonsense and complete bullshit from the depraved, despotic, shameless, shallow, deceitful, volatile, puerile, vindictive, noxious, misogynistic, racist, corrupt, thuggish, insensitive, naive, irrational, petulant, authoritarian, vengeful, disgraceful, abusive, irresponsible, narcissistic, vacuous, cowardly, self-aggrandizing, unprincipled, pathologically deranged, completely detached-from-reality (crazy), unapologetically fraudulent, chronically repulsive, thoroughly sleazy and incoherent mind mouth of the “Republican” nominee for President (think about that very carefully for several minutes).

But, that said, Dana Milbank over at the Washington Post reminds us of the stakes, just a couple of days away; he couldn’t have put it more clearly and succinctly:

Among things you can expect: a trade war with China and Mexico, a restarting of Iran’s nuclear program, millions losing their health insurance, the start of mass deportations, a possible military standoff with China in the South China Sea and North Korea, the resumption of waterboarding, the use of federal agencies to go after Hillary Clinton and other Trump critics, the spectacle of the commander in chief suing women who have accused him of sexual misconduct and a constitutional crisis as the president of the United States attempts to disqualify the federal judge in a fraud suit against him because the judge is Latino.

He’s not joking. Read the entire article here.