MondayMap: National Business Emotional Intelligence

A recent article in the Harvard Business Review (HBR) gives would-be business negotiators some general tips on how best to deal with counterparts from other regions of the world. After all, getting to yes and reaching a mutually beneficial agreement across all parties does require a good degree of cultural sensitivity and emotional intelligence.

map-Emotional-Counterpart

While there is no substitute to understanding other nations through travel and cultural immersion, the HBR article describes some interesting nuances to help those lacking in geographic awareness, international business experience,  and cross-cultural wisdom. The first step in this exotic journey is rather appropriately, a map.

No surprise, the Japanese and Filipinos shirk business confrontation, whereas the Russians and French savor it. Northern Europeans are less emotional, while Southern Europeans and Latin Americans are much more emotionally expressive.

From Frank Jacobs over at Strange Maps:

Negotiating with Filipinos? Be warm and personal, but stay polite. Cutting das Deal with Germans? Stay cool as ice, and be tough as nails. So what happens if you’re a German doing business in the Philippines?

That’s not the question this map was designed to answer. This map — actually, a diagram — shows differences in attitudes to business negotiations in a number of countries. Familiarise yourself with them, then burn the drawing. From now on, you’ll be a master international dealmaker.

Vertically, the map distinguishes between countries where it is highly haram to show emotions during business proceedings (Japan being the prime example) and countries where emotions are an accepted part of il commercio (yes, Italians are emotional extroverts — also in business).

The horizontal axis differentiates countries with a very confrontational negotiating style — think heated arguments and slammed doors — from places where decorum is the alpha and omega of commercial dealings. For an extreme example of the former, try trading with an Israeli company. For the latter, I refer you to those personable but (apparently also) persnickety Filipinos.

Read the entire article here.

Map courtesy of Erin Meyer, professor and the program director for Managing Global Virtual Teams at INSEAD. Courtesy of HBR / Strange Maps.

Send to Kindle

Colonizing the Milky Way 101

ESO-The_Milky_Way_panorama

The human race is likely to spend many future generations grappling with the aftermaths of its colonial sojourns across the globe. Almost every race and creed over our documented history has actively pursued encroaching upon and displacing others. By our very nature we are territorial animals, and very good ones at that.

Yet despite the untold volumes of suffering, pain and death wrought on those we colonize our small blue planet is not enough for our fantasies and follies. We send our space probes throughout the solar system to test for habitability. We dream of human outposts on the Moon and on Mars. But even our solar system is too minuscule for our expansive, acquisitive ambitions. Why not colonize our entire galaxy? Now we’re talking!

Kim Stanley Robinson, author extraordinaire of numerous speculative and science fiction novels, gives us an idea of what it may take to spread our wings across the Milky Way in a recent article for Scientific American, excerpted here.

It will be many centuries before humans move beyond our solar system. But, before we do so I’d propose that we get our own house in order first. That will be our biggest challenge, not the invention of yet to be imagined technologies.

From Scientific American:

The idea that humans will eventually travel to and inhabit other parts of our galaxy was well expressed by the early Russian rocket scientist Konstantin Tsiolkovsky, who wrote, “Earth is humanity’s cradle, but you’re not meant to stay in your cradle forever.” Since then the idea has been a staple of science fiction, and thus become part of a consensus image of humanity’s future. Going to the stars is often regarded as humanity’s destiny, even a measure of its success as a species. But in the century since this vision was proposed, things we have learned about the universe and ourselves combine to suggest that moving out into the galaxy may not be humanity’s destiny after all.

The problem that tends to underlie all the other problems with the idea is the sheer size of the universe, which was not known when people first imagined we would go to the stars. Tau Ceti, one of the closest stars to us at around 12 light-years away, is 100 billion times farther from Earth than our moon. A quantitative difference that large turns into a qualitative difference; we can’t simply send people over such immense distances in a spaceship, because a spaceship is too impoverished an environment to support humans for the time it would take, which is on the order of centuries. Instead of a spaceship, we would have to create some kind of space-traveling ark, big enough to support a community of humans and other plants and animals in a fully recycling ecological system.

On the other hand it would have to be small enough to accelerate to a fairly high speed, to shorten the voyagers’ time of exposure to cosmic radiation, and to breakdowns in the ark. Regarded from some angles bigger is better, but the bigger the ark is, the proportionally more fuel it would have to carry along to slow itself down on reaching its destination; this is a vicious circle that can’t be squared. For that reason and others, smaller is better, but smallness creates problems for resource metabolic flow and ecologic balance. Island biogeography suggests the kinds of problems that would result from this miniaturization, but a space ark’s isolation would be far more complete than that of any island on Earth. The design imperatives for bigness and smallness may cross each other, leaving any viable craft in a non-existent middle.

The biological problems that could result from the radical miniaturization, simplification and isolation of an ark, no matter what size it is, now must include possible impacts on our microbiomes. We are not autonomous units; about eighty percent of the DNA in our bodies is not human DNA, but the DNA of a vast array of smaller creatures. That array of living beings has to function in a dynamic balance for us to be healthy, and the entire complex system co-evolved on this planet’s surface in a particular set of physical influences, including Earth’s gravity, magnetic field, chemical make-up, atmosphere, insolation, and bacterial load. Traveling to the stars means leaving all these influences, and trying to replace them artificially. What the viable parameters are on the replacements would be impossible to be sure of in advance, as the situation is too complex to model. Any starfaring ark would therefore be an experiment, its inhabitants lab animals. The first generation of the humans aboard might have volunteered to be experimental subjects, but their descendants would not have. These generations of descendants would be born into a set of rooms a trillion times smaller than Earth, with no chance of escape.

In this radically diminished enviroment, rules would have to be enforced to keep all aspects of the experiment functioning. Reproduction would not be a matter of free choice, as the population in the ark would have to maintain minimum and maximum numbers. Many jobs would be mandatory to keep the ark functioning, so work too would not be a matter of choices freely made. In the end, sharp constraints would force the social structure in the ark to enforce various norms and behaviors. The situation itself would require the establishment of something like a totalitarian state.

Read the entire article here.

Image: The Milky Way panorama. Courtesy: ESO/S. Brunier – Licensed under Creative Commons.

Send to Kindle

Another Glorious Hubble Image

This NASA/ESA Hubble Space Telescope image shows the spiral galaxy NGC 4845, located over 65 million light-years away in the constellation of Virgo (The Virgin). The galaxy’s orientation clearly reveals the galaxy’s striking spiral structure: a flat and dust-mottled disc surrounding a bright galactic bulge. NGC 4845’s glowing centre hosts a gigantic version of a black hole, known as a supermassive black hole. The presence of a black hole in a distant galaxy like NGC 4845 can be inferred from its effect on the galaxy’s innermost stars; these stars experience a strong gravitational pull from the black hole and whizz around the galaxy’s centre much faster than otherwise. From investigating the motion of these central stars, astronomers can estimate the mass of the central black hole — for NGC 4845 this is estimated to be hundreds of thousands times heavier than the Sun. This same technique was also used to discover the supermassive black hole at the centre of our own Milky Way — Sagittarius A* — which hits some four million times the mass of the Sun (potw1340a). The galactic core of NGC 4845 is not just supermassive, but also super-hungry. In 2013 researchers were observing another galaxy when they noticed a violent flare at the centre of NGC 4845. The flare came from the central black hole tearing up and feeding off an object many times more massive than Jupiter. A brown dwarf or a large planet simply strayed too close and was devoured by the hungry core of NGC 4845.

The Hubble Space Telescope captured this recent image of spiral galaxy NGC 4845. The galaxy lies around 65 million light-years from Earth, but it still presents a gorgeous sight. NGC 4845’s glowing center hosts a supermassive, and super hungry, black hole.

Thanks NASA, but I just wish you would give these galaxies more memorable names.

Image: NASA/ESA Hubble Space Telescope image shows the spiral galaxy NGC 4845, located over 65 million light-years away in the constellation of Virgo. Courtesy: ESA/Hubble & NASA and S. Smartt (Queen’s University Belfast).

Send to Kindle

SPLAAT! Holy Onomatopoeia, Batman!

Batman fans rejoice. Here it is, a compendium of every ZWAPP! KAPOW! BLOOP! and THUNK! from every fight scene in the original 1960’s series.

I think we can all agree that the campy caped crusaders, dastardly villains and limp fight scenes, accompanied by bright onomatopoeiac graphics, guaranteed the show would become an enduring cult classic. Check out the full list here, compiled by the forces for good over at Fastcompany.

My favorites:

FLRBBBBB! GLURPP! KAPOW! KER-SPLOOSH! KLONK! OOOOFF! POWIE! QUNCKKK! URKK! ZLONK!

 

Video: Batman (1966):Fight Scenes-Season 1 (Pt.1). Courtesy of corijei v2 / Youtube.

Send to Kindle

Another Corporate Empire Bites the Dust

Motorola-DynaTACBusinesses and brands come and they go. Seemingly unassailable corporations, often valued in the tens of billions of dollars (and sometimes more) fall to the incessant march of technological change and increasingly due to the ever fickle desires of the consumer.

And, these monoliths of business last but blinks of an eye when compared with the likes of our vast social empires such as the Roman, Han, Ottoman, Venetian, Sudanese, Portuguese, which persist for many hundreds — sometimes thousands — of years.

Yet, even a few years ago who would have predicted the demise of the Motorola empire, the company mostly responsible for the advent of the handheld mobile phone. Motorola had been on a recent downward spiral, failing in part to capitalize on the shift to smartphones, mobile operating systems and apps. Now it’s brand is dust. RIP brick!

From the Guardian:

Motorola, the brand which invented the mobile phone, brought us the iconic “Motorola brick”, and gave us both the first flip-phone and the iconic Razr, is to cease to exist.

Bought from Google by the Chinese smartphone and laptop powerhouse Lenovo in January 2014, Motorola had found success over the past two years. It launched the Moto G in early 2014, which propelled the brand, which had all but disappeared after the Razr, from a near-0% market share to 6% of sales in the UK.

The Moto G kickstarted the reinvigoration of the brand, which saw Motorola ship more than 10m smartphones in the third quarter of 2014, up 118% year-on-year.

But now Lenovo has announced that it will kill off the US mobile phone pioneer’s name. It will keep Moto, the part of Motorola’s product naming that has gained traction in recent years, but Moto smartphones will be branded under Lenovo.

Motorola chief operating officer Rick Osterloh told Cnet that “we’ll slowly phase out Motorola and focus on Moto”.

The Moto line will be joined by Lenovo’s Vibe line in the low end, leaving the fate of the Moto E and G uncertain. The Motorola Mobility division of Lenovo will take over responsibility for the Chinese manufacturer’s entire smartphone range.

Read the entire story here.

Image: Motorola DynaTAC 8000X commercial portable cellular phone, 1983. Courtesy of Motorola.

Send to Kindle

Meet the Broadband Preacher

Welcome-to-mississippi_I20

This fascinating article follows Roberto Gallardo an extension professor at Mississippi State University as he works to bring digital literacy, the internet and other services of our 21st century electronic age to rural communities across the South. It’s an uphill struggle.

From Wired:

For a guy born and raised in Mexico, Roberto Gallardo has an exquisite knack for Southern manners. That’s one of the first things I notice about him when we meet up one recent morning at a deli in Starkville, Mississippi. Mostly it’s the way he punctuates his answers to my questions with a decorous “Yes sir” or “No sir”—a verbal tic I associate with my own Mississippi upbringing in the 1960s.

Gallardo is 36 years old, with a salt-and-pepper beard, oval glasses, and the faint remnant of a Latino accent. He came to Mississippi from Mexico a little more than a decade ago for a doctorate in public policy. Then he never left.

I’m here in Starkville, sitting in this booth, to learn about the work that has kept Gallardo in Mississippi all these years—work that seems increasingly vital to the future of my home state. I’m also here because Gallardo reminds me of my father.

Gallardo is affiliated with something called the Extension Service, an institution that dates back to the days when America was a nation of farmers. Its original purpose was to disseminate the latest agricultural know-how to all the homesteads scattered across the interior. Using land grant universities as bases of operations, each state’s extension service would deploy a network of experts and “county agents” to set up 4-H Clubs or instruct farmers in cultivation science or demonstrate how to can and freeze vegetables without poisoning yourself in your own kitchen.

State extension services still do all this, but Gallardo’s mission is a bit of an update. Rather than teach modern techniques of crop rotation, his job—as an extension professor at Mississippi State University—is to drive around the state in his silver 2013 Nissan Sentra and teach rural Mississippians the value of the Internet.

In sleepy public libraries, at Rotary breakfasts, and in town halls, he gives PowerPoint presentations that seem calculated to fill rural audiences with healthy awe for the technological sublime. Rather than go easy, he starts with a rapid-fire primer on heady concepts like the Internet of Things, the mobile revolution, cloud computing, digital disruption, and the perpetual increase of processing power. (“It’s exponential, folks. It’s just growing and growing.”) The upshot: If you don’t at least try to think digitally, the digital economy will disrupt you. It will drain your town of young people and leave your business in the dust.

Then he switches gears and tries to stiffen their spines with confidence. Start a website, he’ll say. Get on social media. See if the place where you live can finally get a high-speed broadband connection—a baseline point of entry into modern economic and civic life.

Even when he’s talking to me, Gallardo delivers this message with the straitlaced intensity of a traveling preacher. “Broadband is as essential to this country’s infrastructure as electricity was 110 years ago or the Interstate Highway System 50 years ago,” he says from his side of our booth at the deli, his voice rising high enough above the lunch-hour din that a man at a nearby table starts paying attention. “If you don’t have access to the technology, or if you don’t know how to use it, it’s similar to not being able to read and write.”

These issues of digital literacy, access, and isolation are especially pronounced here in the Magnolia State. Mississippi today ranks around the bottom of nearly every national tally of health and economic well-being. It has the lowest median household income and the highest rate of child mortality. It also ranks last in high-speed household Internet access. In human terms, that means more than a million Mississippians—over a third of the state’s population—lack access to fast wired broadband at home.

Gallardo doesn’t talk much about race or history, but that’s the broader context for his work in a state whose population has the largest percentage of African-Americans (38 percent) of any in the union. The most Gallardo will say on the subject is that he sees the Internet as a natural way to level out some of the persistent inequalities—between black and white, urban and rural—that threaten to turn parts of Mississippi into places of exile, left further and further behind the rest of the country.

And yet I can’t help but wonder how Gallardo’s work figures into the sweep of Mississippi’s history, which includes—looking back over just the past century—decades of lynchings, huge outward migrations, the fierce, sustained defense of Jim Crow, and now a period of unprecedented mass incarceration. My curiosity on this point is not merely journalistic. During the lead-up to the civil rights era, my father worked with the Extension Service in southern Mississippi as well. Because the service was segregated at the time, his title was “negro county agent.” As a very young child, I would travel from farm to farm with him. Now I’m here to travel around Mississippi with Gallardo, much as I did with my father. I want to see whether the deliberate isolation of the Jim Crow era—when Mississippi actively fought to keep itself apart from the main currents of American life—has any echoes in today’s digital divide.

Read the entire article here.

Image: Welcome to Mississippi. Courtesy of WebTV3.

Send to Kindle

Neck Tingling and ASMR

Google-search-asmrEver had that curious tingling sensation at the back and base of your neck? Of course you have. Perhaps you’ve felt this sensation during a particular piece of music or from a watching a key scene in a movie or when taking in a panorama from the top of a mountain or from smelling a childhood aroma again. In fact, most people report having felt this sensation, albeit rather infrequently.

But, despite its commonality very little research exists to help us understand how and why it happens. Psychologists tend to agree that the highly personal and often private nature of the neck tingling experience make it difficult to study and hence generalize. This means, of course, that the internet is rife with hypotheses and pseudo-science. Just try searching for ASMR videos and be (not) surprised by the 2 million+ results.

From the Guardian:

Autonomous Sensory Meridian Response, or ASMR, is a curious phenomenon. Those who experience it often characterise it as a tingling sensation in the back of the head or neck, or another part of the body, in response to some sort of sensory stimulus. That stimulus could be anything, but over the past few years, a subculture has developed around YouTube videos, and their growing popularity was the focus of a video posted on the Guardian this last week. It’s well worth a watch, but I couldn’t help but feel it would have been a bit more interesting if there had been some scientific background in it. The trouble is, there isn’t actually much research on ASMR out there.

To date, only one research paper has been published on the phenomenon. In March last year, Emma Barratt, a graduate student at Swansea University, and Dr Nick Davis, then a lecturer at the same institution, published the results of a survey of some 500 ASMR enthusiasts. “ASMR is interesting to me as a psychologist because it’s a bit ‘weird’” says Davis, now at Manchester Metropolitan University. “The sensations people describe are quite hard to describe, and that’s odd because people are usually quite good at describing bodily sensation. So we wanted to know if everybody’s ASMR experience is the same, and of people tend to be triggered by the same sorts of things.”

Read the entire story here.

Image courtesy of Google Search.

Send to Kindle

Please Laugh While You Can

Rationality requires us to laugh at the current state of the U.S. political “conversation” as Jonathan Jones so rightly reminds us. I say “conversation” in quotes because it’s no longer a dialog, not even a heated debate or argument. Politicians have replaced rational dialog and disagreement over policy with hate-speech, fear-mongering, bullying, venom, bigotry and character assassination. And, it’s all to the detriment of our democracy.

Those of us who crave a well-reasoned discussion about substantive issues and direction for our country have to gasp with utter incredulity — and then we must laugh.

From Jonathan Jones over at the Guardian:

When a man hoping to be president of the United States can sum up his own country with a photograph of a monogrammed gun and the single-word caption “America”, it may be time for the rest of the world to worry.

Instead they are laughing. Since the Republican nomination hopeful (although not very hopeful) Jeb Bush tweeted a picture of his handgun he has been mocked around the world with images that comically replace that violent symbol with the gentler images that sum up less trigger-happy places – a cup of tea for the UK, a bike for the Netherlands, a curry for Bradford.

The joke’s a bit thin, because what is currently happening in US politics is only funny if you are an alien watching from a spaceship and the fate of the entire planet is just one big laugh to you. For what is Bush trying to achieve with this picture? He’s trying to appeal to the rage and irrationality that have made Donald Trump’s bombastical assault on the White House look increasingly plausible while Bush languishes, a conventional politician swamped by unconventional times.

The centre cannot hold, WB Yeats wrote nearly a century ago, and this photograph shows exactly how off centre things are getting. When Jeb Bush – brother of one warmongering president, son of another, and a governor who sanctioned 21 executions during his tenure in Florida – embodies the centre ground, you know things have got strange. Compared with the strongman politics, explicit bigotry and perversion that a Trump presidency threatens, mere conservatism would be sweet sanity.

But this photograph reveals that that is not on offer. America, says Bush’s Twitter account, is a gun with your name on it. The candidate has his name inscribed on his weapon – Gov Jeb Bush, it says on the barrel. This man is a gun. He’s primed and loaded. You think Trump talks tough? Well, talk is cheap. “Speak softly, and carry a big stick,” said Theodore Roosevelt. Bush has got this gun, see, and he knows how to use it.

Read the entire article here.

Send to Kindle

Human Bloatware

Most software engineers and IT people are familiar with the term “bloatware“. The word is usually applied to a software application that takes up so much disk space and/or memory that its functional benefits are greatly diminished or rendered useless. Operating systems such as Windows and OSX are often characterized as bloatware — a newer version always seems to require an ever-expanding need for extra disk space (and memory) to accommodate an expanding array of new (often trivial) features with marginal added benefit.

DNA_Structure

But it seems that humans did not invent such obesity through our technology. Rather, a new genetic analysis shows that humans (and other animals) actually consist of biological bloatware, through a process which began when molecules of DNA first assembled the genes of the earliest living organisms.

From ars technica:

Eukaryotes like us are more complex than prokaryotes. We have cells with lots of internal structures, larger genomes with more genes, and our genes are more complex. Since there seems to be no apparent evolutionary advantage to this complexity—evolutionary advantage being defined as fitness, not as things like consciousness or sex—evolutionary biologists have spent much time and energy puzzling over how it came to be.

In 2010, Nick Lane and William Martin suggested that because they don’t have mitochondria, prokaryotes just can’t generate enough energy to maintain large genomes. Thus it was the acquisition of mitochondria and their ability to generate cellular energy that allowed eukaryotic genomes to expand. And with the expansion came the many different types of genes that render us so complex and diverse.

Michael Lynch and Georgi Marinov are now proposing a counter offer. They analyzed the bioenergetic costs of a gene and concluded that there is in fact no energetic barrier to genetic complexity. Rather, eukaryotes can afford bigger genomes simply because they have bigger cells.

First they looked at the lifetime energetic requirements of a cell, defined as the number of times that cell hydrolyzes ATP into ADP, a reaction that powers most cellular processes. This energy requirement rose linearly and smoothly with cell size from bacteria to eukaryotes with no break between them, suggesting that complexity alone, independently of cell volume, requires no more energy.

Then they calculated the cumulative cost of a gene—how much energy it takes to replicate it once per cell cycle, how much energy it takes to transcribe it into mRNA, and how much energy it takes to then translate that mRNA transcript into a functional protein. Genes may provide selective advantages, but those must be sufficient to overcome and justify these energetic costs.

At the levels of replication (copying the DNA) and transcription (making an RNA copy), eukaryotic genes are more costly than prokaryotic genes because they’re bigger and require more processing. But even though these costs are higher, they take up proportionally less of the total energy budget of the cell. That’s because bigger cells take more energy to operate in general (as we saw just above), while things like copying DNA only happens once per cell division. Bigger cells help here, too, as they divide less often.

Read the entire article here.

Send to Kindle

Documenting the Self

Samuel_PepysIs Nicolas Felton the Samuel Pepys of our digital age?

They both chronicled their observations over a period of 10 years, but separated by 345 years. However, that’s where the similarity between the two men ends.

Samuel Pepys was a 17th century member of British Parliament and naval bureaucrat, famous for the decade-long private diary. Pepys kept detailed personal notes from 1660 to 1669. The diary was subsequently published in the 19th century, and is now regarded as one of the principal sources of information of the Restoration period (return of the monarchy under Charles II). Many a British school kid [myself included] has been exposed to Pepys’ observations of momentous events, including his tales of the plague and the Great Fire of London.

Nicolas Felton a graphic designer and ex-Facebook employee cataloged his life from 2005 to 2015. Based in New York, Felton began obsessively recording the minutiae of his life in 2005. He first tracked his locations and time spent in each followed by his music-listening habits. Then he began counting his emails, correspondence, calendar entries, photos. Felton eventually compiled his detailed digital tracks into a visually fascinating annual Feltron Report.

So, Felton is certainly no Pepys, but his data trove remains interesting nonetheless — for different reasons. Pepys recorded history during a tumultuous time in England; his very rare, detailed first-person account across an entire decade has no parallel. His diary is now an invaluable literary chronicle for scholars and history buffs.

Our world is rather different today. Our technologies now enable institutions and individuals to record and relate their observations ad nauseam. Thus Felton’s data is not unique per se, though his decade-long obsession certainly provides us with a quantitative trove of data, which is not necessarily useful to us for historical reasons, but more so for those who study our tracks and needs, and market to us.

Read Samuel Pepys diary here. Read more about Nicolas Felton here.

Image: Samuel Pepys by John Hayls, oil on canvas, 1666. National Portrait Gallery. Public Domain.

Send to Kindle

MondayMap: Search by State

This treasure of a map shows the most popular Google search terms by state in 2015.

Google-search-by-state-2015

The vastly different searches show how the United States really is a collection of very diverse and loosely federated communities. The US may be a great melting pot, but down at the state level its residents seem to care about very different things.

For instance, while Floridians favorite search was “concealed weapons permit“, residents of Mississippi went rather dubiously for “Ashley Madison“, and Oklahoma’s top search was “Caitlyn Jenner“. Kudos to my home state, residents there put aside politics, reality TV, guns and other inanities by searching most for “water on mars“. Similarly, citizens of New Mexico looked far beyond their borders by searching most for “Pluto“.

And, I have to scratch my head over why New York State cares more about “Charlie Sheen HIV” and Kentucky prefers “Dusty Rhodes” over Washington State’s search for “Leonard Nimoy”.

The map was put together by the kind people at Estately. You can read more fascinating state-by-state search rankings here.

Send to Kindle

The Internet of Flow

Time-based structures of information and flowing data — on a global scale — will increasingly dominate the Web. Eventually, this flow is likely to transform how we organize, consume and disseminate our digital knowledge. While we see evidence of this in effect today, in blogs, Facebook’s wall and timeline and, most basically, via Twitter, the long-term implications of this fundamentally new organizing principle have yet to be fully understood — especially in business.

For a brief snapshot on a possible, and likely, future of the Internet I turn to David Gelernter. He is Professor of Computer Science at Yale University, an important thinker and author who has helped shape the fields of parallel computing, artificial intelligence (AI) and networking. Many of Gelernter’s papers, some written over 20 years ago offer a remarkably prescient view, most notably: Mirror Worlds (1991), The Muse In The Machine (1994) and The Second Coming – A Manifesto (1999).

From WSJ:

People ask where the Web is going; it’s going nowhere. The Web was a brilliant first shot at making the Internet usable, but it backed the wrong horse. It chose space over time. The conventional website is “space-organized,” like a patterned beach towel—pineapples upper left, mermaids lower right. Instead it might have been “time-organized,” like a parade—first this band, three minutes later this float, 40 seconds later that band.

We go to the Internet for many reasons, but most often to discover what’s new. We have had libraries for millennia, but never before have we had a crystal ball that can tell us what is happening everywhere right now. Nor have we ever had screens, from room-sized to wrist-sized, that can show us high-resolution, constantly flowing streams of information.

Today, time-based structures, flowing data—in streams, feeds, blogs—increasingly dominate the Web. Flow has become the basic organizing principle of the cybersphere. The trend is widely understood, but its implications aren’t.

Working together at Yale in the mid-1990s, we forecast the coming dominance of time-based structures and invented software called the “lifestream.” We had been losing track of our digital stuff, which was scattered everywhere, across proliferating laptops and desktops. Lifestream unified our digital life: Each new document, email, bookmark or video became a bead threaded onto a single wire in the Cloud, in order of arrival.

To find a bead, you search, as on the Web. Or you can watch the wire and see each new bead as it arrives. Whenever you add a bead to the lifestream, you specify who may see it: everyone, my friends, me. Each post is as private as you make it.

Where do these ideas lead? Your future home page—the screen you go to first on your phone, laptop or TV—is a bouquet of your favorite streams from all over. News streams are blended with shopping streams, blogs, your friends’ streams, each running at its own speed.

This home stream includes your personal stream as part of the blend—emails, documents and so on. Your home stream is just one tiny part of the world stream. You can see your home stream in 3-D on your laptop or desktop, in constant motion on your phone or as a crawl on your big TV.

By watching one stream, you watch the whole world—all the public and private events you care about. To keep from being overwhelmed, you adjust each stream’s flow rate when you add it to your collection. The system slows a stream down by replacing many entries with one that lists short summaries—10, 100 or more.

An all-inclusive home stream creates new possibilities. You could build a smartwatch to display the stream as it flows past. It could tap you on the wrist when there’s something really important onstream. You can set something aside or rewind if necessary. Just speak up to respond to messages or add comments. True in-car computing becomes easy. Because your home stream gathers everything into one line, your car can read it to you as you drive.

Read the entire article here.

 

Send to Kindle

A Gravitational Wave Comes Ashore

ligo-gravitational-waves-detection

On February 11, 2016, a historic day for astronomers the world over, scientists announced a monumental discovery, which was made on September 14, 2015! Thank you LIGO, the era of gravitational wave (G-Wave) astronomy has begun.

One hundred years after a prediction from Einstein’s theory of general relativity scientists have their first direct evidence of gravitational waves. These waves are ripples in the fabric of spacetime itself rather than the movement of fields and particles, such as from electromagnetic radiation. These ripples show up when gravitationally immense bodies warp the structure of space in which they sit, such as through collisions or acceleration.

ligo-hanford-aerial

As you might imagine for such disturbances to be observed here on Earth over distances in the tens to hundreds of millions, of light-years requires not only vastly powerful forces at one end but immensely sensitive instruments at the other. In fact the detector credited with discovery in this case is the Laser Interferometer Gravitational-Wave Observatory, or LIGO. It is so sensitive it can detect a change in length of its measurement apparatus — infra-red laser beams — 10,000 times smaller than the width of a proton. LIGO is operated by Caltech and MIT and supported through the U.S. National Science Foundation.

Prof Kip Thorne, one of the founders of LIGO, said that until now, astronomers had looked at the universe as if on a calm sea. This is now changed. He adds:

“The colliding black holes that produced these gravitational waves created a violent storm in the fabric of space and time, a storm in which time speeded up and slowed down, and speeded up again, a storm in which the shape of space was bent in this way and that way.”

And, as Prof Stephen Hawking remarked:

“Gravitational waves provide a completely new way of looking at the universe. The ability to detect them has the potential to revolutionise astronomy. This discovery is the first detection of a black hole binary system and the first observation of black holes merging.”

Congratulations to the many hundreds of engineers, technicians, researchers and theoreticians who have collaborated on this ground-breaking experiment. Particular congratulations go to LIGO’s three principal instigators: Rainier Weiss, Kip Thorne, and Ronald Drever.

This discovery paves the way for deeper understanding of our cosmos and lays the foundation for a new and rich form of astronomy through gravitational observations.

Galileo’s first telescopes opened our eyes to the visual splendor of our solar system and its immediate neighborhood. More recently, radio-wave, x-ray and gamma-ray astronomy have allowed us to discover wonders further afield: star-forming nebulae, neutron stars, black holes, active galactic nuclei, the Cosmic Microwave Background (CMB). Now, through LIGO and its increasingly sensitive descendants we are likely to make even more breathtaking discoveries, some of which, courtesy of gravitational waves, may let us peer at the very origin of the universe itself — the Big Bang.

How brilliant is that!

Image 1: The historic detection of gravitational waves by the Laser Interferometer Gravitational-Wave Observatory (LIGO) is shown in this plot during a press conference in Washington, D.C. on Feb. 11, 2016.Courtesy: National Science Foundation.

Image 2: LIGO Laboratory operates two detector sites 1,800 miles apart: one near Hanford in eastern Washington, and another near Livingston, Louisiana. This photo shows the Hanford detector. Courtesy of LIGO Caltech.

 

Send to Kindle

Pass the Nicotinamide Adenine Dinucleotide

NAD-moleculeFor those of us seeking to live another 100 years or more the news and/or hype over the last decade belonged to resveratrol. The molecule is believed to improve functioning of specific biochemical pathways in the cell, which may improve cell repair and hinder the aging process. Resveratrol is found — in trace amounts — in grape skin (and hence wine), blueberries and raspberries. While proof remains scarce, this has not stopped the public from consuming large quantities of wine and berries.

Ironically, one would need to ingest such large amounts of resveratrol to replicate the benefits found in mice studies, that the wine alone would probably cause irreversible liver damage before any health benefits appeared. Oh well.

So, on to the next big thing, since aging cannot wait. It’s called NAD or Nicotinamide Adenine Dinucleotide. NAD performs several critical roles in the cell, one of which is energy metabolism. As we age our cells show diminishing levels of NAD and this is, possibly, linked to mitochondrial deterioration. Mitochondria are the cells’ energy factories, so keeping our mitochondria humming along is critical. Thus, hordes of researchers are now experimenting with NAD and related substances to see if they hold promise in postponing cellular demise.

From Scientific American:

Whenever I see my 10-year-old daughter brimming over with so much energy that she jumps up in the middle of supper to run around the table, I think to myself, “those young mitochondria.”

Mitochondria are our cells’ energy dynamos. Descended from bacteria that colonized other cells about 2 billion years, they get flaky as we age. A prominent theory of aging holds that decaying of mitochondria is a key driver of aging. While it’s not clear why our mitochondria fade as we age, evidence suggests that it leads to everything from heart failure to neurodegeneration, as well as the complete absence of zipping around the supper table.

Recent research suggests it may be possible to reverse mitochondrial decay with dietary supplements that increase cellular levels of a molecule called NAD (nicotinamide adenine dinucleotide). But caution is due: While there’s promising test-tube data and animal research regarding NAD boosters, no human clinical results on them have been published.

NAD is a linchpin of energy metabolism, among other roles, and its diminishing level with age has been implicated in mitochondrial deterioration. Supplements containing nicotinamide riboside, or NR, a precursor to NAD that’s found in trace amounts in milk, might be able to boost NAD levels. In support of that idea, half a dozen Nobel laureates and other prominent scientists are working with two small companies offering NR supplements.

The NAD story took off toward the end of 2013 with a high-profile paper by Harvard’s David Sinclair and colleagues. Sinclair, recall, achieved fame in the mid-2000s for research on yeast and mice that suggested the red wine ingredient resveratrol mimics anti-aging effects of calorie restriction. This time his lab made headlines by reporting that the mitochondria in muscles of elderly mice were restored to a youthful state after just a week of injections with NMN (nicotinamide mononucleotide), a molecule that naturally occurs in cells and, like NR, boosts levels of NAD.

It should be noted, however, that muscle strength was not improved in the NMN-treated micethe researchers speculated that one week of treatment wasn’t enough to do that despite signs that their age-related mitochondrial deterioration was reversed.

NMN isn’t available as a consumer product. But Sinclair’s report sparked excitement about NR, which was already on the market as a supplement called Niagen. Niagen’s maker, ChromaDex, a publicly traded Irvine, Calif., company, sells it to various retailers, which market it under their own brand names. In the wake of Sinclair’s paper, Niagen was hailed in the media as a potential blockbuster.

In early February, Elysium Health, a startup cofounded by Sinclair’s former mentor, MIT biologist Lenny Guarente, jumped into the NAD game by unveiling another supplement with NR. Dubbed Basis, it’s only offered online by the company. Elysium is taking no chances when it comes to scientific credibility. Its website lists a dream team of advising scientists, including five Nobel laureates and other big names such as the Mayo Clinic’s Jim Kirkland, a leader in geroscience, and biotech pioneer Lee Hood. I can’t remember a startup with more stars in its firmament.

A few days later, ChromaDex reasserted its first-comer status in the NAD game by announcing that it had conducted a clinical trial demonstrating that a single dose of NR resulted in statistically significant increases in NAD in humansthe first evidence that supplements could really boost NAD levels in people. Details of the study won’t be out until it’s reported in a peer-reviewed journal, the company said. (ChromaDex also brandishes Nobel credentials: Roger Kornberg, a Stanford professor who won the Chemistry prize in 2006, chairs its scientific advisory board. Hes the son of Nobel laureate Arthur Kornberg, who, ChromaDex proudly notes, was among the first scientists to study NR some 60 years ago.)

The NAD findings tie into the ongoing story about enzymes called sirtuins, which Guarente, Sinclair and other researchers have implicated as key players in conferring the longevity and health benefits of calorie restriction. Resveratrol, the wine ingredient, is thought to rev up one of the sirtuins, SIRT1, which appears to help protect mice on high doses of resveratrol from the ill effects of high-fat diets. A slew of other health benefits have been attributed to SIRT1 activation in hundreds of studies, including several small human trials.

Here’s the NAD connection: In 2000, Guarente’s lab reported that NAD fuels the activity of sirtuins, including SIRT1the more NAD there is in cells, the more SIRT1 does beneficial things. One of those things is to induce formation of new mitochondria. NAD can also activate another sirtuin, SIRT3, which is thought to keep mitochondria running smoothly.

Read the entire article here.

Image: Structure of nicotinamide adenine dinucleotide, oxidized (NAD+). Courtesy of Wikipedia. Public Domain.

Send to Kindle

Streaming is So 2015

Led Zeppelin-IV

Fellow music enthusiasts and technology early adopters ditch the streaming sounds right now. And, if you still have an iPod, or worse an MP3 or CD player, trash it; trash them all.

The future of music is coming, and it’s beamed and implanted directly into your grey matter. I’m not sure if I like the idea of Taylor Swift inside my head — I’m more of a Pink Floyd and Led Zeppelin person — nor the idea of not having a filter for certain genres (i.e., country music). However, some might like the notion of a digital-DJ brain implant that lays down tracks based on your mood from monitoring your neurochemical mix. It’s only a matter of time.

Thanks, but I’ll stick to vinyl, crackles and all.

From WSJ:

The year is 2040, and as you wait for a drone to deliver your pizza, you decide to throw on some tunes. Once a commodity bought and sold in stores, music is now an omnipresent utility invoked via spoken- word commands. In response to a simple “play,” an algorithmic DJ opens a blended set of songs, incorporating information about your location, your recent activities and your historical preferences—complemented by biofeedback from your implanted SmartChip. A calming set of lo-fi indie hits streams forth, while the algorithm adjusts the beats per minute and acoustic profile to the rain outside and the fact that you haven’t eaten for six hours.

The rise of such dynamically generated music is the story of the age. The album, that relic of the 20th century, is long dead. Even the concept of a “song” is starting to blur. Instead there are hooks, choruses, catchphrases and beats—a palette of musical elements that are mixed and matched on the fly by the computer, with occasional human assistance. Your life is scored like a movie, with swelling crescendos for the good parts, plaintive, atonal plunks for the bad, and fuzz-pedal guitar for the erotic. The DJ’s ability to read your emotional state approaches clairvoyance. But the developers discourage the name “artificial intelligence” to describe such technology. They prefer the term “mood-affiliated procedural remixing.”

Right now, the mood is hunger. You’ve put on weight lately, as your refrigerator keeps reminding you. With its assistance—and the collaboration of your DJ—you’ve come up with a comprehensive plan for diet and exercise, along with the attendant soundtrack. Already, you’ve lost six pounds. Although you sometimes worry that the machines are running your life, it’s not exactly a dystopian experience—the other day, after a fast- paced dubstep remix spurred you to a personal best on your daily run through the park, you burst into tears of joy.

Cultural production was long thought to be an impregnable stronghold of human intelligence, the one thing the machines could never do better than humans. But a few maverick researchers persisted, and—aided by startling, asymptotic advances in other areas of machine learning—suddenly, one day, they could. To be a musician now is to be an arranger. To be a songwriter is to code. Atlanta, the birthplace of “trap” music, is now a locus of brogrammer culture. Nashville is a leading technology incubator. The Capitol Records tower was converted to condos after the label uploaded its executive suite to the cloud.

Read the entire story here.

Image: Led Zeppelin IV album cover. Courtesy of the author.

 

Send to Kindle

Who Needs a Self-Driving Car?

Self-driving vehicles have been very much in the news over the last couple of years. Google’s autonomous car project is perhaps the most notable recent example — its latest road-worthy prototype is the culmination of a project out of Stanford, which garnered an innovation prize from DARPA (Defense Advanced Research Projects Agency) back in 2005. And, numerous companies are in various stages of experimenting, planning, prototyping and developing, including GM, Apple, Mercedes-Benz, Nissan, BMW, Tesla, to name but a few.

Ehang-184-AAVThat said, even though it may still be a few years yet before we see traffic jams of driverless cars clogging the Interstate Highway system, some forward-thinkers are not resting on their laurels.  EHang, a Chinese drone manufacturer is leapfrogging the car entirely and pursuing an autonomous drone — actually an autonomous aerial vehicle (AAV) known as the Ehang 184 — capable of flying one passenger. Cooler still, the only onboard control is a Google-map interface that allows the passenger to select a destination. The AAV and ground-based command centers take care of the rest.

I have to wonder if EHang’s command centers will be able to use the drone to shoot missiles at militants as well as delivering a passenger, or better still, targeting missiles at rogue drivers.

Wired has more about this fascinating new toy — probably aimed at Russian oligarchs and Silicon Valley billionaires.

Image: Ehang 184 — Autonomous Aerial Vehicle. Courtesy of EHang.

 

Send to Kindle

iScoliosis

Google-search-neck-xray

Industrial and occupational illnesses have followed humans since the advent of industry. Obvious ones include: lung diseases from mining and a variety of skin diseases from exposure to agricultural and factory chemicals.

The late 20th century saw us succumb to carpal tunnel and other repetitive stress injuries from laboring over our desks and computers. Now, in the 21st we are becoming hosts to the smartphone pathogen.

In addition to the spectrum of social and cultural disorders wrought by our constantly chattering mobile devices, we are at increased psychological and physical risk. But, let’s leave aside the two obvious ones: risk from vehicle injury due to texting while driving, and risk from injury due to texting while walking. More commonly, we are at increased risk of back and other chronic physical problems resulting from poor posture. This in turn leads to mood disorders, memory problems and depression. Some have termed this condition “text-neck”, “iHunch”, or “iPosture”; I’ll go with “iScoliosis™”.

From NYT:

THERE are plenty of reasons to put our cellphones down now and then, not least the fact that incessantly checking them takes us out of the present moment and disrupts family dinners around the globe. But here’s one you might not have considered: Smartphones are ruining our posture. And bad posture doesn’t just mean a stiff neck. It can hurt us in insidious psychological ways.

If you’re in a public place, look around: How many people are hunching over a phone? Technology is transforming how we hold ourselves, contorting our bodies into what the New Zealand physiotherapist Steve August calls the iHunch. I’ve also heard people call it text neck, and in my work I sometimes refer to it as iPosture.

The average head weighs about 10 to 12 pounds. When we bend our necks forward 60 degrees, as we do to use our phones, the effective stress on our neck increases to 60 pounds — the weight of about five gallons of paint. When Mr. August started treating patients more than 30 years ago, he says he saw plenty of “dowagers’ humps, where the upper back had frozen into a forward curve, in grandmothers and great-grandmothers.” Now he says he’s seeing the same stoop in teenagers.

When we’re sad, we slouch. We also slouch when we feel scared or powerless. Studies have shown that people with clinical depression adopt a posture that eerily resembles the iHunch. One, published in 2010 in the official journal of the Brazilian Psychiatric Association, found that depressed patients were more likely to stand with their necks bent forward, shoulders collapsed and arms drawn in toward the body.

Posture doesn’t just reflect our emotional states; it can also cause them. In a study published in Health Psychology earlier this year, Shwetha Nair and her colleagues assigned non-depressed participants to sit in an upright or slouched posture and then had them answer a mock job-interview question, a well-established experimental stress inducer, followed by a series of questionnaires. Compared with upright sitters, the slouchers reported significantly lower self-esteem and mood, and much greater fear. Posture affected even the contents of their interview answers: Linguistic analyses revealed that slouchers were much more negative in what they had to say. The researchers concluded, “Sitting upright may be a simple behavioral strategy to help build resilience to stress.”

Slouching can also affect our memory: In a study published last year in Clinical Psychology and Psychotherapy of people with clinical depression, participants were randomly assigned to sit in either a slouched or an upright position and then presented with a list of positive and negative words. When they were later asked to recall those words, the slouchers showed a negative recall bias (remembering the bad stuff more than the good stuff), while those who sat upright showed no such bias. And in a 2009 study of Japanese schoolchildren, those who were trained to sit with upright posture were more productive than their classmates in writing assignments.

Read the entire article here, preferably not via your smartphone.

Image courtesy of Google Search.

 

Send to Kindle

RIP: Maurice White

Maurice_White_1982

We’ve lost another great musical innovator. I’m sick and tired of my artistic heroes dying. But, at the very least, I still have the sounds and the visions.

More on the sad passing of Maurice White from Rolling Stone, NYT, USA Today, BBC News, and CNN.

Image: Maurice White performing with Earth, Wind, and Fire at the Ahoy Rotterdam; 1982. Courtesy: Chris Hakkens – http://www.flickr.com/photos/chris_hakkens/4638840128/in/photostream/

Send to Kindle

A Painful End

This should come as no surprise — advances to our understanding of biochemical and genetic processes seem to make the news with ever-increasing regularity. Researchers seem to have found the mechanism for switching physical pain on and off in mammals. They recently succeeded in blocking and restoring pain signals in mice. And, through the same discovery have been able to restore the sensation in a woman who has an extremely rare condition that makes her unable to feel any pain. It’s all in the Nav1.7 sodium ion channel and in its regulation of opioid peptides.

Fascinating, but where will this lead us? And, more to the point, will there ever be a pill to end the interminable pain of the US political process?

From ars technica:

Physical pain is a near universal problem, whether its sudden pangs or chronic aches. Yet, researchers’ efforts to quash it completely have fallen short—possibly due to a moonlighting channel in nerve cells. But that may be about to change.

The sodium ion channel, called Nav1.7, helps generate the electrical signals that surge through pain-related nerve cells. It’s known to play a key role in pain, but researchers’ past attempts to power-down its charged activities did little to soothe suffering. In a bit of a shocking twist, researchers figured out why; the channel has a second, un-channel-like function—regulating painkilling molecules called opioid peptides. That revelation, published in Nature Communications, provided researchers with the know-how to reverse painlessness in a woman with a rare condition, plus make mice completely pain free.

The link between Nav1.7 and opioid painkillers is “fascinating,” Claire Gaveriaux-Ruff, a pain researcher and professor at the University of Strasbourg, told Ars. And, she added, “this discovery brings hope to the many patients suffering from pain that are not yet adequately treated with the available pain medications.”

That source of hope has been a long time coming, John N. Wood, lead author of the study and a neuroscientist at University College London, told Ars. Researchers have been interested in Nav1.7 for years, he said. Excitement peaked in 2006 when scientists reported finding a family who lacked the channel and could feel no pain at all. After that, researchers excitedly scrambled to relieve pain with Nav1.7-blocking drugs. But the drugs inexplicably failed, Wood said. “So we thought, well maybe this channel isn’t just a channel, maybe it’s got some other activities as well.”

Using genetically engineered mice, Wood and colleagues found that completely shutting off Nav1.7 not only made mice pain-free, it cranked up their amount of opioid peptides in nerve cells. These molecules are natural painkillers that help the body moderate pain responses. In these Nav1.7-lacking mice, opioid levels were extremely high, blunting all twinges and throbs. When the researchers gave the mice a drug that blocks those opioids, the animals could feel pain normally. (The opioid-blocking drug, naloxone, treats overdoses of opioid drugs, such as morphine and codeine.)

Even more promising, Wood and colleagues saw the same result in a person. The test subject, a 39-year-old woman with a rare mutation that shuts off Nav1.7, had been pain-free all her life. But, when the researchers gave her a dose of the opioid-blocking naloxone, she felt pain for the first time—the sting of a tiny laser. She was happy to go back to her normal, painless state after the drug wore off, Wood reported. But, she hopes that the drug treatment can be used in children with the pain-free condition to keep them from unknowingly injuring themselves.

Read the entire article here.

Send to Kindle

Hate Crimes and the Google Correlation

Google-search-hate-speechIt had never occurred to me, but it makes perfect sense: there’s a direct correlation between Muslim hates crimes and Muslim hate searches on Google. For that matter, there is probably a correlation between other types of hate speech and hate crimes — women, gays, lesbians, bosses, blacks, whites, bad drivers, religion X. But it is certainly the case that Muslims and the Islamic religion are taking the current brunt both online and in the real world.

Clearly, we have a long way to go in learning that entire populations are not to blame for the criminal acts of a few. However, back to the correlations.

Mining of Google search data shows indisputable relationships. As the researchers point out, “When Islamophobic searches are at their highest levels, such as during the controversy over the ‘ground zero mosque’ in 2010 or around the anniversary of 9/11, hate crimes tend to be at their highest levels, too.” Interestingly enough there are currently just over 50 daily searches for “I hate my boss” in the US. In November there were 120 searches per day for “I hate Muslims”.

So, here’s an idea. Let’s get Google to replace the “I’m Feeling Lucky” button on the search page (who uses that anyway) with “I’m Feeling Hateful”. This would make the search more productive for those needing to vent their hatred.

More from NYT:

HOURS after the massacre in San Bernardino, Calif., on Dec. 2, and minutes after the media first reported that at least one of the shooters had a Muslim-sounding name, a disturbing number of Californians had decided what they wanted to do with Muslims: kill them.

The top Google search in California with the word “Muslims” in it was “kill Muslims.” And the rest of America searched for the phrase “kill Muslims” with about the same frequency that they searched for “martini recipe,” “migraine symptoms” and “Cowboys roster.”

People often have vicious thoughts. Sometimes they share them on Google. Do these thoughts matter?

Yes. Using weekly data from 2004 to 2013, we found a direct correlation between anti-Muslim searches and anti-Muslim hate crimes.

We measured Islamophobic sentiment by using common Google searches that imply hateful attitudes toward Muslims. A search for “are all Muslims terrorists?” for example leaves little to the imagination about what the searcher really thinks. Searches for “I hate Muslims” are even clearer.

When Islamophobic searches are at their highest levels, such as during the controversy over the “ground zero mosque” in 2010 or around the anniversary of 9/11, hate crimes tend to be at their highest levels, too.

In 2014, according to the F.B.I., anti-Muslim hate crimes represented 16.3 percent of the total of 1,092 reported offenses. Anti-Semitism still led the way as a motive for hate crimes, at 58.2 percent.

Hate crimes may seem chaotic and unpredictable, a consequence of random neurons that happen to fire in the brains of a few angry young men. But we can explain some of the rise and fall of anti-Muslim hate crimes just based on what people are Googling about Muslims.

The frightening thing is this: If our model is right, Islamophobia and thus anti-Muslim hate crimes are currently higher than at any time since the immediate aftermath of the Sept. 11 attacks. Although it will take awhile for the F.B.I. to collect and analyze the data before we know whether anti-Muslim hate crimes are in fact rising spectacularly now, Islamophobic searches in the United States were 10 times higher the week after the Paris attacks than the week before. They have been elevated since then and rose again after the San Bernardino attack.

According to our model, when all the data is analyzed by the F.B.I., there will have been more than 200 anti-Muslim attacks in 2015, making it the worst year since 2001.

How can these Google searches track Islamophobia so well? Who searches for “I hate Muslims” anyway?

We often think of Google as a source from which we seek information directly, on topics like the weather, who won last night’s game or how to make apple pie. But sometimes we type our uncensored thoughts into Google, without much hope that Google will be able to help us. The search window can serve as a kind of confessional.

There are thousands of searches every year, for example, for “I hate my boss,” “people are annoying” and “I am drunk.” Google searches expressing moods, rather than looking for information, represent a tiny sample of everyone who is actually thinking those thoughts.

There are about 1,600 searches for “I hate my boss” every month in the United States. In a survey of American workers, half of the respondents said that they had left a job because they hated their boss; there are about 150 million workers in America.

In November, there were about 3,600 searches in the United States for “I hate Muslims” and about 2,400 for “kill Muslims.” We suspect these Islamophobic searches represent a similarly tiny fraction of those who had the same thoughts but didn’t drop them into Google.

“If someone is willing to say ‘I hate them’ or ‘they disgust me,’ we know that those emotions are as good a predictor of behavior as actual intent,” said Susan Fiske, a social psychologist at Princeton, pointing to 50 years of psychology research on anti-black bias. “If people are making expressive searches about Muslims, it’s likely to be tied to anti-Muslim hate crime.”

Google searches seem to suffer from selection bias: Instead of asking a random sample of Americans how they feel, you just get information from those who are motivated to search. But this restriction may actually help search data predict hate crimes.

Read more here.

Image courtesy of Google Search.

 

Send to Kindle

Robotic Stock Keeping

Tally-robot-simbe

Meet Tally and it may soon be coming to a store near you. Tally is an autonomous robot that patrols store aisles and scans shelves to ensure items are correctly stocked. While the robot doesn’t do the restocking itself — beware stock clerk, this is probably only a matter of time — it audits shelves for out-of-stock items, low stock items, misplaced items, and pricing errors. The robot was developed by start-up Simbe Robotics.

From Technology Review:

When customers can’t find a product on a shelf it’s an inconvenience. But by some estimates, it adds up to billions of dollars of lost revenue each year for retailers around the world.

A new shelf-scanning robot called Tally could help ensure that customers never leave a store empty-handed. It roams the aisles and automatically records which shelves need to be restocked.

The robot, developed by a startup called Simbe Robotics, is the latest effort to automate some of the more routine work done in millions of warehouses and retail stores. It is also an example of the way robots and AI will increasingly take over parts of people’s jobs rather than replacing them.

Restocking shelves is simple but hugely important for retailers. Billions of dollars may be lost each year because products are missing, misplaced, or poorly arranged, according to a report from the analyst firm IHL Services. In a large store it can take hundreds of hours to inspect shelves manually each week.

Brad Bogolea, CEO and cofounder of Simbe Robotics, says his company’s robot can scan the shelves of a small store, like a modest CVS or Walgreens, in about an hour. A very large retailer might need several robots to patrol its premises. He says the robot will be offered on a subscription basis but did not provide the pricing. Bogolea adds that one large retailer is already testing the machine.

Tally automatically roams a store, checking whether a shelf needs restocking; whether a product has been misplaced or poorly arranged; and whether the prices shown on shelves are correct. The robot consists of a wheeled platform with four cameras that scan the shelves on either side from the floor up to a height of eight feet.

Read the entire article here.

Image: Tally. Courtesy of Simbe Robotics.

 

Send to Kindle

PhotoMash: Honey Boo-Boo and Trump’s Jihadists

Oh, the Washington Post is the source that keeps on giving. We’re only a few days into 2016, and the newspaper’s online editors continue to deliver wonderfully juxtaposed stories that highlight the peculiar absurdity of contemporary (American) “news”.

Photomash-honey-booboo-vs-donald-for-isis

This photomash (or more appropriately “storymash”) comes to us from the Washington Post, January 2, 2016. Both subjects are courtesy of our odd fascination with the hideous monsters created by reality TV.

The first story describes Discovery Communications re-awakening; aiming to move away from the reality trash TV of Honey Boo Boo. The second, highlights our move towards the new phenomenon of reality trash politics spearheaded by the comb-overed-one.

Send to Kindle