Category Archives: Idea Soup

iScoliosis

Google-search-neck-xray

Industrial and occupational illnesses have followed humans since the advent of industry. Obvious ones include: lung diseases from mining and a variety of skin diseases from exposure to agricultural and factory chemicals.

The late 20th century saw us succumb to carpal tunnel and other repetitive stress injuries from laboring over our desks and computers. Now, in the 21st we are becoming hosts to the smartphone pathogen.

In addition to the spectrum of social and cultural disorders wrought by our constantly chattering mobile devices, we are at increased psychological and physical risk. But, let’s leave aside the two obvious ones: risk from vehicle injury due to texting while driving, and risk from injury due to texting while walking. More commonly, we are at increased risk of back and other chronic physical problems resulting from poor posture. This in turn leads to mood disorders, memory problems and depression. Some have termed this condition “text-neck”, “iHunch”, or “iPosture”; I’ll go with “iScoliosis™”.

From NYT:

THERE are plenty of reasons to put our cellphones down now and then, not least the fact that incessantly checking them takes us out of the present moment and disrupts family dinners around the globe. But here’s one you might not have considered: Smartphones are ruining our posture. And bad posture doesn’t just mean a stiff neck. It can hurt us in insidious psychological ways.

If you’re in a public place, look around: How many people are hunching over a phone? Technology is transforming how we hold ourselves, contorting our bodies into what the New Zealand physiotherapist Steve August calls the iHunch. I’ve also heard people call it text neck, and in my work I sometimes refer to it as iPosture.

The average head weighs about 10 to 12 pounds. When we bend our necks forward 60 degrees, as we do to use our phones, the effective stress on our neck increases to 60 pounds — the weight of about five gallons of paint. When Mr. August started treating patients more than 30 years ago, he says he saw plenty of “dowagers’ humps, where the upper back had frozen into a forward curve, in grandmothers and great-grandmothers.” Now he says he’s seeing the same stoop in teenagers.

When we’re sad, we slouch. We also slouch when we feel scared or powerless. Studies have shown that people with clinical depression adopt a posture that eerily resembles the iHunch. One, published in 2010 in the official journal of the Brazilian Psychiatric Association, found that depressed patients were more likely to stand with their necks bent forward, shoulders collapsed and arms drawn in toward the body.

Posture doesn’t just reflect our emotional states; it can also cause them. In a study published in Health Psychology earlier this year, Shwetha Nair and her colleagues assigned non-depressed participants to sit in an upright or slouched posture and then had them answer a mock job-interview question, a well-established experimental stress inducer, followed by a series of questionnaires. Compared with upright sitters, the slouchers reported significantly lower self-esteem and mood, and much greater fear. Posture affected even the contents of their interview answers: Linguistic analyses revealed that slouchers were much more negative in what they had to say. The researchers concluded, “Sitting upright may be a simple behavioral strategy to help build resilience to stress.”

Slouching can also affect our memory: In a study published last year in Clinical Psychology and Psychotherapy of people with clinical depression, participants were randomly assigned to sit in either a slouched or an upright position and then presented with a list of positive and negative words. When they were later asked to recall those words, the slouchers showed a negative recall bias (remembering the bad stuff more than the good stuff), while those who sat upright showed no such bias. And in a 2009 study of Japanese schoolchildren, those who were trained to sit with upright posture were more productive than their classmates in writing assignments.

Read the entire article here, preferably not via your smartphone.

Image courtesy of Google Search.

Send to Kindle

Hate Crimes and the Google Correlation

Google-search-hate-speechIt had never occurred to me, but it makes perfect sense: there’s a direct correlation between Muslim hates crimes and Muslim hate searches on Google. For that matter, there is probably a correlation between other types of hate speech and hate crimes — women, gays, lesbians, bosses, blacks, whites, bad drivers, religion X. But it is certainly the case that Muslims and the Islamic religion are taking the current brunt both online and in the real world.

Clearly, we have a long way to go in learning that entire populations are not to blame for the criminal acts of a few. However, back to the correlations.

Mining of Google search data shows indisputable relationships. As the researchers point out, “When Islamophobic searches are at their highest levels, such as during the controversy over the ‘ground zero mosque’ in 2010 or around the anniversary of 9/11, hate crimes tend to be at their highest levels, too.” Interestingly enough there are currently just over 50 daily searches for “I hate my boss” in the US. In November there were 120 searches per day for “I hate Muslims”.

So, here’s an idea. Let’s get Google to replace the “I’m Feeling Lucky” button on the search page (who uses that anyway) with “I’m Feeling Hateful”. This would make the search more productive for those needing to vent their hatred.

More from NYT:

HOURS after the massacre in San Bernardino, Calif., on Dec. 2, and minutes after the media first reported that at least one of the shooters had a Muslim-sounding name, a disturbing number of Californians had decided what they wanted to do with Muslims: kill them.

The top Google search in California with the word “Muslims” in it was “kill Muslims.” And the rest of America searched for the phrase “kill Muslims” with about the same frequency that they searched for “martini recipe,” “migraine symptoms” and “Cowboys roster.”

People often have vicious thoughts. Sometimes they share them on Google. Do these thoughts matter?

Yes. Using weekly data from 2004 to 2013, we found a direct correlation between anti-Muslim searches and anti-Muslim hate crimes.

We measured Islamophobic sentiment by using common Google searches that imply hateful attitudes toward Muslims. A search for “are all Muslims terrorists?” for example leaves little to the imagination about what the searcher really thinks. Searches for “I hate Muslims” are even clearer.

When Islamophobic searches are at their highest levels, such as during the controversy over the “ground zero mosque” in 2010 or around the anniversary of 9/11, hate crimes tend to be at their highest levels, too.

In 2014, according to the F.B.I., anti-Muslim hate crimes represented 16.3 percent of the total of 1,092 reported offenses. Anti-Semitism still led the way as a motive for hate crimes, at 58.2 percent.

Hate crimes may seem chaotic and unpredictable, a consequence of random neurons that happen to fire in the brains of a few angry young men. But we can explain some of the rise and fall of anti-Muslim hate crimes just based on what people are Googling about Muslims.

The frightening thing is this: If our model is right, Islamophobia and thus anti-Muslim hate crimes are currently higher than at any time since the immediate aftermath of the Sept. 11 attacks. Although it will take awhile for the F.B.I. to collect and analyze the data before we know whether anti-Muslim hate crimes are in fact rising spectacularly now, Islamophobic searches in the United States were 10 times higher the week after the Paris attacks than the week before. They have been elevated since then and rose again after the San Bernardino attack.

According to our model, when all the data is analyzed by the F.B.I., there will have been more than 200 anti-Muslim attacks in 2015, making it the worst year since 2001.

How can these Google searches track Islamophobia so well? Who searches for “I hate Muslims” anyway?

We often think of Google as a source from which we seek information directly, on topics like the weather, who won last night’s game or how to make apple pie. But sometimes we type our uncensored thoughts into Google, without much hope that Google will be able to help us. The search window can serve as a kind of confessional.

There are thousands of searches every year, for example, for “I hate my boss,” “people are annoying” and “I am drunk.” Google searches expressing moods, rather than looking for information, represent a tiny sample of everyone who is actually thinking those thoughts.

There are about 1,600 searches for “I hate my boss” every month in the United States. In a survey of American workers, half of the respondents said that they had left a job because they hated their boss; there are about 150 million workers in America.

In November, there were about 3,600 searches in the United States for “I hate Muslims” and about 2,400 for “kill Muslims.” We suspect these Islamophobic searches represent a similarly tiny fraction of those who had the same thoughts but didn’t drop them into Google.

“If someone is willing to say ‘I hate them’ or ‘they disgust me,’ we know that those emotions are as good a predictor of behavior as actual intent,” said Susan Fiske, a social psychologist at Princeton, pointing to 50 years of psychology research on anti-black bias. “If people are making expressive searches about Muslims, it’s likely to be tied to anti-Muslim hate crime.”

Google searches seem to suffer from selection bias: Instead of asking a random sample of Americans how they feel, you just get information from those who are motivated to search. But this restriction may actually help search data predict hate crimes.

Read more here.

Image courtesy of Google Search.

Send to Kindle

PhotoMash: Honey Boo-Boo and Trump’s Jihadists

Oh, the Washington Post is the source that keeps on giving. We’re only a few days into 2016, and the newspaper’s online editors continue to deliver wonderfully juxtaposed stories that highlight the peculiar absurdity of contemporary (American) “news”.

Photomash-honey-booboo-vs-donald-for-isis

This photomash (or more appropriately “storymash”) comes to us from the Washington Post, January 2, 2016. Both subjects are courtesy of our odd fascination with the hideous monsters created by reality TV.

The first story describes Discovery Communications re-awakening; aiming to move away from the reality trash TV of Honey Boo Boo. The second, highlights our move towards the new phenomenon of reality trash politics spearheaded by the comb-overed-one.

Send to Kindle

Fight or Flight (or Record?)

Google-search-danger

Psychologists, social scientists and researchers of the human brain have long maintained that we have three typical responses to an existential, usually physical, threat. First, we may stand our ground to tackle and fight the threat. Second, we may turn and run from danger. Third, we may simply freeze with indecision and inaction. These responses have been studied, documented and confirmed over the decades. Further, they tend to mirror those of other animals when faced with a life-threatening situation.

But, now that humans have entered the smartphone age, it appears that there is a fourth response — to film or record the threat. This may seem hard to believe and foolhardy, but quite disturbingly it’s is a growing trend, especially among younger people.

From the Telegraph:

If you witnessed a violent attack on an innocent victim, would you:

a) help
b) run
c) freeze

Until now, that was the hypothetical question we all asked ourselves when reading about horrific events such as terror attacks.

What survival instinct would come most naturally? Fight or flight?

No longer. Over the last couple of years it’s become very obvious that there’s a fourth option:

d) record it all on your smartphone.

This reaction of filming traumatic events has become more prolific in recent weeks. Last month’s terror attacks in Paris saw mobile phone footage of people being shot, photos of bodies lying in the street, and perhaps most memorably, a pregnant woman clinging onto a window ledge.

Saturday [December 5, 2015] night saw another example when a terror suspect started attacking passengers on the Tube at Leytonstone Station. Most of the horrific incident was captured on video, as people stood filming him.

One brave man, 33-year-old engineer David Pethers, tried to fight the attacker. He ended up with a cut to the neck as he tried to protect passing children. But while he was intervening, others just held up their phones.

“There were so many opportunities where someone could have grabbed him,” he told the Daily Mail. “One guy came up to me afterwards and said ‘well done, I want to shake your hand, you are the only one who did anything, I got the whole thing on film.’

“I was so angry, I nearly turned on him but I walked away. I though, ‘Are you crazy? You are standing there filming and did nothing.’ I was really angry afterwards.”

It’s hard to disagree. Most of us know heroism is rare and admirable. We can easily understand people trying to escape and save themselves, or even freezing in the face of terror.

But deliberately doing nothing and choosing to film the whole thing? That’s a lot harder to sympathise with.

Psychotherapist Richard Reid agrees – “the sensible option would be to think about your own safety and get out, or think about helping people” – but he says it’s important we understand this new reaction.

“Because events like terror attacks are so outside our experience, people don’t fully connect with it,” he explains.

“It’s like they’re watching a film. It doesn’t occur to them they could be in danger or they could be helping. The reality only sinks in after the event. It’s a natural phenomenon. It’s not necessarily the most useful response, but we have to accept it.”

Read the entire story here.

Image courtesy of Google Search.

Send to Kindle

Now We Can All Be Michael Scott And Number 6

Or, if you are from the UK — you can be David Brent. That is, we can all aspire to be a terrible boss. And, it’s all courtesy of the techno-enabled Uberified gig-economy.

Those of us who have a boss will identify with the mostly excruciating ritual that is the annual performance review; your work, your attitude, your personality is dissected, sliced and diced, scored, rated and ranked. However, as traumatic as this may be for you, remember that at least your boss actually interacts (usually) with you, and may actually have come to know you (somewhat), over a period of some years.

But, how would it feel to be evaluated in this way — scored and rated — by complete strangers during a fleeting interaction that may only have lasted minutes? Online social media tools make this scoring wonderfully easy and convenient — just check a box or select 1-5 stars or a thumbs up/down. Add to this the sharing / gig economy, and we now have millions of people ready (and eager) to score millions of others for waiting tables, chauffeuring a car, delivering pizza, writing an app, cleaning a house, walking your dog, mowing your lawn. And, the list grows each day. Thus, you may be an employee to any numbers of managers throughout each day — it’s just that each manager is actually one of your customers, and each customer is armed with your score.

Where will this lead us? Should we rank our partners and spouses each day, indeed, several times each day? Will we score our kids for table etiquette, manners, talk-back? Should we score the check-out employee, the bank clerk, the bus driver, barista, nurse practitioner, car mechanic, surgeon? Ugh.

But you can certainly see why corporate executives are falling over themselves to have customers anonymously score their customer-facing employees. For the process devolves power to the customer, and removes management from having to make the once tough personnel decisions. So, why not have hordes of anonymous reviews and aggregated scores from customers determine the fate of low-level service employees? This would seem to be the ultimate customer service.

Yet, by replacing the human connection between employer/customer and employee/service worker with scores and algorithms we are further commoditizing ourselves. We erode our humanity by allowing ourselves to be quantified and enumerated, and for doing the same to others, known and unknown. Having the power to score and rate another person at the press of a finger — anonymously — may make for savvy 21st century management but it makes for a colder, crueler world, which increasingly reads like a dystopian novel.

From the Verge:

Soon, you’ll be able to go to the Olive Garden and order your fettuccine alfredo from a tablet mounted to the table. After paying, you’ll rate the server.

Then you can use that tablet to hail an Uber driver, whom you’ll also rate, from one to five stars. You can take it to your Airbnb, which you’ll award one to five stars across several categories, and get a TaskRabbit or Postmates worker to pick up groceries — rate them too. Maybe you’ll check on the web developer you’ve hired through Upwork, perusing the screenshots taken automatically from her computer, and think about how you’ll rate her when the job is done. You could hire someone from Handy to clean the place before you leave. More stars.

The on-demand economy has scrambled the roles of employer and employee in ways that courts and regulators are just beginning to parse. So far, the debate has focused on whether workers should be contractors or employees, a question sometimes distilled into an argument about who’s the boss: are workers their own bosses, as the companies often claim, or is the platform their boss, policing their work through algorithms and rules?

But there’s a third party that’s often glossed over: the customer. The rating systems used by these companies have turned customers into unwitting and sometimes unwittingly ruthless middle managers, more efficient than any boss a company could hope to hire. They’re always there, working for free, hypersensitive to the smallest error. All the algorithm has to do is tally up their judgments and deactivate accordingly.

Ratings help these companies to achieve enormous scale, managing large pools of untrained contract workers without having to hire supervisors. It’s a nice arrangement for customers too, who get cheap service with a smile — even if it’s an anxious one. But for the workers, already in the precarious position of contract labor, making every customer a boss is a terrifying prospect. After all, they — we — can be entitled jerks.

“You get pretty good at kissing ass just because you have to,” an Uber driver told me. “Uber and Lyft have created this monstrous brand of customer where they expect Ritz Carlton service at McDonald’s prices.”

In March, when Judge Edward Chen denied Uber’s motion for summary judgement on the California drivers’ class action suit, he seized on the idea that ratings aren’t just a customer feedback tool — they represent a new level of monitoring, far more pervasive than any watchful boss. Customer ratings, Chen wrote, give Uber an “arguably tremendous amount of control over the ‘manner and means’ of its drivers’ performance.” Quoting from Michel Foucault’s Discipline and Punish, he wrote that a “state of conscious and permanent visibility assures the automatic functioning of power.”

Starting with Ebay, rating systems have typically been described as way of establishing trust between strangers. Some commentators go so far as to say ratings are more effective than government regulation. “Uber and Airbnb are in fact some of the most regulated ecosystems in the world,” said Joshua Gans, an economist at the University of Toronto, at an FTC workshop earlier this year. Rather than a single certification before you can begin work, everyone is regulated constantly through a system of mutually assured judgment.

Certainly customers sometimes have awful experiences — reckless driving, creepy comments — and the rating system can help report them. But when it comes to policing dangerous behavior, most of these platforms have come to rely not on ratings but on traditional safety measures — identity verification, background checks, and the knowledge that any illegal actions can be investigated and enforced through the tracking devices every worker carries. We can’t rate for criminal histories, poor training, or negligent car maintenance.

So what do we rate for? We rate for the routes drivers take, for price fluctuations beyond their control, for slow traffic, for refusing to speed, for talking too much or too little, for failing to perform large tasks unrealistically quickly, for the food being cold when they delivered it, for telling us that, No, we can’t bring beer in the car and put our friend in the trunk — really, for any reason at all, including subconscious biases about race or gender, a proven problem on many crowdsourced platforms. This would be a nuisance if feedback were just feedback, but ratings have become the primary metric in automated systems determining employment. If you imagine the things customers rate down for as firing decisions in a traditional workplace, they look capricious and harsh. It’s a strange amount of power for customers to hold, all the more so considering that many don’t know they wield it.

Sometimes, as in Uber’s system, workers have the opportunity to rate customers back. An Uber spokesperson told me that, “Uber’s priority is to connect you with a safe, reliable ride — no matter who you are, where you’re coming from, or where you’re going. Achieving that goal for our community means maintaining an environment of mutual accountability and respect. We want everyone to have a great ride, every time, and two-way feedback is one of the many ways we work to make that possible. “

Read more here.

Video: The Prisoner – I’m not a number, I’m a free man! 1967. Courtesy: Patrick  McGoohan / ITC Entertainment.

Send to Kindle

Rudeness Goes Viral

We know intuitively, anecdotally and through scientific study that aggressive behavior can be transmitted to others through imitation. The famous Bobo doll experiment devised by researchers at Stanford University in the early 1960s, and numerous precursors, showed that subjects given an opportunity to observe aggressive models later reproduced a good deal of physical and verbal aggression substantially identical with that of the model. In these studies the model was usually someone with a higher social status or with greater authority (e.g., an adult) than the observer (e.g., a child).

Recent updates to these studies now show that low-intensity behaviors such as rudeness can be as equally contagious as more intense behaviors like violence. Fascinatingly, the contagion seems to work equally well even if the model and observer are peers.

So, keep this in mind: watching rude behaviors leads us to be rude to others.

From Scientific American:

Flu season is nearly upon us, and in an effort to limit contagion and spare ourselves misery, many of us will get vaccinated. The work of Jonas Salk and Thomas Francis has helped restrict the spread of the nasty bug for generations, and the influenza vaccine is credited with saving tens of thousands of lives. But before the vaccine could be developed, scientists first had to identify the cause of influenza — and, importantly, recognize that it was contagious.

New research by Trevor Foulk, Andrew Woolum, and Amir Erez at the University of Florida takes that same first step in identifying a different kind of contagious menace: rudeness. In a series of studies, Foulk and colleagues demonstrate that being the target of rude behavior, or even simply witnessing rude behavior, induces rudeness. People exposed to rude behavior tend to have concepts associated with rudeness activated in their minds, and consequently may interpret ambiguous but benign behaviors as rude. More significantly, they themselves are more likely to behave rudely toward others, and to evoke hostility, negative affect, and even revenge from others.

The finding that negative behavior can beget negative behavior is not exactly new, as researchers demonstrated decades ago that individuals learn vicariously and will repeat destructive actions.  In the now infamous Bobo doll experiment, for example, children who watched an adult strike a Bobo doll with a mallet or yell at it were themselves abusive toward the doll.  Similarly, supervisors who believe they are mistreated by managers tend to pass on this mistreatment to their employees.

Previous work on the negative contagion effect, however, has focused primarily on high-intensity behaviors like hitting or abusive supervision that are (thankfully) relatively infrequent in everyday life.  In addition, in most previous studies the destructive behavior was modeled by someone with a higher status than the observer. These extreme negative behaviors may thus get repeated because (a) they are quite salient and (b) the observer is consciously and intentionally trying to emulate the behavior of someone with an elevated social status.

To examine whether this sensitivity impacts social behavior, Foulk’s team conducted another study in which participants were asked to play the part of an employee at a local bookstore.  Participants first observed a video showing either a polite or a rude interaction among coworkers.  They were then asked to respond to an email from a customer.  The email was either neutral (e.g., “I am writing to check on an order I placed a few weeks ago.”), highly aggressive (e.g., “I guess you or one of your incompetent staff must have lost my order.”), or moderately rude (I’m really surprised by this as EVERYBODY said you guys give really good customer service???).

Foulk and colleagues again found that prior exposure to rude behavior creates a specific sensitivity to rudeness. Notably, the type of video participants observed did not affect their responses to the neutral or aggressive emails; instead, the nature of those emails drove the response.  That is, all participants were more likely to send a hostile response to the aggressive email than to neutral email, regardless of whether they had previously observed a polite or rude employee interaction.  However, the type of video participants observed early in the study did affect their interpretation of and response to the rude email.  Those who had seen the polite video adopted a benign interpretation of the moderately rude email and delivered a neutral response, while those who had seen the rude video adopted a malevolent interpretation and delivered a hostile response.  Thus, observing rude behaviors, even those committed by coworkers or peers, resulted in greater sensitivity and heightened response to rudeness.

Read the entire article here.

Send to Kindle

Clowns, Ducks and Dancing Girls

OK, OK. I’ve had to break my own rule (again). You know, the one that states that I’m not supposed to write about politics. The subject is far too divisive, I’m told. However, as a US-based, Brit and hence a somewhat removed observer — though I can actually vote — I cannot stay on the sidelines.

Politics-Cruz-ducks-15Jan2016

For US politics and its never-ending election season is a process that must be observed, studied, dissected and savored. After all, it’s not really politics — it’s a hysterically entertaining reality TV show complete with dancing girls, duck hunting, character assassination, clowns, demagogues, guns, hypocrisy, plaid shirts, lies and so much more. Best of all, there are no policies or substantive ideas of any kind; just pure entertainment. Netflix should buy the exclusive rights!

Politics-Trump-rally-15Jan2016

Image, top: Phil Robertson, star of the Duck Dynasty reality TV show, says Cruz is the man for the job because he is godly, loves America, and is willing to kill a duck to make gumbo soup. Courtesy of the Guardian.

Image, bottom, Political rally for Donald Trump featuring gyrating dancing girls and warnings to the “enemy”. Courtesy of Fox News.

Send to Kindle

Design Thinking Versus Product Development

Out with product managers; in with design thinkers. Time for some corporate creativity. Think user journeys and empathy roadmaps.

A different corporate mantra is beginning to take hold at some large companies like IBM. It’s called design thinking, and while it’s not necessarily new, it holds promise for companies seeking to meet the needs of their customers at a fundamental level. Where design is often thought of in terms of defining and constructing cool-looking products, design thinking is used to capture a business problem at a broader level, shape business strategy and deliver a more holistic, deeper solution to customers. And, importantly, to do so more quickly than through a typical product development life-cycle.

From NYT:

Phil Gilbert is a tall man with a shaved head and wire-rimmed glasses. He typically wears cowboy boots and bluejeans to work — hardly unusual these days, except he’s an executive at IBM, a company that still has a button-down suit-and-tie reputation. And in case you don’t get the message from his wardrobe, there’s a huge black-and-white photograph hanging in his office of a young Bob Dylan, hunched over sheet music, making changes to songs in the “Highway 61 Revisited” album. It’s an image, Mr. Gilbert will tell you, that conveys both a rebel spirit and hard work.

Let’s not get carried away. Mr. Gilbert, who is 59 years old, is not trying to redefine an entire generation. On the other hand, he wants to change the habits of a huge company as it tries to adjust to a new era, and that is no small task.

IBM, like many established companies, is confronting the relentless advance of digital technology. For these companies, the question is: Can you grow in the new businesses faster than your older, lucrative businesses decline?

Mr. Gilbert answers that question with something called design thinking. (His title is general manager of design.) Among other things, design thinking flips traditional technology product development on its head. The old way is that you come up with a new product idea and then try to sell it to customers. In the design thinking way, the idea is to identify users’ needs as a starting point.

Mr. Gilbert and his team talk a lot about “iteration cycles,” “lateral thinking,” “user journeys” and “empathy maps.” To the uninitiated, the canons of design thinking can sound mushy and self-evident. But across corporate America, there is a rising enthusiasm for design thinking not only to develop products but also to guide strategy and shape decisions of all kinds. The September cover article of the Harvard Business Review was “The Evolution of Design Thinking.” Venture capital firms are hiring design experts, and so are companies in many industries.

Still, the IBM initiative stands out. The company is well on its way to hiring more than 1,000 professional designers, and much of its management work force is being trained in design thinking. “I’ve never seen any company implement it on the scale of IBM,” said William Burnett, executive director of the design program at Stanford University. “To try to change a culture in a company that size is a daunting task.”

Daunting seems an understatement. IBM has more than 370,000 employees. While its revenues are huge, the company’s quarterly reports have shown them steadily declining in the last two years. The falloff in revenue is partly intentional, as the company sold off less profitable operations, but the sometimes disappointing profits are not, and they reflect IBM’s struggle with its transition. Last month, the company shaved its profit target for 2015.

In recent years, the company has invested heavily in new fields, including data analytics, cloud computing, mobile technology, security, social media software for business and its Watson artificial intelligence technology. Those businesses are growing rapidly, generating revenue of $25 billion last year, and IBM forecasts that they will contribute $40 billion by 2018, through internal growth and acquisitions. Just recently, for example, IBM agreed to pay $2 billion for the Weather Company (not including its television channel), gaining its real-time and historical weather data to feed into Watson and analytics software.

But IBM’s biggest businesses are still the traditional ones — conventional hardware, software and services — which contribute 60 percent of its revenue and most of its profit. And these IBM mainstays are vulnerable, as customers increasingly prefer to buy software as a service, delivered over the Internet from remote data centers.

Recognizing the importance of design is not new, certainly not at IBM. In the 1950s, Thomas J. Watson Jr., then the company’s chief executive, brought on Eliot Noyes, a distinguished architect and industrial designer, to guide a design program at IBM. And Noyes, in turn, tapped others including Paul Rand, Charles Eames and Eero Saarinen in helping design everything from corporate buildings to the eight-bar corporate logo to the IBM Selectric typewriter with its golf-ball-shaped head.

At that time, and for many years, design meant creating eye-pleasing, functional products. Now design thinking has broader aims, as a faster, more productive way of organizing work: Look at problems first through the prism of users’ needs, research those needs with real people and then build prototype products quickly.

Defining problems more expansively is part of the design-thinking ethos. At a course in New York recently, a group of IBM managers were given pads and felt-tip pens and told to sketch designs for “the thing that holds flowers on a table” in two minutes. The results, predictably, were vases of different sizes and shapes.

Next, they were given two minutes to design “a better way for people to enjoy flowers in their home.” In Round 2, the ideas included wall placements, a rotating flower pot run by solar power and a software app for displaying images of flowers on a home TV screen.

Read the entire story here.

Send to Kindle

PhotoMash: A Blind Girl Sees; A Sighted Man is Blind

Today’s juxtaposition of images and stories comes courtesy of the Independent, from December 15, 2015. One is literally blind, the other figuratively.

The girl on the left is a 14-year old from Malawi. Her name is Rose. As a result of severe eye cataracts she was blind since birth. A recent operation restored her sight.

The man on the right can see, and according to his doctors is in excellent health. But he remains blind to all around, except his own reflection.

Photomash-Blind-versus-Blind

Images courtesy of Independent, UK.

Send to Kindle

On the Joys of Not Being Twenty Again

I’m not twenty, and am constantly reminded that I’m not — both from internal alerts and external messages. Would I like to be younger? Of course. But it certainly comes at a price. So, after reading the exploits of a 20-something forced to live without her smartphone for a week, I realize it’s not all that bad being a cranky old luddite.

I hope that the ordeal, excerpted below, is tongue-very-much-in-cheek but I suspect it’s not: constant status refreshes, morning selfies, instant content gratification, nano-scale attention span, over-stimulation, life-stream documentation, peer ranking, group-think, interrupted interruptions. Thus, I realize I’m rather content not to be twenty after all.

From the Telegraph:

I have a confession to make: I am addicted to my smartphone. I use it as an alarm clock, map, notepad, mirror and camera.

I spend far too much time on Twitter and Instagram and have this week realised I have a nervous tick where I repeatedly unlock my smartphone.

And because of my phone’s many apps which organise my life and help me navigate the world, like many people my age, I am quite literally lost without it.

I am constantly told off by friends and family for using my phone during conversations, and I recently found out (to my horror) that I have taken over 5,000 selfies.

So when my phone broke I seized the opportunity to spend an entire week without it, and kept a diary each day.

Day One: Thursday

Frazzled, I reached to my bedside table, so I could take a morning selfie and send it to my friends.

Realising why that could not happen, my hand and my heart both felt empty. I knew at this point it was going to be a long week.

Day Two: Friday

I basked in the fact my colleagues could not contact me – and if I did not reply to their emails straight away it would not be the end of the world.

I then took the train home to see my parents outside London.

I couldn’t text my mother about any delays which may have happened (they didn’t), and she couldn’t tell me if she was going to be late to the station (she wasn’t). The lack of phone did nothing but make me feel anxious and prevent me from being able to tweet about the irritating children screaming on the train.

Day Three: Saturday

It is a bit weird feeling completely cut off from the outside world; I am not chained to my computer like I am at work and I am not allowed to constantly be on my laptop like a teen hacker.

It was nice though – a real detox. We went on a walk with our spaniel in the countryside near the Chiltern Hills. I had to properly talk to everyone, instead of constantly refreshing Twitter, which was novel.

I do feel like my attention span is improving every day, but I equally feel anchorless and lost without having any way of contacting anyone, or documenting my life.

….

Day Seven: Wednesday

My attention span and patience have grown somewhat, and I have noticed I daydream and have thoughts independent of Twitter far more often than usual.

Read the entire account here.

Send to Kindle

Back to the Future

France_in_XXI_Century_Latest_fashionJust over a hundred years ago, at the turn of the 19th century, Jean-Marc Côté and some of his fellow French artists were commissioned to imagine what the world would look like in 2000. Their colorful sketches and paintings portrayed some interesting inventions, though all seem to be grounded in familiar principles and incremental innovations — mechanical helpers, ubiquitous propellers and wings. Interestingly, none of these artist-futurists imagined a world beyond Victorian dress, gender inequality and wars. But these are gems nonetheless.

France_in_XXI_Century._Air_cabSome of their works found their way into cigar boxes and cigarette cases, others were exhibited at the 1900 World Exhibition in Paris. My three favorites: a Tailor of the Latest Fashion, the Aero-cab Station and the Whale Bus. See the full complement of these remarkable futuristic visions at the Public Domain Review, and check out the House Rolling Through the Countryside and At School.

I suspect our contemporary futurists — born in the late 20th or early 21st-century — will fall prey to the same narrow visions when asked to sketch our planet in 3000. But despite the undoubted wealth of new gadgets and gizmos a thousand years from now the challenge would be to see if their imagined worlds might be at peace and with equality for all.
France_in_XXI_Century_Whale_busImages courtesy of the Public Domain Review, a project of the Open Knowledge Foundation. Public Domain.

Send to Kindle

Barbie the Surveillance Officer

Google-search-hello-barbie

There are probably any number of reasons that you, and your kids, may choose to steer clear of Barbie (the Mattel doll that is). Detractors will point to a growing list of problems for which Barbie is to blame, including: gender stereotyping, body image distortion, vacuum cleaner accidents with her fake hair, eating disorders, and poor self esteem. However, it may not have occurred to you that the latest incarnation of the doll — interactive Hello Barbie — could also be spying on you and your family. Could the CIA, NSA or MI5 be keeping tabs on you through your kid’s doll? Creepy, and oh, she’s still far too thin.

From the Guardian:

Mattel’s latest Wi-Fi enabled Barbie doll can easily be hacked to turn it into a surveillance device for spying on children and listening into conversations without the owner’s knowledge.

The Hello Barbie doll is billed as the world’s first “interactive doll” capable of listening to a child and responding via voice, in a similar way to Apple’s Siri, Google’s Now and Microsoft’s Cortana.

It connects to the internet via Wi-Fi and has a microphone to record children and send that information off to third-parties for processing before responding with natural language responses.

But US security researcher Matt Jakubowski discovered that when connected to Wi-Fi the doll was vulnerable to hacking, allowing him easy access to the doll’s system information, account information, stored audio files and direct access to the microphone.

Jakubowski told NBC: “You can take that information and find out a person’s house or business. It’s just a matter of time until we are able to replace their servers with ours and have her say anything we want.”

Once Jakubowski took control of where the data was sent the snooping possibilities were apparent. The doll only listens in on a conversation when a button is pressed and the recorded audio is encrypted before being sent over the internet, but once a hacker has control of the doll the privacy features could be overridden.

It was the ease with which the doll was compromise that was most concerning. The information stored by the doll could allow hackers to take over a home Wi-Fi network and from there gain access to other internet connected devices, steal personal information and cause other problems for the owners, potentially without their knowledge.

Read the entire story here.

Image courtesy of Google Search.

Send to Kindle

Forget The Millennials — It’s Time For Generation K

Blame fickle social scientists. After the baby-boomers the most researched generation has been that of the millennials — so-called due to their coming of age at the turn of the century. We know what millennails like to eat and drink, how they dress, their politics; we know about their proclivity to sharing, their need for meaning and fun at work; we know they need attention and constant feedback. In fact, we have learned so much — and perhaps so little — from the thousands of, often-conflicting, research studies of millennials that some researchers have decided to move on to new blood. Yes, it’s time to tap another rich vein of research material — Generation K. But I’ll stop after relating what the “K” means in Generation K, and let you form your own conclusions.

Generation K is named for Katniss, as in the Hunger Games‘ hero Katniss Everdeen. That’s right, if you were born between 1995 and 2002, according to economist Noreena Hertz you are Gen-Katniss.

From the Guardian:

The brutal, bleak series that has captured the hearts of a generation will come to a brutal, bleak end in November when The Hunger Games: Mockingjay – Part 2 arrives in cinemas. It is the conclusion of the Hunger Games saga, which has immersed the young in a cleverly realised world of trauma, violence, mayhem and death.

For fans of Suzanne Collins’s trilogy about a young girl, Katniss Everdeen, forced to fight for survival in a country ruled by fear and fuelled by televised gladiatorial combat, this is the moment they have been waiting for.

Since the first book in the trilogy was published in 2008, Collins’s tale has sold more than 65 million copies in the US alone. The films, the first of which was released in 2012, have raked in more than $2bn worldwide at the box office and made a global star of their leading lady, Jennifer Lawrence, who plays the increasingly traumatised Katniss with a perfect mix of fury and resignation. For the huge appeal of The Hunger Games goes deeper than the fact that it’s an exciting tale well told. The generation who came to Katniss as young teens and have grown up ploughing through the books and queuing for the movies respond to her story in a particularly personal way.

As to why that might be, the economist and academic Noreena Hertz, who coined the term Generation K (after Katniss) for those born between 1995 and 2002, says that this is a generation riddled with anxiety, distrustful of traditional institutions from government to marriage, and, “like their heroine Katniss Everdeen, [imbued with] a strong sense of what is right and fair”.

“I think The Hunger Games resonates with them so much because they are Katniss navigating a dark and difficult world,” says Hertz, who interviewed 2,000 teenagers from the UK and the US about their hopes, fears and beliefs, concluding that today’s teens are shaped by three factors: technology, recession and coming of age in a time of great unease.

“This is a generation who grew up through 9/11, the Madrid bombings, the London bombings and Islamic State terrors. They see danger piped down their smartphones and beheadings on their Facebook page,” she says. “My data showed very clearly how anxious they are about everything from getting into debt or not getting a job, to wider issues such as climate change and war – 79% of those who took part in my survey worried about getting a job, 72% worried about debt, and you have to remember these are teenagers.

“In previous generations teenagers did not think in this way. Unlike the first-era millennials [who Hertz classes as those aged between 20 and 30] who grew up believing that the world was their oyster and ‘Yes we can’, this new generation knows the world is an unequal and harsh place.”

Writer and activist Laurie Penny, herself a first-era millennial at the age of 29, agrees. “I think what today’s young people have grasped that my generation didn’t get until our early 20s, is that adults don’t know everything,” she says. “They might be trying their best but they don’t always have your best interests at heart. The current generation really understands that – they’re more politically engaged and they have more sense of community because they’re able to find each other easily thanks to their use of technology.”

One of the primary appeals of the Hunger Games trilogy is its refusal to sugarcoat the scenarios Katniss finds herself in. In contrast to JK Rowling’s Harry Potter series, there are no reliable adult figures to dispense helpful advice and no one in authority she can truly trust (notably even the most likeable adult figures in the books tend to be flawed at best and fraudulent at worst). Even her friends may not always have her back, hard as they try – Dumbledore’s Army would probably find themselves taken out before they’d uttered a single counter-curse in the battlegrounds of Panem. At the end of the day, Katniss can only rely on one person, herself.

“Ultimately, the message of the Hunger Games is that everything’s not going to be OK,” says Penny. “One of the reasons Jennifer Lawrence is so good is because she lets you see that while Katniss is heroic, she’s also frightened all of the time. She spends the whole story being forced into situations she doesn’t want to be in. Kids respond because they can imagine what it’s like to be terrified but know that you have to carry on.”

It’s incontestable that we live in difficult times and that younger generations in particular may be more acutely aware that things aren’t improving any time soon, but is it a reach to say that fans of the Hunger Games are responding as much to the world around them as to the books?

Read the entire story here.

Video: The Hunger Games: Mockingjay Part 2 Official Trailer – “We March Together”. Courtesy of the Hunger Games franchise.

Send to Kindle

Perchance Art Thou Smitten by Dapper Hipsters? Verily Methinks

Linguistic-trends-2015As the (mostly) unidirectional tide of cultural influence flows from the U.S to the United Kingdom, the English mother tongue is becoming increasingly (and distressingly, I might add) populated by Americanisms: trash instead of rubbish, fries not chips, deplane instead of disembark, shopping cart instead of trolley, bangs rather than fringe, period instead of full stop. And there’s more: 24/7, heads-up, left-field, normalcy, a savings of, deliverable, the ask, winningest.

All, might I say, utterly cringeworthy.

Yet, there may be a slight glimmer of hope, and all courtesy of the hipster generation. Hipsters, you see, crave an authentic, artisanal experience — think goat cheese and bespoke hats — that also seems to embrace language. So, in 2015, compared with a mere decade earlier, you’re more likely to hear some of the following words, which would normally be more attributable to an archaic, even Shakespearean, era:

perchance, mayhaps, parlor, amidst, amongst, whilst, unbeknownst, thou, thee, ere, hath

I’m all for it. My only hope now, is that these words will flow against the tide and into the U.S. to repair some of the previous linguistic deforestation. Methinks I’ll put some of these to immediate, good use.

From the Independent:

Hipsters are famous for their love of all things old-fashioned: 19th Century beards, pickle-making, Amish outerwear, naming their kids Clementine or Atticus. Now, they may be excavating archaic language, too.

As Chi Luu points out at JSTOR Daily  — the blog of a database of academic journals, what could be more hipster than that? — old-timey words like bespoke, peruse, smitten and dapper appear to be creeping back into the lexicon.

This data comes from Google’s Ngram viewer, which charts the frequencies of words appearing in printed sources between 1800 and 2012.

Google’s Ngram shows that lots of archaic words appear to be resurfacing — including gems like perchance, mayhaps and parlor.

The same trend is visible for words like amongst, amidst, whilst and unbeknownst, which are are archaic forms of among, amid, while and unknown.

Read the story in its entirety here.

Image courtesy of Google’s Ngram viewer / Independent.

Send to Kindle

Your Job is Killing You

Women_mealtime_st_pancras_workhouse

Many of us complain about the daily stresses from our jobs and our bosses, even our coworkers. We even bemoan the morning commute and the work we increasingly bring back home to complete in the evening. Many of us can be heard to say, “this job is killing me!”. Metaphorically, of course.

Well, researchers at Stanford and Harvard now find that in some cases your job is actually, quite literally, killing you. This may seem self-evident, but the data shows that workers with less education are significantly more likely to be employed in jobs that are more stressful and dangerous, and have less healthy workplace practices. This, in turn, leads to a significantly lower average life span than that for those with higher educational attainment. Researchers measured typical employment-related stressors such as: unemployment, layoffs, absence of employer subsidized health insurance, shift work, long working hours, job insecurity and work-family conflict. The less education a worker has, the more likely that she or he will suffer a greater burden from one or more of these stressors.

Looks like we’re gradually reverting to well-tested principles of Victorian worker exploitation. Check out more details from the study here.

From Washington Post:

People often like to groan about how their job is “killing” them. Tragically, for some groups of people in the U.S., that statement appears to be true.

A new study by researchers at Harvard and Stanford has quantified just how much a stressful workplace may be shaving off of Americans’ life spans. It suggests that the amount of life lost to stress varies significantly for people of different races, educational levels and genders, and ranges up to nearly three years of life lost for some groups.

Past research has shown an incredible variation in life expectancy around the United States, depending on who you are and where you live. Mapping life expectancy around the nation by both county of residence and race, you can see that people in some parts of the U.S. live as many as 33 years longer on average than people in other parts of the country, the researchers say.

Those gaps appear to be getting worse, as the wealthy extend their life spans and other groups are stagnant. One study found that men and women with fewer than 12 years of education had life expectancies that were still on par with most adults in the 1950s and 1960s — suggesting the economic gains of the last few decades have gone mostly to more educated people. The financial crisis and subsequent recession, which put many people in economic jeopardy, may have worsened this effect.

There are lots of reasons that people with lower incomes and educations tend to have lower life expectancies: differences in access to health care, in exposure to air and water pollution, in nutrition and health care early in life, and in behaviors, such as smoking, exercise and diet. Past research has also shown that job insecurity, long hours, heavy demands at work and other stresses can also cut down on a worker’s life expectancy by taking a heavy toll on a worker’s health. (If you work in an office, here are some exercises you might try to prevent this.)

But researchers say this is the first study to look at the ways that a workplace’s influence on life expectancy specifically break down by racial and educational lines.

To do their analysis, they divided people into 18 different groups by race, education and sex. They then looked at 10 different workplace factors — including unemployment and layoffs, the absence of health insurance, shift work, long working hours, job insecurity and work-family conflict — and estimated the effect that each would have on annual mortality and life expectancy.

The data show that people with less education are much more likely to end up in jobs with more unhealthy workplace practices that cut down on one’s life span. People with the highest educational attainment were less affected by workplace stress than people with the least education, the study says.

Read the entire story here.

Image: Women mealtime at St Pancras workhouse, London. Courtesy: Peter Higginbothom. Licensed under Public Domain via Commons.

Send to Kindle

The Vicious Cycle of Stuff

google-search-stuff

Many of us in the West, and now increasingly in developing nations, are the guilty perpetrators of the seemingly never-ending cycle of consumption and accumulation. Yet for all the talk of sustainability, down-sizing, and responsible consumption we continue to gather, hoard and surround ourselves with more and more stuff.

From the Guardian:

The personal storage industry rakes in $22bn each year, and it’s only getting bigger. Why?

I’ll give you a hint: it’s not because vast nations of hoarders have finally decided to get their acts together and clean out the hall closet.

It’s also not because we’re short on space. In 1950 the average size of a home in the US was 983 square feet. Compare that to 2011, when American houses ballooned to an average size of 2,480 square feet – almost triple the size.

And finally, it’s not because of our growing families. This will no doubt come as a great relief to our helpful commenters who each week kindly suggest that for maximum environmental impact we simply stop procreating altogether: family sizes in the western world are steadily shrinking, from an average of 3.37 people in 1950 to just 2.6 today.

So, if our houses have tripled in size while the number of people living in them has shrunk, what, exactly, are we doing with all of this extra space? And why the billions of dollars tossed to an industry that was virtually nonexistent a generation or two ago?

Well, friends, it’s because of our stuff. What kind of stuff? Who cares! Whatever fits! Furniture, clothing, children’s toys (for those not fans of deprivation, that is), games, kitchen gadgets and darling tchotchkes that don’t do anything but take up space and look pretty for a season or two before being replaced by other, newer things – equally pretty and equally useless.

The simple truth is this: you can read all the books and buy all the cute cubbies and baskets and chalkboard labels, even master the life-changing magic of cleaning up – but if you have more stuff than you do space to easily store it, your life will be spent a slave to your possessions.

We shop because we’re bored, anxious, depressed or angry, and we make the mistake of buying material goods and thinking they are treats which will fill the hole, soothe the wound, make us feel better. The problem is, they’re not treats, they’re responsibilities and what we own very quickly begins to own us.

The second you open your wallet to buy something, it costs you – and in more ways than you might think. Yes, of course there’s the price tag and the corresponding amount of time it took you to earn that amount of money, but possessions also cost you space in your home and time spent cleaning and maintaining them. And as the token environmentalist in the room, I’d be remiss if I didn’t remind you that when you buy something, you’re also taking on the task of disposing of it (responsibly or not) when you’re done with it. Our addiction to consumption is a vicious one, and it’s stressing us out.

I know this because I’ve experienced it, having lived in everything from a four-bedroom house to my current one-bedroom flat I share with my daughter – but I’m also bringing some cold, hard science to the table.

A study published by UCLA showed that women’s stress hormones peaked during the times they were dealing with their possessions and material goods. Anyone who parks on the street because they can’t fit their car into the garage, or has stared down a crammed closet, can relate.

Our addiction to consuming is a vicious one, and it’s having a markedly negative impact on virtually every aspect of our lives.

Read the entire story here.

Image courtesy of Google Search.

Send to Kindle

Time for the Bucket List to Kick the Bucket

For the same reasons that New Year’s resolutions are daft, it’s time to ditch the bucket list. Columnist Steven Thrasher rightly argues that your actions to get something done or try something new should be driven by your gusto for life — passion, curiosity, wonder, joy — rather than dictated by a check box because you’re one step closer to death. Signs that it’s time to ditch the bucket list: when the idea is co-opted by corporations, advertisers and Hollywood; when motivational posters appear in hallways; and when physical bucket list buckets and notepads go on sale at Pottery Barn or Walmart.

From the Guardian:

Before each one of us dies, let’s wipe the “bucket list” from our collective vocabulary.

I hate the term “the bucket list.” The phrase, a list of things one wants to do in life before one dies or “kicks the bucket”, is the kind of hackneyed, cliche, stupid and insipid term only we Americans can come up with.

Even worse, “the bucket list” has become an excuse for people to couch things they actually desire to try doing as only socially acceptable if framed in the face of their death. It’s as if pleasure, curiosity and fun weren’t reasons enough for action.

If you want to try doing something others might find strange or unorthodox – write a novel, learn to tap dance, engage in a rim job, field dress a deer, climb Everest, go out in drag for a night – why do you need any justification at all? And certainly, why would you need an explanation that is only justifiable in terms of kicking the bucket?

According to the Wall Street Journal, the phrase “bucket list” comes to us from the banal mind of screenwriter Justin Zackham, who developed a list of things he wanted to do before he died. Years later, his “bucket list” became the title of his corny 2007 film starring Jack Nicholson and Morgan Freeman. It’s about two old men with terminal cancer who want to live it up before they die. That, if anyone at all, is who should be using the term “bucket list”. They want to do something with the finite time they know they have left? Fine.

But bucket list has trickled down to everday use by the perfectly healthy, the exceptionally young, and most of all, to douche bags. I realized this at Burning Man last week. Often, when I asked exceptionally boring people what had drawn them to Black Rock City, they’d say: “It was on my bucket list!”

Really? You wanted to schlep out to the desert and face freezing lows, scorching highs and soul crushing techno simply because you’re going to die someday?

There’s a funny dynamic sometimes when I go on a long trip while I’m out of work. When I backpacked through Asia and Europe in 2013, people (usually friends chained to a spouse, children and a mortgage) would sometimes awkwardly say to me: “Well, it will be the trip of a lifetime!” It was a good trip, but just one of many great journeys I’ve taken in my life so far. My adventures might interrupt someone else’s idea of what’s “normal.” But travel isn’t something I do to fulfil my “bucket list”; travel is a way of life for me. I do not rush into a trip thinking: “Good Christ, I could die tomorrow!” I don’t travel in place of the stable job or partner or kids I may or may not ever have. I do it as often as I can because it brings me joy.

Read the entire column here.

Send to Kindle

The 75 Percent Versus 1 Percent

Stop the presses! Hold your horses! There seems to be some hope for humanity after all — and I was just about to seek a misanthropic-approved cave in which to hide.

A recent study by Common Cause shows that three-quarters of one thousand people surveyed identify more closely with unselfish values (altruism, forgiveness, honesty) than selfish ones (money, fame, power). But, as George Monbiot points out those in the 1 percent who run the globe tend to be the selfish ones. Also, he’s quite right to propose that we’d all be better served if the media apparatchik’s who fawn upon the 1 percent spent more time delving into the stories of those who give, rather than take.

From the Guardian:

Do you find yourself thrashing against the tide of human indifference and selfishness? Are you oppressed by the sense that while you care, others don’t? That, because of humankind’s callousness, civilisation and the rest of life on Earth are basically stuffed? If so, you are not alone. But neither are you right.

A study by the Common Cause Foundation, due to be published next month, reveals two transformative findings. The first is that a large majority of the 1,000 people they surveyed – 74% – identifies more strongly with unselfish values than with selfish values. This means that they are more interested in helpfulness, honesty, forgiveness and justice than in money, fame, status and power. The second is that a similar majority – 78% – believes others to be more selfish than they really are. In other words, we have made a terrible mistake about other people’s minds.

The revelation that humanity’s dominant characteristic is, er, humanity will come as no surprise to those who have followed recent developments in behavioural and social sciences. People, these findings suggest, are basically and inherently nice.

A review article in the journal Frontiers in Psychology points out that our behaviour towards unrelated members of our species is “spectacularly unusual when compared to other animals”. While chimpanzees might share food with members of their own group, though usually only after being plagued by aggressive begging, they tend to react violently towards strangers. Chimpanzees, the authors note, behave more like the homo economicus of neoliberal mythology than people do.

Humans, by contrast, are ultrasocial: possessed of an enhanced capacity for empathy, an unparalleled sensitivity to the needs of others, a unique level of concern about their welfare, and an ability to create moral norms that generalise and enforce these tendencies.

Such traits emerge so early in our lives that they appear to be innate. In other words, it seems that we have evolved to be this way. By the age of 14 months, children begin to help each other, for example by handing over objects another child can’t reach. By the time they are two, they start sharing things they value. By the age of three, they start to protest against other people’s violation of moral norms.

A fascinating paper in the journal Infancy reveals that reward has nothing to do with it. Three- to five-year-olds are less likely to help someone a second time if they have been rewarded for doing it the first time. In other words, extrinsic rewards appear to undermine the intrinsic desire to help. (Parents, economists and government ministers, please note.) The study also discovered that children of this age are more inclined to help people if they perceive them to be suffering, and that they want to see someone helped whether or not they do it themselves. This suggests that they are motivated by a genuine concern for other people’s welfare, rather than by a desire to look good.

Why? How would the hard logic of evolution produce such outcomes? This is the subject of heated debate. One school of thought contends that altruism is a logical response to living in small groups of closely related people, and evolution has failed to catch up with the fact that we now live in large groups, mostly composed of strangers.

Another argues that large groups containing high numbers of altruists will outcompete large groups which contain high numbers of selfish people. A third hypothesis insists that a tendency towards collaboration enhances your own survival, regardless of the group in which you might find yourself. Whatever the mechanism might be, the outcome should be a cause of celebration.

So why do we retain such a dim view of human nature? Partly, perhaps, for historical reasons. Philosophers from Hobbes to Rousseau, Malthus to Schopenhauer, whose understanding of human evolution was limited to the Book of Genesis, produced persuasive, influential and catastrophically mistaken accounts of “the state of nature” (our innate, ancestral characteristics). Their speculations on this subject should long ago have been parked on a high shelf marked “historical curiosities”. But somehow they still seem to exert a grip on our minds.

Another problem is that – almost by definition – many of those who dominate public life have a peculiar fixation on fame, money and power. Their extreme self-centredness places them in a small minority, but, because we see them everywhere, we assume that they are representative of humanity.

The media worships wealth and power, and sometimes launches furious attacks on people who behave altruistically. In the Daily Mail last month, Richard Littlejohn described Yvette Cooper’s decision to open her home to refugees as proof that “noisy emoting has replaced quiet intelligence” (quiet intelligence being one of his defining qualities). “It’s all about political opportunism and humanitarian posturing,” he theorised, before boasting that he doesn’t “give a damn” about the suffering of people fleeing Syria. I note with interest the platform given to people who speak and write as if they are psychopaths.

Read the entire story here.

Send to Kindle

Wot! Proper Grammar?

It seems that there are several ways to turn off a potential dating connection online: a picture of your bad teeth, tales of your poor hygiene, political posturing, and now, a poorly written profile or introductory email. Is our children learning?

Seriously, can it be that the younger generation is finally rebelling against the tyranny of lowercase Twitteresque, incorrect punctuation, nonsensical grammar, fatuous emoticons and facile abbreviations? If so, this is wonderful news for those who care about our language. Now, perhaps, these same people can turn their talents to educating the barely literate generations holding jobs in corporate America. After decades of subservience to fractured Powerpoint haiku many can no longer string together a coherent paragraph.

From the WSJ:

When Jeff Cohen was getting ready to meet his OkCupid date for drinks in Manhattan, he started to have second thoughts as he reread the glaring grammatical error in her last message: “I will see you their.”

The date flopped for a couple of reasons, but bad grammar bothers Mr. Cohen. Learning a potential mate doesn’t know the difference between “there,” “they’re” and “their” is like discovering she loves cats, he says. Mr. Cohen is allergic to cats. “It’s like learning I’m going to sneeze every time I see her,” he says.

With crimes against grammar rising in the age of social media, some people are beginning to take action. The online dating world is a prime battleground.

Mr. Cohen joins a number of singles picky about the grammar gaffes they’re seeing on dating sites. For love, these folks say written communications matter, from the correct use of semicolons, to understanding the difference between its and it’s, and sentences built on proper parallel construction.

“Grammar snobbery is one of the last permissible prejudices,” says John McWhorter, a linguistics professor at Columbia University. “The energy that used to go into open classism and racism now goes into disparaging people’s grammar.”

Mr. Cohen now uses an app that ranks the message quality of prospective dates. Called the Grade, the app checks messages for typos and grammar errors and assigns each user a letter grade from A+ to F.

The Grade demotes people whose messages contain certain abbreviations, like “wassup” and “YOLO,” short for “You Only Live Once,” popular among young people who want to justify doing something risky or indulgent. Clifford Lerner, chief executive of SNAP Interactive Inc., the company that makes the Grade, says the app downgrades these types of phrases in an effort to promote “meaningful conversations.”

Dating site Match asked more than 5,000 singles in the U.S. what criteria they used most in assessing dates. Beyond personal hygiene—which 96% of women valued most, as compared with 91% of men—singles said they judged a date foremost by the person’s grammar. The survey found 88% of women and 75% of men said they cared about grammar most, putting it ahead of a person’s confidence and teeth.

“When you get a message that is grammatically correct and has a voice and is put together, it is very attractive, it definitely adds hotness points,” says New Yorker Grace Gold. “People who send me text-type messages, and horrific grammatical errors? I just delete them.” She recalls the red flag raised by one potential suitor who had written his entire dating profile in lowercase.

Language has always played a part in how people judge others, but it has become amplified in recent years with increasing informal and colloquial usage, says Ben Zimmer, a lexicographer and chair of the New Words Committee of the American Dialect Society.

Read the entire story here.

Send to Kindle

PhotoMash: Two Types of Radical

Photomash-Radical-1-vs-Radical-2Meet two faces of radicalism: one is the face of radical islam; the second is the face of radial nationalism. Different, but similar, and both morally bankrupt.

Both have ideas that resonate with a very limited few (luckily for the rest of us); both inflame our discourse; both fuel hatred, distrust and intolerance; both project fear, racism, xenophobia and misogyny. Welcome to the new faces of fascism.

As a Londoner recently said of an attacker (reportedly belonging to the first type of radical group): #YouAintNoMuslimBruv.

I’d suggest to our second radical: #YouAintNoAmericanBro.

Both of these nightmarish visions seek a place on the world stage — both should and will rightly fail.

Image courtesy of the Washington Post, December 7, 2015.

Send to Kindle

PhotoMash: Two Kinds of Monster, One Real

I couldn’t resist this week’s photo mash-up. This one comes courtesy of the Guardian on December 3, 2015. It features two types of monster very aptly placed alongside each other by a kindly newspaper editor.

Photomash-Trump-vs-Monsters

The first monster happens to want to be President of the United States. He seems to be a racist, misogynist and raving bigot, and unfortunately (for some), he’s very real. The second, is a story of photographic artist Flora Borsi. She’s tired of perfect models with perfect hair in perfect fashion photographs. So, she retouches them, or in her words “detouches” the images into her “little monsters”. These are not real.

Our real world can be rather surreal.

Images courtesy of Guardian.

Send to Kindle

The US and the UK: A Stark Difference

Terrorism-US-3Dec2015Within the space of a few days we’ve witnessed two more acts of atrocious violence and murder. One in San Bernardino, California, the other in London, England.

In California 14 innocent people lost there lives and, by some accounts, 21 people were injured, and of course many hundreds of police officers and first-responders put their lives at risk in searching for and confronting the murderers.

In London, 3 people were injured, one seriously by an attacker on the London Underground (subway).Terrorism-UK-6Dec2015

Label these attacks acts of terrorism; acts of deranged minds. But, whether driven by warped ideologies or mental health issues the murder and violence in California and London shows one very stark difference.

Guns. Lots of guns.

The attackers in California were armed to the teeth: handguns, semi-automatic weapons and thousands of rounds of ammunition. The attacker in London was wielding a knife. You see, terrorism, violent radicalism and mental health problems exist — much to the same extent — in both the US and UK (and across the globe for that matter). But more often than not the outcome will be rather different — that is, more bloody and deadly — in the US because of access to weapons that conveniently facilitate mass murder.

And, sadly until a significant proportion of the US population comes to terms with this fact, rather than hiding behind a distorted interpretation of the 2nd Amendment, the carnage and mass murder — in the US — will continue.

Send to Kindle

Monarchy: Bad. Corporations and Oligarchs: Good

Google-search-GOP-candidates

The Founders of the United States had an inkling that federated democracy could not belong to all the people — hence they inserted the Electoral College. Yet they tried hard to design a system that improved upon the unjust, corruptness of hereditary power. But while they understood the dangers of autocratic British monarchy, they utterly failed to understand the role of corporations and vast sums of money in delivering much the same experience a couple of centuries later.

Ironically enough, all of Europe’s monarchies have given way to parliamentary democracies which are less likely to be ruled or controlled through financial puppeteering. In the United States, on the other hand, the once shining beacon of democracy is firmly in the grip of corporations, political action committees (PAC) and a handful of oligarchs awash in money, and lots of it. They control the discourse. They filter the news. They vet and anoint candidates; and destroy their foes. They shape and make policy. They lobby and “pay” lawmakers. They buy and aggregate votes. They now define and run the system.

But, of course, our corporations and billionaires are not hereditary aristocrats — they’re ordinary people with our interests at heart — according to the U.S. Supreme Court. So, all must be perfect and good, especially for those who subscribe to the constructionist view of the US Constitution.

From the Guardian:

To watch American politics today is to watch money speaking. The 2016 US elections will almost certainly be the most expensive in recent history, with total campaign expenditure exceeding the estimated $7bn (£4.6bn) splurged on the 2012 presidential and congressional contests. Donald Trump is at once the personification of this and the exception that proves the rule because – as he keeps trumpeting – at least it’s his own money. Everyone else depends on other people’s, most of it now channelled through outside groups such as “Super PACs” – political action committees – which are allowed to raise unlimited amounts from individuals and corporations.

The sums involved dwarf those in any other mature democracy. Already, during the first half of 2015, $400m has been raised, although the elections are not till next autumn. Spending on television advertising is currently projected to reach $4.4bn over the whole campaign. For comparison, all candidates and parties in Britain’s 2010 election spent less than £46m. In Canada’s recent general election the law allowed parties to lay out a maximum of about C$25m (£12.5m) for the first 37 days of an election campaign, plus an extra C$685,185 (to be precise) for each subsequent day.

Rejecting a challenge to such campaign finance regulation back in 2004, the Canadian supreme court argued that “individuals should have an equal opportunity to participate in the electoral process”, and that “wealth is the main obstacle to equal participation”. “Where those having access to the most resources monopolise the election discourse,” it explained, “their opponents will be deprived of a reasonable opportunity to speak and be heard.”

The US supreme court has taken a very different view. In its 2010 Citizens United judgment it said, in effect, that money has a right to speak. Specifically, it affirmed that a “prohibition on corporate independent expenditures is … a ban on speech”. As the legal scholar Robert Post writes, in a persuasive demolition of the court’s reasoning, “this passage flatly equates the first amendment rights of ordinary commercial corporations with those of natural persons”. (Or, as the former presidential candidate Mitt Romney put it in response to a heckler: “Corporations are people, my friend,”)

In a book entitled Citizens Divided, Post demonstrates how the Citizens United judgment misunderstands the spirit and deeper purpose of the first amendment: for people to be best equipped to govern themselves they need not just the freedom of political speech, but also the “representative integrity” of the electoral process.

Of course, an outsize role for money in US politics is nothing new. Henry George, one of the most popular political economists of his day, wrote in 1883 that “popular government must be a sham and a fraud” so long as “elections are to be gained by the use of money, and cannot be gained without it”. Whether today’s elections are so easily to be gained by the use of money is doubtful, when so much of it is sloshing about behind so many candidates, but does anyone doubt the “cannot be gained without it”?

Money may have been shaping US politics for some time, but what is new is the scale and unconstrained character of the spending, since the 2010 Citizens United decision and the Super PACs that it (and a subsequent case in a lower court) enabled. Figures from the Center for Responsive Politics show outside spending in presidential campaign years rising significantly in 2004 and 2008 but then nearly trebling in 2012 – and, current trends suggest, we ain’t seen nothing yet.

The American political historian Doris Kearns Godwin argues that the proliferation of Republican presidential candidates, so many that they won’t even fit on the stage for one television debate, is at least partly a result of the ease with which wealthy individuals and businesses can take a punt on their own man – or Carly Fiorina. A New York Times analysis found that around 130 families and their businesses accounted for more than half the money raised by Republican candidates and their Super PACs up to the middle of this year. (Things aren’t much better on the Democrat side.) And Godwin urges her fellow citizens to “fight for an amendment to undo Citizens United”.

The Harvard law professor and internet guru Larry Lessig has gone a step further, himself standing for president on the single issue of cleaning up US politics, with a draft citizen equality act covering voter registration, gerrymandering, changing the voting system and reforming campaign finance. That modest goal achieved, he will resign and hand over the reins to his vice-president. Earlier this year he said he would proceed if he managed to crowdfund more than $1m, which he has done. Not peanuts for you or me, but Jeb Bush’s Super PAC, Right to Rise, is planning to spend $37m on television ads before the end of February next year. So one of the problems of the campaign for campaign finance reform is … how to finance its campaign.

Read the entire story here.

Image courtesy of Google Search.

Send to Kindle

H2O and IQ

There is great irony in NASA’s recent discovery of water flowing on Mars.

First, that the gift of our intelligence allows us to make such amazing findings on other worlds while we use the same brain cells to enable the rape and pillage of our own.

CADrought-LakeOroville

Second, the meager seasonal trickles of liquid on the martian surface show us a dire possible future for our own planet.

Mars-Recurring-Slope-Lineae

From the Guardian:

Evidence for flowing water on Mars: this opens up the possibility of life, of wonders we cannot begin to imagine. Its discovery is an astonishing achievement. Meanwhile, Martian scientists continue their search for intelligent life on Earth.

We may be captivated by the thought of organisms on another planet, but we seem to have lost interest in our own. The Oxford Junior Dictionary has been excising the waymarks of the living world. Adders, blackberries, bluebells, conkers, holly, magpies, minnows, otters, primroses, thrushes, weasels and wrens are now surplus to requirements.

In the past four decades, the world has lost 50% of its vertebrate wildlife. But across the latter half of this period, there has been a steep decline in media coverage. In 2014, according to a study at Cardiff University, there were as many news stories broadcast by the BBC and ITV about Madeleine McCann (who went missing in 2007) as there were about the entire range of environmental issues.

Think of what would change if we valued terrestrial water as much as we value the possibility of water on Mars. Only 3% of the water on this planet is fresh; and of that, two-thirds is frozen. Yet we lay waste to the accessible portion. Sixty per cent of the water used in farming is needlessly piddled away by careless irrigation. Rivers, lakes and aquifers are sucked dry, while what remains is often so contaminated that it threatens the lives of those who drink it. In the UK, domestic demand is such that the upper reaches of many rivers disappear during the summer. Yet still we install clunky old toilets and showers that gush like waterfalls.

As for salty water, of the kind that so enthrals us when apparently detected on Mars, on Earth we express our appreciation with a frenzy of destruction. A new report suggests fish numbers have halved since 1970. Pacific bluefin tuna, which once roamed the seas in untold millions, have been reduced to an estimated 40,000, yet still they are pursued. Coral reefs are under such pressure that most could be gone by 2050. And in our own deep space, our desire for exotic fish rips through a world scarcely better known to us than the red planet’s surface. Trawlers are now working at depths of 2,000 metres. We can only guess at what they could be destroying.

A few hours before the Martian discovery was announced, Shell terminated its Arctic oil prospecting in the Chukchi Sea. For the company’s shareholders, it’s a minor disaster: the loss of $4bn; for those who love the planet and the life it sustains, it is a stroke of great fortune. It happened only because the company failed to find sufficient reserves. Had Shell succeeded, it would have exposed one of the most vulnerable places on Earth to spills, which are almost inevitable where containment is almost impossible. Are we to leave such matters to chance?

At the beginning of September, two weeks after he granted Shell permission to drill in the Chukchi Sea, Barack Obama travelled to Alaska to warn Americans about the devastating effects that climate change caused by the burning of fossil fuels could catalyse in the Arctic. “It’s not enough just to talk the talk”, he told them. “We’ve got to walk the walk.” We should “embrace the human ingenuity that can do something about it”. Human ingenuity is on abundant display at Nasa, which released those astounding images. But not when it comes to policy.

Let the market decide: this is the way in which governments seek to resolve planetary destruction. Leave it to the conscience of consumers, while that conscience is muted and confused by advertising and corporate lies. In a near-vacuum of information, we are each left to decide what we should take from other species and other people, what we should allocate to ourselves or leave to succeeding generations. Surely there are some resources and some places – such as the Arctic and the deep sea – whose exploitation should simply stop?

Read the entire article here.

Images: Lake Oroville, California, Earth, courtesy of U.S. Drought Portal. Recurring slope lineae, Mars, courtesy of NASA/JPL.

Send to Kindle

Green Friday

South Arapahoe Peak

To my US readers… Happy Thanksgiving. By this time you will no doubt have been bombarded by countless commercials, online adds, billboards, flyers and messages to your inbox, social media account etc., espousing the wonders of the so-called Black Friday shopping orgy.

My advice: boycott the shopping mall and the stores — both online and brick-and-mortar — go outside, breath some fresh air, and join Green Friday. It’s infinitely better for the heart and the soul (and your bank account). My home state of Colorado has joined the bandwagon this year by opening up all state parks for free on Fresh Air Friday.

Image: South Arapahoe Peak, looking East, Indian Peaks Wilderness, Colorado. Courtesy of the author.

Send to Kindle

Crony Capitalism Rules

The self-righteous preachers of on all sides of the political aisle in the U.S are constantly decrying corruption across the globe; one day the target may be a central African nation, the next it’s China, then a country in Latin America. Of course, this wouldn’t be so ****ing hypocritical if those in positions of power opened their eyes — and closed their wallets — to the rampant cash-fueled cronyism in their own backyards.

The threat to this democracy from those with hoards of money is greater than any real or imagined hostility from terrorism. Money greases and fuels the well-oiled machine in Washington D.C; it catalyses those who peddle influence; it brokers power and it curries favor. The influence of money is insidious and pervasive, and it is eating away the promise of democracy for all.

Our politicians pay homage to the bundlers; they crave endorsement from the millionaires; and, increasingly, they need anointment from the billionaires. And Rome burns. Then, when our so-called representatives have had their turn in the public limelight and in power, they retreat to the shadows, where as lobbyists and brokers they wield even greater power for the moneyed few. And Rome continues to burn.

So you know things must be rather dire if even huge swathes of capitalist corporate America want some form of significant campaign finance reform. You can read for yourself what the Committee for Economic Development of the Conference Board has to say in its scathing report, Crony Capitalism: Unhealthy Relations Between Business and Government.

From the Guardian:

Political corruption is eating our democracy out from the inside. Most Americans know that. But democratic and economic health can’t be easily disentangled. As it diminishes our public sphere and drowns out the myriad of citizen voices, it also sucks the energy and vitality from our economy. This causes pain to business owners.

According to a recent report from the Committee on Economic Development, an old, white-shoe non-partisan organization that came out of the aftermath of World War II (and was a booster for the Marshall Plan), the United States economy is increasingly represented by crony capitalism, not competitive capitalism.

Lobbyists and privately funded elections have, according to the CED: “exerted an important toll on the US economy”. They propose banning registered lobbyists from raising money for federal candidates and officeholders, and implementing strict revolving door policies.

Crony capitalism, the report details, leads to “rent-seeking through subsidies or taxes that benefit vested interests at the expense of others, rather than the pursuit of profit through socially and economically productive behavior”.

What is most striking about the report is who is behind it. The CEO of CED is former Romney supporter Steve Odland. A former top lobbyist for PepsiCo, a republican called Larry Thompson – someone I never thought I’d agree with – is endorsing the single most important structural reform in America: publicly financed elections.

Thompson is the Co-Chair of CED’s Sustainable Capitalism Subcommittee, a driver in the release of the report. Paul Atkins, another member of the CED board (and the sustainable capitalism subcommittee) was a Bush-appointed SEC Commissioner who opposed rules constraining hedge funds.

“Campaign finance reform could free elected officials from their dependence on private campaign funding. Such funding is seen as an important reason why elected officials might bend their views on policy issues away from the public interest” the report said.

I disagree with a big part of the report. I don’t think we should reduce the corporate tax rate. But the crony capitalism argument is right on point, and the most striking thing about the report is its full-throated endorsement of a public financing model. And, the report persuasively shows how our current model reduces competitiveness of the economy “by favoring insiders over outsiders” and “continues to sap vitality” out of our economic life.

We haven’t always had this problem. Until the 1980s, candidates spent a fraction of their time talking to donors; just a few weeks a year, a little more right before an election. True, they’d fund raise from the wealthy interests, as they do now, but it was a minuscule part of their job: policy and constituent services were the heart of the work.

 Read the entire story here.

Video: Money, money, money. ABBA. Courtesy of AbbaEVEO.

Send to Kindle

PhotoMash: Tax Credit Cuts and Robots

This week’s PhotoMash comes courtesy of the Guardian online news site. It’s front page carried a photo of George Osborne, UK Chancellor of the Exchequer (Treasury Secretary) wondering how to cut tax credits next to a story concluding that robots will take over a third of all UK manual jobs by 2030.

Photomash-Osborne_Cuts-Robot_Jobs

I dare you to find the real human above.

Images courtesy of the Guardian.

Send to Kindle

Social Media Lice

google-search-group-selfie

We know that social media helps us stay superficially connected to others. We also know many of the drawbacks — an over-inflated and skewed sense of self; poor understanding and reduced thoughtfulness; neurotic fear of missing out (FOMO); public shaming, online bullying and trolling.

But, now we hear that one of the key foundations of social media — the taking and sharing of selfies — has more serious consequences. Social media has caused an explosion in head lice, especially in teenagers, particularly girls. Call it: social media head lice syndrome. While this may cause you to scratch your head in disbelief, or for psychosomatic reasons, the outbreak of lice is rather obvious. It goes like this: a group of teens needs a quick selfie fix; teens crowd around the smartphone and pose; teens lean in, heads together; head lice jump from one scalp to the next.

From the Independent:

Selfies have sparked an explosion in the number of head lice cases among teenagers a group of US paediatricians has warned.

The group said there is a growing trend of “social media lice” where lice spread when teenagers cram their heads together to take a selfie.

Lice cannot jump so they are less common in older children who do not tend to swap hats or headgear.

A Wisconsin paediatrician, Dr Sharon Rink, told local news channel WBAY2 she has seen a surge of teenagers coming to see her for treatment, something which was unheard of five years ago.

Dr Rink said: “People are doing selfies like every day, as opposed to going to photo booths years and years ago.

“So you’re probably having much more contact with other people’s heads.

“If you have an extremely itchy scalp and you’re a teenager, you might want to get checked out for lice instead of chalking it up to dandruff.”

In its official online guide to preventing the spread of head lice, the Center for Disease Control recommends avoiding head-to-head contact where possible and suggests girls are more likely to get the parasite than boys because they tend to have “more frequent head-to-head contact”.

Read (and scratch) more here.

Image courtesy of Google Search.

Send to Kindle