Back to the Future

France_in_XXI_Century_Latest_fashionJust over a hundred years ago, at the turn of the 19th century, Jean-Marc Côté and some of his fellow French artists were commissioned to imagine what the world would look like in 2000. Their colorful sketches and paintings portrayed some interesting inventions, though all seem to be grounded in familiar principles and incremental innovations — mechanical helpers, ubiquitous propellers and wings. Interestingly, none of these artist-futurists imagined a world beyond Victorian dress, gender inequality and wars. But these are gems nonetheless.

France_in_XXI_Century._Air_cabSome of their works found their way into cigar boxes and cigarette cases, others were exhibited at the 1900 World Exhibition in Paris. My three favorites: a Tailor of the Latest Fashion, the Aero-cab Station and the Whale Bus. See the full complement of these remarkable futuristic visions at the Public Domain Review, and check out the House Rolling Through the Countryside and At School.

I suspect our contemporary futurists — born in the late 20th or early 21st-century — will fall prey to the same narrow visions when asked to sketch our planet in 3000. But despite the undoubted wealth of new gadgets and gizmos a thousand years from now the challenge would be to see if their imagined worlds might be at peace and with equality for all.
France_in_XXI_Century_Whale_busImages courtesy of the Public Domain Review, a project of the Open Knowledge Foundation. Public Domain.

 

The American Dream: Socialism for the Rich Or Capitalism For All?

You know that something’s up when the Wall Street Journal begins running op-ed columns that question capitalism. Has even the WSJ now realized that American capitalism thrives by two sets of rules: one for the rich socialists, the crony capitalists who manipulate markets (and politics), invent loopholes, skirt regulation, and place enormous bets with others’ wealth; the other, for the poor capitalists, who innovate, work hard and create tangible value.

Now even Bill Gates — the world’s richest citizen — tells us that only socialism can address climate change! It’s clear that the continued appeal of Bernie Sanders to those on the political left, and the likes of Ben Carson and that-other-guy-with-the-strange-hair-and-big-mouth-and-even-bigger-ego to those on the right, highlights significant public distaste for our societal inequality and political morass. At times I feel as if I’ve been transported to a parallel universe, a la 1Q84, where the 99 percent will rise and finally realize meaningful change through social and economic justice. Can it really happen?

Nah! It’ll never happen. The tentacles that connect politicians and their donors are too intertwined; the pathways that connect the billionaires, oligarchs, plutocrats and corporations to lobbyists to regulators to lawmakers are too well-protected, too ingrained. Until these links are broken the rich will continue to get richer and the poor will continue to dream. So, for the time being remember: the rich are just too big to fail.

From the WSJ:

If you want to find people who still believe in “the American dream”—the magnetic idea that anyone can build a better life for themselves and their families, regardless of circumstance—you might be best advised to travel to Mumbai. Half of the Indians in a recent poll agreed that “the next generation will probably be richer, safer and healthier than the last.”

The Indians are the most sanguine of the more than 1,000 adults in each of seven nations surveyed in early September by the market-research firm YouGov for the London-based Legatum Institute (with which I am affiliated). The percentage of optimists drops to 42 in Thailand, 39 in Indonesia, 29 in Brazil, 19 in the U.K. and 15 in Germany. But it isn’t old-world Britain or Germany that is gloomiest about the future. It is new-world America, where only 14% of those surveyed think that life will be better for their children, and 52% disagree.

The trajectory of the world doesn’t justify this pessimism. People are living longer on every continent. They’re doing less arduous, backbreaking work. Natural disasters are killing fewer people. Fewer crops are failing. Some 100,000 people are being lifted out of poverty every day, according to World Bank data.

Life is also getting better in the U.S., on multiple measures, but the survey found that 55% of Americans think the “rich get richer” and the “poor get poorer” under capitalism. Sixty-five percent agree that most big businesses have “dodged taxes, damaged the environment or bought special favors from politicians,” and 58% want restrictions on the import of manufactured goods.

Friends of capitalism cannot be complacent, however. The findings of the survey underline the extent to which people think that wealth creation is a dirty business. When big majorities in so many major nations think that big corporations behave unethically and even illegally, it is a system that is always vulnerable to attack from populist politicians.

John Mackey, the CEO of Whole Foods, has long worried about the sustainability of the free enterprise system if large numbers of voters come to think of businesses as “basically a bunch of psychopaths running around trying to line their own pockets.” If the public doesn’t think business is fundamentally good, he has argued, then business is inviting destructive regulation. If, by contrast, business shows responsibility to all its stakeholders—customers, employees, investors, suppliers and the wider community—“the impulse to regulate and control would be lessened.”

Read the entire column here.

Barbie the Surveillance Officer

Google-search-hello-barbie

There are probably any number of reasons that you, and your kids, may choose to steer clear of Barbie (the Mattel doll that is). Detractors will point to a growing list of problems for which Barbie is to blame, including: gender stereotyping, body image distortion, vacuum cleaner accidents with her fake hair, eating disorders, and poor self esteem. However, it may not have occurred to you that the latest incarnation of the doll — interactive Hello Barbie — could also be spying on you and your family. Could the CIA, NSA or MI5 be keeping tabs on you through your kid’s doll? Creepy, and oh, she’s still far too thin.

From the Guardian:

Mattel’s latest Wi-Fi enabled Barbie doll can easily be hacked to turn it into a surveillance device for spying on children and listening into conversations without the owner’s knowledge.

The Hello Barbie doll is billed as the world’s first “interactive doll” capable of listening to a child and responding via voice, in a similar way to Apple’s Siri, Google’s Now and Microsoft’s Cortana.

It connects to the internet via Wi-Fi and has a microphone to record children and send that information off to third-parties for processing before responding with natural language responses.

But US security researcher Matt Jakubowski discovered that when connected to Wi-Fi the doll was vulnerable to hacking, allowing him easy access to the doll’s system information, account information, stored audio files and direct access to the microphone.

Jakubowski told NBC: “You can take that information and find out a person’s house or business. It’s just a matter of time until we are able to replace their servers with ours and have her say anything we want.”

Once Jakubowski took control of where the data was sent the snooping possibilities were apparent. The doll only listens in on a conversation when a button is pressed and the recorded audio is encrypted before being sent over the internet, but once a hacker has control of the doll the privacy features could be overridden.

It was the ease with which the doll was compromise that was most concerning. The information stored by the doll could allow hackers to take over a home Wi-Fi network and from there gain access to other internet connected devices, steal personal information and cause other problems for the owners, potentially without their knowledge.

Read the entire story here.

Image courtesy of Google Search.

A Positive Female Role Model

Margaret_Hamilton_in_action

Our society does a better, but still poor, job of promoting positive female role models. Most of our — let’s face it — male designed images of women fall into rather narrowly defined stereotypical categories: nurturing care-giver, stay-at-home soccer mom, matriarchal office admin, overly bossy middle-manager, vacuous reality-TV spouse or scantily clad vixen.

But every now and then the media seems to discover another unsung, female who made significant contributions in a male-dominated and male-overshadowed world. Take the case of computer scientist Margaret Hamilton — she developed on-board flight software for the Apollo space program while director of the Software Engineering Division of the MIT Instrumentation Laboratory. Aside from developing technology that put people on the Moon, she helped NASA understand the true power of software and the consequences of software-driven technology.

From Wired:

Margaret Hamilton wasn’t supposed to invent the modern concept of software and land men on the moon. It was 1960, not a time when women were encouraged to seek out high-powered technical work. Hamilton, a 24-year-old with an undergrad degree in mathematics, had gotten a job as a programmer at MIT, and the plan was for her to support her husband through his three-year stint at Harvard Law. After that, it would be her turn—she wanted a graduate degree in math.

But the Apollo space program came along. And Hamilton stayed in the lab to lead an epic feat of engineering that would help change the future of what was humanly—and digitally—possible.

As a working mother in the 1960s, Hamilton was unusual; but as a spaceship programmer, Hamilton was positively radical. Hamilton would bring her daughter Lauren by the lab on weekends and evenings. While 4-year-old Lauren slept on the floor of the office overlooking the Charles River, her mother programmed away, creating routines that would ultimately be added to the Apollo’s command module computer.

“People used to say to me, ‘How can you leave your daughter? How can you do this?’” Hamilton remembers. But she loved the arcane novelty of her job. She liked the camaraderie—the after-work drinks at the MIT faculty club; the geek jokes, like saying she was “going to branch left minus” around the hallway. Outsiders didn’t have a clue. But at the lab, she says, “I was one of the guys.”

Then, as now, “the guys” dominated tech and engineering. Like female coders in today’s diversity-challenged tech industry, Hamilton was an outlier. It might surprise today’s software makers that one of the founding fathers of their boys’ club was, in fact, a mother—and that should give them pause as they consider why the gender inequality of the Mad Men era persists to this day.

As Hamilton’s career got under way, the software world was on the verge of a giant leap, thanks to the Apollo program launched by John F. Kennedy in 1961. At the MIT Instrumentation Lab where Hamilton worked, she and her colleagues were inventing core ideas in computer programming as they wrote the code for the world’s first portable computer. She became an expert in systems programming and won important technical arguments. “When I first got into it, nobody knew what it was that we were doing. It was like the Wild West. There was no course in it. They didn’t teach it,” Hamilton says.

This was a decade before Microsoft and nearly 50 years before Marc Andreessen would observe that software is, in fact, “eating the world.” The world didn’t think much at all about software back in the early Apollo days. The original document laying out the engineering requirements of the Apollo mission didn’t even mention the word software, MIT aeronautics professor David Mindell writes in his book Digital Apollo. “Software was not included in the schedule, and it was not included in the budget.” Not at first, anyhow.

Read the entire story here.

Image: Margaret Hamilton during her time as lead Apollo flight software designer. Courtesy NASA. Public Domain.

Forget The Millennials — It’s Time For Generation K

Blame fickle social scientists. After the baby-boomers the most researched generation has been that of the millennials — so-called due to their coming of age at the turn of the century. We know what millennails like to eat and drink, how they dress, their politics; we know about their proclivity to sharing, their need for meaning and fun at work; we know they need attention and constant feedback. In fact, we have learned so much — and perhaps so little — from the thousands of, often-conflicting, research studies of millennials that some researchers have decided to move on to new blood. Yes, it’s time to tap another rich vein of research material — Generation K. But I’ll stop after relating what the “K” means in Generation K, and let you form your own conclusions.

[tube]n-7K_OjsDCQ[/tube]

Generation K is named for Katniss, as in the Hunger Games‘ hero Katniss Everdeen. That’s right, if you were born between 1995 and 2002, according to economist Noreena Hertz you are Gen-Katniss.

From the Guardian:

The brutal, bleak series that has captured the hearts of a generation will come to a brutal, bleak end in November when The Hunger Games: Mockingjay – Part 2 arrives in cinemas. It is the conclusion of the Hunger Games saga, which has immersed the young in a cleverly realised world of trauma, violence, mayhem and death.

For fans of Suzanne Collins’s trilogy about a young girl, Katniss Everdeen, forced to fight for survival in a country ruled by fear and fuelled by televised gladiatorial combat, this is the moment they have been waiting for.

Since the first book in the trilogy was published in 2008, Collins’s tale has sold more than 65 million copies in the US alone. The films, the first of which was released in 2012, have raked in more than $2bn worldwide at the box office and made a global star of their leading lady, Jennifer Lawrence, who plays the increasingly traumatised Katniss with a perfect mix of fury and resignation. For the huge appeal of The Hunger Games goes deeper than the fact that it’s an exciting tale well told. The generation who came to Katniss as young teens and have grown up ploughing through the books and queuing for the movies respond to her story in a particularly personal way.

As to why that might be, the economist and academic Noreena Hertz, who coined the term Generation K (after Katniss) for those born between 1995 and 2002, says that this is a generation riddled with anxiety, distrustful of traditional institutions from government to marriage, and, “like their heroine Katniss Everdeen, [imbued with] a strong sense of what is right and fair”.

“I think The Hunger Games resonates with them so much because they are Katniss navigating a dark and difficult world,” says Hertz, who interviewed 2,000 teenagers from the UK and the US about their hopes, fears and beliefs, concluding that today’s teens are shaped by three factors: technology, recession and coming of age in a time of great unease.

“This is a generation who grew up through 9/11, the Madrid bombings, the London bombings and Islamic State terrors. They see danger piped down their smartphones and beheadings on their Facebook page,” she says. “My data showed very clearly how anxious they are about everything from getting into debt or not getting a job, to wider issues such as climate change and war – 79% of those who took part in my survey worried about getting a job, 72% worried about debt, and you have to remember these are teenagers.

“In previous generations teenagers did not think in this way. Unlike the first-era millennials [who Hertz classes as those aged between 20 and 30] who grew up believing that the world was their oyster and ‘Yes we can’, this new generation knows the world is an unequal and harsh place.”

Writer and activist Laurie Penny, herself a first-era millennial at the age of 29, agrees. “I think what today’s young people have grasped that my generation didn’t get until our early 20s, is that adults don’t know everything,” she says. “They might be trying their best but they don’t always have your best interests at heart. The current generation really understands that – they’re more politically engaged and they have more sense of community because they’re able to find each other easily thanks to their use of technology.”

One of the primary appeals of the Hunger Games trilogy is its refusal to sugarcoat the scenarios Katniss finds herself in. In contrast to JK Rowling’s Harry Potter series, there are no reliable adult figures to dispense helpful advice and no one in authority she can truly trust (notably even the most likeable adult figures in the books tend to be flawed at best and fraudulent at worst). Even her friends may not always have her back, hard as they try – Dumbledore’s Army would probably find themselves taken out before they’d uttered a single counter-curse in the battlegrounds of Panem. At the end of the day, Katniss can only rely on one person, herself.

“Ultimately, the message of the Hunger Games is that everything’s not going to be OK,” says Penny. “One of the reasons Jennifer Lawrence is so good is because she lets you see that while Katniss is heroic, she’s also frightened all of the time. She spends the whole story being forced into situations she doesn’t want to be in. Kids respond because they can imagine what it’s like to be terrified but know that you have to carry on.”

It’s incontestable that we live in difficult times and that younger generations in particular may be more acutely aware that things aren’t improving any time soon, but is it a reach to say that fans of the Hunger Games are responding as much to the world around them as to the books?

Read the entire story here.

Video: The Hunger Games: Mockingjay Part 2 Official Trailer – “We March Together”. Courtesy of the Hunger Games franchise.

Perchance Art Thou Smitten by Dapper Hipsters? Verily Methinks

Linguistic-trends-2015As the (mostly) unidirectional tide of cultural influence flows from the U.S to the United Kingdom, the English mother tongue is becoming increasingly (and distressingly, I might add) populated by Americanisms: trash instead of rubbish, fries not chips, deplane instead of disembark, shopping cart instead of trolley, bangs rather than fringe, period instead of full stop. And there’s more: 24/7, heads-up, left-field, normalcy, a savings of, deliverable, the ask, winningest.

All, might I say, utterly cringeworthy.

Yet, there may be a slight glimmer of hope, and all courtesy of the hipster generation. Hipsters, you see, crave an authentic, artisanal experience — think goat cheese and bespoke hats — that also seems to embrace language. So, in 2015, compared with a mere decade earlier, you’re more likely to hear some of the following words, which would normally be more attributable to an archaic, even Shakespearean, era:

perchance, mayhaps, parlor, amidst, amongst, whilst, unbeknownst, thou, thee, ere, hath

I’m all for it. My only hope now, is that these words will flow against the tide and into the U.S. to repair some of the previous linguistic deforestation. Methinks I’ll put some of these to immediate, good use.

From the Independent:

Hipsters are famous for their love of all things old-fashioned: 19th Century beards, pickle-making, Amish outerwear, naming their kids Clementine or Atticus. Now, they may be excavating archaic language, too.

As Chi Luu points out at JSTOR Daily  — the blog of a database of academic journals, what could be more hipster than that? — old-timey words like bespoke, peruse, smitten and dapper appear to be creeping back into the lexicon.

This data comes from Google’s Ngram viewer, which charts the frequencies of words appearing in printed sources between 1800 and 2012.

Google’s Ngram shows that lots of archaic words appear to be resurfacing — including gems like perchance, mayhaps and parlor.

The same trend is visible for words like amongst, amidst, whilst and unbeknownst, which are are archaic forms of among, amid, while and unknown.

Read the story in its entirety here.

Image courtesy of Google’s Ngram viewer / Independent.

Science, Politics and Experts

NOAA-climate-data-trend

Nowhere is the prickly relationship between science and politics more evident than in the climate change debate. The skeptics, many of whom seem to reside right of center in Congress, disbelieve any and all causal links between human activity and global warming. The fossil-fuel burning truckloads of data continue to show upward trends in all measures from mean sea-level and average temperature, to more frequent severe weather and longer droughts. Yet, the self-proclaimed, non-expert policy-makers in Congress continue to disbelieve the science, the data, the analysis and the experts.

But, could the tide be turning? The Republican Chair of the House Committee on Science, Space, and Technology, Texas Congressman Lamar Smith, wants to see the detail behind the ongoing analysis that shows an ever-warming planet; he’s actually interested in seeing the raw climate data. Joy, at last! Representative Smith has decided to become an expert, right? Wrong. He’s trawling around for evidence that might show tampering of data and biased peer-reviewed analysis — science, after all, is just one great, elitist conspiracy theory.

One has to admire the Congressman’s tenacity. He and his herd of climate-skeptic apologists will continue to fiddle while Rome ignites and burns. But I suppose the warming of our planet is a good thing for Congressman Smith and his disbelieving (in science) followers, for it may well portend the End of Days that they believe (in biblical prophecy) and desire so passionately.

Oh, and the fact that Congressman Lamar Smith is Chair of  the House Committee on Science, Space, and Technology?! Well, that will have to remain the subject of another post. What next, Donald Trump as head of the ACLU?

From ars technica:

In his position as Chair of the House Committee on Science, Space, and Technology, Texas Congressman Lamar Smith has spent much of the last few years pressuring the National Science Foundation to ensure that it only funds science he thinks is worthwhile and “in the national interest.” His views on what’s in the national interest may not include the earth sciences, as Smith rejects the conclusions of climate science—as we saw first hand when we saw him speak at the Heartland Institute’s climate “skeptic” conference earlier this year.

So when National Oceanic and Atmospheric Administration (NOAA) scientists published an update to the agency’s global surface temperature dataset that slightly increased the short-term warming trend since 1998, Rep. Smith was suspicious. The armada of contrarian blog posts that quickly alleged fraud may have stoked these suspicions. But since, again, he’s the chair of the House Committee on Science, Space, and Technology, Rep. Smith was able to take action. He’s sent a series of requests to NOAA, which Ars obtained from Committee staff.

The requests started on July 14 when Smith wrote to the NOAA about the paper published in Science by Thomas Karl and his NOAA colleagues. The letter read, in part, “When corrections to scientific data are made, the quality of the analysis and decision-making is brought into question. The conclusions brought forth in this new study have lasting impacts and provide the basis for further action through regulations. With such broad implications, it is imperative that the underlying data and the analysis are made publicly available to ensure that the conclusions found and methods used are of the highest quality.”

Rep. Smith requested that the NOAA provide his office with “[a]ll data related to this study and the updated global datasets” along with the details of the analysis and “all documents and communications” related to part of that analysis.

In the publication at issue, the NOAA researchers had pulled in a new, larger database of weather station measurements and updated to the latest version of an ocean surface measurement dataset. The ocean data had new corrections for the different methods ships have used over the years to make temperature measurements. Most significantly, they estimated the difference between modern buoy stations and older thermometer-in-a-bucket measurements.

All the major temperature datasets go through revisions like these, as researchers are able to pull in more data and standardize disparate methods more effectively. Since the NOAA’s update, for example, NASA has pulled the same ocean temperature database into its dataset and updated its weather station database. The changes are always quite small, but they can sometimes alter estimates of some very short-term trends.

The NOAA responded to Rep. Smith’s request by pointing him to the relevant data and methods, all of which had already been publicly available. But on September 10, Smith sent another letter. “After review, I have additional questions related to the datasets used to adjust historical temperature records, as well as NOAA’s practices surrounding its use of climate data,” he wrote. The available data wasn’t enough, and he requested various subsets of the data—buoy readings separated out, for example, with both the raw and corrected data provided.

Read the entire story here.

Image: NOAA temperature record. Courtesy of NOAA.

Your Job is Killing You

Women_mealtime_st_pancras_workhouse

Many of us complain about the daily stresses from our jobs and our bosses, even our coworkers. We even bemoan the morning commute and the work we increasingly bring back home to complete in the evening. Many of us can be heard to say, “this job is killing me!”. Metaphorically, of course.

Well, researchers at Stanford and Harvard now find that in some cases your job is actually, quite literally, killing you. This may seem self-evident, but the data shows that workers with less education are significantly more likely to be employed in jobs that are more stressful and dangerous, and have less healthy workplace practices. This, in turn, leads to a significantly lower average life span than that for those with higher educational attainment. Researchers measured typical employment-related stressors such as: unemployment, layoffs, absence of employer subsidized health insurance, shift work, long working hours, job insecurity and work-family conflict. The less education a worker has, the more likely that she or he will suffer a greater burden from one or more of these stressors.

Looks like we’re gradually reverting to well-tested principles of Victorian worker exploitation. Check out more details from the study here.

From Washington Post:

People often like to groan about how their job is “killing” them. Tragically, for some groups of people in the U.S., that statement appears to be true.

A new study by researchers at Harvard and Stanford has quantified just how much a stressful workplace may be shaving off of Americans’ life spans. It suggests that the amount of life lost to stress varies significantly for people of different races, educational levels and genders, and ranges up to nearly three years of life lost for some groups.

Past research has shown an incredible variation in life expectancy around the United States, depending on who you are and where you live. Mapping life expectancy around the nation by both county of residence and race, you can see that people in some parts of the U.S. live as many as 33 years longer on average than people in other parts of the country, the researchers say.

Those gaps appear to be getting worse, as the wealthy extend their life spans and other groups are stagnant. One study found that men and women with fewer than 12 years of education had life expectancies that were still on par with most adults in the 1950s and 1960s — suggesting the economic gains of the last few decades have gone mostly to more educated people. The financial crisis and subsequent recession, which put many people in economic jeopardy, may have worsened this effect.

There are lots of reasons that people with lower incomes and educations tend to have lower life expectancies: differences in access to health care, in exposure to air and water pollution, in nutrition and health care early in life, and in behaviors, such as smoking, exercise and diet. Past research has also shown that job insecurity, long hours, heavy demands at work and other stresses can also cut down on a worker’s life expectancy by taking a heavy toll on a worker’s health. (If you work in an office, here are some exercises you might try to prevent this.)

But researchers say this is the first study to look at the ways that a workplace’s influence on life expectancy specifically break down by racial and educational lines.

To do their analysis, they divided people into 18 different groups by race, education and sex. They then looked at 10 different workplace factors — including unemployment and layoffs, the absence of health insurance, shift work, long working hours, job insecurity and work-family conflict — and estimated the effect that each would have on annual mortality and life expectancy.

The data show that people with less education are much more likely to end up in jobs with more unhealthy workplace practices that cut down on one’s life span. People with the highest educational attainment were less affected by workplace stress than people with the least education, the study says.

Read the entire story here.

Image: Women mealtime at St Pancras workhouse, London. Courtesy: Peter Higginbothom. Licensed under Public Domain via Commons.

AIs and ICBMs

You know something very creepy is going on when robots armed with artificial intelligence (AI) engage in conversations about nuclear war and inter-continental ballistic missiles (ICBM). This scene could be straight out of a William Gibson novel.

[tube]mfcyq7uGbZg[/tube]

Video: The BINA48 robot, created by Martine Rothblatt and Hanson Robotics, has a conversation with Siri. Courtesy of ars technica.

The Vicious Cycle of Stuff

google-search-stuff

Many of us in the West, and now increasingly in developing nations, are the guilty perpetrators of the seemingly never-ending cycle of consumption and accumulation. Yet for all the talk of sustainability, down-sizing, and responsible consumption we continue to gather, hoard and surround ourselves with more and more stuff.

From the Guardian:

The personal storage industry rakes in $22bn each year, and it’s only getting bigger. Why?

I’ll give you a hint: it’s not because vast nations of hoarders have finally decided to get their acts together and clean out the hall closet.

It’s also not because we’re short on space. In 1950 the average size of a home in the US was 983 square feet. Compare that to 2011, when American houses ballooned to an average size of 2,480 square feet – almost triple the size.

And finally, it’s not because of our growing families. This will no doubt come as a great relief to our helpful commenters who each week kindly suggest that for maximum environmental impact we simply stop procreating altogether: family sizes in the western world are steadily shrinking, from an average of 3.37 people in 1950 to just 2.6 today.

So, if our houses have tripled in size while the number of people living in them has shrunk, what, exactly, are we doing with all of this extra space? And why the billions of dollars tossed to an industry that was virtually nonexistent a generation or two ago?

Well, friends, it’s because of our stuff. What kind of stuff? Who cares! Whatever fits! Furniture, clothing, children’s toys (for those not fans of deprivation, that is), games, kitchen gadgets and darling tchotchkes that don’t do anything but take up space and look pretty for a season or two before being replaced by other, newer things – equally pretty and equally useless.

The simple truth is this: you can read all the books and buy all the cute cubbies and baskets and chalkboard labels, even master the life-changing magic of cleaning up – but if you have more stuff than you do space to easily store it, your life will be spent a slave to your possessions.

We shop because we’re bored, anxious, depressed or angry, and we make the mistake of buying material goods and thinking they are treats which will fill the hole, soothe the wound, make us feel better. The problem is, they’re not treats, they’re responsibilities and what we own very quickly begins to own us.

The second you open your wallet to buy something, it costs you – and in more ways than you might think. Yes, of course there’s the price tag and the corresponding amount of time it took you to earn that amount of money, but possessions also cost you space in your home and time spent cleaning and maintaining them. And as the token environmentalist in the room, I’d be remiss if I didn’t remind you that when you buy something, you’re also taking on the task of disposing of it (responsibly or not) when you’re done with it. Our addiction to consumption is a vicious one, and it’s stressing us out.

I know this because I’ve experienced it, having lived in everything from a four-bedroom house to my current one-bedroom flat I share with my daughter – but I’m also bringing some cold, hard science to the table.

A study published by UCLA showed that women’s stress hormones peaked during the times they were dealing with their possessions and material goods. Anyone who parks on the street because they can’t fit their car into the garage, or has stared down a crammed closet, can relate.

Our addiction to consuming is a vicious one, and it’s having a markedly negative impact on virtually every aspect of our lives.

Read the entire story here.

Image courtesy of Google Search.

Time for the Bucket List to Kick the Bucket

For the same reasons that New Year’s resolutions are daft, it’s time to ditch the bucket list. Columnist Steven Thrasher rightly argues that your actions to get something done or try something new should be driven by your gusto for life — passion, curiosity, wonder, joy — rather than dictated by a check box because you’re one step closer to death. Signs that it’s time to ditch the bucket list: when the idea is co-opted by corporations, advertisers and Hollywood; when motivational posters appear in hallways; and when physical bucket list buckets and notepads go on sale at Pottery Barn or Walmart.

From the Guardian:

Before each one of us dies, let’s wipe the “bucket list” from our collective vocabulary.

I hate the term “the bucket list.” The phrase, a list of things one wants to do in life before one dies or “kicks the bucket”, is the kind of hackneyed, cliche, stupid and insipid term only we Americans can come up with.

Even worse, “the bucket list” has become an excuse for people to couch things they actually desire to try doing as only socially acceptable if framed in the face of their death. It’s as if pleasure, curiosity and fun weren’t reasons enough for action.

If you want to try doing something others might find strange or unorthodox – write a novel, learn to tap dance, engage in a rim job, field dress a deer, climb Everest, go out in drag for a night – why do you need any justification at all? And certainly, why would you need an explanation that is only justifiable in terms of kicking the bucket?

According to the Wall Street Journal, the phrase “bucket list” comes to us from the banal mind of screenwriter Justin Zackham, who developed a list of things he wanted to do before he died. Years later, his “bucket list” became the title of his corny 2007 film starring Jack Nicholson and Morgan Freeman. It’s about two old men with terminal cancer who want to live it up before they die. That, if anyone at all, is who should be using the term “bucket list”. They want to do something with the finite time they know they have left? Fine.

But bucket list has trickled down to everday use by the perfectly healthy, the exceptionally young, and most of all, to douche bags. I realized this at Burning Man last week. Often, when I asked exceptionally boring people what had drawn them to Black Rock City, they’d say: “It was on my bucket list!”

Really? You wanted to schlep out to the desert and face freezing lows, scorching highs and soul crushing techno simply because you’re going to die someday?

There’s a funny dynamic sometimes when I go on a long trip while I’m out of work. When I backpacked through Asia and Europe in 2013, people (usually friends chained to a spouse, children and a mortgage) would sometimes awkwardly say to me: “Well, it will be the trip of a lifetime!” It was a good trip, but just one of many great journeys I’ve taken in my life so far. My adventures might interrupt someone else’s idea of what’s “normal.” But travel isn’t something I do to fulfil my “bucket list”; travel is a way of life for me. I do not rush into a trip thinking: “Good Christ, I could die tomorrow!” I don’t travel in place of the stable job or partner or kids I may or may not ever have. I do it as often as I can because it brings me joy.

Read the entire column here.

Titan Close Up

You could be forgiven for thinking the image below is of Earth. Rather, it is Saturn’s largest moon, Titan, as imaged in infra-red by NASA’s Cassini spacecraft on November 13, 2015. Gorgeous.

Titan-Cassini- flyby-13Nov2015

Image: Composite image shows an infrared view of Saturn’s moon Titan from NASA’s Cassini spacecraft, acquired during the mission’s “T-114” flyby on Nov. 13, 2015. Courtesy NASA.

The 75 Percent Versus 1 Percent

Stop the presses! Hold your horses! There seems to be some hope for humanity after all — and I was just about to seek a misanthropic-approved cave in which to hide.

A recent study by Common Cause shows that three-quarters of one thousand people surveyed identify more closely with unselfish values (altruism, forgiveness, honesty) than selfish ones (money, fame, power). But, as George Monbiot points out those in the 1 percent who run the globe tend to be the selfish ones. Also, he’s quite right to propose that we’d all be better served if the media apparatchik’s who fawn upon the 1 percent spent more time delving into the stories of those who give, rather than take.

From the Guardian:

Do you find yourself thrashing against the tide of human indifference and selfishness? Are you oppressed by the sense that while you care, others don’t? That, because of humankind’s callousness, civilisation and the rest of life on Earth are basically stuffed? If so, you are not alone. But neither are you right.

A study by the Common Cause Foundation, due to be published next month, reveals two transformative findings. The first is that a large majority of the 1,000 people they surveyed – 74% – identifies more strongly with unselfish values than with selfish values. This means that they are more interested in helpfulness, honesty, forgiveness and justice than in money, fame, status and power. The second is that a similar majority – 78% – believes others to be more selfish than they really are. In other words, we have made a terrible mistake about other people’s minds.

The revelation that humanity’s dominant characteristic is, er, humanity will come as no surprise to those who have followed recent developments in behavioural and social sciences. People, these findings suggest, are basically and inherently nice.

A review article in the journal Frontiers in Psychology points out that our behaviour towards unrelated members of our species is “spectacularly unusual when compared to other animals”. While chimpanzees might share food with members of their own group, though usually only after being plagued by aggressive begging, they tend to react violently towards strangers. Chimpanzees, the authors note, behave more like the homo economicus of neoliberal mythology than people do.

Humans, by contrast, are ultrasocial: possessed of an enhanced capacity for empathy, an unparalleled sensitivity to the needs of others, a unique level of concern about their welfare, and an ability to create moral norms that generalise and enforce these tendencies.

Such traits emerge so early in our lives that they appear to be innate. In other words, it seems that we have evolved to be this way. By the age of 14 months, children begin to help each other, for example by handing over objects another child can’t reach. By the time they are two, they start sharing things they value. By the age of three, they start to protest against other people’s violation of moral norms.

A fascinating paper in the journal Infancy reveals that reward has nothing to do with it. Three- to five-year-olds are less likely to help someone a second time if they have been rewarded for doing it the first time. In other words, extrinsic rewards appear to undermine the intrinsic desire to help. (Parents, economists and government ministers, please note.) The study also discovered that children of this age are more inclined to help people if they perceive them to be suffering, and that they want to see someone helped whether or not they do it themselves. This suggests that they are motivated by a genuine concern for other people’s welfare, rather than by a desire to look good.

Why? How would the hard logic of evolution produce such outcomes? This is the subject of heated debate. One school of thought contends that altruism is a logical response to living in small groups of closely related people, and evolution has failed to catch up with the fact that we now live in large groups, mostly composed of strangers.

Another argues that large groups containing high numbers of altruists will outcompete large groups which contain high numbers of selfish people. A third hypothesis insists that a tendency towards collaboration enhances your own survival, regardless of the group in which you might find yourself. Whatever the mechanism might be, the outcome should be a cause of celebration.

So why do we retain such a dim view of human nature? Partly, perhaps, for historical reasons. Philosophers from Hobbes to Rousseau, Malthus to Schopenhauer, whose understanding of human evolution was limited to the Book of Genesis, produced persuasive, influential and catastrophically mistaken accounts of “the state of nature” (our innate, ancestral characteristics). Their speculations on this subject should long ago have been parked on a high shelf marked “historical curiosities”. But somehow they still seem to exert a grip on our minds.

Another problem is that – almost by definition – many of those who dominate public life have a peculiar fixation on fame, money and power. Their extreme self-centredness places them in a small minority, but, because we see them everywhere, we assume that they are representative of humanity.

The media worships wealth and power, and sometimes launches furious attacks on people who behave altruistically. In the Daily Mail last month, Richard Littlejohn described Yvette Cooper’s decision to open her home to refugees as proof that “noisy emoting has replaced quiet intelligence” (quiet intelligence being one of his defining qualities). “It’s all about political opportunism and humanitarian posturing,” he theorised, before boasting that he doesn’t “give a damn” about the suffering of people fleeing Syria. I note with interest the platform given to people who speak and write as if they are psychopaths.

Read the entire story here.

Wot! Proper Grammar?

It seems that there are several ways to turn off a potential dating connection online: a picture of your bad teeth, tales of your poor hygiene, political posturing, and now, a poorly written profile or introductory email. Is our children learning?

Seriously, can it be that the younger generation is finally rebelling against the tyranny of lowercase Twitteresque, incorrect punctuation, nonsensical grammar, fatuous emoticons and facile abbreviations? If so, this is wonderful news for those who care about our language. Now, perhaps, these same people can turn their talents to educating the barely literate generations holding jobs in corporate America. After decades of subservience to fractured Powerpoint haiku many can no longer string together a coherent paragraph.

From the WSJ:

When Jeff Cohen was getting ready to meet his OkCupid date for drinks in Manhattan, he started to have second thoughts as he reread the glaring grammatical error in her last message: “I will see you their.”

The date flopped for a couple of reasons, but bad grammar bothers Mr. Cohen. Learning a potential mate doesn’t know the difference between “there,” “they’re” and “their” is like discovering she loves cats, he says. Mr. Cohen is allergic to cats. “It’s like learning I’m going to sneeze every time I see her,” he says.

With crimes against grammar rising in the age of social media, some people are beginning to take action. The online dating world is a prime battleground.

Mr. Cohen joins a number of singles picky about the grammar gaffes they’re seeing on dating sites. For love, these folks say written communications matter, from the correct use of semicolons, to understanding the difference between its and it’s, and sentences built on proper parallel construction.

“Grammar snobbery is one of the last permissible prejudices,” says John McWhorter, a linguistics professor at Columbia University. “The energy that used to go into open classism and racism now goes into disparaging people’s grammar.”

Mr. Cohen now uses an app that ranks the message quality of prospective dates. Called the Grade, the app checks messages for typos and grammar errors and assigns each user a letter grade from A+ to F.

The Grade demotes people whose messages contain certain abbreviations, like “wassup” and “YOLO,” short for “You Only Live Once,” popular among young people who want to justify doing something risky or indulgent. Clifford Lerner, chief executive of SNAP Interactive Inc., the company that makes the Grade, says the app downgrades these types of phrases in an effort to promote “meaningful conversations.”

Dating site Match asked more than 5,000 singles in the U.S. what criteria they used most in assessing dates. Beyond personal hygiene—which 96% of women valued most, as compared with 91% of men—singles said they judged a date foremost by the person’s grammar. The survey found 88% of women and 75% of men said they cared about grammar most, putting it ahead of a person’s confidence and teeth.

“When you get a message that is grammatically correct and has a voice and is put together, it is very attractive, it definitely adds hotness points,” says New Yorker Grace Gold. “People who send me text-type messages, and horrific grammatical errors? I just delete them.” She recalls the red flag raised by one potential suitor who had written his entire dating profile in lowercase.

Language has always played a part in how people judge others, but it has become amplified in recent years with increasing informal and colloquial usage, says Ben Zimmer, a lexicographer and chair of the New Words Committee of the American Dialect Society.

Read the entire story here.

PhotoMash: Two Types of Radical

Photomash-Radical-1-vs-Radical-2Meet two faces of radicalism: one is the face of radical islam; the second is the face of radial nationalism. Different, but similar, and both morally bankrupt.

Both have ideas that resonate with a very limited few (luckily for the rest of us); both inflame our discourse; both fuel hatred, distrust and intolerance; both project fear, racism, xenophobia and misogyny. Welcome to the new faces of fascism.

As a Londoner recently said of an attacker (reportedly belonging to the first type of radical group): #YouAintNoMuslimBruv.

I’d suggest to our second radical: #YouAintNoAmericanBro.

Both of these nightmarish visions seek a place on the world stage — both should and will rightly fail.

Image courtesy of the Washington Post, December 7, 2015.

Neutrinos in the News

Something’s up. Perhaps there’s some degree of hope that we may be reversing the tide of “dumbeddownness” in the stories that the media pumps through its many tubes to reach us. So, it comes as a welcome surprise to see articles about the very, very small making big news in publications like the New Yorker. Stories about neutrinos no less. Thank you New Yorker for dumbing us up. And, kudos to the latest Nobel laureates — Takaaki Kajita and Arthur B. McDonald — for helping us understand just a little bit more about our world.

From the New Yorker:

This week the 2015 Nobel Prize in Physics was awarded jointly to Takaaki Kajita and Arthur B. McDonald for their discovery that elementary particles called neutrinos have mass. This is, remarkably, the fourth Nobel Prize associated with the experimental measurement of neutrinos. One might wonder why we should care so much about these ghostly particles, which barely interact with normal matter.

Even though the existence of neutrinos was predicted in 1930, by Wolfgang Pauli, none were experimentally observed until 1956. That’s because neutrinos almost always pass through matter without stopping. Every second of every day, more than six trillion neutrinos stream through your body, coming directly from the fiery core of the sun—but most of them go right through our bodies, and the Earth, without interacting with the particles out of which those objects are made. In fact, on average, those neutrinos would be able to traverse more than one thousand light-years of lead before interacting with it even once.

The very fact that we can detect these ephemeral particles is a testament to human ingenuity. Because the rules of quantum mechanics are probabilistic, we know that, even though almost all neutrinos will pass right through the Earth, a few will interact with it. A big enough detector can observe such an interaction. The first detector of neutrinos from the sun was built in the nineteen-sixties, deep within a mine in South Dakota. An area of the mine was filled with a hundred thousand gallons of cleaning fluid. On average, one neutrino each day would interact with an atom of chlorine in the fluid, turning it into an atom of argon. Almost unfathomably, the physicist in charge of the detector, Raymond Davis, Jr., figured out how to detect these few atoms of argon, and, four decades later, in 2002, he was awarded the Nobel Prize in Physics for this amazing technical feat.

Because neutrinos interact so weakly, they can travel immense distances. They provide us with a window into places we would never otherwise be able to see. The neutrinos that Davis detected were emitted by nuclear reactions at the very center of the sun, escaping this incredibly dense, hot place only because they so rarely interact with other matter. We have been able to detect neutrinos emerging from the center of an exploding star more than a hundred thousand light-years away.

But neutrinos also allow us to observe the universe at its very smallest scales—far smaller than those that can be probed even at the Large Hadron Collider, in Geneva, which, three years ago, discovered the Higgs boson. It is for this reason that the Nobel Committee decided to award this year’s Nobel Prize for yet another neutrino discovery.

Read the entire story here.

PhotoMash: Two Kinds of Monster, One Real

I couldn’t resist this week’s photo mash-up. This one comes courtesy of the Guardian on December 3, 2015. It features two types of monster very aptly placed alongside each other by a kindly newspaper editor.

Photomash-Trump-vs-Monsters

The first monster happens to want to be President of the United States. He seems to be a racist, misogynist and raving bigot, and unfortunately (for some), he’s very real. The second, is a story of photographic artist Flora Borsi. She’s tired of perfect models with perfect hair in perfect fashion photographs. So, she retouches them, or in her words “detouches” the images into her “little monsters”. These are not real.

Our real world can be rather surreal.

Images courtesy of Guardian.

The US and the UK: A Stark Difference

Terrorism-US-3Dec2015Within the space of a few days we’ve witnessed two more acts of atrocious violence and murder. One in San Bernardino, California, the other in London, England.

In California 14 innocent people lost there lives and, by some accounts, 21 people were injured, and of course many hundreds of police officers and first-responders put their lives at risk in searching for and confronting the murderers.

In London, 3 people were injured, one seriously by an attacker on the London Underground (subway).Terrorism-UK-6Dec2015

 

Label these attacks acts of terrorism; acts of deranged minds. But, whether driven by warped ideologies or mental health issues the murder and violence in California and London shows one very stark difference.

Guns. Lots of guns.

The attackers in California were armed to the teeth: handguns, semi-automatic weapons and thousands of rounds of ammunition. The attacker in London was wielding a knife. You see, terrorism, violent radicalism and mental health problems exist — much to the same extent — in both the US and UK (and across the globe for that matter). But more often than not the outcome will be rather different — that is, more bloody and deadly — in the US because of access to weapons that conveniently facilitate mass murder.

And, sadly until a significant proportion of the US population comes to terms with this fact, rather than hiding behind a distorted interpretation of the 2nd Amendment, the carnage and mass murder — in the US — will continue.

 

Monarchy: Bad. Corporations and Oligarchs: Good

Google-search-GOP-candidates

The Founders of the United States had an inkling that federated democracy could not belong to all the people — hence they inserted the Electoral College. Yet they tried hard to design a system that improved upon the unjust, corruptness of hereditary power. But while they understood the dangers of autocratic British monarchy, they utterly failed to understand the role of corporations and vast sums of money in delivering much the same experience a couple of centuries later.

Ironically enough, all of Europe’s monarchies have given way to parliamentary democracies which are less likely to be ruled or controlled through financial puppeteering. In the United States, on the other hand, the once shining beacon of democracy is firmly in the grip of corporations, political action committees (PAC) and a handful of oligarchs awash in money, and lots of it. They control the discourse. They filter the news. They vet and anoint candidates; and destroy their foes. They shape and make policy. They lobby and “pay” lawmakers. They buy and aggregate votes. They now define and run the system.

But, of course, our corporations and billionaires are not hereditary aristocrats — they’re ordinary people with our interests at heart — according to the U.S. Supreme Court. So, all must be perfect and good, especially for those who subscribe to the constructionist view of the US Constitution.

From the Guardian:

To watch American politics today is to watch money speaking. The 2016 US elections will almost certainly be the most expensive in recent history, with total campaign expenditure exceeding the estimated $7bn (£4.6bn) splurged on the 2012 presidential and congressional contests. Donald Trump is at once the personification of this and the exception that proves the rule because – as he keeps trumpeting – at least it’s his own money. Everyone else depends on other people’s, most of it now channelled through outside groups such as “Super PACs” – political action committees – which are allowed to raise unlimited amounts from individuals and corporations.

The sums involved dwarf those in any other mature democracy. Already, during the first half of 2015, $400m has been raised, although the elections are not till next autumn. Spending on television advertising is currently projected to reach $4.4bn over the whole campaign. For comparison, all candidates and parties in Britain’s 2010 election spent less than £46m. In Canada’s recent general election the law allowed parties to lay out a maximum of about C$25m (£12.5m) for the first 37 days of an election campaign, plus an extra C$685,185 (to be precise) for each subsequent day.

Rejecting a challenge to such campaign finance regulation back in 2004, the Canadian supreme court argued that “individuals should have an equal opportunity to participate in the electoral process”, and that “wealth is the main obstacle to equal participation”. “Where those having access to the most resources monopolise the election discourse,” it explained, “their opponents will be deprived of a reasonable opportunity to speak and be heard.”

The US supreme court has taken a very different view. In its 2010 Citizens United judgment it said, in effect, that money has a right to speak. Specifically, it affirmed that a “prohibition on corporate independent expenditures is … a ban on speech”. As the legal scholar Robert Post writes, in a persuasive demolition of the court’s reasoning, “this passage flatly equates the first amendment rights of ordinary commercial corporations with those of natural persons”. (Or, as the former presidential candidate Mitt Romney put it in response to a heckler: “Corporations are people, my friend,”)

In a book entitled Citizens Divided, Post demonstrates how the Citizens United judgment misunderstands the spirit and deeper purpose of the first amendment: for people to be best equipped to govern themselves they need not just the freedom of political speech, but also the “representative integrity” of the electoral process.

Of course, an outsize role for money in US politics is nothing new. Henry George, one of the most popular political economists of his day, wrote in 1883 that “popular government must be a sham and a fraud” so long as “elections are to be gained by the use of money, and cannot be gained without it”. Whether today’s elections are so easily to be gained by the use of money is doubtful, when so much of it is sloshing about behind so many candidates, but does anyone doubt the “cannot be gained without it”?

Money may have been shaping US politics for some time, but what is new is the scale and unconstrained character of the spending, since the 2010 Citizens United decision and the Super PACs that it (and a subsequent case in a lower court) enabled. Figures from the Center for Responsive Politics show outside spending in presidential campaign years rising significantly in 2004 and 2008 but then nearly trebling in 2012 – and, current trends suggest, we ain’t seen nothing yet.

The American political historian Doris Kearns Godwin argues that the proliferation of Republican presidential candidates, so many that they won’t even fit on the stage for one television debate, is at least partly a result of the ease with which wealthy individuals and businesses can take a punt on their own man – or Carly Fiorina. A New York Times analysis found that around 130 families and their businesses accounted for more than half the money raised by Republican candidates and their Super PACs up to the middle of this year. (Things aren’t much better on the Democrat side.) And Godwin urges her fellow citizens to “fight for an amendment to undo Citizens United”.

The Harvard law professor and internet guru Larry Lessig has gone a step further, himself standing for president on the single issue of cleaning up US politics, with a draft citizen equality act covering voter registration, gerrymandering, changing the voting system and reforming campaign finance. That modest goal achieved, he will resign and hand over the reins to his vice-president. Earlier this year he said he would proceed if he managed to crowdfund more than $1m, which he has done. Not peanuts for you or me, but Jeb Bush’s Super PAC, Right to Rise, is planning to spend $37m on television ads before the end of February next year. So one of the problems of the campaign for campaign finance reform is … how to finance its campaign.

Read the entire story here.

Image courtesy of Google Search.

H2O and IQ

There is great irony in NASA’s recent discovery of water flowing on Mars.

First, that the gift of our intelligence allows us to make such amazing findings on other worlds while we use the same brain cells to enable the rape and pillage of our own.

CADrought-LakeOroville

Second, the meager seasonal trickles of liquid on the martian surface show us a dire possible future for our own planet.

Mars-Recurring-Slope-Lineae

From the Guardian:

Evidence for flowing water on Mars: this opens up the possibility of life, of wonders we cannot begin to imagine. Its discovery is an astonishing achievement. Meanwhile, Martian scientists continue their search for intelligent life on Earth.

We may be captivated by the thought of organisms on another planet, but we seem to have lost interest in our own. The Oxford Junior Dictionary has been excising the waymarks of the living world. Adders, blackberries, bluebells, conkers, holly, magpies, minnows, otters, primroses, thrushes, weasels and wrens are now surplus to requirements.

In the past four decades, the world has lost 50% of its vertebrate wildlife. But across the latter half of this period, there has been a steep decline in media coverage. In 2014, according to a study at Cardiff University, there were as many news stories broadcast by the BBC and ITV about Madeleine McCann (who went missing in 2007) as there were about the entire range of environmental issues.

Think of what would change if we valued terrestrial water as much as we value the possibility of water on Mars. Only 3% of the water on this planet is fresh; and of that, two-thirds is frozen. Yet we lay waste to the accessible portion. Sixty per cent of the water used in farming is needlessly piddled away by careless irrigation. Rivers, lakes and aquifers are sucked dry, while what remains is often so contaminated that it threatens the lives of those who drink it. In the UK, domestic demand is such that the upper reaches of many rivers disappear during the summer. Yet still we install clunky old toilets and showers that gush like waterfalls.

As for salty water, of the kind that so enthrals us when apparently detected on Mars, on Earth we express our appreciation with a frenzy of destruction. A new report suggests fish numbers have halved since 1970. Pacific bluefin tuna, which once roamed the seas in untold millions, have been reduced to an estimated 40,000, yet still they are pursued. Coral reefs are under such pressure that most could be gone by 2050. And in our own deep space, our desire for exotic fish rips through a world scarcely better known to us than the red planet’s surface. Trawlers are now working at depths of 2,000 metres. We can only guess at what they could be destroying.

A few hours before the Martian discovery was announced, Shell terminated its Arctic oil prospecting in the Chukchi Sea. For the company’s shareholders, it’s a minor disaster: the loss of $4bn; for those who love the planet and the life it sustains, it is a stroke of great fortune. It happened only because the company failed to find sufficient reserves. Had Shell succeeded, it would have exposed one of the most vulnerable places on Earth to spills, which are almost inevitable where containment is almost impossible. Are we to leave such matters to chance?

At the beginning of September, two weeks after he granted Shell permission to drill in the Chukchi Sea, Barack Obama travelled to Alaska to warn Americans about the devastating effects that climate change caused by the burning of fossil fuels could catalyse in the Arctic. “It’s not enough just to talk the talk”, he told them. “We’ve got to walk the walk.” We should “embrace the human ingenuity that can do something about it”. Human ingenuity is on abundant display at Nasa, which released those astounding images. But not when it comes to policy.

Let the market decide: this is the way in which governments seek to resolve planetary destruction. Leave it to the conscience of consumers, while that conscience is muted and confused by advertising and corporate lies. In a near-vacuum of information, we are each left to decide what we should take from other species and other people, what we should allocate to ourselves or leave to succeeding generations. Surely there are some resources and some places – such as the Arctic and the deep sea – whose exploitation should simply stop?

Read the entire article here.

Images: Lake Oroville, California, Earth, courtesy of U.S. Drought Portal. Recurring slope lineae, Mars, courtesy of NASA/JPL.

Just Another Ordinary Day

A headline from December 2, 2015. This one courtesy of the Washington Post, says it all.

mass-shooting-headline-2Dec2015

How many US citizens will be murdered using a gun this year? 32,000? 33,000?

At some point we — the US citizens — will become the refugees from this incessant and senseless slaughter. And, our so-called leaders will continue to cower and fiddle, and abrogate one of the most fundamental responsibilities of government — to keep citizens safe.

Politicians who refuse to address this issue with meaningful background checks, meaningful control of assault weapons, meaningful research into gun violence, should be thoroughly ashamed. They do disservice to the public, but especially to the police and other first-responders who have to place themselves between us and the constant hail of gunfire.

 

PhotoMash: Climate Skeptic and Climate Science

Aptly, today’s juxtaposition of stories comes from the Washington Post. One day into the COP21 UN climate change conference in Paris, France, US House of Representatives’ science committee chair Lamar Smith is still at it. He’s a leading climate change skeptic, an avid opponent of the NOAA (National Atmospheric and Oceanic Administration) and self-styled overlord of the National Science Foundation (NSF). While Representative Smith seeks to politicize and skewer science, intimidate scientists and trample on funding for climate science research (and other types of basic science funding), our planet continues to warm.

Photomash-Climate-Skeptic-Climate-Facts

If you’re an open-minded scientist or just concerned about our planet this is not good.

So, it’s rather refreshing to see Representative Smith alongside a story showing that the month of December could be another temperature record breaker — the warmest on record for the northern tier of the continental US.

Images courtesy of the Washington Post, November 30, 2015.