Baroness Thatcher and the Media Baron

The cozy yet fraught relationship between politicians and powerful figures in the media has been with us since the first days of newsprint. It’s a delicate symbiosis of sorts — the politician needs the media magnate to help acquire and retain power; the media baron needs the politician to shape and centralize it. The underlying motivations seem similar for both parties, hence the symbiosis — self-absorbtion, power, vanity.

So, it comes as no surprise to read intimate details of the symbiotic Rupert Murdoch / Margaret Thatcher years. Prime minister Thatcher would sometimes actively, but often surreptitiously, support Murdoch’s megalomaniacal desire to corner the UK (and global) media, while Murdoch would ensure his media channeled appropriately Thatcher-friendly news, spin and op-ed. But the Thatcher-Murdoch story is just one of the latest in a long line of business deals between puppet and puppet-master [you may decide which is which, dear reader]. Over the last hundred years we’ve had William Randolph Hearst and Roosevelt, Lloyd George and Northcliffe, Harold Wilson and Robert Maxwell, Baldwin and Beaverbrook.

Thomas Jefferson deplored newspapers — seeing them as vulgar and cancerous. His prescient analysis of the troubling and complex relationship between the news and politics is just as valid today, “an evil for which there is no remedy; our liberty depends on the freedom of the press, and this cannot be limited without being lost”.

Yet for all the grievous faults and dubious shenanigans of the brutish media barons and their fickle political spouses, the Thatcher-Murdoch story is perhaps not as sinister as one might first think. We now live in an age where faceless corporations and billionaires broker political power and shape policy behind mountains of money, obfuscated institutions and closed doors. This is far more troubling for our democracies. I would rather fight an evil that has a face.

From the Guardian:

The coup that transformed the relationship between British politics and journalism began at a quiet Sunday lunch at Chequers, the official country retreat of the prime minister, Margaret Thatcher. She was trailing in the polls, caught in a recession she had inherited, eager for an assured cheerleader at a difficult time. Her guest had an agenda too. He was Rupert Murdoch, eager to secure her help in acquiring control of nearly 40% of the British press.

Both parties got what they wanted.

The fact that they met at all, on 4 January 1981, was vehemently denied for 30 years. Since their lie was revealed, it has been possible to uncover how the greatest extension of monopoly power in modern press history was planned and executed with such furtive brilliance.

All the wretches in the subsequent hacking sagas – the predators in the red-tops, the scavengers and sleaze merchants, the blackmailers and bribers, the liars, the bullies, the cowed politicians and the bent coppers – were but the detritus of a collapse of integrity in British journalism and political life. At the root of the cruelties and extortions exposed in the recent criminal trials at the Old Bailey, was Margaret Thatcher’s reckless engorgement of the media power of her guest that January Sunday. The simple genesis of the hacking outrages is that Murdoch’s News International came to think it was above the law, because it was.

Thatcher achieved much as a radical prime minister confronted by political turmoil and economic torpor. So did Murdoch, in his liberation of British newspapers from war with the pressroom unions, and by wresting away the print unions’ monopoly of access to computer technology. I applauded his achievements, and still do, as I applauded many of Thatcher’s initiatives when I chaired the editorial boards of the Sunday Times (1967-81) and then the Times (1981-2). It is sad that her successes are stained by recent evidence of her readiness to ensure sunshine headlines for herself in the Murdoch press (especially when it was raining), at a heavy cost to the country. She enabled her guest to avoid a reference to the Monopolies and Mergers Commission, even though he already owned the biggest-selling daily newspaper, the Sun, and the biggest selling Sunday newspaper, the News of the World, and was intent on acquiring the biggest-selling quality weekly, the Sunday Times, and its stablemate, the Times. 

 Times Newspapers had long cherished their independence. In 1966, when the Times was in financial difficulty, the new owner who came to the rescue, Lord Roy Thomson of Fleet, promised to sustain it as an independent non-partisan newspaper – precisely how he had conducted the profitable Sunday Times. Murdoch was able to acquire both publications in 1981 only because he began making solemn pledges that he would maintain the tradition of independence. He broke every one of those promises in the first years. His breach of the undertakings freely made for Times Newspapers was a marked contrast with the independent journalism we at the Sunday Times (and William Rees-Mogg at the Times) had enjoyed under the principled ownership of the Thomson family. Thatcher was a vital force in reviving British competitiveness, but she abetted a concentration of press power that became increasingly arrogant and careless of human dignity in ways that would have appalled her, had she remained in good health long enough to understand what her actions had wrought.

Documents released by the Thatcher Archive Trust, now housed at Churchill College, Cambridge, give the lie to a litany of Murdoch-Thatcher denials about collusion during the bidding for Times Newspapers. They also expose a crucial falsehood in the seventh volume of The History of the Times: The Murdoch Years – the official story of the newspaper from 1981-2002, published in 2005 by the Murdoch-owned HarperCollins. In it Graham Stewart wrote, in all innocence, that Murdoch and Thatcher “had no communication whatsoever during the period in which the Times bid and presumed referral to the Monopolies and Mergers Commission was up for discussion”.

Read the entire story here.

 

Marketing of McGod

google-search-church-logos

Many churches now have their own cool logos. All of the large or mega-churches have their own well-defined brands and well-oiled marketing departments. Clearly, God is not doing enough to disseminate his (or her) message — God needs help from ad agencies and marketing departments. Modern day evangelism is not only a big business, it’s now a formalized business process, with key objectives, market share drivers, growth strategies, metrics and key performance indicators (KPI) — just like any other corporate franchise.

But some Christians believe that there is more (or, actually, less) to their faith than neo-evangelical brands like Vine, Gather, Vertical or Prime. So, some are shunning these houses of “worshipfotainment” [my invention, dear reader] with high-production values and edgy programming; they are forgoing mega-screens with Jesus-powerpoint and heavenly lasers, lattes in the lobby and hip Christian metal. A millennial tells his story of disillusionment with the McChurch — its evangelical shallowness and exclusiveness.

From the Washington Post:

Bass reverberates through the auditorium floor as a heavily bearded worship leader pauses to invite the congregation, bathed in the light of two giant screens, to tweet using #JesusLives. The scent of freshly brewed coffee wafts in from the lobby, where you can order macchiatos and purchase mugs boasting a sleek church logo. The chairs are comfortable, and the music sounds like something from the top of the charts. At the end of the service, someone will win an iPad.

This, in the view of many churches, is what millennials like me want. And no wonder pastors think so. Church attendance has plummeted among young adults. In the United States, 59 percent of people ages 18 to 29 with a Christian background have, at some point, dropped out. According to the Pew Forum on Religion & Public Life, among those of us who came of age around the year 2000, a solid quarter claim no religious affiliation at all, making my generation significantly more disconnected from faith than members of Generation X were at a comparable point in their lives and twice as detached as baby boomers were as young adults.

In response, many churches have sought to lure millennials back by focusing on style points: cooler bands, hipper worship, edgier programming, impressive technology. Yet while these aren’t inherently bad ideas and might in some cases be effective, they are not the key to drawing millennials back to God in a lasting and meaningful way. Young people don’t simply want a better show. And trying to be cool might be making things worse.

 You’re just as likely to hear the words “market share” and “branding” in church staff meetings these days as you are in any corporate office. Megachurches such as Saddleback in Lake Forest, Calif., and Lakewood in Houston have entire marketing departments devoted to enticing new members. Kent Shaffer of ChurchRelevance.com routinely ranks the best logos and Web sites and offers strategic counsel to organizations like Saddleback and LifeChurch.tv.

Increasingly, churches offer sermon series on iTunes and concert-style worship services with names like “Vine” or “Gather.” The young-adult group at Ed Young’s Dallas-based Fellowship Church is called Prime, and one of the singles groups at his father’s congregation in Houston is called Vertical. Churches have made news in recent years for giving away tablet computers , TVs and even cars at Easter. Still, attendance among young people remains flat.

Recent research from Barna Group and the Cornerstone Knowledge Network found that 67 percent of millennials prefer a “classic” church over a “trendy” one, and 77 percent would choose a “sanctuary” over an “auditorium.” While we have yet to warm to the word “traditional” (only 40 percent favor it over “modern”), millennials exhibit an increasing aversion to exclusive, closed-minded religious communities masquerading as the hip new places in town. For a generation bombarded with advertising and sales pitches, and for whom the charge of “inauthentic” is as cutting an insult as any, church rebranding efforts can actually backfire, especially when young people sense that there is more emphasis on marketing Jesus than actually following Him. Millennials “are not disillusioned with tradition; they are frustrated with slick or shallow expressions of religion,” argues David Kinnaman, who interviewed hundreds of them for Barna Group and compiled his research in “You Lost Me: Why Young Christians Are Leaving Church .?.?. and Rethinking Faith.”

My friend and blogger Amy Peterson put it this way: “I want a service that is not sensational, flashy, or particularly ‘relevant.’ I can be entertained anywhere. At church, I do not want to be entertained. I do not want to be the target of anyone’s marketing. I want to be asked to participate in the life of an ancient-future community.”

Millennial blogger Ben Irwin wrote: “When a church tells me how I should feel (‘Clap if you’re excited about Jesus!’), it smacks of inauthenticity. Sometimes I don’t feel like clapping. Sometimes I need to worship in the midst of my brokenness and confusion — not in spite of it and certainly not in denial of it.”

When I left church at age 29, full of doubt and disillusionment, I wasn’t looking for a better-produced Christianity. I was looking for a truer Christianity, a more authentic Christianity: I didn’t like how gay, lesbian, bisexual and transgender people were being treated by my evangelical faith community. I had questions about science and faith, biblical interpretation and theology. I felt lonely in my doubts. And, contrary to popular belief, the fog machines and light shows at those slick evangelical conferences didn’t make things better for me. They made the whole endeavor feel shallow, forced and fake.

Read the entire story here.

Spam, Spam, Spam: All Natural

Google-search-natural-junk-food

Parents through the ages have often decried the mangling of their mother tongue by subsequent generations. Language is fluid after all, particularly English, and our youth constantly add their own revisions to carve a divergent path from their elders. But, the focus of our disdain for the ongoing destruction of our linguistic heritage should really be corporations and their hordes of marketeers and lawyers. Take the once simple and meaningful word “natural”. You’ll see its oxymoronic application each time you stroll along the aisle at your grocery store: one hundred percent natural fruit roll-ups; all natural chicken rings; completely natural corn-dogs; totally naturally flavored cheese puffs. The word — natural — has become meaningless.

From NYT:

It isn’t every day that the definition of a common English word that is ubiquitous in common parlance is challenged in federal court, but that is precisely what has happened with the word “natural.” During the past few years, some 200 class-action suits have been filed against food manufacturers, charging them with misuse of the adjective in marketing such edible oxymorons as “natural” Cheetos Puffs, “all-natural” Sun Chips, “all-natural” Naked Juice, “100 percent all-natural” Tyson chicken nuggets and so forth. The plaintiffs argue that many of these products contain ingredients — high-fructose corn syrup, artificial flavors and colorings, chemical preservatives and genetically modified organisms — that the typical consumer wouldn’t think of as “natural.”

Judges hearing these cases — many of them in the Northern District of California — have sought a standard definition of the adjective that they could cite to adjudicate these claims, only to discover that no such thing exists.

Something in the human mind, or heart, seems to need a word of praise for all that humanity hasn’t contaminated, and for us that word now is “natural.” Such an ideal can be put to all sorts of rhetorical uses. Among the antivaccination crowd, for example, it’s not uncommon to read about the superiority of something called “natural immunity,” brought about by exposure to the pathogen in question rather than to the deactivated (and therefore harmless) version of it made by humans in laboratories. “When you inject a vaccine into the body,” reads a post on an antivaxxer website, Campaign for Truth in Medicine, “you’re actually performing an unnatural act.” This, of course, is the very same term once used to decry homosexuality and, more recently, same-sex marriage, which the Family Research Council has taken to comparing unfavorably to what it calls “natural marriage.”

So what are we really talking about when we talk about natural? It depends; the adjective is impressively slippery, its use steeped in dubious assumptions that are easy to overlook. Perhaps the most incoherent of these is the notion that nature consists of everything in the world except us and all that we have done or made. In our heart of hearts, it seems, we are all creationists.

In the case of “natural immunity,” the modifier implies the absence of human intervention, allowing for a process to unfold as it would if we did nothing, as in “letting nature take its course.” In fact, most of medicine sets itself against nature’s course, which is precisely what we like about it — at least when it’s saving us from dying, an eventuality that is perhaps more natural than it is desirable.

Yet sometimes medicine’s interventions are unwelcome or go overboard, and nature’s way of doing things can serve as a useful corrective. This seems to be especially true at the beginning and end of life, where we’ve seen a backlash against humanity’s technological ingenuity that has given us both “natural childbirth” and, more recently, “natural death.”

This last phrase, which I expect will soon be on many doctors’ lips, indicates the enduring power of the adjective to improve just about anything you attach it to, from cereal bars all the way on up to dying. It seems that getting end-of-life patients and their families to endorse “do not resuscitate” orders has been challenging. To many ears, “D.N.R.” sounds a little too much like throwing Grandpa under the bus. But according to a paper in The Journal of Medical Ethics, when the orders are reworded to say “allow natural death,” patients and family members and even medical professionals are much more likely to give their consent to what amounts to exactly the same protocols.

The word means something a little different when applied to human behavior rather than biology (let alone snack foods). When marriage or certain sexual practices are described as “natural,” the word is being strategically deployed as a synonym for “normal” or “traditional,” neither of which carries nearly as much rhetorical weight. “Normal” is by now too obviously soaked in moral bigotry; by comparison, “natural” seems to float high above human squabbling, offering a kind of secular version of what used to be called divine law. Of course, that’s exactly the role that “natural law” played for America’s founding fathers, who invoked nature rather than God as the granter of rights and the arbiter of right and wrong.

Read the entire article here.

Image courtesy of Google Search.

 

The Rich and Powerful Live by Different Rules

Bradley_ManningNever has there been such a wonderful example of blatant utter hypocrisy. This time from the United States Department of Justice. It would be refreshing to convey to our leaders that not only do “Black Lives Matter”, “Less Privileged Lives Matter” as well.

Former director of the CIA no less, and ex-four star general David Petraeus copped a mere two years of probation and a $100,000 fine for leaking classified information to his biographer. Chelsea Manning, formerly Bradley Manning, intelligence analyst and ex-army private, was sentenced to 35 years in prison in 2013 for disclosing classified documents to WikiLeaks.

And, there are many other similar examples.

DCIA David PetraeusWe wince when hearing of oligarchic corruption and favoritism in other nations, such as Russia and China. But, in this country it goes by the euphemism known as “justice” so it must be OK.

From arstechnica:

Yesterday [April 23, 2015], former CIA Director David Petraeus was handed two years of probation and a $100,000 fine after agreeing to a plea deal that ends in no jail time for leaking classified information to Paula Broadwell, his biographer and lover.

“I now look forward to moving on with the next phase of my life and continuing to serve our great nation as a private citizen,” Petraeus said outside the federal courthouse in Charlotte, North Carolina on Thursday.

Lower-level government leakers have not, however, been as likely to walk out of a courthouse applauding the US as Petraeus did. Trevor Timm, executive director of the Freedom of the Press Foundation, called the Petraeus plea deal a “gross hypocrisy.”

“At the same time as Petraeus got off virtually scot-free, the Justice Department has been bringing the hammer down upon other leakers who talk to journalists—sometimes for disclosing information much less sensitive than Petraeus did,” he said.

The Petraeus sentencing came days after the Justice Department demanded (PDF) up to a 24-year-term for Jeffrey Sterling, a former CIA agent who leaked information to a Pulitzer Prize-winning writer about a botched mission to sell nuclear plans to Iran in order to hinder its nuclear-weapons progress.

“A substantial sentence in this case would send an appropriate and much needed message to all persons entrusted with the handling of classified information, i.e., that intentional breaches of the laws governing the safeguarding of national defense information will be pursued aggressively, and those who violate the law in this manner will be tried, convicted, and punished accordingly,” the Justice Department argued in Sterling’s case this week.

The Daily Beast sums up the argument that the Petraeus deal involves a double standard by noting other recent penalties for lower-level leakers:

“Chelsea Manning, formerly Bradley Manning, was sentenced to 35 years in prison in 2013 for disclosing classified documents to WikiLeaks. Stephen Jin-Woo Kim, a former State Department contractor, entered a guilty plea last year to one felony count of disclosing classified information to a Fox News reporter in February 2014. He was sentenced to 13 months in prison. On Monday, prosecutors urged a judge to sentence Jeffrey Sterling, a former CIA officer, to at least 20 years in prison for leaking classified plans to sabotage Iran’s nuclear-weapons program to a New York Times reporter. Sterling will be sentenced next month. And former CIA officer John C. Kiriakou served 30 months in federal prison after he disclosed the name of a covert operative to a reporter. He was released in February and is finishing up three months of house arrest.”

The information Petraeus was accused of leaking, according to the original indictment, contained “classified information regarding the identities of covert officers, war strategy, intelligence capabilities and mechanisms, diplomatic discussions, quotes and deliberative discussions from high-level National Security Council meetings.” The leak also included “discussions with the president of the United States.”

The judge presiding over the case, US Magistrate Judge David Keesler, increased the government’s recommended fine of $40,000 to $100,000 because of Petraeus’ “grave but uncharacteristic error in judgement.”

Read the entire story here.

Images: Four-Star General David Petraeus; Private Chelsea Manning. Courtesy of Wikipedia.

Belief and the Falling Light

[tube]dpmXyJrs7iU[/tube]

Many of us now accept that lights falling from the sky are rocky interlopers from the asteroid clouds within our solar system, rather than visiting angels or signs from an angry (or mysteriously benevolent) God. New analysis of the meteor that overflew Chelyabinsk in Russia in 2013 suggests that one of the key founders of Christianity may have witnessed a similar natural phenomenon around two thousand years ago. However, at the time, Saul (later to become Paul the evangelist) interpreted the dazzling light on the road to Damascus — Acts of the Apostles, New Testament — as a message from a Christian God. The rest, as they say, is history. Luckily, recent scientific progress now means that most of us no longer establish new religious movements based on fireballs in the sky. But, we are awed nonetheless.

From the New Scientist:

Nearly two thousand years ago, a man named Saul had an experience that changed his life, and possibly yours as well. According to Acts of the Apostles, the fifth book of the biblical New Testament, Saul was on the road to Damascus, Syria, when he saw a bright light in the sky, was blinded and heard the voice of Jesus. Changing his name to Paul, he became a major figure in the spread of Christianity.

William Hartmann, co-founder of the Planetary Science Institute in Tucson, Arizona, has a different explanation for what happened to Paul. He says the biblical descriptions of Paul’s experience closely match accounts of the fireball meteor seen above Chelyabinsk, Russia, in 2013.

Hartmann has detailed his argument in the journal Meteoritics & Planetary Science (doi.org/3vn). He analyses three accounts of Paul’s journey, thought to have taken place around AD 35. The first is a third-person description of the event, thought to be the work of one of Jesus’s disciples, Luke. The other two quote what Paul is said to have subsequently told others.

“Everything they are describing in those three accounts in the book of Acts are exactly the sequence you see with a fireball,” Hartmann says. “If that first-century document had been anything other than part of the Bible, that would have been a straightforward story.”

But the Bible is not just any ancient text. Paul’s Damascene conversion and subsequent missionary journeys around the Mediterranean helped build Christianity into the religion it is today. If his conversion was indeed as Hartmann explains it, then a random space rock has played a major role in determining the course of history (see “Christianity minus Paul”).

That’s not as strange as it sounds. A large asteroid impact helped kill off the dinosaurs, paving the way for mammals to dominate the Earth. So why couldn’t a meteor influence the evolution of our beliefs?

“It’s well recorded that extraterrestrial impacts have helped to shape the evolution of life on this planet,” says Bill Cooke, head of NASA’s Meteoroid Environment Office in Huntsville, Alabama. “If it was a Chelyabinsk fireball that was responsible for Paul’s conversion, then obviously that had a great impact on the growth of Christianity.”

Hartmann’s argument is possible now because of the quality of observations of the Chelyabinsk incident. The 2013 meteor is the most well-documented example of larger impacts that occur perhaps only once in 100 years. Before 2013, the 1908 blast in Tunguska, also in Russia, was the best example, but it left just a scattering of seismic data, millions of flattened trees and some eyewitness accounts. With Chelyabinsk, there is a clear scientific argument to be made, says Hartmann. “We have observational data that match what we see in this first-century account.”

Read the entire article here.

Video: Meteor above Chelyabinsk, Russia in 2013. Courtesy of Tuvix72.

Endless Political Campaigning

US-politicians

The great capitalist market has decided — endless political campaigning in the United States is beneficial. If you think the presidential campaign to elect the next leader in 2016 began sometime last year you are not mistaken. In fact, it really does seem that political posturing for the next election often begins before the current one is even decided. We all complain: too many ads, too much negativity, far too much inanity and little substance. Yet, we allow the process to continue, and to grow in scale. Would you put up with a political campaign that lasts a mere 38 days? The British seem to do it. But, then again, the United States is so much more advanced, right?

From WSJ:

On March 23, Ted Cruz announced he is running for president in a packed auditorium at Liberty University in Lynchburg, Va. On April 7, Rand Paul announced he is running for president amid the riverboat décor of the Galt House hotel in Louisville, Ky. On April 12, Hillary Clinton announced she is running for president in a brief segment of a two-minute video. On April 13, Marco Rubio announced he is running before a cheering crowd at the Freedom Tower in Miami. And these are just the official announcements.

Jeb Bush made it known in December that he is interested in running. Scott Walker’s rousing speech at the Freedom Summit in Des Moines, Iowa, on Jan. 24 left no doubt that he will enter the race. Chris Christie’s appearance in New Hampshire last week strongly suggests the same. Previous presidential candidates Mike Huckabee,Rick Perry and Rick Santorum seem almost certain to run. Pediatric surgeon Ben Carson is reportedly ready to announce his run on May 4 at the Detroit Music Hall.

With some 570 days left until Election Day 2016, the race for president is very much under way—to the dismay of a great many Americans. They find the news coverage of the candidates tiresome (what did Hillary order at Chipotle?), are depressed by the negative campaigning that is inevitable in an adversarial process, and dread the onslaught of political TV ads. Too much too soon!

They also note that other countries somehow manage to select their heads of government much more quickly. The U.K. has a general election campaign going on right now. It began on March 30, when the queen, on the advice of the prime minister, dissolved Parliament, and voting will take place on May 7. That’s 38 days later. Britons are complaining that the electioneering goes on too long.

American presidential campaigns did not always begin so soon, but they have for more than a generation now. As a young journalist, Sidney Blumenthal (in recent decades a consigliere to the Clintons) wrote quite a good book titled “The Permanent Campaign.” It was published in 1980. Mr. Blumenthal described what was then a relatively new phenomenon.

When Jimmy Carter announced his candidacy for president in January 1975, he was not taken particularly seriously. But his perseverance paid off, and he took the oath of office two years later. His successors—Ronald Reagan, George H.W. Bush and Bill Clinton—announced their runs in the fall before their election years, although they had all been busy assembling campaigns before that. George W. Bush announced in June 1999, after the adjournment of the Texas legislature. Barack Obama announced in February 2007, two days before Lincoln’s birthday, in Lincoln’s Springfield, Ill. By that standard, declared candidates Mr. Cruz, Mr. Paul, Mrs. Clinton and Mr. Rubio got a bit of a late start.

Why are American presidential campaigns so lengthy? And is there anything that can be done to compress them to a bearable timetable?

One clue to the answers: The presidential nominating process, the weakest part of our political system, is also the one part that was not envisioned by the Founding Fathers. The framers of the Constitution created a powerful presidency, confident (justifiably, as it turned out) that its first incumbent, George Washington, would set precedents that would guide the republic for years to come.

But they did not foresee that even in Washington’s presidency, Americans would develop political parties, which they abhorred. The Founders expected that later presidents would be chosen, usually by the House of Representatives, from local notables promoted by different states in the Electoral College. They did not expect that the Federalist and Republican parties would coalesce around two national leaders—Washington’s vice president, John Adams, and Washington’s first secretary of state, Thomas Jefferson—in the close elections of 1796 and 1800.

The issue then became: When a president followed George Washington’s precedent and retired after two terms, how would the parties choose nominees, in a republic that, from the start, was regionally, ethnically and religiously diverse?

Read the entire story here.

Image courtesy of Google Search.

Religious Dogma and DNA

Despite ongoing conflicts around the global that are fueled or governed by religious fanaticism it is entirely plausible that our general tendency to supernatural belief is encoded in our DNA. Of course this does not mean that a God or that various gods exist, it merely implies that over time natural selection generally favored those who believed in deities over those did not. We are such complex and contradictory animals.

From NYT:

Most of us find it mind-boggling that some people seem willing to ignore the facts — on climate change, on vaccines, on health care — if the facts conflict with their sense of what someone like them believes. “But those are the facts,” you want to say. “It seems weird to deny them.”

And yet a broad group of scholars is beginning to demonstrate that religious belief and factual belief are indeed different kinds of mental creatures. People process evidence differently when they think with a factual mind-set rather than with a religious mind-set. Even what they count as evidence is different. And they are motivated differently, based on what they conclude. On what grounds do scholars make such claims?

First of all, they have noticed that the very language people use changes when they talk about religious beings, and the changes mean that they think about their realness differently. You do not say, “I believe that my dog is alive.” The fact is so obvious it is not worth stating. You simply talk in ways that presume the dog’s aliveness — you say she’s adorable or hungry or in need of a walk. But to say, “I believe that Jesus Christ is alive” signals that you know that other people might not think so. It also asserts reverence and piety. We seem to regard religious beliefs and factual beliefs with what the philosopher Neil Van Leeuwen calls different “cognitive attitudes.”

Second, these scholars have remarked that when people consider the truth of a religious belief, what the belief does for their lives matters more than, well, the facts. We evaluate factual beliefs often with perceptual evidence. If I believe that the dog is in the study but I find her in the kitchen, I change my belief. We evaluate religious beliefs more with our sense of destiny, purpose and the way we think the world should be. One study found that over 70 percent of people who left a religious cult did so because of a conflict of values. They did not complain that the leader’s views were mistaken. They believed that he was a bad person.

Third, these scholars have found that religious and factual beliefs play different roles in interpreting the same events. Religious beliefs explain why, rather than how. People who understand readily that diseases are caused by natural processes might still attribute sickness at a particular time to demons, or healing to an act of God. The psychologist Cristine H. Legare and her colleagues recently demonstrated that people use both natural and supernatural explanations in this interdependent way across many cultures. They tell a story, as recounted by Tracy Kidder’s book on the anthropologist and physician Paul Farmer, about a woman who had taken her tuberculosis medication and been cured — and who then told Dr. Farmer that she was going to get back at the person who had used sorcery to make her ill. “But if you believe that,” he cried, “why did you take your medicines?” In response to the great doctor she replied, in essence, “Honey, are you incapable of complexity?”

Moreover, people’s reliance on supernatural explanations increases as they age. It may be tempting to think that children are more likely than adults to reach out to magic to explain something, and that they increasingly put that mind-set to the side as they grow up, but the reverse is true. It’s the young kids who seem skeptical when researchers ask them about gods and ancestors, and the adults who seem clear and firm. It seems that supernatural ideas do things for adults they do not yet do for children.

Finally, scholars have determined that people don’t use rational, instrumental reasoning when they deal with religious beliefs. The anthropologist Scott Atran and his colleagues have shown that sacred values are immune to the normal cost-benefit trade-offs that govern other dimensions of our lives. Sacred values are insensitive to quantity (one cartoon can be a profound insult). They don’t respond to material incentives (if you offer people money to give up something that represents their sacred value, and they often become more intractable in their refusal). Sacred values may even have different neural signatures in the brain.

The danger point seems to be when people feel themselves to be completely fused with a group defined by its sacred value. When Mr. Atran and his colleagues surveyed young men in two Moroccan neighborhoods associated with militant jihad (one of them home to five men who helped plot the 2004 Madrid train bombings, and then blew themselves up), they found that those who described themselves as closest to their friends and who upheld Shariah law were also more likely to say that they would suffer grievous harm to defend Shariah law. These people become what Mr. Atran calls “devoted actors” who are unconditionally committed to their sacred value, and they are willing to die for it.

Read the entire article here.

Dark Matter May Cause Cancer and Earthquakes

Abell 1689

Leave aside the fact that there is no direct evidence for the existence of dark matter. In fact, theories that indirectly point to its existence seem rather questionable as well. That said, cosmologists are increasingly convinced that dark matter’s gravitational effects can be derived from recent observations of gravitationally lenses galaxy clusters. Some researchers postulate that this eerily murky non-substance — it doesn’t interact with anything in our visible universe except, perhaps, gravity — may be a cause for activities much closer to home. All very interesting.

From NYT:

Earlier this year, Dr. Sabine Hossenfelder, a theoretical physicist in Stockholm, made the jarring suggestion that dark matter might cause cancer. She was not talking about the “dark matter” of the genome (another term for junk DNA) but about the hypothetical, lightless particles that cosmologists believe pervade the universe and hold the galaxies together.

Though it has yet to be directly detected, dark matter is presumed to exist because we can see the effects of its gravity. As its invisible particles pass through our bodies, they could be mutating DNA, the theory goes, adding at an extremely low level to the overall rate of cancer.

It was unsettling to see two such seemingly different realms, cosmology and oncology, suddenly juxtaposed. But that was just the beginning. Shortly after Dr. Hossenfelder broached her idea in an online essay, Michael Rampino, a professor at New York University, added geology and paleontology to the picture.

Dark matter, he proposed in an article for the Royal Astronomical Society, is responsible for the mass extinctions that have periodically swept Earth, including the one that killed the dinosaurs.

His idea is based on speculations by other scientists that the Milky Way is sliced horizontally through its center by a thin disk of dark matter. As the sun, traveling around the galaxy, bobs up and down through this darkling plane, it generates gravitational ripples strong enough to dislodge distant comets from their orbits, sending them hurtling toward Earth.

An earlier version of this hypothesis was put forth last year by the Harvard physicists Lisa Randall and Matthew Reece. But Dr. Rampino has added another twist: During Earth’s galactic voyage, dark matter accumulates in its core. There the particles self-destruct, generating enough heat to cause deadly volcanic eruptions. Struck from above and below, the dinosaurs succumbed.

It is surprising to see something as abstract as dark matter take on so much solidity, at least in the human mind. The idea was invented in the early 1930s as a theoretical contrivance — a means of explaining observations that otherwise didn’t make sense.

Galaxies appear to be rotating so fast that they should have spun apart long ago, throwing off stars like sparks from a Fourth of July pinwheel. There just isn’t enough gravity to hold a galaxy together, unless you assume that it hides a huge amount of unseen matter — particles that neither emit or absorb light.

Some mavericks propose alternatives, attempting to tweak the equations of gravity to account for what seems like missing mass. But for most cosmologists, the idea of unseeable matter has become so deeply ingrained that it has become almost impossible to do without it.

Said to be five times more abundant than the stuff we can see, dark matter is a crucial component of the theory behind gravitational lensing, in which large masses like galaxies can bend light beams and cause stars to appear in unexpected parts of the sky.

That was the explanation for the spectacular observation of an “Einstein Cross” reported last month. Acting like an enormous lens, a cluster of galaxies deflected the light of a supernova into four images — a cosmological mirage. The light for each reflection followed a different path, providing glimpses of four different moments of the explosion.

Continue reading the main storyContinue reading the main story

But not even a galactic cluster exerts enough gravity to bend light so severely unless you postulate that most of its mass consists of hypothetical dark matter. In fact, astronomers are so sure that dark matter exists that they have embraced gravitational lensing as a tool to map its extent.

Dark matter, in other words, is used to explain gravitational lensing, and gravitational lensing is taken as more evidence for dark matter.

Some skeptics have wondered if this is a modern-day version of what ancient astronomers called “saving the phenomena.” With enough elaborations, a theory can account for what we see without necessarily describing reality. The classic example is the geocentric model of the heavens that Ptolemy laid out in the Almagest, with the planets orbiting Earth along paths of complex curlicues.

Ptolemy apparently didn’t care whether his filigrees were real. What was important to him was that his model worked, predicting planetary movements with great precision.

Modern scientists are not ready to settle for such subterfuge. To show that dark matter resides in the world and not just in their equations, they are trying to detect it directly.

Though its identity remains unknown, most theorists are betting that dark matter consists of WIMPs — weakly interacting massive particles. If they really exist, it might be possible to glimpse them when they interact with ordinary matter.

Read the entire article here.

Image: Abell 1689 galaxy cluster. Courtesy ofNASA, ESA, and D. Coe (NASA JPL/Caltech and STScI).

MondayMap: Imagining a Post-Post-Ottoman World

Sykes_Picot_Agreement_Map_signed_8_May_1916

The United States is often portrayed as the world’s bully and nefarious geo-political schemer — a nation responsible for many of the world’s current political ills. However, it is the French and British who should be called to account for much of the globe’s ongoing turmoil, particularly in the Middle East. After the end of WWI the victors expeditiously carved up the spoils of the vanquished Austro-Hungarian and Ottoman Empires. Much of Eastern Europe and the Middle East was divvied and traded just a kids might swap baseball or football (soccer) cards today. Then President of France Georges Clemenceau and British Prime Minister David Lloyd George famously bartered and gifted — amongst themselves and their friends — entire regions and cities without thought to historical precedence, geographic and ethnic boundaries, or even the basic needs of entire populations. Their decisions were merely lines to be drawn and re-drawn on a map.

So, it would be a fascinating — though rather naive — exercise to re-draw many of today’s arbitrary and contrived boundaries, and to revert regions to their more appropriate owners. Of course, where and when should this thought experiment begin and end? Pre-roman empire, post-normans, before the Prussians, prior to the Austro-Hungarian Empire, or after the Ottomans, post-Soviets, or after Tito, or way before the Huns, Vandals and the Barbarians and any number of the Germanic tribes?

Nevertheless, essayist Yaroslav Trofimov takes a stab at re-districting to pre-Ottoman boundaries and imagines a world with less bloodshed. A worthy dream.

From WSJ:

Shortly after the end of World War I, the French and British prime ministers took a break from the hard business of redrawing the map of Europe to discuss the easier matter of where frontiers would run in the newly conquered Middle East.

Two years earlier, in 1916, the two allies had agreed on their respective zones of influence in a secret pact—known as the Sykes-Picot agreement—for divvying up the region. But now the Ottoman Empire lay defeated, and the United Kingdom, having done most of the fighting against the Turks, felt that it had earned a juicier reward.

“Tell me what you want,” France’s Georges Clemenceau said to Britain’s David Lloyd George as they strolled in the French embassy in London.

“I want Mosul,” the British prime minister replied.

“You shall have it. Anything else?” Clemenceau asked.

In a few seconds, it was done. The huge Ottoman imperial province of Mosul, home to Sunni Arabs and Kurds and to plentiful oil, ended up as part of the newly created country of Iraq, not the newly created country of Syria.

The Ottomans ran a multilingual, multireligious empire, ruled by a sultan who also bore the title of caliph—commander of all the world’s Muslims. Having joined the losing side in the Great War, however, the Ottomans saw their empire summarily dismantled by European statesmen who knew little about the region’s people, geography and customs.

The resulting Middle Eastern states were often artificial creations, sometimes with implausibly straight lines for borders. They have kept going since then, by and large, remaining within their colonial-era frontiers despite repeated attempts at pan-Arab unification.

The built-in imbalances in some of these newly carved-out states—particularly Syria and Iraq—spawned brutal dictatorships that succeeded for decades in suppressing restive majorities and perpetuating the rule of minority groups.

But now it may all be coming to an end. Syria and Iraq have effectively ceased to function as states. Large parts of both countries lie beyond central government control, and the very meaning of Syrian and Iraqi nationhood has been hollowed out by the dominance of sectarian and ethnic identities.

The rise of Islamic State is the direct result of this meltdown. The Sunni extremist group’s leader, Abu Bakr al-Baghdadi, has proclaimed himself the new caliph and vowed to erase the shame of the “Sykes-Picot conspiracy.” After his men surged from their stronghold in Syria last summer and captured Mosul, now one of Iraq’s largest cities, he promised to destroy the old borders. In that offensive, one of the first actions taken by ISIS (as his group is also known) was to blow up the customs checkpoints between Syria and Iraq.

“What we are witnessing is the demise of the post-Ottoman order, the demise of the legitimate states,” says Francis Ricciardone, a former U.S. ambassador to Turkey and Egypt who is now at the Atlantic Council, a Washington think tank. “ISIS is a piece of that, and it is filling in a vacuum of the collapse of that order.”

In the mayhem now engulfing the Middle East, it is mostly the countries created a century ago by European colonialists that are coming apart. In the region’s more “natural” nations, a much stronger sense of shared history and tradition has, so far, prevented a similar implosion.

“Much of the conflict in the Middle East is the result of insecurity of contrived states,” says Husain Haqqani, an author and a former Pakistani ambassador to the U.S. “Contrived states need state ideologies to make up for lack of history and often flex muscles against their own people or against neighbors to consolidate their identity.”

In Egypt, with its millennial history and strong sense of identity, almost nobody questioned the country’s basic “Egyptian-ness” throughout the upheaval that has followed President Hosni Mubarak’s ouster in a 2011 revolution. As a result, most of Egypt’s institutions have survived the turbulence relatively intact, and violence has stopped well short of outright civil war.

Turkey and Iran—both of them, in bygone eras, the center of vast empires—have also gone largely unscathed in recent years, even though both have large ethnic minorities of their own, including Arabs and Kurds.

The Middle East’s “contrived” countries weren’t necessarily doomed to failure, and some of them—notably Jordan—aren’t collapsing, at least not yet. The world, after all, is full of multiethnic and multiconfessional states that are successful and prosperous, from Switzerland to Singapore to the U.S., which remains a relative newcomer as a nation compared with, say, Iran.

Read the entire article here.

Image: Map of Sykes–Picot Agreement showing Eastern Turkey in Asia, Syria and Western Persia, and areas of control and influence agreed between the British and the French. Royal Geographical Society, 1910-15. Signed by Mark Sykes and François Georges-Picot, 8 May 1916. Courtesy of Wikipedia.

 

Yes M’Lady

google-Thunderbirds

Beneath the shell that envelops us as adults lies the child. We all have one inside — that vulnerable being who dreams, plays and improvises. Sadly, our contemporary society does a wonderful job of selectively numbing these traits, usually as soon as we enter school; our work finishes the process by quashing all remnants of our once colorful and unbounded imaginations. OK, I’m exaggerating a little to make my point. But I’m certain this strikes a chord.

Keeping this in mind, it’s awesomely brilliant to see Thunderbirds making a comeback. You may recall the original Thunderbirds TV shows in the mid-sixties. Created by Gerry and Sylvia Anderson, the marionette puppets and their International Rescue science-fiction machines would save us weekly from the forces of evil, destruction and chaos. The child who lurks within me utterly loved this show — everything would come to a halt to make way for this event on saturday mornings. Now I have a chance of reliving it with my kids, and maintaining some degree of childhood wonder in the process. Thunderbirds are go…

From the Guardian:

5, 4, 3, 2, 1 … Thunderbirds are go – but not quite how older viewers will remember. International Rescue has been given a makeover for the modern age, with the Tracy brothers, Brains, Lady Penelope and Parker smarter, fitter and with better gadgets than they ever had when the “supermarionation” show began on ITV half a century ago.

But fans fearful that its return, complete with Hollywood star Rosamund Pike voicing Lady Penelope, will trample all over their childhood memories can rest easy.

Unlike the 2004 live action film which Thunderbirds creator, the late Gerry Anderson, described as the “biggest load of crap I have ever seen in my life”, the new take on the children’s favourite, called Thunderbirds Are Go, remains remarkably true to the spirit of the 50-year-old original.

Gone are the puppet strings – audience research found that younger viewers wanted something more dynamic – but along with computer generated effects are models and miniature sets (“actually rather huge” said executive producer Estelle Hughes) that faithfully recall the original Thunderbirds.

Speaking after the first screening of the new ITV series on Tuesday, executive producer Giles Ridge said: “We felt we should pay tribute to all those elements that made it special but at the same time update it so it’s suitable and compelling for a modern audience.

“The basic DNA of the show – five young brothers on a secret hideaway island with the most fantastic craft you could imagine, helping people around the world who are in trouble, that’s not a bad place to start.”

The theme music is intact, albeit given a 21st century makeover, as is the Tracy Island setting – complete with the avenue of palm trees that makes way for Thunderbird 2 and the swimming pool that slides into the mountain for the launch of Thunderbird 1.

Lady Penelope – as voiced by Pike – still has a cut-glass accent and is entirely unflappable. When she is not saving the world she is visiting Buckingham Palace or attending receptions at 10 Downing Street. There is also a nod – blink and you miss it – to another Anderson puppet series, Stingray.

Graham, who voiced Parker in the original series, returns in the same role. “I think they were checking me out to see if I was still in one piece,” said Graham, now 89, of the meeting when he was first approached to appear in the new series.

“I was absolutely thrilled to repeat the voice and character of Parker. Although I am older my voice hasn’t changed too much over the years.”

He said the voice of Parker had come from a wine waiter who used to work in the royal household, whom Anderson had taken him to see in a pub in Cookham, Berkshire.

“He came over and said, ‘Would you like to see the wine list, sir?’ And Parker was born. Thank you, old mate.”

Brains, as voiced by Fonejacker star Kayvan Novak, now has an Indian accent.

Sylvia Anderson, Anderson’s widow, who co-created the show, will make a guest appearance as Lady Penelope’s “crazy aunt”.

Read the entire story here.

Image courtesy of Google Search.

 

Your Current Dystopian Nightmare: In Just One Click

Amazon was supposed to give you back precious time by making shopping and spending painlessly simple. Apps on your smartphone were supposed to do the same for all manner of re-tooled on-demand services. What wonderful time-saving inventions! So, now you can live in the moment and make use of all this extra free time. It’s your time now. You’ve won it back and no one can take it away.

And, what do you spend this newly earned free time doing? Well, you sit at home in your isolated cocoon, you shop for more things online, you download some more great apps that promise to bring even greater convenience, you interact less with real humans, and, best of all, you spend more time working. Welcome to your new dystopian nightmare, and it’s happening right now. Click.

From Medium:

Angel the concierge stands behind a lobby desk at a luxe apartment building in downtown San Francisco, and describes the residents of this imperial, 37-story tower. “Ubers, Squares, a few Twitters,” she says. “A lot of work-from-homers.”

And by late afternoon on a Tuesday, they’re striding into the lobby at a just-get-me-home-goddammit clip, some with laptop bags slung over their shoulders, others carrying swank leather satchels. At the same time a second, temporary population streams into the building: the app-based meal delivery people hoisting thermal carrier bags and sacks. Green means Sprig. A huge M means Munchery. Down in the basement, Amazon Prime delivery people check in packages with the porter. The Instacart groceries are plunked straight into a walk-in fridge.

This is a familiar scene. Five months ago I moved into a spartan apartment a few blocks away, where dozens of startups and thousands of tech workers live. Outside my building there’s always a phalanx of befuddled delivery guys who seem relieved when you walk out, so they can get in. Inside, the place is stuffed with the goodies they bring: Amazon Prime boxes sitting outside doors, evidence of the tangible, quotidian needs that are being serviced by the web. The humans who live there, though, I mostly never see. And even when I do, there seems to be a tacit agreement among residents to not talk to one another. I floated a few “hi’s” in the elevator when I first moved in, but in return I got the monosyllabic, no-eye-contact mumble. It was clear: Lady, this is not that kind of building.

Back in the elevator in the 37-story tower, the messengers do talk, one tells me. They end up asking each other which apps they work for: Postmates. Seamless. EAT24. GrubHub. Safeway.com. A woman hauling two Whole Foods sacks reads the concierge an apartment number off her smartphone, along with the resident’s directions: “Please deliver to my door.”

“They have a nice kitchen up there,” Angel says. The apartments rent for as much as $5,000 a month for a one-bedroom. “But so much, so much food comes in. Between 4 and 8 o’clock, they’re on fire.”

I start to walk toward home. En route, I pass an EAT24 ad on a bus stop shelter, and a little further down the street, a Dungeons & Dragons–type dude opens the locked lobby door of yet another glass-box residential building for a Sprig deliveryman:

“You’re…”

“Jonathan?”

“Sweet,” Dungeons & Dragons says, grabbing the bag of food. The door clanks behind him.

And that’s when I realized: the on-demand world isn’t about sharing at all. It’s about being served. This is an economy of shut-ins.

In 1998, Carnegie Mellon researchers warned that the internet could make us into hermits. They released a study monitoring the social behavior of 169 people making their first forays online. The web-surfers started talking less with family and friends, and grew more isolated and depressed. “We were surprised to find that what is a social technology has such anti-social consequences,” said one of the researchers at the time. “And these are the same people who, when asked, describe the Internet as a positive thing.”

We’re now deep into the bombastic buildout of the on-demand economy— with investment in the apps, platforms and services surging exponentially. Right now Americans buy nearly eight percent of all their retail goods online, though that seems a wild underestimate in the most congested, wired, time-strapped urban centers.

Many services promote themselves as life-expanding?—?there to free up your time so you can spend it connecting with the people you care about, not standing at the post office with strangers. Rinse’s ad shows a couple chilling at a park, their laundry being washed by someone, somewhere beyond the picture’s frame. But plenty of the delivery companies are brutally honest that, actually, they never want you to leave home at all.

GrubHub’s advertising banks on us secretly never wanting to talk to a human again: “Everything great about eating, combined with everything great about not talking to people.” DoorDash, another food delivery service, goes for the all-caps, batshit extreme:

“NEVER LEAVE HOME AGAIN.”

Katherine van Ekert isn’t a shut-in, exactly, but there are only two things she ever has to run errands for any more: trash bags and saline solution. For those, she must leave her San Francisco apartment and walk two blocks to the drug store, “so woe is my life,” she tells me. (She realizes her dry humor about #firstworldproblems may not translate, and clarifies later: “Honestly, this is all tongue in cheek. We’re not spoiled brats.”) Everything else is done by app. Her husband’s office contracts with Washio. Groceries come from Instacart. “I live on Amazon,” she says, buying everything from curry leaves to a jogging suit for her dog, complete with hoodie.

She’s so partial to these services, in fact, that she’s running one of her own: A veterinarian by trade, she’s a co-founder of VetPronto, which sends an on-call vet to your house. It’s one of a half-dozen on-demand services in the current batch at Y Combinator, the startup factory, including a marijuana delivery app called Meadow (“You laugh, but they’re going to be rich,” she says). She took a look at her current clients?—?they skew late 20s to late 30s, and work in high-paying jobs: “The kinds of people who use a lot of on demand services and hang out on Yelp a lot ?”

Basically, people a lot like herself. That’s the common wisdom: the apps are created by the urban young for the needs of urban young. The potential of delivery with a swipe of the finger is exciting for van Ekert, who grew up without such services in Sydney and recently arrived in wired San Francisco. “I’m just milking this city for all it’s worth,” she says. “I was talking to my father on Skype the other day. He asked, ‘Don’t you miss a casual stroll to the shop?’ Everything we do now is time-limited, and you do everything with intention. There’s not time to stroll anywhere.”

Suddenly, for people like van Ekert, the end of chores is here. After hours, you’re free from dirty laundry and dishes. (TaskRabbit’s ad rolls by me on a bus: “Buy yourself time?—?literally.”)

So here’s the big question. What does she, or you, or any of us do with all this time we’re buying? Binge on Netflix shows? Go for a run? Van Ekert’s answer: “It’s more to dedicate more time to working.”

Read the entire story here.

The Me-Useum

art-in-island-museum

The smartphone and its partner in crime, the online social network, begat the ubiquitous selfie. The selfie begat the self-stick. And, now we have the selfie museum. This is not an April Fool’s prank. Quite the contrary.

The Art in Island museum in Manila is making the selfie part of the visitor experience. Despite the obvious crassness, it may usher in a way for this and other museums to engage with their visitors more personally, and for visitors to connect with art more intimately. Let’s face it, if you ever try to pull a selfie-like stunt, or even take a photo, in the Louvre or the Prado galleries you would be escorted rather promptly to the nearest padded cell.

From the Guardian:

Selfiemania in art galleries has reached new heights of surreal comedy at a museum in Manila. Art in Island is a museum specifically designed for taking selfies, with “paintings” you can touch, or even step inside, and unlimited, unhindered photo opportunities. It is full of 3D reproductions of famous paintings that are designed to offer the wackiest possible selfie poses.

Meanwhile, traditional museums are adopting diverse approaches to the mania for narcissistic photography. I have recently visited museums with wildly contrasting policies on picture taking. At the Prado in Madrid, all photography is banned. Anything goes? No, nothing goes. Guards leap on anyone wielding a camera.

At the Musée d’Orsay in Paris photography is a free-for-all. Even selfie sticks are allowed. I watched a woman elaborately pose in front of Manet’s Le Déjeuner sur l’herbe so she could photograph herself with her daft selfie stick. This ostentatious technology turns holiday snaps into a kind of performance art. That is what the Manila museum indulges.

My instincts are to ban selfie sticks, selfies, cameras and phones from museums. But my instincts are almost certanly wrong.

Surely the bizarre selfie museum in Manila is a warning to museums, such as New York’s MoMA, that seek to ban, at the very least, selfie sticks – let alone photography itself. If you frustrate selfie enthusiasts, they may just create their own simulated galleries with phoney art that’s “fun” – or stop going to art galleries entirely.

It is better for photo fans to be inside real art museums, looking – however briefly – at actual art than to create elitist barriers between museums and the children of the digital age.

The lure of the selfie stick, which has caused such a flurry of anxiety at museums, is exaggerated. It really is a specialist device for the hardcore selfie lover. At the Musée d’Orsay there are no prohibitions, but only that one visitor, in front of the Manet, out of all the thousands was actually using a selfie stick.

And there’s another reason to go easy on selfies in museums, however irritating such low-attention-span, superficial behaviour in front of masterpieces may be.

Read the entire story here.

Image: Jean-François Millet’s gleaners break out of his canvas. The original, The Gleaners (Des glaneuses) was completed in 1857. Courtesy of Art in Island Museum. Manila, Philippines.

Electric Sheep?

[tube]NoAzpa1x7jU[/tube]

I couldn’t agree more with Michael Newton’s analysis — Blade Runner remains a dystopian masterpiece, thirty-three years on. Long may it reign and rain.

And, here’s another toast to the brilliant mind of Philip K Dick. The author’s work Do Androids Dream of Electric Sheep?, published in 1968, led to this noir science-fiction classic.

From the Guardian:

It’s entirely apt that a film dedicated to replication should exist in multiple versions; there is not one Blade Runner, but seven. Though opinions on which is best vary and every edition has its partisans, the definitive rendering of Ridley Scott’s 1982 dystopian film is most likely The Final Cut (2002), about to play out once more in cinemas across the UK. Aptly, too, repetition is written into the movie’s plot (there are spoilers coming), that sees Deckard (played by Harrison Ford) as an official bounty hunter (or “Blade Runner”) consigned to hunt down, one after the other, four Nexus-6 replicants (genetically-designed artificial human beings, intended as slaves for Earth’s off-world colonies). One by one, our equivocal hero seeks out the runaways: worldly-wise Zhora (Joanna Cassidy); stolid Leon (Brion James); the “pleasure-model” Pris (Daryl Hannah); and the group’s apparent leader, the ultimate Nietzschean blond beast, Roy Batty (the wonderful Rutger Hauer). Along the way, Deckard meets and falls in love with another replicant, Rachael (Sean Young), as beautiful and cold as a porcelain doll.

In Blade Runner, as in all science-fiction, the “future” is a style. Here that style is part film noir and part Gary Numan. The 40s influence is everywhere: in Rachael’s Joan-Crawford shoulder pads, the striped shadows cast by Venetian blinds, the atmosphere of defeat. It’s not just noir, Ridley Scott also taps into 70s cop shows and movies that themselves tapped into nostalgic style, with their yearning jazz and their sad apartments; Deckard even visits a strip joint as all TV detectives must. The movie remains one of the most visually stunning in cinema history. It plots a planet of perpetual night, a landscape of shadows, rain and reflected neon (shone on windows or the eye) in a world not built to a human scale; there, the skyscrapers dwarf us like the pyramids. High above the Philip Marlowe world, hover cars swoop and dirigible billboards float by. More dated now than its hard-boiled lustre is the movie’s equal and opposite involvement in modish early 80s dreams; the soundtrack by Vangelis was up-to-the-minute, while the replicants dress like extras in a Billy Idol video, a post-punk, synth-pop costume party. However, it is noir romanticism that wins out, gifting the film with its forlorn Californian loneliness.

It is a starkly empty film, preoccupied as it is with the thought that people themselves might be hollow. The plot depends on the notion that the replicants must be allowed to live no longer than four years, because as time passes they begin to develop raw emotions. Why emotion should be a capital offence is never sufficiently explained; but it is of a piece with the film’s investigation of a flight from feeling – what psychologist Ian D Suttie once named the “taboo on tenderness”. Intimacy here is frightful (everyone appears to live alone), especially that closeness that suggests that the replicants might be indistinguishable from us.

Advertisement

This anxiety may originally have had tacit political resonances. In the novel that the film is based on, Philip K Dick’s thoughtful Do Androids Dream of Electric Sheep? (1968), the dilemma of the foot soldier plays out, commanded to kill an adversary considered less human than ourselves, yet troubled by the possibility that the enemy are in fact no different. Shades of Vietnam darken the story, as well as memories of America’s slave-owning past. We are told that the replicants can do everything a human being can do, except feel empathy. Yet how much empathy do we feel for faraway victims or inconvenient others?

Ford’s Deckard may or may not be as gripped by uncertainty about his job as Dick’s original blade runner. In any case, his brusque “lack of affect” provides one of the long-standing puzzles of the film: is he, too, a replicant? Certainly Ford’s perpetual grumpiness (it sometimes seems his default acting position), his curdled cynicism, put up barriers to feeling that suggest it is as disturbing for him as it is for the hunted Leon or Roy. Though some still doubt, it seems clear that Deckard is indeed a replicant, his imaginings and memories downloaded from some database, his life as transitory as that of his victims. However, as we watch Blade Runner, Deckard doesn’t feel like a replicant; he is dour and unengaged, but lacks his victims’ detached innocence, their staccato puzzlement at their own untrained feelings. The antithesis of the scowling Ford, Hauer’s Roy is a sinister smiler, or someone whose face falls at the brush of an unassimilable emotion.

Read the entire article here.

Video: Blade Runner clip.

April Can Mean Only One Thing

April-fool-Hailo-app

The advent of April in the United States usually brings the impending  tax day to mind. In the UK when April rolls in, it means the media goes overboard with April Fool’s jokes. Here’s a smattering of the silliest from Britain’s most serious media outlets.

 

From the Telegraph: transparent Marmite, Yessus Juice, prison release voting app, Burger King cologne (for men).

From the Guardian: Jeremy Clarkson and fossil fuel divestment.

 

From the Independent: a round-up of the best gags, including the proposed Edinburgh suspension bridge featuring a gap, Simon Cowell’s effigy on the new £5 note, grocery store aisle trampolines for the short of stature.

Image: Hailo’s new piggyback rideshare service.

 

A New Mobile App or Genomic Understanding?

Eyjafjallajökull

Silicon Valley has been a tremendous incubator for some of most our recent inventions: the first integrated transistor chip, which led to Intel; the first true personal computer, which led to Apple. Yet, this esteemed venture capital (VC) community now seems to need a self-medication of innovation. Aren’t we all getting a little jaded from yet another “new, great mobile app” — worth in the tens of billions (but having no revenue model) — courtesy of a bright and young group of 20-somethings?

It is indeed gratifying to see innovators, young and old, rewarded for their creativity and perseverance. Yet, we should be encouraging more of our pioneers to look beyond the next cool smartphone invention. Perhaps our technological and industrial luminaries and their retinues of futurists could do us all a favor if they channeled more of their speculative funds at longer-term and more significant endeavors: cost-effective desalination; cheaper medications; understanding and curing our insidious diseases; antibiotic replacements; more effective recycling; cleaner power; cheaper and stronger infrastructure; more effective education. These are all difficult problems. But therein lies the reward.

Clearly some pioneering businesses are investing in these areas. But isn’t it time we insisted that the majority of our private and public intellectual capital (and financial) should be invested in truly meaningful ways. Here’s an example from Iceland — with their national human genome project.

From ars technica:

An Icelandic genetics firm has sequenced the genomes of 2,636 of its countrymen and women, finding genetic markers for a variety of diseases, as well as a new timeline for the paternal ancestor of all humans.

Iceland is, in many ways, perfectly suited to being a genetic case study. It has a small population with limited genetic diversity, a result of the population descending from a small number of settlers—between 8 and 20 thousand, who arrived just 1100 years ago. It also has an unusually well-documented genealogical history, with information sometimes stretching all the way back to the initial settlement of the country. Combined with excellent medical records, it’s a veritable treasure trove for genetic researchers.

The researchers at genetics firm deCODE compared the complete genomes of participants with historical and medical records, publishing their findings in a series of four papers in Nature Genetics last Wednesday. The wealth of data allowed them to track down genetic mutations that are related to a number of diseases, some of them rare. Although few diseases are caused by a single genetic mutation, a combination of mutations can increase the risk for certain diseases. Having access to a large genetic sample with corresponding medical data can help to pinpoint certain risk-increasing mutations.

Among their headline findings was the identification of the gene ABCA7 as a risk factor for Alzheimer’s disease. Although previous research had established that a gene in this region was involved in Alzheimer’s, this result delivers a new level of precision. The researchers replicated their results in further groups in Europe and the United States.

Also identified was a genetic mutation that causes early-onset atrial fibrillation, a heart condition causing an irregular and often very fast heart rate. It’s the most common cardiac arrhythmia condition, and it’s considered early-onset if it’s diagnosed before the age of 60. The researchers found eight Icelanders diagnosed with the condition, all carrying a mutation in the same gene, MYL4.

The studies also turned up a gene with an unusual pattern of inheritance. It causes increased levels of thyroid stimulation when it’s passed down from the mother, but decreased levels when inherited from the father.

Genetic research in mice often involves “knocking out” or switching off a particular gene to explore the effects. However, mouse genetics aren’t a perfect approximation of human genetics. Obviously, doing this in humans presents all sorts of ethical problems, but a population such as Iceland provides the perfect natural laboratory to explore how knockouts affect human health.

The data showed that eight percent of people in Iceland have the equivalent of a knockout, one gene that isn’t working. This provides an opportunity to look at the data in a different way: rather than only looking for people with a particular diagnosis and finding out what they have in common genetically, the researchers can look for people who have genetic knockouts, and then examine their medical records to see how their missing genes affect their health. It’s then possible to start piecing together the story of how certain genes affect physiology.

Finally, the researchers used the data to explore human history, using Y chromosome data from 753 Icelandic males. Based on knowledge about mutation rates, Y chromosomes can be used to trace the male lineage of human groups, establishing dates of events like migrations. This technique has also been used to work out when the common ancestor of all humans was alive. The maternal ancestor, known as “Mitochondrial Eve,” is thought to have lived 170,000 to 180,000 years ago, while the paternal ancestor had previously been estimated to have lived around 338,000 years ago.

The Icelandic data allowed the researchers to calculate what they suggest is a more accurate mutation rate, placing the father of all humans at around 239,000 years ago. This is the estimate with the greatest likelihood, but the full range falls between 174,000 and 321,000 years ago. This estimate places the paternal ancestor closer in time to the maternal ancestor.

Read the entire story here.

Image: Gígjökull, an outlet glacier extending from Eyjafjallajökull, Iceland. Courtesy of Andreas Tille / Wikipedia.

Women Are From Venus, Men Can’t Remember

Yet another body of research underscores how different women are from men. This time, we are told, that the sexes generally encode and recall memories differently. So, the next time you take issue with a spouse (of different gender) about a — typically trivial — past event keep in mind that your own actions, mood and gender will affect your recall. If you’re female, your memories may be much more vivid than your male counterpart, but not necessarily more correct. If you (male) won last night’s argument, your spouse (female) will — unfortunately for you — remember it more accurately than you, which of course will lead to another argument.

From WSJ:

Carrie Aulenbacher remembers the conversation clearly: Her husband told her he wanted to buy an arcade machine he found on eBay. He said he’d been saving up for it as a birthday present to himself. The spouses sat at the kitchen table and discussed where it would go in the den.

Two weeks later, Ms. Aulenbacher came home from work and found two arcade machines in the garage—and her husband beaming with pride.

“What are these?” she demanded.

“I told you I was picking them up today,” he replied.

She asked him why he’d bought two. He said he’d told her he was getting “a package deal.” She reminded him they’d measured the den for just one. He stood his ground.

“I believe I told her there was a chance I was going to get two,” says Joe Aulenbacher, who is 37 and lives in Erie, Pa.

“It still gets me going to think about it a year later,” says Ms. Aulenbacher, 36. “My home is now overrun with two machines I never agreed upon.” The couple compromised by putting one game in the den and the other in Mr. Aulenbacher’s weight room.

It is striking how many arguments in a relationship start with two different versions of an event: “Your tone of voice was rude.” “No it wasn’t.” “You didn’t say you’d be working late.” “Yes I did.” “I told you we were having dinner with my mother tonight.” “No, honey. You didn’t.”

How can two people have different memories of the same event? It starts with the way each person perceives the event in the first place—and how they encoded that memory. “You may recall something differently at least in part because you understood it differently at the time,” says Dr. Michael Ross, professor emeritus in the psychology department at the University of Waterloo in Ontario, Canada, who has studied memory for many years.

Researchers know that spouses sometimes can’t even agree on concrete events that happened in the past 24 hours—such as whether they had an argument or whether one received a gift from the other. A study in the early 1980s, published in the journal “Behavioral Assessment,” found that couples couldn’t perfectly agree on whether they had sex the previous night.

Women tend to remember more about relationship issues than men do. When husbands and wives are asked to recall concrete relationship events, such as their first date, an argument or a recent vacation, women’s memories are more vivid and detailed.

But not necessarily more accurate. When given a standard memory test where they are shown names or pictures and then asked to recall them, women do just about the same as men.

Researchers have found that women report having more emotions during relationship events than men do. They may remember events better because they pay more attention to the relationship and reminisce more about it.

People also remember their own actions better. So they can recall what they did, just not what their spouse did. Researchers call this an egocentric bias, and study it by asking people to recall their contributions to events, as well as their spouse’s. Who cleans the kitchen more? Who started the argument? Whether the event is positive or negative, people tend to believe that they had more responsibility.

Your mood—both when an event happens and when you recall it later—plays a big part in memory, experts say. If you are in a positive mood or feeling positive about the other person, you will more likely recall a positive experience or give a positive interpretation to a negative experience. Similarly, negative moods tend to reap negative memories.

Negative moods may also cause stronger memories. A person who lost an argument remembers it more clearly than the person who won it, says Dr. Ross. Men tend to win more arguments, he says, which may help to explain why women remember the spat more. But men who lost an argument remember it as well as women who lost.

Read the entire article here.

Heads in the Rising Tide

King-Knut

Officials from the state of Florida seem to have their heads in the sand (and other places); sand that is likely to be swept from their very own Florida shores as sea levels rise. However, surely climate change could be an eventual positive for Florida: think warmer climate and huge urban swathes underwater — a great new Floridian theme park! But, remember, don’t talk about it. I suppose officials will soon be looking for a contemporary version of King Canute to help them out of this watery pickle.

From Wired:

The oceans are slowly overtaking Florida. Ancient reefs of mollusk and coral off the present-day coasts are dying. Annual extremes in hot and cold, wet and dry, are becoming more pronounced. Women and men of science have investigated, and a great majority agree upon a culprit. In the outside world, this culprit has a name, but within the borders of Florida, it does not. According to a  Miami Herald investigation, the state Department of Environmental Protection has since 2010 had an unwritten policy prohibiting the use of some well-understood phrases for the meteorological phenomena slowly drowning America’s weirdest-shaped state. It’s … that thing where burning too much fossil fuel puts certain molecules into a certain atmosphere, disrupting a certain planetary ecosystem. You know what we’re talking about. We know you know. They know we know you know. But are we allowed to talk about … you know? No. Not in Florida. It must not be spoken of. Ever.

Unless … you could, maybe, type around it? It’s worth a shot.

The cyclone slowdown

It has been nine years since Florida was hit by a proper hurricane. Could that be a coincidence? Sure. Or it could be because of … something. A nameless, voiceless something. A feeling, like a pricking-of-thumbs, this confluence-of-chemistry-and-atmospheric-energy-over-time. If so, this anonymous dreadfulness would, scientists say, lead to a drier middle layer of atmosphere over the ocean. Because water vapor stores energy, this dry air will suffocate all but the most energetic baby storms. “So the general thinking, is that that as [redacted] levels increase, it ultimately won’t have an effect on the number of storms,” says Jim Kossin, a scientist who studies, oh, how about “things-that-happen-in-the-atmosphere-over-long-time-periods” at the National Centers for Environmental Information. “However, there is a lot of evidence that if a storm does form, it has a chance of getting very strong.”

Storms darken the sky

Hurricanes are powered by energy in the sea. And as cold and warm currents thread around the globe, storms go through natural, decades-long cycles of high-to-low intensity. “There is a natural 40-to-60-year oscillation in what sea surface temperatures are doing, and this is driven by ocean-wide currents that move on very slow time scales,” says Kossin, who has authored reports for the Intergovernmental Panel on, well, let’s just call it Chemical-and-Thermodynamic-Alterations-to-Long-Term-Atmospheric-Conditions. But in recent years, storms have become stronger than that natural cycle would otherwise predict. Kossin says that many in his field agree that while the natural churning of the ocean is behind this increasing intensity, other forces are at work. Darker, more sinister forces, like thermodynamics. Possibly even chemistry. No one knows for sure. Anyway, storms are getting less frequent, but stronger. It’s an eldritch tale of unspeakable horror, maybe.

 Read the entire article here.

Image: King Knut (or Cnut or Canute) the Great, illustrated in a medieval manuscript. Courtesy of Der Spiegel Geschichte.

The Big Crunch

cmb

It may just be possible that prophetic doomsayers have been right all along. The end is coming… well, in a few tens of billions of years. A group of physicists propose that the cosmos will soon begin collapsing in on itself. Keep in mind that soon in cosmological terms runs into the billions of years. So, it does appear that we still have some time to crunch down our breakfast cereal a few more times before the ultimate universal apocalypse. Clearly this may not please those who seek the end of days within their lifetimes, and for rather different — scientific — reasons, cosmologists seem to be unhappy too.

From Phys:

Physicists have proposed a mechanism for “cosmological collapse” that predicts that the universe will soon stop expanding and collapse in on itself, obliterating all matter as we know it. Their calculations suggest that the collapse is “imminent”—on the order of a few tens of billions of years or so—which may not keep most people up at night, but for the physicists it’s still much too soon.

In a paper published in Physical Review Letters, physicists Nemanja Kaloper at the University of California, Davis; and Antonio Padilla at the University of Nottingham have proposed the cosmological collapse mechanism and analyzed its implications, which include an explanation of dark energy.

“The fact that we are seeing dark energy now could be taken as an indication of impending doom, and we are trying to look at the data to put some figures on the end date,” Padilla told Phys.org. “Early indications suggest the collapse will kick in in a few tens of billions of years, but we have yet to properly verify this.”

The main point of the paper is not so much when exactly the universe will end, but that the mechanism may help resolve some of the unanswered questions in physics. In particular, why is the universe expanding at an accelerating rate, and what is the dark energy causing this acceleration? These questions are related to the cosmological constant problem, which is that the predicted vacuum energy density of the universe causing the expansion is much larger than what is observed.

“I think we have opened up a brand new approach to what some have described as ‘the mother of all physics problems,’ namely the cosmological constant problem,” Padilla said. “It’s way too early to say if it will stand the test of time, but so far it has stood up to scrutiny, and it does seem to address the issue of vacuum energy contributions from the standard model, and how they gravitate.”

The collapse mechanism builds on the physicists’ previous research on vacuum energy sequestering, which they proposed to address the cosmological constant problem. The dynamics of vacuum energy sequestering predict that the universe will collapse, but don’t provide a specific mechanism for how collapse will occur.

According to the new mechanism, the universe originated under a set of specific initial conditions so that it naturally evolved to its present state of acceleration and will continue on a path toward collapse. In this scenario, once the collapse trigger begins to dominate, it does so in a period of “slow roll” that brings about the accelerated expansion we see today. Eventually the universe will stop expanding and reach a turnaround point at which it begins to shrink, culminating in a “big crunch.”

Read the entire article here.

Image: Image of the Cosmic Microwave Background (CMB) from nine years of WMAP data. The image reveals 13.77 billion year old temperature fluctuations (shown as color differences) that correspond to the seeds that grew to become the galaxies. Courtesy of NASA.

PowerPoint Karaoke Olympics

PPT-karaokeIt may not be beyond the realm of fantasy to imagine a day in the not too distant future when PowerPoint Karaoke features as an olympic sport. Ugh!

Without a doubt karaoke has set human culture back at least a thousand years (thanks Japan). And, Powerpoint has singlehandedly dealt killer blows to creativity, deep thought and literary progress (thanks Microsoft). Surely, combining these two banes of modern society into a competitive event is the stuff of true horror. But, this hasn’t stopped the activity from becoming a burgeoning improv phenomenon for corporate hacks — validating the trend in which humans continue making fools of themselves. After all, it must be big — and there’s probably money in it — if the WSJ is reporting on it.

Nonetheless,

  • Count
  • me
  • out!

From the WSJ:

On a sunny Friday afternoon earlier this month, about 100 employees of Adobe Systems Inc. filed expectantly into an auditorium to watch PowerPoint presentations.

“I am really thrilled to be here today,” began Kimberley Chambers, a 37-year-old communications manager for the software company, as she nervously clutched a microphone. “I want to talk you through…my experience with whales, in both my personal and professional life.”

Co-workers giggled. Ms. Chambers glanced behind her, where a PowerPoint slide displayed four ink sketches of bare-chested male torsos, each with a distinct pattern of chest hair. The giggles became guffaws. “What you might not know,” she continued, “is that whales can be uniquely identified by a number of different characteristics, not the least of which is body hair.”

Ms. Chambers, sporting a black blazer and her employee ID badge, hadn’t seen this slide in advance, nor the five others that popped up as she clicked her remote control. To accompany the slides, she gave a nine-minute impromptu talk about whales, a topic she was handed 30 seconds earlier.

Forums like this at Adobe, called “PowerPoint karaoke” or “battle decks,” are cropping up as a way for office workers of the world to mock an oppressor, the ubiquitous PowerPoint presentation. The mix of improvised comedy and corporate-culture takedown is based on a simple notion: Many PowerPoint presentations are unintentional parody already, so why not go all the way?

Library associations in Texas and California held PowerPoint karaoke sessions at their annual conferences. At a Wal-Mart StoresInc. event last year, workers gave fake talks based on real slides from a meatpacking supplier. Twitter Inc. Chief Executive Dick Costolo, armed with his training from comedy troupe Second City, has faced off with employees at “battle decks” contests during company meetings.

One veteran corporate satirist gives these events a thumbs up. “Riffing off of PowerPoints without knowing what your next slide is going to be? The humorist in me says it’s kinda brilliant,” said “Dilbert” cartoonist Scott Adams, who has spent 26 years training his jaundiced eye on office work. “I assume this game requires drinking?” he asked. (Drinking is technically not required, but it is common.)

Mr. Adams, who worked for years at a bank and at a telephone company, said PowerPoint is popular because it offers a rare dose of autonomy in cubicle culture. But it often bores, because creators lose sight of their mission. “If you just look at a page and drag things around and play with fonts, you think you’re a genius and you’re in full control of your world,” he said.

At a February PowerPoint karaoke show in San Francisco, contestants were given pairings of topics and slides ranging from a self-help seminar for people who abuse Amazon Prime, with slides including a dog balancing a stack of pancakes on its nose, to a sermon on “Fifty Shades of Grey,” with slides including a pyramid dotted with blocks of numbers. Another had to explain the dating app Tinder to aliens invading the Earth, accompanied by a slide of old floppy disk drives, among other things.

Read and sing-a-long to the entire article here.

Circadian Misalignment and Your Smartphone

Google-search-smartphone-night

You take your portable electronics everywhere, all the time. You watch TV with or on your smartphone. You eat with a fork in one hand and your smartphone in the other. In fact, you probably wish you had two pairs of arms so you could eat, drink and use your smartphone and laptop at the same time. You use your smartphone in your car — hopefully or sensibly not while driving. You read texts on your smartphone while in the restroom. You use it at the movie theater, at the theater (much to the dismay of stage actors). It’s with you at the restaurant, on the bus or metro, in the aircraft, in the bath (despite chances of getting electrically shocked). You check your smartphone first thing in the morning and last thing before going to sleep. And, if your home or work-life demands you will check it periodically throughout the night.

Let’s leave aside for now the growing body of anecdotal and formal evidence that smartphones are damaging your physical wellbeing. This includes finger, hand and wrist problems (from texting); and neck and posture problems (from constantly bending over your small screen). Now there is evidence that constant use, especially at night, is damaging your mental wellbeing and increasing the likelihood of additional, chronic physical ailments. It appears that the light from our constant electronic companions is not healthy, particularly as it disrupts our regular rhythm of sleep.

From Wired:

For More than 3 billion years, life on Earth was governed by the cyclical light of sun, moon and stars. Then along came electric light, turning night into day at the flick of a switch. Our bodies and brains may not have been ready.

A fast-growing body of research has linked artificial light exposure to disruptions in circadian rhythms, the light-triggered releases of hormones that regulate bodily function. Circadian disruption has in turn been linked to a host of health problems, from cancer to diabetes, obesity and depression. “Everything changed with electricity. Now we can have bright light in the middle of night. And that changes our circadian physiology almost immediately,” says Richard Stevens, a cancer epidemiologist at the University of Connecticut. “What we don’t know, and what so many people are interested in, are the effects of having that light chronically.”

Stevens, one of the field’s most prominent researchers, reviews the literature on light exposure and human health the latest Philosophical Transactions of the Royal Society B. The new article comes nearly two decades after Stevens first sounded the alarm about light exposure possibly causing harm; writing in 1996, he said the evidence was “sparse but provocative.” Since then, nighttime light has become even more ubiquitous: an estimated 95 percent of Americans regularly use screens shortly before going to sleep, and incandescent bulbs have been mostly replaced by LED and compact fluorescent lights that emit light in potentially more problematic wavelengths. Meanwhile, the scientific evidence is still provocative, but no longer sparse.

As Stevens says in the new article, researchers now know that increased nighttime light exposure tracks with increased rates of breast cancer, obesity and depression. Correlation isn’t causation, of course, and it’s easy to imagine all the ways researchers might mistake those findings. The easy availability of electric lighting almost certainly tracks with various disease-causing factors: bad diets, sedentary lifestyles, exposure to they array of chemicals that come along with modernity. Oil refineries and aluminum smelters, to be hyperbolic, also blaze with light at night.

Yet biology at least supports some of the correlations. The circadian system synchronizes physiological function—from digestion to body temperature, cell repair and immune system activity—with a 24-hour cycle of light and dark. Even photosynthetic bacteria thought to resemble Earth’s earliest life forms have circadian rhythms. Despite its ubiquity, though, scientists discovered only in the last decade what triggers circadian activity in mammals: specialized cells in the retina, the light-sensing part of the eye, rather than conveying visual detail from eye to brain, simply signal the presence or absence of light. Activity in these cells sets off a reaction that calibrates clocks in every cell and tissue in a body. Now, these cells are especially sensitive to blue wavelengths—like those in a daytime sky.

But artificial lights, particularly LCDs, some LEDs, and fluorescent bulbs, also favor the blue side of the spectrum. So even a brief exposure to dim artificial light can trick a night-subdued circadian system into behaving as though day has arrived. Circadian disruption in turn produces a wealth of downstream effects, including dysregulation of key hormones. “Circadian rhythm is being tied to so many important functions,” says Joseph Takahashi, a neurobiologist at the University of Texas Southwestern. “We’re just beginning to discover all the molecular pathways that this gene network regulates. It’s not just the sleep-wake cycle. There are system-wide, drastic changes.” His lab has found that tweaking a key circadian clock gene in mice gives them diabetes. And a tour-de-force 2009 study put human volunteers on a 28-hour day-night cycle, then measured what happened to their endocrine, metabolic and cardiovascular systems.

Crucially, that experiment investigated circadian disruption induced by sleep alteration rather than light exposure, which is also the case with the many studies linking clock-scrambling shift work to health problems. Whether artificial light is as problematic as disturbed sleep patterns remains unknown, but Stevens thinks that some and perhaps much of what’s now assumed to result from sleep issues is actually a function of light. “You can wake up in the middle of the night and your melatonin levels don’t change,” he says. “But if you turn on a light, melatonin starts falling immediately. We need darkness.” According to Stevens, most people live in a sort of “circadian fog.”

Read the entire article here.

Image courtesy of Google Search.

3D Printing Magic

[tube]UpH1zhUQY0c[/tube]

If you’ve visited this blog before you know I’m a great fan of 3D printing. Though some uses, such as printing 3D selfies, seem dubious at best. So, when Carbon3D unveiled its fundamentally different, and better, approach to 3D printing I was intrigued. The company uses an approach called continuous liquid interface production (CLIP), which seems to construct objects from a magical ooze. Check out the video — you’ll be enthralled. The future is here.

Learn more about Carbon3D here.

From Wired:

EVEN IF YOU have little interest in 3-D printing, you’re likely to find  Carbon3D’s Continuous Liquid Interface Production (CLIP) technology fascinating. Rather than the time-intensive printing of a 3-D object layer by layer like most printers, Carbon3D’s technique works 25 to 100 times faster than what you may have seen before, and looks a bit like Terminator 2‘s liquid metal T-1000 in the process.

CLIP creations grow out of a pool of UV-sensitive resin in a process that’s similar to the way laser 3-D printers work, but at a much faster pace. Instead of the laser used in conventional 3-D printers, CLIP uses an ultraviolet projector on the underside of a resin tray to project an image for how each layer should form. Light shines through an oxygen-permeable window onto the resin, which hardens it. Areas of resin that are exposed to oxygen don’t harden, while those that are cut off form the 3-D printed shape.

In practice, all that physics translates to unprecedented 3-D printing speed. At this week’s TED Conference in Vancouver, Carbon3D CEO and co-founder Dr. Joseph DeSimone demonstrated the printer onstage with a bit of theatrical underselling, wagering that his creation could produce in 10 minutes a geometric ball shape that would take a regular 3-D printer up to 10 hours. The CLIP process churned out the design in a little under 7 minutes.

Read the entire story here.

Video courtesy of Carbon3D.

We Are All Always Right, All of the Time

You already know this: you believe that your opinion is correct all the time, about everything. And, interestingly enough, your friends and neighbors believe that they are always right too. Oh, and the colleague at the office with whom you argue all the time — she’s right all the time too.

How can this be, when in an increasingly science-driven, objective universe facts trump opinion? Well, not so fast. It seems that we humans have an internal mechanism that colors our views based on a need for acceptance within a broader group. That is, we generally tend to spin our rational views in favor of group consensus, versus supporting the views of a subject matter expert, which might polarize the group. This is both good and bad. Good because it reinforces the broader benefits of being within a group; bad because we are more likely to reject opinion, evidence and fact from experts outside of our group — think climate change.

From the Washington Post:

It’s both the coolest — and also in some ways the most depressing — psychology study ever.

Indeed, it’s so cool (and so depressing) that the name of its chief finding — the Dunning-Kruger effect — has at least halfway filtered into public consciousness. In the classic 1999 paper, Cornell researchers David Dunning and Justin Kruger found that the less competent people were in three domains — humor, logic, and grammar — the less likely they were to be able to recognize that. Or as the researchers put it:

We propose that those with limited knowledge in a domain suffer from a dual burden: Not only do they reach mistaken conclusions and make regrettable errors, but their incompetence robs them of the ability to realize it.

Dunning and Kruger didn’t directly apply this insight to our debates about science. But I would argue that the effect named after them certainly helps to explain phenomena like vaccine denial, in which medical authorities have voiced a very strong opinion, but some parents just keep on thinking that, somehow, they’re in a position to challenge or ignore this view.

So why do I bring this classic study up now?

The reason is that an important successor to the Dunning-Kruger paper has just been come out — and it, too, is pretty depressing (at least for those of us who believe that domain expertise is a thing to be respected and, indeed, treasured)This time around, psychologists have not uncovered an endless spiral of incompetence and the inability to perceive it. Rather, they’ve shown that people have an “equality bias” when it comes to competence or expertise, such that even when it’s very clear that one person in a group is more skilled, expert, or competent (and the other less), they are nonetheless inclined to seek out a middle ground in determining how correct different viewpoints are.

Yes, that’s right — we’re all right, nobody’s wrong, and nobody gets hurt feelings.

The new study, just published in the Proceedings of the National Academy of Sciences, is by Ali Mahmoodi of the University of Tehran and a long list of colleagues from universities in the UK, Germany, China, Denmark, and the United States. And no wonder: The research was transnational, and the same experiment — with the same basic results — was carried out across cultures in China, Denmark, and Iran.

Read the entire story here.

Hyper-Parenting and Couch Potato Kids

Google-search-kids-playing

Parents who are overly engaged in micro-managing the academic, athletic and social lives of their kids may be responsible for ensuring their offspring lead less active lives. A new research study finds children of so-called hyper-parents are significantly less active than peers with less involved parents. Hyper-parenting seems to come in 4 flavors: helicopter parents who hover over their child’s every move; tiger moms who constantly push for superior academic attainment; little emperor parents who constantly bestow their kids material things; and concerted cultivation parents who over-schedule their kids with never-ending after-school activities. If you recognize yourself in one of these parenting styles, take a deep breath, think back on when as a 7-12 year-old you had the most fun, and let you kids play outside — preferably in the rain and mud!

From the WSJ / Preventive Medicine:

Hyper-parenting may increase the risk of physical inactivity in children, a study in the April issue of Preventive Medicine suggests.

Children with parents who tended to be overly involved in their academic, athletic and social lives—a child-rearing style known as hyper-parenting—spent less time outdoors, played fewer after-school sports and were less likely to bike or walk to school, friends’ homes, parks and playgrounds than children with less-involved parents.

Hyperparenting, although it’s intended to benefit children by giving them extra time and attention, could have adverse consequences for their health, the researchers said.

The study, at Queen’s University in Ontario, surveyed 724 parents of children, ages 7 to 12 years old, born in the U.S. and Canada from 2002 to 2007. (The survey was based on parents’ interaction with the oldest child.)

Questionnaires assessed four hyper-parenting styles: helicopter or overprotective parents; little-emperor parents who shower children with material goods; so-called tiger moms who push for exceptional achievement; and parents who schedule excessive extracurricular activities, termed concerted cultivation. Hyperparenting was ranked in five categories from low to high based on average scores in the four styles.

Children’s preferred play location was their yard at home, and 64% of the children played there at least three times a week. Only 12% played on streets and cul-de-sacs away from home. Just over a quarter walked or cycled to school or friends’ homes, and slightly fewer to parks and playgrounds. Organized sports participation was 26%.

Of parents, about 40% had high hyper-parenting scores and 6% had low scores. The most active children had parents with low to below-average scores in all four hyper-parenting styles, while the least active had parents with average-to-high hyper-parenting scores. The difference between children in the low and high hyper-parenting groups was equivalent to about 20 physical-activity sessions a week, the researchers said.

Read the entire story here.

Image courtesy of Google Search.

Humor Versus Horror

[tube]wMP5VCVIekQ[/tube]

Faced with unspeakable horror many of usually turn away. Some courageous souls turn to humor to counter the vileness of others. So, it is heartwarming to see comedians and satirists taking up rhetorical arms in the backyards of murderers and terrorists. Fighting violence and terror with much of the same may show progress in the short-term, but ridiculing our enemies with humor and thoughtful dialogue is the only long-term way to fight evil in its many human forms. A profound thank you to these four brave Syrian refugees who, in the face of much personal danger, are able to laugh at their foes.

From the Guardian:

They don’t have much to laugh about. But four young Syrian refugees from Aleppo believe humour may be the only antidote to the horrors taking place back home.

Settled in a makeshift studio in the Turkish city of Gaziantep 40 miles from the Syrian border, the film-makers decided ridicule was an effective way of responding to Islamic State and its grisly record of extreme violence.

“The entire world seems to be terrified of Isis, so we want to laugh at them, expose their hypocrisy and show that their interpretation of Islam does not represent the overwhelming majority of Muslims,” says Maen Watfe, 27. “The media, especially the western media, obsessively reproduce Isis propaganda portraying them as strong and intimidating. We want to show their weaknesses.”

The films and videos on Watfe and his three friends’ website mock the Islamist extremists and depict them as naive simpletons, hypocritical zealots and brutal thugs. It’s a high-risk undertaking. They have had to move house and keep their addresses secret from even their best friends after receiving death threats.

But the video activists – Watfe, Youssef Helali, Mohammed Damlakhy and Aya Brown – will not be deterred.

Their film The Prince shows Isis leader and self-appointed caliph Abu Bakr al-Baghdadi drinking wine, listening to pop music and exchanging selfies with girls on his smartphone. A Moroccan jihadi arrives saying he came to Syria to “liberate Jerusalem”. The leader swaps the wine for milk and switches the music to Islamic chants praising martyrdom. Then he hands the Moroccan a suicide belt and sends him off against a unit of Free Syrian Army fighters. The grenades detonate, and Baghdadi reaches for his glass of wine and turns the pop music back on.

It is pieces like this that have brought hate mail and threats via social media.

“One of them said that they would finish us off like they finished off Charlie [Hebdo],” Brown, 26, recalls. She declined to give her real name out of fear for her family, who still live in Aleppo. “In the end we decided to move from our old apartment.”

The Turkish landlord told them Arabic-speaking men had repeatedly asked for their whereabouts after they left, and kept the studio under surveillance.

Follow the story here.

Video: Happy Valentine. Courtesy of Dayaaltaseh Productions.

Household Chores for Kids Are Good

Google-kid-chores

Apparently household chores are becoming rather yesterday. Several recent surveys — no doubt commissioned by my children — show that shared duties in the home are a dying phenomenon. No, I here you cry. Not only do chores provide a necessary respite from the otherwise 24/7-videogame-texting addiction, they help establish a sense of responsibility and reinforce our increasingly imperiled altruistic tendencies. So, parents, get out the duster, vacuum, fresh sheets, laundry basket and put those (little) people to work before it’s too late. But first of all let’s rename “chores” to responsibilities.

From WSJ:

Today’s demands for measurable childhood success—from the Common Core to college placement—have chased household chores from the to-do lists of many young people. In a survey of 1,001 U.S. adults released last fall by Braun Research, 82% reported having regular chores growing up, but only 28% said that they require their own children to do them. With students under pressure to learn Mandarin, run the chess club or get a varsity letter, chores have fallen victim to the imperatives of resume-building—though it is hardly clear that such activities are a better use of their time.

“Parents today want their kids spending time on things that can bring them success, but ironically, we’ve stopped doing one thing that’s actually been a proven predictor of success—and that’s household chores,” says Richard Rende, a developmental psychologist in Paradise Valley, Ariz., and co-author of the forthcoming book “Raising Can-Do Kids.” Decades of studies show the benefits of chores—academically, emotionally and even professionally.

Giving children household chores at an early age helps to build a lasting sense of mastery, responsibility and self-reliance, according to research by Marty Rossmann, professor emeritus at the University of Minnesota. In 2002, Dr. Rossmann analyzed data from a longitudinal study that followed 84 children across four periods in their lives—in preschool, around ages 10 and 15, and in their mid-20s. She found that young adults who began chores at ages 3 and 4 were more likely to have good relationships with family and friends, to achieve academic and early career success and to be self-sufficient, as compared with those who didn’t have chores or who started them as teens.

Chores also teach children how to be empathetic and responsive to others’ needs, notes psychologist Richard Weissbourd of the Harvard Graduate School of Education. In research published last year, he and his team surveyed 10,000 middle- and high-school students and asked them to rank what they valued more: achievement, happiness or caring for others.

Almost 80% chose either achievement or happiness over caring for others. As he points out, however, research suggests that personal happiness comes most reliably not from high achievement but from strong relationships. “We’re out of balance,” says Dr. Weissbourd. A good way to start readjusting priorities, he suggests, is by learning to be kind and helpful at home.

Read the entire story here.

Image courtesy of Google Search.

 

The Damned Embuggerance

Google-search-terry-pratchett-books

Sadly, genre-busting author Sir Terry Pratchett succumbed to DEATH on March 12, 2015. Luckily, for those of us still fending off the clutches of Reaper Man we have seventy-plus works of his to keep us company in the darkness.

So now that our world contains a little less magic it’s important to remind ourselves of a few choice words of his:

A man is not truly dead while his name is still spoken.

Stories of imagination tend to upset those without one.

It’s not worth doing something unless someone, somewhere, would much rather you weren’t doing it.

The truth may be out there, but the lies are inside your head.

Goodness is about what you do. Not who you pray to.

From the Guardian:

Neil Gaiman led tributes from the literary, entertainment and fantasy worlds to Terry Pratchett after the author’s death on Thursday, aged 66.

The author of the Discworld novels, which sold in the tens of millions worldwide, had been afflicted with a rare form of early-onset Alzheimer’s disease.

Gaiman, who collaborated with Pratchett on the huge hit Good Omens, tweeted: “I will miss you, Terry, so much,” pointing to “the last thing I wrote about you”, on the Guardian.

“Terry Pratchett is not a jolly old elf at all,” wrote Gaiman last September. “Not even close. He’s so much more than that. As Terry walks into the darkness much too soon, I find myself raging too: at the injustice that deprives us of – what? Another 20 or 30 books? Another shelf-full of ideas and glorious phrases and old friends and new, of stories in which people do what they really do best, which is use their heads to get themselves out of the trouble they got into by not thinking? … I rage at the imminent loss of my friend. And I think, ‘What would Terry do with this anger?’ Then I pick up my pen, and I start to write.”

Appealing to readers to donate to Alzheimer’s research, Gaiman added on his blog: “Thirty years and a month ago, a beginning author met a young journalist in a Chinese Restaurant, and the two men became friends, and they wrote a book, and they managed to stay friends despite everything. Last night, the author died.

“There was nobody like him. I was fortunate to have written a book with him, when we were younger, which taught me so much.

“I knew his death was coming and it made it no easier.”

Read the entire article here.

Image courtesy of Google Search.

The Internet 0f Th1ngs

Google-search-IoT

Technologist Marc Goodman describes a not too distant future in which all our appliances, tools, products… anything and everything is plugged into the so-called Internet of Things (IoT). The IoT describes a world where all things are connected to everything else, making for a global mesh of intelligent devices from your connected car and your WiFi enabled sneakers to your smartwatch and home thermostat. You may well believe it advantageous to have your refrigerator ping the local grocery store when it runs out of fresh eggs and milk or to have your toilet auto-call a local plumber when it gets stopped-up.

But, as our current Internet shows us — let’s call it the Internet of People — not all is rosy in this hyper-connected, 24/7, always-on digital ocean. What are you to do when hackers attack all your home appliances in a “denial of home service attack (DohS)”, or when your every move inside your home is scrutinized, collected, analyzed and sold to the nearest advertiser, or when your cooktop starts taking and sharing selfies with the neighbors?

Goodman’s new book on this important subject, excerpted here, is titled Future Crimes.

From the Guardian:

If we think of today’s internet metaphorically as about the size of a golf ball, tomorrow’s will be the size of the sun. Within the coming years, not only will every computer, phone and tablet be online, but so too will every car, house, dog, bridge, tunnel, cup, clock, watch, pacemaker, cow, streetlight, bridge, tunnel, pipeline, toy and soda can. Though in 2013 there were only 13bn online devices, Cisco Systems has estimated that by 2020 there will be 50bn things connected to the internet, with room for exponential growth thereafter. As all of these devices come online and begin sharing data, they will bring with them massive improvements in logistics, employee efficiency, energy consumption, customer service and personal productivity.

This is the promise of the internet of things (IoT), a rapidly emerging new paradigm of computing that, when it takes off, may very well change the world we live in forever.

The Pew Research Center defines the internet of things as “a global, immersive, invisible, ambient networked computing environment built through the continued proliferation of smart sensors, cameras, software, databases, and massive data centres in a world-spanning information fabric”. Back in 1999, when the term was first coined by MIT researcher Kevin Ashton, the technology did not exist to make the IoT a reality outside very controlled environments, such as factory warehouses. Today we have low-powered, ultra-cheap computer chips, some as small as the head of a pin, that can be embedded in an infinite number of devices, some for mere pennies. These miniature computing devices only need milliwatts of electricity and can run for years on a minuscule battery or small solar cell. As a result, it is now possible to make a web server that fits on a fingertip for $1.

The microchips will receive data from a near-infinite range of sensors, minute devices capable of monitoring anything that can possibly be measured and recorded, including temperature, power, location, hydro-flow, radiation, atmospheric pressure, acceleration, altitude, sound and video. They will activate miniature switches, valves, servos, turbines and engines – and speak to the world using high-speed wireless data networks. They will communicate not only with the broader internet but with each other, generating unfathomable amounts of data. The result will be an always-on “global, immersive, invisible, ambient networked computing environment”, a mere prelude to the tidal wave of change coming next.

In the future all objects may be smart

The broad thrust sounds rosy. Because chips and sensors will be embedded in everyday objects, we will have much better information and convenience in our lives. Because your alarm clock is connected to the internet, it will be able to access and read your calendar. It will know where and when your first appointment of the day is and be able to cross-reference that information against the latest traffic conditions. Light traffic, you get to sleep an extra 10 minutes; heavy traffic, and you might find yourself waking up earlier than you had hoped.

When your alarm does go off, it will gently raise the lights in the house, perhaps turn up the heat or run your bath. The electronic pet door will open to let Fido into the backyard for his morning visit, and the coffeemaker will begin brewing your coffee. You won’t have to ask your kids if they’ve brushed their teeth; the chip in their toothbrush will send a message to your smartphone letting you know the task is done. As you walk out the door, you won’t have to worry about finding your keys; the beacon sensor on the key chain makes them locatable to within two inches. It will be as if the Jetsons era has finally arrived.

While the hype-o-meter on the IoT has been blinking red for some time, everything described above is already technically feasible. To be certain, there will be obstacles, in particular in relation to a lack of common technical standards, but a wide variety of companies, consortia and government agencies are hard at work to make the IoT a reality. The result will be our transition from connectivity to hyper-connectivity, and like all things Moore’s law related, it will be here sooner than we realise.

The IoT means that all physical objects in the future will be assigned an IP address and be transformed into information technologies. As a result, your lamp, cat or pot plant will be part of an IT network. Things that were previously silent will now have a voice, and every object will be able to tell its own story and history. The refrigerator will know exactly when it was manufactured, the names of the people who built it, what factory it came from, and the day it left the assembly line, arrived at the retailer, and joined your home network. It will keep track of every time its door has been opened and which one of your kids forgot to close it. When the refrigerator’s motor begins to fail, it can signal for help, and when it finally dies, it will tell us how to disassemble its parts and best recycle them. Buildings will know every person who has ever worked there, and streetlights every car that has ever driven by.

All of these objects will communicate with each other and have access to the massive processing and storage power of the cloud, further enhanced by additional mobile and social networks. In the future all objects may become smart, in fact much smarter than they are today, and as these devices become networked, they will develop their own limited form of sentience, resulting in a world in which people, data and things come together. As a consequence of the power of embedded computing, we will see billions of smart, connected things joining a global neural network in the cloud.

In this world, the unknowable suddenly becomes knowable. For example, groceries will be tracked from field to table, and restaurants will keep tabs on every plate, what’s on it, who ate from it, and how quickly the waiters are moving it from kitchen to customer. As a result, when the next E coli outbreak occurs, we won’t have to close 500 eateries and wonder if it was the chicken or beef that caused the problem. We will know exactly which restaurant, supplier and diner to contact to quickly resolve the problem. The IoT and its billions of sensors will create an ambient intelligence network that thinks, senses and feels and contributes profoundly to the knowable universe.

Things that used to make sense suddenly won’t, such as smoke detectors. Why do most smoke detectors do nothing more than make loud beeps if your life is in mortal danger because of fire? In the future, they will flash your bedroom lights to wake you, turn on your home stereo, play an MP3 audio file that loudly warns, “Fire, fire, fire.” They will also contact the fire department, call your neighbours (in case you are unconscious and in need of help), and automatically shut off flow to the gas appliances in the house.

The byproduct of the IoT will be a living, breathing, global information grid, and technology will come alive in ways we’ve never seen before, except in science fiction movies. As we venture down the path toward ubiquitous computing, the results and implications of the phenomenon are likely to be mind-blowing. Just as the introduction of electricity was astonishing in its day, it eventually faded into the background, becoming an imperceptible, omnipresent medium in constant interaction with the physical world. Before we let this happen, and for all the promise of the IoT, we must ask critically important questions about this brave new world. For just as electricity can shock and kill, so too can billions of connected things networked online.

One of the central premises of the IoT is that everyday objects will have the capacity to speak to us and to each other. This relies on a series of competing communications technologies and protocols, many of which are eminently hackable. Take radio-frequency identification (RFID) technology, considered by many the gateway to the IoT. Even if you are unfamiliar with the name, chances are you have already encountered it in your life, whether it’s the security ID card you use to swipe your way into your office, your “wave and pay” credit card, the key to your hotel room, your Oyster card.

Even if you don’t use an RFID card for work, there’s a good chance you either have it or will soon have it embedded in the credit card sitting in your wallet. Hackers have been able to break into these as well, using cheap RFID readers available on eBay for just $50, tools that allow an attacker to wirelessly capture a target’s credit card number, expiration date and security code. Welcome to pocket picking 2.0.

More productive and more prison-like

A much rarer breed of hacker targets the physical elements that make up a computer system, including the microchips, electronics, controllers, memory, circuits, components, transistors and sensors – core elements of the internet of things. These hackers attack a device’s firmware, the set of computer instructions present on every electronic device we encounter, including TVs, mobile phones, game consoles, digital cameras, network routers, alarm systems, CCTVs, USB drives, traffic lights, gas station pumps and smart home management systems. Before we add billions of hackable things and communicate with hackable data transmission protocols, important questions must be asked about the risks for the future of security, crime, terrorism, warfare and privacy.

In the same way our every move online can be tracked, recorded, sold and monetised today, so too will that be possible in the near future in the physical world. Real space will become just like cyberspace. With the widespread adoption of more networked devices, what people do in their homes, cars, workplaces, schools and communities will be subjected to increased monitoring and analysis by the corporations making these devices. Of course these data will be resold to advertisers, data brokers and governments, providing an unprecedented view into our daily lives. Unfortunately, just like our social, mobile, locational and financial information, our IoT data will leak, providing further profound capabilities to stalkers and other miscreants interested in persistently tracking us. While it would certainly be possible to establish regulations and build privacy protocols to protect consumers from such activities, the greater likelihood is that every IoT-enabled device, whether an iron, vacuum, refrigerator, thermostat or lightbulb, will come with terms of service that grant manufacturers access to all your data. More troublingly, while it may be theoretically possible to log off in cyberspace, in your well-connected smart home there will be no “opt-out” provision.

We may find ourselves interacting with thousands of little objects around us on a daily basis, each collecting seemingly innocuous bits of data 24/7, information these things will report to the cloud, where it will be processed, correlated, and reviewed. Your smart watch will reveal your lack of exercise to your health insurance company, your car will tell your insurer of your frequent speeding, and your dustbin will tell your local council that you are not following local recycling regulations. This is the “internet of stool pigeons”, and though it may sound far-fetched, it’s already happening. Progressive, one of the largest US auto insurance companies, offers discounted personalised rates based on your driving habits. “The better you drive, the more you can save,” according to its advertising. All drivers need to do to receive the lower pricing is agree to the installation of Progressive’s Snapshot black-box technology in their cars and to having their braking, acceleration and mileage persistently tracked.

The IoT will also provide vast new options for advertisers to reach out and touch you on every one of your new smart connected devices. Every time you go to your refrigerator to get ice, you will be presented with ads for products based on the food your refrigerator knows you’re most likely to buy. Screens too will be ubiquitous, and marketers are already planning for the bounty of advertising opportunities. In late 2013, Google sent a letter to the Securities and Exchange Commission noting, “we and other companies could [soon] be serving ads and other content on refrigerators, car dashboards, thermostats, glasses and watches, to name just a few possibilities.”

Knowing that Google can already read your Gmail, record your every web search, and track your physical location on your Android mobile phone, what new powerful insights into your personal life will the company develop when its entertainment system is in your car, its thermostat regulates the temperature in your home, and its smart watch monitors your physical activity?

Not only will RFID and other IoT communications technologies track inanimate objects, they will be used for tracking living things as well. The British government has considered implanting RFID chips directly under the skin of prisoners, as is common practice with dogs. School officials across the US have begun embedding RFID chips in student identity cards, which pupils are required to wear at all times. In Contra Costa County, California, preschoolers are now required to wear basketball-style jerseys with electronic tracking devices built in that allow teachers and administrators to know exactly where each student is. According to school district officials, the RFID system saves “3,000 labour hours a year in tracking and processing students”.

Meanwhile, the ability to track employees, how much time they take for lunch, the length of their toilet breaks and the number of widgets they produce will become easy. Moreover, even things such as words typed per minute, eye movements, total calls answered, respiration, time away from desk and attention to detail will be recorded. The result will be a modern workplace that is simultaneously more productive and more prison-like.

At the scene of a suspected crime, police will be able to interrogate the refrigerator and ask the equivalent of, “Hey, buddy, did you see anything?” Child social workers will know there haven’t been any milk or nappies in the home, and the only thing stored in the fridge has been beer for the past week. The IoT also opens up the world for “perfect enforcement”. When sensors are everywhere and all data is tracked and recorded, it becomes more likely that you will receive a moving violation for going 26 miles per hour in a 25-mile-per-hour zone and get a parking ticket for being 17 seconds over on your meter.

The former CIA director David Petraeus has noted that the IoT will be “transformational for clandestine tradecraft”. While the old model of corporate and government espionage might have involved hiding a bug under the table, tomorrow the very same information might be obtained by intercepting in real time the data sent from your Wi-Fi lightbulb to the lighting app on your smart phone. Thus the devices you thought were working for you may in fact be on somebody else’s payroll, particularly that of Crime, Inc.

A network of unintended consequences

For all the untold benefits of the IoT, its potential downsides are colossal. Adding 50bn new objects to the global information grid by 2020 means that each of these devices, for good or ill, will be able to potentially interact with the other 50bn connected objects on earth. The result will be 2.5 sextillion potential networked object-to-object interactions – a network so vast and complex it can scarcely be understood or modelled. The IoT will be a global network of unintended consequences and black swan events, ones that will do things nobody ever planned. In this world, it is impossible to know the consequences of connecting your home’s networked blender to the same information grid as an ambulance in Tokyo, a bridge in Sydney, or a Detroit auto manufacturer’s production line.

The vast levels of cyber crime we currently face make it abundantly clear we cannot even adequately protect the standard desktops and laptops we presently have online, let alone the hundreds of millions of mobile phones and tablets we are adding annually. In what vision of the future, then, is it conceivable that we will be able to protect the next 50bn things, from pets to pacemakers to self-driving cars? The obvious reality is that we cannot.

Our technological threat surface area is growing exponentially and we have no idea how to defend it effectively. The internet of things will become nothing more than the Internet of things to be hacked.

Read the entire article here.

Image courtesy of Google Search.

Luck

Four-leaf_clover

Some think they have it constantly at their side, like a well-trained puppy. Others crave and seek it. And yet others believe they have been shunned by it. Some put their love lives down to it, and many believe it has had a hand in guiding their careers, friendships, and finances. Of course, many know that it — luck — plays a crucial part in their fortunes at the poker table, roulette wheel or at the races. So what really is luck? Does it stem from within or does it envelope us like a benevolent (mostly) aether? And more importantly, how can more of us find some and tune it to our purposes? 

Carlin Flora over at Aeon presents an insightful analysis, with some rather simple answers. Oh, and you may wish to give away that rabbit’s foot.

From aeon:

In 1992, Archie Karas, then a waiter, headed out to Las Vegas. By 1995, he had turned $50 into $40 million, in what has become known as the biggest winning streak in gambling history. Most of us would call it an instance of great luck, or we might say of Archie himself: ‘What a lucky guy!’ The cold-hearted statistician would laugh at our superstious notions, and instead describe a series of chance processes that happened to work out for Karas. In the larger landscape where randomness reigns, anything can happen at any given casino. Calling its beneficiaries lucky is simply sticking a label on it after the fact.

To investigate luck is to take on one of the grandest of all questions: how can we explain what happens to us, and whether we will be winners, losers or somewhere in the middle at love, work, sports, gambling and life overall? As it turns out, new findings suggest that luck is not a phenomenon that appears exclusively in hindsight, like a hail storm on your wedding day. Nor is it an expression of our desire to see patterns where none exist, like a conviction that your yellow sweater is lucky. The concept of luck is not a myth.

Instead, the studies show, luck can be powered by past good or bad luck, personality and, in a meta-twist, even our own ideas and beliefs about luck itself. Lucky streaks are real, but they are the product of more than just blind fate. Our ideas about luck influence the way we behave in risky situations. We really can make our own luck, though we don’t like to think of ourselves as lucky – a descriptor that undermines other qualities, like talent and skill. Luck can be a force, but it’s one we interact with, shape and cultivate. Luck helps determine our fate here on Earth, even if you think its ultimate cause divine.

Luck is perspective and point of view: if a secular man happened to survive because he took a meeting outside his office at the World Trade Center on the morning of 11 September 2001, he might simply acknowledge random chance in life without assigning a deeper meaning. A Hindu might conclude he had good karma. A Christian might say God was watching out for him so that he could fulfil a special destiny in His service. The mystic could insist he was born under lucky stars, as others are born with green eyes.

Traditionally, the Chinese think luck is an inner trait, like intelligence or upbeat mood, notes Maia Young, a management expert at the University of California, Los Angeles. ‘My mom always used to tell me, “You have a lucky nose”, because its particular shape was a lucky one, according to Chinese lore.’ Growing up in the American Midwest, it dawned on Young that the fleeting luck that Americans often talked about – a luck that seemed to visit the same person at certain times (‘I got lucky on that test!’) but not others (‘I got caught in traffic before my interview!’) – was not equivalent to the unchanging, stable luck her mother saw in her daughter, her nose being an advertisement of its existence within.

‘It’s something that I have that’s a possession of mine, that can be more relied upon than just dumb luck,’ says Young. The distinction stuck with her. You might think someone with a lucky nose wouldn’t roll up their sleeves to work hard – why bother? – but here’s another cultural difference in perceptions of luck. ‘In Chinese culture,’ she says, ‘hard work can go hand-in-hand with being lucky. The belief system accommodates both.’

On the other hand, because Westerners see effort and good fortune as taking up opposite corners of the ring, they are ambivalent about luck. They might pray for it and sincerely wish others they care about ‘Good luck!’ but sometimes they just don’t want to think of themselves as lucky. They’d rather be deserving. The fact that they live in a society that is neither random nor wholly meritocratic makes for an even messier slamdance between ‘hard work’ and ‘luck’. Case in point: when a friend gets into a top law or medical school, we might say: ‘Congratulations! You’ve persevered. You deserve it.’ Were she not to get in, we would say: ‘Acceptance is arbitrary. Everyone’s qualified these days – it’s the luck of the draw.’

Read the entire article here.

Image: Four-leaf clover. Some consider it a sign of god luck. Courtesy of Phyzome.

Nuisance Flooding = Sea-Level Rise

hurricane_andrewGovernment officials in Florida are barred from using the terms “climate change”, “global warming”, “sustainable” and other related terms. Apparently, they’ll have to use the euphemism “nuisance flooding” in place of “sea-level rise”. One wonders what literary trick they’ll conjure up next time the state gets hit by a hurricane — “Oh, that? Just a ‘mischievous little breeze’, I’m not a scientist you know.”

From the Guardian:

Officials with the Florida Department of Environmental Protection (DEP), the agency in charge of setting conservation policy and enforcing environmental laws in the state, issued directives in 2011 barring thousands of employees from using the phrases “climate change” and “global warming”, according to a bombshell report by the Florida Center for Investigative Reporting (FCIR).

The report ties the alleged policy, which is described as “unwritten”, to the election of Republican governor Rick Scott and his appointment of a new department director that year. Scott, who was re-elected last November, has declined to say whether he believes in climate change caused by human activity.

“I’m not a scientist,” he said in one appearance last May.

Scott’s office did not return a call Sunday from the Guardian, seeking comment. A spokesperson for the governor told the FCIR team: “There’s no policy on this.”

The FCIR report was based on statements by multiple named former employees who worked in different DEP offices around Florida. The instruction not to refer to “climate change” came from agency supervisors as well as lawyers, according to the report.

“We were told not to use the terms ‘climate change’, ‘global warming’ or ‘sustainability’,” the report quotes Christopher Byrd, who was an attorney with the DEP’s Office of General Counsel in Tallahassee from 2008 to 2013, as saying. “That message was communicated to me and my colleagues by our superiors in the Office of General Counsel.”

“We were instructed by our regional administrator that we were no longer allowed to use the terms ‘global warming’ or ‘climate change’ or even ‘sea-level rise’,” said a second former DEP employee, Kristina Trotta. “Sea-level rise was to be referred to as ‘nuisance flooding’.”

According to the employees’ accounts, the ban left damaging holes in everything from educational material published by the agency to training programs to annual reports on the environment that could be used to set energy and business policy.

The 2014 national climate assessment for the US found an “imminent threat of increased inland flooding” in Florida due to climate change and called the state “uniquely vulnerable to sea level rise”.

Read the entire story here.

Image: Hurricane Floyd 1999, a “mischievous little breeze”. Courtesy of NASA.

The Power of Mediocrity

Over-achievers may well frown upon the slacking mediocre souls who strive to do less. But, mediocrity has a way of pervading the lives of the constantly striving 18 hour-a-day, multi-taskers as well. The figure of speech “jack of all trades, master of none”, sums up the inevitability of mediocrity for those who strive to do everything, but do nothing well. In fact, pursuit of the mediocre may well be an immutable universal law — both for under-archievers and over-archievers, and for that vast, second-rate, mediocre middle-ground of averageness.

From the Guardian:

In the early years of the last century, Spanish philosopher José Ortega y Gassetproposed a solution to society’s ills that still strikes me as ingenious, in a deranged way. He argued that all public sector workers from the top down (though, come to think of it, why not everyone else, too?) should be demoted to the level beneath their current job. His reasoning foreshadowed the Peter Principle: in hierarchies, people “rise to their level of incompetence”. Do your job well, and you’re rewarded with promotion, until you reach a job you’re less good at, where you remain.

In a recent book, The Hard Thing About Hard Things, the tech investor Ben Horowitz adds a twist: “The Law of Crappy People”. As soon as someone on a given rung at a company gets as good as the worst person the next rung up, he or she may expect a promotion. Yet, if it’s granted, the firm’s talent levels will gradually slide downhill. No one person need be peculiarly crappy for this to occur; bureaucracies just tend to be crappier than the sum of their parts.

Yet it’s wrong to think of these pitfalls as restricted to organisations. There’s a case to be made that the gravitational pull of the mediocre affects all life – as John Stuart Mill put it, that “the general tendency of things throughout the world is to render mediocrity the ascendant power among mankind”. True, it’s most obvious in the workplace (hence the observation that “a meeting moves at the pace of the slowest mind in the room”), but the broader point is that in any domain – work, love, friendship, health – crappy solutions crowd out good ones time after time, so long as they’re not so bad as to destroy the system. People and organisations hit plateaux not because they couldn’t do better, but because a plateau is a tolerable, even comfortable place. Even evolution – life itself! – is all about mediocrity. “Survival of the fittest” isn’t a progression towards greatness; it just means the survival of the sufficiently non-terrible.

And mediocrity is cunning: it can disguise itself as achievement. The cliche of a “mediocre” worker is a Dilbert-esque manager with little to do. But as Greg McKeown notes, in his book Essentialism: The Disciplined Pursuit Of Less, the busyness of the go-getter can lead to mediocrity, too. Throw yourself at every opportunity and you’ll end up doing unimportant stuff – and badly. You can’t fight this with motivational tricks or cheesy mission statements: you need a discipline, a rule you apply daily, to counter the pull of the sub-par. For a company, that might mean stricter, more objective promotion policies. For the over-busy person, there’s McKeown’s “90% Rule” – when considering an option, ask: does it score at least 9/10 on some relevant criterion? If not, say no. (Ideally, that criterion is: “Is this fulfilling?”, but the rule still works if it’s “Does this pay the bills?”).

Read the entire story here.